We often in programming for test performance, we must take a difference before and after operation. It is very simple in Java, with the system.currenttimemillis () function, it can get the current time and the accuracy is milliseconds. In C language, most date class functions are exactly to second, such as Time (), localtime (). The function of exactly to milliseconds has ftime (), getTimeOfDay (), is returned to a structure. Where ftime () is defined in Time.h, returns a structure of Struct Timeb, including seconds and milliseconds.
Now the problem is coming, I need to initiate a CORBA call from the C program, call to the Java program, return to the C procedure, requiring C to Java, Java to C call time. Java's system.currenttimemillis () function returns a large long integer, check the documentation from January 1, 1970 to the current millisecond (more than 30 years, conversion into milliseconds, is a big number ). In the case of ftime () in c, second by 1000 plus milliseconds, actually a negative number, although it is incremented in one second, more than 17 days, what does this mean? The key is that the reference time of the two languages is different, how I calculate the time between them. Break.
After another test, ha, the problem found. It turns out that the second field in the TIMEB structure is an int type. I will give it a 1000, it overflows, turns a negative value, in fact, the original value is still positive. After a simple calculation, it also started from January 1, 1970. It turns out that C and Java calculate time with a benchmark, so that I can use the time value taken in C and Java. Directly subtract down.
It can also be seen from here that C language pursuit is efficiency, in order to save memory, only one int value is stored in seconds, and the harm overflow. The GetTimeOfDay () function is poor, the accuracy is relatively poor, and the reason is unknown.