Linux和Windows毫秒时间
问题描述:
我想获得系统的毫秒时间(我不在乎它是否是实时,我希望它尽可能准确)。这是一个很好的方法吗?Linux和Windows毫秒时间
#ifdef WIN32
unsigned long long freq;
unsigned long long get_ms_time() {
LARGE_INTEGER t;
QueryPerformanceCounter(&t);
return t.QuadPart/freq;
}
#else
unsigned long long get_ms_time() {
struct timespec t;
clock_gettime(CLOCK_MONOTONIC, &t);
return t.tv_sec * 1000 + t.tv_nsec/1000000;
}
#endif
如何将此值包裹到signed int?我尝试这样做,并得到像这样的负值(在Linux上,我不知道在Windows上):
~ start
-2083002438
~ 15 seconds after..
-2082987440
~ 15 seconds after..
-2082972441
我会这样。 〜开始 X 〜15秒后.. X + 14998 〜15秒后.. X + 29997
当X是一个正数。 (我想输出正和增加)
答
我做这样的事情在我的代码...
timespec specStart, specStop;
// Get start time (UNIX timestamp) in seconds...
clock_gettime(CLOCK_MONOTONIC_RAW, &startTime);
int startTimeInt = startTime.tv_sec;
std::cout << "start time : " << startTimeInt << std::endl;
...
// Get stop time (UNIX timestamp) in seconds...
clock_gettime(CLOCK_MONOTONIC_RAW, &stopTime);
int stopTimeInt = stopTime.tv_sec;
std::cout << "stop time : " << stopTimeInt << std::endl;
// Get time diff from stop time to start time
unsigned long long timeStart = specStart.tv_sec * 1000000000 + specStart.tv_nsec;
unsigned long long timeStop = specStop.tv_sec * 1000000000 + specStop.tv_nsec;
unsigned long long timeDelta = timeStio - timeStart; // Time diff in nanoseconds.
int microSec = timeDelate/1000;
int mSec = timeDelta/1000000;
int sec = timeDelta/1000000000;
std::cout << "time diff : " << std::endl
<< sec << " s" << std::endl
<< msec << " ms" << std::endl
<< microSec << " µs" << std::endl
<< timeDelta << " ns" << std::endl;
你为什么不告诉我们你的代码? – Nick 2013-05-09 11:30:18
你显示的代码看起来不错,所以我怀疑你打印的是错误的值+或者做了其他错误。 – 2013-05-09 11:37:28
int x =(int)get_ms_time(); cout dan 2013-05-09 11:37:45