4

I was thinking about any method where we can use RAM I/O speed, to calculate time, let's say for example: RAM transfer rate is 1000 MB/s, so when it's half full (about 500 M Bytes written), by dividing 500 by 1000 we get the time which is epoch + 0.5 seconds so all in all we have three routines:

  1. one for writing data (zeros for example) into RAM
  2. another one for checking when a specific size is full
  3. and the third for calculating time

Note: Sorry if it sounds stupid, because this is not my study field,

3 Answers3

5

You could try to measure time from RAM transfer rates. You'd need to know the precise clock ratio between the RAM and the CPU. There would have to be a precise clock ratio; I don't think this is the case on typical PC architectures as the RAM and the CPU have their own clock and I'm not sure the RAM peak transfer rate is a constant.

You'd need to block all other sources of interrupts, because if the CPU is responding to some other interrupt, it may notice too late that the RAM is ready, so it would tend to drift.

The nail in the coffin is that while the CPU is waiting for the RAM, it isn't doing anything else. The same objection goes for anything that requires the CPU to perform computations to obtain the time — a CPU that does nothing but measure the time has no time left to do useful things.

Gilles 'SO- stop being evil'
  • 44,159
  • 8
  • 120
  • 184
2

On some video-gaming platforms where the performance characteristics are very rigidly defined and consistent, you can make a very good approximation of elapsed time by counting how many frames have elapsed. For example, if the hardware is designed to refresh the screen every 1/60th of a second, and you are able to detect each time the screen refreshes, you can make a good approximation that 1 second has passed for every 60 refreshes. This allows you to track with decent accuracy the amount of time since the system or application was started; if you punched in the real time when the program started, you could then approximate the real time as long as the program was running.

Some console games measure time based on how many new frames have been drawn, rather than by how many screen refreshes have taken place, and thus you can observe in-game timers actually slowing down if the framerate decreases.

user45623
  • 521
  • 3
  • 4
1

Like you suggest, there would be ways in specific environments to measure time indirectly, but, as the commenters say, most environments would have heavy constraints making this very unwieldy. For example, the moment you have any non-maskable interrupts (not even full multi-tasking), you need to cater for those interruptions.

Mark Hurd
  • 233
  • 3
  • 7