hi again

i'm a bit confused about this two props. in mdx common framework we had one parameter float elapsedtime, and now we have 2 times. someone can tell me physical difference beetwen them

p.s. i debugged miliseconds and seconds of both times. the result is strange enough: only game miliseconds iz non-zero, real miliseconds jumping permantly from zero to approximately the same value as gametime...

p.p.s in addition there is TotalMilliseconds prop, altogether - 4 milliseconds properties, totally confused :S - this item solved, milliseconds - int, totalmiliseconds - double :)

Re: XNA Framework ElapsedGameTime vs ElapsedRealTime


Usually you want your game runs smoothly on all types of computers so the best choise  is  to call game logic at stabilized frequency. It's your design-time task to organize all other game components to provide the rendering  time less then the rated logic loop time on the lowest class of graphic cards your game is supported. There is some techniques to do it, f.e. tweening. But  in the non-XNA game you have to arrange it yourself meaning initialization, loading, updating rendering and some other auxiliary  tasks.  The XNA makes a big job for us setting up the game loop automatically with implemented update and logic methods. So it gives us the ElapsedGameTime - an elapsed time since the last Update method call and the  ElapsedRealTime - an elapsed time since the last Draw method.
The ElapsedGameTime must be the same at every frame for smooth moving. If your calculations exceed TargetElapsedTime the property  IsRunningSlowly is set to true. In this case you have to  simplify calculations (f.e. use  approximated algos or generate less particles and so on). The ElapsedRealTime varies from 0 (if your measurement unit is millisec) up to ElapsedGameTime and more if a videocard does not support your game as you decide at the design-time. The inverted ElapsedRealTime is the fps in common sense.

All the XNA timings are derivatives  from the C# TimeSpan class. A minimal time piece is the tick = .0000001 sec. The system counter just add the ticks continiously forming integer types for milliseconds etc. But the variables like TotalMiliseconds, TotalDays etc. contain a huge amount of ticks to be stored in the integer types so they have floating point types.