Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Games Entertainment

Debunking The Need For 200FPS 263

Digital_Fusion writes: "If you follow the gaming sites at all you will hear about people that have tweaked their gaming rigs to give 200fps in games like Quake3. Do we really need this kind of framerate? Well no, according to this paper. It seems that anything over 72fps is just wasted as our visual pipeline is saturated past that point." On the other hand, I'm glad that companies make it possible to show crazy-fast framerates, for the trickle-down effect of cheaper "normal" cards.
This discussion has been archived. No new comments can be posted.

Debunking The Need For 200FPS

Comments Filter:
  • There's 2 things to consider here ....
    • if your screen's phosphors are long persistance (TV for example) they limit how fast you screen can refresh anyway (it's a tricky balance - too fast and dumb VGA screens flicker, too slow and fast refresh rate screens blur ...)
    • how fast is the display subsystem - large screen refreshing above about 250MHz (say 85Hz) is pretty close to impossible due to the imense amount of data needed to be sucked for the refresh process - this takes bandwidth away from poly rendering - above 250MHz is also a big problem for the analog portion of the subsystem (DACs, cables etc - the results can be smeared pixels)
  • these "frame rates" are what are probably causing me to lose chess matches online.

    I'm sure they're also responsible for my miserable Karma.
  • First of all, NTSC is an INTERLACED progressive scan rate @30fps, 60 _fields_per_second_. Also, most (if not all) ntsc monitors have phosphors that decay a lot slower than today's xga+ monitors. The slower decay coupled with 60 fields per sec means that you achieve motion that will look smoother than 60Hz on a computer monitor (at which most people will still percieve flicker.) Coupled with (as someone mentioned) the camera's natural motion blur this all adds up to motion smooth enough to fool the human eye. The point is that all technical errata aside, Pc's and Tv's are 2 different beasties.
  • This whole debate has gotten silly. I personally don't agree with this, "I read in my psych book humans see at XX fps" tie in to Q3 frames.

    Btw, this whole debate was held about a year and a half ago, only it was 30 vs 60 fps, because film only plays at 24 fps. 3DFX even released a 30 vs. 60 demo.

    First, no matter what, your monitor is a strobe light. It is not a constant light stream. Unless you have perfectly synched frame generation with your refresh rate, there are going to be syncing issues. Some frames skipped, some drawn twice, etc. Those descrepencies are noticed.

    Second of all, even if every refresh of your monitor drew one and only one frame of the world, it would still be a strobe light which is not synched up with your vision. Any person who sits at a computer all day with his/her refresh set to 100Hz can easily tell when the refresh is dropped down to 72Hz. Just as dropping down to 60Hz and opening up a solid white window is noticable to people in 75Hz land.

    This was just a silly rant, but any one who plays at 100+ fps can tell the difference between the two. There are just way too many factors going into your visualization for any of us or the Canadian and his snowflake story to put a hard number on it.
  • by andyh1978 ( 173377 ) on Monday October 30, 2000 @02:23PM (#664304) Homepage
    I do know that sometimes when playing games with a uber-high frame rate, I sometimes get a 'bluring' effect that I don't get a lower frame rates.
    Your card may be rendering at 200fps, but I bet your monitor isn't set to 200Hz vertical refresh!

    And I don't know many monitors that would even handle that.

    Since the only way to render at framerates above monitor vertical refresh rate is (obviously) to disable vertical sync (pausing rendering until the screen is updated), then you'll get tearing effects, as part of the screen is being drawn from data rendered in one frame, and the next part of the refresh uses the next rendered frame.

    In fact, this shows that your data's being wasted; say for example 200fps on a 100Hz monitor, only half the data from each frame is actually drawn.

    At high frame rates, the tearing effect probably causes the 'blurring' you describe.
  • Greetings (my first post). In games such as Quake III Arena, there are other very important factors to consider regarding smoothness, such as the mouse's sampling rate and sensitivity. Since you directly control the ingame POV with your mouse, it is imperative that your input is in sync with the video. Since high performance mice have high sampling rates (125hz for USB), 72 fps is simply not enough.
  • I'm not sure what the eye's "maximum refresh rate" might be, but I know this is a stupid way to measure it. They should at least do it outside in full sunshine.

    Not precisely, full darkness turns out to be better, which makes sense if you think about it.

    But your point about interference with 60 Hz. (US) rates is dead-on correct. Working in the interactive TV biz I often have to deal with PAL monitors in the US, and there's a noticable worsening of quality of the PAL TVs (50 Hz. refresh) when they're illuminated by florescent lighting (I'm in the US, where the lighting flickers at 60 Hz.) So I turn the lights off, and work from the daylight from the window. (In my case, even then there may also be electrical interference yanking on the signal, I haven't figured out how to prove if that contibutes much to my perception of flicker on PAL TVs in the US--yet.

    --j

  • I've seen this argument before. This time it's gaming. Last time, I think, it was IMAX. It is not simply a matter of anything beyond the limits or human perception being wasted. It's about providing a more intense experience. I occasionally run a psychedelic light show. Rule number one is that you START at the point of sensory overload, then pile on.

    While you may not be able to visually process 200fps, you can most certainly distinguish a qualitative difference between 60fps and 200fps. The sheep who patronize Blockbuster don't know that the 220 lines they're seeing don't approach broadcast quality, let alone DVD/LaserDisc/Satellite - they're still able to boggle when they see a nice quadrupled high-res image at or above 1600 lines.

    More data gives the brain more to work with, whether you can process every individual frame or not, subliminally you're going to pick up MUCH more information, and the result is a more intense experience.

    Of course, not everyone runs their games on a 90" screen, either.
  • This is correct. What we should be looking at is worst case framerate. It doesn't matter how pretty CounterStrike looks when it runs at an average of 99fps. When it drops down to below 30 in wide open spaces with smoke grenades, then it really hurts. I want a guarantee of 72fps. As for this $600 video card nonsense, the GeForce2 MX does the job admirably, almost as well as the GeForce 2 GTS. Its only a hundred odd bucks, and makes a bigger difference than you think.

    Captain_Frisk
  • Isn't this what they do in Q3? I love the performance I get on my dual CPU machine. The framerate doesn't so rapidly when there is a lot of things going on. It's definitely using 2 threads of execution.
  • Thats only true if you keep up with buying (or more commonly, warezing) the newest Direct3D games, which are the same games as they were in 1994, with more intensive graphics.

    I still play Quake and some addons for Quake2. I bought Quake3 for cheap on boxing day but didn't really like it.
    Why would I want a 128mb agp4x card that pushes 200fps on some new Direct3D game that I will never play?
  • The human brain hardly can process more than 25 to 30 frames images per second. Furthermore the images persist for about 0.05 to 0.1 seconds in our retinas. Cinema and TV are based on these samples values.

    What it's important are image and colour resolutions. These discussion alreade took place when defining PAL standard. NTSC produces 30 fps, PAL only 25, but with better resolution.

    And for graphic boards, it's more important the amount of polygons per seconds, which strongly depends on transfer ratio between RAM and VRAM (see PS2 threads....). I would say also that given that textures add a lot of realism with "little effort", it's more important to support lot of textures than more than 60-72 frames per seconds (yes, yes, its sounds as Murphy law, but it's serves aslo to avoid screen refresh artifacts).

    --ricardo

  • That's a reason for higher framerates - at 200 fps you can render the scene multiple times to create
    proper motion blurring, by adding the images to each other.

    Of course... a lot of games emulate motion blur anyway.... (who builds models for the bullets streaking towards you when you can jut do a streak in the air?)

    There are motion blurring algorithms, I wonder when hardware motion blurring will become required for the next generation graphics cards?
  • That's pretty impressive, I've never seen a monitor with a 200Hz refresh rate...mine maxes out at 72Hz, although I'm sure newer monitors are better. I know that if the FPS is higher than your monitor's refresh rate, you really won't be able to tell the difference in 100FPS vs. 200FPS, because there won't be any...
  • But don't movie cameras introduce temporal anti-aliasing, which would help reduce the effects of a lower framerate?
    Absolutely.

    Since computer graphics are instantaneous snapshots of the scene, it accentuates the frame rate a great deal.

    Movie cameras show a 'blur' of the movement over the duration of the frame... i.e. temporal antialiasing.

    Graphics cards are supporting spatial antialiasing, which give the impression of a higher resolution and smoothes those nasty jaggies at the edges. Temporal antialiasing could be the 'next big thing' (although the methods using accumulation buffers are years old, the hardware has to catch up). 3dfx have their T-buffer which could do such a thing, and aren't ATI producing a card with accumulation buffer support?
  • this writer makes many mistakes early on about the anatomy and physiology of the retina/brain system. He/She doesn't not understand cones and rods, nor the pathways (superior colliculus or laterate geniculate nucleus). For that, the rest of his article is flawed (remember, false assumptions lead to false conclusions).

    Indeed, the major falacy is that we live in an analog world. No we don't! We live in a quantum world, and light is quantized. That in itself should set up the rest of the system (retina/brain) to behave more like a digital system. This is why (partly) we percieve moderate flicker at the movies, 24 fps doens't cut it. Nor the 60Hz (30Hz really, taking into account the interlace). Flicker stops when some part of the retina/brain system is fully saturated. I don't know if there is such a point, our bodies seem to have very large bandwidth.

    Just a side note, I can see flicker on my 21 inch monitor at 70 and 75 Hz. I can still make it out at 80Hz, and at 100Hz it seems solid. Thank god for my video card...

    Kawaldeep

  • Why is it nothing attacking the paper itself has been moderated up?

    There was nothing scientific about how the 72+fps limit was calculated! As far a I can tell, the author judged by how much flicker he could perceive in the refresh of his monitor, to determine how much detail.

    That's crap.

    The refresh rate will only tell us about our persistence of vision effects. A refresh below a certain threshold does not trigger the POV to kick in, so that we can see the flashes of the monitor, whereas a refresh above that rate means our POV will start to blend the frames together.

    The argument of 72fps doesn't limit the human visual system from seeing a fast moving object; if something traveling at a certain speed gets drawn on screen twice at 72fps, it will get drawn 4 times 140fps, and with a decent monitor at the right resolutions, those four frames should be seen on screen. The real argument, then, is whether the human reflex is fast enough to react to those 4 images (whether the visual system is fast enough to see all four frames, or just blur them together into one image, is irrelevant, I think). Can a person dodge a railgun?

    Well, at least that's my 2cents

    The nick is a joke! Really!
  • There is never, ever any need to try and push your framerate above the verticle refresh rate of your monitor.
    Not true. Plenty of games have synchronous event handling. Faster framerate won't give you faster display but it will mean more responsiveness and/or more accurate physics in many games.
  • I can't remember what FPS rate it takes to fool cats and dogs into thinking there's motion on a screen, but I know it's significantly higher than humans. I believe that's why it's rare that an animal that pays much attention to a TV screen. To most dogs, cats, etc., it's just a bunch of poorly-formed still images.

    Perhaps this is a worthy cause for pursuing high framerates. Maybe someday you'll keep your pet occupied at home (watching TV) while you're out, or have your pet participate in your video games ... who knows.
  • I beg to differ! Since dark ages of MSDOS games were parallel, and I mean games like Commander Keen, Wolf 3D, DOOM, etc!

    There would be an Int08 handler (possibly reprogrammed to have more than meagre 18.2Hz freq like in DOOM) which would simply DRAW the CURRENT view scene, that's it nothing else, except for safe check to avoid re-entering same handler if rendering was slow for any reason.

    Int09 (keyboard) would handle movements, which would only be someting like, change player actor X to X-10, and Int08 handler would redraw it next time. Same would go for mouse interrupt Int33, if my memory serves me well (callback function).

    So, there was no such thing as the loop you described in all more or less well built games (id's). I actually did this thing like 6-7 years ago, no one it his sane mind would do differently these days, it's no longer state of the art approach.
  • Recently on slashdot, somebody posted an article about a new Sony CD Player (Click. [slashdot.org]) that dramatically increases CD sound quality. Takes it from 44,100 khz/sec, to 2.82mhz/sec .. Me and my dad got in a discussion about whether you would hear the difference or not.
    It got me thinking, maybe we can't now, but if we start getting used to incredible sound quality, would we then listen to our current 44,100 khz/sec and be confused at how we used to listen to 44,100 khz/sec every day?
    I started thinking this because if you listen to a lousy stereo system all the time, you get used to it, and it starts sounding decent. Then you travel to the nearest audio store, and listen to the newer, better stereos, you go back to the previous one and you suddenly realize how horrible your's sounds.

    So is that the same as video? Sure right now we (supposedly) can't see above 72frames per second, but if our eyes got used to the 200 frames per second, would our eyes... adjust, so to say?
  • There is one small detail that is almost ALWAYS forgotten when someone writes and article like this and talks about when you can and cannot see flicker. People don't watch Q3, they play Q3. Adding the hand-eye feedback loop to the situation makes q3 (et al.) a radically different experience from passive viewing. Human reaction times can drop below a hundredth of a second for expected events. This is why drag racers use the christmas tree starting lights and eliminate driver reaction as part of the race. Assuming monitor, mouse, vid card, etc are all up to spec and can handle >120 or >150 fps, the difference over 60-70 fps is very real for the player, but a passive viewer won't be able to tell. Simply watching for flicker to go away while you crank up the refresh is like asking the guy in the passenger's seat how the car is handling. He really has no clue.
  • Why would you want to debunk something, what does that mean anyway? Wouldn't it become bunk then, and isn't something that is bunk, not good?
  • Surely the advantage of better graphics cards is to provide higher resolution rather than huge fps at low res?

    I mean if you can do 200fps at 640x480 you should be able to bump it up to 1024x768 and still maintain a decent frame rate.

    Benno
  • At 60fps, it takes an absolute minimum of 1/60th of a second for your input to make the round trip from the mouse, through the game simulation, onto the screen and thence into eyes. As your framerate increases, that latency drops off to almost nothing. 1/60th of a second of latency between input and feedback on that input quite definitely is perceptible - introduce an additional frame of latency (eg by using triple buffering) and the controls definitely feel more laggy.

    So the reason you get that millisecond jump on someone running at a lower framerate is that you're effectively seeing a few tens of milliseconds into his future.
  • I couldn't agree more. The more % of cpu time you have idle while playing a game is that much more detail that can be thrown in the next version.

    rosie_bhjp
  • Even fairly weak monitors refresh at 60Mhz (thousand cycles per second).
    Wrong on two counts.
    1. Monitors refresh at 60-150Hz (approximately). That's 60-100 cycles per second.
    2. MHz is not thousand cycles per second. It's million cycles per second. kHz is thousands of cycles per second.
  • What about the maximum resolution that human eye can accept? How small does a pixel has to be before start confusing the computer screen with reality? I would venture to say that sound quality is pretty close to faking reality but visual display technology is whole different game.
  • An average monitor refreshes in 60hz, not Mhz, take a look at the article you pointed to yourself... :)
  • the old "feel" was put back in the latest beta, the jump based on FPS however is not in the newest beta as it gave people an unfair advantage and amounted to cheating (flame me all you want I don't care) which they're ardently against. I personally think the damage through walls thing was wise to take out, as it's not realistic, however they could have special brushes for metal gratings or something that would give the damage through walls effect when it's useful, and realistic. I don't know which side of the fence you're on with regards to strafe jumping but I'm all for it, I don't know how coordinated you are, but I can run sideways and jump sideways and I would assume most other people can. The only thing that bothers me about quake3 is that bunny hopping is faster than running. If I remember correctly in the early tests it wasn't, but Thresh bitched about it so to quiet him Carmack made it so.

  • "Well no, according to this paper. It seems that anything over 72fps is just wasted as our visual pipeline is saturated past that point." On the other hand, I'm glad that companies make it possible to show crazy-fast framerates, for the trickle-down effect of cheaper "normal" cards."

    I do know that sometimes when playing games with a uber-high frame rate, I sometimes get a 'bluring' effect that I don't get a lower frame rates. Plus, if you don't have a top-of-the-line graphics card, the lower frame rate will allow you to process things faster. Plus, who really needs 200 fps??

    Eric Gearman
    --
  • Framerate also effects Quake3's physics:

    Quake3World Messageboard Post [quake3world.com]

    Hardcore people get a GeForce2 and play at 680*480*16 to maintain a constant high framerate because physics rounding errors are greatest at the higher framerates. Then they can make crazy jumps like the rail jump in q3dm6 and the megahealth in q3dm13. The 1.25 patch is supposed to fix this though.

    And on topic, visually I can't tell the difference over 70fps in any game, so its ridiculous to play at 200fps if you can bump up the resolution or turn on other options.

  • Er...what?

    Monitors' refresh rates are in Hz, not MHz, you freak.

  • Yes, Quake 3 may put out a needless 200 fps, but tomorrow's games will have larger enviornments and textures. Such hardware is needed, it's just a step of ahead of it's time. And even with today's games, will you constantly get > 200 fps? It's more of a cushion than anything else.
  • And on what concerns monitors. For me and several people 60Hz is deadly painful! Seat on a 60Hz monitor for the whole day and you surely get some serious headaches (specially on the temporas and inside the eyes). It looks like someone furiously turns lights on and off. On a 72-75Hz it is still visible the flickering. The minimal frequency for such aliens/mutants like me is no less than 85 Hz. And sincerly one gets tired working on such monitor. My good level is 100Hz.

    One thing to note is that this does depend on the ambient light level. If the room is bright, and the monitor is turned to maximum brightness, you notice flicker much more than if the room is more dimly lit, and the monitor brightness is turned down a little.

    Personally I really like LCDs, especially when I'm working for long periods - no flicker at all!

    - Fzz

  • >I think this whole "they project each frame
    >multiple times" thing is some weird urban
    >legend...

    Sorry, but speculation does not make fact. If you bother to go to the rec.arts.movie.tech FAQ, they refer to how the use of a double-bladed shutter does in fact cause each frame to be shown twice.

    The issue of the matter is how the human eye percieves light. By cutting the display time in half, even if it is the same image, the eye percieves change, thereby creating an optical illusion that cuts down on the perception of the actual jerkiness of the changing images.

    As I am not disciplined in this field, I did not retain the information on this particularly well. However I have come across a large number of sites backing this up. (Mostly stumbled across when looking into HDTV and progressive scan video technology).

    Matt
  • "I'm surprised to read that games like Q3 don't do this. (Physics depending on your refresh rate is just nutty.)"

    Quake does do this, so the physics engine is not dependent on the frame rate. *However* there is a *bug* in Quake3Arena that makes the physics engine slightly different at different frame rates, but it is a bug, and doesn't have to do with the game loop design. The physics engine runs in a .qvm module, which runs on a virtual machine, and apparently there is a floating point rounding error in the virtual machine implementation itself that causes the bug. The newest patch is supposed to fix it.

    As far as LionKimbro's post goes, I don't think there's too much difference between how you've mentioned it, and parallelizing it. Parallelizing it makes things quite a bit more complex - it sounds pretty beautiful on the surface, run your model/view each in their own threads and they can update at their own rates. But somewhere along the line these two threads must exchange data - and this happens often - every time something moves, every time something new joins, or geometry changes etc etc (all the time, in other words) - you can't just update that while your rendering thread is rendering. This isn't an easy problem to solve, it requires very careful design and thought into how the threads will communicate. We currently do our stuff the parallelization way, but one of the main reasons we do it is so that the application main thread cannot "accidentally" stop the simulation - e.g. if a modal dialog box pops up on the server applications, or the user presses 'alt' by mistake and the window enters the menu loop, then you don't want everything on the network to suddenly stop updating. So we run the simulation stuff in another thread.

  • "or the user presses 'alt' by mistake and the window enters the menu loop, then you don't want everything on the network to suddenly stop updating"

    Forgot to mention .. an interesting example of this .. grab your mouse down on the scroll bar of a Quake3 dedicated server dialog box while people are playing .. :)

  • Yeah, I'd agree.. which was atleast implied if not stated in my posten.

  • They should run at 48 fps, but every other frame is simply the interlaced combination of the current and next frames. That I'd like to see.
    ---


  • ...women can see up to 30% more colors than men, so if a woman doesn?t think your outfit matches, she is probably right, go change)...

    What? I have *never* heard this one. I'm currently a med student and have been more or less "in the neuro field" for some time before that. Perhaps I'm wrong, but I'd love to see where that came from. A reference please?

  • Another fine example of why I will never be caught dead on IRC.
    ---
  • (Note: I have no real expertise in visiual perception, I am just applying some common sense)

    I can't say I a agree with the authors proof that we can percieve 60fps+. His simple experment involved human perception of flicker NOT animation. Yes it is true that most people can perceive flicker up to about 72Hz. But what we are talking about is a cycle between to completely different states, the moniter screen is blank for 1/120 of a second it is on for 120th of a second (ok this is over simplifing but you get the point).

    This is completely different from animation. Animation is a gradual change from one scene to the next. It is much more difficult to distinguish subtle changes from one scene to the next then is to tell if something is on or off.

    Imagine looking at a painting for a moment and looking away, then looking back a again. Would you notice a subtle change in the scene? Probably not (we are talking fractions of a second here). Now imagine you look at the painting, look away, look back and it is completely gone. Then I am certain you would notice. The two example are COMPLETELY different situations.

    What is the human threshold of perception for fps. I really don't know but I would say it is well below 72fps or even 60fps. I would estimate it to be somewhere between 40 and 60 fps. Any thing more is a waste of CPU cylcles.

    Personally I would gladly trade 60+ fps for better image quality or resolution.

    (Note: many posters have also pointed out the difference between average and peak fps, so I feel no need for further comment on that here)
  • I find it ironic that the graphics engine behind quake made it famous yet in the quest to do well in the game every graphical option is turned off. Makes you wonder why they didn't just make it 2d stick figures? Oh yeah, then nobody would have bought it. But boy, people would use it.
    ---
  • It's usually not the video card's fault for incorrectly guessing the resolution of the monitor; blame the monitor detection scheme and/or database. It generally underestimates by lots, usually going to 60 or so Hz on 1024x768 (which is why I avoid going down the computer system aisles in any store - flicker!!).

    A good program that fixes this is PowerStrip, not free but nagware, but all you need to do once (and then any time you reinstall) is ask it to first get the best rates for your monitor, then store those in the registery, so that you can pick and choose the refresh rates to use for that particular resolution. This will work with nearly all video cards. It's also got various tweaks, but best to go with card-specific tweaking programs for that.

  • Plenty of games have synchronous event handling. Faster framerate won't give you faster display but it will mean more responsiveness and/or more accurate physics in many games.

    Even in that situation, is there any point in actually rendering to the card? You're not going to see that frame, since your monitor can't keep up. Instead, they could do event handling, and then wait the amount of time it would've taken to render the frame, or perhaps even done additional event handling cycles...

    Of course, doing event handling synchronously with rendering is a bad idea from the start.
  • The editor even said it is above 72, somewhere in the 100 range. I personally (and most men I know) have a hard time making out a 60 Hz refresh, even. But the majority of women (in a statistically uselessly small sample) could make out 70+

    Also note that if you really could see 60, flourescent bulbs would seem to strobe for you. They don't for me, but ask around and you'll be surprised. (It works best with 1 direct bulb. More bulbs, especially on different circuits, can be at different parts of the cycle and meld together.)

    But you CAN see a much shorter flash than 1/60th of a second. You don't see in strobe, you see the average of all light in the slice - the "shutter" is open the entire duration. Which is why you see a blur: it's the average of all the images from 1/whatever of a second. This averaging is why the sleepy hollow cardinal trick (and many others) work.

    I'm not sure what good 200 fps does when your monitor rate (for a regular monitor, admittedly) tops out in the 80s. I think there are two reasons:

    1 mentioned above, is extra capacity. 200 fps average might equal 60 fps during a fight scene.

    but another reason is that even if you're displaying only 60 Hz (monitor limit) to have maximum smooth you need a frame refresh every 1/60 of a second, not just an average of 1/60th. And if frames take varying amounts of time to process, which they do, you could be unlucky and have 2 frame refreshes in 1 monitor refresh and then none in another... it would look like 30 Hz because only every other monitor refresh would be an unmodified repeat. This can happen even with BOTH the monitor and fps being at 60 Hz if the fps changes size (sinusoidally, in my example) and they two are not in sync.

    FPS are not regular, and the reality is the fps is a measure of speed, NOT a reliable timing device. 200 fps != 1/200th each refresh... the first one after you turn is going to have to make many more changes, so it's going to take a lot longer, whereas many things will be reused in the next one. (this assumes nothing has to move to the card on the bus, which I won't go into) so if that refresh is 4x longer than average you'd be down to 50Hz for that frame. THEN you have to use an integer number of monitor refreshes, so it's going to be 30 Hz as viewed. Too much math, perhaps.

    I predict that eventually (probably 1 more generation) many of the objects will be dynamically generated in sync with the monitor refresh. The framerate will be fixed at the (variable) monitor refresh rate. For each frame, one class of objects will be redrawn each time, no more and no less. The problem is that that class has to redraw asynchronously with any other kind of redraw, and that can be bad. But it's good for many kinds of animations... and depending on the architecture it should be no worse in any case.
    you heard it here first.

  • I'm male, but I'm young, and I have pretty good eyesight. My monitor is set to 70hz right now, and it looks like a strobe on the edges. Whereever I'm looking looks fine, but at the top of the screen (if I looking at the bottom) it's flickering.

    It drives me insane. I just got a new monitor that will do 87hz at acceptable resolution, but I haven't gotten around to adjusting it yet (linux).

    But there's a big difference between refresh rate and frames per second. I'm guessing if you got tricky enough with simulating motion blur, you could drop the frame rate down to around 20-30fps (film is 18-24) and still get acceptable quality.
  • I've given up counting the mistakes in the article. Here are a few corrections and interesting facts:
    • Rods are sensitive but quite slow, compared to cones.
    • An individual cone is either short, medium or long wavelength detecting. They are commonly imprecisely called red, green and blue. Red cones peak at green wavelengths.
    • The bandwidth of the optice nerve is spectacularly low. John Carmack compared it to a 57.6kbps modem. He should know about these things. I tend to believe him.
    • Cones are not 'inefficient due to their complex nature'. They are designed to be used in daylight viewing conditions, having fewer buckets of photosensitive goo than rods. Their small size makes them quick to respond.
    • Most humans can reliably detect a flash of light dim enough that only one in a hundred rods will receive a photon.
    • The optic nerve thingy is obviously more than 2-3cm in length, considering that the visual cortex is at the back of the head.
    • Blurring is just what happens when a moving image is integrated over time. The viisual cortex doesn't have do anything special here.

    It's hard to say what the maxinum frame rate a human eye can perceive directly is. It depends on the viewing conditions and the observer. In daylight, I can easily watch the progress of the video beam on a 50Hz TV as it makes its way from the top to the bottom in each field of each frame.

    If 'frameless rendering' can be used (an option if real-time raytracing is feasible), then the natural smearing and removal of temporal aliasing in a quickly changing scene will lessen the need for a very high frame rate. Try searching for 'Frameless rendering'. I'm looking forward to Quake XXIV Bitchfight implementing it.

    For pretty pictures and interesting reading, see
    http://gwis2.circ.gwu.edu /~a tkins/Neuroweb/retina.html [gwu.edu]


    - I mean to win the wimbledon!
  • 60Mhz would mean 60 Million cycles per second (not thousands).

    And monitors have a vertical refresh rate usually around 60 - 85 hz (and higher). This is how often the beam traces from the bottom to the top of the screen (how many screens/second you get).

    The horizontal sync signal is usually measured in Khz, which might be what you are thinking of, but this is only used to move the beam back to the left (or right, whatever...).

  • by Ektanoor ( 9949 ) on Monday October 30, 2000 @03:06PM (#664355) Journal
    What this guy writes is a mixture of secondary school knowledge and flamebait. Yes, he goes in some detail on how our eyes work but he strongly lacks some deeper scientific knowledge. A clear example:

    "The visual cortex is where all the information is put together. Humans only have so much room in the brain, so there are some tricks it uses to give us the most information possible in the smallest, most efficient structure. One of these tricks is the property of motion blur."

    Some tricks that produce motion blur... However he does not explain any details of what these tricks are. How human brain compresses information is still a question but this guy even does not touch this slightly. Only "tricks of the trade". Sorry people but he is very superficial. I am no expert on these things but I saw books and I know people who would explain more clearly for the layman these things. Once, Scientifc American published an excellent book exclusively dedicated to this problem. I think it would be worth to search for it.

    On what concerns 72fps. Is he nuts? I can discern a 60-70 fps picture clearly from a 110 fps! On such level it is still well seen how things go hickcups.

    And on what concerns monitors. For me and several people 60Hz is deadly painful! Seat on a 60Hz monitor for the whole day and you surely get some serious headaches (specially on the temporas and inside the eyes). It looks like someone furiously turns lights on and off. On a 72-75Hz it is still visible the flickering. The minimal frequency for such aliens/mutants like me is no less than 85 Hz. And sincerly one gets tired working on such monitor. My good level is 100Hz. Yes there I can work without feeling any stress. Btw. When working, I pass more than 12 hours day in front of the bright head of the computer. In fact, my work turns frequently to 36 hour shifts (like today, I'm in the 17th hour). So guys, maybe I mutated too strongly... >:E

    Well, I don't know where this guy took his theories but my everyday work tells me he's nuts. So much for the theory.

  • Because we're bored with their crappy engine, and have been waiting patiently FOREVER for tribes II to come out.
  • The goal of hardware tweakers is to get the maximun effects while not droping below that critical point of 60 ftp (or 72 as that article clames).

    So, why is the common practice to quote the maximum framerate and not the minimum framerate?

    Why are "timedemo" tests usually lightweight compared to actual gameplay?

    You're right on, but I don't think the dicksizing motivations should be ruled out either.
  • Though everything you say about flicker is true...

    Many people can notice flicker, even in a completely dark room.

    I usually work in a dark room, and even at 75hz, I notice flicker on my screen. 80 is tolerable; 85 is just fine.

    Interesting note about the tool shop, btw..
  • I don't believe 30 is enough to see really fluid motion. Motion on a TV never really looks fluid, and I believe that a lot of the disorientation that accompanies some Imax films is less from the very large screen than from the refresh rate being too low to permit really smooth motion.

    I've found even a very simple scene, such as watching a wheeled vehicle move, in an Imax movie to be disorienting. The part of the view that seemed to cause me particular trouble was the wheels turning.

    I think it also depends on the "velocity" of an object in motion on the screen. Very fast moving objects require faster update than slower moving objects. The test (which is admittedly imperfect) that I use is to turn my head rapidly from side to side. Even at 80 Hz refresh rate I can see the discrete frames. Again, this particular test has a lot of problems, and I may be testing the wrong thing, but I suspect that the reality lies somewhere between the 72-80'ish fps of the article and the 200 fps that some people are trying for.
  • No.. that simply adds to retention.. so the images is 'there' longer.. there is still the same amount of animation happening.
  • Fluorescent lights at 60Hz certainly make 60Hz refresh much worse, but 60 Hz is very flickery for me in any light (or none). It's clear this perception is different for different people.

    Rick

  • by ichimunki ( 194887 ) on Monday October 30, 2000 @01:46PM (#664372)
    Are you people saying that my video card and these "frame rates" are what are probably causing me to lose chess matches online and to never quite get all 40 bonus bugs in the Galaga challenge rounds? Should I be looking for a new card, or will adjusting the resolution help?
  • A "real 3D world" or better saying 3D systems that give you the feeling of a real 3D world. Really this is the result of a conversation i had with one friend about 3D glasses & Co.

    As far as I know there is always a lag between screen frequencies & fps. On glasses systems this is quite visible. To get a 50 fps you need a 100Hz monitor as minimum. To get higher rates you need a monitor going nearly 2 times the fps rate. So it is quite logical to try to achieve 200fps as they also have to be divided in glasses systems. However then, monitors should reach a cool 200-300Hz to give a chance for your eyes.

    I have never see a glasses system but some friends around here tell that presently that is the same as burning you eyes for good. So let's wait the 200's
  • by Seumas ( 6865 ) on Monday October 30, 2000 @02:01PM (#664392)
    Playing Q3 in 1024x768, trilinear, high-detail, full quality everything, I peak at 130fps on my 64MB Radeon and 150fps on my 64MB GeForce Ultra2.

    In the midst of battle with body parts and rockets flying everywhere (clarification: my body-parts; someone elses rockets), my rate easily drops down to 90fps. Very rarely, I'll catch it plummeting as low as 70fps or 60fps. I can't really tell any difference between 70fps and 150fps, but anything below about 60fps is noticable to varied degrees.

    As long as you can still aim and shoot fluidly, you're fine. Anyone who is still moving fluidly at the heaviest point of graphic intensity shouldn't worry about tweaking every last frame out of their system. Unless there is some revolutionary change in the industry, I don't plan to upgrade my cards for a long time to come (until we see games that drop my frame rate enough that I can notice it). I'm certainly not about to dump a few hundred more on a card just because I can achieve 200fps, when 150fps will more than do.

    Besides, what is more irritating is the games with the poor net framework that makes finding a fast server impossible. While Q3's code seems to be sleek (I usualy find a lot of servers averaging between 12 and 30 ping), other games (Unreal Tournement, to name one) rarely have anything below 100 and only a few under 200 ping. Even the sweetest frame-rate can't help poor network performance.
    ---
    seumas.com

  • Hey..dont' have to 'splain it to me. I love tribes... it just started to get a bit stale.
    And tribes II is taking an awful long time, though I'm definately a customer when it comes out.

    The 'render extremely large areas' shoudl read 'render extremely large areas with very little detail'. Any map that has lots of structure on it bags down in a hurry; rolling terrain is great. On that note, though, i agree, that's what made tribes really cool.

    Watch what you say about rendering though.. indoors or outdoors, as soon as you have lots of structure or detail, tribes bags.

    And tribes rocks on team fortress, I agree. Tribes is awesome.
  • So you get 200fps. Big deal, if your vertical refresh is only 85Hz, then you only SEE 85fps, regardless. An earlier poster had it right, the reason you want a 200fps max is so that you don't drop below the critical rate of 72 (60) fps where animation artifacts start to show up.
  • Of course, once you throw in the USB mouse, the high powered SB Live! doing surround sound, the 21" monitor, etc., your frame rate is sure to drop even more.

    If I can push 30 I'm perfectly happy.

  • by zlite ( 199781 ) on Monday October 30, 2000 @12:53PM (#664413)
    It's peak frame rate that matters: i.e. the frame rate at the moment that you've got the most number of objects on the screen. Just like you don't want only enough server capacity to handle your average traffic, but instead need a lot extra to handle peak load, so frame rates above 72 ensure that you won't drop below 60 or so even in the most complex scenes.
  • What people haven't mention sofar is _control lag_, which is independant from any input sampling method: the fact that if your frames stutter past you at 20 fps you have a harder time aiming, because how people aim is a kind of feedback loop between what they see and the adjustment they make with the mouse. Control lag can easily give as much or more lag in aiming than a slow connection does, I've included a very good explanation of this (taken from a messageboard that doesn't exist anymore) below.

    Couple this with what other people have mentioned (primarily that MINIMUM fps is all that matters: difference between a normal timedemo and an intense fight can be as much as 5x to 10x, try getting 72fps minimum for that!) and the fact that the physics of all quake derived games are biased towards high fps, and 200 fps is actually kinda low. This is of course if you want to play "competively", if all you want is a relaxing frag for half an hour after work, 50 fps average will do you fine.

    The post:


    from: Menace (matthewv@best.com):

    That Penstar link is interesting, but gets a lot of things wrong and misses all the most
    important points.

    Here goes.

    Visually, the eye can distinguish between separate "flickers" at a varying rate depending on
    such factors as brightness (brighter requires higher refresh rate--movies can get away with 48
    flickers/sec becuase they are actually quite DIM--contrary to what that website says), field of
    view (peripheral vision is more sensitive to flicker than straight-ahead vision), age of the
    viewer, etc. The specific rate is known as the "critical flicker frequency."

    So if you have a tiny, very dim point of light which is only slightly brighter than the
    background and which just gets a little brighter and dimmer, and look at it directly, it can
    flicker very slowly and still appear to be a steady point of light (you might not see the flicker
    even if it's flashing at under 10Hz, or even slower), but a bright full-view image (such as a
    bright strobe light in a dark room) may need to refresh at 200Hz or more to appear
    completely steady and continuous.

    It's less clear how much the eye/brain can integrate and distinguish motion, but my opinion is
    that we CAN distinguish smoother/more detailed motion up to nearly the flicker frequency
    limit. It may not always be noticeable, since often successive images differ little from the
    preceeding one (which is a critical factor in motion video compression), but when the whole
    image moves quickly or fast dramatic action occurs, you can see the difference.

    Anyway, the 24fps (24 frames, each projected twice for 48 "flickers" per second) of film is
    clearly NOT enough--I can see the lack of motion discrimination in movies quite easily,
    though you tend to get used to it within the first couple of minutes. The 20-25fps range often
    quoted as "all the eye can see" is really the MINIMUM limit below which most people have a
    hard time fooling their brains into thinking they are seeing continuous motion, as opposed to
    a series of distinct still images in succession. But that doesn't at all mean that increasing the
    framerate doesn't result in dramatic improvements. The one serious effort I know of to study
    this in regards to film was done by Douglas Trumbull, who concluded that 60fps was the
    desirable minimum for smooth and accurate motion reproduction--and appearantly such film
    looks startlingly more realistic than normal film.

    HOWEVER, all this talk of "what the eye can see" blah blah blah doesn't even begin to address
    the biggest problem in terms of computer games. We seem to be forgetting that instead of
    just passively WATCHING our computer games, we INTERACT with them. So now you have
    to think a little about what "framerate" means in these terms--if the game can render a certain
    number of frames per second, that means that each frame takes a certain amount of time to
    render. So from the point that the game has accepted all input, it takes that much time before
    you can see it on screen. In addition, if you give input while it's in the middle of the previous
    screen, it has to wait additional time until that screen is finished before it even begins on the
    next one.

    What this means is that there is an inverse relatoinship between framerate and what I call
    "control latency." The amount of latency can be pretty shocking--at 10fps, your control latency
    (the time between, say, moving the mouse and seeing the image on screen move as a result)
    is actually 100-200ms! This is even wose than having a 100-200ms ping (a LOT worse) since it
    effects aiming and all other client-side actions. At 25fps, the delay is 40-80ms, which is still
    pretty significant. In order to keep that delay under 20ms at all times you need to have a
    framerate which never drops below 100fps!

    This helps explain the feeling of sudden lag that can occur during intense battles--in addition
    to likely ping spikes from too much data saturating your lowly modem connection, if your
    framerate plummets at the same time (which it probably will), it's quite possible you would
    literaly experience a brief increase in total latency approaching a full second!

    On top of this of course you have to consider the maximum possible number of angles you
    can aim it in a given period of time. At 10fps, in the half second you typically have to aim,
    there are only 5 possible angles you can aim at, and it will jump from one to the next, instead
    of moving smoothly between them. Combined with the latency (100-200ms) this makes it
    VERY difficult to aim. Compare this to 100fps where there would be 50 distinct angles to
    cycle through in that half second, and only 10-20ms delay--it's MUCH easier to aim precisely,
    and to make quick adjustments if necessary.

    On top of all that of course is the fact that what we should be concerned with is our MINIMUM
    framerates under the worst possible conditions, since the average framerate in a particular
    situation is usually as much as double the minimum, and an intense situation is usually far
    more demanding than your average "timedemo." (Compare bigass1.dem to Quake's built-in
    demos, or crusher.dm2 to Q2's demo1.dm2 for instance, then spend some time watching
    your "average" "minimum" and "maximum" numbers in Unreal Tournament with the timedemo
    utility enabled. Based on the variation, take a guess what your minimum framerate in bigass1
    or crusher would be, then compare that to the average reported score for demo1...) The
    point being that even if all we want is a bare minimum of 30fps at all times, we may need to
    get 50-100fps as the average score in a standard timedemo. And that's assuming that 30fps is
    "enough," which, based on the information above, I contend is not the case--while a "casual"
    gamer may not find it worthwhile to spend the money or sacrifice the visual quality for
    improved framerate, there clearly IS a benefit to it. Certainly there's a limit--I doubt going
    above 100fps or so does any additional good, but again, to ensure that fps ALL THE TIME
    may take a timedemo score approaching 200fps...

  • What? Are human eyes subject to Moore's Law too, now?
  • by Thorson ( 91904 ) on Monday October 30, 2000 @12:54PM (#664427)
    I worked for a research and development group about 15 years ago. One of our areas of research was frame rates. We discovered that frame rates are a function of age, genetics, ambient light, and a number of other smaller effects. The highest rate we saw before fusion was 78 fps. Some don't see flicker at rates as low as 55 fps. Everyone saw flicker at 50 fps.
  • by Sanchi ( 192386 ) on Monday October 30, 2000 @12:54PM (#664434)
    While on my gameing computer i average 70 FPS there are times that i drop to well below 20 during hard effects (like 5 bodies being gibbed at once). The goal of hardware tweakers is to get the maximun effects while not droping below that critical point of 60 ftp (or 72 as that article clames).

    And I promise that i can tell the difference between a computer avereging 72 FPS and 200 FPS.

    Sanchi
  • im personally glad for the graphics cards that are giving us crazy frame rates for 2 reasons... 1) all my games llok really good 2) i can keep my card for a long time without needing to replace it every two months when a new game that's better looking has more polys and plays ata a higher resolution comes out.
  • by Domini ( 103836 ) on Tuesday October 31, 2000 @12:05AM (#664444) Journal
    Firstly: Quake3

    The game is designed in such a manner that there is always a server and a client, even in single player mode. Quake has this little oddity (which hardcore Quake player use a lot) that allows them to achive a bit more height with certain update ferequencies. And somehow the updates are linked to certrain FPS. For instance: Begin able to just jump normally up onto the megahealth platform without any other aid on the Q3DM13 level has certain advantages. I can only do it with a FPS of 120 and 140.
    With an FPS of 140 I can rocketjump higher, and with a FPS of 120 gravity seems to work a little less harder, and I can jump from the railgun to the rocket-launcher platform (and back) on Q3DM6. (Using a combination of circle-jumping and stafe-jumping techniques that exploit some other physics feature - these are so difficult to master that they were left in from previous bugs)

    Thus for Quake I need a sustained 120. It is possible in Quake3 to cap the framerate at acertain value, but then you must be sure you can keep it there. Besides, there are certain jerking phenomena with my mouse with has a update frequency of 120 Hz, and my monitor with refreshes at 120 Hz if I cannot seem to keep 120 FPS in quake. (Which makes railing more inacurate)

    These things are only important in competitive playing, for which Quake3 was designed.

    Secondly: Other games.
    Mostly similar to reasons I stated above - Mouse jitter on certain systems, as well as sustaining the same FPS on even high difficulty scenes. Most of the FPS ratings were done with certain detail off, and was only an average. You need about 150+ on average to have >80 on worst scenario.
  • by Temporal ( 96070 ) on Monday October 30, 2000 @12:55PM (#664449) Journal
    The point is, with a faster card, you get better visual quality, be it from FSAA, multiple rendering passes, higher GeoLOD, or just higher resolution. Also, if you get 72fps on average, that might drop noticeably when you least want it to (during big firefights), which would be bad. At 200fps, your performance could take a sudden 64% hit and you wouldn't be able to see it.

    ------
  • You always wonder whether there are any more things to be done besides optimizing the frame renderer, when the programmer spends all day trying to get from 192fps to 193fps.
  • As most games the clock is not async, I doubt every object was updated by 1/200th of T and all polys resubmited 200 times a sec. More likely is 140 "No change frames" and 60 or so actual updates.
    TMOICBW.
  • That's probably because it does more positional calculations (once for each frame). If the frame doesn't 'catch' you at the very top of the parabola, it looks like you didn't jump quite as high.

    I remember dealing with this in a moon lander program for Radio-Shack Model I. The original version did a non-realtime cycle of 1 "frame" per second. I found it annoying that you could be 3 feet up dropping 50FPS, and then, after a heavy burn, be 10 feet up climbing at 80FPS. I resolved the problem by calculating lower bound of the curve to see if you touched the ground. If you touched the ground, I'd calculate your speed at touchdown time to decide whether or not you cratered.

    Later on, I did a realtime version -- peek commands for keyboard scan codes and input editing routines. I think I got it up to about 5 FPS. At the time, that was considered pretty hot.
    `ø,,ø`ø,,ø!

  • by g_mcbay ( 201099 ) on Monday October 30, 2000 @12:56PM (#664460)
    Moderate this up! Hardly anyone (with a clue) argues that they need a constant 200 FPS. The reason you want a base of a couple hundred FPS is so that you're (hopefully) above 72 FPS (or whatever lower number you think is minimal detectable) when there's 25 other guys on the screen all hurling rockets left & right.
  • I usually require > 150 fps if I'm on a heavy duty, super caffeinated drink. Everything else looks like I'm on 'ludes, or something.
  • The point is that games like Q3A and UT make great CPU/bandwidth benchmarks at low res. look at Tom's Hardware. He uses 640x480 as a CPU benchmark, and then ignores anything below at least 800x600 or higher when looking at graphics. He has a great article a while back (I don't remember the link. sorry) explaining exactly that. Also, if the card/CPU manufacturers give more speed, then that lets the game developers add better physics/AI and more complex graphics. Anyway, no one had mentioned this yet, thought I might.
  • by Vegeta99 ( 219501 ) <rjlynn.gmail@com> on Monday October 30, 2000 @12:58PM (#664487)

    The article doesn't debunk it. It supports it.

    While we still may be several years away from photographic quality in 3D accelerators, it is important to keep the speed up there. Looks like 3dfx isn't so full of it.
    Read up before you post, Tim.
  • So, why is the common practice to quote the maximum framerate and not the minimum framerate?

    Because it's nearly impossible to measure the minimum framerate in many games, we assume that a card or system that has a higher average FPS will have a higher minimum FPS in gameplay conditions. This point is debatable, but not illogical.

    Why are "timedemo" tests usually lightweight compared to actual gameplay?

    We can't compare actual gamplay FPS taken from different sources, so a standard demo is used to consistent basis for comparison. As above, it isn't likely to truly represent gameplay, but the contrast between two cards or systems is likely to be the same.

    In a busy situation, both will likely take the same performance hit, and the one with the higher average FPS will probably still be on the top.

    --
  • Running sideways is pretty god damned hard. You can do it, but you really have to practice at it without tripping over yourself or turning around in the process.

    Despite that, I don't think a game like Quake needs to be realistic. In fact, I like Quake because it isn't truly realistic. Do people bitch that Tetris isn't realistic? Why should Quake be any different?

    I'm a little confused by the current rash of realism-based games like SOF or CS. The "realism" seems quite arbitrary and independent of its effect on gameplay. You have a gun that takes five seconds to reload, but you conviently forget the fact that you can't really reload a gun while dodging bullets, switching weapons and running for cover. Real SWAT people spend months training for fifteen seconds of action and don't particularly find it fun.

    --
  • by shinji1911 ( 238955 ) on Monday October 30, 2000 @01:00PM (#664497)
    It's a known fact that most "super-jumps" (questionable physics be damned) in Quake3 cannot be made with anything less than 125 frame or so.

    Jump from the rail over on DM6, the swing jump to get the health in the middle of Tourney 4, etc, cannot be done with lower framerates.

    Granted, this has nothing to do with perception, but gameplay is also kinda important.

  • by _xeno_ ( 155264 ) on Monday October 30, 2000 @01:00PM (#664498) Homepage Journal
    There is never, ever any need to try and push your framerate above the verticle refresh rate of your monitor. Since my monitor peaks out around 120 Hz (or 120 frames per second) someone running a game at 200 FPS (or 200 Hz) would be dropping frames, since the monitor can't keep up.

    Most games use this to their advantage, so that when I play Half-Life, my frame rates never go above 72 FPS since my refresh rate is around 72 Hz - this is used to prevent "tearing" when one frame is rendered during the first half of the sweep to refresh the screen and another is rendered later. Going above your refresh rate will actually make your game look worse.

    Even if the card is capable of 200fps, it should never actually do that - unless you have a rediculously fast refreshing monitor, you're just drawing frames that you won't see or that will simply tear. Plus, I believe that it's been stated that the human eye cannot discern framerates above about 60FPS anyway. Although it is quite nice to be able to play Half-Life at 1280x960 at a constant 72fps (again, locked to 72fps since anything higher would tear on my display).

  • For one thing, I can perceive flicker at up to 80 Hz, as can a few other people I know. This is why I have to run at a res/refresh that allows at least 85 Hz refresh. I would imagine that there are probably people out there than can perceive up to 100Hz.

    Further, as others have mentioned, average frame rates are relatively irrelevant compared to framerates while rendering complex scenarios.

    Also frame rates with *existing* games are often tweaked by handling the number of polygons on screen at a time. Dynamic T&L engines can place further strain. So even if Q3 could get 200 FPS on a highly complex screen (as opposed to an average scene), the only thing that means is that it's time to build more complex scenes.

    At some point we'll have photorealistic engines at 200 fps for the most complex scene imaginable, but that day is far in the future.
  • I remember at the lan party, after that massive gib, 2 computers hard locked. Damn glad i was running Linux :)

    Sanchi
  • Granted, the 24 fps used in the cinema does show signs of jerkiness during action scenes, but most people are surprised to know *just* how slow film still is.

    There is never a need to go beyond 75 fps for video because this is the refresh rate for most monitors above 1024x768. Pushing more than that, you will have frames that are rendered, but the scan gun will never pick up the pixels!

    Of course, what *really* matters is sustained frame rate under scenes of high complexity, but as long as you can always manage 60-75 fps you'll never see the difference.

    -p.

  • I saw somewhere (maybe /.?) that graphics card technology was improving at a rate of Moore's Law cubed... doubling performance roughly every 6 months instead of every 18 months... Looks to me like we're starting to see the beginning of the limits of "useful FPS". Since it looks like the "Usefeul FPS" dragon will be slain soon.. then what? What I'm really hoping we'll see in the next few years is some viable consumer grade VR applications and hardware. The companies are starting to produce some mainstream graphics cards that have some serious horsepower... the kind of horsepower that will be needed to drive stereoscopic HMD's...
  • Of course the human eye has limits, but that's not what 200+ FPS is for. The more frames, the faster you can do fake radiosity, environment mapping, and other effects that involve multiple frames composited to form the final image.

    I would like to see 307200 FPS so we can run a separate pipeline for each pixel on a 640*480 screen. Oh... and I'd like the card that does that to be so cheap that when it burns out you just run down to the drugstore and get one for $1.98. It'll happen eventually.


    Eenie meenie miney moe
    Stupid voters have to go.
    Inca dinca dinca do
    I can do it, why can't you?
  • In terms of the human visual system, the maximum perceivable frame rate is somewhere between 60 and 100 FPS, depending on the viewer The research done when Showscan [showscan.com] was developed established this. Showscan settled on 60FPS; IMAX uses 48FPS or sometimes even 96FPS. The effect is striking on a big screen; the annoying 24FPS strobing during pans of standard film disappears. That's the goal to shoot for.

    But above 100FPS, it just doesn't matter. Besides, you're limited by monitor sweep rate and phosphor decay rate.

  • When your video card is drawing 200fps, it is also getting input 200 times a second, so your control is much smoother and more accurate.
  • well, I know that even on a regular screen, *depending on the monitor*, I can tell the differance between 72 fps and 90 fps or so...

    after that, it is all sort of gravy, depending on the other bells and whistles and effects and such....

    seriously, even for regular applications, some monitors look worse at "standard frame rates" compared to others....

    not that it matters *that* much ...[smile]

  • by LionKimbro ( 200000 ) on Monday October 30, 2000 @01:10PM (#664540) Homepage

    60/72 Hz is what we want when we play games, but you generally have to target above that for when your potentially viewable set (PVS) changes dramatically- if you move *really* fast (missile cam), say, or if you just turn your head 90 degrees (and look down a completely different hallway).

    A current trend in games is to seperate the rendering cycle from the simulation cycle.

    Historically, games have been implemented with a read-eval-print loop like this:

    1. read user input for a given cycle
    2. evaluate what happens next
    3. display the result

    Now, we (FPS, 3D) seem to be moving towards the parallelization of read/eval (simulation) cycles and the print (display) cycles. That way they can be controlled independently: The display can be given just the cycles it needs to provide 60/72Hz, and simulation lives in it's own space. The display routines have their own prediction mechanisms to make sure that they can keep pace.

  • Under optimal conditions these cards can do 150-200+ fps then when things get a little cpu intensive it will only drop to 80-90, still a comfort zone. If they listened to all these scientsist who really need to not worry about us, and made their cards concentrate on the 72-75 fps mark, the moment things get cpu heavy, we drop to 30 then we get mad and kick the machine. Xcess power and speed is needed to insure a playable game. And for bragging rights :)
  • this is one of my beefs with quake 3, that the physics of a game depend on framerate. I remember reading that the 1.25 first beta fixed this, but the old physics code was put back in the latest beta. oh well. People will bitch no matter what. People whined when id said they were taking out the damage through walls. Of course I bitch about the stupidity of strafe jumping, so I am one of those that will be bitching about id no matter what.
  • "As an illustration: when you watch a movie in a theater, there are 24 different images per second, but each image is displayed three times per second to yield a refresh rate of 72 Hz"

    But don't movie cameras introduce temporal anti-aliasing, which would help reduce the effects of a lower framerate?
  • Developers and gameplayers alike don't seem to recognize the huge difference that steady, high framerate makes for both the feel and playability of a game.

    A game that runs 72fps most of the time but drops down below that, even if it's only a little, will not feel as solid or be as playable as one that holds a steady, say, 60fps.

    PC games have always been the worst in this regard, partially because developers assume that better hardware is going to come out that will make their game run faster in the future, and partially because hardware is so unpredictable that getting it running smoothly on one machine doesn't necessarily mean that it will run smoothly on the next.

    It's excellent that hardware manufacturers keep pushing the level of performance of their products, but it's not so that we can achieve 200fps with Quake 3. It's so that the game won't drop *below* its peak framerate, ever, even on a complex level with lots of enemies. (First person games have a special need for a high framerate because of the speed with which your viewing angle changes.)
  • That's impressive, given that depending on resolution and monitor, you're limited to 75,85, or maybe a max of 100 fps
  • In the article referenced, the author states that motion blur would not be good in games because it would create imprecise locations for game objects, making determination of hits impossible. Here's a quote:

    "The lack of motion blur with current rendering techniques is a huge setback for smooth playback. Even if you could put motion blur into games, it really is not a good idea whatsoever. We live in an analog world, and in doing so, we receive information continuously. We do not perceive the world through frames. In games, motion blur would cause the game to behave erratically. An example would be playing a game like Quake II, if there was motion blur used, there would be problems calculating the exact position of an object, so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned."

    This is just a failure to distinguish between a software *model* and it's screen rendering or *view* (Smalltalk programmers will see this at once). It is perfectly possible to maintain a precise location for an object in the game's model of the it's world, while only *rendering* a motion blurred version of the object. This would allow extremely fast moving objects (projectiles, shrapnel, etc.) to be rendered realistically, while still keeping the game's internal world model as precise as necessary to determine hits, collisions, etc.

    In this context, it should be noted that movie special effects make *extensive* use of motion blur to produce extremely realistic renderings of non-existent scenes using very low frame rates. Motion blur should really be seen as the key to realistic rendering, since frame rates will never reach the threshold necessary to freeze extremely fast moving objects. After all, in the real world, one needs a very high speed strobe to freeze a bullet. Frame rates, especially in demanding frames (lots of objects, lots of motion) are not going to hit the 1000 fps mark any time soon. If fast moving objects are to be rendered realistically, then they'll have to be done with motion blur, just as film professionals, like ILM, discovered years ago.
  • How much of this can be acredited to the CPU having to work on other things too? I play Q3 under Win2K on my dual CPU machine. First of all, I get a base frame rate of about 40% more. On top of that, I don't get such major slow downs when the action gets heavy.

    I play with maximum graphical details on a dual P2-450. I get about 90fps, and major gibbing doesn't kill my frame rate the same way it does when I set r_smp = 0. I should point out that at 640x480, the performance bottle-neck for Q3 is the CPU. Turning off all of the detail or increasing the resolution to 800x600 makes little difference. I just like to see things in glorious technicolour rather than hi-res.
  • Try moving your hand between your eyes and your screen. Where did all those fingers come from?! It's a strobe, and that's what you see when you turn your head from side to side too. You can see the dark/light contrast.

    The faster the refersh the more fluid that had motion will be and the less your screen will seem to flash as you look rapidly from one corner of your much too big monitor to the other.

    200 FPS may really be better.

    Poster does not play games.

  • In case anyone was wondering, this happens because Quake 3 uses a virtual machine by default for a large portion of the code on x86 and Motorla. This occurs because of a bug in the VM where a float is rounded up, instead of chopped off, like is supposed to happen when you typecast (float).

    This does not happen when you use the hard compiled DLLs, obviously. Also, you do not need a high framerate in order to exploit this bug. Rather, there are framerate points where it is exploitable. 37 frames per second is one point.

    Michael Labbe

If you think the system is working, ask someone who's waiting for a prompt.

Working...