Forgot your password?
typodupeerror
Power

Antiocheian's Journal: How I improved energy savings on my nVidia 8800 GTS

Journal by Antiocheian

The following is a walkthrough for a better power management of nVidia 8800 GTS video card (it will work on most nVidia or ATI cards). Experienced RivaTuner users only need to read this: you can achieve better power management if you depend on the video memory usage instead of the hardware acceleration boolean for your high frequency / low frequency switching.

I didn't like nVidia's own energy-saving settings because they didn't save a lot of energy as temperatures were still quite high when lower energy settings were enabled. In particular, the build-in energy saving

  1. Does not change the memory frequency at all
  2. Does not lower the GPU frequency too much
  3. It is disabled whenever a "hardware acceleration" call is invoked

All the items are self-explanatory but #3 needs some further attention: most CPU-based power management depends on CPU usage; that means, when you use your CPU intensively, it is switched to higher frequencies and voltages. There is currently no utility (or, more importantly, hardware call) to measure GPU usage. Therefore an alternative has to be found.

Nvidia's alternative is the hardware acceleration boolean. This boolean is switched on when 3D hardware acceleration features are being used, otherwise it is disabled. Therefore, nVidia's theory is that when 3d acceleration is required, then the your card should go to full power mode.

This is a bad idea because playing a video on a standard video player or a simple game with minimal 3d acceleration requirements will invoke such calls, yet neither of these require more than 1% of the monstrous 8800 GPU. (Windows Vista is using pixel shaders and therefore it invokes hardware acceleration calls all the time, but I am not using Vista and neither should anyone interested in saving energy).

On the other hand, video memory provides a much better source of data to rely on. With a few exceptions (mostly diagnostics and some benchmarks) demanding games will require a lot of video memory while non-demanding games (or video playback) will require small amounts of video ram. My threshold is set to 98 MB. Anything that requires more video ram will work on standard speed, while anything that requires less will work on energy saving settings.

Interested? You'll need:

  1. A standard or modified Forceware driver (version 150 or higher)
  2. The RivaTuner utility from Guru3D (version 2.08 or higher)

And this is what we'll do:

  1. Enable monitoring of various metrics
  2. Create a standard profile
  3. Create an underclocked profile
  4. Automate switching between profiles based on the video memory usage

1) Enable monitoring of various metrics

Install RivaTuner. And go from the Main tab to Target Adapter, Customize..., Hardware monitoring. Click on the background monitoring button (red button like Record on bottom left) and click on Setup, Plugins. Enable the VidMem.dll plugin. Click OK and tick on the Local Videomemory usage of the data sources (you are still on Setup, right?). It is also not a bad idea to tick on Core clock, Core clock shader and Memory clock to see what is going on. Click OK and inspect your graphs for a moment. You can close the hardware monitoring window and it will monitor the data sources on the background as you've set it up.

2) Create a standard profile

Now go to the Power User tab and change the RivaTuner\Overlcokcing\Global\MinClockLimit setting to 9. This will allow you to specify really low frequency settings. Go to the Main tab and open the System tweaks\Overclocking tab; save your current (default) clocks as a preset with the name "HIGH CLOCKS". You'll need this later.

3) Create an underclocked profile

Now it's time for some underclocking; set your Core clock and Memory clock to a low frequency combination (such as 128 Core /200 Memory) and click on "Test" to see what happens. It will probably be fine and you should hit "Apply" after that. If you wish try a lower combination. I use 80 Core / 80 Memory and I am fine with that.

Make sure those settings are enabled on both standard 2D and performance 3D settings (combo box on the top of the window). Now watch a video clip, play a game or run a short 3D speed test. Naturally reported FPS will be quite low (although not as low as you expect, a proof on how fast these cards really are). If all goes well (and I don't see why it shouldn't) go back to the System tweaks /Overclocking tab and click on apply overclocking at Windows startup and save it as a profile with the intuitive name "LOW CLOCKS" or something like that.

Note: newer versions of the Forceware drivers allow over/under clocking of the shader domain clock. You should change that as well for even better energy savings.

4) Automate switching between profiles based on the video memory usage

The point is that whenever you play a demanding game, your standard settings will have to kick in, otherwise you should be on the lower and cooler clocks. That was quite difficult with RivaTuner but fortunately the good (and donation-deserving) author has improved it a lot.

Go to the Launcher tab and click on the plus icon; choose Regular Item and type the name "HIGH CLOCKS" (again). Tick the "Associated overclocking profile" and pick the "HIGH CLOCKS" item from the combo box beneath. Click OK and do the same for the low clocks profile.

We are almost done; go to the Scheduler tab and click on the plus icon. Use the following settings to create the low clocks launcher:

Launch: item
Name: LOW CLOCKS
Run task: on hardware monitoring threshold event
Data source: Local videomemory usage
Threshold: 98
Direction: downward
Display threshold on hardware monitoring graph: yes (a bright color will be fine)

and another launcher for high clocks:

Launch: item
Name: HIGH CLOCKS
Run task: on hardware monitoring threshold event
Data source: Local videomemory usage
Threshold: 99
Direction: upward
Display threshold on hardware monitoring graph: yes (a bright color will be fine)

That's it. It's done now. You can play a game and you'll see no difference because the standard clocks will take place. While working on the desktop though, you'll notice much lower temps (and lower fan noise) on your card.

It's a shame nVidia didn't pay greater attention to energy saving. It's not only an issue of doing something for the environment; keeping the temperatures of the video card low means that you keep the general system temperatures low. It also means that less heat is dissipated from the card which results in lower (if any) air conditioning bills for those living in warmer climates.

It would be really great if nVidia published an updated BIOS to automatically lower power settings based on the GPU usage instead of video memory. nVidia could also switch the voltage settings which means that the difference between low clocks to standard clocks would really be huge.

Thanks for reading; comments will be welcome.

This discussion has been archived. No new comments can be posted.

How I improved energy savings on my nVidia 8800 GTS

Comments Filter:

It is clear that the individual who persecutes a man, his brother, because he is not of the same opinion, is a monster. - Voltaire

Working...