I usually go by the performance index of www.3dcenter.org, which gives an average performance value relative to the Radeon HD 5750/6750 GDDR5, which is defined as 100%.
The index is not based on theoretical GFLOPS, but on tests by various review sites (mostly gaming) and calculated for benchmark results at 1920x1080 with 4x multisampling anti-aliasing.
This explains why Nvidia looks better in the 3dcenter.org ranking, as they usually get more gaming performance out of cards with the same GFLOPS.
3dcenter.org also calculates a performance/watt rating where they divide the performance index by the typical power consumption in games. The result is in percent of performance per watt, and as explained above it favors Nvidia. Of course, if you do something other than gaming, your results may differ.
The best result at the moment is for the GTX 980 4GB at 3.45, closely followed by the 750Ti at 3.44. I used the 750Ti as example of a midrange card that still performs quite nicely compared to high end cards of a few years ago. Current market price is 130-145 Euro. The Fury X is listed with a performance per watt of 2.32.
BTW, Wikipedia says that
Full-height cards may increase their power after configuration. They can use up to 75 W (3.3 V Ã-- 3 A + 12 V Ã-- 5.5 A)
Graphics cards manufacturers use that routinely to save a few cent on the extra connector.