It all depends on the particular case, you can’t draw a conclusion from some general stats.
Choice of hardware in the first place plays a large part. For example if you don’t really need quad core, the extra idling power could mean tens or hundreds of dollars extra power cost per year. Ditto for a high end video card some of which get as hot as a CPU.
I noticed that I get almost 30 minutes more on battery in OpenSuSE 10.3 (64-bit), than what I had with Vista (32-bit), on the same laptop. From 1:30 to almost 2 hours. (yeah, only 2 hours, the 17" slurps a lot and the batteries are some cheap rubbish)
The test above was running redhat and windows 2008 on the same set up. Therefore the savings that they noticed was with the same hardware, not scaling down the hardware
What I was trying to point out is that you may get better savings from choosing the hardware appropriately than choosing the OS. Desktop users seldom choose the OS based on power consumption anyway. You generally choose what apps you want to run first.
And desktop workloads are much harder to measure than server workloads. There are also more settings to tweak.