Data centers are one of the fastest growing power consumers, something like 1.5-2% of all power today (eg http://www.analyticspress.com/datacenters.html). We should all be excited that Facebook, Google, Apple et al are seriously working on reducing that impact.
Clearly there's a lot more to be done (manufacturing, transportation, etc), but I don't think that undermines the progress being made in datacenters.
I wonder how much of the growth of data centers is due to people moving their computational workload "out of the closet" and into "the cloud"?
If cloud virtualization and app hosting is truly a significant driver of data center growth, it seems likely that DCs represent a category shifting and a net reduction of power consumption. So someone concerned with global or national overall energy consumption shouldn't want anyone "working on reducing that impact", they'd want that impact DCs increased.
From your linked report:
Growth in the installed base of servers in data centers had already begun to slow by early 2007 because of virtualization and other factors. Growth in the installed base of servers in data centers had already begun to slow by early 2007 because of virtualization and other factors. The 2008 financial crisis, the associated economic slowdown, and further improvements in virtualization led to a significant reduction in actual server installed base by 2010 compared to the IDC installed base forecast published in 2007
Also, perhaps more directly:
Because cloud computing installations typically have much higher server utilization
levels and infrastructure efficiencies than do in-house data centers (with PUEs for some
specific facilities lower than 1.1) increased adoption of cloud architectures will result in
lower electricity use than if the same computing services were delivered using more
conventional approaches.
Clearly there's a lot more to be done (manufacturing, transportation, etc), but I don't think that undermines the progress being made in datacenters.