A Carnegie Mellon University study concluded that the energy cost of data transfer and storage is about 7 kWh per gigabyte. An assessment at a conference of the American Council for an Energy-Efficient Economy reached a lower number: 3.1 kWh per gigabyte. (A gigabyte is enough data to save a few hundred high-resolution photos or an hour of video.)

Compared with your personal hard disk, which requires about 0.000005 kWh per gigabyte to save your data, this is a huge amount of energy. Saving and storing 100 gigabytes of data in the cloud per year would result in a carbon footprint of about 0.2 tons of CO2, based on the usual U.S. electric mix. https://medium.com/stanford-magazine/carbon-and-the-cloud-d6f481b79dfe Ok apparently a lot of these numbers are pretty questionable.

This is hard then to argue for communal computing infrastructure when it is much more energy-efficient to centralize and virtualize computer servers

Amazon claims that AWS is 3.6 times more efficient than median of U.S. data centers.

argument for local-first?