16

The German test magazine "Stiftung Warentest" tested video streaming platforms in the magazine 1/2020. As a side note they write on page 35:

Laut Studien der grünen französischen Denkfabrik "The Shift Project" erzeugt Videostreaming enorme CO2-Emissionen. Sie sollen knapp der Hälfte jener Emissionen entsprechen, die durch den gesamten zivilen Flugverkehr entstehen. Das Streamen eines zehnminütigen HD-Videos auf einem Handy verbraucht laut Forschern genauso viel Energie wie ein 2 000-Watt Backofen bei voller Leistung in fünf Minuten. Stromfresser sind vor allem die riesigen Serverfarmen der Anbieter, von denen aus die Videos gestreamt werden und die zudem ständig gekühlt werden müssen.

My translation:

According to studies by the green, french think tank "The Shift Project", video streaming causes enormous CO2-emissions. Supposedly the emissions should be equal to half of the amount caused by world wide civil aviation. According to the researchers streaming a ten minute HD-video to a smartphone consumes as much energy as a 2 000 Watt oven running on full power for five minutes. Power guzzlers are mainly the huge server farms of the providers which provide the videos and also require continuous cooling.

They don't mention it so explicity, but it seems to make sense to assume that a "2 000 Watt oven" consumes 2 000 Watt when running on full power. If that oven consumes in 5 minutes the same amount of energy then 10 minutes of video streaming, then video streaming would have a consumption of 1 000 Watt.

These numbers sound highly exaggerated to me. Smartphones consume very little energy and will have negligible impact on the calculation, so the consumption must happen nearly exclusively in the data centers and the transmission infrastructure.

This article assumes a power consumption of up to 350 W for a gaming PC under full load. That seems plausible. In a data center we additionally need supporting infrastructure such as cooling. This article writes about 50% of power consumption is for supporting infrastructure, so we'd need about 700 W for a gaming PC under full load in a data center. To put this in relation that means that significantly more than one gaming PC under full load would need to be running in a data center to provide a video stream, so that we get to the number of 1000W. As I have been able to stream videos from a Raspberry Pie in my local network this sounds extremely exaggerated.

The cited "The Shift Project" published the REPORT / LEAN ICT: TOWARDS DIGITAL SOBRIETY and on page 33 I found:

Spending 10 minutes watching a high definition video by streaming on a smartphone is equivalent to using a 2,000W electric oven at full power for 5 minutes

This indirectly references to:

Described in "[Lean ICT Materials] 1byte Model". Produced by The Shift Project.

Now I start to get into difficulties to follow this reference.

Does streaming a video to smartphone consume 1000W of electricity?

Notes:

  • This is primarily about popular video on demand services such as Netflix, Amazon, YouTube, etc.
  • I assume that the total energy consumed is linear to the the time watched. E.g. Watching a 20 minutes video will consume power at the same rate as a 10 minutes video and thus watching a 20 minutes video will consume twice as much energy as watching a 10 minutes video. In case there is any evidence that this assumption is wrong to such a degree that is has influence on the outcome of the question, let me know.
yankee
  • 263
  • 1
  • 6
  • 8
    Energy is measured in watt hours, not watts. Power is measured in watts. The interesting comparison is the energy consumed, not the power. A 2000 watt oven operated at full power for 5 minutes consumes 166 watt hours of energy. – JRE Dec 27 '19 at 12:48
  • 1000 watts for 10 minutes is also 166 watt hours (as you expected.) – JRE Dec 27 '19 at 12:49
  • 1
    @JRE: I know the difference between Watts and Watt hours. Alternatively I could have askes "does streaming a one hour video to a smartphone consume 1000 Watt hours?", but this only complicates the question, right? So I am not sure what you are trying to say... – yankee Dec 27 '19 at 12:55
  • 5
    I'm trying to say that you shouldn't confuse things. The question only makes sense in terms of energy. Your question flip-flops between energy and power and treats them both the same. – JRE Dec 27 '19 at 13:00
  • 4
    @JRE there's no confusion. As you clearly understand, energy and power are related by time, so asking whether streaming video for a particular length of time consumes a given amount of energy is equivalent to asking (without mentioning time) whether streaming video consumes a given amount of power. Similarly, "does a 2000-watt oven consume 2000 watts of electricity at full power" is just as valid a question as "does a 2000-watt oven consume 200 watt-hours of electricity over 6 minutes of use at full power." – phoog Dec 27 '19 at 13:32
  • 2
    @JRE unless you want to get into the language and say that "consume" is the wrong verb for power, or that the time-free version of the question should be "does it consume electricity at the rate of..." or "...at the same rate as...." – phoog Dec 27 '19 at 13:34
  • @Nat: I added a note about this to the end of my question. – yankee Dec 27 '19 at 14:57
  • There is some minor overhead energy lost in serving the video, due to the site's servers needing to find the requested video. Videos are definitely indexed, so the search should be very quick and cost little energy, but it might not be negligible on short videos. – Ryan_L Dec 27 '19 at 16:31
  • 5
    I call absolute shenanigans on the claim: it doesn't pass the sniff test. If it cost 166 watt-hours to show a 10 minute video, Netflix would be out of cash in a few hours charging only ~$10/mo per subscriber. At a lowish rate of $0.10/[kwh](https://www.eia.gov/electricity/monthly/epm_table_grapher.php?t=epmt_5_6_a), that's $16.80 per week (if you have the stream going 24 hours a day). There is no way this could possibly even be *close* to correct. – warren Dec 27 '19 at 19:00
  • 2
    @warren: of those 166 Wh mobile internet provider has to pay for 150 Wh. And I'm sure they account for that in their data plan prices. Netflix has to cover about 12 Wh. (according to the Lean ICT model, see my answer) – cbeleites unhappy with SX Dec 27 '19 at 19:21
  • 2
    @cbeleitessupportsMonica - according to OP, "*Power guzzlers are mainly the huge server farms of the providers which provide the videos and also require continuous cooling.*" Your answer indicates that is patently false (which lines-up with my back-of-the-envelope math) – warren Dec 27 '19 at 19:34
  • 1
    @warren: also, *data* centers usually don't run servers providing computational power like gaming PCs. And of course, every W of power consumed for computation/data retrieval/storage/backup as well as all losses in the power chain (transforming etc.) has to be gotten rid of. I.e. unless you build that data center at the northern polar circle (like facebook in Luleå - and then you'll have to calculate internet transmission over long distances) you'll have to have serious air conditioning. Also, the smaller the building, the more power is needed to transport away heat produced at the same rate. – cbeleites unhappy with SX Dec 27 '19 at 19:53
  • @cbeleitessupportsMonica - no, data centers usually run much higher power-draw systems than a mere desktop :) – warren Dec 28 '19 at 00:25
  • 1
    @warren: ...but not per served data volume. (which I lost during re-sorting the sentences above). I'd also expect that they have power supplies with better efficiency. OTOH, they are on 24/7, whereas I'm going to switch off my PC now. – cbeleites unhappy with SX Dec 28 '19 at 00:48

1 Answers1

9

Summary:

  • The data (model + coefficients together with source citations) of the Lean ICT study are available (see below).

  • HD Video streaming consuming between 250 W and 1 kW power equivalent is plausible compared to a study from 2012 finding 780 W.

  • The Stiftung Warentest formulation cited by OP is IMHO misleading:

    • it does not mention the crucial fact that this 1 kW equivalent estimate applies to mobile internet connections - but connection mode (mobile internet vs. WLAN vs. wired LAN) is the most important factor in the Lean ICT model.
      In contrast, that same model estimates about 500 W equivalent for wired access and 250 W equivalent for WLAN streaming on the mobile phone.
    • In contrast the study estimates laptop vs. mobile phone being a difference of 2 Wh for the example video (6,5 W equivalent [no kilo here!]).
  • I found one point where plausibility of the model used in the Lean ICT study is unclear to me. However, that would mean the mobile and WLAN internet estimates to be too low (by something between < 60 W and 440 W equivalent) rather than unplausibly high.
    In any case, Lean ICT explicitly states that their numbers should be read as order of magnitude only due to extensive uncertainty. The order of magnitude would probably still be OK.

Long version

Direct Comparison with other estimates

 Checking where the Lean ICT numbers come from (internally)

OP reasons:

Smartphones consume very little energy and will have negligible impact on the calculation, so the consumption must happen nearly exclusively in the data centers and the transmission infrastructure.

This is actually in full agreement with the statements of the Shift Project's Lean ICT report (linked by OP in the question), which states on p. 33:

Watching a video online on the Cloud for ten minutes, for example, results in the electricity consumption equivalent to the consumption of a smartphone over ten days. In other words, the energetic impact of watching a video is about 1,500 times greater than the simple electricity consumption of a smartphone itself.
[emphasis as in source]

The report also states that there's large uncertainty in those estimates and that therefore they'll discuss only orders of magnitude (also p. 33).

They do have a kind of supplementary material guide, which leads to an Excel sheet for the energy consumption model for video streaming.
That sheet gives the energy consumption for 10 min video (162 MB) streaming on a smart phone over mobile internet as 164 Wh, so the equivalent power is 980 W.

The 164 Wh consist by that model of (my calculations from their sheet):

  • 1 Wh device impact (calculated time dependent)
  • 12 Wh impact at data center (calculated data volume dependent)
  • 150 Wh impact of network transmission in "mobile internet mode" (calculated data volume dependent)

So the elephant in the room is mobile internet power consumption which is estmated to be 92 % of the power consumption. Their model also gives numbers for WIFI and wired network streaming which are roughly 1/6 (17 %, 26 Wh) and 1/2 (49 %, 73 Wh) of the per-byte energy consumption by mobile internet, respectively.

I'm astonished that WIFI transmission is supposed to consume only a fraction of the power (1/3) of wired transmission as I'd expect wireless transmission (whether LAN or WAN) resulting in large losses (warming up walls etc.) compared to wired data transmission. As for not seing this on the mobile phone energy consumption, as a rule of thumb we can say that energy consumption for broadcasting wireless data is much higher than for receiving the same amount of data.


The Shift project uses the underlying model from Andrae & Edler: On Global Electricity Usage of Communication Technology: Trends to 2030, Challenges 2015, 6(1), 117-157 with their own updates in the estimated coefficients (p. 14-15 in the report). Wired LAN having higher power consumption per data volume than WLAN also occurs in the Andrae & Edler model. But they also note that for wired data transmission,

The optical backbone network is responsible for the majority of the consumed electricity

This reads to me as if both mobile internet and WLAN "modes" would need to have some wired data transmission added because the WLAN and mobile transmission covers only the "last mile". It is not clear to me whether and how Andrae & Edler or the Shift do this. If they didn't, the resulting numbers will underestimate energy consumption for WLAN and mobile internet and overestimate wired energy consumption.


Relevant Information from Other Sources

This news post by Deutsche Welle (in German) (mostly also about the Lean ICT study) in addition cites Lutz Strobbe from the Fraunhofer institute for reliability and microintegration saying that

  • the last mile is important for total energy consumption.
    (This is no contradiction with the majority of power consumption in wired networks occuring in the internet backbone if the wireless options consume similar amounts of power on the last meters as the whole backbone transmission)
  • mobile internet is worst in terms of power consumption due to high losses in building walls, vegetation, weather etc.
  • Wired network using copper cable (as opposed to optical transmission) has high losses, too.
    So another possible explanation for the high power consumption of wired internet is that there are still lots of copper cables out there.
    However, at least where I am, the likely transmission "mix" for, say, WLAN is optical - probably copper last mile to house - WLAN (indoors). Still, the question whether a common wired transmission part has been added remains.

Aslan et al.: Electricity Intensity of Internet Data Transmission: Untangling the Estimates, J Industrial Ecology, 2017 estimate 60 Wh/GB for data transmission via the internet for 2015. This is for wired transmission and does not include the energy consumption of the home network, just IP core (backbone) network and the ISP's access network (also undersea cable transmission is not included).

For the assumed 162 MB video, that would be about 10 Wh (or less due to improvements such as higher fraction of optical transmission infrastructure since 2015).

As the Shift Project estimates 73 Wh for the example video, it is quite possible that these 10 Wh thereof are spent on internet transmission. And it is possible that this amount is included in the WLAN and mobile internet estimates as well. However, this does not go well with the Andrae & Edler paper claiming that most of the wired internet energy consumption occurs in the backbone.

  • 1
    Wow, great answer. Although I am still astounded: I can upload data using 4G network with my smartphone with 50 MBit/s without problems. This does drain my battery rather quickly, but it can't be more than a couple of Watts anyway considiring battery size. So the plain generation of radio waves cannot be the issue. What makes the mobile network so darn inefficient? The same is true for Wifi/LAN: Just looking at the specs of the power adapter reveals that they can't consume that much energy. So it's all in the infrastructure outside my house. Whatever they mean with "WIFI" in this context... – yankee Dec 27 '19 at 20:17
  • 6
    @yankee As you realize, these figures make absolutely no sense at all, and any "back-of-the-envelope calculations" when taking the power adapters in to considerations causes their figures to break the laws of physics, so yeah.. Someone is leaving something important out. Not going to read all the papers but maybe they think that a single 100 kW cell tower only serves one person. – pipe Dec 27 '19 at 22:49
  • 1
    Honestly, even the "at data center" number seem too high for me, even my PC would spend significantly less, and I'd assume data centres are more energy efficient that home PCs. – Alice Dec 27 '19 at 23:23
  • 1
    @pipe: one would need to go down the rabbithole and follow far more papers. I have no idea about the power consumption characteristics of cell towers, but some of the papers I glanced through gave numbers for idle/receiving/sending WLAN access points. Idle was not negligible compared to sending, and these devices tend to be idle for large parts of the day. I expect these estimates to contain numbers like 99 % of WLAN access points are on 24/7. Average video streaming is 2 h/day, other traffic another x min. So a fraction of the idle consumption should be included in the per video numbers. – cbeleites unhappy with SX Dec 28 '19 at 00:16
  • the same should apply to cell towers, both with cell towers in remote locations that serve someone only once in a while, even highly used cell towers probably having peak transmission and maybe even idle times. And cell towers broadcasting with high power in order to get a useful minimum signal to suboptimal locations nearby. – cbeleites unhappy with SX Dec 28 '19 at 00:19
  • 3
    @Alice: for the (marginal) power consumption per calculation and per stored/backed up/retrieved data volume, probably. But: I don't backup the video because the data center does that for me. Also, I switch off my PC when I'm not using it. The data center is on 24/7 (and probably doesn't even shut partially down for low-traffic hours). Also, it is built much more compact: where my PC can dissipate the heat by a simple small ventilator, the data center spends significant amounts of energy on cooling. Also, if I have a lamp on when at the computer, I don't count that as computer-related... – cbeleites unhappy with SX Dec 28 '19 at 00:54
  • 3
    ... whereas the data center should do so (we're not looking at marginal power consuption, we're looking at total power consumption in relation to total hours of video streamed). All that being said, the numbers may still be off. – cbeleites unhappy with SX Dec 28 '19 at 00:56