21

My physics teacher regularly said to our class:

Turning these lights [pointing to the ceiling of the classroom] off and on uses more energy than leaving them on for 30 minutes.

Is this true?

I am not sure about the exact kind of lamp used in the classroom. I guess it was some sort of neon light (it had this characteristic flickering when turning it on). But the exact kind of lamp does not matter so much. I am equally interested in an answer to the following question: "Is there any light source that was in popular use 15 years ago that uses more energy to be turned off and on that leaving it on for 30 minutes?"

user505117
  • 313
  • 1
  • 5
  • Every electric thing I know of has a start up amperage need that's greater than steady running. I can't say for lights, but for house hold appliances and garage power tools, it can be 30 to 40% higher. –  Dec 12 '18 at 22:29
  • Related: https://skeptics.stackexchange.com/questions/16760/do-electronics-have-a-startup-cost – Oddthinking Dec 12 '18 at 22:50
  • 2
    I've heard this claim on a frequent basis over the last few decades. – DJClayworth Dec 12 '18 at 22:51
  • 3
    The light he is speaking of is a fluorescent tube. – DJClayworth Dec 12 '18 at 22:56
  • 1
    Not an answer because it's "theory", but it'd be interesting to work out the current draw that would be required for this to be true, and compare to the rating of your circuit breaker. Or, to estimate the temperature increase from dissipating this much power, and compare to the bulb's melting point. – Nate Eldredge Dec 13 '18 at 01:39
  • 1
    Your physics teacher is not worth his salary? Just try to calculate the current that would be drawn during that half second that it is switching on (which is a simple school level exercise to do)! –  Dec 13 '18 at 08:29
  • 1
    I assume he wasn't trying to claim that _repeatedly_ turning it on and off for 30 minutes would draw more power? That's a pretty different claim than doing it once every 30 minutes (and would then depend on the rate of switching). – JMac Dec 13 '18 at 14:49
  • @user505117 I believe your teacher was refering to wear and tear caused to the fluorescent bulbs in repeated switching - shortening the lifetime of the bulbs which are energy intensive to manufacture - compared to just leaving them switched on. He may be quite correct, but since you've yet to specify the bulb type or exact parameters for the question - who can say? – Jiminy Cricket. Dec 13 '18 at 21:02
  • @Duckisaduckisaduck - I am not sure about wear and tear on the fluorescent lamp, but current draw through the starter which is needed to switch a fluorescent tube on may be higher than leaving the switch on. See my comment on Paul Johnson's answer – Chris Rogers Dec 15 '18 at 23:45
  • @JanDoggen How would I calculate that? And where did you get the half second from? I would be keen to see a calculation that gives the answer to my question and would make it the accepted answer in a heartbeat. Other people would also appreciate it, I believe, as it doesn't seem to be readily available online. – user505117 Dec 16 '18 at 16:33
  • @NateEldredge - Consider that a 4-foot 40W lamp draws about 0.3 amps when "warmed up". But the circuit breaker can handle at least 15 amps (US NEC standards) -- 45 times the normal current. And consider that, worst case, it might take 60 seconds to "warm up". This says that the lamp could consume 45 minutes worth of power during it's warm-up cycle. So the size of the circuit breaker is not a practical limit on the warm-up cost. – Daniel R Hicks Dec 17 '18 at 01:28
  • @DanielRHicks: Sure. But it's pretty common to have several such bulbs fed from the same 15A circuit, and to turn them all on simultaneously. The fact that this doesn't normally trip the breaker does give us some kind of bound. – Nate Eldredge Dec 17 '18 at 16:16
  • @NateEldredge - Not much of one, especially considering that the breaker will tolerate a brief overload. – Daniel R Hicks Dec 17 '18 at 22:29
  • @DanielRHicks: But not 60 seconds, surely? And if you hypothesize the warmup happening faster, then the current is that much higher. – Nate Eldredge Dec 18 '18 at 00:57
  • @NateEldredge - I forget what the standards say (and I can't find anything through Google), but I'm thinking a standard breaker can tolerate a 2x overload for 5-15 seconds. A 10x overload would trip much more rapidly. – Daniel R Hicks Dec 18 '18 at 01:24

1 Answers1

25

This page at Cambridge University says its a myth. There is a burst of energy when you turn them on, but its equivalent to 2 seconds of run time. Also the light lifetime is not seriously affected by turning it off and then on again occasionally.

The energy consumed to start a typical lamp is the equivalent of 2 seconds running time, so it is wrong to say it takes a lot of power to start them. It is true there is a current surge but this takes place in less than one-eighth of a second and because it happens so quickly it takes very little energy.

The Mythbusters performed a practical experiment to measure the amount of power required to turn on various lights compared to their steady-state consumption when on. They found the following:

Based on the amount of energy consumed turning on the bulb, they were able calculated how long the bulb would have to be turned off in order to make it worth the energy savings, i.e. “It’s best to turn off the bulb if you are leaving the room for”:

  • Incandescent: 0.36 seconds

  • CFL: 0.015 seconds

  • Halogen: .51 seconds

  • LED: 1.28 seconds

  • Fluorescent: 23.3 seconds

In other words, its almost always best to turn the bulb off. Even the 23 seconds for the fluorescent lights isn’t very long, and the rest of the times are pretty much blinks of an eye.

It is true that switching on/off fluorescents reduces lamp life but lamps are designed to be switched on/off up to seven times a day without any effect on their life. How many times a day do your colleagues switch on/off to save energy? Probably not enough times to reduce the lamp life.

Addressing the last part of the question is harder.

Is there any light source that was in popular use 15 years ago that uses more energy to be turned off and on that leaving it on for 30 minutes?

At risk of doing some Own Research, some back of the envelope calculations suggest that this is unlikely. A typical 4 foot flourescent tube consumes 36 watts, and a light fitting will often contain two tubes, making a total steady-state power consumption of 72 watts. If the starting process takes 5 seconds then to take as much energy in those 5 seconds as it takes during 30 minutes of continuous operation the light fitting would have to draw 72x30x60/5 watts, which is about 26kW. At the UK standard of 240 volts that would be 108 amps, or about 8 times the power of an electric kettle. For US 110 volt circuits it would require 236 amps. However flourescent lights have never required any special cabling to deal with such high currents.

Paul Johnson
  • 15,814
  • 7
  • 66
  • 81
  • Off and on, at a relatively slow interval of 1 second, would give you 30 seconds of on time, plus 60 seconds worth of energy for the 30 start up power surges. I'd call that a confirmation. –  Dec 12 '18 at 22:25
  • 1
    @fredsbend The teacher doesn't mean that. They mean (claim) that turning them off and on once uses as much energy as 30 minutes of running. – DJClayworth Dec 12 '18 at 22:29
  • @DJclay I'm going to challenge notability then. Has anyone actual made that claim? I suspect the OP doesn't remember what his teacher actually said. –  Dec 12 '18 at 22:45
  • 9
    I remember people saying this to me frequently when I was a kid. – DJClayworth Dec 12 '18 at 22:50
  • The numbers were quite a bit different with the older starter-type lamps (50 years ago), though even then the claim was likely exaggerated. – Daniel R Hicks Dec 13 '18 at 01:04
  • 3
    The "Cambridge University" page isn't a link to experts in Electronic Engineering who have studied fluorescent lights. It is a link to administrators and co-ordinators who are trying to encourage staff to reduce costs. They reference a video, but that doesn't provide references either. So I am left wondering why I should trust these claims over those of a physics teacher? – Oddthinking Dec 13 '18 at 08:33
  • @Oddthinking I've added a reference to the Mythbusters experiment where they tested this, with results. Not peer reviewed, but seems sound enough and I doubt we're going to find anything much better. – Paul Johnson Dec 13 '18 at 08:41
  • It seems to me that this teacher mixed up some stuff and used "minutes" when the correct wording would be "seconds", if they're taking the Mythbusters experiment into account. Mixing units up like that isn't an uncommon mistake, so I wouldn't be surprised. – T. Sar Dec 13 '18 at 08:45
  • +1: Much better. MythBusters isn't super-rigorous, but their claim is pretty prosaic, so I don't think it needs very strong evidence to accept it. – Oddthinking Dec 13 '18 at 09:31
  • 1
    You might talk about normal lighting circuit breakers that would trip if starting consumed an elephant of electricity. – daniel Dec 13 '18 at 11:20
  • Hmmmm. Look at the figures for fluorescent lighting, which in all probability is what was used in the OPs classroom. If I understand the figures correctly, it is only best to turn the light off if leaving the room for 23.3 seconds or more. What if the room was empty for 15 seconds before the lights were switched on. Is that consuming more power than leaving them on? I am assuming this is because of current draw through the fluorescent lamp starter. – Chris Rogers Dec 15 '18 at 23:38
  • @ChrisRogers - There are several processes involved as a fluorescent light "warms up", and the processes have changed several times over the past 50-60 years. Plus one must also consider "wear and tear" on the lamps, ballasts, and starters if the consideration is overall cost-efficiency. I don't doubt that, at one time, the "break point" may have been on the order of 15 minutes (though newer units are much better). – Daniel R Hicks Dec 17 '18 at 01:21