ryao 5 days ago

> The decrease is almost entirely due to gains in lighting efficiency in households, and particularly the transition from incandescent (and compact fluorescent) light bulbs to LED light bulbs

I am reminded of Jevons paradox:

https://en.wikipedia.org/wiki/Jevons_paradox

I replaced all of my incandescent and fluorescent lighting with LEDs years ago. A decent amount of the hypothetical savings from more efficient lighting was eaten by having even more lumens than previously as quality of light upgrades. Despite that, I did not notice much of a difference since my household had been keeping the lights off unless we needed them.

There was a minor dip in the electric bill from other initiatives (e.g. heat pump dryer) and my solar panels started producing more than my household used. I had relatively few computers (general purpose; not counting embedded) running in my home compared to others in computing for years to try to keep this trend. In the past few years, I got an electric car and replaced my oil heat with heat pumps. Now my solar panels only produce about 60% of the electricity I use and I have given up on limiting usage to match what I produce.

Anyway, no matter how many efficiency initiatives people adopt, electricity usage is likely to increase rather than drop. That is because we not only find new uses for electricity, but the population keeps growing.

  • roywiggins 5 days ago

    > Anyway, no matter how many efficiency initiatives people adopt, electricity usage is likely to increase rather than drop. That is because we not only find new uses for electricity, but the population keeps growing.

    It seems that energy use in the US peaked in 2007, though I do wonder how much of that is down to moving manufacturing abroad.

    https://www.eia.gov/energyexplained/us-energy-facts/

    In general energy use in the US, Canada, and Europe looks to be declining:

    https://ourworldindata.org/explorers/energy?tab=chart&time=2...

    • ryao 5 days ago

      Energy use applies to far more than just electricity. It includes petroleum used in automobiles and heating. While the total energy pie is decreasing (for now), the marketshare of electricity should be increasing.

      When there is no more energy usage to move to electric cars and heat pumps (and no more production to move to China), I would expect energy usage to start increasing again. Jevons paradox cannot be avoided forever.

      • pfdietz 5 days ago

        Jevon's paradox is not some universal law. It's not even clear if it applies very often. One has to separate out different reasons for increasing consumption of some resource as it gets cheaper, and not just observe a correlation and infer causation.

        • ryao 4 days ago

          It is related to supply and demand curves. The more supply is available for the same cost, the more demand increases to use it. In any case, technology moves forward stimulating increased electricity usage and we are healthier because of it. My heat pump and electric car eliminate nitrous oxide and sulfur oxide emissions, as well as particulate emissions. Respiratory health is improved because of it.

          • Retric 4 days ago

            That’s only true up to a point. PC prices fell dramatically in real terms over time as people didn’t need to spend 2000$ mid 90’s dollars to get vastly more performance at home.

            But more interesting is the trend to upgraded less frequently.

            • pfdietz 4 days ago

              > mid 90's dollars

              To be precise: inflation from then to now is just about 2x. So, that would be $4K in current dollars.

            • ryao 4 days ago

              When prices fell, sales increased.

              • Retric 4 days ago

                Not in a way that makes those trends correlate. Worldwide PC sales peaked in 2011, but they continued to get cheaper in real terms.

                Sales trends for new technology should be broken down into new customers and replacement devices for existing customers if you want to understand what’s going on. Falling prices did little for either trend but replacements were providing an eco of earlier growth. Until that was offset by people taking ever longer to buy replacements.

                • ryao 4 days ago

                  2011 is long after the major price drops of the 90s and early 00s, where demand kept increasing with price drops. Not only that, but every few years, you could buy a computer that was a few times better than the one that preceded it at a lower price.

                  PC sales began to drop in 2012 due to the iPhone and iPad causing people to not need PCs as much. Improvements in PC technology had also started to slow and would reach a crawl a few years later. All of this meant market saturation. Supply and demand still apply, but increases in sales volume from lower pricing are much more limited than they used to be.

                  For anyone who does not remember, on May 7, 1997, Intel introduced the Pentium II at 300MHz at a cost of $1981. On March 8, 2000, Intel introduced a the Pentium III at 1000MHz at a cost of $990. It was roughly 3 times faster on average, while being half the price of the older processor that launched less than 3 years prior. The performance had jumped so much that unfortunately no one seems to have included the two chips in a comparison. Anyway, no three year period in 2011 onward had performance jump so much with pricing halving. The closest was the Ryzen 7 1800X to the Ryzen 9 3950X, but the performance increase was more tame (~2.5x best case compared to ~3x average case) and the price went up by 50% instead of halving. However, you can see similar improvements if you go further back in time, such as 3 years before the Pentium II launch to the 100MHz Pentium (I cannot find how much that cost sadly).

                  • Retric 4 days ago

                    I noticed a vastly larger jump between HDD and SSD’s, which hit for most people after 2011, than between PII vs PIII’s. People generally buy systems not individual components and in practice it’s mostly bottlenecks that matter. PII’s and PIII’s had the same RAM latency as earlier and later CPU’s because that’s bound by the speed of light, thus the explosion in cache size.

                    > 2011 is long after the major price drops of the 90s and early 00s

                    Not in terms of actual average PC prices which are down ~40% from their 2011 average. That’s a big discount in real world prices. Your price comparison on 1+k CPU’s is irrelevant when so few of them entered customers hands at those prices. Sure nVidia is selling 2000$ GPU’s today, but the average PC is ~650$ that’s the number that matters.

                    • ryao 4 days ago

                      Being down 40% is nothing compared to how you used to be able to get a PC that is many times more capable of one from a few years ago for half the price.

                      As for NAND flash SSDs, the Intel X25-M came out in 2008:

                      https://www.anandtech.com/show/2614

                      • Retric 4 days ago

                        Being available isn’t becoming mainstream. Most new computers in 2011 still had HDD due to cost.

                        Similarly, the average price people actually paid for a PC didn’t suddenly drop by half over 3 years as you proposed. Over 15 years that kind of price declines would mean people when from 4k in 2025 dollars to 125$ which simply didn’t happen.

                        I could cherry pick numbers suggesting opposite trends. nVidia’s top of the line card more than doubled in price when adjusted for inflation between 1999 and now. But back then most people didn’t even buy 3D graphics cards and today integrated graphics is by far the most common.

  • mmooss 5 days ago

    My power usage has dropped by maybe 50% (a bit of a guess, based partly on electricity bills), and quality of life has increased - I simply didn't consider that these options worked better overall until I tried them for power-saving purposes.

    It actually taught me a lot about innovation - there is no substitute for just trying things; you just can't know by thinking about them ahead of time. A picture is worth 1,000 words, and an experience is worth 100,000 pictures - you can't really convey it in pictures. As a result, I 'just try things' whenever I have the opportunity, and my learning rate has increased dramatically.

    There are two questions about efficiency upgrades:

    How much power do you use compared to the alternative? It may increase, but more efficiency means it increases less than the alternative.

    How much power do you use absolutely? Physics causes climate change, and doesn't care about the hypothetical alternative.

    • ryao 5 days ago

      My household energy usage has decreased, but my household electricity usage continues to rise.

      In the early days, our heating oil usage was 1000 gallons per year. Efficiency initiatives reduced that to about 430 gallons per year. 1000 gallons is about 41MWh. 430 gallons is about 17MWh. Going to the heat pump has me using about 7MWh extra electricity per year, while yearly electricity usage had been around 11 MWh per year (with 10 MWh produced by solar). This does not count the car, for which the numbers would be even more skewed by life changes since I drove about 1 to 2 orders of magnitude more when I was in college.

      Depending on how you decide to do the accounting, my household’s total energy usage dropped by 65% (if we count the oil usage reductions on top of the heat pump) or by 35% (if we count only the heat pump), without even counting the solar panels. Still, my electricity usage has never been higher.

      If you were to try to cook my books so to speak, you could say my household’s total energy usage has decreased by 85% with a electricity usage decrease of 27%, by treating the energy from the solar panels as if it were free. I do not think that is a correct way of doing accounting (although it could be a matter of opinion).

      By the way, remarks on climate change could encourage people to claim unrealistic improvements in personal/household energy usage, such as the figures I gave for what I could claim if we “cooked the books”. Of the various figures I gave, I think the 35% reduction in total energy usage is the most honest figure. It had been achieved in the past 2 years, unlike the other factors I mentioned that are many years old.

      • _aavaa_ 5 days ago

        Though hath sinned against the 2nd law of thermodynamics!

        1,000 gallons of heating oil have ~41 MWh of chemical potential energy (if using the HHV). Those 41 MWh are not directly comparable to 7MWh of electricity. While the units (MWh) are the same, they are measuring two completely different things.

        They have as much in common as an American dollar and a Jamaican dollar; yes they're both "dollars", but they refer to different things.

        Repent and sin no more.

        • ryao 4 days ago

          Energy usage wise, they are the same. That is why the units match after conversions. In any case, I replaced substantial energy usage from heating oil with less substantial use of electricity. Here is a fun fact. It is more energy efficient to run a heat pump off electricity from a commercial generator burning diesel to heat a home than it is to burn the diesel directly. This includes grid losses. The reason for this is that heat pumps exceed 100% efficiency because the environmental energy moved is free as far as accounting is concerned. In my case, the electricity is produced by burning natural gas rather than diesel.

          By the way, I realize my numbers are borderline in showing that, but the past year has been colder than years where I burned oil, and it is hard to account for that when looking at numbers as there is no strict control.

          • _aavaa_ 4 days ago

            They are not the same though. This leads to issues with prominent people, see Vaclav Smil, making the case that it'll be too difficult to transition from fossil fuels because they conflate these two things. Some do it out of ignorance, others know better and do it since it's in their financial interest.

            As long as we conflate the two, people will more easily be misinformed and think they need to replace their 41 MWh of heating oil with 41 MWh of electricity. But they don't. They need at most 41 MWh of heating. And as you said, your heat pump is probably getting you and average COP of at least 3. Meaning they will need to pay for fewer "MWh" in order to get the same amount of heat to your house.

            It is more efficient, just as it's more efficient to charge and EV of a grid running on natural gas than it is burning the petrol in an ICE. Both are also far better for pollution.

            • ryao 4 days ago

              The two are the same as far as my computation of my household’s reduction in energy usage is concerned. That is why I converted to the same units.

              As for needing 41MWh of heat, that is incorrect as my boiler was only 86% efficient and the one it replaced was even less efficient. It is also incorrect as the efficiency gains had reduced oil usage to 18MWh. Heat wise, I only need around 15MWh per year (although I likely needed more this year since it was particularly cold).

              I have a suspicion that the ducted method of heat delivery used by my heat pump has more losses than the hot water system previously used to deliver heat. I had been sealing the central AC ducts in the winter to save a few hundred gallons of oil usage. I can no longer do that as heat is delivered via those ducts now.

              • _aavaa_ 4 days ago

                I don’t disagree with you about furnace efficiency.

                But my point still stands: You needed 18MWh of oil or 15MWh of heating. Neither of those numbers are how much electricity you will need to run a heat pump.

                • ryao 4 days ago

                  Dividing 15kWh by the average COP should determine it, although heat demands change from year to year and temperatures vary, causing not only the amount of heat needed to vary, but the average COP to vary too.

              • 11101010001100 4 days ago

                Ask yourself what is the difference between a heat pump and an electric resistance heater?

                • ryao 4 days ago

                  This does not matter for total energy usage calculations unless you consider the environmental heat to be an input, but as far as the industry is concerned, it is free, which is why the COP for heat pumps is greater than 1.

            • mech9879876 4 days ago

              If you were comparing gasoline in a car to an EV I would maybe see what you are talking about- engines are like 30% efficient so the conversion to useful energy requires a large multiple of potential energy.

              But in the case of an oil- or gas- fired furnace, their thermal efficiency is at least 80%, and often more, so their potential energy usage is close enough to directly comparable to their heating value.

              • ryao 4 days ago

                Mine was 86% efficient, although for accounting purposes, I considered the waste energy to be part of energy consumption, which seems to be the most sensible way of doing it.

              • _aavaa_ 4 days ago

                I don’t know that I disagreed with you. Yes it might be close to their heating value but it’s fundamentally measuring something different.

                • ryao 4 days ago

                  Energy usage as does not care about whether any of the energy is wasted. If it did, a 100W incandescent light bulbs would only use 1W of power.

        • schiffern 4 days ago

            >Though hath sinned against the 2nd law of thermodynamics!
          
            >...Repent and sin no more.
          
          Sure.

          What is a specific edit to the post that could have been made to avoid your wrath? What text, exactly, should have been written? Thanks.

          • 11101010001100 4 days ago

            Ryao is comparing a heat engine to a heat pump. Thermodynamically different objects.

            • ryao 4 days ago

              Where did I compare a heat engine to a heat pump? An oil boiler certainly is not a heat engine.

      • mmooss 4 days ago

        FWIW I'm not cooking books or shifting between energy sources. I changed habits and use much less energy total.

        • ryao 4 days ago

          I did not mean to imply you did. I just stated that my own data on my energy usage shows a decrease, but how much depends on how the accounting is done, and I can “cook the books” to produce some truly absurd figures by considering savings from the not so recent past, and treating my solar panels as causing a net decrease.

  • stephen_g 5 days ago

    Electricity use went up in your case, but switching from oil heat and an internal combustion car to heat pumps and an EV should mean that your overall energy use has gone down fairly significantly (down to 1/3 or 1/4 of the energy used on heating and driving).

    So that's not quite the Jevons paradox unless you're going to drive three times the distance or expand to heating four times as much space in your house.

    • ryao 4 days ago

      Overall energy efficiency has improved by about 35% in my napkin math, but I find whenever I improve efficiency, my electricity usage increases. This applies to both electric sources of energy and non-electric sources of energy as something new always seems to follow the efficiency gain. The reduction in overall energy usage is a side effect of efficiency difference of the heat pump (although on a dollar basis, costs remain about the same) being huge and is an abnormally.

    • pfdietz 5 days ago

      If food gets 10x cheaper I'm sure I'll eat 10x as many calories! /s

      • ryao 4 days ago

        Food demand is relatively inelastic.

  • SilasX 5 days ago

    Yours seems like an unusual case. Lighting is something that’s least vulnerable to Jevons effects because (after electrification reached maturity) people already use all the artificial light they could ever want and don’t look for ways to use more when it becomes cheaper (more lighting per unit cost). Even if they’re less careful about turning off unused light, LEDs are so much more efficient that they will still use less energy for lighting.

    In contrast, energy itself is very vulnerable to Jevons effects because there is always a marginal use of energy it can be applied to as more is freed up.

    • ryao 4 days ago

      My thinking is that we should look at the electricity usage rather than the subset of electricity used by lighting, since the bill going down because of lighting means the bill can go up because of something new.

      • SilasX 4 days ago

        Oh yes, I agree with the general point, that the Jevons non-effect in lighting is canceled by the Jevons effect in general energy usage, and that we can't expect energy efficiency improvements alone to reduce the latter.

        So there's an important point here: the benefit of energy efficiency improvements is not that it will get total energy usage down by itself (and thereby hit GHG targets), but rather, that it will reduce the total hit to consumption/utility that we experience when regulations disincentivize (GHG-emitting) energy usage.

    • jcelerier 4 days ago

      > people already use all the artificial light they could ever want

      Huh ? People put LEDs in so many more places than they were putting incandescent lights. Under desks, beds, in closets, all around walls, over furniture, behind TVs...

      • SilasX 4 days ago

        Hm, I'd never noticed that phenomenon, but even so, the efficiency gain is so high that, even when you enumerate all those spaces, it doesn't cancel out the energy savings of the shift.

        • crusty 3 days ago

          Also, I've noticed that during the transition, when people have a mix of incandescent and LED, they internalize the potential efficiency of the LED, and then use the incandescent bulbs in a similar way. So the incandescent light on the hall table that's more appealing gets left on for longer periods as if it were an LED.

          I'm guessing this is more common in households with legacy light fixtures and legacy nostalgias, though, and that's its own diminishing set.

  • p0w3n3d 4 days ago

    I can't understand if this is the same as the Jevons paradox but I can observe that once some economical behaviour is imposed/incentivied on people, government taxes more the object of this economical behaviour. Also once government promises some subside, the cost of subsided good raises by the amount of subside. I would name it the Greedo Paradox

  • hn_throwaway_99 4 days ago

    I highly doubt that Jevons paradox is going to apply much here (and the data in the article seems to show this), simply because lighting was already cheap enough so that most people never gave a second thought to using lightbulbs (indeed, ask any parent that has ever yelled "turn off the lights" to their kids). Jevons paradox applies when there is strong demand for a good, but it is too expensive to be widely deployed, but a decrease in price of that good then allows many more uses of it. While sure there may be some edge cases (a sibling comment mentioned big flood lights for their yard), it's not like most people had dark rooms before where they thought "darn, if only light were cheaper"...

    > A decent amount of the hypothetical savings from more efficient lighting was eaten by having even more lumens than previously as quality of light upgrades

    I'm not sure what you mean by "quality of light upgrades". To be honest I hate the quality of light that LEDs give off. It's too "white bright", and I much prefer the quality of incandescent bulbs. I usually have LEDs much less brighter than they can go because otherwise I feel like there are floodlights in my house. Relatedly, I hate the switch to LED headlights. They're much too bright for oncoming drivers, and many car brands position them much too high.

    • ryao 4 days ago

      It was a pun. It should have been quality of life. Our rooms are better lit now, as the lumen output is higher.

      On a more serious note, the CRI improved when going from fluorescent lighting to LEDs, when better quality LEDs were used. Thus quality of light is actually meaningful as more than a pun.

  • aqueueaqueue 5 days ago

    Yeah I have a couple of 150w leds flooding the yard. No way I'd have done this if I needed a 1kw halogen. Would have made do with a dark garden.

roenxi 5 days ago

> The decrease is almost entirely due to gains in lighting efficiency in households...

The article is an interesting treatment of how lighting is getting more efficient and well worth a read. But pedantically zooming in on this one throwaway phrase for a second... this is a misinterpretation of the data on 2 levels.

1) The (badly labelled) graph seems to be displaying a very very slight linear uptrend for "residential".

2) Energy is literally the first example of where we expect to see Jevons paradox [0]. If its use is going down, that is because energy is getting more expensive in real terms. If the only trend here was lighting getting more efficient, households on aggregate would find ways to use more electricity because it is extremely fungible.

By default the proper way to interpret the data (if for the sake of argument I say what I would interpret as a slight uptrend is actually a downtrend) is that electricity is getting more expensive real terms. The impact that has on living standards is cushioned somewhat by improvements in lighting efficiency. But if electricity costs were steady and lighting efficiency improved we'd expect to see an increase in electricity use.

[0] https://en.wikipedia.org/wiki/Jevons_paradox

  • hansvm 5 days ago

    > Jevons paradox

    That depends on the demand elasticity of the thing in question. If everyone already had about all the lights they wanted in their homes (a somewhat reasonable assumption, give or take a proportion of families turning off lights religiously), making them cheaper wouldn't make you go out and buy more lights -- certainly not enough to make up for the cheaper cost.

    Stated somewhat differently, when you switched from incandescents to LEDs, did you add more or less than 8x the total lumens to your home? I'm guessing quite a bit less. My apartment is a little small, but I couldn't manage 8x if I doubled the brightness in every room and always left every light on. No lighting efficiency improvements are going to convince me to go beyond that threshold.

    The point about households on aggregate finding ways to use more electricity doesn't apply because it's not power those houses now have in comparative excess, but rather dollars -- you don't have to buy as much power to do the things you want, so you have more dollars and can choose to spend them on additional electricity consumption or massively inflated rent or a vacation or whatever. The _unit cost_ of electricity (note that Jevons paradox applies to unit costs, not substituted goods or alternative sources of income or savings) didn't change by making lighting more efficient (and, as you've noted, has gotten more expensive), so you're not incentivized more than you already were to spend those dollars on electricity consumption instead of other things you could buy instead.

  • taeric 5 days ago

    I think this just doesn't come to terms with how much more efficient modern lights are?

    I remember when folks were resisting LED lights at the start. Folks would literally promote turning off the lights earlier to save energy. Remember back when making sure the lights were out was a big deal?

    Turns out, 60-100 watts down to 10 is just ridiculously hard to come terms with. Turn off the lights early just doesn't compete. Not even close.

    This also ignores how much more efficient other things are. Televisions would be an amusing one. It isn't as dramatic, sure, but it is about a quarter of the energy?

    • dylan604 5 days ago

      > I remember when folks were resisting LED lights at the start.

      At the start, LEDs were horrible. There were early versions that you could point your cell phone camera in video mode at the lights and see them blinking. We did that just to prove that some of us could see these lights strobing. Now they have insane blink rates that you'll never see them except maybe for the highest of high frame rate cameras that mere mortals will never see.

      I would be curious to which was worse between the early LEDs or CFL. I hated both.

      • taeric 5 days ago

        To be sure! Early low flo toilets were also pretty bad. Modern ones are a whole new thing, though.

        It's a lot like battery technology. It progressed a ton with people not realizing it. If you said I could get a battery powered lawn mower, I'd have assumed you meant as a toy not even a decade ago.

        • dylan604 5 days ago

          I feel the same way about all of the generative AI stuff that keeps getting posted here. Some of it is just laughable at how bad it is that you have to seriously ask why would they show it off to people. Maybe in the next decade, we'll find the road out of uncanny valley.

          There's a risk of people thinking the early stuff is so bad that it must be a joke, but the people working have to promote they are working hoping to name recognition/funding. Sometimes it pays off to be first with low quality that matures, other times it's a death knell

    • ryao 5 days ago

      Decreasing time for lights on has a linear impact on power usage. Increasing the efficiency of the lights has a constant factor impact on power usage. It is a big constant, but stopping particularly egregious wastes of electricity can overcome it. That said, both at once is the best of both worlds.

      • taeric 5 days ago

        I get what you are aiming at, but strictly, both have a linear relationship here?

        More, you can only cut out so much of the time, and since you can run modern lights for the full day before you burn the energy you would save by turning off the lights an hour early, it is kind of silly at a personal level to think that will work out. (City level and larger, sure?)

        Pulling it back to this article, the assertion is that getting more efficient with energy use for lights would just be offset by us using more lights. Which, I think it is fair to say that we do. Considerably more lights, in many instances. It doesn't completely negate the energy savings, but largely because of how big that energy savings was.

        • ryao 4 days ago

          As far as the integral is concerned, the rate is a constant while time is a linear variable, so the improvement from LEDs is a constant time improvement and the improvement from running the lights less is a linear time improvement. Otherwise, we are in agreement.

          That said, I do think that there is an incentive to spend the energy savings on things other than lighting.

          • taeric 3 days ago

            But the power use is also constant over time t, such that any reduction in time use is effectively a constant drop in cost. You are literally shrinking either the width (time) or the height (electricity use) of the curve, as it were.

            For your point to be stronger, you would have to show that using lights for longer makes them less efficient, such that you increase the unit cost of electricity over the time variable. Which may indeed be the case. The naive model, though, is just Cost = (electrical use) * (time), and reducing time is the same as reducing electrical use via efficiency.

    • roenxi 5 days ago

      The Jevons effect has nothing to do with how efficient something gets. Lightbulbs could be free to build & operate and it'd still trigger Jevons. Electricity use won't drop unless it got more expensive to produce electricity. Or, ironically, unless electric goods suddenly become much less efficient for some reason.

      It is built into the laws of supply and demand. If you want to argue that the Jevons effect won't apply you basically have to argue that the supply/demand curves are funky. In this case there is no reason to believe they are.

      • dahart 4 days ago

        > The Jevons effect has nothing to do with how efficient something gets.

        Why do you say that? From the first sentence of the link you posted: “In economics, the Jevons paradox occurs when technological advancements make a resource more efficient to use (thereby reducing the amount needed for a single application); however, as the cost of using the resource drops, if the price is highly elastic, this results in overall demand increases causing total resource consumption to rise.”

        It has everything to do with efficiency, as higher efficiency is what makes it a paradox. It’s not even interesting, let alone a paradox, if we ignore efficiency and talk about lower prices leading to increased demand.

      • drdeca 5 days ago

        Huh?

        Why would the demand curve for lighting not be “funky”?

        If someone offered to pay for all of your home lighting, no matter how much power your lighting took, how much lighting would you get? Presumably a bounded amount! But in this case, the price you are paying for power for lighting is zero. So, reducing the energy use per amount of lighting would presumably not increase the amount of lighting you use beyond this limit, and therefore would decrease the energy you use for lighting.

        Where’s the hole in this argument?

        • roenxi 5 days ago

          > Where’s the hole in this argument?

          The unrealistic assumptions - the argument started by assuming that resources are unlimited. Most of economics is expected to break down from that starting point. The point of all the models is theorising about how people will distribute limited resources.

          It is like talking about the market for air or trying to measure its supply/demand curves. They are completely degenerate because it is too abundant for anything meaningful to be said.

          • drdeca 4 days ago

            The unrealistic assumption of someone offering to pay the part of your electricity bill that comes from lighting, no matter how big? Ok, yes, that’s clearly unrealistic, but, the cost of this presumably wouldn’t be like, totally insane, right?

            Of course, I don’t mean that in realistic scenarios that the amount of energy used for lighting would decrease directly by the proportion given by the increase in efficiency of light-per-power, which would only happen if you had exactly no demand for more lighting than you have even if it was free, but that doesn’t mean the energy used would necessarily increase, just that it wouldn’t decrease as much as it would if you consumed the same amount of lighting.

      • taeric 5 days ago

        I mean, the Jevons paradox is that increased efficiency can lead to increased use? No?

        So, the idea would be that being able to more efficiently use electricity for running lights might still see us use more electricity for lights as we use them in more and more places. And, at a personal level, that kind of tracks. We don't hesitate to put lights in places that we used to accept as dark.

        You certainly see this with televisions. The dramatic increase in efficiency afforded by new television technologies has seen us both start having screens everywhere, and in larger televisions.

  • clcaev 5 days ago

    Is Jevons’ applicable here? People only have a fixed square footage in their house that needs to be lit, and often negative utility to having rooms lit all of the time.

    • formerly_proven 5 days ago

      If electricity were cheaper you might turn the lights higher instead of balancing cost vs. comfort, wouldn't use the eco-mode on the dishwasher that occasionally results in dirty dishes, would probably not think twice about washing clothes at 30 °C instead of 40 °C, maybe use a dryer instead of clothes racks blocking the living room for a day, use the more comfortable tankless warm water heater, properly preheating the oven giving you the results you want etc. pp. ... the list is endless.

      But electricity often costs upwards of 30 cents/kWh nowadays, so you avoid doing all those comfy things. 'cause they're expensive.

      • mikeyouse 5 days ago

        My power is still $0.11/kwh... I haven't turned off my christmas lights in 3 years. There are huge swaths of the US where power is still (relatively) dirt cheap and nobody thinks twice about the heavy soil function on the dishwasher or leaving landscaping lights on.

        This list matches my experience; https://www.electricchoice.com/electricity-prices-by-state/

      • dworkr 5 days ago

        If I have to spend more than 30% of my monthly budget on power, I will not be taking cold showers or living in the cold. Consuming energy replaces other hobbies. High energy prices have been normalized, at least in my state. Same with gas. People had to stop caring, or leave.

    • edflsafoiewq 5 days ago

      HOA I know wants every house to have more lights kept on all night (for "safety"). They explicitly say LEDs are what makes this cost effective.

      • dmix 5 days ago

        How is that enforced in practice?

  • devwastaken 5 days ago

    LED lighting is both better and worse.

    - purposefully made with line frequency (60hz) refresh which means its actually constantly blinking. you can see this with led Christmas lights by moving them.

    - pack liquid capacitors designed to fail well before the rest of the board.

    - thermals are too hot and fry the diodes or the rectifier IC’s.

    - wrong colors sold everywhere. a proper 4000K 90+ CRI led is hard to find and more expensive. the two most often available are 2700K (yellow) and 5000K (blue)

    • adgjlsfhk1 4 days ago

      Why do you want 4000K? Incandescent/halogen are 2700k/3000k, and sunlight is 5000k. 4000k is flourescent, and I don't think anyone wants lighting to look like that. That said, if that's what you want Lowes has it (technically 3500k at CRI 90) https://www.lowes.com/pd/GE-Pro-60-Watt-EQ-A19-Bright-White-...

      Also, only the absolute cheapest LEDs have noticeable flicker. Decent ones have big enough capacitors to make it un-noticable.

    • HankB99 4 days ago

      Wasn't planned obsolescence invented with the incandescent bulb?

      The apple doesn't fall far from the tree.