• Mr_Blott@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    9 months ago

    I’m guessing it’s an EU model. They have all sorts of “eco” modes to pass environmental laws, but you wouldn’t use them IRL

    So yes, it could, but fuck that, stick it on dynamic HDR and drive your eco friendly -ish car to compensate lol

    • counselwolf@lemmy.worldOP
      link
      fedilink
      arrow-up
      0
      ·
      9 months ago

      Is it possible that the local version of Energy Star for my TV used the Eco mode setting for the tests?

      • AggressivelyPassive@feddit.de
        link
        fedilink
        arrow-up
        0
        ·
        9 months ago

        They usually test whatever the manufacturer says is the default. And that most likely happens to be the lowest power mode that barely resembles a reasonable usage.

    • fuckwit_mcbumcrumble@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 months ago

      That was my first guess as well. By default the TV is in an eco mode and would only use around 50 watts. But as soon as you make the TV actually usable it will double in power.

      OP if you want to get a worst cast scenario of the power consumption of your TV just look at the power supply. If it’s an external brick just look at the DC output from the brick and multiply the voltage by the amperage. If you’re running it off of a battery powered inverter then power factor, and efficiency of the brick come into play, but it shouldn’t be too much worse than the absolute highest the brick is rated to output.