Technological feat aside:

Revolutionary heat dissipating coating effectively reduces temperatures by more than 10%

78.5C -> 70C = (78.5 - 70) / 78.5 = 0.1082 = 10% right?!

Well, not really. Celsius is an arbitrary temperature scale. The same values on Kelvin would be:

351.65K -> 343.15K = (351.65 - 343.15) / 351.65 = 0.0241 = 2% (???)

So that’s why you shouldn’t do % on temp changes. A more entertaining version: https://www.youtube.com/watch?v=vhkYcO1VxOk&t=374s

  • RedWeasel@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Shouldn’t the math be degrees over ambient? So 22C would be (56C - 48C)/56C = 14%. Seems like marketable market speak.

  • 7heo@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    I would argue that what makes sense when considering temperature percentages wrt dissipation, is the difference between old and new, divided by the difference between the system at rest and the old temperature.

    Which is then a ratio of offsets, rather than a ratio of one offset and a difference with an arbitrarily defined origin.

    In this case, it is fair to assume the temperature at rest of the system around 292K, or 19C.

    Which would give: (78.5C - 70C) / (78.5C - 19C) = 14.29%, or (351.65K - 343.15K) / (351.65K - 292.15K) = 14.29%.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      GamersNexus presents their temperature testing in terms of difference from room temperature, so this is probably how they’d do this comparison.

      I’m not sure they’d see a reason to cover ram temperature unless it was approaching actual risk of harm or enabled higher clocks, though. Comparing cases or CPU coolers by temperature makes sense. Comparing GPUs when they’re using the same chip and cooling performance is a big part of the difference between models? Sure. But RAM? Who cares.