• 0 Posts
  • 19 Comments
Joined 1 year ago
cake
Cake day: July 19th, 2023

help-circle










  • TL;DR: Depends on what you mean.

    Long version:

    Disclaimer: I’m not an expert by any means, I haven’t vetted the links properly (or at all), they’re mostly there for illustration and if you want to read further. Also, the last time I actually read up on this is quite some years ago, so stuff may have changed in the industry and/or my memory on specifics is foggy. Many of the links lead to Tesla sources since I first looked into this topic back before Musk made it known to the public that he’s an insufferable human being.

    Batteries are usually structurally integrated into the chassis with modern EVs, since that means space (and often small weight) savings, and is easier/faster to do in manufacturing.

    With that knowledge, it is safe to assume that replacing a car’s battery is a difficult or next to impossible task, outside of end-of-life reuse.

    But this is actually where it gets interesting, since EV batteries last many years anyways: What happens when the car’s time has come?

    Well… the batteries can be reused. It’s not a trivial process, there’s several ways to do it, but the best intuitive explanation I’ve found is this: In raw ore, lithium and other metals are present at maybe 0.1 or 1%, per tonne of material. In batteries, it’s maybe 99% of reusable, expensive material. Even if you let it be 90 due to inefficiencies in recovery, or whatever, it’ll still make way more sense financially to work with old batteries – once you have the process figured out and automated machinery to get it done in place.

    All that is assuming total destruction of the existing cells, which, depending on their state, may not even be necessary at all. In fact, it looks like all of that may not be needed for as much as >80% of batteries. Wow!

    And we all know the best way to ensure companies are doing something is if the financial aspect aligns with their goals. It’s in their best self-interest to be able to and actually do this.

    So: Replaceability per car – eh, doesn’t look to great. Replaceability across the industry? Perfect.






  • Yes, it does, if they have full access to the disassembled hardware and assuming research time & resources they could do practically anything. Such as emulating the Secure Enclave chip with a “fraudulent” version, changing all firmware running on any semiconductors in the phone, isolating storage, I don’t know the details, but let your imagination loose.

    Physical, uninterrupted access is unlikely, yet bad news for anyone’s threat model.



  • there is no chance you would get back to the Intel system and plug it in every 2 hours.

    don’t be irrealistic. most laptops in the Macbook price range will have 8 hours of usage in low consumption mode or around 6 or 5 if you need more power.

    While I completely agree on the repairability front, which is really quite unfortunate and quite frankly a shame (at least iPhones have been getting more repairable, silver lining I guess? damned need for neverending profits), it’s just… non unrealistic.

    That being said, unified memory kind of sucks but it’s still understandable due to the advantages it brings, and fixed-in-place main storage that also stores the OS is just plain shitty. It’ll render all these devices unusable once that SSD gives out.

    Anyhow, off the tangent again: I have Stats installed for general system monitoring, as well as AlDente to limit charge to 80% of maximum battery capacity. All that to say, by now after around 1.5 years of owning the M2 MacBook Air (which I’ve been waiting for to buy/to release since late 2019, btw), I know pretty well which wattages to expect and can gauge its power usage pretty well.

    I’ll try to give a generalized rundown:

    • High-intensity workloads (mostly in shorter bursts for me): typically around 10W. I’ve installed Minecraft before once just to test it, and I get reasonable frames (both modded and unmodded), where it seemed to draw maybe 15W, thus still being able to charge (!) the battery off a 30W power supply. It doesn’t ever really go above 20W as a rule of thumb, and the CPU/GPU will be capable enough for easily 80-90% of the general population.
    • Idle/suspended: unnoticeable. I use my machine every day with maybe an exception or three per month, but from what I’ve read from others, battery will dip slightly after a month of standby, but that’s mostly due to battery chemistry I’d assume, not actually background usage.
    • Idle/running, light usage (yes it’s the same category*): It actually depends on the screen size edit: whoops, brightness. Energy consumption due to CPU usage is by far the minority portion. I’d say 2-4W, maybe. Screen usage when really bright makes it jump to 8-9W, darker-but-not-minimum screen brightnesses leave it at… 5W maybe.

    Given the spec sheet’s 52 Wh battery, you can draw your own conclusions about the actual runtime of this thing by simple division. I leave it mostly plugged in to preserve the battery for when it becomes a couch laptop in around 5-8 years, so I can’t actually testify on that yet, I just know the numbers.

    I didn’t mean for this to come off as fanboi-y as it did now. I also really want to support Framework, but recommending it universally from my great-aunt to my colleagues is not as easy as it is with the MacBook. Given they’re a company probably 1,000 times smaller than Apple, what they’re doing is still tremendously impressive, but in all honesty, I don’t see myself leaving ARM architecture anytime soon. It’s just too damn efficient.

    *At least for my typical usage, which will be browser with far too many tabs and windows open + a few shell sessions + a (may or may not be shell) text editor, sometimes full-fledged IDE, but mostly just text editors with plugins.