There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple’s claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won’t be able to use it. There’s a memory requirement for Predictive Code Completion in Xcode 16, and it’s the closest thing we’ll get from Apple to an admission that 8GB of memory isn’t really enough for a new Mac in 2024.

  • rottingleaf@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    256MB or 512MB was fine for high-quality content in 2002, what was that then.

    Suppose the amount of pixels and everything quadrupled - OK, then 2GB it is.

    But 4GB being not enough? Do you realize what 4GB is?

    • Aux@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      One frame for a 4K monitor takes 33MB of memory. You need three of them for triple buffering used back in 2002, so half of your 256MB went to simply displaying a bloody UI. But there’s more! Today we’re using viewport composition, so the more apps you run, the more memory you need just to display the UI. Now this is what OS will use to render the final result, but your app will use additional memory for high res icons, fonts, photos, videos, etc. 4GB today is nothing.

      I can tell you an anecdote. My partner was making a set of photo collages, about 7 art works to be printed in large format (think 5m+ per side). So 7 photo collages with source material saved on an external drive took 500 gigs. Tell me more about 256MB, lol.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        Yes, you wouldn’t have 4K in 2002.

        4GB today is nothing.

        My normal usage would be kinda strained with it, but possible.

        $ free -h
                       total        used        free      shared  buff/cache   available
        Mem:            17Gi       3,1Gi        11Gi       322Mi       3,0Gi        14Gi
        Swap:          2,0Gi          0B       2,0Gi
        $ 
        
    • lastweakness@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      They didn’t just quadruple. They’re orders of magnitude higher these days. So content is a real thing.

      But that’s not what’s actually being discussed here, memory usage these days is much more of a problem caused by bad practices rather than just content.

      • rottingleaf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I know. BTW, if something is done in an order of magnitude less efficient way than it could and it did, one might consider it a result of intentional policy aimed at neutering development. Just not clear whose. There are fewer corporations affecting this than big governments, and those are capable of reaching consensus from time to time. So not a conspiracy theory.