i wouldn’t normally be concerned since any company releasing a VR product with this price tag is obviously going to fail… but it’s apple and somehow through exquisite branding and sleek design they have managed to create something that resonated with “tech reviewers” and rich folk who can afford it.

what’s really concerning is that it’s not marketed as a new VR headset, it’s marketed by apple and these “tech reviewers” as the new iphone, something you take with you everywhere and do your daily tasks in, consume content in etc…

and it’s dystopian. imagine you are watching youtube on this thing and when an ad shows up, you can’t look away, even if you try to they can track your eye movement and just move the window, you can’t mute it, you certainly cannot install adblock on it, you are forced to watch the ad until it satisfies apple or you just give up and take out the headset.

this is why i think all these tech giants (google meta apple etc) were/are interested in the “metaverse”. it holds both your vision and your hearing hostage, you cannot do anything else when using it but to just use the thing. a 100% efficiency attention machine, completely blocking you from the outside world.

i’m not concerned about this iteration as much as people are not hyped about this iteration. just like how people are hyped about the next apple vision, i’m more worried about the next iterations with somewhat lower price tag and better software availability. i hope it flops and i know it probably won’t achieve any sort of mainstream adoption even if it’s deemed a success because it probably can’t get less bulky and look less dorky, but the possibility is still worrying. what are your thoughts?

  • fluckx@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    9 months ago

    What is it used for then? Face recognition?

    Edit: honest question before I come off as agressive :)

    • bagfatnick@kulupu.duckdns.org
      link
      fedilink
      arrow-up
      0
      ·
      9 months ago

      Thanks for the question, it actually made me look for the api. Looks like I misremembered it, and there aren’t actually any exposed APIs for developers regarding attention. Internally it’s used by iOS for checking when you’re looking at the screen for faceID and keeping the screen lit when you’re reading.

      There are APIs for developers that expose the position of the users head, but apparently it excludes eye information. Looks like it’s also pretty resource intensive, and mainly for AR applications.

      The faceID / touchID api essentially only returns “authenticated”, “authenticating”, and “unautheticated”. The prompts / UI are stock iOS and cannot be altered, save showing a reason.