• 4 Posts
  • 920 Comments
Joined 2 years ago
cake
Cake day: June 29th, 2023

help-circle
  • If all you’re saying is that neural networks could develop consciousness one day, sure, and nothing I said contradicts that. Our brains are neural networks, so it stands to reason they could do what our brains can do. But the technical hurdles are huge.

    You need at least two things to get there:

    1. Enough computing power to support it.
    2. Insight into how consciousness is structured.

    1 is hard because a single brain alone is about as powerful as a significant chunk of worldwide computing, the gulf between our current power and what we would need is about… 100% of what we would need. We are so woefully under resourced for that. You also need to solve how to power the computers without cooking the planet, which is not something we’re even close to solving currently.

    2 means that we can’t just throw more power or training at the problem. Modern NN modules have an underlying theory that makes them work. They’re essentially statistical curve-fitting machines. We don’t currently have a good theoretical model that would allow us to structure the NN to create a consciousness. It’s not even on the horizon yet.

    Those are two enormous hurdles. I think saying modern NN design can create consciousness is like Jules Verne in 1867 saying we can get to the Moon with a cannon because of “what progress artillery science has made in the last few years”.

    Moon rockets are essentially artillery science in many ways, yes, but Jules Verne was still a century away in terms of supporting technologies, raw power, and essential insights into how to do it.


  • You’re definitely overselling how AI works and underselling how human brains work here, but there is a kernel of truth to what you’re saying.

    Neural networks are a biomimicry technology. They explicitly work by mimicking how our own neurons work, and surprise surprise, they create eerily humanlike responses.

    The thing is, LLMs don’t have anything close to reasoning the way human brains reason. We are actually capable of understanding and creating meaning, LLMs are not.

    So how are they human-like? Our brains are made up of many subsystems, each doing extremely focussed, specific tasks.

    We have so many, including sound recognition, speech recognition, language recognition. Then on the flipside we have language planning, then speech planning and motor centres dedicated to creating the speech sounds we’ve planned to make. The first three get sound into your brain and turn it into ideas, the last three take ideas and turn them into speech.

    We have made neural network versions of each of these systems, and even tied them together. An LLM is analogous to our brain’s language planning centre. That’s the part that decides how to put words in sequence.

    That’s why LLMs sound like us, they sequence words in a very similar way.

    However, each of these subsystems in our brains can loop-back on themselves to check the output. I can get my language planner to say “mary sat on the hill”, then loop that through my language recognition centre to see how my conscious brain likes it. My consciousness might notice that “the hill” is wrong, and request new words until it gets “a hill” which it believes is more fitting. It might even notice that “mary” is the wrong name, and look for others, it might cycle through martha, marge, maths, maple, may, yes, that one. Okay, “may sat on a hill”, then send that to the speech planning centres to eventually come out of my mouth.

    Your brain does this so much you generally don’t notice it happening.

    In the 80s there was a craze around so called “automatic writing”, which was essentially zoning out and just writing whatever popped into your head without editing. You’d get fragments of ideas and really strange things, often very emotionally charged, they seemed like they were coming from some mysterious place, maybe ghosts, demons, past lives, who knows? It was just our internal LLM being given free rein, but people got spooked into believing it was a real person, just like people think LLMs are people today.

    In reality we have no idea how to even start constructing a consciousness. It’s such a complex task and requires so much more linking and understanding than just a probabilistic connection between words. I wouldn’t be surprised if we were more than a century away from AGI.




  • Assume, he says, that the distribution of holdings in a given society is just according to some theory based on patterns or historical circumstances—e.g., the egalitarian theory, according to which only a strictly equal distribution of holdings is just.

    Okay well this is immediately a false premise because nobody seriously makes this argument. This is a strawman of the notion of egalitarianism.

    Also, we don’t need Wilt Chamberlain to create an unequal society, we just need money. It’s easy enough to show that simply keeping an account of wealth and then randomly shuffling money around creates the unequal distribution that we see in the real world:

    https://charlie-xiao.github.io/assets/pdf/projects/inequality-process-simulation.pdf

    And every actor there began with the impossible strictly eqalitarian beginning. No actor was privileged in any way nor had any merit whatsoever, but some wound up on top of an extremely unequal system.

    So Noszick just needs to look a little deeper at his own economic system to see the problem. There is no reason why we need to have a strict numerical accounting of wealth.


  • I’ve done it. It was a grub screw - so the hex was entirely within the shaft - that was surrounded by loctite, and frankly I never had a chance to get it out. It went circular immediately, just with hand pressure. I ended up having to use a screw extractor.

    I was told this was a common problem on ARRMA vehicles and that I should get a more precise type of hex driver. They were expensive but I haven’t had the problem since.




  • Excrubulent@slrpnk.nettoLemmy Shitpost@lemmy.worldNo looky for you!
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    14 days ago

    Your gnomes shouldn’t be dead, they’re technically immortal and a stint in the dishwasher is their ticket out of the salt mines. If you’ve got dead gnomes the last thing you want is to keep their bodies on the premises. If you leave them in the cartridge they can be revived when you exchange it for the new cartridge. If you put them in the ground they will find… other ways back to their realm, and they will remember what you did.

    And please remember to buy gnomane dishwashing tablets, I cannot stress enough how much they should not be dead.

    Also don’t ask me why the gnome salt mine slavery exists, I didn’t create it, I just benefit from it.




  • In my experience that is not a true defence against disappointment.

    My expectations weren’t low enough to guard against my boss’s husband getting drunk, shoving his kid around in front of me then driving like a lunatic with us both in the car. When I quit over it I didn’t get my last paycheck or even an email back.

    There had been red flags in the hiring process which these days I would absolutely bounce over, especially since they’d taken so long to get my contract in order, but you just don’t expect people to be complete monsters.

    When I emailed again a week later to ask what was taking so long the second business partner’s email bounced. Apparently she’d left in that time. The business is still somehow running because you don’t need to be competent to get startup capital and run a business, you just need to talk fast enough to get the bag.







  • Always remember that “the market” is just a signal to the landlord that they could get more if the property were on the market today. It’s still their choice to squeeze you to take advantage of that. “It’s the market” is code for “because I can”.

    Also they know that people don’t want to move every year or two, so they can absolutely raise the rent above market level without you wanting to leave yet. This has the effect of pushing the market higher. The switching cost is very high, so it’s in their favour that way too.

    A landlord I knew about through a friend said they never raised the rent as long as their property is being paid off, because they would rather have it occupied and being paid than the tenants leave and the place sit empty.

    Not to say that’s a good landlord by any means, but there is a choice. The market isn’t a mandate.