Jfc just spent 15 minutes trying to cancel a newspaper subscription this morning. Shame I couldn’t wait six months to do so.
Jfc just spent 15 minutes trying to cancel a newspaper subscription this morning. Shame I couldn’t wait six months to do so.
For me and mine, it’s carrots. Do you know how difficult it is to find carrot-free items? Impossible.
Devastating loss for the science community. I used this database in my PhD, and didn’t expect it to shut down ever.
Agreed, seems like a no-brainer. Typically this stuff is handled at an institutional level, with bad professors losing/ failing to achieve tenure. But some results have much bigger implications than just “Uh oh, I cited that paper and it was a bad one.” Often, entire clinical pipelines are developed off of bad research, which wastes millions of dollars.
See also, the recent scandals in Alzheimer’s research. https://www.science.org/content/article/potential-fabrication-research-images-threatens-key-theory-alzheimers-disease
In grad school I worked with MRI data (hence the username). I had to upload ~500GB to our supercomputing cluster. Somewhere around 100,000 MRI images, and wrote 20 or so different machine learning algorithms to process them. All said and done, I ended up with about 2.5TB on the supercomputer. About 500MB ended up being useful and made it into my thesis.
Don’t stay in school, kids.
Nah. Fenced epee for a bit in a college club. Height advantage was pretty great. I guess it just depends on the weapon.
I’m not sure I agree here - I think the resin printer might not be a good entry point, but I’m curious to hear what others think. I’ve heard resin printers require special ventilation and the photo-resin is carcinogenic. Once dialed in, an FDM can do pretty great for detailed parts. Especially with a smaller nozzle. So I’m not convinced jumping straight into a resin printer is wise.
I used my Ender 3 for a few years making miniatures, and they came out pretty great. Of course, then I tried switching to a larger nozzle and I still haven’t managed to get it running… but that’s my fault.
They raised my rent 20% over two years and priced me out of two apartments. Glad to see progress.
It’s been in development for a while: https://ieeexplore.ieee.org/abstract/document/1396377?casa_token=-gOCNaYaKZIAAAAA:Z0pSQkyDBjv6ITghDSt5YnbvrkA88fAfQV_ISknUF_5XURVI5N995YNaTVLUtacS7cTsOs7o
Even before the above paper, I recall efforts to connect (rat) brains to computers in the late 90s/early 2000s. https://link.springer.com/article/10.1023/A:1012407611130
It’s a bunch of neurons that speak to a computer with a microelectrode array. So they “speak to” the neurons with electric impulses, and then “listen to” what they have to say. The computer it’s connected to uses binary, but the neurons are somewhere in between. Yes, the change in electrical potential is analog, but neurons are typically in their “on” state, recovering from their “on” state, or just chilling out.
The brain is incredible because of the network of connections between neurons that store information. It’ll be interesting to see if a small scale system like this can be used for anything larger scale.
Believe it or not, I studied this in school. There’s some niche applications for alternative computers like this. My favorite is the way you can use DNA to solve the traveling salesman problem (https://en.wikipedia.org/wiki/DNA_computing?wprov=sfla1)
There have been other “bioprocessors” before this one, some of which have used neurons for simple image detection, e.g https://ieeexplore.ieee.org/abstract/document/1396377?casa_token=-gOCNaYaKZIAAAAA:Z0pSQkyDBjv6ITghDSt5YnbvrkA88fAfQV_ISknUF_5XURVI5N995YNaTVLUtacS7cTsOs7o. But this seems to be the first commercial application. Yes, it’ll use less energy, but the applications will probably be equally as niche. Artificial neural networks can do most of the important parts (like “learn” and “rememeber”) and are less finicky to work with.
My job is 8:30 - 5 with a 30 minute lunch break. So almost.
But, we also get 2 days/week at home, and can flex time as required. Tons of international work, so the flexible hours are a godsend when time zones are against us.
It’s a salaried position and depending on your supervisor and stage of your career, you’re expected to work 40-45 hours a week. Deadlines and ugly projects tend to increase hours work. I’m very lucky, as my industry can be pretty brutal with sudden ends to projects and unexpected layoffs.
Thanks for the recommendation, I was worried they would be missing some of my artists but they had 99% of my music. Can’t wait to ditch Spotify.
ETA: dear lord the sound quality is so much better. I had no idea what I was missing.
We’ve got some really good theories, though. Neurons make new connections and prune them over time. We know about two types of ion channels within the synapse - AMPA and NMDA. AMPA channels open within the post-synapse neuron when glutamate is released by the pre-synapse neuron. And the AMPA receptor allows sodium ions into the dell, causing it to activate.
If the post-synapse cell fires for a long enough time, i.e. recieves strong enough input from another cells/enough AMPA receptors open, the NMDA receptor opens and calcium enters the cell. Typically an ion of magnesium keeps it closed. Once opened, it triggers a series of cellular mechanisms that cause the connection between the neurons to get stronger.
This is how Donald Hebb’s theory of learning works. https://en.wikipedia.org/wiki/Hebbian_theory?wprov=sfla1
Cells that fire together, wire together.
Actually, neuron-based machine learning models can handle this. The connections between the fake neurons can be modeled as a “strength”, or the probability that activating neuron A leads to activation of neuron B. Advanced learning models just change the strength of these connections. If the probability is zero, that’s a “lost” connection.
Those models don’t have physical connections between neurons, but mathematical/programmed connections. Those are easy to change.
I’ve been quoting Jordan Peterson for years?! Ahhh fuck.
Actually, we’ve got some pretty sophisticated models of neurons. https://en.wikipedia.org/wiki/Blue_Brain_Project?wprov=sfla1
See my other comment for an example of how little we truly understand about neurons.
Even assuming we can model the same number of (simple machine learning model) neurons, it’s the connections that matter. The number of possible connections in the human brain is literally greater than the number of atoms in the universe.
It’s not a terrible idea by any means. It’s pretty hard to do, though. Check out the Blue Brain Project. https://en.wikipedia.org/wiki/Blue_Brain_Project?wprov=sfla1
Unironically, I had to delete this game from my phone because I wasn’t getting work done. This game slaps.