Here’s a great example of dystopian tech being rolled out without guardrails. Brought to you by Axos, which you may know as the company that rebranded after Taser became a liability as a name.
Here’s a great example of dystopian tech being rolled out without guardrails. Brought to you by Axos, which you may know as the company that rebranded after Taser became a liability as a name.
So, AI that is strictly incapabale of generating new ideas is going to be fed decades of police reports as it’s database, and use that data to discern that makes a good police report?
Surely this won’t replicate decade old systematic problems with racial profiling. I mean, all these police reports are certainly objective, with no hint of bias to be found in the officers writing.