- cross-posted to:
- technology@lemmy.ml
- cross-posted to:
- technology@lemmy.ml
cross-posted from: https://lemmy.ml/post/15741608
They offer a thing they’re calling an “opt-out.”
The opt-out (a) is only available to companies who are slack customers, not end users, and (b) doesn’t actually opt-out.
When a company account holder tries to opt-out, Slack says their data will still be used to train LLMs, but the results won’t be shared with other companies.
LOL no. That’s not an opt-out. The way to opt-out is to stop using Slack.
https://slack.com/intl/en-gb/trust/data-management/privacy-principles
Instead of working on their platform to get discord users to jump ship they decide to go in the same direction. Also pretty sure training LLMs after someone opts out is illegal?
Wait, discord is also doing this?
Not currently and publically at least. They’re feeding your messages into an LLM, https://twitter.com/DiscordPreviews/status/1790065494432608432 but that’s not as bad as training one with your messages
Why? There have been a couple of lawsuits launched in various jurisdictions claiming LLM training is copyright violation but IMO they’re pretty weak and none of them have reached a conclusion. The “opting” status of the writer doesn’t seem relevant if copyright doesn’t apply in the first place.
Well, thankfully, it’s not up to you.
Nor is it up to you. But fact remains, it’s not illegal until there are actually laws against it. The court cases that might determine whether current laws are against it are still ongoing.
If copyrights apply, only you and stack own the data. You can opt out but 99% of users don’t. No users get any money. Google or Microsoft buys stack so only they can use the data. We only get subscription based AI, open source dies.
If copyrights don’t apply, everyone owns the data. The users still don’t get any money but they get free open source AI built off their work instead of closed source AI built off their work.
Having the website have copyright of the content in the context of AI training would be a fucking disaster.