The “illegally trained LLMs” they’re taking about are trained on copyrighted data that they didn’t have permission to use, this isn’t about LLMs that have been trained to do illegal things. OpenAI (chatgpt) is being sued because there is a lot of evidence that they used copyrighted content for training, like NY Times articles. OpenAI is so profitable that they’ll probably see these lawsuits as a business expense and keep doing it. Most people won’t sue anyway…
i know that by illegally trained LLMs they are talking about training on copyrighted data(by legally have access to, i meant that they are legally allowed to train AI on it).
Its ridiculous that companies can just ignore laws
I think he’s talking about people using LLMs for illegal and unethical activities such as fishing. There are already a lot of people using LLMs that are open source without ethics restrictions to do bad stuff, with the power of GPT4 behind them they would be a lot more effective.
Wouldnt that give people who is it for bad things easier access? It should be made illegal to create if they dont legally have access to that data
The “illegally trained LLMs” they’re taking about are trained on copyrighted data that they didn’t have permission to use, this isn’t about LLMs that have been trained to do illegal things. OpenAI (chatgpt) is being sued because there is a lot of evidence that they used copyrighted content for training, like NY Times articles. OpenAI is so profitable that they’ll probably see these lawsuits as a business expense and keep doing it. Most people won’t sue anyway…
i know that by illegally trained LLMs they are talking about training on copyrighted data(by legally have access to, i meant that they are legally allowed to train AI on it).
Its ridiculous that companies can just ignore laws
Oh, I’m not sure what you meant in your first comment then?
I think he’s talking about people using LLMs for illegal and unethical activities such as fishing. There are already a lot of people using LLMs that are open source without ethics restrictions to do bad stuff, with the power of GPT4 behind them they would be a lot more effective.