- cross-posted to:
- aicompanions@lemmy.world
- technology@lemmit.online
- cross-posted to:
- aicompanions@lemmy.world
- technology@lemmit.online
ChatGPT’s new AI store is struggling to keep a lid on all the AI girlfriends::OpenAI: ‘We also don’t allow GPTs dedicated to fostering romantic companionship’
I am pretty sure its just to avoid controversy, look up the recent news about “laion” for an example, gpt4 isn’t just text anymore, it can generate images also.
Altman talked about we may sometime all have our own personal AI’s tailored to our own needs and sensitivities. But almost everyone has a different idea of if and where there should be a line.
If I have an AI tailored for me and my sensitivities then it should have no filter whatever filter it has should be defined and trained by me.
Someone else artificially trying to adjust my personality through AI to fit whatever arbitrary norms they believe it should have is cancer.
I am inclined to agree, i believe that once society is able to fill everyone’s needs and everyone can summon any ai vr experience they want crime will stop to exist, there would be nothing to gain from committing harm. But i fear the simulated role-play in the context of psychological torture, csam could lead to making dangerous people more confident before we get to that post-scarcity. Maybe you say chatgpt inst realistic enough for it now, but i will be soon.
training an LLM entirely by yourself with self curated text is beyond what is feasible, most ai researched today dont even know whats in all of the data they use. Its more then you can look at even with an extended lifetime and at best you can fine-tune a standard base model.