We will be able to create hyper-realistic virtual people soon

After really getting into ChatGPT, I must admit that when I got an email today asking me to sign an open letter to pause AI development for 6 months for the safety of humanity…I didn’t want to. I want to see what GPT 5 and 6 can do. I can see it’s going to be a bumpy ride, but it’s exciting. So, I can only imagine how the people working on developing it must feel.

I have read a bit about how GPT is fine-tuned – it starts out with the whole internet, for better or worse, but then is fine-tuned to bias it toward reputable and correct and reasonable answers. But other companies could fine-tune it in other directions.

Last night I was thinking about how we have almost all the components now to create extremely realistic virtual partners – if you combine generative AI which is mostly still images right now, but will be able to do video and then VR at some point soon – and you add a GPT that is fine-tuned towards relational responses…AND, you could just ask it to mimic whoever you have a crush on IRL, or a celebrity…or your dead wife. Haptic suits are coming along.

In other words, we will soon be able to create people on demand, and interact with them in VR, extremely realistically. Just like on Star Trek when they say, “Create character” on the holodeck, and simulate real people.

There are lots of applications for this, good and bad. For example, part of recovering from an abusive relationship or childhood is learning what good relationships really are like. A relationship simulator of a secure partner would be amazing and help you practice relationship skills and really have a felt sense of what is healthy.

On the other hand, I could see no real life human actually measuring up to what a really well trained AI could offer. Especially if people grow up having AI relationships with their virtual nanny, virtual teachers, etc.

Companies already have to nerf the AI to keep people from doing bad things with it. They also have to nerf it to discourage people from falling in love with it. But that just opens up a market opportunity for other companies to give people what they want.

I can imagine China using this to deal with their massive sex-ratio imbalance problem ie the “missing women” from their previous one-child birth policies. Maybe it could also help our incel problem. Even if it is looked down upon, people will still do it. Drugs are extremely popular, and falling in love is like a drug.

Addiction and trauma are extremely linked, to the point where you can see addiction as a trauma symptom. It’s an attempt to self-regulate a dysregulated nervous system using an external substance or process. But even relatively regulated bodies and minds can be hooked into addictive processes if you manipulate their dopamine pathways. So far we have let capitalism use AI to freely hack our brains (see: social media) in the name of profit. I really don’t know that it’s the AI capabilities that are the problem so much as the incentives of capitalism are wildly misaligned to creating a population of happy and well human beings.

Exciting times!

Add a comment

© 2022 Emma Arbogast · About · Contact