It seems like we’re asking AI to feel with us now.
To hold our grief.
To help us make meaning.
To “understand” us.
Not excluding this — and Eliza somewhat anticipated our current predicament — what happens when "it" also starts representing us?
Not just as companions… but as
proxies.
For work.
For relationships.
For identity itself.
When the "it" is not just with us, but also taking action amongst other humans on our behalf?
Digital Twins — almost feels potentially more problematic.
But hold on..
Suppose we worked really, really hard to imagine how that might be good rather than leading to an existential conflagration.
We're prone to identify the calamity. I get that.
Remember Fredric Jameson?
“It's easier to imagine the end of the world than the end of capitalism.”
Sure you do..or some variation therein.
I do this too.
It's as if we're programmed to see the worst of all possible outcomes, particularly when we feel like we have no control.
What does this idea of a 'Digital Twin' (okay..not the most evocative naming there..) look like if it delivers non-extractive benefits?
If you've got 50 minutes with it per week rather than continuous always-on?
Just wondering outloud.
Wonder with me into our preferred near future ⇪