Exhalation: Transformative Experience and AI


Ted Chiang’s Exhalation is one of the best collections of short science fiction that you can find. It also happens to have several stories directly concerned with transformative experience and decision-making. That’s unusual. Most short science fiction stories are idea driven. They take a concept and run with it.

In Exhalation,  for example, Chiang writes about agent regret and predestination (The Merchant and the Alchemist’s Gate), hope in face of death (Exhalation’s title story), the non-literal nature of truth (The Truth of Fact, the Truth of Feeling), our blind-spot in the search for intelligent life (The Great Silence), and the possibility of existential despair even in a god-created universe (Omphalos). These are terrific stories and if you haven’t read Chiang, you should. If you enjoyed Arrival, it’s based on his Story of Your Life (collected under that name).

But as good as these stories are, they aren’t about personal transformation: how experience shapes us, what kind of life to lead, and what kind of person to try and be. And you might think a story called “The LifeCycle of Software Objects” is particularly unlikely to be about transformative experience, but Chiang is his usual surprising self. It’s the longest story in Exhalation (really a novella) and while it’s not the best story in the collection, it has something profound to say about how we learn and what it means to be intelligent.

The main character (Ana) in LifeCycle is recruited to work on digitally engineered creatures built via simulation of genetic development. These digents are sentient creatures but, like humans, they are created with very little pre-programmed knowledge. The goal is to breed “digital pets” for people, and Ana’s background in zookeeping makes her a plausible choice for training archetypal digents.

Unsurprisingly, she becomes quite attached to her primary digent even as the commercial program grows, falls out of fashion, and becomes the personal project of a small community of “owners” who are slowly raising their digents through childhood into something very much like adults.

It’s an immense amount of work, very much a matter of parenting. These are fully sentient creatures albeit trapped in a digital world (and one that, since the commercial failure of their virtual world, is shrinking).  What’s interesting in Chiang’s story – and resonant – is that their human parents slowly begin to realize that the only path to real intelligence is via experience. Digents become truthful, responsible, thoughtful, and wise the same way humans do – by learning from others. For though they are a digital intelligence, their intelligence is exactly like ours. They are adaptive learners who are changed by experience.

As the digents become more sophisticated, other companies express interest in using or studying them, even though they have long since fallen off the market. Ana is recruited by one company that wants to make a super-intelligence, but she ultimately rejects their offer and their vision…

“They want something the responds like a person, but isn’t owed the same obligations as a person, and that’s something she can’t give them.

No one can give it to them, because it’s an impossibility. The years she spent raising Jax didn’t just make him fun to talk to, didn’t just provide him with hobbies and a sense of humor. They were what gave him all the attributes Exponential is looking for: fluency in navigating the real-word, creativity at solving new problems, judgement you could entrust with an important decision. Every quality that made a person more valuable than a database was a product of experience.

She wants to tell them that Blue Gamma was more right it knew; experience isn’t merely the best teacher; it’s the only teacher. If she’s learned anything from Jax, it’s that there are no shortcuts. if you want to create the common sense that comes from twenty years of being in the world, you need to devote twenty years to the task.”

This is interesting as applied to AI’s, but it is a brave soul who will confidently predict how and what an AI might learn. But insofar as it applies to people, there is no doubt about its truth. There are no moral systems, rational rules, or logical processes we can give someone that will make them funny, wise or sympathetic. We do not learn humor and wisdom and empathy from reading Kant or Bentham. Experience is how we learn, and it is most fundamentally the experience of others that makes us who we are.

People matter far more than ideas. Friends and teachers matter. Parents matter most. Our ethical shape and intellectual character may feel as if they are the product of some set of books, but at the basic levels of how we think, react and learn, it is always people who have taught us.

And if we must apply this lesson to the world of AI, we should expect that while an AI may start with vastly greater capabilities and knowledge than a baby (or a digent in Chiang’s story), it too will be an adaptive learner. And what it experiences from us will, quite inevitably, shape what it is. We can only hope that whatever and whenever such an AI is created, it (and we) are as lucky as the digents in their parents.


Leave a Reply

Your email address will not be published. Required fields are marked *