A slight departure from my usual musings to share that I’ve started a new role as Storyteller at the AI company Imbue. Here’s why, and what this role means.
When this year began, I did not expect to join an AI company. My primary interest has always been in human lives and the humanities: in the stories that help us understand the world and each other; in the literature and art that nourish our hearts and broaden our souls; in the intimate moral questions of our everyday lives. This desire to understand and improve the human condition led me to work in journalism through high school and college, to study political philosophy, and to eventually, unexpectedly, land in the startup realm.
My gravitation toward tech, and AI in particular, seemed mystifying to many. But I believe that, to truly understand humanity, one must understand the technology we create. Humans have always been technologists, in our continual efforts to invent tools that better our lives — whether communication technologies such as the alphabet, social technologies such as the law, or digital technologies such as the personal computer. Technology has always served as a power amplifier, allowing us to do more with our limited time, energy, and capabilities. But technology is not value neutral. Our tools shape us just as we shape them; they have the power to both liberate and shackle, depending on how they are built and used.
For years, I have been haunted by the foreword to Neil Postman’s Amusing Ourselves to Death, in which he writes of a dystopian future à la Aldous Huxley’s Brave New World where “people will come to love their oppression, to adore the technologies that undo their capacities to think.” As AI tools have become increasingly ubiquitous, we’ve observed how they can be used in ways that make our lives more frictionless, but may ultimately erode our cognitive and creative abilities. But I also see the great possibilities of AI to expand our capacity to learn and create, freeing us from tedium to do what we find most enriching and enlivening. As with all technologies, the dangers and benefits are not black and white; the implications only become evident through careful and critical examination as these technologies are being developed and adopted.
AI’s rapid development in recent years has raised age-old questions of what it means to be human: How can we learn to better care for and connect with each other, when algorithmically-trained models promise to do so more seamlessly? When AI can nearly match or even surpass many human abilities, what differentiates us from machines? How can we create technologies that elevate and enrich our lives rather than supplant us?
These questions are, at their core, about the human good, the very question that the humanities have always sought to answer. And though they may stir up existential anxieties, they also raise new possibilities for human life. What I’m most interested in is not mitigating the tail-end possibilities of existential risk (though I’m glad others are working on this), or building AGI that extends human consciousness to Mars, but the ways in which AI can truly better our everyday lives. We need new social and moral infrastructures to help us deal with rapid changes that this new technology will inevitably bring about. And as we’ve learned from the age of social networks, it is essential to create the right incentives for the creation and use of these technologies, and to ensure that the benefits will be distributed rather than concentrated in the hands of a powerful few.
I’ve met few people grappling with these questions more earnestly and thoughtfully than the Imbue team. From the first day I met them, it was clear that the desire to create a world that honors the dignity and innate gifts of every person lies at the heart of every decision they make. I couldn’t be more delighted to have joined the team as Storyteller: to craft a vision for AI that is centered around the human good, and to illuminate a path that leads us there.
In sculpting my role, I found myself referring to two quotes. The first, from Steve Jobs:
“The storyteller sets the vision, values, and agenda of an entire generation that is to come.”
The second, from E.B. White:
“The writer's role is what it has always been: he is a custodian, a secretary. Science and technology have perhaps deepened his responsibility but not changed it. In 'The Ring of Time,' I wrote: 'As a writing man, or secretary, I have always felt charged with the safekeeping of all unexpected items of worldly or unworldly enchantment, as though I might be held personally responsible if even a small one were to be lost. But it is not easy to communicate anything of this nature.'”
It is my humble hope to be a worthy custodian of our strange, fantastic time: to safeguard what is good, beautiful, and sacred amidst great technological upheaval; to elevate values and virtues that orient us toward the human good; to provide inspiration and guidance, questions and challenges, for our brave new world to come.
I love the story you’ve told! 💯 I just skim through stories but something told me to read this 🙌🏾 It is indeed important to remember our responsibility when building tools around AI. Thanks for that and looking forward to more 🚀
Hey Ashley! It's cool to see your arc continuing to the next phase with helping to orient towards humanness from within an AI company. I just renamed my blog to More Human Possible which gets at some overlapping themes: https://morehumanpossible.substack.com/p/more-human-possible
Let me know what ya think!