In June of 2020 Google fired an AI engineer for making a loud public statement that its AI was sentient. At the time, the news was amusing; and my social circle joked about it. One of those jokes has stuck with me for the past 10 months:
Associative learning
Why has this silly take stayed with me for so long? I’m similar to Blake Lemoine (the Google engineer) in that recent advancements have changed my opinions about sentience. However, that change has taken me in the opposite direction: I don’t have a higher opinion of machine sentience, I have a lower opinion of my own.
Over the past few years I’ve been playing dabbling in the generative AI space. I’ve used LLM’s and image diffusion models to write a children’s book (although, who hasn’t 😅), used gpt3 to generate flashcard decks, graphed character relationships in fiction, and developed what is essential an “ai-dungeon” discord bot.
Throughout this process I’ve been stunned by the results LLM’s and diffusion models are able to produce through statistical weighting. They are funny, creative, and often surprising in the originality of their output. This makes sense, since we trained them on billions of online interactions where those “sentient” qualities were well-represented.
In the sense of training, we’re very much looking in a mirror when we ask AI to remix and regurgitate the creativity of human beings. But the process of association and weighting by which the results are achieved doesn’t strike me as all that different from social development. We embed in society when we’re similarly young and neuroplastic and develop the associations that allow us to recognize that mobsters in media should talk a certain way, and that certain artistic movements can be alluded to with visual archetypes.
Humans also learn and develop via association, we just do it slower and with wetware instead of circuity and linear algebra.
Feelings
Does this indicate AI is more human than we think? No. It means we are vain.
Human creativity isn’t special. Taking in your environment and parroting it back with a twist isn’t an indicator of sentience, it’s an indicator of learning. Any living system is capable of this, be it through reinforcement and habituation (a la Pavlov), or sheer tenacity and adaptation through selection (a la drug resistant bacteria).
Most often we define sentience as the ability to feel- to experience qualia and emotion, to ruminate internally and arrive at new conclusions about the nature of being. Machines lack that capacity by nature of being created rather than evolved.
Biological sentience has been bootstrapped via billions of years of selection necessary to coordinate complex organisms into coordinated action via signaling mechanisms like hormones and innervation. The “feelings” we use to arrive at logical courses of action are just an interface the rest of our nervous system uses to communicate somatic state to the part of our body making decisions. Grief, rage, lust- all just dashboards the executive team in our brain is using to get a vague sense of the best course of action.
We have feelings because of their adaptive utility. If they didn’t make our predecessors more fit to survive and pass on their genes, we wouldn’t have them. It’s possible to make AI feel, but we haven’t added that functionality to the current models. There’s no reason for it to develop spontaneously.
Ego death
What the current stage of machine intelligence has to offer is an opportunity to humble ourselves and our intellect. We aren’t special. Other animals learn and feel. This aligns with much of what the animal welfare movement has already put forward. Now that we have machines up to speed with the learning part, the morality of sentience should only align with AI if we decided to take the inhumane path of programming emotions and qualia alongside learning and association.
In the meantime, accept that human society isn’t sentient, only the pieces that make it up. As a collective, we’re as intelligent as an anthill or a slime mold: a senseless squirming mass, all thumbs and greedy meat. We need to surrender the vanity that expression makes us special, and accept our idiocy.
Great essay. You make very good points.