Hey, it’s Data! The android we all know and love, from his brilliant scientific mind to his love of theatre and his sweet lil cat called Spot. He is an asset to the starship Enterprise and he is well liked by the crew, has friends, particularly Geordi La Forge.
He is determined to make friends, to explore what it means to be human, given that he serves alongside a crew that mostly comprises such creatures. Through the lens of Data, we are treated to some very tender and emotional stories even though they star an emotionless android, we enjoy his genuine friendship with Geordi and the others, we laugh along with the crew when he gets something wrong, and we gain some insights into the fundamental nature of humanity. It’s all good stuff.
But it is part of a wider trope that I’m overall less fond of, the Pinocchio syndrome. The non-human strives to be a human. In particular, I have grown weary of androids or robots or AI wanting to do this, as an SF writer I have my biases here. While I do love Data and all his episodes throughout seven glorious seasons of TNG, I’m really bored of this trope now!
And then I look online for examples of this trope and realise I haven’t seen all that many… There’s Haley Joel Osment in AI, there’s silent Gosling and his pet hologram in the incredibly sexist Blade Runner 2049 and others, but it’s not as all-pervasive as I thought! Perhaps it’s because Data was so incredibly vocal about his desire to be more than the sum of his processors and programming (and rewatching all 170-something episodes in a very short space of time…) that I thought it was a more widespread problem. Perhaps also because Data just starts becoming obsessed with the idea of emotions and humanity but I never quite felt the reasons for him to do that. It was almost just assumed that that’s what he must want, and that striving towards humanity must innately be a positive thing, when Data’s pretty awesome already.
Still, it feels like the shorthand narrative for writing an AI character who isn’t an antagonist is to make them want to be human or to have emotions. It’s also common for characters to put down robots and AI in science fiction by constantly reminding them of their lack of value as a result of their absence of emotions or humanity. In a future world, would this still really happen? People today are able to connect to machines in a more sensitive manner than these Future Trekkers, such as the everlastingly sweet and gentle PARO robot being used in care homes. How we connect and empathise with machines is an ethical debate that is driving a lot of the conversations over we govern AI and technology, but according to SF we just hate on it most of the time, except for when it saves the day with its superior computing power.
What I want to see in stories is an AI/android/robot being comfortable, even proud of who and what they are. So what if they do'n’t have emotions in exactly the same way as a human does? They might have something that is relatively analogous to a particular emotion. If a machine – and we’re talking SF here, so something with general artificial intelligence, not just a chess computer – reacts suddenly in response to provocation, it could be said to be angry. Just because it was not programmed to be angry does not mean that it could not have some form of emotion that could be compared with anger. There’s no way for a human to disprove it. Just as any human has absolutely no idea what anger feels like to another human or even if they see colours the same way. Until we have a better word to represent what a computer feels, let’s just stick with the emotional ones. And then the story can focus on something other than just Pinocchio – we’ve seen and read these stories before, not least with the original Pinocchio!
It would also be nice to see some more positive interactions between humans and machines, rather than distrust or just disregard. Becky Chambers’ A Long Way to a Small Angry Planet is a wonderful example of different people, aliens, and machines all working together, respecting each other’s differences, having love and emotions that transcend any boundaries of culture and upbringing. Definite recommendation from me there.
If anyone has any recommendations that reverse or subvert this trope, I’d love to have them. I’ve already been told about Autonomous by Annalee Newitz and added it to the old To Read pile but I’d love to read others, as I am trying to write my own stories that feature androids and AIs that are comfortable with remaining artificial.
But I do still love Data, don’t you dare think otherwise.