Twig technology

View Original

People think tools are people

Or, why you get mad at your printer…

You know the feeling: your car breaks down yet again. Your shoelace breaks right as you’re rushing out the door. Your printer insists you’ve given it the wrong settings and there’s nothing it can do. These things, these inanimate objects, refuse to do the simple jobs they were designed for. It’s enough to make you shout at them in frustration, or even physically attack them (as the 1999 film Office Space memorably showed).

But why? What triggers such passion towards lumps of plastic or metal? And could that frustration be a clue to the reason that we’re able to use complex tools in the first place?

Today’s post covers one of my own peer-reviewed academic papers, from the August 2023 issue of the Journal of Comparative Psychology. In that study, I trace the links between our evolution as an ultra-social species, reliant on other people, and our ability to build and use ever more intricate tools and devices. You can find the full paper in the Research section of this site, including full reference list—this post is its shorter, more reader-friendly companion. Along the way, we’ll encounter the idea of machination, delve into the concept of overimitation, and ask whether other animals might share some of our abilities to use (and abuse) their tools.

Social rewards

Our starting point is that people need people. There are physical systems within our bodies that reward us for interacting with other humans, to the point that most people get great pleasure from social interaction, and suffer withdrawal symptoms if we are too lonely. This cooperative spirit allows us to work, play and live together in incredibly dense numbers, compared with any other primate.

As part of this friends and family focus, we have an evolved tendency to cooperate and expect the same in return. That’s fine when talking to your co-workers, but the problem—I am suggesting—is that we often take a similar stance with non-human objects. We mistakenly treat our tools and machines as other forms of people, an error that I have termed machination. That term obviously refers to the fact that machines are involved, but it also hints at the secret schemes or plans we might imagine our tools to have. Such as the printer deliberately refusing to work, innocently claiming it’s our fault (for not using the right toner, paper size or ritual sacrifices) then probably laughing at us behind our backs.

Anthropomorphic resonance

The tendency to see non-humans as similar to ourselves is called anthropomorphism (from the Greek for ‘human-shaped’). Every time a cartoon animal talks, or you imagine gods or ghosts as having human form, or ascribe human-like motivation to your pet cat or snake, that’s anthropomorphism. The animals roaming the 2016 Disney film Zootopia are perfect examples, as you can see in the image below. We should note that this concept has a close cousin, anthropocentrism (‘human-centred’), that involves seeing humans as the centre of the universe, and a standard by which all else must be judged, but we’re not dealing with that today.

Anthropomorphism actually has its uses, for example aiding conservation efforts when we think of polar bears or other endangered animals as fluffy versions of ourselves. But in general it is something that scientists try to avoid. That isn’t always easy: biological research is full of people saying that a particular gene or cell ‘wants’ to do something, for example, even if they are careful to include the quotation marks. This scientific caution is warranted, because rampant anthropomorphism risks adding fuzzy or irrelevant details to our understanding of how the world works.

However, there is usually little harm in thinking of our everyday tools as little people. We might call out to a lost pair of glasses, or get worried about how a toy feels when we leave it outside in the rain. In my paper, I take that notion a step further, and argue that in fact seeing our devices and tools as having their own wants and needs has been beneficial to our development as a species. The core idea is that feeling like we are working with a cooperative partner makes us less concerned with how it all works, allowing us to use incredibly diverse and complex machines without being overcome by panic or confusion. It is a mistake to think that our coffee mugs and laptops computers really do have wants or needs, but it is a useful mistake nonetheless.

Overimitation

To make things more concrete, in the paper I consider the example of overimitation. This term describes the way that people—both children and adults—will copy irrelevant actions shown to them by someone who is demonstrating how to use a device. Opening an unfamiliar box might just involve lifting a latch using a stick, but if the demonstrator first taps the stick on top of the box (which does nothing to help open it), there is a strong chance that the watching person will do that too. They are not just imitating what they see to get the result, they are overimitating by following useless extra steps. Humans do this a lot more than other apes, to the point that this mindless copying has been suggested as a key difference in the flowering of diverse cultures in the human lineage.

From a machination perspective, people may not just be copying an experimenter (perhaps as a way to please them), but might be treating the tool as having its own needs. Basically, if the tool wants to be tapped on the box, you should do that. In many overimitation experiments, the demonstrator isn’t in the room when the watcher tries to follow their instructions, to avoid any possibility of bias. The test subject may also not be told that they are being watched or filmed. That means that the main relationship during the active part of the experiment is between the user and the device. Cooperation brings rewards in the form of happy hormones and a feeling of comfort, so it may just feel right to work with the tool on its own terms, even if you aren’t able to explain exactly why.

The breakdown in the cooperative relationship, where you end up shouting at a toaster, shows the strength of the expectation of shared goals and trust. Research has shown that there is a moral dimension to machination. When a machine or tool fails to cooperate, we can automatically feel not only let down because they didn’t work as they were designed, but because they had a duty to help us. As I say in the paper:

working with tools can involve both making them work correctly in a physical sense, and making them do what is correct in a socially conforming sense.

Taking a stance

There are similarities between my proposal, and the idea of the intentional stance put forward by philosopher Daniel Dennett. Both involve seeing an object as having a purpose, not just a particular form or design. In Dennett’s view, taking an intentional stance allows you to understand the world better and quicker by treating something as trying to reach a goal. He suggests that thinking about intentions allows us to treat non-human things as having both underlying desires and beliefs (opinions about the world). The intentional stance works well with many such things, including plants, bacteria, and complex computer programs.

One key difference, though, is that machination as I see it does not involve tools having beliefs, only wants or desires. Cooperation with tools is not about understanding how they see the world, it is about the team we make when we and our devices work together towards a common goal. This leads to a second difference, which is that machination is temporary—it only happens when we’re directly interacting with the tool or device. And a third difference is that machination is an unconscious error, not a reasoned or thought-through approach to using a tool.

It may be that the more experience you have with a tool, and the more you understand how it is made and how its parts interact, the less prone you would be to machination. If your bluetooth speaker won’t work, you might get mad at it or blame it for ruining your picnic. Or, if you have the relevant experience, you may know exactly what has gone wrong inside, and fix it by opening it up and reattaching a wire. In the latter case, the tool becomes a collection of inert parts again, not a faithful companion who has betrayed your trust. Knowledge about one type of tool or system wouldn’t automatically shield you from falling into the anthropomorphic error in other parts of your life, though.

Beyond us

Our social lives are all-encompassing, and it should not be a surprise if our intrinsic social reward systems mistakenly capture things that don’t really belong there. Favourite toys and much-loved kitchen utensils or items of clothing show us that there is a place for treating objects as friends or partners. But what about other animals? Might it be, as I suggest, that machination has played a central role in allowing us to build ever-more complex devices from all manner of materials and forms without becoming overwhelmed, which is something that other animals seem not to have done?

I’m not suggesting that this idea is the sole divider between humans and other animals. There’s no such thing. But machination does give us new ways to look at how other animals interact with objects in their world. For example, how might we view evidence that non-humans have favourite objects, things that they continue to return to and use even when doing so is inefficient or more costly than finding a new one? Both chimpanzees and orangutans in the wild have been seen carrying stick or leaf ‘dolls’—is that an example of them misattributing ape-hood to an inanimate thing? What (if anything) do crows think that a testing apparatus wants, and do they get frustrated when it doesn’t play along?

We have a way to go before these questions can be answered. Here’s Jane Goodall’s first photograph of the Gombe chimpanzee she named David Greybeard, fishing for termites with a tool. Despite 60 years passing, we still know very little about the value that animals like him place on their tools:

Machination is predicted to happen most readily in situations of stress, such as being short of time. People tend to anthropomorphise more often when stressed, and it is precisely those situations when treating tools as little helpers rather than inert lumps or complicated machinery would be most helpful. this behaviour will not manifest for everyone in the same way, just as we see variation in how people respond to social situations (after all, to our subconscious, machination is just another social interaction). We should therefore expect that not all animals will respond the same way either. But one prediction is that those individual animals most comfortable using novel or complicated tools are more likely to be drawing on some form of mistaken social companionship.

It is also possible that our tools, machines, robots, programs and so on will eventually have their own legitimate goals, independent of ours, at which point machination will no longer be a category error and become a sensible perspective—it will join the intentional stance. For example, I wrote my paper in 2021, before the public release of large language models like ChatGPT. Those models have since become heavily anthropomorphised, to the point that some people believe they really do have the ability to reason and follow their own goals. Whatever the truth of that, however, I think we can already begin to look for new angles on how we and other animals approach tools. We simply need to ask ourselves: what does the tool want?


Source: Haslam, M. (2023) Anthropomorphism as a Contributor to the Success of Human (Homo sapiens) Tool Use. Journal of Comparative Psychology 137, 200-208.

Main image credit: Office Space (1999) || Second image credit: Zootopia (2016) || Third image credit: Shutterstock || Fourth image credit: Jane Goodall/Jane Goodall Institute