Budapest Post

Cum Deo pro Patria et Libertate
Budapest, Europe and world news

Why we place too much trust in machines

Why we place too much trust in machines

While many people might claim to be sceptical of autonomous technology, we may have a deep ingrained trust of machines that traces back to our evolutionary past.

As Air France Flight 447 hurtled belly-first towards the Atlantic Ocean at nearly 300km per hour (186mph), pilot Pierre-Cédric Bonin wrestled with the controls. He and his crew had taken over after the autopilot suddenly switched itself off, apparently due to a build-up of ice on the aircraft. It was a situation that demanded manual intervention.

The pilots, unfamiliar with this scenario, struggled to steady the plane. Confusing messages and alarms from the aircraft's computer bombarded them and suggested the craft was not stalling when in fact it was. Bonin's last known words, captured on the flight recorder, were, "We're going to crash – this can't be true. But what's happening?"

All 228 passengers and crew on board perished that day, 1 June 2009. When accidents like this occur, involving humans and machines, there are usually multiple factors or causes at work. But analysts have blamed the tragedy of Flight 447 partly on an excessive reliance or trust in machines. They pointed to the flight crew's expectation that the autopilot would stay switched on, and that the plane's information systems would provide accurate information. It is far from the only incident in which an over-reliance on technology has contributed to fatalities.

It is a well-studied phenomenon known as automation bias, which sometimes also leads to automation complacency, where people are less able to spot malfunctions when a computer is running the show. But what is perhaps surprising is that our tendency to "overtrust" machinery is perhaps directly influenced by millions of years of evolution.

"Technology overtrust is an error of staggering proportion," writes Patricia Hardré of the University of Oklahoma in a book chapter on why we sometimes put too much faith in machines. She argues that people generally lack the ability to judge how reliable a specific technology is. This can actually go both ways. We might dismiss the help of a computer in situations where it would benefit us – or blindly trust such a device, only for it to end up harming us or our livelihoods.

The behaviour of one of our closest relatives, the chimpanzee, may contain a clue about why we are so bad at assessing the trustworthiness of machines. It could be because we are primed to evaluate other members of our species instead.

A zoo in Japan tried to teach its chimpanzees to use a vending machine by watching children – such interactions with technology are entirely built on trust

In a recent experiment, researchers set up an apparatus in which chimpanzees at a sanctuary in Kenya could pull on a rope to retrieve a food reward. One rope offered a basic food reward – one piece of banana. But they were also presented with a second option – a bigger reward of two pieces of banana and a slice of apple that could be dished out either by a machine or a fellow chimpanzee.


Sometimes it was the machine on the other end of the rope, other times it was another chimp, but never both. However, sometimes the machine would not deliver the reward, and sometimes the other chimp would choose not to share. So while there was potentially a bigger reward, it was a less certain choice to pull the second rope.

The chimpanzee participant was thus faced with either a social or a non-social condition. They'd need to trust either the machine or the other chimp in order to have the chance of a bigger food reward.

The study found that, when a fellow chimpanzee presided over the uncertain option, the primates were less likely to go for it. They refused to engage with the social trials 12% of the time, but they only displayed this aversion 4% of the time in non-social trials where a machine presided over the reward. In other words, they had greater trust in the machine.

"They were much more hesitant… when the partner was another chimpanzee," says Lou Haux at the Max Planck Institute for Human Development, who designed and ran the experiment with her colleagues. It's one of just a few studies revealing that social risk plays a big part in how chimpanzees and humans navigate the world.

Imagine how you'd feel if a bartender took your cash and then proceeded to drink your cola right in front of you. You'd probably be livid


It's called "betrayal aversion" says Haux: "The fear of being duped by another human [or chimpanzee], which is thought to cause stronger emotions." She likens this to putting money into a vending machine only for it to fail to dispense the soft drink you requested. That could prompt irritation, no doubt, but imagine how you'd feel if a bartender took your cash and then proceeded to drink your cola right in front of you. You'd probably be livid. Of course, the vending machine didn't make a decision to dupe you, it just failed to deliver, while the bartender decided to drink your order despite knowing how that might make you feel.

The research team didn't stop there, however. They went on to perform another experiment involving chimpanzees who had already gained an understanding of how likely they were to get a better food reward when selecting the uncertain option, thanks to having participated in the first experiment. The uncertain option was, in truth, no longer completely uncertain – the chimps now possessed a sense of what kind of risk they were taking.

And that's when a surprise emerged. The chimpanzees stopped discriminating between the social and non-social options – they no longer seemed to trust the machine more than the fellow chimpanzee.

"That's why we think it's an exciting finding, that they distinguish between the social world and the non-social world in cases where there's still a lot of uncertainty around," says Haux.

It makes sense when you think about how important it is for primates to negotiate their social environment, says Darby Proctor, a psychologist at the Florida Institute of Technology.

"With a machine, there's no future implications," she explains. "You don't have that additional potential social cost." After all, chimpanzees in these experiments often have to go and spend time with their fellow participants once the experiment is over – any displeasure caused by taking part could have onward consequences for their relationships.

Most of us are aware that blindly following a satellite navigation system can lead to sticky situations


Proctor and colleagues previously carried out similar tests and also found that chimpanzees were more likely to trust objects in the search for food rewards versus other chimpanzees. Proctor mentions that when one of the primates was seemingly let down by another chimpanzee who failed to give out a hearty food reward, the dejected chimp made their feelings known by spitting water at the partner. "A common display of unhappiness," says Proctor.

Proctor questions whether the chimpanzees in these experiments really trusted the machines more. It could also be described as individuals simply having a more muted reaction to a bad deal when a social partner isn't involved.

“It’s not that we have confidence that the machine will give us a good payout, it might be that we don’t see it as emotionally salient, so we might just be more inclined to gamble or take risks with this inanimate object,” she hypothesises.

But either way, evolution seems to have influenced primates' willingness to engage with uncertainty, based on whether we feel we're taking a social risk or not.

Evolution hasn't really prepared us for the fact that it can be quite costly to be betrayed by a machine, argues Francesca de Petrillo at the Institute for Advanced Study in Toulouse, who studies primates. For millions of years, there was no need to develop an ability to assess machines as carefully as we assess members of the same species. But today, when technology can have a huge impact on people's lives, there arguably is.

Many surveys have suggested that people are often uncomfortable with the idea of self-driving cars or handing over work responsibilities to machines


There are other factors involved here. Evolution aside, our willingness to trust technology is also influenced by personal knowledge about a machine or device, and cultural expectations. A 2019 study found that people were, on average, 29% more likely to give away their credit card details during a text-based chat if they thought they were speaking to a computer versus another human being. The researchers found that this effect was even more pronounced among those who had a pre-existing expectation that machines were more secure or trustworthy than humans.

Then again, sometimes people report a strong aversion to trusting technology. Many surveys have suggested that people are often uncomfortable with the idea of self-driving cars or handing over work responsibilities to machines. There are lots of reasons why suspicions about new technology can take hold. People might fear losing a part of their identity if an automaton takes over. Or they might simply feel sceptical that a computer will approach certain tasks with the required caution and dexterity. When you've seen a hundred videos of robots falling over, or experienced an obstinate computer refusing to function correctly, that's not necessarily surprising.

Among those who have studied what can influence an individual's willingness to trust a particular technological system is Philipp Kulms, a social psychologist at Bielefeld University in Germany. He and a colleague came up with a Tetris-style puzzle game in which participants collaborated with a computer partner that also had control over some of the pieces. When the computer played the game well (demonstrating competence) and allowed the human player access to high-value pieces that netted them extra points (demonstrating warmth), the participants were more likely to report that they trusted the computer player and also more likely to exchange puzzle pieces with it in a collaborative effort, which shows that they also expressed that confidence in-game. It was trust formed by mechanics, if you like.

"To our surprise, this very limited set of variables that we could manipulate was apparently enough," says Kulms.

If we accept that people are generally ill-equipped to assess the trustworthiness of machines because, from an evolutionary standpoint, we are primed to judge trustworthiness based on social cues, then this makes perfect sense. It also chimes with other research that suggests gamblers will gamble more when playing on slot machines that have been designed to display human-like properties.

Most interactions with automated systems are benign, but should we really trust them unreservedly to dispense medical prescriptions?


In other words, we aren't just bad at weighing up the trustworthiness of a machine, we're also easily seduced by mechanical objects when they start behaving a bit like a social partner who has our best interests at heart.

So, in contrast to the person who has been stung by unresponsive computers and therefore distrusts them all, the individual who has learned to put their faith in certain systems – such as aircraft autopilots – may struggle to understand how they could be wrong, even when they are.

De Petrillo notes that she's experienced a feeling of confidence in computers when interacting with voice-activated assistants like Apple's Siri or Amazon's Alexa.

"I assume that they are acting in my best interests, so I don't need to question them," she says. So long as they appear competent and reasonably warm, Kulms' study suggests, that will continue to be the case, for de Petrillo and many others. Kulm points out that this is why it's so important for designers of technology to make sure their systems are ethical and their functionality transparent.

The great irony of all this is that behind a seemingly trustworthy machine may lie a malicious human with nefarious intentions. Undue trust in a faulty machine is dangerous enough, never mind one designed to deceive.

"If we were taking the same type of risk with a human, we would be much more attuned to what the potential negative outcomes could be," says Proctor. She and Haux agree that more work is needed to tease out how much the chimpanzees in their studies genuinely trust machines, and to what extent that reveals truths about human behaviour.

But there's a hint here that our occasional, sometimes disastrous, overtrust in technology has been influenced by a simple fact: we evolved to be social animals in a world where machines did not exist. Now they do and we put our money, our personal data and even our lives under their supervision all the time. That's not necessarily the wrong thing to do – it's just that we're often pretty bad at judging when it's right.

AI Disclaimer: An advanced artificial intelligence (AI) system generated the content of this page on its own. This innovative technology conducts extensive research from a variety of reliable sources, performs rigorous fact-checking and verification, cleans up and balances biased or manipulated content, and presents a minimal factual summary that is just enough yet essential for you to function as an informed and educated citizen. Please keep in mind, however, that this system is an evolving technology, and as a result, the article may contain accidental inaccuracies or errors. We urge you to help us improve our site by reporting any inaccuracies you find using the "Contact Us" link at the bottom of this page. Your helpful feedback helps us improve our system and deliver more precise content. When you find an article of interest here, please look for the full and extensive coverage of this topic in traditional news sources, as they are written by professional journalists that we try to support, not replace. We appreciate your understanding and assistance.
Newsletter

Related Articles

0:00
0:00
Close
Instagram Released a New Feature – and Sent Users Into a Panic
China Accuses: Nvidia Chips Are U.S. Espionage Tools
Mercedes’ CEO Is Killing Germany’s Auto Legacy
US Postal Service Targets Unregulated Vape Distributors in Crackdown
RFK Jr. Announces HHS Investigation into Big Pharma Incentives to Doctors
Australia to Recognize the State of Palestine at UN Assembly
The Collapse of the Programmer Dream: AI Experts Now the Real High-Earners
Security flaws in a carmaker’s web portal let one hacker remotely unlock cars from anywhere
Denmark Pushes for Child Sexual Abuse Scanning Bill in EU, Could Be Adopted by October 2025
Street justice isn’t pretty but how else do you deal with this kind of insanity? Sometimes someone needs to standup and say something
Armenia and Azerbaijan sign U.S.-brokered accord at White House outlining transit link via southern Armenia
Barcelona Resolves Captaincy Issue with Marc-André ter Stegen
US Justice Department Seeks Release of Epstein and Maxwell Grand Jury Exhibits Amid Legal and Victim Challenges
Spain Scraps F-35 Jet Deal as Trump Pushes for More NATO Spending
France Faces Largest Wildfire Since 1949 as Blazes Rage Across Aude
French Senate Report Alleges State Cover‑Up in Perrier ‘Natural Mineral Water’ Scandal
British Labour Government Utilizes Counter-Terrorism Tools for Social Media Monitoring Against Legitimate Critics
OpenAI Launches GPT‑5, Its Most Advanced AI Model Yet
Brazilian President Lula says he’ll contact the leaders of BRICS states to propose a unified response to U.S. tariffs
US envoy Steve Witkoff arrived in Moscow to seek a breakthrough in the Ukraine war ahead of President Trump’s peace deadline
WhatsApp Deletes 6.8 Million Scam Accounts Amid Rising Global Fraud
Britain's Online Safety Law Sparks Outcry Over Privacy, Free Speech, and Mass Surveillance
Nine people have been hospitalized and dozens of salmonella cases have been reported after an outbreak of infections linked to certain brands of pistachios and pistachio-containing products, according to the Public Health Agency of Canada
Karol Nawrocki Inaugurated as Poland’s President, Setting Stage for Clash with Tusk Government
US Charges Two Chinese Nationals for Illegal Nvidia AI Chip Exports
Texas Residents Face Water Restrictions While AI Data Centers Consume Millions of Gallons
U.S. Tariff Policy Triggers Market Volatility Amid Growing Global Trade Tensions
Tariffs, AI, and the Shifting U.S. Macro Landscape: Navigating a New Economic Regime
German Finance Minister Criticizes Trump’s Attacks on Institutions
India Rejects U.S. Tariff Threat, Defends Russian Oil Purchases
United States Establishes Strategic Bitcoin Reserve and Digital Asset Stockpile
Thousands of Private ChatGPT Conversations Accidentally Indexed by Google
China Tightens Mineral Controls, Curtailing Critical Inputs for Western Defence Contractors
OpenAI’s Bold Bet: Teaching AI to Think, Not Just Chat
U.S. Tariffs Surge to Highest Levels in Nearly a Century Under Second Trump Term
Ong Beng Seng Pleads Guilty in Corruption Case Linked to Former Singapore Transport Minister
BP’s Largest Oil and Gas Find in 25 Years Uncovered Offshore Brazil
Italy Fines Shein One Million Euros for Misleading Sustainability Claims
JPMorgan and Coinbase Unveil Partnership to Let Chase Cardholders Buy Crypto Directly
Declassified Annex Links Soros‑Affiliated Officials and Clinton Campaign to ‘Russiagate’ Narrative
UK's Online Safety Law: A Front for Censorship
Parents Abandon Child at Barcelona Airport Over Passport Issue
Bus Driver Discovers Toddler Hidden in Suitcase in New Zealand
Switzerland Celebrates 734 Years of Independence Amid Global Changes
China Enforces Comprehensive Ban on Cryptocurrency Activities
Grok 4 Video plus Voice, can identify wildlife!
George Soros tells the World Economic Forum: "President Trump is a con man and the ultimate narcissist, who wants the world to revolve around him."
Hamas are STARVING the hostages.
The UK Does Not Have a ‘Far-Right’ Problem
British Tourist Dies Following Hair Transplant in Turkey, Police Investigate
×