Clemson study: Should computers, smartphones apologize?

Clemson Univeristy researcher Richard Pak is a psychology professor.

The failures are inevitable: Voice-recognition software won’t understand us, hard drives will crash on us and GPS apps will lead us astray.

So wouldn’t it be nice if our computers and cellphones could just say they’re sorry for their glitches, crashes and error messages?

That’s a question being posed by Clemson University psychology professor Richard Pak, who wants to know whether something as simple as an apology could smooth our relationships with the automated technology that increasingly guides our day-to-day lives.

It’s a pressing issue: Automation is already commonplace, and it’s found its way into our cellphones, laptops and cars, in the form of everything from turn-by-turn directions that avoid congested roads to in-car sensors that put on the brakes in a traffic jam.

“Just 10 years ago, automation was really only within military or nuclear power plant control operators or industrial settings. The everyday layperson really didn’t have much exposure to that much advanced automation,” Pak said. “But just in the past 10 years, automation’s everywhere.”

And the pace of change is only accelerating. Self-driving cars are on the not-so-distant horizon, drone deliveries seem increasingly feasible, and robotics — which have already reshaped factories and warehouses — are starting to creep into Americans’ homes.

But the question of how to maintain — or repair — relationships between humans and the automation they use each day is still mostly unanswered, Pak said.

He and his collaborator on the project, George Mason University researcher Ewart de Visser, plan to give their test subjects a piece of software designed to fail and see how they react when it owns up to the issue and explains what happened instead of, say, blaming user error.

The study, which is still in the planning stages, is expected to take about a year, with preliminary results this fall. It’s backed by a $97,000 grant from Google’s research arm. The technoology giant, whic operates a large data center near Goose Creek, didn’t respond to a request for comment.

Pak said he plans to follow up the study with more detailed research into how different kinds of technology and errors might require different kinds of apologies.

The research strikes at a dilemma in automation: Mistakes are inevitable because no technology is fail-proof, but users tend to expect perfection all the same. Automation doesn’t get the benefit of the doubt people give one another.

“Machines fall from grace more dramatically,” de Visser said. “You just expect more from machines.”

Adding to the challenge is that the rise of automation has often put people in the position of overseeing a task instead of doing it themselves, like making sure a self-driving car stays in its lane instead of taking the wheel. That relationship requires trust — to depend on technology as a decision-maker and not just another tool.

Still, software playing to human emotions can be a risky proposition, said Jonathan Gratch, director for virtual human research at the University of Southern California’s Institute for Creative Technologies.

Emotions, after all, aren’t all positive.

So a smartphone’s apology might work as a patch after a mistake, but it’s not a long-term fix. Apologizing, after all, is “a commitment to do better next time,” said Gratch, who isn’t affiliated with the Clemson study.

“If the machine just says it’s sorry but doesn’t do anything better, that could actually backfire,” Gratch said.

And our response to automated apologies could sour from forgiveness to frustration.

Reach Thad Moore at 843-937-5703 or follow @thadmoore.