Automation and Ethics: The Human Side of Technology | TechWell

Automation and Ethics: The Human Side of Technology

On my way home from work last week, I listened to a story on NPR featuring Nicholas Carr, author of The Shallows and The Big Switch. Carr’s books often explore the effect technology has on our society and ourselves, and they serve as warnings of the price we pay for our tech-enriched lives.

In this episode of NPR’s “All Tech Considered,” host Robert Siegel borrows a 2014 Mercedes-Benz S550 4Matic to pick up Carr from his hotel. This Mercedes is the company’s most technologically advanced, with automated features that keep the driver from drifting out of his lane, turn on the windshield wipers when needed, and follow at a safe distance from the car ahead.

Siegel used this car to demonstrate automation technology—the focus of Carr’s latest book, The Glass Cagewhich explores the consequences of so much automation in our lives. One hazard is what Marshall McLuhan referred to as the principle of auto-amputation: When we automate a task, we stop doing work we otherwise would do manually. As a result, we eventually “amputate” the knowledge and skills needed to perform that task. For example, thanks to smartphones, we no longer need to learn phone numbers, read maps, and memorize facts.

While the NPR story was fascinating, one point stuck with me long after I turned off the radio. What happens when the skill we lose is how to control a vehicle? Real, live people develop the automation software for the self-driving cars and the autopiloted airplanes, so someone has to decide what to tell the device to do when things go wrong. That’s hard.

"You have to start programming difficult moral, ethical decisions into the car," Carr says. "If you are gonna crash into something, what do you crash into? Do you go off the road and crash into a telephone pole rather than hitting a pedestrian?"

Automation on this scale requires considering some unsavory scenarios. For example, as Aeon magazine posits: “If your vehicle encounters a busload of schoolchildren skidding across the road, do you want to live in a world where it automatically swerves, at a speed you could never have managed, saving them but putting your life at risk? Or would you prefer to live in a world where it doesn’t swerve but keeps you safe?”

What are the ethical and moral responsibilities of software developers and testers working on these projects?

I don’t have an answer, but I’d like to hear yours. If you work on automation technology or other safety-critical projects where tough decisions such as these need to be made, who makes those decisions that can result in loss of life? Do you feel increased stress programming on these projects, or are they just another assignment? How about the testing aspect? What happens if a bug slips past?

If you don’t work on automation projects but you want to weigh in on the moral and ethical responsibilities of software developers and testers, I want to hear from you, too.

Send me your stories, thoughts, and concerns, and I’ll compile them into a future blog post or article exploring the human side of developing automation technology.

Up Next

About the Author

TechWell Insights To Go

(* Required fields)

Get the latest stories delivered to your inbox every month.