In response to
"Wouldn't self drive cars be basically AI, making thousands of decisions per minute, many of which could cause harm if choosing poorly. -- nm"
by
zeitgeist
|
The concern of AI isn't robots seeking to destroy humans but rather robots killing humans in the process of carrying out their duties.
Posted by
Mop (aka Rburriel)
Aug 11 '17, 22:43
|
For instance, a car assembly factory charged with building cars the most efficient way possible, which may mean seizing humans and working them to death. They AI was simply doing what it was programmed to do. In that context, a self-driving car is exactly that. The trolley paradox is a big factor here. But what if your car was simply programmed to get you from A to B the fastest? Over sidewalks, through parks, running over pedestrians... it did what it was programmed to do.
|
Responses:
|