Friday, July 29, 2016

Robot Violates Asimov's First Law (part 5/5)


Considering the statement from Knightscope about their K5 robot injuring a young child, it's obvious that we'll be seeing more of this kind of response—in the effort to protect their company, robotics-based businesses will insist their robots functioned as programmed and were not at fault (even if it seems to be true in this case). Keep an eye on this, because we'll likely see more of the same for the next few years as robots become more commonplace in public. How long will it take before some kind of programming standard is mandated to protect the public? And then, how long will it take before that standard is violated somehow? The next few decades will likely be a period of adjustment as the “kinks” are worked out while introducing a robot presence to the public. There will always be the question, though, at whatever stage we are at with robots and how they have integrated into our society: how much will it take, and how big will the shockwave be, when robots injure another person? The answer might be the collapse of entire economies that rely heavily on robots, which is a scenario that occurs with one of the possible endings to the Solar Echoes mission, “The Seeds of Chaos.”

No comments:

Post a Comment