Remember that Movie iRobot? You know, the one where robots attempt to take over and protect humans at all costs? Well in that movie, the main character, portrayed by Will Smith, had a sincere hatred for robots before they started revolting against the human population, and he had good reason. At one point in the movie, it is revealed that a robot lets a child drown, while saving Smith’s character because he was a little more likely to survive. The robot had no soul; just algorithms that helped it decide which person to save in the middle of a tragic situation.
There’s that word: Algorithm… does it sound familiar? Well, it should. Algorithms are exactly what makes autonomous technology work. And, as we transition into cars with better autonomous technologies, the artificial intelligence in those cars will be tasked with making more, potentially deadly, decisions. Think about that for a minute. Potentially deadly. But, aren’t autonomous cars supposed to make the roads safer for everyone? They are, but there is a serious dilemma unfolding with each day that passes.
What happens if an autonomous car has to make a decision between itself and a pedestrian? At a quick glance, the answer seems pretty obvious, right? The pedestrian has the right of way, so pedestrian should get the long straw. Well, that’s not the way everyone sees it, and a recent study shows that the general public is torn between whether or not they are willing to give their life to save others in the event that an autonomous car can’t avoid an accident. Furthermore, it could even lead to the roads and use of autonomous cars to be more unsafe than they are now.
Keep reading for the full story
from Top Speed http://ift.tt/290gN4C
via IFTTT
0 comments:
Post a Comment