top of page
Writer's pictureShanice Ashley

Engineering Intuition

Updated: Apr 6, 2019

In an effort to stay one step ahead of autonomous technology, we seem to be no closer to answering the question, "how close to human thought can artificial intelligence get?" Furthermore, will AI ever be sentient?


The motive here is not to have robots that can laugh and cry along with us. In fact, it is to find and create that thi




ng which makes us human. That feeling which tells you something may be awry even if you cannot see what it is - intuition. Although intuition does not control most of our decision-making, it does seem to be a driving factor in the most important ones.


Self-driving cars have come an extremely long way since their inception. Today, the world's most sophisticated AVs can drive themselves in most situations. However, engineers everywhere are beginning to realize that the quest for level V autonomy is more difficult than it seemed at the start, which begs a new question:


"How human does an AI have to be in order to prove completely capable of driving itself and others?"


Perhaps the answer is...it doesn't need to be human at all. I am a believer that there are many tasks better suited for AI than humans. However, there are other facets of human-machine interaction that are likely to maximize efficiency around the synchronization of artificial capabilities and human input. The possibility that artificial intelligence peaks in capability with human input remains highly likely. Your phone can finish your text messages best with input of your most used language patterns. Spotify connects you to new music once its learned your You have to wonder how successful these technologies would be without human input. We all remember the days of having to go on the internet and search for your music manually. Thanks to the plethora of data gathered by drivers all over the world, one can only imagine the capabilities and efficiency of autonomous vehicles.


Perhaps those intuitive situations don't need to be engineered. I often ponder whether or not the "tricky situations" an AV may find itself incapable of navigating are due to the inability of AI to process things that are "incorrect". After all, we program machines to do things properly. Calculator's don't answer your 2+2 equation with an answer of 3, because it doesn't make sense. It's incorrect. In comparison, accidents, traffic, and hazardous driving conditions are caused by human failure to properly operate the vehicle, not the other way around. If an AV is having difficulty interacting with other vehicles while following the rules of the road, perhaps it's because it doesn't need them. Furthermore, there are currently aspects of driving pertinent only to human beings. To improve that institution, may call for a re-modernization of transportation policy, infrastructure, and industry. Keeping regulation in mind, we must also keep in mind that we are effectively setting the boundary lines for "a better driver" than we've ever known. (Krafcik, 2018).

378 views0 comments

Comments


bottom of page