AI APPLICATIONS
you can enter a destination hundreds of miles away and your car will take you there .”
Ethics and misconceptions : what needs to be considered ? Within all applications of AI , be it AI in healthcare , military operations or facial recognition , bias and ethics is and will remain a huge issue .
Outlining these ethical biases , Nicholson stated there is a number that need to be considered :
• “ Avoid visual bias , such as whether it can spot a person of colour just as well as a white person or a woman compared to a man
• Avoiding technological bias , such as whether it can see a cyclist because they have bought a device that broadcasts the cyclist ' s location , but the cyclist who can ' t afford that device has a higher chance of being hit
• Decision-making bias is also crucial , with questions around whether there ’ s a 10 % chance of hitting a male or a 10 % chance of hitting a female , making sure that there isn ' t any bias in the choice of action
• Finally , self-preferential bias such as if a car is a Google car and it gives way to a fellow Google car as opposed to a Ford vehicle at the next junction .” Misconceptions in the media mean that many people see AI as a threat to humankind . To overcome these misconceptions and fears within autonomous vehicles , Rodriguez believes the narrative should be more open and testing needs to remain rigorous : “ It is important that the technology is proven under controlled environments before we deploy driverless solutions globally . Each road across every city and country has its own specific nuances which bring a variety of navigation challenges .”
He concluded : “ One of the key elements will be acceptance of the technology . This may be somewhat generational as the younger generation , being less car-centric , are happy to adopt new tech that makes their lives easier and less costly in towns and cities . Why own a car when it is so easy and cheap to call up a driverless vehicle which is guaranteed to be safer than a humanoperated vehicle ?” aimagazine . com 109