Companies must adopt proactive and more responsible policies to mitigate biases in artificial intelligence (AI)-based decision-making within an organization, said a study by the Aapti Institute and United Nations Development Programme (UNDP), issued on Wednesday. “The results are timely as AI has the potential to transform many industries in the coming years and can be used to improve the lives of vulnerable people,” said Jeremy Ulman, founder and CEO of Aapti. “This AI of ethics will be used to help protect society and its land from the damage that could be caused by AI-based decision-makers.”
The study found that a significant number of firms are struggling with the practicalities and potential impact of artificial intelligence technology, thus the need for more robust policies to ensure these technologies are used ethically. As a result, more than half (52%) of respondents surveyed within these organizations experience bias in their decision-making resulting in unnecessary risk and lost revenue. The majority (46%) believe internal data collection is a potential risk in AI systems, while 51% believes it could potentially harm their company internally. Of those who express concerns, most want to prevent this within their organizations by offering greater transparency and controls.
Across different sectors within the research respondents identified four common challenges to achieving ethical decision-making:
- A strong demand for transparency at all levels; lack of room for error;
- lack of resources;
- doubts about ethics;
- and scarcity or lack of guidance on how to guide the resources toward ethical approaches.
“In order for businesses to develop world-leading approaches to achieve better decisions in AI-based technologies like voice recognition and natural language understanding, innovators should recognize that these are new technologies that present unprecedented opportunities for companies but also require new ways of thinking about business processes and processes across industries.,” says Aapti CEO, Jeremy Ulman.