AI needs auditing, for humanity's sake

AI needs auditing, for humanity's sake

Two of the most important sectors are defined by a risk management culture - financial services and the public sector. They are often criticized for their caution, but their longevity hints at some success from this approach. An audit process acts as the foundation for the risk management culture. Artificial intelligence (AI) and automation technologies require the same auditable culture in order to protect citizens and, ultimately, organizations, says ForHumanity, a not-for-profit organization advocating the auditing of AI. 

Ryan Carrier founded ForHumanity in 2016 following eight years in hedge fund management and a 25-year financial services career. Today ForHumanity has 46 fellows in all four corners of the globe, six streams of work, a leadership board and over 800 contributors to its knowledge banks, codes of ethics and conduct and audit programme. 

Carrier's time in hedge funds brought him into contact with AI, which was used to manage the diversification of investment portfolios. Parenthood would shift Carrier's perception of AI, though, he says: 

Instead of managing risk, the technology industry has become hooked on the vernacular of move fast and break things, whilst the word disruption took on a positive connotation, rather than the pain and negativity it has always been associated with in the past. Carrier says he founded ForHumanity not to put an end to innovation at pace but to ensure there is some thought about the consequences. He says: 

I don't mind some disruption. I am a capitalist, after all. But what I started to see was moving fast and breaking things was breaking people and breaking relationships. I played that out with my boy's future in mind, and I got scared. In financial services, you are coming from a heavy risk management culture, but looking at technology, and in particular Silicon Valley, and you see none at all.

ForHumanity believes that the audit process, which publicly listed companies are subject to, has now become vital to the safe development of AI and automation technology. Carrier says the principles of transparency are tried and tested. He explains:

ForHumanity admits that the audit is far from foolproof. Carrier adds:

The alternative is to continue with no system of redress or analysis, which Carrier says is not sustainable. He defends audits and says: 

ForHumanity claims if the technology is to benefit society and the organizations deploying the technology, then an audit will begin the process. 

AI and automation are challenging existing liability frameworks. For example, in the case of the automotive industry, the liability of a product has traditionally been laid with the vehicle manufacturer, but if the autonomous software used in a vehicle comes from another provider, then the liability becomes complicated. Carrier says the European Union is leading the way with its proposed Artificial Intelligence Act that defines the liability of software. 

ForHumanity believes that auditing and legislation will change the risk/reward dynamic, which is too beneficial to the software industry. Carrier says: 

ForHumanity proposes shifting the risk/reward dynamic through three ways, government legislation, which GDPR and the UK's Children's Code are examples of. Although not perfect, Carrier says these acts are important steps towards balancing the risk/reward. He adds:

ForHumanity secondly calls for giving citizens the right to legal action. This would give individuals the right to sue software companies for harm caused by automation and AI tools. Carrier says: 

ForHumanity says consumers must demand responsible AI products in the same way they expect food hygiene standards. Carrier adds: 

ForHumanity has a mission statement, which says: 

Carrier says of it: 

A data taxonomy by the organization is being released to help CIOs and data leaders define data types, metrics, outcomes, pipelines and the process flow across the organization. 

ForHumanity also recently worked with the UK Information Commissioner's Office to define UK GDPR, a derivative of the EU GDPR that is necessary as a result of the Conservative Party Brexit policy. Carrier explains that UK GDPR will see increased accountability placed on the data protection officer than currently exists in GDPR. However, ForHumanity is concerned by other proposals by the government, which reduce the levels of protection that citizens in the UK will receive compared to their European neighbours. He says: 

ForHumanity will offer licences to its models for data auditing and trust as well as training for its core revenue models to ensure the organisation remains sustainable. 

Recently an old friend shared a meme on social media which featured an image of various single-use plastic water bottles. The image was captioned that capitalism drives innovation; the statement was not untrue but ignored the fact that well-regulated capitalism drives innovation. Single-use plastic bottles and their proliferation are an example of a product that could and should be regulated out of production. If AI and automation technologies are to benefit business and society alike, then good regulation will become necessary - and as consumers, we should demand it. Although far from perfect, auditing leads to disclosure and transparency, which more often than not leads to good business practice. 

Images Powered by Shutterstock