All interesting problems are Goldilocks problems. You want the porridge not to be too hot, nor to be too cold, but to be just the right temperature.
Popular media does not deal in balance. The way the press would deal with the porridge problem is to tell the story of a child who suffered horrific burns to their tongue due to consuming boiling porridge. Headlined ‘Mouth Burn Porridge Hell’, the story would be five paragraphs detailing gruesome injuries suffered as a result of porridge that was too hot. The last paragraph, included to provide ‘balance’, would be a quote from a spokesperson from the Right to Hot Porridge Association, “The RHPA expresses their deep sympathy to the child in question but would point out that 99.87% of hot porridge is consumed with no harmful effects”.
On Saturday the Guardian published an article headlined, “Tech firms say laws to protect us from bad AI will limit ‘innovation’. Well, good.”
Regulation in any sector is a Goldilocks problem.
Too little regulation and bad actors will do bad things that can be hugely detrimental to society. The 2008 financial crisis, Enron, the Fukushima nuclear disaster can all be directly attributed to too little regulation.
Some examples of poor regulation
Too much regulation stops progress being made. For centuries charging interest on loans was prohibited. Lending money is important for societal progress because it enables people without personal wealth to start and expand businesses, to access and benefit from education and to own and trade assets. Preventing this stifles economic growth and prevents the creation of jobs.
Up to 1940, 26 states in the US had regulations that prevented married women being employed in state government jobs. This blocked both the progress of the women denied those jobs and also the institutions themselves.
It is difficult to remember that, up until 1994, it was illegal to buy and sell things in the UK on a Sunday. It is increasingly hard to find anyone in the UK who now believes that Sunday trading regulation was a good thing.
Regulation is a particularly thorny issue in tech. The vast majority of legislators simply do not understand it. Technology moves very fast. In the past 12 months AI technology has moved very fast indeed. Legislation moves painfully slowly. With legislation virtually always lagging technological innovation the temptation can be to over-regulate in the short-term.
The EU AI Act and an 18th Century Clergyman
Nearly 300 years ago Reverend Thomas Bayes put forward the theorem on conditional probability that would bear his name. The British mathematician, Harold Jeffreys, described Bayes Theorem as “to the theory of probability what Pythagoras’s theorem is to geometry.”
Why am I mentioning this? The new EU AI Act explicitly seeks to regulate the use of Bayes Theorem; a centuries old piece of mathematics which remains critical to probability theory and statistics. That is an enormously broad definition of AI and one that has far-reaching unknowable consequences. This feels like an example of legislative over-reach based on lack of understanding rather than a proportionate response to a new technology.
Striving for balance
Regulate too little and people and society are harmed. The Guardian article linked to above references the very real problems of algorithms churning out harmful content to children and AI systems that perpetuate and accentuate discrimination against minority groups and online. We want regulation that prevents these abuses. Clearly tech companies are not going to do this themselves or they would have done so already. Preventing abuse of technology costs money. For-profit organisations very rarely volunteer to spend additional money on non-revenue generating activities.
We want to regulate to protect children and to combat extremism and prejudice. We don’t want to legislate in such a way that would have prevented DeepMind solving protein folding or would stop cancer diagnoses being made more accurately and more quickly. People can suffer and die as a result of over-regulation as well as under-regulation.
At AutogenAI we care passionately about using technology for good. For example, my colleague, James Huckle, has written about how we build our Language Engines to reflect the modern world as we would want it to be.
Can the AI help us regulate AI?
I asked AutogenAI’s general Language Engine “How could government regulate the tech industry better?” Here are 10 ideas that the Artificial Intelligence came up with:
1) Work with industry leaders to create clear rules and regulations regarding the use of AI.
2) Create incentives for companies to develop new technologies that promote the general good.
3) Invest in research and development to keep pace with the rapidly changing landscape of the tech industry.
4) Develop universal standards and best practices in collaboration with think tanks, universities and business.
5) Increase funding for enforcement and investigations into potential violations of laws and regulations.
6) Create a dedicated task force or agency to oversee the tech industry.
7) Establish clear penalties for companies that violate laws or regulations, including financial penalties and revoking of licenses.
8) Work with other countries to develop coordinated approaches to regulating the tech industry.
9) Hold public hearings and consultations on proposed changes to laws and regulations.
10) Review existing laws and regulations regularly to ensure they are still effective and relevant.
AI is a tool. It is a tool used by humans. Like most tools it can be used for good or ill. We need regulation that does not get in the way of AI being used for good while preventing it being used for harm. Regulation is not easy. At AutogenAI we want to actively support informed, proportionate regulation of these new technologies to make sure that they genuinely work for the common good.