Crippling staff shortages and calls for increased diversity have prompted AI leaders to call for arts and humanities students to join the technological elite developing Artificial Intelligence systems.
In the latest PassW0rd radio programme from Future Intelligence, ‘Entering the AI World‘, some of the world’s leading AI voices have called for students from across all disciplines to become involved in future AI development to improve the technology.
The calls reflect research carried out in 2014 by Associate Professor of Machine Learning Dr Mike Osborne, for Oxford University which had said that the ideal combination for a future-proof job in 2024 would be mathematical skill combined with artistic flair – in short that AI needs artists and preferably artists that combine science with their art.
To hear this edition of PassW0rd click: Entering the AI World
The Oxford University research, featured in Future Intelligence at the time, said that driverless cars will replace taxi-drivers. In an eerie echo of Dr Osborne’s findings last week the Imperium Drive announced the launch of ‘Fetch’ a remotely controlled driverless car service that delivers self drive vehicles to human customers and once they have finished with them, driverlessly returns them or takes them to the next customer.
According to Janet Adams, Chief Operating Officer of SingularityNet, the crypto-currency and AI brainchild of Artificial General Intelligence guru Ben Goertzel, AI now needs people from other disciplines to develop properly.
“We need people from all walks of life, from all communities from all races, backgrounds, gender identity, identification, identities, sexual preferences, race, religion, age. We need our AIs because they are potentially the most powerful technology ever invented. They need to reflect the whole beauty and the joy and the glory and the full diversity of humankind.
Diversity improves AI
“So I think it’s really important for everybody to find some way of getting involved and that we grow our AIs in such a way that they are inclusive enough benefit to everyone, not just to the few,” said Adams.
Echoing a call made by Dr Stephen Cave, Director of Cambridge University’s Future of Life Institute at the launch of the House of Lords report into AI development ‘AI in the UK: Ready willing and able?” attended by PassW0rd at which Dr Cave said that AI would fail unless all sections of the community were involved in its rollout.
A perhaps slightly evangelical view of technology but one that was reflected by Professor Jim Hendler, author of ‘Social Machines: the coming collision of artificial intelligence, social networks and humanity’ and one of the US’s leading data scientists. Professor Hendler is the Director of the Institute for Data Exploration and Applications and the Tetherless World Professor of Computer, Web and Cognitive Sciences at Rensselaer Polytechnic Institute, the acting director of the RPI-IBM Artificial Intelligence Research Collaboration and serves as a member of the Board of the UK’s charitable Web Science Trust. He was also the former Chief Scientist of the Information Systems Office at the US Defense Advanced Research Projects Agency.

“The most important thing you can do is educate your workforce, make sure that the people who are going to be deploying this technology understand what its boundaries are so that they can decide both so that they can decide when they deploy it, whether they want to or not. They also need to understand it so when questions start coming in, when things start happening, that don’t make sense, they know what questions to ask and who to ask it of,” said Professor Hendler, who added the power of AI now meant that a basic understanding of how it worked and how to use the technology should be taught in schools.
“There’s some legislation that asks for the Science Foundation to actually look at K twelve education space. But what I do think is a mistake in that space is too much of that focuses on the programming. We say we want to make our students AI literate in the sense of able to programme it. Well, first of all, I don’t know why you would want everyone to do that, but secondly that’s not the best way to understand what the system does,” said Professor Hendler, speaking to PassW0rd in his role as a member of the US’ Association of Computing Machinery’s Technology Policy Committee.
Professor Hendler is not alone in stating that entrance to the advanced computing world should not demand computing skills. Sheffield University’s Professor of Search and Analytics Paul Clough, who is also head of the business consultancy Peak Indicators underlined that the industry was now moving to a skill lite model.
“I think there will be some people who are at the kind of cutting edge who are driving the development of the AI technologies. That tends to be limited to some big tech companies, together with certain individuals, who have maybe gone to various universities and so on. But I think the way it’s going is that a lot of AI technology is becoming, democratised, in the sense you and I could access tools where we don’t have to write any code. Yet we can use those tools, for example, for machine learning, to build a predictive analytics model or tool and so on.
“I think there will be different types of people. I think there will be some who will be writing the code and developing new algorithms. I think there will be others, and I’ll probably include myself in this who are users of the technology and are using the technology and applying that to various applications. So, I think it’s going to vary. To be honest, I don’t think you’ll lose out if you’re not a coder.”
AI needs artists
Indeed, both Professor Clough and SingularityNet’s Adams both point to the fact that art, as the Oxford University research pointed out, can teach AI new things.
“I think machines typically are not particularly creative,” said Professor Clough. “Now, there are examples of AI, particularly the new kind of deep learning methods that are coming along, which have been trained on huge amounts of incredibly complex data, which are showing elements of intelligence through creative writing, or of creating art on their own, when they have been left completely unsupervised. They can be left to do it on their own because they’ve been trained on enough examples,” Clough added, a creativity that Google demonstrated by feeding artworks into a neural network architecture, in a process it called ‘Inceptionism’ that was overseen by eleven Google engineers and artists. The networks learned from the pictures the rules and associations the pictures were based on. The resulting pictures were sold at an exhibition in San Francisco in 2016.
As Adams said: “You don’t just have to be scientific. You can also be artistic, but love how the possibiliites of artificial intelligence and science and technology can enhance art and help to create new forms of art.
No Code and Low Code
A recognition that AI needs artists by the elite of the AI community also of an employment position now being laid out to businesses desperate to fill IT roles in both AI and cyber security because of the change to working practices caused by the pandemic and one that according to Zandra Moore, of the Leeds based company Panintelligence means that people with little coding ability should be encouraged to get involved in the technology industry.
“I’ve seen many people come up with ideas to help address the digital skills shortage and how to make tech careers more accessible and the industry more diverse for people of all backgrounds. One solution I strongly believe could help tackle these issues is No Code and Low Code technology. These platforms are set to change the way we digitally innovate and will help to radically reduce barriers to entry into technology careers, as they don’t require users to have advanced technical skills to use them. This provides an opportunity to rapidly accelerate the diversity of people who can enter the STEM community.”
A possible solution to alarming research from the Learning and Work Institute (LWI) which recently revealed, that far from being at the cutting edge of technology as is so often claimed the younger generations that have grown up in the digital age with the internet and mobile phones is entering the workforce without the necessary digital skills. The LWI research has shown that over 70% of young people expect employers to invest in teaching advanced digital skills on the job – yet many employers are unable to fund such training.
A worrying problem facing the tech industry at a time when not only is the demand for digital skills soaring as technology capabilities continuously evolve but as we have heard from those developing the AI world, they are growing ever more complex.
So perhaps the saving of a digitally lacking humanity might be an AI that teaches it rather than learning from it.