This essay has been written by Dr Christopher Daley, London School of Economics and Political Science, and forms part of a publication entitled ‘Perspectives on the role of London’s higher education sector in global AI leadership: A collection of essays’ which is being published on the London Higher website one part at a time.
We are regularly reminded of the transformative impact of AI on our social and work life, but the revolutionary potential of AI can sometimes lead us towards a form of technological determinism, where AI is the seismic change, and humans are simply swept along by its unceasing progress. As John Tasioulas argues, we need to keep in mind that “AI is not a matter of destiny, but instead involves successive waves of highly consequential human choices”. This is where SHAPE (Social Sciences, Humanities and Arts for the People and Economy) disciplines have a crucial role to play, not only in studying how AI is changing society, but also as active participants in directing or regulating change by considering areas such as ethics, governance, human-centred design, creativity and responsible AI usage. This piece will outline some example areas where SHAPE disciplines have taken a leading role in AI advancement, whilst also highlighting how the AI ecosystem within London has provided a fertile environment for SHAPE-led initiatives to thrive.
Since 2020, the British Academy and UCL Public Policy have collaborated on a project focused on AI and the future of work, producing (amongst other outputs) a series of briefing papers aimed at advising policymakers and business leaders on the ways in which AI is shaping the workplace and, in particular, employee attitudes towards the technology. Through interactions between social scientists, parliamentarians, trade associations and businesses the project has demonstrated how social science-led initiatives can systematically bring-together a variety of societal perspectives to deliver actionable insights around policy or processes.
Similarly, JournalismAI, a collaboration between the London School of Economics and Political Science (LSE) and the Google News Initiative, has worked with news organisations globally to inform and train journalists about responsible AI use. The project is an important example of how academic and professional expertise in media and communications can be utilised to interrogate AI, with the knowledge gained then applied to advancing journalistic practice.
The UK has been keen to position itself as a leader in AI safety, hosting an AI Safety Summit at Bletchley Park in November 2023 and recently establishing the AI Safety Institute within the Department of Science, Innovation and Technology. While technical solutions are an important component for building safe and responsible AI systems, there will also be a need to develop governance, legal frameworks and equitable practices. This is most clearly reflected in the government’s 2022 AI Action Plan where two of its three pillars imply a significant role for SHAPE disciplines with pillar two focused on delivering the benefits of AI equally across the country and pillar three dedicated to effective AI governance. Delivering these priorities will require a detailed understanding of regional inequalities, social attitudes and the legal/ethical implications, all areas encompassed by SHAPE disciplines.
The AHRC’s BRAID programme is a £15.9 million investment designed to integrate arts and humanities research within the responsible AI landscape. The programme has funded a range of collaborative fellowships which imbed arts and humanities perspectives within public and private sector institutions. A notable example of this can be found through Dr Oonagh Murphy’s work at Goldsmiths, University of London. Dr Murphy’s BRAID fellowship is delivered in collaboration with Arts Council England and explores the impact of AI on artistic practice, encouraging the creative sector to experiment with new work and business models through responsible AI use, whilst also examining how the intellectual property of artists can be protected in an increasingly AI dominated world.
The potentially revolutionary uses of AI in the healthcare sector will also require SHAPE expertise. King’s College London was recently announced as one of the EPSRC’s Digital Health Hubs, undertaking research into the use of digital health technologies such as tracking apps and wearable devices. The EPSRC hubs will also examine how emerging technologies can tackle health inequalities and explore the steps needed to prevent digital exclusion. The hubs will therefore require considerable input from disciplines such as health economics, behavioural psychology and public policy to ensure emerging technologies have positive impacts across society.
The GLA calculates that the creative industries accounted for 795,500 jobs in London in 2021, which is approximately one in seven of all jobs in the capital. Recognising the vibrancy of the creative industries in London and the South East, the AHRC recently funded the CoSTAR National Lab, which is a £51 million investment in creative technologies led by Royal Holloway, University of London. The lab will be based at Pinewood Studios in Buckinghamshire and will include a state-of-the-art creative AI compute facility, with the aim of putting the UK at the leading edge of applied research in creative technologies.
Long-term STEM, SHAPE and industry collaboration
London is the UK’s hotspot for AI activity. Datacity states that 45% of all UK AI companies are based in London amounting to a total turnover of £17 billion. It is also home to a range of AI focused research infrastructure, including the Alan Turing Institute, Ada Lovelace Institute and a strong collection of research centres and institutes at London universities.
To guarantee both the long-term sustainability and coherency of this ecosystem there is a need for continual mapping and strategic partnership development across sectors. For London universities this will mean more cross-institution collaboration, such as that seen between the Data Science Institutes at Imperial and LSE, which allows for SHAPE expertise to naturally complement strengths in STEM. Additionally, London universities offer fantastic collaborative opportunities to the commercial AI sector in London, either through R&D partnerships or a diverse pool of skilled graduates. Both academics and graduates from SHAPE disciplines are a crucial part of this cross-sectoral AI ecosystem, providing the critical and social thinking required to ensure that transformative AI technologies have safe and human-centred applications.