Artificial intelligence was high on the agenda at the recent political party conferences, with the CII hosting fringe events at both the Labour and Conservative gatherings. Lawrence Finkle was there…
Chancellor Philip Hammond addressed this year’s Conservative Party Conference on the future of the UK economy. At the heart of that vision was the fourth industrial revolution.
“Technological change is transforming not only our economy, but our society and our politics at a rate that none of us have seen in our lifetimes,” said Mr Hammond.
This was the backdrop to the CII’s participation at the party conferences this year. Together with the Chartered Body Alliance – a joint initiative with two other major professional bodies: the Chartered Institute for Securities & Investment and the Chartered Banker Institute – the CII hosted timely fringe events at both the Labour and Conservative conferences on the ethics of artificial intelligence in financial services.
CII chief executive Sian Fisher spoke on the panel at the Labour Conference hosted by Phil Brown, head of policy at LV=, and alongside shadow minister for industrial strategy Chi Onwurah MP, member of the Treasury Select Committee Wes Streeting MP, and Adrian Weller, the AI programme director at the Alan Turing Institute, the UK’s national institute for data science and artificial intelligence.
Ms Fisher welcomed the new possibilities afforded to consumers by the adoption of new technologies by the financial services sector, but said that financial services firms had to ensure they were “not just getting carried away and really excited about building the tools, but actually taking responsibility for the outcomes”.
Ms Onwurah was keen to emphasise the benefits of technology advances for customers but warned the increasing complexity of the systems was bringing new challenges for the sector.
“There is huge potential in reaching out to the financially excluded, but that will not be achieved without applying appropriate transparency and regulation,” she said.
“The opportunity lies with the ability of algorithms to personalise and customise the experience of everyone, but the challenge is that algorithms also automate and industrialise bias.”
And Ms Onwurah called on the government to ditch what she referred to as a “light touch” approach to regulation in this space, instead urging tech companies to prioritise transparency by providing regulators with access to their algorithms to ensure that firms were responsible for the long-term impacts of their work.
“Technological change is transforming not only our economy, but our society and our politics at a rate that none of us have seen in our lifetimes” – chancellor Philip Hammond
“Just as with software becoming open-source which has not only increased its availability and driven down costs, but which has also improved and de-bugged as it has been continuously tested, then there is an argument for at least sharing the principles behind these algorithms so that we can be sure that they are working to help and empower people in accessing financial services, and
not imposing decision making upon them.”
Dr Adrian Weller highlighted how the increasing use of specialised algorithms had already delivered a series of benefits for consumers. He said: “[They have brought] great improvements in fraud detection, trade processing, compliance and, where appropriate, personalised recommendations of products – and for insurance, personalised pricing depending on individual risks.”
Dr Weller agreed that companies should be held accountable for their use of AI and machine learning, adding: “If an algorithm is doing something, we need to make sure that whoever is behind it is held accountable if the algorithm goes wrong.”
“Customer consent in a world where AI is using our data is even more critical”
REPLACING TRADITIONAL SYSTEMS
The panel event at the Conservative conference was chaired by the former treasury minister and current chair of the Office of Tax Simplification Angela Knight. She was joined by co-chair of the cross-party Parliamentary Commission on Technology Ethics Lee Rowley MP, head of HSBC Digital Bank UK Raman Bhatia, CEO of the Chartered Institute for Securities & Investment Simon Culhane, and CEO of the Chartered Banker Institute Simon Thompson, as well as Dr Weller.
AI systems are rapidly replacing traditional systems, remarked Ms Knight at Conservatives, with the main difference being that artificial intelligence learns from its customer base in a way that traditional systems do not.
The panel again agreed that this change had been largely beneficial. However, as technology advances, concerns over the ethical adoption of AI and those who use it are growing.
For Mr Culhane, AI has created an imbalance in society: “We have seen a great rise in inequality, a huge power shift to a few firms and individuals – the rise of the Facebooks and the Amazons.”
What these firms are able to achieve through the application of AI has left consumers wary – especially when it comes to data gathering.
“Customer consent in a world where AI is using our data is even more critical,” said Mr Bhatia.
General Data Protection Regulation (GDPR) laws aimed at controlling how data is handled have been welcomed by many as a positive step towards greater transparency. However,
Mr Rowley was unsure if it was the best way forward, explaining: “There is an underlying question as to where we are heading with privacy in society and we are currently at a crossroads. It is either going to go down the GDPR road where everyone owns their own data – about which I’m a bit sceptical – or where people start pulling back in certain areas about the kind of data they share.”
To address those concerns, Mr Bhatia suggested that companies should concentrate on how to demystify complex AI models: “If banks are using AI for the risk assessment of a customer, they have to be able to explain how a decision was made. Can they always do that? No they can’t, so that is a key concern.”
This issue is particularly prominent in the insurance sector, as noted by Mr Rowley, who added: “Insurance is built on the idea of pooled risk, but when you understand exactly what the profile is of the person coming to ask you for insurance, that does pose some fundamental questions.
“When we lack curiosity, fail to understand what’s underneath technology and just expect it to ‘do its thing’ and give us the results we want, perhaps we shouldn’t be surprised when it doesn’t meet our expectations”
“There are lots of opportunities but also a lot of ethical questions to answer in terms of who you insure, how you insure, and the premium you insure them with.”
Mr Thompson said the focus needs to be on openness: “There should not be black boxes working away unseen in the background, there should be glass boxes; transparent technology with clear lines of accountability.
“Whether that is us as professionals, as customers, or as policymakers putting the frameworks around it – we need to be sure we can understand, monitor and explain what that technology is doing and how it actually works, how it reaches those decisions and whether those decisions are actually in line with expectations when there are unexpected consequences.”
He reiterated that it was essential for companies, regulators and politicians to have a basic understanding of how the technology works.
“When we lack curiosity, fail to understand what’s underneath technology and just expect it to ‘do its thing’ and give us the results we want, perhaps we shouldn’t be surprised when it doesn’t meet our expectations,” said Mr Thompson.
With global insurtech investment booming, we take a look at the recently opened Lloyd’s Lab
As apprenticeships become ever more integral to businesses across the UK, Daniel Pedley explains what the CII is doing to help your company benefit.
Business interruption remains the top concern for risk managers, according to a new study. Sam Barrett finds out why…