Skip to main content
Journal Magazine: Informing Workplace and Facilities Management Professionals - return to the homepage Journal magazine logo
  • Search
  • Visit Journal Magazine on Instagram
  • Visit Journal Magazine on Twitter
  • Visit @Journal_Mag on Facebook
Visit the website of the Chartered Insurance Institute Logo of the Chartered Insurance Institute

Main navigation

  • Home
  • News
  • News analysis
  • Features
  • Study Room
    • A-Z
    • Question and Answer (Q&A)
    • Study Room Features
  • Opinion
  • CII Radio
  • Events
  • Digital Magazine
Quick links:
  • Home
  • Study Room

AN EYE ON AI

Share on
  • Twitter
  • Facebook
  • Linked in
  • Mail
  • Print
Open-access content 8th November 2019

Article hero image.

Duncan Minty provides a window into the FCA's thinking on artificial intelligence

Around five years ago, the UK Financial Conduct Authority (FCA) realised that it needed to really get to grips with big data and artificial intelligence (AI). It had some data scientists working in what was to become the Behavioural Economics and Data Science Unit (BDU). Their work had produced some eye-catching results, described as "the regulatory equivalent of the leap from black and white to glorious technicolour". But how was the FCA to scale this up?

The FCA's response was clever. It began building a partnership with the Alan Turing Institute (ATI), the UK's national institute for data science and artificial intelligence. This provided the FCA with access to top data scientists from across 13 UK universities. And the FCA then used that partnership to attract top talent to its growing BDU team, with the best data scientist recruits being given a Visiting Fellow position at the ATI.

The FCA then went on to explore a wide range of financial conduct issues with the BDU's new expertise. Two examples were models to predict the probability and location of an adviser mis-selling financial products and the pricing of personal lines insurance products.

This data science capability is surprising many insurance professionals, used to a more prosaic form of engagement with the regulator. There would usually be little sign there of algorithmic expertise. Yet, is that surprising? Policyholders do not have contact with insurers' data scientists. Why should that regulatory expertise be handled differently?

What many insurers experienced though were calls for big datasets, particularly in relation to the personal lines pricing review. The regulator's new algorithmic models needed data -- and this was the call for feeding time.

The FCA/ATI partnership has obviously proved fruitful, as in July 2019, the two organisations went public about their partnership, at a conference about the policy and scientific implications of 'AI Ethics in Financial Services'. A joint programme of work was announced, but perhaps of equal importance, a signal was sent to the market that the regulator was taking data and ethics seriously and that boards should do so too. This is what the FCA's Christopher Woolard had to say: "If firms are deploying AI and machine learning, they need to ensure they have a solid understanding of the technology and the governance around it. This is true of any new product or service but will be especially pertinent when considering ethical questions around data. We want to see boards asking themselves, 'What is the worst thing that can go wrong?' and providing mitigations against those risks."

If firms are deploying AI and machine learning, they need to ensure they have a solid understanding of the technology and the governance around it

MARKET IMPACT

So, what can insurance markets expect from all this? Certainly, a focus on the transparency and explainability of the artificial intelligence tools being used by firms. Guidance on this will be published in 2020 and firms should expect to see it presented firmly within the context of the Senior Managers & Certification Regime.

The guidance is expected to cover both corporate and social accountability, as well as explainability to boards, customers and the significant stakeholders in between. On paper, this is all pretty straightforward. The challenge for senior management function holders is to put it into practise and crank out the outcomes.

Insurance professionals have sometimes complained about the FCA being big on requirements, but short on advice on how to deliver them. AI ethics is not going to be different. However, a window into the FCA's thinking could come from some of the academic papers on AI accountability and explainability written by ATI people such as Luciano Floridi, Sandra Wachter, Chris Russell and Brent Mittelstadt. These papers are detailed, but influential.

One further point emphasised by the FCA was around how artificial intelligence might be used (inadvertently or otherwise) to facilitate anti-competitive behaviour by firms. So, while the FCA will firmly support collaboration to address, say, digital anti-money laundering projects at the market level, it will expect to see the data architecture protecting against anti-competitive behaviours. The ATI's expertise will inform that FCA scrutiny.

We know that AI is going to be big in insurance. We should expect it to be big in the regulation of insurance too.


CONTROLLING DIGITAL RISK

Four things insurance firms should be doing:

  • Have a clear and effective governance structure for your digital projects, products and services;
  • Know the ethical risks that arise from how the firm is using data and analytics;
  • Control for those risks through a mix of existing and new policies and procedures;
  • Have outcomes evidence to show how well those controls are working.

Duncan Minty is ethics consultant at Duncan Minty consultancy

You may also be interested in...

  • A powerful solution
  • Century of Law -- Set to be replaced
  • Keeping the heritage safe
Filed in:
Study Room
Topics:
Artificial Intelligence
Digital Risk
Share
  • Twitter
  • Facebook
  • Linked in
  • Mail
  • Print

Most-Popular

 

 

BECOME A MEMBER

BECOME A MEMBER

SUBSCRIBE TO PRINT

SUBSCRIBE TO PRINT
The-Journal_NEW.png
​
FOLLOW US
Twitter
Facebook
Youtube
CONTACT US
Tel: +44 (0) 20 7880 6200
Email
Advertise with us
​

About the CII

About us
Membership
Qualifications
Events

The Journal

Digital magazine
Podcasts
Blog
News

General Information

Privacy Policy
Terms & Conditions
Cookie Policy

Get in touch

Contact us
Advertise with us
Write for The Journal
Want to receive The Journal?

The Journal Magazine is © 2020 Redactive Publishing Limited

All rights reserved.