Artificial intelligence (AI) currently being used by insurance companies has failed to remove gender bias from the profession's claims, underwriting and marketing processes.
A CII report has identified that insurers have more to do to address gender biases.
The report reveals that the datasets used to train the algorithms that support AI systems are rooted in outdated gender concepts.
Algorithms learn by being trained on historic data -- but the report notes that more and more of that data is now unstructured, coming from text, audio, video and sensors.
Yet the report warns that embedded in that historic data are decisions based on historic biases, particularly around gender.
The report concludes that insurance firms need to prepare a structured response to this issue, starting with visible leadership on tackling gender bias in AI.
To read the report, please visit: www.cii.co.uk/82803