Blog 27 November 2020

Solving the data disconnect in the London market: Driving change (Part 1)

by Netcall

Share:
Commercial office buildings exterior. Evening view at bottom sky

The London Market, like many others is struggling to leverage and use its data in an efficient and cost-effective way. Multiple stakeholders and many disparate systems mean creating frictionless processes is a real challenge. Currently, up to 40% of every premium is swallowed by acquisition costs.

We spoke to Paul Rich, London Market Consultant, at MOTOSI Consulting, to get an insight into the specific challenges with managing data in the London Market. And how this could be improved to create a London Market for the future. Paul works with many different businesses in the London Market, designing solutions that help clients maximise operational efficiency. All while removing friction and solving data disconnects.

We asked Paul how data could be leveraged to deliver reduced operating costs and improved CX.

How do you use data to solve the expense problem for the London Market?
Paul: I believe using data in a more intelligent and structured way, will give companies the insights they require to better understand and ultimately solve problems like acquisition costs and operational expenses. By understanding all of the operational data, you can build a picture of how a business is performing. Or not of course. The data needs to be presented, interrogated, and rigorous disciplines applied. Then, the insight will give you the required transparency to optimise efficiencies in all areas of their business and also across the broader market, help to address problems such as the much quoted 40% acquisition cost challenge. Access to actionable data is fundamentally, the way in which things like these costs can be managed and controlled.

So if data is the key, do you think companies are currently too protective with their data?
Paul: In many ways they are. Everyone understands there’s a requirement to manage data in a proper and fair way, while being respectful of the different types of data that a company has to action in order to deliver their products and services. However, my sense is that companies have become overly sensitive when it comes to protecting the data we are talking about here. There are common datasets that could be used across various stakeholders and it’s these data sets that benefit from being standardised. Companies can then apply their own insights, rigor and analysis, on the huge amounts of other data they have.

In terms of some of these datasets, do you have some specific examples?
Paul: Well, if you take this down to particular product lines, certain classes of insurance have a common dataset. Whether that’s upstream energy business, private motor insurance, home insurance, or any other insurance you wish to choose. They all have common and standardised datasets and therefore lend themselves to be codified and formed as basic data standards

Is competitive advantage an issue? If the common dataset is common and everybody’s using the same data, they don’t really need to be so protective.
Paul: Absolutely right and it is on that base upon which companies should view it I think in many respects there is an element of data paralysis as they struggle to differentiate between the different types of data and how best to apply it. It comes into their businesses and it needs to be segregated, splitting out the commercially sensitive, the privately sensitive, the data that’s subject to regulation and increased scrutiny and governance; over the common, very, very standardised sets of data, which can form the pillars of underwriting propositions, claims propositions, and associated deliverables.

So, how do we go about solving this problem of companies being too protective over their data?
Paul: Well, it’s a problem that certainly the London Market’s been struggling with for many, many years. Particularly because of the way in which risks are syndicated, using the variety of different methods available. You’ve got multiple companies on multiple contracts with multiple stakeholders and that in itself presents the challenge. Understanding what agreements could be standardised and what datasets could be applied and used in a common form? That’s where the market has struggled in the past. It’s the fact that people do think, “Well, if I’ve got this data, I don’t want my competitor seeing this data.” Without fully understanding the downstream effect of distributing that data. If there is data that is common, and it doesn’t give commercial advantage to anyone, then it should be shared. And that transparency should be there.

Sounds like a great idea, but how are we going to make it work in reality? Will people start to share?
Paul: I’d like to think so. The marketplace is in a place where it needs to change now. It’s not a nice to have anymore. As with all things, it will take a pretty seismic change in mindset and given that the market has always struggled with the concept of how to best to adopt standards and manage the huge amounts of data that are ingested by it, I can’t see a straight forward or an easy way to market wide adoption

My thoughts are that if key market stakeholders are motivated by the key principles and the potential positive outcomes of such a move, then there’s a good chance that will work. It will however be interesting to see how this plays out under the roadmap of the Lloyd’s Blueprint work as this must be a market wide initiative, engaging all aspects of it. The mindset needs to be right, and the attitude of the people in the marketplace needs to be focused towards change.

A digital-first approach is critical to the future of the London Market. Businesses must be able to keep pace with changing times. The market must look to systems that optimise processes and unlock value. This includes using technology, such as low-code, that can be easily integrated with existing systems, so data utilisation can be implemented in a more innovative way.


.


.


Latest Blog Posts