Solving the data disconnect in the London Market: Building a better future (Part 2)
We recently published a blog looking at the data challenges present in the London Market. We discussed these with Paul Rich, London Market Consultant, at MOTOSI Consulting. We got an insight into the specific challenges with managing data. And how this could be improved to create a London Market for the future.
In this blog we continue our discussion. Looking at how the London Market can make changes to their approach to managing data, the perils of inactivity and examples of organisations leading the way on this.
So, last time we spoke we discussed the change in mindset that needs to happen in the London Market. But what challenges does this bring? What motivations will we need to provide to give people the incentive to open up their data, do you think?
Paul: Well, you can if you can demonstrate the value, with real information. If you can show you could address a certain problem by having standardised data and apply common data standards to the whole insurance distribution value chain. It is easy to see the value if you consider the whole process from customer to ultimately the (re)insurance end of the chain, it has the potential to remove huge amounts of friction and by taking out friction, you’re tackling the biggest challenge faced at the moment in the London Market and that is to reduce the disproportionately large expense base. The market simply cannot afford to trade and to operate at the current expense levels. So, data could provide the motivation to express that in a meaningful and commercial way.
If not, then this could challenge the market’s continued relevance as a global insurance hub. It’s being challenged now, with the global insurance marketplace opening to a point where London isn’t necessarily the only or in fact the first choice. There are many, many other centres around the world and so by applying data standards, establishing common agreements as to what is required and expressing the commercial value in it, then there’s hope for change.
So, it’s a case of surviving and staying relevant in the world market?
Paul: Completely. And if nothing was to happen, the potential outcomes could be very damaging and my personal fear would be that the London Market would slip further down the global pecking order. It would be a very challenging uphill battle to express its ongoing relevance as a global insurance marketplace. Especially as other centres around the world adopt these more centralised models. It would mean the international buyers of UK-based insurance would have an increasing menu of choices. That could be potentially very damaging for the London Market in particular.
Who needs to take the lead? Does the London Market itself need to set an example here, to the organisations that operate within it?
Paul: I think so. It will take some strong leadership. There are a variety of different stakeholders in the marketplace that would have a vested interest in supporting these data standards. Whether that’s associations such as the IUA or the LMA. The brokers in this are obviously a very, very important part of the puzzle. So, you would need to get the support of associations like LIIBA. Another very important potential stakeholder in all of this could be ACORD. ACORD has the structural integrity globally to create the platform and the base upon which these standards could be delivered, administered and governed and they would, to my mind be the natural place for such data structures to sit.
Looking forward, are we seeing any evidence of things changing? Or are you seeing any evidence of things changing yet?
Paul: We’re certainly seeing lots of change, right across the industry. Companies that are switched on to this idea of standardising their own internal datasets, then enriching that with externally sourced data are recognising the value of the aggregated output in support of their business delivery requirements. I don’t however think this approach is something that is market-wide and I for one would like to see far more adoption along these lines and for this approach to be taken. You could look at certain little pockets of the London Market, certain eco-systems in their own right whether that’s across claims or the delegated authority space or the open market syndicated model and apply the disciplines required and then bring all of those together and create a platform or centre of commonality.
Are there any organisations who are leading the way on this?
Paul: In terms of a part of the market, you could point to somewhere like the Marine market. This market has traditionally been weighed down by the sheer volume of data that’s available to it and that needs to be processed in order to function properly. They are however making good progress as they are starting to really understand the value in use of the data it has at its disposal. For example, the real-time geolocation of various vessels that may be at sea and the various perils that they may come across in the course of a transit means they need access to real-time, qualitative data to better manage the associated risks they face and there are certainly examples that I’ve seen where they are able to get much better insight as to the risks that are being posed to their particular portfolios and therefore manage those risks in a more sustained and predictive way.
A lot of these companies that you are starting to see, aren’t just digesting the data and then using that data in a very monosyllabic fashion, they’re trying to actually model it in a predictive way, meaning they can predictively forecast or at least model out potential outcomes far more accurately and dynamically. It’s like turning a light on in the front of the car when you go through a tunnel. You’ll get through the tunnel without the light, but if you have that light on, then you can see the potential pitfalls and hazards more clearly.
What does the future for the London Market look like, if we can make the changes that you’re suggesting?
Paul: The changes that are needed would create the modern London Market that people like John Neal and others in the marketplace are looking to express through the Future at Lloyd’s. Data fundamentally underpins pretty much all of the work that is being done and if it can be done, it has the ability to transform the way in which business is done and you will see people re-assessing and hopefully re-engaging with the market, which should really start to yield results and shoot it back up the pecking order.
My hope is that the Lloyd’s Blueprint documents are used to paint the pictures of what is possible. Demonstrate the outcomes that are possible through adoption of the best practices around data use and governance. Because with all of the technology that’s available today, with all of the improving processes available, data really is the fuel to facilitate that change.
If they can start to visualise it to the market, then I think that adoption levels will increase and the more adoption you get, the more interest there is in it and it becomes a self-fulfilling process.
Change doesn’t need to be slow and painful. Low-code is one technology in the ‘intelligent automation’ space that’s pioneering a fresh approach to digital transformation. Recognising big transformation projects can be expensive and too slow, low-code allows you to make change happen quickly. Improving one process at a time.
Read part 1 of this interview. Discover how low-code enables seamless data utilisation to reduce cost and improve efficiency.
Learn about low-code and why it’s the easiest way to develop business applications, fast.
.
.