I attended the annual Advisen Cyber Risk Insights Conference in New York. The event this year had over 900 attendees and is frequented by insurance representatives, brokers, risk managers and lawyers, all of whom work in the cyber and cyber insurance space. It’s probably the premier conference of its kind.
I’ve been attending this conference for about 5 years. When I first started coming, the conference was all about sales, marketing and underwriting. I remember sitting at a table my first year having lunch with a handful of 20-something underwriters who were gushing over their new insurance product called cyber insurance. They trumpeted the freedoms—the lack of forms provided and how they could even make up policy language as they went along. When I spoke up and questioned future claims, it was as if I was speaking in a foreign language. They were completely baffled.
Fast forward 5 years. The theme at this year’s conference (October 25) was “Solving the Cyber Risk Equation.” The conference was devoted primarily, if not exclusively, on what to do about claims and the enormous and uncertain risk cyber poses under the policies. Things have a way of evolving along a natural course.
The good news is the attendees and speakers have become much more sophisticated about exactly what carriers are getting into when they write cyber policies and what kinds of risk they are biting off. They are also beginning to look long and hard at the future and the problems and issues that the proliferation of data through the Internet of Things and data analytics may pose for all insurance—not just cyber.
The conference kicked off with a keynote address by retired Adm. Michael Rogers, former director of the National Security Agency. Rogers talked about his view of the cyber landscape and the threats. He made the point that cyber carriers and, for that matter, society as a whole are dealing with a highly fluid and ever-changing cyber threat where past data does not correlate well with future outcomes. In short, the situation is a potential insuring nightmare. But Rogers’ other conclusion: cyber insurance is in a unique position to strengthen cybersecurity practices of insureds by incentivizing sound cyber practices.
This all sounded well and good, but in practicality this may also mean a potential dark side that may or may not be acceptable to society.
For example, Nicole Eagen of Darktrace, a cybersecurity analytics firm, noted in her panel presentation that many cyberattacks occur due to human missteps and errors. According to Eagen, we now have the ability through artificial intelligence and data analytics to take a virtual look at a company’s employees, and determine what normal behavior is for each of them. Analytics can then figure out when those employees aren’t acting consistent with their norm. This then could signal actions that decrease cyber risk and steps to stop improper behavior before it causes harm. That is kind of spooky. How much do we really want our employers monitoring our every step and labeling what we do as normal or abnormal?
What happens when insurance becomes involved. Should insurance carriers who insure against cyber risk have access to this data? Should they insist on having it? That would certainly enable a carrier to put in to place protections to reduce the risk of data breach. And what if a carrier offered a premium discount to a business that allowed this kind of access? I think most business would jump at the chance to, in essence, sell their employees’ information in exchange for reduced premiums and better protection.
Mark Jenkins of PaloAlto Networks, a cybersecurity protection and detection firm, kicked off the next panel and raised the possibility that a carrier could use data collected from IoT devices to make real-time algorithmic decisions about coverages and premiums. Jenkins laid out a hypothetical where data collected by your automobile and communicated to your carrier would signal if you were driving too fast for too long. Or the auto data could show you frequented areas of town where the risks of accidents or theft were high. Your carrier could then use this data to make real-time adjustments to your premium based on your behavior. Jenkins thinks this is not that far off.
These ideas did not go without push back however. Adam Cottini of Arthur Gallagher, an insurance brokerage firm, pushed back and asked if this is something we really want. Do we as a society and as consumers really want to give up our privacy and allow our carrier to know where we go and how fast for this kind of monitoring and decision making? He thought many would not. And later in the day, Shiraz Saeed of Starr Insurance raised the same issue: “You have to have some level of anonymity in this country. Our Constitution guarantees it. We have all made mistakes in life we would just as soon not everyone know about.”
So the day quickly morphed from a cyber insurance conference to, in part, a philosophical debate about whether insurance companies should have and use analytical tools to reduce risk and premium or if that invades basic human privacy. And it raises related questions of who owns data, what is its value and who controls it. I recently interviewed Kathleen Dooley, GC of hu-manity.co, a company whose mission is to use the block chain to enable consumers to make decisions about their health data such as who can access it and to even sell it if they so desire. Dooley made the point that personal health data is property and we have certain inherent rights with respect to it.
I actually predicted many of these very issues in a post from 2016. In particular, I was interested in how business and insurance carriers could use data to reduce costs. I focused for example on a new electric toothbrush that would communicate to your dentist how often you brushed your teeth. I postulated that this same information could be communicated to your health care provider who could in real time change your premium. Would that change behavior? It would mine. Do I really want Delta Dental to know this? Depends I suppose on what it is worth and how much I would save. Implicit in my analysis is that my tooth brushing data does have value—I can trade it for reduced premium.
My guess is that people and businesses will elect to buy cheaper insurance if all it means is giving up some level of privacy. Indeed, we make this election every day. (How many of us trade privacy for the communication abilities offered by Facebook in order to get that free service?). We will make choices based on the implicit monetary value of our data. Looking at how we have willingly, consistently and even happily given up privacy rights for convenience and cost, it is inevitable that this is where we are headed.
But this isn’t all bad. If the insurance company monitoring my teeth brushing habits means I will brush my teeth more, have less cavities and be healthier then the fact that insurance pays for filling my cavities doesn’t make going through the process any less an ordeal.
About the author: Stephen Embry is a frequent speaker, blogger and writer. He publishes TechLaw Crossroads, a blog devoted to the examination of the tension between technology, the law and the practice of law. He is co-chair of the ABA’s Legal Technology Resource Center and on the Board of Editors of the Law Practice Today webzine. He is a member of the Leadership Council of the ABA Law Practice Division. Stephen serves as Chair of the Kentucky Bar Association’s Law Practice Task Force and Webinar Chair and Steering Committee member of Defense Research Institute’s Law Practice Management section. He is Chair of the Data Breach, Privacy and Cyber Insurance Section of the Federation of Defense and Corporate Counsel (FDCC), an invitation-only premier defense trial lawyer association.