Advisen FPN

Advisen Cyber FPN - Friday, December 21, 2018

Analysis of FTC hearing on competition and consumer protection: Part II


Analysis of FTC hearing on competition and consumer protection: Part II

Advisen is pleased to have Goldberg Segalla provide our readers with a summary of these recent hearings. This is Part II of their analysis. Read Part I here.

Making a Case for a Federal Privacy Regulatory Framework

Another component of the first day of the hearings was with respect to institutional incentives to invest in data security. One of the more telling narratives was whether and to what extent there is a true consumer appetite for privacy. The presumption probably is yes, but the reality may not be so clear cut. The considerations that were dissected included customer trust, reputation, ex ante compliance, ex post liability, customer demand, competitive advantage, cost reduction/cost savings, and cyber insurance. As one anecdote so aptly put it, a Chief Information Security Officer (CISO) can “spend millions on security that is compromised by a $500 piece of rented malware.” The question becomes, how do institutions prioritize cyber risk among all of the other risks they have to manage? Data security and privacy protections compete with every other one of an organization’s investments, essentially boiling data security down to a resource allocation proposition — a cost-saving project versus a revenue-generating project. Can there ever be such a thing as over-investment in prevention? “Cyber resilience” can be defined as an entity’s ability to continue to operate effectively despite adverse cyber events (for example, a security incident or data breach). Resilience doesn’t come cheap.

The cyber insurance market continues to grow and the product continues to evolve. It was suggested during the hearings that cyber insurance may be a catalyst to improved data security. The analysis makes sense. Insurance carriers essentially mandate better cybersecurity from their insureds. Further, potential insureds have to lay bare their infrastructure and investment in data security, giving the industry a unique look at a company’s investment, protocols, and preparedness. Nevertheless, application assessments as they currently exist may not be enough. Enter the hope for virtual herd immunity, because let’s be realistic, the approximate $5 billion in cyber coverage insurance premium is infinitesimally small in comparison to the $2.1 trillion in non-life gross written premium. The coverage is out there, as are the risks. Businesses across the small, middle, and large market should get at it. As noted during a rather robust debate, the insurance industry is not stopping cyber attacks; it is managing the risks associated with them. Although not all security incidents are accurately reflected in claims frequency, the insurance industry is positioned to collect a multitude of information, via reported claims, that identifies where vulnerabilities exist, what controls are lacking, and other data points. Cyber insurance will remain one of the fastest growing lines of insurance in an increasingly connected world.

Interestingly, the first panel on December 12 largely was focused on accurately assessing network security risk, underscoring the distinct drought in available historical information, which is in part attributable to the speed of innovation. In contrast to some of the dialogue on the first day, the panelists agreed that the insurance industry, as a whole, contributed in part to this factual famine with undisciplined underwriting, often using simplified, short-form applications and no additional diligence to understand the nature of the cyber risk it was assuming. Putting aside this informational deficiency, the panel also identified several other facets of network security that require further investigation, most notably patching and vendor control. For simplicity’s sake, let’s call this virtual blocking and tackling, the pick-and-roll, the 6-4-3 double play — that is, fundamentals.

Dissecting the practice of patching, the panel suggested the practice is underutilized, may not be possible for older or legacy systems, and is haphazardly applied without fully assessing how a patch may create or expose vulnerabilities in other parts of the company’s system that were seemingly unrelated to the patch. With respect to vendors, the panel stressed that the risk of infiltration is not limited to what we would traditionally view as IT-related vendors (i.e. cloud services — also known as “someone else’s server”). Instead, the threat can come from any vendor, regardless of industry, as demonstrated by some recent large-scale breaches dating all the way back to the 2013 Target breach where the threat actors gained access to Target’s network through Target’s HVAC vendor. Wake up, small and mid-size businesses — the future of cyber risk management is calling and it knows where you live and how vulnerable you are. Windows XP is no longer supported!

One of the more unique sessions was a one-to-one “fireside chat” on emerging threats. Joshua Corman, founder of @IAmTheCavalry, and FTC Commissioner Rebecca Kelly Slaughter engaged in a discussion about emerging threats. One topic that stood out head-and-shoulders above the rest was the need to shift the focus from simply safeguarding data for privacy concerns to ensuring that a security incident does not compromise safety. How, you ask? Cyber-Physical Systems, of course. The National Institute of Standards and Technology (NIST) defines them as follows: “Cyber Physical Systems (CPS) comprise interacting digital, analog, physical, and human components engineered for function through integrated physics and logic.” In other words, interconnected devices, interconnected people, interconnected programs, and interconnected machinery. For example, a data breach can have an impact on the healthcare sector beyond dissemination of private health information. The good old days of identifiable information theft are over thanks to the Internet of Things. Connected devices are vulnerable. Forget that your refrigerator can be a passageway to a denial of service attack in the right hands. Let’s take medical devices. More than ever, they are part of the IoT, which makes them hackable. This means that a threat actor not only has access to data, but can control the equipment. As Mr. Corman so aptly stated, there is currently “more incentive to have a corpse with its privacy intact.” As an example, he identified real-world impact to medical devices, explaining that hospitals routinely utilize bedside infusion pumps to deliver medication. The pump is connected and vulnerable, but not patchable. Let that sink in. Take it from the FDA: “[M]any medical devices contain configurable embedded computer systems that can be vulnerable to cybersecurity intrusions and exploits. As medical devices become increasingly interconnected via the Internet, hospital networks, other medical devices, and smartphones, there is an increased risk of exploitation of cybersecurity vulnerabilities, some of which could affect how a medical device operates.” In this particular space, stakeholders have worked to refocus the security discussion so the vulnerabilities in physical systems, particularly as they touch human health care, are being addressed by government and industry alike. It’s downright scary: malware that can start a fire, shut down utilities, take over an autonomous vehicle (or fleet of vehicles, or defibrillate.

As the second day of hearings moved into the home stretch, another panel addressed the U.S. approach to data security. The panel overwhelmingly concluded that there was no cohesive “U.S. approach.” Rather, we currently are governed by a cacophony of data security rules which are not uniform and do not dovetail. The panelists deliberated whether we, as a society, need a cohesive methodology. Although there was no consensus on the correct tactic to regulating data security, the panelists agreed that the FTC and other governing bodies need to provide some baseline standard for data security as a floor, and build off that for individualized, reasonable data security models that fit within corporate constraints. There is a lot of work to do and the partnership between the public and private sector needs to be forged sooner rather than later.

During the analysis, one of the panelists suggested that we are collectively failing to address a fundamental flaw in how we view data security. He insisted, for example, that we should not be worried about the disclosure of social security numbers. Rather, we need to change our approach to how we use them. In the current environment, he explained that social security numbers have transformed from their original purpose, and are currently used for authentication purposes, similar to how we use a password. The problem lies in this misuse, since social security numbers were intended for identification purposes, similar to a username. While disclosure of a username is not typically an alarming security concern, the disclosure of a password is. Therefore, regulators and industry professionals need to reconsider how we use certain information as identifiers and authenticators. Yet more food for thought.

The big finish was a panel tasked with the seemingly sensitive discussion of FTC Data Security Enforcement. Recognizing that the FTC’s powers are somewhat limited by its fairness authority under Section 5, several panelists urged the FTC to be more transparent with the public in its consent orders and closing letters, noting that there is very little utility in these orders and letters if other data custodians cannot extrapolate the reason why some conduct is sanctionable and why some conduct is not. The panelists suggested that by limiting the analysis in the consent orders and closure letters, the FTC was in turn limiting the impact of each to its specific facts rather than serving the broader purpose of deterrence. One could argue that this is tantamount to treating the symptom, but ignoring the disease. The federal agency tasked with the level of responsibility that the FTC has in this rapidly evolving climate needs the mandate, the resources, and the teeth to support its mission.

One of the the FTC’s associate directors summed up the two days of hearings into three succinct takeaways: (1) More empirical data is needed respecting which processes and practices are most effective for security and deterrence, (2) data custodians already have multiple sources of incentives to adequately protect their network, and (3) a one-size-fits-all solution to network security will not work. Although this neatly packaged the overall themes of the hearings, each panel conveyed its own messages, much to each one’s credit.

Consumer demand for data security was one of the more telling topics during the hearings. In an age when a smart TV can collect, compile, and report information, and learns behaviors in the process, the more-than-rhetorical question was, “How important is perceived security to consumers making purchase decisions?” The potential answers ranged from important to moderately important to unimportant. The qualitative aspect revolved around whether or not the individual consumer is responsible, whether product and service providers are responsible, or whether the duty to protect information is shared. Might as well throw the federal government in the mix. Whether or not consumers are more concerned about functionality versus security is up for debate; however, the sound-byte of the session was “we have made security violations of the consumers very painless.” Data security and data privacy education, protection, and enforcement should be painless, as well. Let’s get together and share the burden. Everyone’s got skin in the game.

About the authors: Todd D. Kremin is a partner in Goldberg Segalla's Global Insurance Services Practice Group and Cybersecurity and Data Privacy Practice Group. He maintains a focus on claims management, insurance coverage disputes, and other corporate and individual exposure arising from a wide variety of risk, including securities disputes, cyber risks, business combinations, employment practices, and professional errors and omissions. His practice also includes reinsurance disputes, insurance regulatory matters, and defense of insurance agents and brokers and securities broker-dealers and registered representatives in connection with alleged errors and omissions. Robert F. McCarthy, a trial and appellate lawyer with over 25 years of experience, is a partner in Goldberg Segalla's Global Insurance Services Practice Group and Cybersecurity and Data Privacy Practice Group. He is a Certified Litigation Management Professional (CLMP) and a Certified Claims Professional (CCP) in Cyber Claims, and, among other industry leadership positions, is a member of DRI's Data Management and Security Committee. He maintains a particular focus on complex and high-exposure litigation involving commercial, excess and surplus, and specialty lines of insurance. Before joining Goldberg Segalla, Bob spent two decades in-house at Nationwide, first as a trial attorney, and eventually with national responsibility for large-loss, complex commercial and construction defect litigation.

Starr Companies