Friday, March 26, 2021

Making Software Companies Liable

It is long past time to start suing software companies for the security holes in their products. There is precedent. The automobile was invented in 1888, but the court system reacted to the new technology by essentially asserting the driver assumed all responsibility for safety, the auto manufacturer virtually none.  This attitude towards safety and security continued until 1965, when Ralph Nader released "Unsafe At Any Speed", exposing the dangers modern auto manufacturers knew were baked into their products, but deliberately ignored. At first, the response was dismissive. 

A TIME essay the following year portrayed Nader as a polemicist who was trying to paint auto accidents as solely the fault of the machines, with no account for driver error, and asserted that the book was “an arresting, though one-sided, lawyer’s brief that accuses Detroit of just about everything except starting the Vietnamese war.”

But, after a series of Congressional inquiries into auto safety, the courts soon sided with Nader's analysis and began holding auto manufacturers responsible for the people riding in their products. It took nearly a century for the auto industry to move from "new technology" wherein the user assumed all risk to "established industry" wherein the industry had to take responsibility for the safety of its products.

Modern software design and development arguably dates from the late 1960s and 70s, when mainframes began to infiltrate the business segment of American industry. In the succeeding fifty years, software has come to dominate almost every aspect of every industry. It is certainly time for this industry to take responsibility for the safety and security of its products. 

However, when we see remarks like this, it does not engender confidence that this has happened:

"Open-source developers say securing their code is a soul-withering waste of time (TechRepublic, Dec 9,2020) A survey of nearly 1,200 FOSS contributors found security to be low on developers' list of priorities....Moreover, responses indicated that many respondents had little interest in increasing time and effort on security. One respondent commented that they "find the enterprise of security a soul-withering chore and a subject best left for the lawyers and process freaks," while another said: "I find security an insufferably boring procedural hindrance.""

GitLab's "2019 Global Developer Report: DevSecOps" survey of over 4,000 software professionals agrees:

"Nearly half of security pros surveyed, 49%, said they struggle to get developers to make remediation of vulnerabilities a priority. Worse still, 68% of security professionals feel fewer than half of developers can spot security vulnerabilities later in the life cycle. Roughly half of security professionals said they most often found bugs after code is merged in a test environment.

At the same time, nearly 70% of developers said that while they are expected to write secure code, they get little guidance or help. One disgruntled programmer said, "It's a mess, no standardization, most of my work has never had a security scan." Another problem is it seems many companies don't take security seriously enough. Nearly 44% of those surveyed reported that they're not judged on their security vulnerabilities."

And then there is this gem from freelance developers:

Of the 18 who had to resubmit their code, 15 developers were part of the group that were never told the user registration system needed to store password securely, showing that developers don't inherently think about security when writing code....The other three were from the half that was told to use a secure method to store passwords, but who stored passwords in plaintext anyway....

Of the programmers who actually bothered to submit a password storage system that was encrypted, almost 75% of the solutions were deprecated. That is, the solutions used already-compromised ciphers. The "solutions" were broken before they shipped. Of the 75% who failed, nearly 20% used Base64, which isn't even an encryption algorithm.

The 2019 study showed that current developers aren't any better than unsupervised students.

It is perfectly obvious that companies aren't demanding appropriate security training, so no one is providing it. Why aren't companies demanding it? Because software companies cannot be sued for creating insecure code. As one technical article pointed out:

"There is no software liability and there is no standard of care or 'building code' for software, so as a result, there are security holes in your [products] that are allowing attackers to compromise you over and over."—Joshua Corman

The end-user license agreement (EULA)—a dense legalistic disclaimer that only 8% of people read, according to data collected in 2011—essentially states that people do not have to use the software, but if they do, the developer is not responsible for any damages. EULAs have been the primary way that software makers have escaped liability for vulnerabilities for the past three decades.  

This laxity on the part of software and hardware companies has produced a multi-billion dollar international crime industry in which: 

Only 3 cyber incidents out of 1000 see an arrest [much less a conviction]... By comparison, the clearance rate for property crimes was approximately 18% and for violent crimes 46%, according to the Federal Bureau of Investigation’s (FBI) Uniform Crime Report (UCR) for 2016.

But the use of information technology is so ubiquitous that:
... the US Department of Treasury has designated cybersecurity incidents as one of the biggest threats to the stability of the entire US financial system....  
Third Way’s analysis estimates that the enforcement rate for reported incidents of the IC3 database is 0.3%. Taking into account that cybercrime victims often do not report cases, the effective enforcement rate estimate may be closer to 0.05%. [emphasis added]...  
The FBI reports that using the IC3 data to develop law enforcement referrals, it only secured nine convictions in 2016, down from nineteen cases the previous year.
Make no mistake: the cybercrime wave has been created by the poor coding practices at every software firm in the world. This has to end. For years, the IT industry has been urging everyone to wrap their products, data and business processes 'round with an IT blanket. But, when that same IT wrapper's poor design and lack of security began enabling the corruption or destruction of the products, data and business processes that were entrusted to them, the IT industry suddenly decided it is not responsible for the result. 

It is impossible to refer to members of the industry as "IT professionals" when those same people refuse to take responsibility for the security disasters their indifference has helped create. That isn't a professional attitude. It isn't even an adult attitude. It's time for IT to grow up.

If an organization is successfully hacked, someone in the IT supplier chain needs to be fined and/or jailed. Maybe it's the software company, maybe it's the integrators, maybe it's the developers, maybe the C-suite that runs the company, maybe the hardware people. Maybe all of them, or maybe just a selection. It doesn't matter. Someone or everyone in the chain needs to face consequences. Until IT people start facing serious consequences for their failure to care for the data and processes entrusted to them, they aren't going to change. We've waited fifty years. How much longer do we wait? 


UPDATE: Well, this ZDnet article on website password storage is rather chilling. This article describes how passwords SHOULD be stored, and this one gives more technical details.  

UPDATE II: It's 2022, and nothing has changed.

No comments:

Post a Comment