EU cyber laws ‘will’ make FOSS devs liable

European lawmakers want all software makers to be liable for security holes. Even non-profit or hobbyist developers could be sued for negligence.

The EU’s draft Cyber Resilience Act (CRA) and Product Liability Act (PLA) would “create a chilling effect” and do “irreparable harm,” according to the organization behind Python and PyPI. When replicated across other parts of the software supply chain ecosystem, we risk the whole house of cards crashing down — as devs race to limit their liability.

The goal might be laudable, but some aspects need a major rethink. In this week’s Secure Software Blogwatch, we fear unintended consequences.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: Snail vs. fruit.


What’s the craic? Thomas Claburn reports — “Python head hisses at looming Euro cybersecurity rules”:

“Liable for the consequences”
The Python Software Foundation (PSF) is concerned that proposed EU cybersecurity laws will leave open-source organizations and individuals unfairly liable for distributing incorrect code. [It] argues that holding open-source developers liable for code contributions would discourage contributors.

The PSF argues the EU lawmakers should provide clear exemptions for public software repositories that serve the public good — and for organizations and developers hosting packages on public repositories. … Anyone who made a substantive change to an open source project would be liable for the consequences of that change.

What about the U.S.? Andreas Kuehn and Alexandra Paulus foresee it happening here, too — “A Toolbox for Policymakers”:

“Important milestones”
As security flaws keep software and entire supply chains vulnerable, it is time now for policymakers to set regulatory lanes for developers to build safe and secure technology. They must create incentives for the private sector to prioritize software supply chain security, deter insecure practices and negligent behavior, and help coordinate among industry, government, and civil society stakeholders.

The idea of product liability for software is not new but has received little traction so far, especially given industry fears of excessive lawsuits. … The 2022 Cyber Resilience Act draft legislation envisions a strict liability regime [and] the new U.S. National Cybersecurity Strategy details important strides in this direction. [But] software-developing entities rely significantly on a multitude of software components developed, delivered, and maintained by others. While software-developing entities can and should test the security of these components, complete security is usually not feasible.

While the maturity of supply chain security efforts varies, [the] basic building blocks for increasing supply chain security have been in the making for more than half a decade. [But] adoption … has been slow [because of] in part, the lack of direct incentives, market pressure, and government demand. … The new U.S. National Cybersecurity Strategy as well as current EU cybersecurity proposals are important milestones.

Horse’s mouth? The PSF’s Deb Nicholson warns of “Unintended Consequences”:

“We need it to be crystal clear”
After reviewing the proposed Cyber Resilience Act and Product Liability Act, [we’ve] found issues that put … the open-source software community at risk. … Overly broad policies will unintentionally harm the users they are intended to protect.

If … enforced as currently written, the authors of open-source components might bear legal and financial responsibility for the way their components are applied in someone else’s commercial product. … Any policy that does not provide clear carve outs … for the public good will do irreparable harm to … modern software development. … Assigning liability to every upstream developer would create less security, not more [and] would almost certainly create a chilling effect.

We need it to be crystal clear who is on the hook for both the assurances and the accountability that software consumers deserve … We believe that increased liability should be carefully assigned to the entity that has entered into an agreement with the consumer. … Python users in [the EU] may wish to write to their MEP voicing their concerns about the proposed CRA law before April 26th.

What the WHAT? layer8 stacks the standard response:

It [should] be tied to the “productization.” That is, the liability chain will only go as far as there is someone who turned the software into a product for monetization.

If a company sells a product that uses Linux, they will be liable regardless of whether they contributed to Linux development or not. If part of the product was itself purchased from a third party, the third party will be liable for that part. But open-source developers who don’t monetize the software won’t be.

And gweihir agrees, but with reservations:

I agree. … For-profit organizations distributing FOSS commercially (directly or indirectly) in any way should have just the same responsibility for the quality of the product they are selling as anybody selling directly commercial software. Providing stuff truly free and not having any commercial interests … is the only reasonable exception I see.

Companies like Red Hat or Suse or any IoT provider or the like would love to continue not being responsible for the stuff they sell. That is not acceptable though.

The GDPR did not cause the sky to fall. Sure, some ****ers got called out on their bad practices, but that is a good thing. Most just needed to do some cleanup and better processes, but that is it. It will be the same for this thing.

But surely it’s not the lawmakers’ intent to hold open source devs liable? Here’s Brewster’s Angle Grinder:

The concerns seems genuine. It wouldn’t hurt law makers to say what they mean, rather than hope the courts correctly infer what they meant after everybody has spent a lot of money on lawyers (assuming people can afford the lawyers and individuals and open source organisations don’t go bankrupt because they can’t).

Basing it on monetization is also flawed. So says octacat:

That is sooo bad. Basically anyone can give you support with some open source code (i.e., consultancy — they fix a bug/deploy/tweak for you and go away) except the authors of the code. Because if the authors do this, they are liable for the whole … product. Nice. Also, many open source projects have very complex authorship, good luck digging which company is responsible.

Also, basically your favourite cloud provider could host your favourite open source database, but the authors … would be liable. Because, “This Regulation does not regulate services, such as … SaaS.”

Those meddling Eurocrats. SwashbucklingCowboy looks to a west that’s not so wild:

This is one case where the US actually has a better vision (a rarity I know), where there’s a reasonable safe harbor where if you securely develop your software and can show that then you won’t be held liable for vulnerabilities.

The requirements of the CRA are going to make proprietary software more secure and safer, but probably more expensive as well. … If open source isn’t exempted from these laws … expect a lot of open source to disappear from the Internet.

What will happen? lars_francke has some good news:

This section has been rewritten … in the latest (internal) draft of the CRA, based on feedback of various open source foundations as far as I know. I’m not sure how much I’m allowed to share but it’ll be public at some point in April I believe.

Meanwhile, Spazturtle thinks it’s time for a colorful metaphor:

If I bake some cakes and give them away on the street for free and people get sick … then I would be prosecuted. Why should software be treated differently?

And Finally:

Cyclophoroidea vs. Fragaria


Previously in And finally

You have been reading Secure Software Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or [email protected]. Ask your doctor before reading. Your mileage may vary. Past performance is no guarantee of future results. Do not stare into laser with remaining eye. E&OE. 30.

Image sauce: Guillaume Périgois (via Unsplash; leveled and cropped)

*** This is a Security Bloggers Network syndicated blog from ReversingLabs Blog authored by Richi Jennings. Read the original post at: