<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=705633339897683&amp;ev=PageView&amp;noscript=1">

Open source and the unintended consequences of the EU’s Cyber Resiliency Act

Luis Villa
by Luis Villa
on February 22, 2023

Don't miss the latest from Tidelift

On September 15, 2022 the EU unveiled a draft of the Cyber Resiliency Act (CRA), an eighty-seven page document detailing proposed new rules meant to minimize the risk from hardware and software attacks. Like the White House cybersecurity executive order 14028 and the U.S. government’s Office of Management and Budget memorandum M-22-18, with this act, the EU is calling for measures to be taken to make software more resilient. With the increasing prevalence of software supply chain vulnerabilities, this attention on improving cybersecurity should be welcome news for anyone who uses software—but it comes with some potential unintended consequences for people who create and maintain open source software.

The CRA requires that organizations are aware of what’s exactly in their products, to the point of being able to recall certain components. To quote the CRA: “it is of particular importance for manufacturers to ensure that their products do not contain vulnerable components developed by third parties.” This goes beyond having a Software Bill of Materials (SBOM), because it won’t be enough to just log the ingredients in software. Organizations will need to know how to understand and act on every item in their SBOMs. (Learn more about why SBOMs alone aren’t the silver bullet.)

Organizations will also need to attest that they have followed the requirements set out by the CRA, similar to regulations that have long been in place for consumer electronics. How thorough this attestation will be will vary, ranging from self-attestation for products that aren’t “critical”, to mandatory third-party validation for the most important categories of infrastructural software, like those in networking.

Unintended consequences?

These requirements are, at some level, common sense—when software forms the core of the modern economy, requiring it to be safe and secure is hard to argue with! The rub comes with open source software—which is often a critical but accidental part of the supply chain. The current draft of the CRA attempts to thread this needle by sheltering “non-commercial” open source, while not allowing commercial software providers to escape liability simply by releasing software as open source. To quote the draft:

In order not to hamper innovation or research, free and open-source software developed or supplied outside the course of a commercial activity should not be covered by this Regulation. … [C]ommercial activity might be characterized not only by charging a price for a product, but also by charging a price for technical support services, [or] by providing a software platform through which the manufacturer monetises other services…

(Emphasis ours; Proposal for a Regulation on cybersecurity requirements for products with digital elements: Cyber Resiliency Act (2022))

While this callout to open source is well-intentioned, as currently drafted the language would stratify open source into two tiers: a heavily-commercial tier, whose sponsors can afford to follow the strict rules of the CRA, and a second tier whose authors must refuse any support, for fear of becoming liable.

This would create not just an unfunded mandate, but potentially an unfundable mandate. Meaning, any “commercial activity” (like paying open source maintainers, for example) would create new liability, perversely forcing more maintainers to continue to treat core infrastructure as a hobby rather than the paying career many of them would prefer it to be—and perhaps even dissuading some from working on open source code in the first place.

The exception also suffers from another potential issue: as pointed out by the Open Source Initiative in their filing to the EU, it’s not a core part of the law (what the EU calls an “article”) but instead is simply an advisory “recitation” that EU countries can choose not to implement in their national versions of the rule. So it’s possible that, once adopted, even open source done purely by volunteers would be subject to the rule.

The act attempts to soften the blow by limiting the rule to “critical” products, particularly those involved in networking and operating systems security. Unfortunately, these “critical” areas are some of the fields where open source has been most successful, like web browsers and network-connected devices.

 Much of the open source in these products are parts of the “accidental” supply chain—with developers who started doing something for fun and never asked to be used in “critical” infrastructure. If they are faced with legal liability, instead of making their software more secure, they may simply walk away from their projects—an even worse outcome than we face today. (NLnet, a non-profit whose work supports several likely “critical” networking infrastructure tools, goes into great detail on this in their blog post on the CRA.)

What’s next?

At Tidelift, we’re following these developments closely and are inclined to agree with the organizations that have filed responses with the EU. The intent of this act is noble—just as we invest in our roads and buildings, Tidelift firmly believes we must also invest in the core infrastructure that powers our modern world. But if this becomes law as-is, it will not only have the intended effect of reshaping how software is used, it will have the unintended effect of polarizing open source into those who can and can’t get paid for their work. Many packages will be abandoned just when they become most critical, for fear that further usage will create liability for their creators and maintainers. We hope that the EU can reach a compromise, where the maintainers can get paid for the important work they are doing to build a better software world for all of us.

New call-to-action