This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 5 minutes read

The XZ vulnerability: opening the (back) door to regulatory risk?

The XZ back door was a near-miss that has shaken the cyber security world. How does this kind of vulnerability fit into the new software regulatory landscape?

The XZ Vulnerability

A little over a week ago, a critical vulnerability was discovered in the XZ Utils library (an open source package which provides data compression functionality). The vulnerability has received a severity rating of 10.0, the highest possible score on the CVSS scale and well above the threshold required to be deemed “critical”. The vulnerability had three characteristics that led one prominent security expert to describe it as a “nightmare scenario”.

Severity: The vulnerability allowed an attacker to remotely execute code on an affected system by altering the operation of OpenSSH, another commonly-deployed package (not directly affected by the vulnerability). This is usually regarded as the most severe kind of vulnerability, as it exposes the affected system, the data held in it, and the networks it is on to essentially any kind of further exploitation.

Scale: The XZ Utils package is incorporated into most distributions of Linux (much like OpenSSL, the library affected by the infamous Heartbleed bug a decade ago). Since almost all IT estates and businesses rely on Linux to some extent, the vulnerability would have exposed most organisations to an attack. Fortunately, the vulnerability was caught quickly and before it flowed down to the Linux releases in common use.

Sabotage: Most concerning is the fact that the back door appears to have been introduced into the XZ source code deliberately and covertly. A malicious actor going by the name of “Jia Tan” first established themselves as a contributor to the XZ Utils project through multiple bona fide code submissions over the course of years. They then made a series of commits to the codebase library to create the back door, some of which were obfuscated to make them harder to spot in the publicly visible source code. This was a well-planned attack which took place over years, leading some to speculate that it may have been sponsored by a nation state.

The open source security debate

This episode is likely to reignite the classic debate on the relative security of open source and closed source software. Some will argue that the attacker would have found it harder to introduce the malicious code to the XZ Utils project if it were a proprietary product developed only by employees (although where nation states are involved, it is not impossible to imagine a scenario where an employee is compromised). 

The obvious response is that, if the vulnerability had existed in a closed source product, it would likely have remained concealed for much longer. The vulnerability was discovered by Andres Freund, a Microsoft developer (not involved with the XZ Utils project) who was driven to investigate after he observed the package behaving strangely. He was able to inspect the publicly-available source code to uncover the vulnerability and then trace it upstream to find the source.

Our view is that this kind of debate no longer reflects the reality of software development and modern enterprise IT – if it ever did. Most businesses depend on a mixed technology stack for their critical workloads, involving a combination of closed source and open source software. Moreover, dig into the source code of almost all proprietary software and you are almost certain to find open source dependencies. The reality is that this kind of attack cannot be avoided entirely, but only mitigated.

For suppliers – Cyber Resilience Act and new Product Liability Directive

The incident highlights the need for software vendors to properly understand their source code supply chain and the risks contained in it. This goes beyond producing the traditional software bill of materials (SBOM): it is important to consider the risk associated with each individual component. The XZ Utils project was maintained by a single person on a voluntary basis, which was likely a factor in enabling a bad actor to contribute their malicious code unnoticed.

Cyber Resilience Act

The upcoming Cyber Resilience Act (“CRA”) will enforce the secure development of software products sold on the EU market. While the CRA obligations will not apply in full until 2027, it is interesting to consider how it might have applied to the XZ vulnerability. Manufacturers of software products will be legally required to (amongst other things) ensure that products are securely developed and do not contain any known exploitable vulnerabilities, address and remediate vulnerabilities without delay (including by providing security updates), and publicly disclose information about fixed vulnerabilities.

In this case, the situation is complicated by the fact that the XZ Utils package is open source. The recently agreed text of the CRA creates a complex regime around open source software, to the effect that the maintainers of XZ Utils would probably not have been in scope because the package is not distributed in the course of a commercial activity. However, the many companies downstream that incorporate XZ Utils into their own products which are commercialised likely would have been in scope. So too would any “open source software stewards” supporting the project (albeit with a much reduced set of obligations).

Following discovery of the vulnerability, providers of affected Linux distributions voluntarily pushed alerts to their customers and addressed the back door almost immediately, ensuring that there was minimal opportunity for attackers to exploit it. Under the CRA, many of the providers would have been legally required to do so, giving a new regulatory impetus to their existing vulnerability response processes. 

Software vendors should consider how their own development and vulnerability management processes would have fared in this situation, and whether they will be compliant with the requirements of the CRA.

New Product Liability Directive 

For software companies, it is also relevant to consider how the revised Product Liability Directive (“PLD”) would apply in this situation (although it is not expected come into effect until 2026). Under the new PLD, manufacturers of software will have near-strict liability for any damage (including loss of non-commercial data) that their defective or vulnerable software causes to an end user. 

Like the CRA, the PLD does not apply to entities distributing open source software outside the context of a commercial activity (as with the XZ Utils project). However, it would have applied to the many businesses which incorporate the XZ Utils package into their commercial distributions, making them potentially liable for damages caused by exploitation of the back door. When dealing with such a serious vulnerability in such a widely distributed package, the PLD could result in a large number of organisations being exposed to substantial liabilities.

NIS 2

Software supply chain security is not just a concern for software companies. The XZ incident demonstrates the real risk posed by software supply chain attacks and the importance of obtaining software from a trusted source. 

Under the EU NIS 2 Directive (to be implemented by October 2024 at the latest), “essential” and “important” entities must take appropriate technical, operational and organisational measures to manage their cyber security risk. This includes explicit obligations to ensure supply chain security (likely through due diligence) and address software maintenance. These obligations are backed up by fines of up to EUR 10,000,000 or 2% of annual worldwide turnover, or potentially even higher. 

It is unlikely that any reasonable due diligence process would have discovered a back door so well hidden and so far upstream. However, regulators are likely to expect entities in scope of NIS 2 to have support arrangements in place to ensure that they receive security notifications and updates as quickly as possible, and to ensure that any affected systems are patched without delay. This has long been good security practice for all businesses, but (for some) will soon have force of law behind it as well.

Tags

data breaches cyber security, it and digital