Home / Technology / Data security depends on a secure software-development supply chain

Data security depends on a secure software-development supply chain

As 2020 finally came to an end and 2021 began, The New York Times reported that Russia used SolarWinds’ hacked program to infiltrate at least 18,000 government and private networks. As a result, it is presumed that the data within these networks (user IDs, passwords, financial records, source code), is in the hands of Russian intelligence agents. While the media has written numerous stories about the effects of the breach, there has been a noticeable lack of discussion around the type of attack that was perpetrated, that is, a supply-chain hack. This article will describe in more detail the nature of this type of attack along with some proposed best practices about supply-chain security to thwart nefarious incidents in the future. Finally, we’ll explore if the open source community (which is designed to be transparent and collaborative), can provide some guidance on better security approaches to developing software with a security-first mindset.

What is a supply-chain hack? As an analogy, consider the Chicago Tylenol Murders that took place in the 1980s. It started when somebody broke into a pharmacy in Chicago, opened the Tylenol bottles, laced pills with cyanide and returned the bottles back to the shelves. As a result, people who consumed these laced Tylenol pills got very sick resulting in multiple fatalities. This concept is analogous to a supply chain attack (software or infrastructure) in that a hacker breaks into where the software is consumed through a small backdoor or sneaks in malicious code that’s going to take over the computer or cause any sort of damage to the eventual consumer of the software. In the case of the SolarWinds hack, the attacker hacked a particular vendor field server most used by military and government contractors.

The consequence of a small stealthy attack into the infrastructure used to deliver software (or the software itself) can have a lot of impact. It’s stealthy because it’s very hard to track all the way to the left of the supply chain exactly what went wrong. In a similar manner, those responsible for lacing the Tylenol back in the eighties were never caught. Here’s the thing — supply-chain attacks are not new; we’ve known about them going way back to Ken Thompson’s famous paper in 1984 titled Reflections On Trusting Trust. Why haven’t we started taking it seriously until now? Likely because other open door attacks were easier to execute so there was no need.

In today’s world, where open source software is universally pervasive, supply-chain attacks are even more damaging because there are hundreds of thousands of “ingredients” contributed by multiple parties. This means there are a lot more points where somebody can come in and attack when one considers the full dependency tree of any package. That’s not to say that open source is to blame for this and other supply-chain attacks. The fact is there are so many open-source components on private or closed-source infrastructure today, the whole open-source versus closed-source debate is moot. The key challenge is, how can we secure today’s ecosystem that is made mostly of open-source and closed-source hybrids?

The primary obstacle to overcome is culture-related. That is, the very nature of open source development is based on trust and transparency — developers are essentially giving source code to everybody to consume for free. For example, consider Libtiff, a component created 33 years ago to render a particular type of image. Today, it is used by Sony PSP,  the Chrome browser, Windows, Linux, iiOS, and many others. The creator never had the idea that it would be used so pervasively in the ecosystem. If malicious code was introduced to this root component, imagine the widespread damage.
Given the cultural background and approach to open source that is pervasive today, what practical steps we all take to limit the danger of future supply-chain hacks?

First and foremost, developers need to start injecting infrastructure to protect the software development pipeline as it’s in use. Put down protocols that help the ecosystem understand how components are made and what they’re expected to be used for. In the same way that you wouldn’t plug a USB key into your machine if you found it sitting on the sidewalk outside of your building, don’t run a random open-source package from the internet on your machine either. Unfortunately, every developer does that 100 times a day.

Second, convey all of this information to users and consumers so they can make educated decisions. How can we best prove transparency in the software processes, not only in open-source, but in the whole pipeline from open to closed and so forth? Going back to the Tylenol metaphor, as a result of that horrible event, tamper proof seals on bottles were created. In a similar way, the software supply chain is starting to identify crucial parts that need fixing to safeguard it from attacks.

One of them is communicating the components, or ingredients through a software bill of materials. It’s about building infrastructure that allows for the communication of information throughout the supply chain. There are a number of projects seeking to do this, including in-toto, Grafeas, SPDX, and 3T SBOM. They are all trying to shift verification left and shift transparency right. Back to the metaphor, if somebody is able to look at an FDA approval seal on the Tylenol bottle, they know they can consume it and that there are a lot of checks and balances along the line to ensure its safety. We need this type of software primitive in the software supply chain so we can better communicate to the upstream consumers of the software.

Let’s not ignore the lazy factor. Developers know they’re supposed to use cryptography and sign things and check the signatures before using things — but it’s inconvenient and not taken seriously. The software build and CI/CD process is usually the most neglected; it’s usually a machine sitting under somebody’s desk that was set up once and never looked at again. Unfortunately, that’s the point of security that we really need to enforce and protect. But it’s not a priority today (so many other fires to attend to!) as evidenced by the Linux Foundation 2020 FOSS Contributor survey. In a collaborative open source development ecosystem where many parties can be involved, the producers (developers) are not incentivized to communicate the software components because the compromise is happening elsewhere in the supply chain. For example, SolarWinds wasn’t affected by the attack, but their consumers were. There needs to be an acknowledgement from every single individual who’s part of a chain that a brought-to-surface identification of components is paramount at every level.

Diving deeper, we need a cryptographic paper trail that provides verifiable information that’s cryptographically signed that provides insight on how the practices were followed. The Linux Foundation recently put out a blog post citing this amongst other recommendations for preventing supply-chain attacks like SolarWinds. The ecosystem needs to make sure that everything was followed to the letter and that every single act in the supply chain was the right one — every single software artifact was created by the right person, consumed by the right person, and that there was no tampering or hacking along the way. By emphasizing verification through the software supply chain, the resulting transparency will make it harder for bad actors’ hacks to go undetected, limiting the amount of down-stream impact and damage on software consumers.  This supply train audit trail also makes it way easier to do reconnaissance should an attack occur.

While today the idea of tedious open source security work pains so many of us, open source managers, security experts and developers have an opportunity to be the unexpected heroes in the fight against those who aim to do harm to our systems. With some intention and consistency, we’re in a position — due to the pervasiveness of the software we’ve built — to help solve one of the biggest technology challenges of our time.

Santiago Torres-Arias is Assistant Professor of Electrical and Computer Engineering at Purdue University. He conducts research on software supply chain security, operating systems, privacy, open source security, and binary analysis.

Dan Lorenc is a Software Engineer at Google focused on open source cloud technologies. He leads an engineering team focused on making it easier to build and deliver systems for Kubernetes. He created the Minikube, Skaffold, and Tekton open-source projects, and is a member of the Technical Oversight Committee for the Continuous Delivery Foundation.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member

Let’s block ads! (Why?)

VentureBeat

About

Check Also

Kongregate focuses on building its own idle games for mobile

GamesBeat Summit 2021 #GBSummit returns with two days of content and networking designed for industry …