Biden Admin Declares War on C++

Breakdown of the White House Report calling for the adoption of memory safe programming languages and the adoption of software security measurability.

The whitehouse
Photo by Srikanta H. U / Unsplash

Today the White House office of the National Security Director released a report titled “Back to the Building Blocks: A Path Toward Secure and Measurable Software.” that outlines actions the White House wants developers nationwide to make to ensure the cybersecurity of the United States.

Roadmap to Memory Safety

The suggestion from the report that is immediately actionable for developers is switching to memory safe programming languages. The report states that now is the best time it has ever been to switch to a memory safe language, therefore, all new projects should make the jump. It even goes as far as calling existing projects to start refactoring their codebases as well.

The report explicitly calls out C/C++ as the popular problematic languages and suggests that Rust should be used instead. No other languages are mentioned in the report, cue crab rave. It does acknowledge that while on paper Rust checks all the boxes needed to replace C/C++ (specifically for space applications) it is not yet battle proven. There is the argument that with more adoption comes improvements such as toolchain development and workforce education that could really take Rust to the Moon.

The most important discussion in this paper is how the incentives are not aligned to favor memory safety. Huge systematic changes need to happen just to dethrone C++ as the default systems language, let alone to get the ball rolling on refactoring old code. I don't think the systems that are being implemented in the report are doing enough. They are mostly education and incentive based with a focus on the very long term. I would like to see strict punishments put in place for malice/negligence when it comes to system design. The cost of refactoring even critically vulnerable parts of a codebase in a new language is magnitudes more expensive than the fine/advertising costs it takes to sweep a massive cybersecurity incident under the rug. Over time I think the better languages will prevail, but there is a strong need for swift action to push the market into the correct course.

Open source is also mentioned as a critical part of the overall software ecosystem. I found it surprising that there was no mention of supporting open source work. The Log4Shell vulnerability was highlighted as a key reason why we need better observability into open source but there is a void of any plan to fix it. I think open source is one of the few things where resources input creates orders of magnitude of economic activity. It truly is the backbone of all forms of modern engineering and progress at large.

Interestingly, there is also the discussion on how hardware is another avenue towards a memory safe future. There are projects in development that could allow patching memory vulnerabilities just by switching hardware. This is still a new area that doesn't appear to have made it outside of a lab yet, but given the cost of refactoring ancient C monoliths I could see little wins in this field gaining a lot of traction and snowballing.

Measuring Cybersecurity Quality

The other required step is far less actionable, figuring out a way to measure code safety. This is a huge problem that I think the report really nailed. There is no simple way to describe the safety of a software system. How is anyone supposed to make informed decisions regarding the safety of their data and systems in the current ecosystem? Even if you have a full teams dedicated to pouring over every piece of code that your organization uses would they catch things like Log4Shell or Heartbleed? The answer is no.

This is an extremely hard area of research that I think is a mostly open problem. I think its awesome that the White House is willing to tackle something this big. It really does effect every single part of the software ecosystem. From CTO's to developers to consumers nobody has any actionable data about the safety of a system. The report says that groundbreaking academic research is an essential part of getting a solution here so hopefully that will inspire the upcoming generation of PHD's to tackle these difficult problems.

So what can we measure today? It seems odd that the only real indicator is what language the project was written in. Don't tell HackerNews but in the near future seeing Rust as the main language of a project is probably going to be the singular selling point in sensitive areas. Maybe AI will save us.


So in conclusion, stop writing code in C/C++. The biggest dent you can make in the security of your code is writing it in a memory safe language. If you need a low level systems language you should probably use Rust instead, but its likely you don't actually need that. Hopefully soon hardware wins will make unsafe code a little safer, and in the longer term we will have actionable metrics about the safety of our systems.

Here is my highlighted copy of the report: