In a recent opinion piece, Linus Torvalds shares his views on C and C++. “I must be a glutton for punishment. Not only was my first programming language IBM 360 Assembler, but my second language was C. Programming anything in them wasn't easy. Programming safely in either is much harder.” So, when the US Cybersecurity and Infrastructure Security Agency (CISA) and the Federal Bureau of Investigations (FBI announced they were doubling down on their efforts to persuade software manufacturers to abandon "memory-unsafe" programming languages such as C and C++, it came as no surprise.
The report on Product Security Bad Practices warns software manufacturers about developing "new product lines for use in service of critical infrastructure or [national critical functions] NCFs in a memory-unsafe language (eg: C or C++) where there are readily available alternative memory-safe languages that could be used is dangerous and significantly elevates risk to national security, national economic security, and national public health and safety." In short, don't use C or C++. Yeah, that's going to happen.[1]
If this sounds familiar, it's because CISA has been preaching on this point for years. Earlier in 2024, CISA, along with partner agencies including the FBI, Australian Signals Directorate's Australian Cyber Security Centre, and the Canadian Centre for Cyber Security, aka the Five Eyes, published a report, Exploring Memory Safety in Critical Open Source Projects, which analyzed 172 critical open source projects. The findings revealed that over half of these projects contain code written in memory-unsafe languages, accounting for 55% of the total lines of code across the examined projects. Specifically, "Memory-unsafe languages require developers to properly manage memory use and allocation. Mistakes, which inevitably occur, can result in memory-safety vulnerabilities such as buffer overflows and use after free. Successful exploitation of these types of vulnerabilities can allow adversaries to take control of software, systems, and data." Tell us something we didn't know.
CISA continued that memory safety vulnerabilities account for 70% of security vulnerabilities. To address this concern, CISA recommends that developers transition to memory-safe programming languages such as Rust, Java, C#, Go, Python, and Swift. These languages incorporate built-in protections against common memory-related errors, making them more secure from the code up. Sounds good, doesn't it?
If only it were that easy to snap your fingers and magically transform your code base from C to Rust. Spoiler alert: It's not. Take Rust in Linux, for example. Even with support from Linux's creator, Linus Torvalds, Rust is moving into Linux at a snail's pace. The problem is, as Torvalds said at Open Source Summit Europe 2024, "The whole Rust versus C discussion has taken almost religious overtones" with harsh arguments that have led to one Rust in Linux maintainer throwing up his hands in disgust and walking away. You see, people who've spent years and sometimes decades mastering C don't want to master the very different Rust. They don't see the point. After all, they can write memory-safe code in C, so why can't you? Well, because they don't have those years of experience, for one thing.
It's more than just old, grumpy developers. Converting existing large codebases to memory-safe languages can be an enormous undertaking. It's time-consuming, resource-intensive, requires careful planning to maintain functionality, and, frankly, it's a pain in the rump.
Another problem is that memory-safe languages may introduce performance slowdowns compared to C and C++. There's a reason we're still using these decades-old, difficult languages; with them, developers can produce the fastest programs. Given a choice between speed and security, programmers and the companies that employ them go for the fastest code every time. Besides the sheer migration cost, companies also face the expense of replacing existing development tools, debuggers, and testing frameworks to support the new languages. Then, of course, they're integrating the new programs with the old code and libraries.
The CISA is insisting that this be done. Or, at the least, companies must come up with roadmaps for moving their existing codebases by 1 January 2026. The CISA argues that the long-term benefits in terms of reduced vulnerabilities and improved security outweigh the initial investment.
I know businesses. They're not going to buy this argument. In the modern corporate world, it's all about maximizing the profits for the next quarter. Spending money today to save money in 2027? It's not going to happen. Eventually, painfully, slowly, we'll move to memory-safe languages. It really is a good idea. Personally, though, I don't expect it to happen this decade. In the 2030s? Yes, 2020s? No. Neither businesses nor programmers have sufficient reason to make the jump. “Sorry, CISA, that's just the way it is.”
This article is shared at no charge for educational and informational purposes only.
Red Sky Alliance is a Cyber Threat Analysis and Intelligence Service organization. We provide indicators of compromise information via a notification service (RedXray) or an analysis service (CTAC). For questions, comments or assistance, please contact the office directly at 1-844-492-7225, or feedback@redskyalliance.com
Weekly Cyber Intelligence Briefings:
- Reporting: https://www.redskyalliance.org/
- Website: https://www.redskyalliance.com/
- LinkedIn: https://www.linkedin.com/company/64265941
Weekly Cyber Intelligence Briefings:
REDSHORTS - Weekly Cyber Intelligence Briefings
https://register.gotowebinar.com/register/5378972949933166424
[1] https://www.theregister.com/2024/11/08/the_us_government_wants_developers/?td=rt-3a
Comments