department of hack
1874 stories
·
16 followers

01apr2024

1 Share
Read the whole story
brennen
3 days ago
reply
Boulder, CO
Share this story
Delete

What autoconf got right

1 Share

Thanks to the xz backdoor, many people are now talking about the state of Linux packaging tools, and in particular build systems. As a maintainer of Void Linux and packager of many things, I have my five cents to add, so today I’ll be the contrarian and argue what autoconf got right. This is not an apology for GNU autotools; we are all well familiar with the issues they bring—yet some prospective replacements manage to be worse in certain aspects.

It provides a standardized interface.

This is of course the hardest point to tackle for any new contestor that has not reached a critical mass.

In Void Linux, the GNU configure build style is the most popular; roughly 2250 of about 14300 package template use it, and an additional 120 use the generic configure build style, which works similarily.

As a packager, the worst thing is to find a custom made build system that behaves totally different from what we know—if you decide to write your own ./configure scripts, please stick to the conventions! We packagers really have better things to do than figure out yet another homebrew build system that’s used exactly once.

These conventions are standardized as part of the GNU Coding Standards and they specify many features that packagers expect, but developers without own packaging experience are likely to miss. One example is support for staged installation, i.e. DESTDIR. This is essential for building packages that only contain the files that package actually ships. And no, support for --prefix is not enough to make up for this (if you wonder why, please read up the standards).

It is based on checking features.

People who have been staring at ./configure output for too long may want to disagree, but let me make my point: check-based configuration is the only way to write software that will continue to work properly in the future. If you instead keep a table of broken systems and workarounds, it a) will not be updated for future systems, b) doesn’t detect if the system was actually fixed (either by patching a bug, or adding a missing feature). It’s also very unlikely the software builds on an system unknown to the build system, even if it’s standards-compliant otherwise.

Of course, the checks should be reasonable (and in practice, often are excessive). If your code assumes a C99 environment, you don’t need to check whether all C99 functions you use are available. Likewise, if you don’t need macros for certain sizeof values, you don’t need to check for them, either. And you never need to check if sizeof char is actually 1—it literally can’t be anything else. Also, checking for functions can be done incorrectly.

Overrides are possible.

While checks are good, sometimes they are broken or a certain configuration needs special override, because a feature can’t be checked (for example, when cross-compiling). In this case, autoconf scripts provide options to override checks with a predetermined result; usually you can set an environment variable like gt_cv_func_printf_posix=yes.

Likewise, if a library is installed at a special location, it’s also easy to tell configure to use it.

The config.log tells what happened.

Many other systems do checks, but only tell that something has failed. Debugging this can be difficult. Autoconf writes what it does into a config.log file, which is sometimes helpful to debug a check.

There is support for cross-compiling and for host/target separation.

Cross-compilation is a build system feature that is often put in second place, but as a maintainer of a system that heavily makes use of it, I have a fair share of experience and can say that autotools are one of the best systems to support cross-compilation. Especially custom-made build systems are often very lacking. Cross-compilation of C programs is not particularly hard in principle, but your build system needs to know which code is going to run on the target, and that programs which need to run during compilation (e.g. to precompute tables or something) need to be compiled for the host (with different CFLAGS and so on).

It has few runtime dependencies.

This is also a defining feature of autoconf, as usually a basic POSIX shell environment (or, say, something busybox) is enough to run the configure scripts. This is in particular important for packages needed for bootstrapping. If your build system needs Python, well, then you need to compile Python first; but to compile Python, you need to compile all of its dependencies, which hopefully don’t need Python then themselves to build…

However, for packages not directly relevant to bootstrapping a system this is not such an essential feature.

NP: Policy of 3—Let It Build

Read the whole story
brennen
3 days ago
reply
Boulder, CO
Share this story
Delete

reflections on distrusting xz

1 Share

Was the ssh backdoor the only goal that "Jia Tan" was pursuing with their multi-year operation against xz?

I doubt it, and if not, then every fix so far has been incomplete, because everything is still running code written by that entity.

If we assume that they had a multilayered plan, that their every action was calculated and malicious, then we have to think about the full threat surface of using xz. This quickly gets into nightmare scenarios of the "trusting trust" variety.

What if xz contains a hidden buffer overflow or other vulnerability, that can be exploited by the xz file it's decompressing? This would let the attacker target other packages, as needed.

Let's say they want to target gcc. Well, gcc contains a lot of documentation, which includes png images. So they spend a while getting accepted as a documentation contributor on that project, and get added to it a png file that is specially constructed, it has additional binary data appended that exploits the buffer overflow. And instructs xz to modify the source code that comes later when decompressing gcc.tar.xz.

More likely, they wouldn't bother with an actual trusting trust attack on gcc, which would be a lot of work to get right. One problem with the ssh backdoor is that well, not all servers on the internet run ssh. (Or systemd.) So webservers seem a likely target of this kind of second stage attack. Apache's docs include png files, nginx does not, but there's always scope to add improved documentation to a project.

When would such a vulnerability have been introduced? In February, "Jia Tan" wrote a new decoder for xz. This added 1000+ lines of new C code across several commits. So much code and in just the right place to insert something like this. And why take on such a significant project just two months before inserting the ssh backdoor? "Jia Tan" was already fully accepted as maintainer, and doing lots of other work, it doesn't seem to me that they needed to start this rewrite as part of their cover.

They were working closely with xz's author Lasse Collin in this, by indications exchanging patches offlist as they developed it. So Lasse Collin's commits in this time period are also worth scrutiny, because they could have been influenced by "Jia Tan". One that caught my eye comes immediately afterwards: "prepares the code for alternative C versions and inline assembly" Multiple versions and assembly mean even more places to hide such a security hole.

I stress that I have not found such a security hole, I'm only considering what the worst case possibilities are. I think we need to fully consider them in order to decide how to fully wrap up this mess.

Whether such stealthy security holes have been introduced into xz by "Jia Tan" or not, there are definitely indications that the ssh backdoor was not the end of what they had planned.

For one thing, the "test file" based system they introduced was extensible. They could have been planning to add more test files later, that backdoored xz in further ways.

And then there's the matter of the disabling of the Landlock sandbox. This was not necessary for the ssh backdoor, because the sandbox is only used by the xz command, not by liblzma. So why did they potentially tip their hand by adding that rogue "." that disables the sandbox?

A sandbox would not prevent the kind of attack I discuss above, where xz is just modifying code that it decompresses. Disabling the sandbox suggests that they were going to make xz run arbitrary code, that perhaps wrote to files it shouldn't be touching, to install a backdoor in the system.

Both deb and rpm use xz compression, and with the sandbox disabled, whether they link with liblzma or run the xz command, a backdoored xz can write to any file on the system while dpkg or rpm is running and noone is likely to notice, because that's the kind of thing a package manager does.

My impression is that all of this was well planned and they were in it for the long haul. They had no reason to stop with backdooring ssh, except for the risk of additional exposure. But they decided to take that risk, with the sandbox disabling. So they planned to do more, and every commit by "Jia Tan", and really every commit that they could have influenced needs to be distrusted.

This is why I've suggested to Debian that they revert to an earlier version of xz. That would be my advice to anyone distributing xz.

I do have a xz-unscathed fork which I've carefully constructed to avoid all "Jia Tan" involved commits. It feels good to not need to worry about dpkg and tar. I only plan to maintain this fork minimally, eg security fixes. Hopefully Lasse Collin will consider these possibilities and address them in his response to the attack.

Read the whole story
brennen
3 days ago
reply
Boulder, CO
Share this story
Delete

Types of Eclipse Photo

1 Comment
The most rare, top-tier eclipse photo would be the Solar Earth Eclipse, but the Apollo 12 crew's attempt to capture it was marred by camera shake. They said it looked spectacular, though.
Read the whole story
brennen
3 days ago
reply
Tag yourself. I'm "focus issues", "reaction shot", and "traffic jam". (Unlisted here: "Selfie with focus issues".)
Boulder, CO
Share this story
Delete

https://sarahcandersen.com/post/746752075494572032

1 Share
Read the whole story
brennen
9 days ago
reply
Boulder, CO
Share this story
Delete

ode to a faux grecian urn

1 Comment and 2 Shares

Howdy everyone,

Today’s house, built in 2001, comes to you from, you guessed it, the Chicago suburbs. The house is a testimony to traditional craftsmanship and traditional values (having lots of money.) The cost of painting this house greige is approximately the GDP of Slovenia so the owners have decided to keep it period perfect (beige.) Anyway.

This 5 bedroom, 7.5 bathroom house clocks in at a completely reasonable 12,700 square feet. If you like hulking masses and all-tile interiors, it could be all yours for the reasonable price of $2.65 million.

The problem with having a house that is 12,700 square feet is that they have to go somewhere. At least 500 of them were devoted to this foyer. Despite the size, I consider this a rather cold and lackluster welcome. Cold feet anyone?

The theme of this house is, vaguely, “old stuff.” Kind of like if Chuck E Cheese did the sets for Spartacus. Why the dining room is on a platform is a good question. The answer: the American mind desires clearly demarcated space, which, sadly, is verboten in our culture.

The other problem with a 12,700 square foot house is that even huge furniture looks tiny in it.

Entering cheat codes in “Kitchen Building Sim 2000” because I spent my entire $70,000 budget on the island.

Of course, a second sitting room (without television) is warranted. Personally, speaking, I’m team Prince.

I wonder why rich people do this. Surely they must know it’s tacky right? That it’s giving Liberace? (Ask your parents, kids.) That it’s giving Art.com 75% off sale if you enter the code ROMANEMPIRE.

Something about the bathroom really just says “You know what, I give up. Who cares?” But this is not even the worst part of the bathroom…

Not gonna lie, this activates my flight or fight response.

If you remember Raggedy Ann you should probably schedule your first colonoscopy.

Anyways, that does it for the interior. Let’s take a nice peek at what’s out back.

I love mowing in a line. I love monomaniacal tasks that are lethal to gophers.

Alright, that does it for this edition of McMansion Hell. Back to the book mines for me. Bonus posts up on Patreon soon.

If you like this post and want more like it, support McMansion Hell on Patreon for as little as $1/month for access to great bonus content including a discord server, extra posts, and livestreams.

Not into recurring payments? Try the tip jar! Student loans just started back up!

Read the whole story
brennen
11 days ago
reply
"I love monomaniacal tasks that are lethal to gophers."
Boulder, CO
Share this story
Delete
Next Page of Stories