Project Zero Exploits 'Unexploitable' Glibc Bug 98
NotInHere (3654617) writes with news that Google's Project Zero has been busy at work. A month ago they reported an off-by-one error in glibc that would overwrite a word on the heap with NUL and were met with skepticism at its ability to be used in an attack. Google's 'Project Zero' devised an exploit of the out-of-bounds NUL write in glibc to gain root access using the setuid binary pkexec in order to convince skeptical glibc developers. 44 days after being reported, the bug has been fixed.
They even managed to defeat address space randomization on 32-bit platforms by tweaking ulimits. 64-bit systems should remain safe if they are using address space randomization.
Re: microsofties here is your chance to party (Score:5, Informative)
Embedded stuff would typically use uClibc. Android uses Bionic libc.
Most ARM might be 32 bit but most ARM doesn't use Glibc.
Summary is completely exagerated (Score:5, Informative)
I read through the thread and at no point was the bug considered "Unexploitable". Even skepticism is too strong of a word to use. The only doubt that was raised was asking "How likely is that?"
C Needs Bounds Checking (Score:5, Informative)
Meanwhile, slopping programming in any language results in unintended side effects.
Yes, but the lack of bounds checking in C is kind of crazy. The compiler is now going out of its way to delete error-checking code simply because it runs into "undefined behavior," but no matter how obvious a bounds violation is, the compiler won't even mention it. Go ahead and try it. Create an array, index it with an immediate value of negative one, and compile. It won't complain at all. ...but god-forbid you accidentally write code that depends upon signed overflow to function correctly, because that's something the compiler needs to notice and do something about, namely, it needs to remove your overflow detection code because obviously you've memorized the C standards in their entirety and you're infallible, and there's no chance whatsoever that anyone ever thought that "undefined behavior" might mean "it'll just do whatever the platform the code was compiled for happens to do" rather than "it can do anything at all, no matter how little sense it makes."
Due to just how well GCC optimizes code, bounds checking wouldn't be a huge detriment to program execution speed. In some cases the compiler could verify at compile time that bounds violations will not occur. At other times, it could find more logical ways to check, like if there's a "for (int i = 0; i < some_variable; i++)" used to index an array, the compiler would know that simply checking "some_variable" against the bounds of the array before executing the loop is sufficient. I've looked at the code GCC generates, and optimizations like these are well within its abilities. The end result is that bounds checking wouldn't hinder execution speeds as much as everyone thinks. A compare and a conditional jump isn't a whole lot of code to begin with, and with the compiler determining that a lot of those tests aren't even necessary, it simply wouldn't be a big deal.
Re:Summary is completely exagerated (Score:5, Informative)
I chose the word scepticism, and still I think it is. I agree that the word "unexploitable" was a bit exaggerated, but that was added by unknown lamer.
Florian Weimer [sourceware.org] said:
My assessment is "not exploitable" because it's a NUL byte written into malloc metadata. But Tavis disagrees. He is usually right. And that's why I'm not really sure.
Its however true that he corrects himself the same day a bit later:
>> if not maybe the one byte overflow is still exploitable.
>
> Hmm. How likely is that? It overflows in to malloc metadata, and the
> glibc malloc hardening should catch that these days.
Not necessarily on 32-bit architectures, so I agree with Tavis now, and
we need a CVE.
Re:"Unexploitable" sudo bug pre-1.6.3p6 (Score:4, Informative)
I've read a bit through the threads and think that the reason it took so long was because they decided to remove a feature [openwall.com] to fix the problem:
I believe the current plan is to completely remove the transliteration
module support, as it hasn't worked for 10+ years.
The git commit message states the same. There were really some problems in that function: https://sourceware.org/ml/libc... [sourceware.org]
Re:C Needs Bounds Checking (Score:5, Informative)
Personally, I never worry about code not executing [quickly] enough.
You know, people say stuff like that all the time, but all it proves is you're not a programmer that developers speed-critical applications. Guess what? There are lots of people who are. Game programmers (me). Simulations programmers. OS / Kernel developers. There are some situations where fast is never fast enough. You're thinking like a desktop developer who writes business applications that are probably not that demanding of the CPU. Get a faster processor? I wish! Not possible for console developers, or when you're running software in data centers with thousands of machines. Those are real problems, and they require highly optimized code, not more hardware. Most programmers have no idea how much the constant push for efficiency colors everything we do.
Just today the other day I was looking at a test case where a complicated pathfinding test scenario bogs pegs my 8 core CPU when a lot of units are on-screen at once. That's not some theoretical problem, and telling users they need some uber-machine to play the game is a non-starter. I either need to ensure my game design avoids those scenarios or I'll need to further optimize the pathfinding systems to allow for more units in the game.
That being said, I agree with your complain about C's fundamental insecurity, but it's not so simple as adding in a compilers switch. For the most common and checkable types of bounds problems, or library functions that can cause problems, Microsoft's C/C++ compiler already does what you've suggested to a degree (not as certain about GCC). The big problem with bounds checking in C is that arrays are simple pointers to memory. The compiler doesn't always know how big that free space is, because there's no type or size associated with it. It's possible in some cases to do bounds-checking, but not in many others. It's a fundamental difficulty with the language, and it's impossible for the compiler to check all those bounds without help from the language or the programmer.