Heartbleed Coder: Bug In OpenSSL Was an Honest Mistake 447
nk497 (1345219) writes "The Heartbleed bug in OpenSSL wasn't placed there deliberately, according to the coder responsible for the mistake — despite suspicions from many that security services may have been behind it. OpenSSL logs show that German developer Robin Seggelmann introduced the bug into OpenSSL when working on the open-source project two and a half years ago, according to an Australian newspaper. The change was logged on New Year's Eve 2011. 'I was working on improving OpenSSL and submitted numerous bug fixes and added new features,' Seggelmann told the Sydney Morning Herald. 'In one of the new features, unfortunately, I missed validating a variable containing a length.' His work was reviewed, but the reviewer also missed the error, and it was included in the released version of OpenSSL."
It goes to show that... (Score:5, Insightful)
Coding and campaign do not mix.
Not malicious but not honest? (Score:5, Insightful)
Hmm, considering that the real bug is OpenSSL's malloc, where it was reusing 'freed' memory I think that's the bug that is critical. The developer who put in the TLS support itself may have been under the assumption that since OpenSSL didn't die/crash once implemented that everything was honky-dory, and the reviewer likewise.
Re:Not malicious but not honest? (Score:5, Insightful)
Re:Not malicious but not honest? (Score:5, Insightful)
Ditto. Writing a custom malloc is insane for a sensitive security library like this... specially when it is done so carelessly.
The fact that OpenSSL won't even work using regular malloc() suggests that there're more issues waiting to pop up here.
Re:Not malicious but not honest? (Score:5, Informative)
considering that the real bug is OpenSSL's malloc, where it was reusing 'freed' memory I think that's the bug that is critical.
Well. no.
The bad bit of code is, per http://www.tedunangst.com/flak/post/heartbleed-vs-mallocconf [tedunangst.com]:
struct {
unsigned short len;
char payload[];
} *packet;
packet = malloc(amt);
read(s, packet, amt);
buffer = malloc(packet->len);
memcpy(buffer, packet->payload, packet->len);
write(s, buffer, packet->len);
The bad bit is that "amt" is not checked against "packet->len", so the copy into "buffer" reads off the end of the allocated data structure "packet". The data read may be freed memory, or it may be allocated memory.
The only way malloc could completely protect against the bug would be by putting an unmapped guard page after every malloced block - making every malloced block at least one page long and slowing malloc down by the time needed for all those munmaps. (Probably making malloc slow enough to incite OpenSSL devs to implement their own malloc layer...)
I think the real bug is in the RFC.
Look at the fix:
if (1 + 2 + 16 > s->s3->rrec.length)
return 0;
hbtype = *p++;
n2s(p, payload);
if (1 + 2 + payload + 16 > s->s3->rrec.length)
return 0;
pl = p;
Why does the heartbeat request even contain the length of the heartbeat block? We know the length of the SSL record!
(Not even bothering with the whole problem that the heartbeat thing is ridiculous - there are already ways of keeping connections alive at the TCP level - why does every layer of the protocol need it's own keepalive).
Re:Not malicious but not honest? (Score:5, Interesting)
I know this is redundant but this is funny:
Seggelmann told the Sydney Morning Herald. 'In one of the new features, unfortunately, I missed validating a variable containing a length.'
https://tools.ietf.org/html/rfc6520 [ietf.org]
Internet Engineering Task Force (IETF)
Request for Comments: 6520
Category: Standards Track
ISSN: 2070-1721
R. Seggelmann, M. Tuexen: Muenster Univ. of Appl. Sciences
M. Williams: GWhiz Arts & Sciences
February 2012
Transport Layer Security (TLS) and Datagram Transport Layer Security (DTLS) Heartbeat Extension ...
If the payload_length of a received HeartbeatMessage is too large, the received HeartbeatMessage MUST be discarded silently.
Can't even implement an RFC he wrote himself. Nice one.
Re:Not malicious but not honest? (Score:5, Insightful)
Look at the recent Apple security hole: all they did was leave in a redundant "goto" statement that unfortunately had the effect of negating the purpose of the previous check and opening a hole. It could have been as simple as not deleting the line due to a distraction somewhere else in the code, and it likely passed all the unit tests (can't easily write a test for all of these types of mistakes.)
Programmers are human. They'll make a ton of mistakes. I can't say I blame him for making one; anyone who has written enough C code is used to making plenty of mistakes before something usable comes out the other end.
Re:Not malicious but not honest? (Score:5, Insightful)
The process of dissolving a big problem into low-level steps as is required by C programming is mentally brutal. You can't just go "I want to save the text that was deleted and restore it when they hit the undo key." You have to translate that into variables, pointers, structs, mallocs, and glue logic. You have to take into account every corner case; how do you undo multiple lines of pasted text instead of single characters? How do you store undo information a paste-over operation, where you must keep a record of what was deleted but also record that it overwrites a (possibly different length) chunk of the text buffer? How do you store the undo data for find-and-replace operations? Now that you've managed to figure out ways to store all these different types of text editing operations for undoing, how do you actually undo each one? Where does all this stuff need to hook into the existing program, and how will you hook it in without breaking existing functionality?
It's so easy to forget trivially simple things when you're trying to flow your mind through that complex glue logic that makes the magic happen, especially when you make a change, it appears to work, and you have no way to test for the bug you created by accidental error or omission.
Re: (Score:3, Interesting)
Doctors are human. We hold them accountable for their mistakes. Engineers are human. We hold them accountable for their mistakes. Indeed, we hold just about everybody accountable for their on-the-job mistakes and the consequences of their mistakes result in everything from terminations to criminal proceedings.
So, when should programmers be held accountable for their mistakes, and how should be respond as a society?
Re:Not malicious but not honest? (Score:5, Insightful)
A doctor performing surgery or an engineer designing a manufacturing machine (both at significant expense to the customer) are quite different from using something you got completely for free with full design schematics an upfront "there is no guarantee that this will work but we hope you find it useful" warning and that free thing leaking its contents for some reason.
Re:Not malicious but not honest? (Score:4, Insightful)
Re: (Score:3)
Programmers are human. They'll make a ton of mistakes
Humans make certain classes of mistake. Things like array bounds checking are really easy to miss.
If only machines were good at this stuff. How come we don't have any languages that do this for us yet?
Re: (Score:3)
(I hope the sarcasm in my comment was obvious.)
Re:Not malicious but not honest? (Score:5, Insightful)
Re: (Score:3)
Re: (Score:3, Interesting)
Why does the heartbeat request even contain the length of the heartbeat block? We know the length of the SSL record!
The record has two variable length fields, so you need a length field for either the payload or the padding. In this case the payload has the length field and the padding length is implicit.
Re:Not malicious but not honest? (Score:5, Insightful)
If they had resorted to their area of expertise and simply used the malloc provided with the system, like all the regular chaps would do, even in their situation, the code would crash upon running (freed memory access) and the bug would surface already at New Years Eve 2012-2013 when Seggelmann was hopefully test-running it. So, even though indeed the code you quoted is the "bad bit", the real and broader issue probably is the teams questionable approach to development in general, in particular their false belief that someone writing a security library should consider themselves experts in rewriting heap management. Which ultimately cost them and their users. Sloppy sloppy.
This kind of practice of overestimating ones area of expertise - should be frowned upon everytime, for a good reason. We (developers) need to put it in our heads - not all algorithms are equal, and even though you and me may be prime experts at say, writing a perfectly safe implementation of SSL/TLS, we probably should steer clear of the stuff others know much more about, like heap, strings and what not. Time and again, someone comes along with the "brilliant" idea of "optimizing" the system heap allocator through caching memory blocks. True genius. No offense Robin, but WHY?! Yes, maybe the system malloc is slower than you'd like - still it is NOT YOUR PROBLEM. Division of responsibility, man. Let Glibc folks optimize malloc, or submit a patch THEY can review, if you have wonderful malloc improvement ideas.
Re: (Score:3, Insightful)
If they had resorted to their area of expertise and simply used the malloc provided with the system, like all the regular chaps would do, even in their situation, the code would crash upon running (freed memory access) and the bug would surface already at New Years Eve 2012-2013 when Seggelmann was hopefully test-running it.
Only if Seggelman thought to test it with a malformed packet.
If the code was run with a malloc that splattered unmapped pages around like OpenBSD malloc apparently does then it would crash when it was exploited, and people would be complaining "OpenBSD breaks OpenSSL, fix your shit Theo" and the problem would have been found earlier.
Comment removed (Score:5, Informative)
Whatever you may think ... (Score:5, Interesting)
I suspect s/he'll get pilloried in the press and may end up with some lawsuits (?) but I, for one, recognize a person big enough to take responsibility.
Re: (Score:2, Informative)
The RFC and the Git commit both have his name firmly attached to the "feature". He hasn't admitted anything that wasn't proven already.
Re:Whatever you may think ... (Score:5, Insightful)
may end up with some lawsuits (?)
very difficult;
a) the license clearly states the lack of warantee
b) the source code is available so you could have audited if you want
c) I for one will contribute considerably if that happens unless decent evidence is uncovered that he works for the NSA. I'm sure there are several others who will do the same.
Re:Whatever you may think ... (Score:4, Funny)
I'm sure the next issue of Newsweek will have his confession.
Re:Whatever you may think ... (Score:4, Insightful)
Devil's advocate here: I have to thank the developer for even working on OpenSSL. I fear that the bad press and consequences coming down on these developers will scare more people off from core OSS projects and we will end up with more app developers on smartphones instead of infrastructure improvements.
It would be nice if they had some sort of code review in place for this sort of stuff. However, this isn't a paid project, so the developers writing this are doing arguably the best they can.
Re: (Score:2)
Well, he hasnt admitted to anything. He has two options 1) Admit it was malicious 2) Admit it was a mistake. It doesnt take a genius to figure out 2 is the best option.
Re:Whatever you may think ... (Score:5, Insightful)
Re:Whatever you may think ... (Score:5, Insightful)
... likely because everyone thought someone else would have been looking at it.
Re:Whatever you may think ... (Score:5, Insightful)
Two reasons:
The idea that many eyes make all bugs shallow is a myth. Even most programmers don't bother auditing the open source code they download. I bet most of them don't really look beyond the API documentation.
Also, OpenSSL is one of the worst code bases you'll ever set eyes on. It's poorly documented and so complex, it'll make your eyes bleed.
Re: (Score:3)
It's not the language that's at fault, it's the attitude that resulted in a gimpy "replacement" for malloc() being used for all platforms because some platforms had a slow malloc() once upon a time.
Re:Whatever you may think ... (Score:4, Interesting)
Occam's confession? Don't admit to being malicious if you can easily claim you made a mistake? :-)
Re:Whatever you may think ... (Score:4, Insightful)
Well, he hasnt admitted to anything.
He has clearly stated it was his mistake. The third option you hint at is of course "Admit nothing", it's the preferred option for 10/10 corporations, governments, religions. This guy is an engineer, one who's moral compass points to option 2, he has stood up publically and owned his mistake so others can be aware of the problem and do something about (coincidently "what to do about HB" was the topic of discussion at work today).
It doesnt take a genius to figure out 2 is the best option.
Correct, doing the "right thing" doesn't require brains, it requires principles and the balls to live up to them.
Re:Whatever you may think ... (Score:5, Insightful)
Boy, if there's one thing that could ever kill Open Source it would be being held legally liable for a commit with a bug in it.
Which is why all projects are released AS-IS without any liability.
Re:Whatever you may think ... (Score:5, Insightful)
Boy, if there's one thing that could ever kill Open Source it would be being held legally liable for a commit with a bug in it.
It burns me that RSA is not held liable for their $10M NSA backdoor in Dual_EC_DRBG PRNG. Customers should be flocking in droves but RSA gives enough swag at conferences that the suits don't care.
Your privacy sold off for $10M and some mouse pads.
Re:Whatever you may think ... (Score:5, Informative)
RSA has denied having knowledge of the backdoor, says NSA tricked them, and has never denied the $10M payout. Some of Snowden's leaks mention it.
Reuters has a summary [reuters.com]
proof-of-concept backdoor with a link to the github repo.
None of that is a smoking gun, but there is enough smoke to tell me there is a fire.
Re:Whatever you may think ... (Score:4, Interesting)
There is also a nice proof-of-concept backdoor [0xbadc0de.be] with a link to the github repo.
Re: (Score:3, Informative)
I'll go as far as it is reasonable to suspect there could be a fire. All of the background reading that I have done only shows that it may be possible that a backdoor exists, not that there actually is one. As for the RSA affair, it is entirely possible that they were simply trying to promote what was at the time a very hot and promising encryption technology.
Of course keep in mind that it took people 20 years to figure out that NSA strengthened DES against cryptanalysis methods that were still secret whe
Re:Whatever you may think ... (Score:5, Informative)
From the proof-of-concept page [0xbadc0de.be] I mentioned above.
Here is the Github repo for the PoC code. [github.com]
This PRNG is not the NSA making a crypto system stronger ala DES, it's a backdoor.
Re:Whatever you may think ... (Score:5, Interesting)
Re:Whatever you may think ... (Score:5, Insightful)
Re:Whatever you may think ... (Score:5, Funny)
Re:Whatever you may think ... (Score:5, Funny)
hats off to the developer who admits a mistake.
It's laudable but insufficient; to genuinely move towards making the aggrieved parties whole, I think it demands nothing short of a full refund.
Re: (Score:3)
may end up with some lawsuits (?)
If you have ever wondered why all the popular open source licenses, like GPL, BSD and Apache, include the "warranty" and "limitation of liability" clauses, this is exactly why. The clauses usually state something like "this software is provided 'as is' and without any warranty. The user of the software assumes all risks that may arise. In no event shall the project or its contributors be liable for any damages."
Re: (Score:3)
This is not the US. There will be no lawsuits. "No warranty" actually has meaning here.
Re:Whatever you may think ... (Score:4, Interesting)
Well pretty much anyone can start a lawsuit. But what damages are they suing for? Reimbursement of the purchase price?
If you're using it, you're agreeing to the license:
* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY
* EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR
* ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
* OF THE POSSIBILITY OF SUCH DAMAGE.
Now I am not a lawyer, and there are always folk looking for an opportunity to sue, but the license terms surely set them off on a bad start.
Re: (Score:3)
It's not just the implementation (Score:5, Interesting)
The design of the feature looks like a backdoor too. A heartbeat function with a variable length payload, and there is a superfluous field for the payload length, and all running on top of TCP, which already has a keep-alive function? And then the feature contains a "rookie mistake", but still passes review. Yes, we totally believe you. It was a mistake.
Re:It's not just the implementation (Score:5, Informative)
and all running on top of TCP
Not necessarily. OpenVPN is an SSL VPN which defaults to UDP/1194.
Re: (Score:3)
Side-note: as OpenVPN does not use vanilla SSL sockets, simple-minded Heartbleed exploits that work against HTTPS etc. won't be usable against it, but it is possible to hand-craft a Heartbleed attack [serverfault.com] against OpenVPN servers (or clients) running with u
Re:It's not just the implementation (Score:4, Insightful)
The feature was TLS/DTLS heartbeats, with the "D" in DTLS standing for datagram.
HTTPS isn't the only thing using OpenSSL. As grub pointed out, OpenVPN uses it as well, and will use DTLS by default.
The advantage of tunneling traffic in datagrams is that if you try to tunnel TCP traffic within TCP, performance suffers due to dueling TCP backoff.
Re:It's not just the implementation (Score:4, Insightful)
It's a possibility, but pretty paranoid reasoning.
Assuming you have coded in your life (your language implies so), there were errors. I certainly made more than a few. Some the compilers caught, some the testing, some the users, some the accountants; some... probably nobody.
Care to explain why any of yours were not intentional because it sure looks that way. Hindsight is cheap.
My knee jerk was also that length would be superfluous - from zero context and understanding of the spec/RFC. But there could be a lot of reasons. If your job is to process a formally defined struct, you are not going to review the struct in an attempt to change the standard.
Which brings us back to where we began. If this was a conspiracy, intentional coding would be required for exploit. Given that, I find it difficult to accept that an agency intentionally first complicated the spec by including a length field (which *could* be checked, in the name of security, for protocol robustness rather than local memory ops); then hen perverted a particular implementation in a manner that looks exceedingly garden variety. Easier to never have the length field in the first place.
Re:It's not just the implementation (Score:5, Insightful)
The developer who wrote this code also wrote the spec [ietf.org], as part of his PhD research. To me the more worrying aspect of this whole affair is that OpenSSL accepted into the trunk an extension which at the time hadn't even reached "Proposed standard" status (and still doesn't seem to have progressed beyond it).
Re:It's not just the implementation (Score:5, Informative)
As others have indicated, the primary stated use of heartbeat is for DTLS, which is not over TCP.
The payload length is not actually superfluous. The packet has an arbitrary amount of both payload and padding, of which only the payload is echoed to the sender. Roughly: { uint16 payload_len; char payload[]; char padding[]; } The intent of payload_len is to tell you which of the bytes following it are payload rather than padding. Of course, you need to check that it's less than the remaining data in the packet. (Per the spec, at least 16 less -- at least 16 bytes of random padding are required.)
on purpose or not, couldn't happen if... (Score:2, Interesting)
All I know is the organization I work for has prohibited use of C or C++ for mission critical software for years now. The languages we use would not ALLOW code to execute which tries to copy 64K from a 2 byte sized container.
Part of software engineering is to use the right tool for the right job. When a buffer overrun can destroy the security of the entire internet, you damn well better not be using C as your tool. Or assembly language for that matter.
Do that where I work, and you'd be fired.
Re: (Score:3)
This is BS. You can screw up just as badly in any other language. It will just be less obvious. Of course, the combination of an incompetent code writer and an incompetent reviewer will quite often have spectacularly bad results.
Re:Not true at all (Score:5, Insightful)
OpenSSL would be meaningless and not in use if it had been written in ADA. Also, ADA is not at all the "magic bullet" some people make it out to be. In fact, it suffers from unreadable code even more than clean C code. Also remember that this is a library. Unless ADA allows creating libraries with "C" bindings in ADA, it is not even usable for the task. And if you invest the same time on the C side, you get a similar level of quality. Hell, this bug would have been found with some halfway competent fuzzing or completely avoided with mandatory time-of-use checking of all bounds. Which secure coding guidelines can make mandatory, unless you explicitly explain in each instance where you omit it why this is ok.
Sure, languages play an important role, but it always falls back to Assembler-level (or C if you want to be somewhat portable) and there competent people can do all things that imperative languages can do. The problem with most languages is that they restrict too much what you can do and/or foster bloat and/or have intransparent behavior.
I have done Eiffel, Sather, Python, Lua, Pascal, Modula, Haskell, Gofer, Prolog, Lisp, C++, several assembler variants, Basic, Java, JavaScript, Pizza, and some more things. I am back to doing all "library" level things in C and the glue in Python or Lua (or C where that is not an option). Basically all OO languages suck badly, except Eiffel, which unfortunately it too exotic to be useful. Functional languages are nice and compact for some things, but suffer from interfacing problems. (Will have a look at Erlang though, that may be better.) Logical languages have some value in some specific situations, but do not even approach "general purpose". C++ and Java are bloated, intransparent, non-orthogonal atrocities made by people that do not understand OO. These languages typically make things worse unless you have really experienced and capable people. The typical mediocre-to-bad coders just throw the features of these languages around indiscriminately, resulting in an awful mess. IDEs make that worse as suddenly it becomes easy to write more code and distribute it into even more files.
Yes, you need discipline, experience and insight to do things well in C. The current OpenSSL disaster shows all three are missing in the OpenSSL team. Yes, many C programmers suffer from a terminal desire to place efficiency above everything else. But that problem is with the programmer, not the language. And you get just the same stupid mistakes in other languages and they add their own problems.
That is not to say C is a good language. It is (besides assembler) just the only language where code quality rests completely with the coder and that does not stand in the way of the coder. It is the only "native" language if you do not want to do assembler and that gives it a special place and all that cannot work competently with it a clear limitation. At the same time, all approaches to force people to write "good code" by language features and restrictions that force them to do so have failed. There is even bloated, unreadable, unreliable code in Python or Eiffel, and you have to really work at that in these languages.
After being in this for 30 years and having seen and reviewed countless pieces of code by others, I am convinced that language selection is mostly irrelevant. The only thing language can give you is problems in the form of restrictions. If the coder is not top-notch, the code will suck and be insecure, slow, unreliable and unmaintainable, regardless of the language used. If the coder is top-notch, the code will be secure, fast, reliable and easy to maintain, mostly regardless of language used (within the restrictions the language comes with). However a top-notch coder will know or be able to figure out fast what a specific language can do and what its restrictions are and know early whether it is suitable for a specific project or not.
Of course, that said, you are right that people that want to do everything in one specific language (often Java these da
code review idea (Score:2)
Well, maybe this is a blessing. While it's open source, maybe multiple eye's need to look at it for final validation.
While never have worked with open source, I do work with data very often. I re-validate entries a lot and just scan data look for something odd. I am amazed at the amount of errors I find, so I report them and go from that point onwards.
sometimes I work with teams of people and have a kind of race looking to find the most errors, then we go out and the winner only buys 1 round, and does not p
Re:code review idea (Score:5, Insightful)
Well, maybe this is a blessing. While it's open source, maybe multiple eye's need to look at it for final validation.
No it's a curse. I have input fuzzing, unit tests, code coverage profiling and Valgrind memory tests. Such a bug wouldn't have slipped past me with both eyes shut -- no seriously! If I fuck up accidentally like this THE COMPUTER TELLS ME SO without ever having to do anything but make the mistake and type make test all. I test every line of code on every side of my #ifdef options, in all my projects. If you're implementing ENCRYPTION AND/OR SECURITY SOFTWARE then I expect such practices as the absolute minimum effort -- I mean, that's what I do, even when it's just me coding on dinky indie games as a hobby. I don't want to be known as the guy who's game was used to compromise users' credentials or data, that would be game over for me.
These ass-hats have just shown the world that they can't be trusted to use the fucking tools we wrote that would have prevented this shit if they'd have just ran them. It's really not acceptable. It's hard to comprehend the degree of unacceptable this is. It reeks of intentional disaster masquerading as coy "accidental" screw up, "silly me, I just didn't do anything you're supposed to do when you're developing industry standard security software". No. Just, no. An ancient optimization that was made default even though it only mattered on SOME slower platforms? Yeah, OK, that's fucking dumb, I can buy it as an accident. However, NOT TESTING BOTH BRANCHES for that option? What the actual fuck? I could see someone missing an edge case in their unit test, but not even using input fuzzing at all? It's not hard, shit, I have a script that generates the basic unit fuzzing code from the function signatures in .H files, you know, so you don't miss a stub...
"Never attribute to malice what can be adequately explained by stupidity." -- The level of stupidity required is unexplainable. How the fuck are they this inept and in charge of THIS project? THAT'S the real issue. This isn't even the fist time OpenSSL shit the bed so bad. [taint.org] In <- this linked example, it was Debian maintainers and not the OpenSSL maintainers fault (directly): Instead of adding an exception to the Valgrind ignore list (which you most frequently must have in any moderately sized project, esp one that handles its own memory management) they instead commented out the source of entropy, making all the SSL connections and keys generated by OpenSSL easily exploitable since it gutted the entropy of the random number generator (which is a known prime target for breakage that's very hard to get right even if you're not evil, so any change thereto needs to be extremely well vetted). Last time the OpenSSL maintainers brazenly commented they "would have fallen about laughing, and once we had got our breath back, told them what a terrible idea this was." -- Except that they silently stopped paying attention to to the public bug tracker / questions and quietly moved to another dev area, making it nearly impossible to contact them to ask them about anything (a big no-no in Open Source dev), but it gives you a better idea about the sort of maintainers these fuck-tards are.
We don't know absolutely for sure, but we're pretty damn close to absolutely certain that OpenSSL and other security products (see: RSA's BSafe) are being targeted for anti-sec by damn near all the powers that be. So, now we find out OpenSSL has an obsolete optimization -- a custom memory pool (red flag goes up right away if you see memory reuse in a security product, that shit MUST be even more throughly checked than entropy-pools, since it can cause remote code execution, memory leaks, and internal state exposure... you don't say?). We find that optimization would have been caught by basic fuzz test with Valgrind, which apparently folks have been using previously according to the comments in the prior S
Re:code review idea (Score:5, Interesting)
Serious question: Why don't you become the new maintainer yourself, if you honestly believe you can do a significantly better job at it than the current person(s)?
I don't do it myself because I can not guarantee that I wouldn't make even worse mistakes. I'm glad there are people out there who are willing to do the job, and I'm in no position to bite their heads off when they mess it up. And you're probably glad that I'm not a maintainer of anything even remotely security-related :-)
Re: (Score:3)
Bingo. There's a lot of people who are willing to declare the unpaid volunteers don't know what they're doing...but have absolutely no patience to try and contribute or agitate for change from within. Which really makes me wonder whether they'd have the wherewithall to run a new project, or you know, do anything at all.
Code that gets written gets run.
Re: (Score:3)
Instead of adding an exception to the Valgrind ignore list (which you most frequently must have in any moderately sized project, esp one that handles its own memory management)
If you need to add exceptions to get a tool to work... the tool is wrong for the job.
for a library... (Score:5, Insightful)
... so much of the internet depends on for security just one reviewer for a commit seems way way way too little, honestly checking anything into openssl (or gnutls) should be at least a 4-step approval process (submitter -> mantainer for that area -> overall library mantainer -> security officer), for any code that includes buffers/malloc especially if related to user supplied data the final security review should be a panel.
Everybody makes mistakes, everybody can have a 'brown paper bag' coding moment (especially around Christmas/New Year's like it happened in this case), 2 people having a 'brown paper bag' moment at the same time around the holidays is definitely not that unlikely, for something as important as a crypto library on which so many things depend a single reviewer is just not enough.
I do feel for the original developer, and hope that he won't suffer more about this than he already is (any developer worth their salt feels quite bad about bugs they introduce, let alone if they lead to this many problems), we've all made coding mistakes, no matter how experienced we are, so the focus should not be on "who" but more on "what kind of process can we introduce so this does not happen again".
Moving away from C in my opinion would just be a band-aid, other languages don't expose you to this particular bug, that's fine, however for security software choosing a vetting process for what goes in the codebase is a lot more important than choosing what language it's written in, not to mention that it's not that "hard" to write "secure C" especially if one leans on all the various available tools/libraries and writes proper unit tests, in this case for example had the malloc decision not been influenced by performance reasons (on unspecified platforms) this would not have been as big of a deal as it was.
Re:for a library... (Score:5, Insightful)
Moving away from C just means you now have to have faith in some bytecode virtual machine's memory and buffer management. Is it a more secure approach? Maybe, but if the root complaint is putting faith in complex software, coding in Java or some .NET language means trusting the people coding those engines are equally capable of screwing up. All these higher level virtual machines and interpreters are ultimately written in C.
Re: (Score:3)
Moving away from C just means you now have to have faith in some bytecode virtual machine's memory and buffer management. Is it a more secure approach? Maybe, but if the root complaint is putting faith in complex software, coding in Java or some .NET language means trusting the people coding those engines are equally capable of screwing up. All these higher level virtual machines and interpreters are ultimately written in C.
Or you could just use C++ complete with their bounds-checked containers.
Re: (Score:3)
And what languages are these languages themselves written in? At some point you're working with something written in C, C++ or assembler, and if those languages are dangerous to directly write apps in, then surely they must be equally dangerous to write the compilers and platforms on which your non-VM language runs.
At some point it's turtles all the way down. By writing in some other language, you're putting your faith in the people writing the interpreters, VMs and/or compilers, and in many cases those dev
Re: (Score:3)
And what languages are these languages themselves written in?
Languages aren't written in anything. (Though, the specification for a language could be written in English, for instance.) Compilers and interpreters are written in a language.
and if those languages are dangerous to directly write apps in, then surely they must be equally dangerous to write the compilers and platforms on which your non-VM language runs.
A program isn't run by a compiler, it's compiled by a compiler. There's a pretty significant difference.
There's also a significant difference between writing an application in C and writing an application in a language that is compiled by a compiler written in C. For one, the compiler has a much smaller attack surface. There is a lot
Re: (Score:3)
Great, the OpenSSL project needs more people working on it (even if only reviewers).
You are volunteering I see ?
Re:for a library... (Score:5, Insightful)
Just because everyone can look at the code, doesn't mean anyone IS looking at the code.
Security is only one good reason to have the source code. Another very big convenience for programmers is the ability to figure out what went wrong when a library returns a bug.
Furthermore, no one ever said open source was perfect, just that it was better than proprietary code. And that is probably true; the worst open source code I've seen is much better than the worst proprietary code I've seen.
Re: (Score:3)
The ironic thing is, in the last 5 years, I've seen more memory leaks go into production from Java code than from C code.
Memory leaks that waste memory aren't good.
Memory leaks that expose memory contents are bad.
Bigger problem: stupid 'optimizations' (Score:3, Interesting)
The bigger problem is coders that think they need to optimize for speed.
Read the horror here: http://article.gmane.org/gmane... [gmane.org]
Ugh... premature optimization, the root of all evil. And now also the root of the biggest security hole ever.
Re:Bigger problem: stupid 'optimizations' (Score:4, Insightful)
...then there is a real world cost to not optimizing this code.
Turns out there's a real-world cost to optimizing it, too!
Why is he even excusing himself ? (Score:5, Insightful)
Re:Why is he even excusing himself ? (Score:5, Insightful)
Thank you for a bit of sanity. What you wrote should be the summary text for every article on the subject: "He gave it a best effort shot given the resources. If you can do better, fucking do it."
As an open-source dev myself, I often wonder why the fuck I do anything useful for others when they'll just turn on me the moment their toys don't work exactly as desired because -- gorsh -- I'm not perfect, though I work very hard to be.
Re:Why is he even excusing himself ? (Score:5, Insightful)
As an open-source dev myself, I often wonder why the fuck I do anything useful for others when they'll just turn on me the moment their toys don't work exactly as desired because -- gorsh -- I'm not perfect, though I work very hard to be.
Well, I'm a developer too. Mostly open source. Thing is, I don't bite off more than I can chew. This is a security product. They're not using basic code coverage tools on every line, or input fuzzing. They missed a unit test that should have been automatically generated. This is like offering a free oil change service boasting A+ Certified Mechanics, then forgetting to put oil in the damn car. Yeah, it was a free oil change, but come the fuck-on man. You really can't fuck up this bad unless you're stoned! I mean, if you change the oil, you check the oil level after you're done to ensure it hasn't been over-filled... You check all the code-paths, and fuzz test to make sure both sides of the #ifdef validate the same, or else why even keep that code? "I can accept the responsibility of maintaining and contributing to an industry standard security product" "YIKES I Didn't Fully Test my Contribution! Don't blame me! I never said I could accept the responsibility of contributing to or maintaining an industry standard security product!"
It's cancerous shit like you that give open source a bad name. Own up, or Fuck off.
Re:Why is he even excusing himself ? (Score:5, Interesting)
Welcome to Engineering. Scott Adams (of Dilbert fame) best summarized this disconnect between commendation and blame in the Engineers Explained [boulder.co.us] chapter of his book:
not a single mistake, symptom of bigger problem (Score:3, Interesting)
http://article.gmane.org/gmane... [gmane.org]
Re: (Score:3)
Indeed. While both this person and the reviewer messed up badly and their competence is rightfully in question, they were also set-up by omission of basically everything you need to do to produce secure code in OpenSSL and by sabotage (likely due to terminal stupidity) of safeguards that _were_ available.
To produce a catastrophe, several people have to screw up spectacularly. This is what happened here. Quite often, when you dig down, you find a combination of big egos and small skills.
Sloppy code (Score:5, Informative)
Re:Sloppy code (Score:4, Interesting)
I glanced at some of the OpenSSL C code, in particular the new code that introduced this bug.
I don't disagree about the 'coding style' issue, but that kinda misses the point. The points are:
Theres a memcpy() - where is the bounds checking? Hello? Its not 1976. We all know memcpy is dangerous. Where there's a memcpy there should be a bounds check... even in a fart app. If the project has secure in the title there should be paranoid anal-retentive checking of both the source and destination buffers.
The code uses data that has come from teh interwebs, - again, where's the obsessive-compulsive validity checking on everything that comes in?
However, that's still not the point. Programmers make mistakes - and this bug was at least a bit more subtle than the usual one where the bad hat sends an over-length string.
The problem is with the oft-made claim that Open Source security software is extra-safe because the code is public has been seen by many eyeballs. That claim is dead. Possibly crypto experts have been all over the actual encryption/decryption algorithms in OpenSSL like flies on shit - however, clearly none of them looked at the boring heartbeat stuff. That shouldn't be the death of open source, though - Windows is proprietary and look at the sheer terror caused by the prospect of running Windows XP for one day after the security patches stop...
Re: (Score:3)
Re: (Score:3)
The real failure is in the process (Score:4, Insightful)
Re: (Score:3)
Unit Tests are Not Optional Anymore (Score:3)
Re: (Score:3)
No production code without unit tests. Every possible type or class of input must be tested. All assumptions must be tested. All outputs must be verified for each possible combination of inputs. All failure modes must be exercised. No excuses, just do it.
Nope. It's a waste of time. Much of the time the people writing the unit tests are the same people writing the code, so their assumptions are also in the unit tests.
There's only one thing to say to this coder ... (Score:3)
Robin Seggelmann, thank you and the entire OpenSSL Team for your contributions to free open source software. Glad we could find a serious security flaw, that you're helping to find out how it happend and that the OpenSSL crew is so fast in coming up with a fix.
With just about any other development paradigm and folks like MS we'd've waited for weeks for that to happen.
Carry on with the good work, you guys rock!
Re:Names! (Score:5, Informative)
Re:Names! (Score:5, Informative)
Have a look for yourself. [github.com] The reviewer "steve" is Stephen Henson.
Re: (Score:3, Informative)
Lawyers love EULA's and licenses. The OpenSSL license disclaims all liability
* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY
* EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR
* ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
* OF THE POSSIBILITY OF SUCH DAMAGE.
https://www.openssl.org/source... [openssl.org]
If you never agreed to that license, you're violating their copyright.
Re: (Score:3)
Lawyers love EULA's and licenses. The OpenSSL license disclaims all liability
* THIS SOFTWARE IS PROVIDED BY THE OpenSSL PROJECT ``AS IS'' AND ANY * EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE OpenSSL PROJECT OR * ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT * NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, * STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED * OF THE POSSIBILITY OF SUCH DAMAGE.
https://www.openssl.org/source... [openssl.org]
If you never agreed to that license, you're violating their copyright.
OK, switching from humor to serious. The above can be challenged in court. And being correct/innocent does not necessarily determine the outcome of a case. As a hostile lawyer once explained: the facts of the matter are irrelevant, my client can afford to go to court, you can not. See "pyrrhic victory".
Re:He's sorry now ... (Score:4, Insightful)
The above cannot be challenged in court. No court in the Universe holds jurisdiction over this. The contributor didn't sell you OpenSSL, he didn't force you to use it, it didn't tell you to use it, he didn't make any guarantees about its functionality, you have no contract, no warranty, no expectation for it to actually do anything, etc.
You may as well sue someone after walking into their house uninvited, listening to them whistle while they're sitting on the toilet, and hearing a missed note.
Re:He's sorry now ... (Score:5, Interesting)
I found your post on slashdot and used it as legal advise.
It turns out you were negligent in checking your facts so I'm suing you for damages.
Re: (Score:2)
Re: (Score:2)
Russia now has full access to any of those back doors.
Re: (Score:3)
They always did, the US and British intel agencies have always been double-agent ridden. I doubt that much of what Snowden revealed was a surprise to Moscow.
Re:Doesn't seem to be on purpose (Score:5, Interesting)
Re:Doesn't seem to be on purpose (Score:4, Insightful)
we do it just to bug YOU, cold fnord.
we know it pisses you off to have your boys look bad.
DEAL WITH IT. or just leave. either is fine with most of us.
Re: (Score:3)
UDP does not have keep alive. TCP over TCP is an inherently broken combo so a VPN would prefer UDP. In crypto, it's necessary to hide nature of packets to make traffic analysis harder, which is probably why there is all that length stuff (did not check RFC, if it explains reasons).
Re: (Score:3)
Re: (Score:3)
You're likely to get modded Troll; but this really does remind me a bit of Ford vs. Toyota. For years Ford was fixed in peoples minds as the exploding Pinto company, and Toyota was high quality. Now Toyota isn't what it used to be, and Ford is better... but neither is perfect.
If nothing else this is a good argument against monoculture. We have different systems with different bugs, so it's not a total loss. If the market shares were evenly distributed among 10 different vendors, the black-hat task would
Re: (Score:3)
In this case, the code is open, so pretty much everyone understands exactly what happens, exactly how bad it is, and how to fix it.
Ahem, sir. Open source can be useful, but it is not a magic bullet like that. Even I can read and understand the C code in OpenSSL, but to see the bigger picture and to understand how this particular software actually works and is arranged is completely different story. I bet that in case of OpenSSL, only under 100 people in the world can "understand exactly what happens, exactly how bad it is, and how to fix it".