Interbase Backdoor, Secret for Six Years, Revealed in Source 260
Diesel Dave writes "CERT Advisory CA-2001-01 announced today that the Interbase server database contains a compiled-in back door account. The thing is, it was not the result of a malicious code infection, but a direct addition by the original Borland/Inprise authors done before the program was released as open source." The backdoor was installed sometime between 1992 and 1994, and has been included in every version of Interbase during that time.
Re:Security patches - apologies to QuantumG (Score:2)
Re:A mixed bag (Score:2)
Linus Torvalds
I'm sorry, but I just had to comment. When I saw your sig, the following line just popped into my head.
"Me, the creator of the Linux Kernel? in las Vegas? with showgirls? What were they thinking?"
People who've seen "The Fast Show" ("Brilliant" in the US) will know what I'm talking about.
Rich
You aint seen me, right!
Re:Why the surprise? (Score:3)
At least with opensource, things like this get found. Obviously Borland's security audit didn't find it when they originally released this as a commercial product! If it wasn't for opensource, this would probably still be being silently exploited by the original programmers and the few people they told.
Re:Why the surprise? (Score:3)
Urban Myth? (Score:2)
This is probably just an urban myth. With the amount of personal firewall software people are running these days someone would have logged the unauthorized data being transmitted and their would be sufficient evidence to get M$ in a whole load of shit.
Re:Why the surprise? (Score:5)
OpenBSD has been undergoing a security aduit for years. A couple months ago they were able to claim there had been no known root hacks in the current release for 3 years. (That is they were able to fix root hacks before they were discovered for the last 3 years). Well sometime this summer someone discovered a root hack in the released system, despite all those audits. (To be fair, they had fixed that hole in the unreleased code stream, nobody realized it was exploitable at the time though so there was no hurry to release it early).
Audits are good, but they take time. OpenBSD has proven they take a lot of time. There is no open source project with as much work in security auditing as openBSD. (Probably no closed source project either). No open source project cares are much, yet they can't always get it right despite 5 years of work. To criticie any other project for not discovereing all secuirity holes is a mistake. Even if the openBSD audit team had decided to work on this with as much effort as went into openBSD there is no reason to belive they would have discovered this sooner.
root=backdoor? (was Re:Open source = no backdoor) (Score:3)
In the case of root, the existence of the backdoor is well known, but the details (password) are nominally only known by a few people. On some systems, the 'root' name is changed to something else (e.g. toor) for obscurity reasons.
In the case of Inprise, the existence and details of the backdoor were known to external persons (developers) but unknown by the actual user and the details are unchangable without source code. (note: it looks like a quick fix here would be to edit the backdoor details in the source and recompile). This was entirely 'security by obscurity' and, now that the cat is out of the bag, almost every user of the software is at risk.
Point to be made here: Opening the source code simply made it much easier to find the backdoor. Overall, I think that this is a good thing. There may be some hackers out there who knew of this backdoor for many years. Now we have the knowledge and impetus to clean it up.
I don't think that this was a malicious backdoor. The design of the software seemed to require it (oops!). The big mistake is that nobody who had access questioned it's existence. The lesson to be learned is that people who have access to source code and see this sort of stuff should make waves to open up the process.
The best gemeric solution is to remove the need for internal 'backdoors' in code. That being infeasible, the software should be changed so that the details of the backdoor are editable by the end-user (or randomized on every start of the software). Obviously, the user has to be made aware of the need to edit this data. That solution, of course, has its own security implications (exercise for the reader).
`ø,,ø!
Re:Open source = no backdoor (Score:3)
I have two machines linked together by an crossover ethernet cable. Can you hack into that network? I'd be impressed if you could
A fairly simple manner of splitting the cable and installing my own junction, or attaching my laptop to one of your machines via a serial port
Anyway, as soon as I saw your comment, I got into your master server (which I noticed connected to the Internet on 127.0.0.1 hah!!), and have told the police about your massive pr0n and war3z collection! You should now notice your hard disk is thrashing as my rm -r * takes affect suX0r!
Whoops! Hangon? Why is MY disk thrashing
Re:Security patches (Score:2)
In this case, it was worse, because when Interbase was Open Sourced, it was not buildable. Important scripts, Makefiles, and other assets were missing. So, you had this whole mess of random source that you could begin to guess the function of, but if you hadn't been a former Interbase developer at Inprise, it was all a black box to you.
Have you ever "read the code from start to finish with a pen and paper next to them" on any major project? Have you ever heard someone do that? Frequently? Or are you just trying to be a troll? That's just not the way it happens.
Personally, I'm impressed that they've been able to find this. I mean the Firebird project found a working binary from Inprise and a collection of code rumored to, through some magic process, produce that binary. They worked out the magic process, produced their own binary, and moved on from there.
Only then, after they had source code corresponding to a working program could they start doing the poking and reverse engineering it took to figure out the parts and the places. They had some luck, since they do have the original creators of Interbase on their side, but there were quite a few hurdles to go through before they could even start to make heads or tails of what they were looking at.
Then again, I dunno, maybe I have it all wrong and these guys were just sitting on their thumbs all day.
Re:Here's a buffer overflow (Score:2)
There is a fundamental problem at the root of this, which is that the C standard library is hideously irregular, and the C language itself is not meant to be "safe". It's an okay language for writing hardware drivers and other low-level system components, but a safer, more abstract language would be a better choice for applications.
Re:These lines of code like sand.. (Score:2)
Rich
Re:Security patches (Score:2)
Re:Hits on port 3050/tcp already on the increase (Score:3)
Which is why I like
the AC
Open source = no backdoor (Score:2)
Is it a good thing or not?
Is there a good use for back doors?
Here's a buffer overflow (Score:5)
SCHAR home_directory[256];
...
#ifdef UNIX
/* If a Unix system, get home directory from environment */
startup_file = getenv("HOME");
if (startup_file == NULL)
{
startup_file = ".qli_startup";
}
else
{
strcpy(home_directory, startup_file);
strcat(home_directory, "/.qli_startup");
startup_file = home_directory;
}
#endif
That's called a "buffer overflow" and I doubt it is the only one. Just a short grep over the files gives an idea here. 642 strcpy's, 139 strcat and 945 sprintf's. The first thing to do is replace those with safe alternatives (strncpy, strncat, snprintf) and then the fun begins. And I just know that next week I'm gunna be asked to install an Interbase server
Re:Here's a buffer overflow (Score:2)
Are there any *good* choices for Interbase users? (Score:3)
Anybody running a pre-open-source Interbase seems to have only really unpleasant choices:
I'm glad I'm not in that position.
Re:https?? (Score:3)
-russ
Re:Other Borland Products (Score:2)
You're assuming that the disassembler wasn't also in on the joke, and that it it wouldn't recognize that it was disassembling the compiler and casually omit the incriminating code. For that matter, you're assuming that the C compiler wasn't smart enough to recognize when it was compiling a disassembler and insert the appropriate code to implement the above.
There are two questions to ask yourself: "am I being paranoid?" and "am I being paranoid enough?".
Re:Open source = no backdoor (Score:2)
I'd also be inclined to bet on the probability that code is being designed and written today with these sorts of problems in them. Probably these people are justifying it to themselves.
Investment in security will continue until the cost of the security exceeds the cost of a breach -- or until someone insists on getting some usefull work done.Murphy's laws file, ~1979
`ø,,ø!
Re:Dogma (Score:3)
Correction: how many years it took anyone to discover and announce this. Just because it was only now announced doesn't mean someone didn't know about it two years ago and kept quiet about it.
Re:Here's a buffer overflow (Score:2)
OpenBSD have addressed this issue. glibc has not yet adopted their solution, although glib is expected to adopt g_strlcat and g_strlcpy in version 2.
[openbsd.com]
http://www.openbsd.com/papers/strlcpy-paper.ps
Re:Why the surprise? (Score:3)
So even he didn't think this would ever happen and the bug in ftpd was a direct result of this. No one knew it was there because no-one knew that such a bug even existed (and if it did it was most probably not possible to exploit). That is definitely not the case here. This is an obvious flaw in security written by a programmer who obviously never thought the code would be open sourced. It should have been one of those things that you picked up on the first day and said "this is bad, you never should have done this."
Re:Security patches (Score:2)
395521 lines in
116496 lines in
you call this big? From a security analysis point of view, this is a baby.
Re:Right on, dude! (Score:2)
Re:Security patches (Score:2)
Re:A Compiler written in Assembler will stop this (Score:2)
Unless the compiler source has no obsticated backdoors of course.
The solution is to have a basic compiler written in Assembler. This way you do not need to start with a binary compiler that you can know with 100% is clean of any bad things
And now you assume that more than about 1% (if even that) of the programming community have the skill to analyze 20000 lines of assembler looking for backdoors! I'd much rather try and find a backdoor in 30000 lines of C than 20000 lines of assembler.
Re:Security patches (Score:2)
Re:Hits on port 3050/tcp already on the increase (Score:2)
Now that I know what to simulate, I'll rig one of the honeypots and see if the script tries the exploit, or if the crackers wait until later after a positive hit to try their luck. But that will wait until tomorrow, beer is calling
And besides, if I ever choke out one of the routers, its good justification to accounting to buy bigger routers
the AC
Re:Security patches (Score:2)
Re:Mmmmm.. (Score:2)
Re:Here's a buffer overflow (Score:2)
Re:Here's a buffer overflow (Score:2)
Hey, don't forget to trap those null pointers. And handle them. Then free that buffer when you've finished with it.
One item in particular that I was referring to was a quick one liner to debug to check that part of a project was working properly by feeding and receiving values. Minimal development time assigned to it, no likelyhood of exploitation and it only had to run once and its job was done. It turned out with another project needed exactly the same functionality on a regular and reliable basis. Now the code was available (I wrote it) and the necessary tightening was done. But just as easily, someone else could have just used it out-of-the box. I think this points not to needing to have over-engineered perfectly written software down to "Hello World" but rather to ensuring that you know the security implications of any software you exec.
Rich
Re:Why the surprise? (Score:2)
Re:Here's a buffer overflow (Score:2)
Re:Security patches - apologies to QuantumG (Score:2)
So how would you go about making a backdoor? (Score:2)
There can't be a way to completely hide it. Just make the trail harder to follow. So, as an exercise of what to look for, how would you go about pulling something like this off?
Jason
Re:Security patches (Score:2)
Actually that's exactly the way it happens. It's called a "security audit" and it involves reading the source. It is best done by a security expert who reads through the source, writes down everything that he is suspicous of and then sits the programmers down in a room and asks them question by question what each of the variables involved are, where they come from, what resultant binaries they are used in, etc. I know this because I used to do security audits for a living and it was during this actual hands on experience with software that I decided that open source was better because you can get more people reading the source simultaniously. As for whether I am trolling? No, but I appear to have attracted a few flame throwers anyway!
Re:Here's a buffer overflow (Score:4)
Add into this that this will be a HUGE source base with many many lines of code, that open source contributors generally want to produce things and not be reading over other peoples code and that reading other peoples code (and that usually includes the "you" from >6 months ago) sucks sucks sucks
But those criticisms aside, it does indicate that open source probably does need to consider security more. Especially when inheriting code from closed source projects but just as importantly for exisitng open source projects. It seems that openBSD is doing a good job of auditing their code. While I wouldn't even think of saying that open source projects *must* do x or y, perhaps a central security auditing helping project which ranks other projects on their security and offers suggestions on common security errors and auditing methodology. Projects could apply these techniques or not as they desired but the end user could check the security status by going to the security site. Interbase would have been ranked red_unsecure_not-yet-audited, sendmail could be blue_unsecure_script-kiddie-heaven etc.
My second comment is more a query. Are there header files available which make sure that strcpy and friends can't be used? It would go a way to helping if you could use these headers and WARNING:STRCPY USED. COMPILE ABORTED would pop up as appropriate. It wouldn't be a final fix but it would help and might get programmers out of the habit of using these awful functions in the first place.
Finally, with the front page story yesterday being about OOP, this is clearly the kind of thing where OOP helps. A good string class will take you a long way. Also, OOP is more easy to read and understand in small chunks so it's easier to audit (and easier to get people to audit)
Rich
Re:Security patches (Score:2)
A code backdoor is NOT a "security flaw"! Any decent C programmer can spot a buffer overflow in 20 minutes, but very few programmers could spot an obsticated backdoor in a major application like a relational database system without a major investigation by a dedicated team of people!
Why is it you seem to think you know anything about security analysis? Do you do this for a living? Well I have
Well I'm a security consultant and could probably spot a hole in a set of firewall rules in 20 minutes, but it doesn't mean I could find a route through a unicode vulnerability in a www server, which accesses an open share on another server, which has trusted access through another firewall to a back-end Oracle system in 20 minutes
Please stop being defensive, and stand back and look at this particular situation!
Re:Security patches (Score:2)
binary patch for interbase (Score:2)
Rich
Re:Are there any *good* choices for Interbase user (Score:2)
Security Through Obscurity Works! (Score:2)
It worked for 6 years. How much longer would it have worked if they hadn't opened their source?
Of course you don't want obscurity to be your only method, but you shouldn't rely on peer review as your only method either. It's just that I've grown tired of people saying that obscurity is of no value at all.
Re:Hits on port 3050/tcp already on the increase (Score:2)
They actually let you run a honeypot? You lucky thing! The chances of me actually managing to produce a business justification for one are pretty slim. Management happily spend money on top-end NetRangers etc. which is nice, but this is one step too far for them!
And besides, if I ever choke out one of the routers, its good justification to accounting to buy bigger routers
Extremely good point: like accounting would ever understand that processor saturation is down to multiple ACLs
Re:Security Through Obscurity Works! (Score:2)
Did it? Are you sure? Do you know that the Interbase people that had it didn't abuse it to go poking around in companies databases, reading peoples private messages? Are you sure they didn't tell any friends? Can you be sure that Interbase didn't supply confidential information obtained illegally about one of their users to a "friendly" competitor? (I mean I'm sure they haven't but the possibility is there)
Can you be sure this hasn't been exploited somewhere somehow?
Rich
binary patch for interbase (oops) (Score:2)
>"BACKDOOR_PASSWORD\0My_Sekret_Password\0"
Rich
Re:Here's a buffer overflow (Score:2)
#define strcpy STRCPY_NOT_ALLOWED_BABY!@#!@#@!
which would cause a compiler error
Re:Why the surprise? (Score:5)
The backdoor was introduced in the commercial version of the software. It's only now that it is open source that we could even see the error. The people paying for the 'presumably...high-quality app' you extoll the virtue of were receiving the backdoor-enabled product. Rather than being a failure of open-source software, I'd say this one was a sucess. I only wonder what other kind of 'crap' exists in all those apps whose sources are closed.
Re:Security patches (Score:2)
grep -R 'obvious backdoor' `find . -name '*.[ch]' -print` | Mail -s 'Fix these' me
(It's a one-liner. Re-assemble if necessary. Modify appropriately for other languges.)
anybody who takes this seriously deserves to .
`ø,,ø!
Interesting in light of NSA secure Linux (Score:2)
Re:Here's a buffer overflow (Score:2)
$ echo $NIG
12345678901234567890123456789012345678901234567
My environment variables hold way more than 256 bytes.. where you gettin' yours?
Re:Here's a buffer overflow (Score:2)
True. But that's up to the developer to decide when they are drawing up the project spec. For a small internal utility where I know noone is likely to want (or have the need) to perform an exploit I might be lazy and use char[2000] and strcpy for paths (though I still think it would be better if those functions disappeared but note that strncpy is not proof against buffer overflows and there are buffer overflow problems where strings may not be terminated properly [particularly in networked software]). For an absolutely robust, mission critical system such as one that stores credit card numbers (a database) I would be tempted to go for a string class and for something that needed to be robust but definitely needed to be lean, I would probably go for writing some specialised string functions (strcpy and friends do not have to be used in dangerous ways).
Rich
Open source and security (Score:2)
If bBrland could have it, why not Oracle, Sun, IBM? (Well to be honest you could get the source for Solaris from Sun.)
/ Balp
Re:Here's a buffer overflow (Score:2)
I should state here that I have sometimes seen those small internal utilities go into full scale production systems, usually requiring a rewrite to remove all those little nasties. It's probably best to not be lazy in general :)
Rich
Re:Recent MS break in? (Score:2)
Re:Reasons_for_strong_firewall++; (Score:2)
Re:Recent MS break in? (Score:2)
Whether it's profit-driven, back-doors, or mass-murder: How often have you heard the phrase:
"That's just the way we do things."?
`ø,,ø!
Right on, dude! (Score:2)
The "trusted" unsafe C codebase should be as small as possible.
Re:Security patches (Score:2)
return (!strcmp (name, "USER") && !strcmp (project, "LOCKSMITH"));
and immediately ask "what's this LOCKSMITH thing?" and then take a grep around and discover the #define LOCKSMITH PWD_ls_user() and have a look at that and discover
char *PWD_ls_user()
{
if (strcmp(ls_user,"Firebird ")==0)
{
mk_pwd(ls_user);
}
return ls_user;
}
char *PWD_ls_pw()
{
if (strcmp(ls_pw,"Phoenix")==0)
{
mk_pwd(ls_pw);
}
return ls_pw;
}
and say "hey, this thing which is obviously a username and password is hard coded here? What the fuck?" and quickly come to the conclusion that there is a backdoor in the code. When I filed my security report I would include a section on the LOCKSMITH backdoor and when the programmers told me that they did that intentually I would have a little laugh and explain to them the risks of doing that and how to do it properly. They would tell me what is right and wrong with my proposed solution and the problem would get solved.
BTW, here we distinquish between you guys as "network security" and us guys as "software security" but I've also done network security.
Re:This is serious fuel for open source (Score:2)
It's really given me cause for entrusting my financial data to any online merchant. As such, I make a concerted effort to only use one cc for online purchasing, which I periodically "lose" so I get a new number.
I recommend you all do this.
Um, how do you know it worked? (Score:3)
Re:Why the surprise? (Score:2)
Re:Here's a buffer overflow (Score:2)
char *foo;
//Process error (note: why use blockquotes for a single line?)
...
foo=NULL;
if(str_add(*foo,"Hello")!=0){
}
Where str_add is a function which attempts to allocate a buffer long enough to hold both the concatenation of the two arguments, concatenates them, frees the original foo then sticks the result back out into foo. If the buffer cannot be allocated, dont mess with the pointer to foo and return an error.
Note that this is not efficient if you are creating long buffers from small chunks of data. In that case, I would make foo a struct containing char * and int and store the length of the allocated buffer in the int. I would #define a BUFF_CHUNK_SIZE and bump the size of the buffer up by that as required.
But not for a ten line quickie program :)
Rich
Re:Open source = no backdoor (Score:2)
Yes, this is a good thing; backdoors should be eliminated from commercial products. I don't want anyone sneaking into my database. Although Borland might not be too happy about this...
---
pb Reply or e-mail; don't vaguely moderate [ncsu.edu].
Well, not exactly... (Score:2)
I'm sure other people could think of more scenarios
More juice ... (Score:2)
Re:Oh no! (Score:2)
It's an age-old debate. Older than the computer. Some people feel that it's just torture to tell a terminally ill patient that they're about to die. Others welcome the opportunity to say goodbye to friends and spend the their retirement money.
`ø,,ø!
Hits on port 3050/tcp already on the increase (Score:2)
It will be interesting to see what various inquiries produce as to why this was put into the code, and why it existed for years in open source before being discovered.
Off to modify some router ACLs to log and drop...
the AC
Re:Are there any *good* choices for Interbase user (Score:4)
Have a closer look ;-)
The code is intialised to the variables in the .h file, and when the server starts up it repaces them with random data using chars with ascii values 1-255
So every time the server starts up you get a different random password.
I've posted somewhere else, a bit about how this was done just prior to christmas, to fix the problem, and not introduce any unknowns.
A more perminant fix will be applied, we found it when we were doing a review of the security
There are problems, but in Firebird we have several people who do crypto/PKI things for their day job and we were doing a security review, that in part explains how we've found these. It also places us in a good position to fix these things. As far as Borland are concerned, they seem to be ignoring us,
They wouldn't tell Jim they were working on a patch for prior versions of InterBase, so he felt compelled to write his own.
But for now it's a good time to keep your Firebird/InterBase server locked behind a firewall
Cheers
Mark O'Donohue
--
Your database needs YOU!
http://firebird.sourceforge.net
Re:Dogma (Score:2)
Huh? This strikes me as a rather semantic argument. It all depends on how you define the word "problem". In any event, I'd say (and I think most others would agree to) that the most pressing security concern of any product consumer, be it open source or closed, is the effective security of the product. How the problem came to be is not nearly as relevant to the consumer as IF and WHEN it becomes known. Notice: This is not the same thing as saying that just because a problem is unknown at some point in time that it is irrelevant...as long as there is an (actual) risk it is relevant. However, it is equally stupid to somehow imply that any closed source product with any backdoor (no matter WHEN or IF it is discovered) is somehow, necessarily, more problematic for everyone then an Open Source product with a zillion "accidental" security flaws that are discovered haphazardly in great number.
Put simply, I'd rather face the risk of ONE developer knowing a backdoor (or bug or flaw) exists than face a zillion hackers armed with many different exploits on the comparable open source product long before. One might argue this empirically: the percentage of highly exposed interbase dbs that were hacked versus the somewhat equivelent MySQL database (which, incidentally, has seen it's fair share of security problems).
All of this, however, is entirely besides my original point. The point is that the poster I was replying to, and indeed a great deal of open source dogma, says that such backdoors are impossible to hide for an extended period of time in a popular Open Source project. I simply assert that: If security flaws can lay dormant for years as the result of improper coding in popular Open Source products, then an honest to god backdoor can certainly be hidden in there by an intelligent coder with equal or greater success, even if it isn't trigged by something as trivial as "MY VOICE IS MY PASSWORD".
Little Brother Is Watching? (Score:2)
I fully expect that somewhere, in a corporation's database, is the tacit knowledge that I wear brightly colored underpants each and every second Thursday of the month... };-)
Re:Here's a buffer overflow (Score:2)
Re:A mixed bag (Score:2)
When I was an MIS director, I had all the critical passwords written down on separate, sealed envelopes with my signature on them and put in a safe deposit box which could only be opened by the VP of finance -- specifically to guard against the event that the key sysadmins and/or I should come to an untimely end.
Backdoor (Score:2)
Re:Mmmmm.. (Score:2)
Re:Security patches (Score:2)
Re:Here's a buffer overflow (Score:2)
Re:Who ensures the safety of the ensurers of safet (Score:2)
Hey, I only suggest Java because it is similar enough to C to possibly make my dreams come true in the short term. I don't like it either. =)
You'll probably be interested to know that we DO have compilers which generate provably safe code. One piece of the puzzle is TAL: Type safe assembly language. Another is TILT: A type-preserving ML compiler. They've also got projects on compiling safe-c, proof carrying code for transmitting this stuff over the network (without sandboxing), etc. The technology is almost there.
And while I agree with you that the compiler is an important source of more bugs... wouldn't it be nice to plug up holes on the programmer end (since compiler bugs right now also introduce more non-safety) while we wait for this stuff?
Re:Open source = no backdoor (Score:3)
Is there a good use for back doors?
I can't think of one. The CERT advisory makes it sound like this particular one is there because the design of the system requires it:
So, at least it doesn't seem to be a Borland/Inprise employee being sneaky. But still, leaving such a gaping hole in the software, even by design, it stupid. Especially considering the password for said account is hard coded! I can't imagine that idea passing the giggle test for any security expert.
Re:Security patches (Score:2)
Re:Your attitude sucks. (Score:2)
Recent MS break in? (Score:3)
Since the source was released, it's obvious that the developers that added the backdoor have already left borland, since it wasn't removed, and the other developers haven't noticed that there is a backdoor.
So, If it can go undetected even if the whole world has access to the source. So might this indicate that there is a very certain possibility that the crackers who broke into MS DID backdoor the source?
Re:Recent MS break in? (Score:2)
FYI: Godwin's law is about comparing someone to a Nazi, or such. Simply using the Nazis as an example of something isn't covered.
For instance, if I was to talk about military uniforms through the ages, discussing Nazis is perfectly reasonable.
Now, Nazis aren't related to databases, and it is a stretch, but the poster is right, people see things happen and get comfortable, then they don't think where those things could get to if abused.
Extra info (Score:3)
BTW, it seems that, as usual, they were not very concerned.
Re:More juice ... I like this part (Score:5)
For security reasons, the patch is available only as a binary and you will be required to register for this download.
Nice, eh?
M.
what did you expect? (Score:3)
Loosing this kind of control is one among other things that make industry afraid of going open...
Other Borland Products (Score:4)
Has something like that ever happened before?
Re:Open source = no backdoor (Score:3)
Re:Backdoors vs. default passwords (Score:4)
Of course, any computer is only as secure as its administrator.
Re:Hits on port 3050/tcp already on the increase (Score:5)
Correction... Note that the blurb above says "...a direct addition by the original Borland/Inprise authors done before the program was released as open source." This wasn't done after the Open Source release.
Furthermore, Interbase has only been under an Open ource license for less than a year. Inprise was considering the move around last December [slashdot.org], and was finally (although missing parts and amidst great controversy which eventually forked the code [sourceforge.net]) released under an Open Source license around July 2000 [slashdot.org]
So, the thing is from what I can see, this is an instance where an Open Source release allowed a security hole, hidden for years as closed source, to be found finally. Which is, of course, the complete opposite of what you said.
Why the surprise? (Score:5)
If anyone truly believes that things like this should be found faster, they should try reading through this amount of code. When their heads stop spinning they will probably have a change of heart.
This is serious fuel for open source (Score:3)
What most guys don't realise is that many many closed-source software that currently run on many computers contains such backdoors, generally implanted to ease remote maintenance (and cut down costs). I, for one, would be _very_ surprised if there was no such backdoor in the various incantations of proprietary operating systems.
Cheers,
--fred
One year since source release.. (Score:4)
You can download the surce Here [sourceforge.net]
According to the page it was registered at Source Forge on 2000-Jan-28 15:37
--
Why pay for drugs when you can get Linux for free ?
Re:Hits on port 3050/tcp already on the increase (Score:5)
Even more... If you read the saga of the backdoor here [interbase2000.com], it seems that not only was the backdoor known about by Inprise R & D engineers-- but that when the original creators of Interbase (no longer a part of Inprise, but now part of the Firebird development fork) brought the security breach to their attention engineers at Inprise were forbidden to speak to them .
And furthermore, as they realized that not only was this in the Open Source release, this backdoor was also in the last 3 closed source versions of the database. So they fixed the Firebird source, but also-- even with the company itself forbidding its own engineers to contact these people-- they wrote a binary patch program to disable the backdoor on previous versions.
Imagine that. Even while being slapped in the face, these guys fixed their product for them.
Re:what did you expect? (Score:3)
Not surprised that it took time to be found (Score:5)
1. Interbase wasn't officially released under an open source license until last summer. I at least, did not spend any serious time with it until the license was correct.
2. The open source interbase got off to a very slow start. Here's why:
- Borland didn't release all the tools required to build and test interbase code.
- Many of the original developers had left Borland, meaning that there was a shortage of mentors for new developers.
- Borland yanked startup funding at the last minute from the group that was going to take over the management of the code base, causing many to question interbase's future.
- Documentation of the code base is still unfinished.
- The codebase is large and complex.
Independent interbase builds (firebird on sourceforge) didn't start happening until very recently. In my mind they found this bug faster than I would have expected.
-OT
Dogma (Score:5)
It would take a hacker a significant amount of time to discover a properly hidden and hardcoded backdoor in a closed source product. Notice how many years it took ANYONE to discover this. That is "difficult", or rather time consuming for the hacker. You might say it's easy to reproduce, but that's true for literally hundreds of Open Source security flaws. Once a hacker discovers a means and releases an exploit, the work is done. It doesn't matter to the hax0r, aka script kiddy, if exploit.c sends "LET ME IN BACKDOOR" or a bunch of machine code to the target host. Furthermore, it's quite easy to test for the existence (or at least the probable existence) of a security flaw via improper bounds checking. In other words, you just send a bunch of different programs extra long strings on various inputs until something crashes, then you simply do the work to make the exploit happen. Compare this with trying to find a well hidden backdoor in a closed source product, you either try to reverse engineer the binary or you can try brute force. In either case, it's much harder to detect.
So the question remains, easier for whom and how is that relevant? It's really not terribly relevant if you ask me. The question is how secure is YOUR product at the end of the day in YOUR environment for YOUR needs. If you start overgeneralizing by saying "Open Source is secure, Closed Source is not" then you're making a fundamental mistake. Rhetoric and dogma are not conducive to practical security.
Re:Open source = no backdoor (Score:3)
Re:Are there any *good* choices for Interbase user (Score:4)
char *PWD_ls_user()
{
if (strcmp(ls_user,"Firebird ")==0)
{
mk_pwd(ls_user);
}
return ls_user;
}
char *PWD_ls_pw()
{
if (strcmp(ls_pw,"Phoenix")==0)
{
mk_pwd(ls_pw);
}
return ls_pw;
}
Perhaps you mean it doesn't use the same backdoor password? If you are using firebird I would suggest you change these lines in interbase/jrd/pwd.c to something else for the time being (note *QUICKFIX* only). If there are any developers of firebird around I wouldn't mind hearing reasons why this isn't the same problem? What's more, the "solution" described on the home page, namely "change super secret backdoor password to something else" won't work. That's security through obscurity in the perfect form.