Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Microsoft Java Programming Sun Microsystems

Gosling Claims Huge Security Hole in .NET 687

renai42 writes "Java creator James Gosling this week called Microsoft's decision to support C and C++ in the common language runtime in .NET one of the 'biggest and most offensive mistakes that they could have made.' Gosling further commented that by including the two languages into Microsoft's software development platform, the company 'has left open a security hole large enough to drive many, many large trucks through.'" Note that this isn't a particular vulnerability, just a system of typing that makes it easy to introduce vulnerabilities, which last time I checked, all C programmers deal with.
This discussion has been archived. No new comments can be posted.

Gosling Claims Huge Security Hole in .NET

Comments Filter:
  • Advertisement? (Score:5, Insightful)

    by nuclear305 ( 674185 ) * on Friday February 04, 2005 @08:51PM (#11578840)
    I actually RTFA since it included a sensationalistic phrase like "biggest and most offensive mistakes that they could have made."

    To me, it sounded like a big advertisement for Java.

    It's the developers decision to use unsafe code in the .NET platform. I certainly wouldn't call this a huge mistake made by MS.

    A hunting rifle can be used to kill people. Does that mean the trigger should only work after inserting a valid and current hunting license?

  • by Anonymous Coward on Friday February 04, 2005 @08:52PM (#11578855)
    So you mean to tell me that the father of Java won't be slightly bias?

    C'mon now. There is no vulnerability. Don't post this sort of crap. Its strictly knee-jerk material meant to bend a few people out of shape and start flames. .NET is great (for its target area)
    J2EE is great (for its target area)

    Both are secure, stable and reasonably fast if you are a GOOD programmer. ANYONE who does ANY C or C++ code that will be used in industry needs to ENSURE that they just take a few extra precautions and are aware of secure coding techniques in both languages. Its rather quite simple.

    To sum it up: nothing to see here folks.
  • by BlueCup ( 753410 ) on Friday February 04, 2005 @08:53PM (#11578859) Homepage Journal
    I don't disagree with Microsofts position. Yes errors are possible, but it's a programming language, and not Microsofts responsibility. With a case like programming it is the programmers responsibility to release code without exploits... c and c++ are fast, they have many advantages other languages don't have (such as Java) if a programmer decides to take advantage of that, with a slight bump in risk, then I say more power to them.
  • What a surprise! (Score:5, Insightful)

    by Anonymous Coward on Friday February 04, 2005 @08:53PM (#11578861)
    This could have just as easily read "Java Creator Disses Rival Product, Ignores Flaws in His Own."

    In Java, everything is an object! Oh...except for the basic types, you need to use object wrappers for those.

  • by patniemeyer ( 444913 ) * <pat@pat.net> on Friday February 04, 2005 @08:54PM (#11578870) Homepage
    This is what really distinguishes Java from other languages. The Java verifier is a sort of theorum prover that examines the byte-code and can guarantee that it does not violate certain rules such as forging the type of a reference or under/over-flowing the stack. Because this is done at the verify stage it is still possible to compile the bytecode down to machine level instructions after that and run at full speed. This is why Java is both safe and fast.

    To support C/C++ semantics (ad-hoc pointers) you'd have to throw all that out the window and I assume that's what he's talking about.

    Pat Niemeyer,
    Author of Learning Java, O'Reilly & Associates and the BeanShell Java Scripting language.
  • by apoplectic ( 711437 ) on Friday February 04, 2005 @08:58PM (#11578910)
    New languages such as C# and Visual Basic.NET only produce managed code.

    Hey, what about the keyword unsafe in C#? Sheesh.
  • Re:Advertisement? (Score:2, Insightful)

    by haystor ( 102186 ) on Friday February 04, 2005 @08:59PM (#11578913)
    I'm just curious, but what language(s) of ultimate security is Java and Solaris written in?
  • Re:Advertisement? (Score:5, Insightful)

    by TWX ( 665546 ) on Friday February 04, 2005 @08:59PM (#11578914)
    As much as I think his presentation method is tacky, I can agree with some of what he says.

    C and C++ allow for buffer overflows. They allow for improper or intentional coding to cause software to try to violate memory space of other functions or programs. They allow for memory allocation without necessarily providing any cleanup later. In the hands of bad, sloppy, lazy, or malicious programmers these traits have always proven to be a problem time and again on many different platforms. This doesn't mean that these languages are the wrong tool; I'd argue that part of Linux's success is because the kernel and most of the GNU-implemented services are written in these languages, which are flexible. Too much flexibility for the wrong purpose leads to problems though, just as too much rigidity leads to problems when things need to be flexible.
  • Re:Advertisement? (Score:2, Insightful)

    by rackhamh ( 217889 ) on Friday February 04, 2005 @08:59PM (#11578917)
    A hunting rifle can be used to kill people. Does that mean the trigger should only work after inserting a valid and current hunting license?

    DISCLAIMER: COMPLETELY OFF-TOPIC

    I don't know what the law is, but if a hunting rifle can only be legally used for hunting, this actually a pretty good idea. The card mechanism could also be used to enforce hunting seasons.

    I realize this offends some people's sense of rights, but I'm not particularly inclined to defend somebody's "right" to use a firearm outside its intended purpose.
  • by ahdeoz ( 714773 ) on Friday February 04, 2005 @09:00PM (#11578927)
    Gosling that Java is inherently insecure, as it is written in C.
  • Why oh why (Score:4, Insightful)

    by Anonymous Coward on Friday February 04, 2005 @09:03PM (#11578947)
    When elevators were first invented, people didn't want to use them because people thought they were not safe. They were right. Elevators were not safe. When Mr. Otis invented the safety elevator, which had a catch so that it would not fall even if the cable were cut, people started using elevators. It would be foolish to trust your life to an elevator without such a safety system. You could use it to lift bales of hay or cement powder or something but you wouldn't put a human being in it.

    It's the same with C. We should know by now "you cannot use C to handle untrusted data (ie, data from untrusted machines on the net)". All such data need to be handled in a sandboxed system, a system with safe memory access. This means something like Java or similar things.

    A lot of people will make posts that say things like "C doesn't cause the problems, it's incompetent or lazy programmers who cause the problems." Whatever. No excuse. That's like saying "we shouldn't need seat belts or airbags; all we need is to make sure that drivers don't make mistakes." Drivers and programmers do make mistakes and that's why we need safety mechanisms in both cases. C provides none. Programming in C is like driving around in a car from the fifties, with no seat belts, no airbags, no head rests, no ABS.

    So any decision to extend the use of C is just foolish. What is the purpose of doing this? If people must use horrible legacy code then just use it, but why drag that into new frameworks like .NET?

    It does not compute, for me at least.

  • by Anonymous Coward on Friday February 04, 2005 @09:04PM (#11578966)
    What about all the open-source applications built with C and C++? I.e. virtually all OSS apps?
  • by I judge you ( 796415 ) on Friday February 04, 2005 @09:07PM (#11578989)
    It would be really nice if you KNEW WHAT THE FUCK YOU WERE TALKING ABOUT.

    Of course the article contains a few brief quotes from Gosling and fails to clarify whether he knew (and spoke of) the differences between safe managed code, unsafe managed code, and native code...

    However, it is clear that *you* do not know.

  • Re:Advertisement? (Score:5, Insightful)

    by miu ( 626917 ) on Friday February 04, 2005 @09:07PM (#11578994) Homepage Journal
    I think he is talking about the fact that the type system of managed code itself could potentially be subverted by unmanaged code added by other developers.

    The article is heavy on sensationalism and short on content so it is difficult to tell what is actually being debated here, but I think that Gosling is claiming that support of C type handling in itself creates a chink in the armor of the CLR, regardless of any particular project's use of that feature.

  • Re:Advertisement? (Score:5, Insightful)

    by n0-0p ( 325773 ) on Friday February 04, 2005 @09:14PM (#11579066)
    He's not wrong about the pitfalls of C/C++. It's just that his argument is downright silly when taken in the appropriate context. The .NET "unsafe" code segments are really no different than JNI, except that they integrate much more cleanly into the platform. As much as I dislike Microsoft in general, .NET is an extremely well designed and secure platform. I say this as someone who has spent almost a decade making a living performing software security assessments and developing secure architectures. If you take the time to research it you will find that .NET really feels like the next incremental step after Java, and it takes advantage of a decade's lessons learned in Java.
  • by TheKingAdrock ( 834418 ) on Friday February 04, 2005 @09:23PM (#11579134)
    It's not the language, but the VM that matters. With the new C++/CLI (http://msdn.microsoft.com/visualc/homepageheadlin es/ecma/default.aspx [microsoft.com]) you'll be able to write code that is verifiable, just like C#, or have the option to call into "unsafe" native code. Imagine that, a migration path for users!

    .NET supports many languages, and they can all "play nice", and interoperate (compare this to Java).

  • Re:Phew! (Score:3, Insightful)

    by ArbitraryConstant ( 763964 ) on Friday February 04, 2005 @09:26PM (#11579156) Homepage
    C/C++ programs have terrible security track records. They're behind basically every arbitrary code exploit out there. That class of bug just doesn't happen in managed languages. And I don't mean just Java and .NET. An open source example is Python.
  • Re:Rediculous (Score:5, Insightful)

    by janoc ( 699997 ) on Friday February 04, 2005 @09:29PM (#11579181)
    Yeah, right - the same problem as with signed ActiveX - once a buffer overflow in the trusted code is found, your security is a fair game - the attacker has to only persuade e.g. your browser to load the buggy but trusted code. The managed languages like C# and Java were invented exactly with the purpose to prevent this kind of holes.

    To me this looks like a similar problem as allowing running native code via ActiveX. Yeah, we have permissions, signing and what ever - how much does it take for a trusted but buggy ActiveX applet to be exploited?

    Huge mistake, IMHO. And do not compare this to JNI - I am no Java expert, but AFAIK you simply cannot call JNI functions from something like web applet by design, whereas here it is on the discretion of the app developer.

  • Re:Advertisement? (Score:3, Insightful)

    by damiangerous ( 218679 ) <1ndt7174ekq80001@sneakemail.com> on Friday February 04, 2005 @09:36PM (#11579226)
    I don't know what the law is

    Obviously.

    but if a hunting rifle can only be legally used for hunting

    A hunting license licenses the owner to take a certain type of game (deer season, etc) on certain land (assigned state land, private land, etc) during certain times (hunting seasons, obviously) with certain tools (shotgun only, bow, etc). It only grants this, in the case of firearms, to people who already legally own them. A "hunting rifle" is simply a subset of rifle suitable for a certain task (which varies for the types of game). In every case the "hunting rifle" set overlaps other sets such as the "target shooting" set or "clay pigeon" set.

    this actually a pretty good idea. The card mechanism could also be used to enforce hunting seasons.

    No, it's a terrible idea. Even suspending basic rights and also assuming for a moment that a "hunting rifle" with no other legitimate purpose exists you're proposing that it be completely inoperable for 11 months (or whatever) of the year? And you see nothing wrong with forbidding use of a dangerous tool except for the brief times we let people loose in the woods with them? Not allowing people to become comfortable, or even passingly familiar with it until they're hopped up on adrenaline in the forest? You see nothing inherently dangerous with that at all?

  • Re:Phew! (Score:3, Insightful)

    by bnenning ( 58349 ) on Friday February 04, 2005 @09:38PM (#11579235)
    Id like to see a kernel written in either language.

    Fair enough, but at least 90% of the stuff written in C and C++ doesn't need to be.
  • Re:Advertisement? (Score:5, Insightful)

    by Zeinfeld ( 263942 ) on Friday February 04, 2005 @09:42PM (#11579260) Homepage
    The problem here is that it will be very difficult to take Gosling seriously when he talks about anything in future. This does not make me think any better of Sun.

    Nobody is going to use C or C++ to write a completely new program under .NET. There are occasions where I might use C for something I wanted to make cross platform but no way would I ever go near C++.

    Most people who are going to use the new .NET support are people who have legacy C programs and want to gradually transition them to the .NET base in stages. The makes a good deal of sense.

    The other constituency is folk who are writing stuff that is almost but not quite at driver level.

  • Re:JNI (Score:3, Insightful)

    by spideyct ( 250045 ) on Friday February 04, 2005 @09:46PM (#11579292)
    It may be fair to point out, but it is kinda silly without any context.

    Java lets you write to the user's filesystem. Does that make it insecure? You could run a program to wipe out your hard drive!

    But Java allows for a "sandbox". So does .NET. And if your code runs in that .NET "sandbox" (for example, if it is running from a network resource), it won't let you run unsafe code.
  • Re:Phew! (Score:4, Insightful)

    by abigor ( 540274 ) on Friday February 04, 2005 @09:49PM (#11579312)
    You're entirely correct. But in the real world, systems programming accounts for only a small portion of written software. Most is firmly at the application level.

    I feel confident in predicting that most of userland will eventually run in the context of some virtual machine or other. Of course, that doesn't exactly make me a prophet, since that's the plan for Longhorn, but I think it will become the norm on other platforms as well.

    It would be nice if, in the long run, operating systems became irrelevant when it comes to choosing applications. You go with whatever has the best track record for speed, security, or whatever, and then just choose whatever applications you like. Since the virtual machine runs everywhere, so will your software.

  • Re:Phew! (Score:5, Insightful)

    by andreyw ( 798182 ) on Friday February 04, 2005 @09:59PM (#11579378) Homepage
    No offense, but give a fool a hammer and he'll crack his skull. C is not inherently insecure. C++ is not inherently insecure. If you don't know how to program, please step aside and let others through. I am not some sort of anti-managed-language zealot, I love Python, but to claim that C *as a language* has a terrible security track record is ridiculous. The applications, not the language, might have a terrible track record due to the ineptness of the programmer.

    I mean seriously, this is like claiming ASSEMBLY is a worthless insecure language because you can hang the system while in supervisor mode, due to ineptness? Sheesh.
  • Re:Phew! (Score:3, Insightful)

    by ArbitraryConstant ( 763964 ) on Friday February 04, 2005 @10:02PM (#11579397) Homepage
    Fair enough, but how often does the kernel do things like parsing strings?
  • Re:Advertisement? (Score:3, Insightful)

    by iabervon ( 1971 ) on Friday February 04, 2005 @10:02PM (#11579404) Homepage Journal
    His point is essentially that .NET does not protect the user against untrusted code, while Java does. If you run .NET code, you have to trust the developer, because the system won't protect you against a malicious or careless developer. If you run Java code in a sandbox, you're safe, because the system will watch what's going on and can be sure of the safety of its information.

    A hunting rifle is fine for some purposes, but decorating your house with them is unwise. Java, effectively, has support for making absolutely certain that the rifle cannot be fired, and therefore you can feel okay about having it on your mantle.

    Of course, he's theoretically wrong; the C standard actually does not exclude the possibility of preventing programs from doing bad things, by, for example, giving a bus error if you dereference a pointer to freed memory. You could have garbage collection in C if you really wanted, because there is a limited amount you can do to pointers and still be necessarily able to use them again. It's just that C implementations almost never do anything like this, because it would be slower and more resource-hungry than Java (because Java has limitations which then permit optimizations by the system). On the other hand, it might be worthwhile having such an environment, so that you could run untrusted code by developers who expect to be trusted.
  • by jeif1k ( 809151 ) on Friday February 04, 2005 @10:10PM (#11579459)
    This is what really distinguishes Java from other languages. The Java verifier is a sort of theorum prover that examines the byte-code and can guarantee that it does not violate certain rules such as forging the type of a reference or under/over-flowing the stack

    You arae kidding, right? Do you seriously believe Java is the first or only language to guarantee runtime safety? Safe languages are the rule, not the exception.

    To support C/C++ semantics (ad-hoc pointers) you'd have to throw all that out the window and I assume that's what he's talking about.

    C# distinguishes safe and unsafe code. C#'s safe code is as safe as "pure" Java code. You can think of C#'s unsafe code (or its equivalent in C/C++) as code linked in through the JNI interface, except that C#'s unsafe code has far better error checking and integration with the C# language than anything invoked through JNI.

    Altogether, C#'s "unsafe" construct results in safer and more portable code than the Java equivalent, native code linked in through JNI.

    Pat Niemeyer, Author of Learning Java, O'Reilly & Associates and the BeanShell Java Scripting language.

    Well, then I suggest you learn some languages other than Java before making such ridiculous statements.
  • by cecom ( 698048 ) on Friday February 04, 2005 @10:20PM (#11579521) Journal

    Yes, actually, it does. Have you checked it recently? The only overhead that natively compiled Java code would have over comparable C++ is that it always does array bounds checking. Other than that you just have to ask yourself, what kind of optimization can a static compiler (C/C++) do that a dynamic, profiling runtime compiler (Java) can't do?

    Java suffers a penalty of about 500% on the average because of pointer chasing. There is no compiler or optimization that exist today that can optimize that, but Java as a language requires it, while C/C++ does not. (Think about how an array of complex numbers can be implemented in either languages, for example)

    Static versus dynamic compilation has little to do with this, since both Java and C/C++ can be compiled statically or dynamically (although Java tends to benefit more from dynamic compilation).

  • by YetAnotherAnonymousC ( 594097 ) on Friday February 04, 2005 @10:29PM (#11579579)
    That's not all. Even natively compiled Java code has additional overhead in casting operations and in object instantiation.
  • Horses for courses.

    I run projects developing business database-based software. Stuff like java and .net suits me fine. Do I need to access hardware beyond what those languages give me? Nope. Ever likely to? Probably not, but there are ways to interface.

    How critical is performance? It's as important as it needs to be. If a daily process has a 12 hour window to run in and takes 1 hour instead of 30 minutes, do I care? It's fast enough.

  • Re:Phew! (Score:5, Insightful)

    by owlstead ( 636356 ) on Friday February 04, 2005 @11:02PM (#11579759)
    Pffft, I am working with a couple of high grade C++ programmers. When they go down using pointers etc. you can be sure they introduce some overflow errors. You need at least a code checker to make sure that the most common mistakes are avoided. This is like saying that Internet Explorer is not insecure, as long as you visit the right web-sites.

    For most applications assembly is a worthless insecure language, and you should stick to a higher level language if you don't want to introduce problems (for anything larger, but probably including "hello world").
  • by clem.dickey ( 102292 ) on Friday February 04, 2005 @11:07PM (#11579788)
    It's been about eight years now that Java has been touted (not necessarily by present company) as faster than C. Do I really need to check again? I did check yesterday, and Java took about one minute to bring up an application login menu. That is slower than any C program, with the possible exception of Lotus Notes. So there's speed and then there's speed. Initialization is time is still not too good in Java. Maybe Sun (and IBM, from whence my Java comes) could take a few pointers from Microsoft (Word) and Apple (OS/X boot). Keep "hot Java environments" ready for use, starting at boot time. Figure out how to run in less than ~0.5Gb of RAM. Mac OS X has seen memory requirements steadily *shrink* since initial release. That is unusual, if perhaps unprecedented. Amit Singh describes some of Apple's tricks at kernelthread.org. I'd like to see Java do that.

    Still, some of us C/C++ coders get pretty tired of the assumption that all technology benefits accrue to Java (or will Real Soon Now - the 4GHz and 5GHz Pentiums should really help) while none accrue to C/C++.

    Oh, and can I mention that heap? Java's insistence on the garbage collection model prevents the determinstic destruction of objects, upon which some programming idioms rely ("resource allocaation is initialization" is one). I don't mind that a heap exists for programmers who wish to use it, but It Would Be Nice If Java also permitted automatic allocation (and, more importantly, deallocation) in the C++ model. The RTE would have to deal with stale references to automatic objects, but Java could take care of that by raising an exception when auto destruction would create a stale reference.
  • Re:Phew! (Score:5, Insightful)

    by ArbitraryConstant ( 763964 ) on Friday February 04, 2005 @11:28PM (#11579879) Homepage
    "No offense, but give a fool a hammer and he'll crack his skull. C is not inherently insecure. C++ is not inherently insecure. If you don't know how to program, please step aside and let others through. I am not some sort of anti-managed-language zealot, I love Python, but to claim that C *as a language* has a terrible security track record is ridiculous. The applications, not the language, might have a terrible track record due to the ineptness of the programmer."

    That's just restating the question.

    If managed languages make a certain class of exploits impossible or very unlikely while C doesn't, then C is insecure relative to those languages.

    A good C programmer might be able to cut the exploit rate down to some very small value, but they're going to work pretty hard to get to that point while people in managed languages get it for free. And good C programmers still fuck up sometimes.

    Of course, there's other ways to screw up. No language is immune from security problems. Using a "managed" language is nothing more than risk management, but it's pretty effective.
  • Re:Phew! (Score:5, Insightful)

    by timeOday ( 582209 ) on Friday February 04, 2005 @11:56PM (#11580053)
    No offense, but give a fool a hammer and he'll crack his skull. C is not inherently insecure. C++ is not inherently insecure. If you don't know how to program, please step aside and let others through.
    No, the programmer is irrelevant to this argument. Pick any programmer you like, any one in the world. He will make mistakes at some rate. In C, those bugs will translate directly to security holes, whereas in a typesafe language they will not. It's just that simple.

  • Re:Advertisement? (Score:5, Insightful)

    by ndykman ( 659315 ) on Friday February 04, 2005 @11:59PM (#11580075)
    Not so. Well, unless you hacked the .Net type verification and loading code, managed to install it over the .Net Framework (not easy, really).

    All use of unsafe features in .Net are marked as such and can't be hidden. So, pointers, unsafe casts, etc. all stick out to the type loader. In fact, if an .Net assembly tries to mark itself as safe and it has unsafe features, the loader won't load it.

    As far as I know, there is no example of unmanaged code that can violate the managed code type system, and .Net was explicitly built to keep this from happening.

    Also, this ignores that C/C++ support is much more complicated in .Net. Yes, there is the IJW (It Just Works) stuff that allows unmodified code to compile to unsafe .Net assemblies, but there is also the C++/CLI stuff, which creates a CLS version of C++.

    Frankly, this seems like a bit of sour grapes to me. .Net does really improve on Java in lots of ways. Yes, James, Java isn't the last word on programming languages. .Net isn't either.
  • Re:Phew! (Score:4, Insightful)

    by Transcendent ( 204992 ) on Saturday February 05, 2005 @12:26AM (#11580220)
    Pffft, I am working with a couple of high grade C++ programmers. When they go down using pointers etc. you can be sure they introduce some overflow errors.

    Are they really that high grade then?
  • Re:Phew! (Score:2, Insightful)

    by relay_mod ( 525998 ) on Saturday February 05, 2005 @01:06AM (#11580391)
    • In C, those bugs will translate directly to security holes, whereas
    • in a typesafe language they will not. It's just that simple.

    Bullshit. The rate of security holes will be lower, yes. But type safety never guarantees correctness. As such, it never guarantees security.

  • Re:Phew! (Score:3, Insightful)

    by jtshaw ( 398319 ) on Saturday February 05, 2005 @01:10AM (#11580406) Homepage
    It is true, any programmer, no matter how good, will make a mistake here and there. However, buffer overflows and such in a single program don't have to be the security nightmare that they often are these days.

    It all comes down to bad OS design in general. Take the IE exploits for example. Why the heck can you get so much system access through an exploit in a web browser?!? Lets be honest here, the security model employed in most of today's OS's is mind boggling in it's ineptness.

    Linux is not immune either. Many distributions out there still have absolutely retarded setup's like having server daemons running as the root user. You run each server daemon under it's own user account and give the user no permissions on anything that it doesn't need for that particular daemon and you can at least save the rest of the system if the deamon is hacked.

    I love linux, but I'm sick of having to apply SELinux patches, Pax/Grsecurity patches, ACL patches, and setup complicated user jails just to feel like my system is safe.
  • Re:Phew! (Score:3, Insightful)

    by andreyw ( 798182 ) on Saturday February 05, 2005 @01:19AM (#11580450) Homepage
    Did I just DROP an important SQL table? ;-) ;-)
  • by Earlybird ( 56426 ) <slashdot@purefict i o n.net> on Saturday February 05, 2005 @01:50AM (#11580574) Homepage
    • Note that this isn't a particular vulnerability, just a system of typing that makes it easy to introduce vulnerabilities, which last time I checked, all C programmers deal with.

    Yes, CowboyNeal, but do they want to deal with it, and should they deal with it?

    For every programmer who reads security bulletins and keeps tabs on the latest string-copying buffer overflow issues and fundamental security principles, there are a hundred who don't know or care.

    C is a high-level language that:

    • Has direct access to every part of the operating system and executes instructions directly from memory. This means that malicious code can slip into its memory space through buffer overflow exploitations and the like.
    • Is, in almost all cases/operating systems, running with the same capabilities as the logged-in user, which means it has virtually endless power that ranges from formatting your hard drive to infecting other nodes with worms or looking through your email app's address book. It's not limited to the desktop computer of the hapless Windows user, either: Unix daemons running on servers as non-root users can cause serious havoc.

    Programmers want to be productive -- most want to make things make colourful stuff happen on the screen, not fiddle around with buffer guard code. So the more security can be built into the language and its running environment, the better.

    Many languages, such as Python or Ruby, provide security against what I mention in my first bullet, through a virtual machine. They're not impenetrable, and are of course, as dynamic languages, subject to a different class of security holes (eg., string evaluation of code), but they're a step up from the C level.

    Other languages, like Java, provide capability-based security models, allowing for sandbox environments with fine-grained control over what a program may or may not do. Java's security system is ambitious, but since most Java apps run on the server these days, it's not frequently used, and except for browser applets, Java code tend to run unchecked.

    In a way, Java tries to do what the OS should be doing. Programs run on behalf of its human user, and their destructive power is scary. Why should any given program running on my PC have full access to my documents or personal data? As we're entering an age where we have more and smaller programs, and the difference between "my PC" and "the net" is increasingly blurred. Operating systems need to evolve into being able to distinguish between different capabilities it can grant to programs, or processes -- we need to think about our programs as servants that are doing work for us by proxy.

    The same way you wouldn't let a personal servant manage your credit cards, you don't want to let your program do it -- unless, of course, it was a servant (or, following this metaphor, program) hired to deal with credit cards, which introduces the idea of trust. The personal accountant trusts the bank clerk, who trusts the people handling the vault, who trust the people who built the vault, and so on.

    In short, any modern computer system needs to support the notions of delegated powers, and trust.

    Programmers will certainly never stop having to consider vulnerabilities in code. But painstakingly working around pitfalls inherent in one's language, be it C or indeed .NET -- we need to evolve past that. The users, upon whom we exert so much power, certainly deserve it.

  • Re:Phew! (Score:2, Insightful)

    by malfunct ( 120790 ) on Saturday February 05, 2005 @01:53AM (#11580587) Homepage
    If you examine the fault model for C/C++ programs as compared to managed programs you will find that the classes of errors are very different. Its much easier to overflow buffers and overflow integers in an unmanaged language. These types of errors lead to exploitable holes and are just way more common in C/C++ programs.
  • Re:Phew! (Score:5, Insightful)

    by Brandybuck ( 704397 ) on Saturday February 05, 2005 @02:16AM (#11580698) Homepage Journal
    Then they're not "high grade". If you need to use strings, you use the string class. If you need to use a bounded array you use the STL vector. If you can't use the STL, you guard your arrays. Those three things, which are normal "high grade" C++ coding style, avoids the vast majority of potential overflows.
  • by EventHorizon ( 41772 ) on Saturday February 05, 2005 @03:12AM (#11580873)
    Incompetence leads to trouble in any language. For instance, a recursive function which does not enforce a recursion limit may:

    - segfault in C
    - throw an unhandled exception in python
    - churn up 2GB swap in java (or something similar)

    In any case, think DoS. The solution is to program competently, regardless of language.
  • Re:Phew! (Score:3, Insightful)

    by HiThere ( 15173 ) * <charleshixsn@LIONearthlink.net minus cat> on Saturday February 05, 2005 @03:45AM (#11580966)
    Actually, C *is* inherrently insecure. You are required to use unbounded pointers. And, yes, assembly language is insecure in exactly the same way.

    It would help if C automatically initialized ram (and I've known at least one C that did, but it isn't a part of the language specs), just like it helps not to be able to use 0 as a pointer address. But the only thing that would make C safe would be to put it in a box, and not let it look outside. Even then you'd get buffer overflows within the code, but if it was restricted to not use pointer references outside the bounds of the program that would help.

    Saying that "C is not inherently insecure" is like saying a sharp knife without a hilt isn't dangerous to the user. With reasonable care one can usually insure that nothing adverse happens to one, but it always takes a lot of care and attention, and the time that you slip can hurt you badly.
  • Re:Phew! (Score:5, Insightful)

    by jilles ( 20976 ) on Saturday February 05, 2005 @04:22AM (#11581076) Homepage
    ASSEMBLY is a totally inappropriate language for the vast majority of applications, including operating system kernels, video card drivers and games. The complexity and security tradeoff simply doesn't justify the performance gains. That's why it isn't used anymore in most of the software industry. The same is true to a lesser extent for C and C++.

    These languages are inherently insecure because they allow for mistakes that other languages do not allow for. Combine this with the fact that it doesn't take a fool to make mistakes and you have the perfect proof that C is inherently insecure. Refute either of those arguments and you might have a point.

    The problem with C is that it takes the inhuman capability to not make mistakes to end up with secure software. That's why all of the long lived C software projects have to be constantly patched to correct mistakes. Buffer overflows are no accidents, they are the predictable result of using C. Use C -> expect to deal with buffer overflows, memory leaks, etc. Every good C programmer knows this.

    The difference between good C software and bad C software is that in the latter case the programmers are surprised whereas in the first case the programmers actually do some testing and prevention (because they expect things to go wrong). Mozilla Firefox comes with a talkback component because the developers know that the software will crash sometimes and want to be able to do a post mortem. The software crashes because the implementation language of some components is C. IMHO mozilla firefox is good software because it crashes less than internet explorer and offers nice features.

    Of course we have learned to work around C's limitations. Using custom memory allocation routines, code verifiers and checkers, extensive code reviews we can actually build pretty stable C software. The only problem is that C programmers are extremely reluctant to let go of their bad habits. Some actually think they are gods when they sit down behind their editors and do such stupid things as using string manipulation functions everyone recommends to avoid like the plague, trying to outsmart garbage collecting memory allocaters by not using them, etc. If you'd build in a switch in the compiler which enforces the use of the improvements listed above, most of the popular C software simply wouldn't compile. But then it wouldn't be C anymore because the whole point of C is to allow the programmer to do all the bad things listed above, even accidentally.

    IMHO programmer reeducation is an inherently bad solution to inherent security problems of the C language. You can't teach people not to make mistakes. You need to actively prevent them from making mistakes. You can make mistakes in other languages but they are a subset of the mistakes you can make in C. Therefore you should use those languages rather than C unless you have a good reason not to. Any other reason than performance is not good enough.
  • Re:Phew! (Score:3, Insightful)

    by tomstdenis ( 446163 ) <tomstdenis@gmGINSBERGail.com minus poet> on Saturday February 05, 2005 @04:33AM (#11581104) Homepage
    How did your post get insightful?

    buffer overflows are not the only kind of bug that plagues development. quite a few "plain old logic errors" or "insecure designs" are source of problems.

    I mean I just reinstalled pam last night [for the second day in a row... diff versions] with maybe 20 patches applied to it. I doubt all 20 [or any at all] were due to buffer overflows.

    A proper programmer would do proper bounds checking on their own [e.g. I need to store N bytes, do I have N bytes available]. People who don't shouldn't be writing software. Period.

    And yes, "shit happens" but you can just as easily screw up the logic or other aspects in a typesafe language and end up with lowered security.

    SSL rollback anyone?

    Tom
  • by reflective recursion ( 462464 ) on Saturday February 05, 2005 @05:10AM (#11581202)
    i was thinking more along the lines of Lisp or Scheme. please do your own research, I really don't have time to school a newbie. Java is nothing new. Hell, even pascal had P-code. Welcome to the 1980s and prior. GC is some new technology according to Java people.. what a fucking joke.
  • by notany ( 528696 ) on Saturday February 05, 2005 @05:16AM (#11581221) Journal
    Good architect will know how to choose tools to match the problem. (If you can't, either you are not educated or you are code slave)

    Rule:

    If you don't need to spend 5-10% of your development time to speed/size optimizing your program to make it useable, you are not using language/abstractions that is high level enough to your task.

    Explanation:

    If I use high level languge (say Haskell/OCaml/Clean/Common Lisp) and use all it's abstarction powers, program code will usually be 10-50% of the size compared to same program written in C/C++ (and Even Java). Now that X% (50-90%) of slack will take X% of development time and contain X% of the bugs. It will make the program much harder to change too. You can see that the 5-10% spent into optimization (you can even write the fast parts in C if you like) will pay.

    If you don't belive, compare code of gnu-arch arch [srparish.net] to darcs [abridgegame.org]. Both are similar version control systems.

  • Re:Phew! (Score:1, Insightful)

    by Anonymous Coward on Saturday February 05, 2005 @06:07AM (#11581344)
    "If you don't know how to program, please step aside and let others through."

    Ah...another one that has never been a novice.
  • Re:Phew! (Score:3, Insightful)

    by Taladar ( 717494 ) on Saturday February 05, 2005 @10:37AM (#11582267)
    The problem with .NET as I see it is that VMs don't make much sense if they only run on one platform. They are just an additional layer of crap that slows down your program.
  • Re:Advertisement? (Score:3, Insightful)

    by Bodrius ( 191265 ) on Saturday February 05, 2005 @06:36PM (#11585682) Homepage
    You are correct, I should say that C# fully support static inner classes.

    "Nested type" and "inner class" are, as far as I know, equivalent terms and the latter has been a common term used in both languages. However, this is the first time I know of someone who differentiates the two terms in practical use.

    What makes the implicit "this.Outer" the essential feature of an "inner class" in your terminology? I'm also curious as to why you consider it such an important feature.

    Once more this seems to be a matter of designer taste, but unlike anonymous inner classes, I do not see how this is particularly useful.

    I have seen 2 broad categories where the reference to the outer class is useful:
    - Inner classes for logic encapsulation in a complex class: passing the reference explicitly is no pain here, though.
    - Anonymous inner classes: most useful because the class declaration itself is overkill, doing the constructor plumbing even more so.

    I always saw the non-static inner class as a nice element of syntactic sugar that quickly gets out of control because it solves a non-problem if you ignore anonymous inner classes. In the process they complicate the language with special semantics and syntax for little need. Of course, once you bring this in to support anonymous inner classes it pays for itself. But the problem is that anonymous inner classes have almost always been a crufty solution themselves... 99% of the times I used them effectively they were functors and what I wanted was a method, not a full class.

    Ignoring anonymous inner classes for a sec:

    In Java, I do not have to pass an instance of my object to the inner class, so I get 'for free' the ability to write this (horrible) syntax:

    int x= this.ExtremelyLongNameBecauseWeKnowClassesGetLongN amesTheseDays.privateMember.foo();

    If the class has a non-trivial size (which for non-anonymous classes can easily happen) or the reader of the code is not familiar with the special semantics of Java's non-static inner classes, it's not immediately obvious where this reference came from.

    As a programmer I have to make an explicit decision to NOT keep an extra reference in my object. if I forget to put the static modifier, I get this dancing implicit this.Outer 'for free' too, when normally I just don't need and pollutes the contents of the object.

    In C#, you have to explicitly decide you'll need a reference to your outer type, and then use it:

    public MyFooInnerClass(MyBarOuterClass bar)
    { myBar = bar; } ...

    int x = this.myBar.privateMember.foo();

    The latter code seems to me more straightforward and readable, not depending on special semantics to know where myBar came from.

    Sure, I have to pass the 'outer' object to the constructor myself instead of letting the compiler do it, but in exchange I don't have to worry about a different instantiation or reference syntax. Anyone who has never seen this done before will not be too surprised this works, and will have little trouble understanding what is going on.

    In order to keep things clean in Java in my code I always made inner classes static until I actually NEEDED the outer reference. That is not different than explicitly passing the reference to the outer object in the constructor when you find you need to, only the latter is always an explicit decision on the programmer.

    That seems to me a good thing, but once more it's a matter of taste.

  • by bXTr ( 123510 ) on Saturday February 05, 2005 @07:25PM (#11586008) Homepage
    Talk about not eating your own dogfood?

Pound for pound, the amoeba is the most vicious animal on earth.

Working...