Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security Microsoft Java Programming Sun Microsystems

Gosling Claims Huge Security Hole in .NET 687

renai42 writes "Java creator James Gosling this week called Microsoft's decision to support C and C++ in the common language runtime in .NET one of the 'biggest and most offensive mistakes that they could have made.' Gosling further commented that by including the two languages into Microsoft's software development platform, the company 'has left open a security hole large enough to drive many, many large trucks through.'" Note that this isn't a particular vulnerability, just a system of typing that makes it easy to introduce vulnerabilities, which last time I checked, all C programmers deal with.
This discussion has been archived. No new comments can be posted.

Gosling Claims Huge Security Hole in .NET

Comments Filter:
  • by Saint Stephen ( 19450 ) on Friday February 04, 2005 @08:54PM (#11578876) Homepage Journal
    I hate to defend MS on this, but you have to have a certain type of permission to call unsafe code. As soon as you call anything such as that, the whole program becomes immediately unverifiable.
  • Rediculous (Score:2, Interesting)

    by SnprBoB86 ( 576143 ) on Friday February 04, 2005 @09:01PM (#11578938) Homepage
    Assemblies (.NET DLLs and EXEs) require special permission to run unsafe code. In the eyes of .NET, all unmanaged code or any use of pointers is considered unsafe. This includes every C/C++ application ever. .NET's philosophy on security is clear:
    A .NET assembly is secure except by special request to use unsafe code. Over time, all assemblies should be completely void of unsafe code except for assemblies from trusted sources.

    For example: The end user can grant unsafe permissions to the Microsoft Managed DirectX assemblies. Anyone could then use these assemblies without needing unsafe permissions. If you trust MS MDX to use unsafe code, and you trust the app you downloaded to use MS MDX, you don't need to give the app permission to use unsafe code.
  • All C programmers? (Score:2, Interesting)

    by Space Coyote ( 413320 ) on Friday February 04, 2005 @09:10PM (#11579014) Homepage
    which last time I checked, all C programmers deal with.

    "Trust the programmer" is the most asinine statement ever put to paper, with the possible exception of "security through obscurity."

    We have an operating system so that programmers don't have to do boilerplate file and memory operations themselves, we have good type-safe languages so they don't have to spend time profiling all of their code to make sure it doesn't have any buffer overrun risks.
  • Beware the agenda (Score:4, Interesting)

    by BillsPetMonkey ( 654200 ) on Friday February 04, 2005 @09:14PM (#11579057)
    C++ allowed you to do arbitrary casting, arbitrary adding of images and pointers, and converting them back and forth between pointers in a very, very unstructured way.

    Unstructured? Yes. A huge security hole? No more than any other language using COM objects. You can write crappy spaghetti code in any language. The type interface for .NET and the unsafe keyword for managed code are there to restrict how you use native objects.

    What Gosling is really criticising is the way .NET handles managed code, which java can't do so easily (remember jini? Me neither) - so what .NET should really do according to Gosling is have a sandbox runtime with no severely restricted access to the native interfaces - to hell with performance compared no native methods? Oh, that'll be just like ummm .. java then.
  • by bitflip ( 49188 ) on Friday February 04, 2005 @09:23PM (#11579129)
    Actually, this is the kind of thing I like to see. It is definitely technology related; it's omission would be an error, IMO. If I'd seen this someplace else, and didn't see a discussion of it on /., I'd be concerned.

    The fact that the editors actually chose to point out the flaw in the argument (in MS' favor!!!), rather than adding to the sensationalism is a welcome and refreshing change.
  • by Anonymous Brave Guy ( 457657 ) on Friday February 04, 2005 @09:36PM (#11579224)
    Sure it does, comiled Java code can be just as fast as C++ code and nearly as fast as C code.

    You imply that compiled C code is faster than compiled C++ code, which IME is rarely the case these days. In particular, optimisations performed by C++ compilers have almost caught up with their C brethren. With almost perfect zero-overhead implementations of all the major C++-only language features now in common use and the added performance boost from things like inlined code in templates, the balance often tips significantly in C++'s favour now.

    There is numeric code available for Java today that is just as fast as equivilant libraries for C++.

    Can you give some examples of high quality numerical libraries written in pure Java (i.e., without JNI)?

    Disclaimer: I'm a professional C++ developer, and I write high performance maths libraries for a living.

  • Re:Advertisement? (Score:3, Interesting)

    by MerlinTheWizard ( 824941 ) on Friday February 04, 2005 @09:51PM (#11579327)
    C and C++ allow for buffer overflows.

    It's irrelevant, actually. A bug is a bug. You can make them in any language. The consequences of the potential bugs are what matters. But only the implementation defines what a "buffer overflow" will actually do. Granted you can try and write past some allocated buffer in C (and C++). That doesn't mean the write should actually occur. That's the responsibility of the implementation, and mostly of the underlying operating system. I already said that earlier: the major problem we have been facing for decades on mainstream systems (and even some critical servers) is, in my opinion:

    Allowing executing code from a purely 'data memory' space. That should never, ever be possible under any circumstances. I'll fight for that cause if I have to.

    Poor 'data memory' protection. Ideally, the OS should be able to protect individual data areas, down to application buffers and variables. There is nothing that would prevent from writing a C or C++ compiler for such an environment. Absolutely nothing.

    You may not be able to "overflow" some data buffer in Java, but you can always write garbage to it. That's the same. As I said, the languages need not be fixed. The systems and the memory models do.

  • by afidel ( 530433 ) on Friday February 04, 2005 @09:53PM (#11579339)
    You should be able to find plenty of starting points here [nist.gov].
  • Re:Advertisement? (Score:3, Interesting)

    by andreyw ( 798182 ) on Friday February 04, 2005 @10:03PM (#11579410) Homepage
    C is a language. Its not an OS executive. It can't stop you from overwriting 0xdeadbeef with gobbledygook if your OS has no VMM.... which unless you're still running DOS, writing a kernel, or programming an embedded device... is not a problem as you might imagine.
  • Free and malloc are NOT very simple algorithms. They have arbitrary complexity. malloc ranges from the normal case of about 100 instructions (when a block of memory of the correct size is available) to unbounded (when data must be paged out to make enough room for the new allocation and the data for tracking it). Free can also be arbitrarily slow, as releasing one block may trigger a coalesce operation.

    So you have to measure time per malloc and time per free, then total them up and compare it to GC's time per allocation and time spent in GC. In some cases, one will be significantly larger than the other, but in most nontrivial programs, using modern malloc/free and modern GC, it comes out pretty close to even.

    Some argue that the "pause" from GC is a problem. Maybe. Except that as mentioned before, malloc can also "pause" for arbitrarily long times. And a lot of work has been done on "concurrent" GC that doesn't pause. If you can afford paging in from disk (swap file), you can also afford GC's "pause".

    Finally, when you write a big program, you spend incredible effort in your program tracking memory. That takes cycles. "If x then save a copy cause we'll have to free it later, etc."

    The bottom line is that there are some cases where GC still won't work, but those cases are getting smaller and smaller. For most cases, the argument that GC is slow or inefficient just isn't true. Go do some real benchmarks, or go study up on the already published benchmarks. GC is pretty efficient, and malloc/free has no significant speed advantage anymore.
  • I hate Java. (Score:1, Interesting)

    by Anonymous Coward on Friday February 04, 2005 @10:24PM (#11579549)
    A little off-topic, but I just have to say it now that someone mentioned Gosling...

    The "enterprise" label is given to any application that meets the following requirements:

    o Must be Java-based.
    o Must be a bloated/slow POS that crashes every day making oncall a total nightmare.
    o Must have little to no documentation(requires an expensive consultant to install)
    o Must cost a lot of money cause if it was cheap it must mean it sucks. Right? uhhh

    At least this is true with the company I work for.
    I long for the days most webapps were written in scripting languages...

    So I'd like to say thank you Gosling for making working as an IT sysadmin suck.
  • Re:Phew! (Score:5, Interesting)

    by pivo ( 11957 ) on Friday February 04, 2005 @10:26PM (#11579559)
    I think the point is that it's much easier to inadvertently create security holes when you write code in lower level languages like C. Lots of excellent programmers have written code with security problems, simply because they're focusing on making their code work and not thinking about security. It's an extremely common problem, and while it may be a problem with the developer's focus, it's not generally a problem of low skill levels.

  • Re:Phew! (Score:2, Interesting)

    by Anonymous Coward on Friday February 04, 2005 @10:32PM (#11579600)
    Congratulations. You have just described Microsoft's worst case scenario. They will never let this happen, so long as they have any say.

    You are correctin saying that Microsoft wants userland running in a virtual machine - specifically THEIR virtual machine. And ONLY their virtual machine. This will become evident as soon as the gloves come off with .NET.
  • by Anonymous Coward on Friday February 04, 2005 @10:37PM (#11579637)
    Sorry. This doesn't really cut it. Note the dates on the website. Its been moribund for about 2-3 years (the unfortunate side effects of downsizing useful government initiatives to fund the military/security complex and tax cuts, but that is another sad story).

    Of the two "numeric" libraries mentioned on the website only one handles complex numbers and the implementation in java leaves much to be desired (relative to assembly or C). To my knowledge, the Lau Numerical libraries based on algol routines are good and probably the most extensive available in java, but there are numerous performance issues relative to FORTRAN and only some are even marginally optimized to run in parallel.

    With the advent of dual core opterons in the near future and the commercialization of grid computing, as a SUNW investor, I'm still hopeful that James Gosling and Sun will step up to the plate here. So far, I haven't seen my fervent hopes come true, but perhaps you know something I don't. If so, I'd love to hear about it.

    Investing aside, this is really unfortunate as 1) I love to program in java, 2) my area of interest is in the application of numeric algorithms for bioimaging and 3) I would like to make better use of such in threaded, object oriented/actionlistener/GUI contexts for which java excels. When one attempts to call numeric libraries to produce actionlisteners attached to dynamic graphics calls, even minor performance penalties can notably degrade graphics performance particularly when matrix sizes that result from even NTSC video resolutions.

    I'm not too thrilled about invoking JNI as there can be a substantial overhead on calling native code such as BLAS from within java (not to mention that it is complicated, even for relatively simple function calls and of course non portable). This also ignores the problems raised about java's floating point representation and its inability to code for addition and multiplication within a single clock cycle as can be done in FORTRAN. Such problems are especially acute when dealing with eigenvalue problems in which the results contain roots with multiplicity and where ill-conditioning can be an issue for iterative solutions. Obviously, 64 bits will help in such circumstances, but its not really a general panacea.

    This is all somewhat off topic, but I am always on the lookout for someone who knows better than I, as I am keen to prove myself wrong (and hence be in the position to write more effective java code).

    It is ironic that Sun has move aggressively to grid computing, but has still not fully address limitations within java's numeric routines. James Gosling made some noises about attending to these defects on the forum you mentioned quite a few years ago, but to my knowledge no real response to the criticisms raised in references to be found via the website you cite. At least none that I am aware of.

  • by sapgau ( 413511 ) on Friday February 04, 2005 @10:43PM (#11579665) Journal
    Man, why don't you tell us why you really hate Java.

    No, Java is not suitable (or useful) for what an engineer would call a "critical" application. Those applications are coded in C or C++ (or Assembler).

    I'm using java because that was the business decision made by my boss (or my boss' boss). So I'm just told what I have to do (what interface the user expects, what system I have to connect to, etc.) But for the company I work for, Java might be a critical part of their business plan.

    For example, you won't find java in a heart monitor in the hospital but probably find the server that keeps your health records is done in Java. Whoever is developing the Health record system can (more or less) pass the code to a new developer to continue working on it without expecting the new guy to be an expert on that particular system.

    Anyway, this could all be bullshit if sound coding practices are not follow on ANY language.
  • by David's Boy Toy ( 856279 ) on Friday February 04, 2005 @10:56PM (#11579735)
    Java is way way over hyped, not to mention proprietary.

    You can program C++ using classes such as std::string in a manner similar to the java string class. This eliminates most of the buffer overrun issues that plague many C programs. But unlike java you can bit twiddle when you need to, ideally encapsulating your twiddling in a class.

    Java is simply a straight jacket for programmers, but straight jackets can't prevent logic errors. Your data is still at risk through stupid programming mistakes.
  • Re:Advertisement? (Score:3, Interesting)

    by owlstead ( 636356 ) on Friday February 04, 2005 @11:08PM (#11579796)
    I agree with almost everything of your post. However, I must say that not every features .NET has over Java is an improvement. And not everything has such a long track record either. It took MS almost no time at all to implement almost every single feature of Java 1.5 into C#. Then again, Java added auto-boxing, which seems to stem from C#. None of the features of both languages is really original of course.

    Note that the Java VM of Microsoft was not that safe. I am very curious if .NET will have a better track record. Safe code is very interesting thing to have - if it runs on a safe virtual machine.
  • Come on... (Score:2, Interesting)

    by RedHatRebel0 ( 800752 ) on Friday February 04, 2005 @11:14PM (#11579831) Homepage
    I don't like Microsoft just as much as the next Linux user, but come on... Good programmers know how to deal with the problems inherent in C/C++. Not to mention that those problems extending beyond .NET to all C/C++ environments. Besides, why should anyone listen to the anti-C++ rantings of one of the top dogs in Java world. Just remember that today's security is tomorrow vulnerability.
  • Re:Advertisement? (Score:5, Interesting)

    by gburgyan ( 28359 ) on Saturday February 05, 2005 @12:06AM (#11580113) Homepage
    I have to agree -- and I'll try to extend your arguments even further.

    In my current job, which involves quite a bit of C#, I had the opportunity to port large chunks of our legacy application from C++ to Managed C++. We didn't gain security benefits, nor did we gain speed; we didn't loose any either. However we gained a lot of maintainability since we now have a single stack-trace to deal with that bridges all of the languages that we have (now reduced to C# and C++ -- down significantly from when we relied heavily on COM)

    The fact that MS gave us that choice is wonderful. If we wanted to be using JNI (which I had the unlucky opportunity to use), we'd not have made much progress at all.

  • by skraps ( 650379 ) on Saturday February 05, 2005 @01:33AM (#11580511)
    That is just because of the auto-boxing. The CLI specifies primitive types like
    bool, int, uint, int16, uint16, int32, uint32, int64, uint64, char, and byte.
    These are quite distinct from their corresponding boxed types in the System namespace,
    Boolean, IntPtr, UIntPtr, Int16, Uint16, Int32, Uint32, Int64, Uint64, Char, and Byte.
    The "box" and "unbox" CLI instructions allow for the translation between the two sets above. There is nothing automatic about this at the CLI level.

    The reason you've probably never seen it, and the reason your code snippet works, is because C# is smart enough to automatically insert box and unbox instructions where appropriate.

    If you want to prove this to yourself, try reading the ECMA Standard 335, which covers the topic.

  • by Unordained ( 262962 ) <unordained_slashdotNOSPAM@csmaster.org> on Saturday February 05, 2005 @02:01AM (#11580623)
    Why do people think of Pascal as inherently safe? If you want to mess with pointers, it's really rather easy. Stick 'em in a variant (union) record, do your arithmetic against another named integer, and there you go. I mean really ...

    "Object-oriented programming" is ill-defined. It encompasses a lot of languages that go about it in entirely different ways. To me, the most it can mean is "calling functions with an assumed this pointer." What does OO mean to you? Virtuals? What makes it "real"?

    There are benefits to C beyond speed and direct access to memory, hardware, etc. People seem to forget that for us to make software "work together", calling conventions across libraries need to be compatible. Which is why we picked C calling conventions. It's not necessarily the most expressive if you're into fancy things, but it is flexible enough for most everything. My main problem with Java isn't the language -- it's the libraries. Lots of them, packaged in their own special way, not really designed for use by any language.

    Languages, most of the time, aren't the issue. We haven't gained all that many 'new' features with new languages, at least not anything we can't easily live without. Access to symbols is an issue, however, and a really important one from the point of view of integration, code re-use, and even making sure you're using trusted/proven code.

    Regardless of buffer overflows, you can still write infinite loops, incorrect logic, etc. in just about any language. These language wars are about markets -- they're about money.
  • Re:Phew! (Score:3, Interesting)

    by Brandybuck ( 704397 ) on Saturday February 05, 2005 @02:08AM (#11580659) Homepage Journal
    All programming mistakes are security holes, because any software that doesn't behave as intended is a security hole. They might not all grant root access to random passerbys, but they are security holes nonetheless.

    Anecdote time. After five years of working on a million+ line C/C++ codebase, I ran across my first buffer overflow last monday. I've seen many potential buffer overflows (and fixed them when I found them), but this was the first I've seen actually get thrown over the wall to QA.

    If buffer overflows are getting past your unit tests, it's because you're not writing proper unit tests. Using a language as a substitute for proper testing is pathologic. If a tenth the energy spent proselytizing for certain languages was spent on proselytizing for correct software testing, we wouldn't have this problem. Of course, I don't always do unit tests myself, so remember to do what I say and not what I do...
  • by StarsAreAlsoFire ( 738726 ) on Saturday February 05, 2005 @04:16AM (#11581057)
    I call bullshit.

    If 'greenhorn' C++ developers can make an app that is even ONE PERCENT faster, then the Java Developers WERE NOT 'highly skilled'. Period. But TWICE as fast? As in, C++ takes 1/2 the time to execute 'x' as the Java version? No way. Not even if we are talking linear algebra code* [home.cern.ch].

    An experienced Java programmer knows you have to memory manage large apps. Yes, Java will *always* use more memory than an equally well written C++ app; however, unless you are working *exclusively* with *huge* arrays, java will be damn near as fast, and often faster than equivilent C++ code. Hell, if those arrays have to be collected in C++, Java will be faster. *UNLESS* you optimize the living crap out of your C++ code and ignore optimizations on your Java code.

    I have *plenty* of issues with Java (for instance, who the f*ck decided on a 64MB default max memory space for the JVM?). Speed has always been one of my Java PLUS points.

    * A quote for the lazy:
    "For example, IBM Watson's Ninja project showed that Java can indeed perform BLAS matrix computations up to 90% as fast as optimized Fortran."
  • by Anonymous Brave Guy ( 457657 ) on Saturday February 05, 2005 @02:21PM (#11583779)

    Java has several pretty fundamental disadvantages when it comes to serious numerical work, compared to a language like C or C++.

    The most obvious is the "everything is an object" principle. If you can't create value types for things like vectors or complex numbers, you're imposing performance overheads for dereferencing before you even start doing any maths.

    Moreover, serious maths work often involves large data sets. We work with graphs with many thousands of nodes pretty routinely, which can make fine control of how much memory each node occupies very important even on powerful workstations with lots of RAM. When you're constrained to do everything using indirection and using a limited set of primitive types, this is difficult to impossible.

    Then of course there's Java's floating point requirements, which were technically impossible to meet for a long time IIRC. I'm not sure whether they've been fixed even today, but certainly if you require a VM to do a manual series expansion to calculate trigonometric functions according to your strict requirements, while everyone else is using a single FPU instruction and getting an answer that is either identical or off by one in the last binary place, you are not going to be winning in the performance stakes.

    The bottom line is that the same things that are strengths for safety/security in general applications -- lack of low-level control and banning dangerous primitive constructs -- can be huge weaknesses when those are necessary to achieve an acceptable result in the real world.

    Despite Sun's propaganda, I suspect C and C++ are still considerably more portable than Java. I don't know how you managed to get code only compiling on one machine on one platform. We routinely build our code on something like 15 different compiler/platform combinations, with many more having come and gone in the past, and anything not building on any platform is usually an old compiler failing to support a standard feature properly so we rewrite that code to work around the problem. Java's "perfectly portable" floating point requirements might be an advantage in this area -- we do occasionally see very minor discrepancies in the outputs on different platforms -- but I don't see Java as an advantage for actually compiling your code across different platforms.

  • by DavidHopwood ( 853221 ) on Sunday February 06, 2005 @01:38AM (#11587995)
    .NET and Java are both insecure, because they both rely on too much code written in unsafe languages.

    If you want to implement a system based on language-level security using a mixture of code in safe and unsafe languages, as little as possible of the system must be written in the unsafe language(s), and that part must be treated as being in the system TCB.

    Some unsafe code is unavoidable if you want the system to be able to use OS facilities on Windows and Unix. However, it must be written by people who know how to write secure code, and gone over with a fine-tooth comb for possible weaknesses.

    It is completely disingenuous for either Microsoft or Sun to claim that these platforms are secure, given that their implementations depend on millions of lines of unsafe-language code that no-one is ever going to review properly. Even more so since both .NET and Java allow any arbitrary application to load unsafe code.

    So basically, Gosling's argument is correct: .NET will never be secure with its current architecture. Neither will Java. I am personally convinced that language-based security can be made to work (using a capability security model), but not the way Microsoft or Sun are doing it.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...