Gosling Claims Huge Security Hole in .NET 687
renai42 writes "Java creator James Gosling this week called Microsoft's decision to support C and C++ in the common language runtime in .NET one of the 'biggest and most offensive mistakes that they could have made.' Gosling further commented that by including the two languages into Microsoft's software development platform, the company 'has left open a security hole large enough to drive many, many large trucks through.'" Note that this isn't a particular vulnerability, just a system of typing that makes it easy to introduce vulnerabilities, which last time I checked, all C programmers deal with.
Phew! (Score:5, Funny)
Oh. Never mind!
Re:Phew! (Score:2)
Re:Phew! (Score:3, Insightful)
Re:Phew! (Score:4, Informative)
Re:Phew! (Score:3, Insightful)
Fair enough, but at least 90% of the stuff written in C and C++ doesn't need to be.
Re:Phew! (Score:5, Insightful)
I mean seriously, this is like claiming ASSEMBLY is a worthless insecure language because you can hang the system while in supervisor mode, due to ineptness? Sheesh.
Re:Phew! (Score:5, Interesting)
Re:Phew! (Score:3, Informative)
I'd like to point out that in this day and age most C programmers have heard about the problems and make some effort to prevent them. While programmers in "safe" languages (VB) generally have not heard of these problems, so while they are harder to create, those programmers are also less likely to recognize them. In fact problems in C are generally minor mistakes that are easy (though tedious) to fix, while in the other languages the same problem tends to be major design level issues that are hard to co
Re:Phew! (Score:3, Funny)
Just imagine how secure the world would be if we wrote everything in PHP! :)
Re:Phew! (Score:3, Insightful)
Re:Phew! (Score:5, Insightful)
For most applications assembly is a worthless insecure language, and you should stick to a higher level language if you don't want to introduce problems (for anything larger, but probably including "hello world").
Re:Phew! (Score:4, Insightful)
Are they really that high grade then?
Re:Phew! (Score:5, Insightful)
Re:Phew! (Score:3, Funny)
Give a man a gun, and he can kill many people with it.
Give that same man a pencil and... eh, not so much.
Re:Phew! (Score:5, Insightful)
That's just restating the question.
If managed languages make a certain class of exploits impossible or very unlikely while C doesn't, then C is insecure relative to those languages.
A good C programmer might be able to cut the exploit rate down to some very small value, but they're going to work pretty hard to get to that point while people in managed languages get it for free. And good C programmers still fuck up sometimes.
Of course, there's other ways to screw up. No language is immune from security problems. Using a "managed" language is nothing more than risk management, but it's pretty effective.
Re:Phew! (Score:5, Insightful)
Re:Phew! (Score:3, Insightful)
It all comes down to bad OS design in general. Take the IE exploits for example. Why the heck can you get so much system access through an exploit in a web browser?!? Lets be honest here, the security model employed in most of today's OS's is mind boggling in it's ineptness.
Linux is not immune e
Re:Phew! (Score:3, Interesting)
Anecdote time. After five years of working on a million+ line C/C++ codebase, I ran across my first buffer overflow last monday. I've seen many potential buffer overflows (and fixed them when I found them), but this was the first I've seen actually get thrown over the wall to QA.
If buffer over
Different != Better (Score:4, Insightful)
- segfault in C
- throw an unhandled exception in python
- churn up 2GB swap in java (or something similar)
In any case, think DoS. The solution is to program competently, regardless of language.
Re:Phew! (Score:3, Insightful)
buffer overflows are not the only kind of bug that plagues development. quite a few "plain old logic errors" or "insecure designs" are source of problems.
I mean I just reinstalled pam last night [for the second day in a row... diff versions] with maybe 20 patches applied to it. I doubt all 20 [or any at all] were due to buffer overflows.
A proper programmer would do proper bounds checking on their own [e.g. I need to store N bytes, do I have N bytes available]. People w
Re:Phew! (Score:3, Insightful)
It would help if C automatically initialized ram (and I've known at least one C that did, but it isn't a part of the language specs), just like it helps not to be able to use 0 as a pointer address. But the only thing that would make C safe would be to put it in a box, and not let it look outside. Even then you'd get buffer overflows within the code, but if
Re:Phew! (Score:5, Insightful)
These languages are inherently insecure because they allow for mistakes that other languages do not allow for. Combine this with the fact that it doesn't take a fool to make mistakes and you have the perfect proof that C is inherently insecure. Refute either of those arguments and you might have a point.
The problem with C is that it takes the inhuman capability to not make mistakes to end up with secure software. That's why all of the long lived C software projects have to be constantly patched to correct mistakes. Buffer overflows are no accidents, they are the predictable result of using C. Use C -> expect to deal with buffer overflows, memory leaks, etc. Every good C programmer knows this.
The difference between good C software and bad C software is that in the latter case the programmers are surprised whereas in the first case the programmers actually do some testing and prevention (because they expect things to go wrong). Mozilla Firefox comes with a talkback component because the developers know that the software will crash sometimes and want to be able to do a post mortem. The software crashes because the implementation language of some components is C. IMHO mozilla firefox is good software because it crashes less than internet explorer and offers nice features.
Of course we have learned to work around C's limitations. Using custom memory allocation routines, code verifiers and checkers, extensive code reviews we can actually build pretty stable C software. The only problem is that C programmers are extremely reluctant to let go of their bad habits. Some actually think they are gods when they sit down behind their editors and do such stupid things as using string manipulation functions everyone recommends to avoid like the plague, trying to outsmart garbage collecting memory allocaters by not using them, etc. If you'd build in a switch in the compiler which enforces the use of the improvements listed above, most of the popular C software simply wouldn't compile. But then it wouldn't be C anymore because the whole point of C is to allow the programmer to do all the bad things listed above, even accidentally.
IMHO programmer reeducation is an inherently bad solution to inherent security problems of the C language. You can't teach people not to make mistakes. You need to actively prevent them from making mistakes. You can make mistakes in other languages but they are a subset of the mistakes you can make in C. Therefore you should use those languages rather than C unless you have a good reason not to. Any other reason than performance is not good enough.
Rule for selecting programming language: (Score:3, Insightful)
Rule:
If you don't need to spend 5-10% of your development time to speed/size optimizing your program to make it useable, you are not using language/abstractions that is high level enough to your task.
Explanation:
If I use high level languge (say Haskell/OCaml/Clean/Common Lisp) and use all it's abstarction powers, program code will usually be 10-50% of the size compared to
Re:Phew! (Score:3, Insightful)
Re:Phew! (Score:3, Funny)
What Visual Studio
LK
Re:Phew! (Score:4, Insightful)
I feel confident in predicting that most of userland will eventually run in the context of some virtual machine or other. Of course, that doesn't exactly make me a prophet, since that's the plan for Longhorn, but I think it will become the norm on other platforms as well.
It would be nice if, in the long run, operating systems became irrelevant when it comes to choosing applications. You go with whatever has the best track record for speed, security, or whatever, and then just choose whatever applications you like. Since the virtual machine runs everywhere, so will your software.
Re:Phew! (Score:3, Insightful)
Advertisement? (Score:5, Insightful)
To me, it sounded like a big advertisement for Java.
It's the developers decision to use unsafe code in the
A hunting rifle can be used to kill people. Does that mean the trigger should only work after inserting a valid and current hunting license?
Re:Advertisement? (Score:2, Insightful)
Re:Advertisement? (Score:5, Insightful)
C and C++ allow for buffer overflows. They allow for improper or intentional coding to cause software to try to violate memory space of other functions or programs. They allow for memory allocation without necessarily providing any cleanup later. In the hands of bad, sloppy, lazy, or malicious programmers these traits have always proven to be a problem time and again on many different platforms. This doesn't mean that these languages are the wrong tool; I'd argue that part of Linux's success is because the kernel and most of the GNU-implemented services are written in these languages, which are flexible. Too much flexibility for the wrong purpose leads to problems though, just as too much rigidity leads to problems when things need to be flexible.
Re:Advertisement? (Score:5, Insightful)
Re:Advertisement? (Score:3, Interesting)
Note that the Java VM of Microsoft was not that safe. I am very curious if
Re:Advertisement? (Score:3, Informative)
Re:Advertisement? (Score:4, Informative)
Unless you're talking about anonymous inner classes (a different animal, typically with a different purpose altogether when it comes to design motivation).
This seems to have been a matter of designer taste on the part of Hejlsberg. 2.0 should bring anonymous methods, which aims to solve the same type of problems anonymous classes did, in a neater way.
Re:Advertisement? (Score:3, Insightful)
"Nested type" and "inner class" are, as far as I know, equivalent terms and the latter has been a common term used in both languages. However, this is the first time I know of someone who differentiates the two terms in practical use.
What makes the implicit "this.Outer" the essential feature of an "inner class" in your terminology? I'm also curious as to why you consider it such an important feature.
Once more this seems to be a matt
Re:Advertisement? (Score:5, Interesting)
In my current job, which involves quite a bit of C#, I had the opportunity to port large chunks of our legacy application from C++ to Managed C++. We didn't gain security benefits, nor did we gain speed; we didn't loose any either. However we gained a lot of maintainability since we now have a single stack-trace to deal with that bridges all of the languages that we have (now reduced to C# and C++ -- down significantly from when we relied heavily on COM)
The fact that MS gave us that choice is wonderful. If we wanted to be using JNI (which I had the unlucky opportunity to use), we'd not have made much progress at all.
Re:Advertisement? (Score:3, Informative)
Re:Advertisement? (Score:5, Insightful)
Nobody is going to use C or C++ to write a completely new program under .NET. There are occasions where I might use C for something I wanted to make cross platform but no way would I ever go near C++.
Most people who are going to use the new .NET support are people who have legacy C programs and want to gradually transition them to the .NET base in stages. The makes a good deal of sense.
The other constituency is folk who are writing stuff that is almost but not quite at driver level.
Re:Advertisement? (Score:3, Informative)
Not saying this can't be defeated but there are tools in the languages to protect yourself.
Re:Advertisement? (Score:3, Informative)
Re:Advertisement? (Score:3, Informative)
Sigh. There are plenty of garbage collectors and boundscheckers for C/C++. Heck, there are several bounds checker extensions/patches for GCC that introduce bounds checking to both stack and heap. Boehm is a fine garbage collector if you like that sort of thing. They go back at least 10 years.
Combined with a non-executable stack and heap, randomized address space layout and signed return addresses (StackGuard XORs with a random value and verifies before return
Re:Advertisement? (Score:3, Interesting)
It's irrelevant, actually. A bug is a bug. You can make them in any language. The consequences of the potential bugs are what matters. But only the implementation defines what a "buffer overflow" will actually do. Granted you can try and write past some allocated buffer in C (and C++). That doesn't mean the write should actually occur. That's the responsibility of the implementation, and mostly of the underlying operating system. I already said that earlier: the major
Re:Advertisement? (Score:3, Interesting)
Pascal isn't safe either (Score:4, Interesting)
"Object-oriented programming" is ill-defined. It encompasses a lot of languages that go about it in entirely different ways. To me, the most it can mean is "calling functions with an assumed this pointer." What does OO mean to you? Virtuals? What makes it "real"?
There are benefits to C beyond speed and direct access to memory, hardware, etc. People seem to forget that for us to make software "work together", calling conventions across libraries need to be compatible. Which is why we picked C calling conventions. It's not necessarily the most expressive if you're into fancy things, but it is flexible enough for most everything. My main problem with Java isn't the language -- it's the libraries. Lots of them, packaged in their own special way, not really designed for use by any language.
Languages, most of the time, aren't the issue. We haven't gained all that many 'new' features with new languages, at least not anything we can't easily live without. Access to symbols is an issue, however, and a really important one from the point of view of integration, code re-use, and even making sure you're using trusted/proven code.
Regardless of buffer overflows, you can still write infinite loops, incorrect logic, etc. in just about any language. These language wars are about markets -- they're about money.
Re:Advertisement? (Score:2, Insightful)
DISCLAIMER: COMPLETELY OFF-TOPIC
I don't know what the law is, but if a hunting rifle can only be legally used for hunting, this actually a pretty good idea. The card mechanism could also be used to enforce hunting seasons.
I realize this offends some people's sense of rights, but I'm not particularly inclined to defend somebody's "right" to use a firearm outside its
Re:Advertisement? (Score:3, Insightful)
Obviously.
but if a hunting rifle can only be legally used for hunting
A hunting license licenses the owner to take a certain type of game (deer season, etc) on certain land (assigned state land, private land, etc) during certain times (hunting seasons, obviously) with certain tools (shotgun only, bow, etc). It only grants this, in the case of firearms, to people who already legally own them. A "hunting rifle" is simply a subset of rifle suitable for a certain task (which var
Re:Advertisement? (Score:5, Insightful)
The article is heavy on sensationalism and short on content so it is difficult to tell what is actually being debated here, but I think that Gosling is claiming that support of C type handling in itself creates a chink in the armor of the CLR, regardless of any particular project's use of that feature.
Re:Advertisement? (Score:5, Insightful)
All use of unsafe features in
As far as I know, there is no example of unmanaged code that can violate the managed code type system, and
Also, this ignores that C/C++ support is much more complicated in
Frankly, this seems like a bit of sour grapes to me.
Re:Advertisement? (Score:3, Funny)
Me: "Oh, that would be, umm, James Gosling."
He: "No, that's not the name. It was a lady. Let me check
Me "Who?"
A web search revealed that Ms. Centoni's position was "Director of Java Marketing." Out of the mouths of babes come all wise sayings.
Re:Advertisement? (Score:3, Insightful)
A hunting rifle is fine for some purposes, but decorating your house with them is unwise. Java, effectively, has support for making
Re:Advertisement? (Score:3, Informative)
If that is the point, he's dead wrong.
Just like you can run Java code in or out of the sandbox, you can run
Re:Advertisement? (Score:3, Informative)
The developer can choose to do whatever he/she wants. The nice thing about
Woah (Score:5, Funny)
So you mean to tell me (Score:5, Insightful)
C'mon now. There is no vulnerability. Don't post this sort of crap. Its strictly knee-jerk material meant to bend a few people out of shape and start flames.
J2EE is great (for its target area)
Both are secure, stable and reasonably fast if you are a GOOD programmer. ANYONE who does ANY C or C++ code that will be used in industry needs to ENSURE that they just take a few extra precautions and are aware of secure coding techniques in both languages. Its rather quite simple.
To sum it up: nothing to see here folks.
Re:So you mean to tell me (Score:3, Interesting)
The fact that the editors actually chose to point out the flaw in the argument (in MS' favor!!!), rather than adding to the sensationalism is a welcome and refreshing change.
Don't disagree with Microsoft... (Score:5, Insightful)
What a surprise! (Score:5, Insightful)
In Java, everything is an object! Oh...except for the basic types, you need to use object wrappers for those.
Re:What's the flaw again? (Score:3, Informative)
It speaks to the purity of the language. Being able to deal with *everything* as an object is a distinct advantage since it allows you to, potentially, extend basic types into more complex ones and it also prevents you from having to "box" primitives in objects and "unbox" them on the way out.
BTW, JDK1.5 (aka Java 5) has a new feature called "autoboxing" which does the above boxing for you. This doesn't really count as those types being objects, it's more of a kl
Re:What's the flaw again? (Score:3, Interesting)
These are quite distinct from their corresponding boxed types in the System namespace,
The "box" and "unbox" CLI instructions allow for the translation between the two sets above. There is nothing automatic about this at the CLI level.
The reason you've probabl
Re:What's the flaw again? (Score:3, Informative)
It's ugly and non-orthoganal. Look at the Arrays [sun.com] class for example; there are dozens of duplicated methods that are identical except that they take bytes, chars, shorts, ints, etc. as arguments. I'd much rather have everything be a true object; any performance issues can be handled by the compiler, runtime, or Moore's law. Autoboxing helps, but better to fix it for real than with syntactic sugar.
JNI (Score:3, Informative)
Re:JNI (Score:5, Informative)
It is completely fair to point out that
Pat Niemeyer
Author of Learning Java, O'Reilly & Associates
Re:JNI (Score:3, Insightful)
Java lets you write to the user's filesystem. Does that make it insecure? You could run a program to wipe out your hard drive!
But Java allows for a "sandbox". So does
A truck, eh? (Score:5, Funny)
Like, say, a truck about the size of Sun's Java runtime environment.
Java is a type-safe language at the VM level... (Score:5, Insightful)
To support C/C++ semantics (ad-hoc pointers) you'd have to throw all that out the window and I assume that's what he's talking about.
Pat Niemeyer,
Author of Learning Java, O'Reilly & Associates and the BeanShell Java Scripting language.
Re:Java is a type-safe language at the VM level... (Score:2)
The Microsoft CLR is also type-safe at the VM level. If you choose to use pointers in Managed C++, though, you lose any ability to assert heap access safety, and therefore must mark your code as unsafe, because you can perform pointer arithmetic.
you got it backwards (Score:4, Insightful)
You arae kidding, right? Do you seriously believe Java is the first or only language to guarantee runtime safety? Safe languages are the rule, not the exception.
To support C/C++ semantics (ad-hoc pointers) you'd have to throw all that out the window and I assume that's what he's talking about.
C# distinguishes safe and unsafe code. C#'s safe code is as safe as "pure" Java code. You can think of C#'s unsafe code (or its equivalent in C/C++) as code linked in through the JNI interface, except that C#'s unsafe code has far better error checking and integration with the C# language than anything invoked through JNI.
Altogether, C#'s "unsafe" construct results in safer and more portable code than the Java equivalent, native code linked in through JNI.
Pat Niemeyer, Author of Learning Java, O'Reilly & Associates and the BeanShell Java Scripting language.
Well, then I suggest you learn some languages other than Java before making such ridiculous statements.
Re:Java is a type-safe language at the VM level... (Score:2)
Re:Java is a type-safe language at the VM level... (Score:4, Interesting)
You imply that compiled C code is faster than compiled C++ code, which IME is rarely the case these days. In particular, optimisations performed by C++ compilers have almost caught up with their C brethren. With almost perfect zero-overhead implementations of all the major C++-only language features now in common use and the added performance boost from things like inlined code in templates, the balance often tips significantly in C++'s favour now.
Can you give some examples of high quality numerical libraries written in pure Java (i.e., without JNI)?
Disclaimer: I'm a professional C++ developer, and I write high performance maths libraries for a living.
Re:Java is a type-safe language at the VM level... (Score:3, Interesting)
Re:Java numeric libraries (Score:3, Interesting)
Of the two "numeric" libraries mentioned on the website only one handles complex numbers and the implementation in java leaves much to be desired (relative to assembly or C). To my knowledge, the Lau Numerical libraries based on algol routines ar
Java has many potential drawbacks for numerics (Score:3, Interesting)
Java has several pretty fundamental disadvantages when it comes to serious numerical work, compared to a language like C or C++.
The most obvious is the "everything is an object" principle. If you can't create value types for things like vectors or complex numbers, you're imposing performance overheads for dereferencing before you even start doing any maths.
Moreover, serious maths work often involves large data sets. We work with graphs with many thousands of nodes pretty routinely, which can make fine c
Re:Java is a type-safe language at the VM level... (Score:5, Informative)
Garbage collection in Java has been faster than free/malloc in C for years. This is in large part due to the fact that the runtime can recognize very short lived objects and put them on a special part of the heap.
It's not necessary to use unsafe languages to get performance any more.
Pat Niemeyer
Re:Java is a type-safe language at the VM level... (Score:5, Informative)
You're quite right that Java's speed is excellent these days (for non-GUI code, at least). I've spent a lot of time recently working with a large system that was first implemented in Java (by highly skilled developers) and then ported to C++ (by greenhorns). The C++ port is only 50-100% faster, which isn't worth the price in developer time that's been wasted on memory leaks and other forms of memory corruption that were never a factor in Java. Besides that, supporting multiple platforms with the C++ version is the #definition of pain.
However, the C++ version uses only about 1/4 or 1/5 as much memory as the Java version, and starts up far more quickly. If a *desktop* application needs to be deployed on older machines, or if the application is so memory-intensive it taxes the limits of today's server hardware, Java still falls flat.
Re:Java is a type-safe language at the VM level... (Score:3, Interesting)
If 'greenhorn' C++ developers can make an app that is even ONE PERCENT faster, then the Java Developers WERE NOT 'highly skilled'. Period. But TWICE as fast? As in, C++ takes 1/2 the time to execute 'x' as the Java version? No way. Not even if we are talking linear algebra code* [home.cern.ch].
An experienced Java programmer knows you have to memory manage large apps. Yes, Java will *always* use more memory than an equally well written C++ app; however, unless you are working *exclusively* with *huge* arr
Re:Java is a type-safe language at the VM level... (Score:3, Insightful)
Re:Java is a type-safe language at the VM level... (Score:3, Insightful)
Re:Java is a type-safe language at the VM level... (Score:5, Interesting)
So you have to measure time per malloc and time per free, then total them up and compare it to GC's time per allocation and time spent in GC. In some cases, one will be significantly larger than the other, but in most nontrivial programs, using modern malloc/free and modern GC, it comes out pretty close to even.
Some argue that the "pause" from GC is a problem. Maybe. Except that as mentioned before, malloc can also "pause" for arbitrarily long times. And a lot of work has been done on "concurrent" GC that doesn't pause. If you can afford paging in from disk (swap file), you can also afford GC's "pause".
Finally, when you write a big program, you spend incredible effort in your program tracking memory. That takes cycles. "If x then save a copy cause we'll have to free it later, etc."
The bottom line is that there are some cases where GC still won't work, but those cases are getting smaller and smaller. For most cases, the argument that GC is slow or inefficient just isn't true. Go do some real benchmarks, or go study up on the already published benchmarks. GC is pretty efficient, and malloc/free has no significant speed advantage anymore.
malloc+free: fast, simple, and can be even faster (Score:5, Informative)
I don't know where you got your understanding of malloc, and especially free, but it's severely out of date. Knuth published about "Boundary Tags" no later than 1973 (citeseer is down, so no link). Saying that a coalesce operation "can be arbirarily slow" is just FUD. A boundary tag makes free() a fast O(1) operation: check the previous block in memory to see if it's free and if so join, a fast O(1) operation; check the next block in memory to see if it's free and if so coalesce; add free block to free-list, done. Yes, it's not zero work like a GC implementation sort-of is, but "arbitrarily slow"? It's basically at least as fast as malloc().
Allocation requests can hit disk, sure, but so can GC allocations even if they're just bumping a pointer: it all depends on the working set size. GC compaction can reduce fragmentation to reduce working set size, but that is only a big win if there's a lot of fragmentation, and most apps using a good malloc() don't exhibit that much. (It is also possible for a GC to rearrange memory so more in-working-set data is on pages together, reducing the working set page count without changing the total memory used. I don't know of any in-use implementations of this, since you need hardware support to know what objects are more-in-use; generally this is only available at the page level, where it's no help. I think maybe an early microde-based Smalltalk implementation might have done this.)
If your malloc has to walk giant free lists to find an open block, then sure, that can be slow. That's why people use trees of free lists based on size and such to make it more O(log N), and O(1) for small allocations. (On large allocations, actually using the memory amortizes the cost.) Read about dlmalloc [oswego.edu], for example.
Furthermore, let's not misrepresent GC. Stop-and-collect GCs have obvious extra costs beyond the free-of-charge free (or lack of need for one). Incremental GCs that don't pause are usually slower overall and only preferred for interactive programs. For example, incremental GCs usually require "write barriers" or "read barriers" which require several extra instruction on every fetch from memory or every write of a pointer variable in memory. This can add up across the entire program. Incremental GCs also tend to be conservative, and only end up collecting things that, say, were garbage at the start of the most recent collection round, and generational collectors allow garbage to collect in later generations for some time, so they don't actually necessarily have a smaller working set than a non-leaky malloc()/free() program.
Another big win in non-GC systems is that you can use pointers that don't come off the heap. That way you can avoid allocation and deallocation and GC entirely. (You can actually do some of this in a GC system too if it's a 'conservative' GC that copes with pointers into the middle of blocks. Those pretty much only get used for adding GC to C and C++, though.) Here are some common ways this happens:
Of course, doing all these things requires that you balance your different types of malloc()s with the correct, matching type of free(). In practice, GC proponents overestimate the diffi
Unsafe code is called -- duh, Unsafe (Score:3, Interesting)
Why is this here? (Score:2)
What a well researched article! (Score:5, Insightful)
Hey, what about the keyword unsafe in C#? Sheesh.
It's called unsafe code for a reason (Score:5, Informative)
Applications that require safety (for example running plugins downloaded from the net) simply don't allow those assemblies to be loaded.
Where is the problem again?
Someone should tell (Score:2, Insightful)
Re:Someone should tell (Score:2)
Too funny. (Score:2)
In
B:
In Java you can write code that harms a computer and deletes files.
A:
You can write code in C#, in which case it is managed and helps prevent you from making stupid mistakes.
B:
You can write code in Java in which case it is somewhat managed and helps prevent you from stupid mistakes.
A:
Under
B:
Under Java you read Sun's doc
Rediculous (Score:2, Interesting)
A
For example: The end user can grant unsafe permissions to the Microsoft Managed Di
Re:Rediculous (Score:5, Insightful)
To me this looks like a similar problem as allowing running native code via ActiveX. Yeah, we have permissions, signing and what ever - how much does it take for a trusted but buggy ActiveX applet to be exploited?
Huge mistake, IMHO. And do not compare this to JNI - I am no Java expert, but AFAIK you simply cannot call JNI functions from something like web applet by design, whereas here it is on the discretion of the app developer.
Why oh why (Score:4, Insightful)
It's the same with C. We should know by now "you cannot use C to handle untrusted data (ie, data from untrusted machines on the net)". All such data need to be handled in a sandboxed system, a system with safe memory access. This means something like Java or similar things.
A lot of people will make posts that say things like "C doesn't cause the problems, it's incompetent or lazy programmers who cause the problems." Whatever. No excuse. That's like saying "we shouldn't need seat belts or airbags; all we need is to make sure that drivers don't make mistakes." Drivers and programmers do make mistakes and that's why we need safety mechanisms in both cases. C provides none. Programming in C is like driving around in a car from the fifties, with no seat belts, no airbags, no head rests, no ABS.
So any decision to extend the use of C is just foolish. What is the purpose of doing this? If people must use horrible legacy code then just use it, but why drag that into new frameworks like .NET?
It does not compute, for me at least.
Re:Why oh why (Score:3, Informative)
So any decision to extend the use of C is just foolish. What is the purpose of doing this? If people must use horrible legacy code then just use it, but why drag that into new frameworks like .NET?
Managed C++ is basically a compatibility language. It exists to provide developers an easy way to interface legacy C/C++ code with .NET code. By providing MC++ Microsoft is actually providing a way for developers to slowly migrate to more modern languages (sorta like JNI with Java; imagine if you couldn't make
Homeowners!! Beware! (Score:5, Funny)
--The Elmer's Glue Foundation for Strength and Security
Beware the agenda (Score:4, Interesting)
Unstructured? Yes. A huge security hole? No more than any other language using COM objects. You can write crappy spaghetti code in any language. The type interface for
What Gosling is really criticising is the way
I believe Gosling is wrong (Score:5, Funny)
Gosling is dead wrong. I believe that Microsoft will soon prove they are capable of even bigger and more offensive security mistakes.
Also, the choice to actually use .NET is at least as big of a security error.
pots and kettles, you know... (Score:4, Funny)
oh. nevermind.
Yay, more rhetoric from Sun (Score:4, Informative)
In order to use "unsafe" code from managed C++ (or unsafe blocks in C#) you must have "FullTrust" security rights, otherwise the code fails to run.
You could shoot yourself in the foot but the runtime is perfectly capable of detecting and coping with corruption of the managed heap (generally by closing down the offending AppDomain.) Of course you can write a COM component in C++ and call it from dotnet, which is (in effect) the same exact thing! (I dare you to try and stop me from trashing Java or dotnet once I'm loaded in process via JNI or COM...)
CAS (Code Access Security) means that no other code can call your "unsafe" methods without FullTrust either, so there is no danger from code running off the web of doing this.
JNI is the same thing, Sun just gets to hide behind the lie since the risks aren't known by or integrated with the platform. At least with unsafe code the runtime is fully aware of that pointer voodoo magic you are trying to pull and can deal with it appropriately.
In other words Game Developer X can hand-tune the rendering algorithm inside the "unsafe" code areas, but develop the rest of the platform in fully managed code, making the development process much easier to write, test, and debug.
(As an aside, thanks to the antitrust ruling Microsoft is not allowed to comment on a great many things, including competitors. I don't know if this falls under that heading, but in many cases Microsoft's employees can't just come out and call bullshit when they see it for legal reasons.)
In conclusion: Sun should shut the hell up.
C programmer deals with it (Score:5, Insightful)
Yes, CowboyNeal, but do they want to deal with it, and should they deal with it?
For every programmer who reads security bulletins and keeps tabs on the latest string-copying buffer overflow issues and fundamental security principles, there are a hundred who don't know or care.
C is a high-level language that:
Programmers want to be productive -- most want to make things make colourful stuff happen on the screen, not fiddle around with buffer guard code. So the more security can be built into the language and its running environment, the better.
Many languages, such as Python or Ruby, provide security against what I mention in my first bullet, through a virtual machine. They're not impenetrable, and are of course, as dynamic languages, subject to a different class of security holes (eg., string evaluation of code), but they're a step up from the C level.
Other languages, like Java, provide capability-based security models, allowing for sandbox environments with fine-grained control over what a program may or may not do. Java's security system is ambitious, but since most Java apps run on the server these days, it's not frequently used, and except for browser applets, Java code tend to run unchecked.
In a way, Java tries to do what the OS should be doing. Programs run on behalf of its human user, and their destructive power is scary. Why should any given program running on my PC have full access to my documents or personal data? As we're entering an age where we have more and smaller programs, and the difference between "my PC" and "the net" is increasingly blurred. Operating systems need to evolve into being able to distinguish between different capabilities it can grant to programs, or processes -- we need to think about our programs as servants that are doing work for us by proxy.
The same way you wouldn't let a personal servant manage your credit cards, you don't want to let your program do it -- unless, of course, it was a servant (or, following this metaphor, program) hired to deal with credit cards, which introduces the idea of trust. The personal accountant trusts the bank clerk, who trusts the people handling the vault, who trust the people who built the vault, and so on.
In short, any modern computer system needs to support the notions of delegated powers, and trust.
Programmers will certainly never stop having to consider vulnerabilities in code. But painstakingly working around pitfalls inherent in one's language, be it C or indeed .NET -- we need to evolve past that. The users, upon whom we exert so much power, certainly deserve it.
Gosling Emacs security holes + spyware + malware! (Score:4, Funny)
Emacs has a notorious "shell" facility that can actually run a shell and send it arbitrary commands!!!
In fact, there's even a built-in scripting langauge called "Mocklisp" that enables hackers and viruses to totally reprogram the behavior of the editor (and it looks like Lisp, but without any of those confusing lexical closures and list processing functions).
Gosling Emacs is actually spyware, because it has a hidden "keyboard macro" facility that can spy on every character you type! Emacs is also malware, because at any point it can instantly undo any editing changes you've made!
One of the biggest most offensive mistakes is that James Gosling has not fixed these huge security holes in Emacs, after all these years. In fact, many of the security holes have been reimplemented in another notorious piece of communist spyware called Gnu Emacs!
All Emacs should be banned!!!
-Don
Gosling is right, but Java has the same problem (Score:3, Interesting)
If you want to implement a system based on language-level security using a mixture of code in safe and unsafe languages, as little as possible of the system must be written in the unsafe language(s), and that part must be treated as being in the system TCB.
Some unsafe code is unavoidable if you want the system to be able to use OS facilities on Windows and Unix. However, it must be written by people who know how to write secure code, and gone over with a fine-tooth comb for possible weaknesses.
It is completely disingenuous for either Microsoft or Sun to claim that these platforms are secure, given that their implementations depend on millions of lines of unsafe-language code that no-one is ever going to review properly. Even more so since both
So basically, Gosling's argument is correct:
Re:James Gosling is an expert in this area (Score:4, Interesting)
No, Java is not suitable (or useful) for what an engineer would call a "critical" application. Those applications are coded in C or C++ (or Assembler).
I'm using java because that was the business decision made by my boss (or my boss' boss). So I'm just told what I have to do (what interface the user expects, what system I have to connect to, etc.) But for the company I work for, Java might be a critical part of their business plan.
For example, you won't find java in a heart monitor in the hospital but probably find the server that keeps your health records is done in Java. Whoever is developing the Health record system can (more or less) pass the code to a new developer to continue working on it without expecting the new guy to be an expert on that particular system.
Anyway, this could all be bullshit if sound coding practices are not follow on ANY language.
Re:James Gosling is an expert in this area (Score:3, Insightful)