Security Flaw Found That Allows Control of iPhone 176
i_like_spam writes "The NYTimes is running a story about an iPhone flaw that has been found and documented by researchers from Independent Security Evaluators. Attackers were able to gain full control of the iPhone either through WiFi or by visiting a website with malicious code. The exploit will be demonstrated at BlackHat on Aug. 2nd at 4:45pm. Until then, 'details on the vulnerability, but not a step-by-step guide to hacking the phone, can be found at www.exploitingiphone.com, which the researchers said would be unveiled today.'"
no wonder they don't allow programming the thing (Score:3, Insightful)
Evidently (and, I suppose, not surprisingly), an OS X-based phone lacks these safeguard. I guess that's the real reason Apple has been refusing to permit third party phone apps on the iPhone, even though they don't cause problems on other phones: the iPhone software architecture just doesn't seem designed for it.
Re:The technical paper is the article (Score:4, Insightful)
*sigh* I hate this argument.
Why not just disallow anyone not capable of not programming a buffer overflow from ever programming a device? It's not a language issue. I'm sure that unqualified folks will figure out how to cause all sorts of new vulnerabilities in "safe" languages. (Hint: there is no such thing as a "safe" language.)
I guess maybe I'm just in the "old school" camp that thinks that unqualified people shouldn't be allowed to do things. There's a reason why, for instance, amateur carpenters aren't allowed to construct public buildings.
Maybe we should actually require people to be licensed programmers...while that would weed out some of the problem, though, we all know that professional engineering licenses or whatever don't guarantee competence...
</rant>
Re:The technical paper is the article (Score:4, Insightful)
First off, it's not true. Even experienced programmers make mistakes, and I've had issues where the API I was writing to was incomplete, and my implementation and the other guys implementation were just different enough that, in the right circumstance, you could have a problem. That stuff happens. It's always going to happen.
Second, in my experience, no C programmer ever thinks that they make these mistakes, it's always crappy programmers from somewhere else. At some point, you have to own up to the fact that it's a problem of the language and that, as long as you use the language, you're going to have the problem.
Now, if we could find something that ran with the performance curve of well programmed and tuned C, but without C's baggage of errors, leaks, and overflows, this would be a no-brainer. Unfortunately, that language does not exist, and hardware hasn't gotten to a level yet to make language performance irrelevant for consumer applications.
So we have to put up with C's issues. But if a new language is developed, or if processors keep increasing, the day will come when C will join Fortran as one of those languages that really only academics need to know, and it's because of crap like this.
Comment removed (Score:2, Insightful)
Re:The technical paper is the article (Score:3, Insightful)
Don't be silly! (Score:4, Insightful)
Java did not exist when NeXT chose Objective C as their development language. Objective C was arguably the technically most superior language for applications development. It was (is!) cleaner and more object-oriented than C++.
C-like languages may increasing have less merit for appsdev today, but they certainly still have their place. You and I know little about the iPhone. It may indeed be the case that running a JVM on it for all apps is a poor choice.
Re:Excellent! (Score:2, Insightful)
Mac zealots have long argued that windows sees way more viruses / malware and therefore macs are more superior.
The truth is that all devices and operating systems are hackable - it's motivation to do so that's required, and a large userbase is a huge part of that motivation.
Re:The technical paper is the article (Score:4, Insightful)
Your post highlighted all the dangers of ad-hoc programming.
Personally, I've seen a lot of "experienced" C programmers but only met a few good C programmers. I don't blame the language, instead I blame the programmer who gives into temptation and think they could just quickly code an application on-the-fly without any planning. A decent analogy would be that while a large number of people are familiar with the english language, only a small percentage of them can write a decent novel.
I am a C/C++ programmer, and I believe that I am capable of making even the dumbest mistakes when coding (especially during an exceptionally long programming session). This is why I test! I make my unit tests with the intent of making sure my procedures error out correctly, as well as verifying that it works correctly. I put special effort in data objects that are prone to buffer allocation errors.
I also design the overall program ahead of time, define the interfaces between modules, and go over expected application behavior with the test engineer.
I go through this because (1) the only true way to make quality software is to put forth the effort to design, test, and maintain the code from the beginning, and (2) I never expect any language to handle errors correctly.
Number 2 is important, since even if a higher language does handle out-of-bounds conditions, does it handle it in a way that doesn't cause a problem somewhere else in the data chain?
For (off-the-top of my head) example, if I had a string buffer that automatically grew to 518 bytes to handle an erroneous data entry and it stored it in a database record that only held 512 bytes, does it truncate the data or does it posts a warning. Remember this error wasn't caught by the programmer, so are we to assume that the programmer would check for this error when it came time to place it in the database? Or what if the application simply threw an out-of-bounds exception? If it wasn't tested for this in production, then this means it could happen in the field. In my line of business, it must be caught in production.
The reason *I* use C and C++ is because I need the speed that comes from not having the language doing all my thinking for me. If *you* think C has issues then maybe *you* need to use another language more appropriate for your requirements and your skill set. BTW, my colleagues would have issues about your Fortran comment too!
I am not saying C/C++ is the best language for every situation. I am just saying that I use C/C++ because of performance and space requirements. I am also saying that no high level language is a good substitution for good programming practices.
The issues related to this story require even more effort in testing. We are not trying to catch simple runtime errors, we are trying to prevent someone who has the intent to break the security of a program. This falls outside of the realm of high level languages even with the issue that caused this particular "breach."
For (another off-the top of my head) example, even the best programs can fall victim to a compromised configuration file. So effort must be made to give an application the minimum amount of security privileges necessary to do its task *and* make sure that the configuration files are not modifiable without higher privileges.
Re:The technical paper is the article (Score:3, Insightful)
Because that's not realistic.
If you were to group bugs into two categories, "buffer overflow bugs" and "other bugs", the C programming language makes it possible for developers to code both kinds of bugs. If, by using a safer programming language, you can at least eliminate "buffer overflow bugs", why wouldn't you want to do that?
(Hint: I never said there was such a thing as a "safe" language. I said "safer".)
Yeah, right. That'll happen, right after every little girl on the planet gets her very own pony, and we all slide down rainbows to work, instead of commuting.
As an experienced C programmer, I'm not just sitting around bashing C for fun. I'm just recognizing the fact that as our world becomes increasingly networked, security becomes increasingly important. If we can use alternative languages that minimize whole categories of severe bugs such as buffer overflows, we need to consider using them.
Good job there's no SDK, eh? (Score:4, Insightful)
this is backwards (Score:3, Insightful)
The main differentiating feature of the iPhone software is that it is a brand new GUI designed specifically for a portable communicator. It is specifically and closely tailored to the physical attributes of the device and the applications it contains, and merges the software functionality with the physical functionality into a seamless whole. It's very success is tied to this integration and the fact that it does *not* simulate a desktop environment.
The main differentiating feature of WinCE/Windows Mobile (and it's most touted feature on release), is that it *was* built as a clone of the Windows Desktop complete with start button and task bar. It failures as a mobile OS are directly proportionate to the degree in which it tries to emulate a Windows desktop. It even forces users to manipulate the tiny screen of the mobile in the same way as they would a desktop, (albeit a very small one), by necessitating the use of a tiny "pick" (stylus) to emulate the mouse on the reduced scale of the mobile desktop.
Re:Excellent! (Score:4, Insightful)
Userbase isn't what makes a target "big." It's potential media exposure. The iPhone hype makes it a key target. The closed nature of the iPhone means lots of talented people are trying to break in. The iPhone, moreover, probably isn't as secure as a desktop computer. It's not designed to be open and from what I understand about this particular hack, it builds on previous "doors" which can easily be closed. It's not surprising that once you've gained access to a device, you'll be able to manipulate it (particularly when they're all configured with the same username and password).
There's plenty of media attention given to every half-assed attempt to break OS X. So far, nothing worth losing sleep over. You don't think that the media attention for OS X would be worth the payoff? Linux might have some protection because it's somewhat obscure and not mainstream--but while it might not show up on CNN, people here certainly would take note, and so would large corporations using Linux servers.
Re:The technical paper is the article (Score:4, Insightful)
Now, not trying to be snarky or anything, honestly, but how often do you find that you run out of time on projects? Do you ever find that being so careful in your coding means that you get hounded for being late or behind schedule?