Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He's got great skills, and NSA training is as good as it gets, but he explicitly violated the rule to not download and run code from a server, to see if the rule would be enforced. They enforced it, just as he'd known they would. There was no point to his doing that other than to get headlines.


No, he explicitly violated the rule in order to test the hypothesis that a security hole he'd uncovered would allow unsigned code to be downloaded after release into the app store and run on the device.

The sane response to this would be "Oh, we better fix that. Thanks. We're removing your app BTW." The Apple response was typical of a bureaucracy.


But he did more than just test it and remove the app from the store. He left it up there and people presumably were downloading it. Another Forbes article[1] says:

> But the researcher for the security consultancy Accuvant argues that he was only trying to demonstrate a serious security issue with a harmless demo, and that revoking his developer rights is “heavy-handed” and counterproductive.

But as he demonstrated in his YouTube video it wasn't just a harmless demo, he had a shell that he could run on anyone's phone who downloaded his app.

Further, he didn't even have to put it up for sale at all except to perform his publicity stunt. The code signing aspect doesn't change when you are developing on your device locally. He could have submitted the app and not even put it up for sale at all. If the exploit worked in dev it would work on the store, and if they approved it he really didn't have to test it at all. Of course he's a curious guy - I think we can all relate to and appreciate that - so he could have chosen not to release it to the store on approval, then if approved put it up for sale only long enough to try it out, and then removed it from sale again.

I don't agree that they should have terminated his account, but neither are they really that out of line in doing so. I also don't think he would have opened a shell to anyone else's phone but the fact remains that he still had the ability to do so.

[1] http://www.forbes.com/sites/andygreenberg/2011/11/07/apple-e...


The lesson I would take away from this is that Apple should provide a mechanism for security vulnerabilities to be reported officially so that researchers don't have to engage in these sort of dubious activities. Whether they listen to the reports or not is another matter.

Anyway, is there any special reason why reporting via https://ssl.apple.com/support/security/ won't work?


Charlie is one of the founders of the controversial "no more free bugs" movement.

The amount of skill necessary to identify AND exploit bugs is so great that the bug reports themselves have value,far beyond attribution in the patch notesand a T-Shirt. This is especially true when there is in fact a lack market of bad people willing to pay good money for 0 day vulns.

thus, reporting vulns that way doesnt necessarily make sense. Charlie's walking a fine line: He is not a BadGuy, but he also isn't giving away security consulting to companies with 200 billion market capitaliazations. Apple should pay him good money to look at this stuff. Otherwise, its going to be only BadGuys.


"Whether they listen to the reports or not is another matter." - It's kind of the point: the instinct of a bureaucracy that is not serious about security is just to keep things quiet in the belief that no noise means no problem. Schneier's excellent essay, "Full Disclosure of Security Vulnerabilities a 'Damned Good Idea'", observes that this reflex is in fact economically rational.

http://www.schneier.com/essay-146.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: