Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There have been quite a few exploits in tails.

I suspect you're better off with a more obscure project, because then your adversary is less likely to have a 'ready to go' exploit.



Wouldn't that be security through obscurity? Which is bad security and a good way to be exploited. I thought that having more eyes on a system made it more secure because people find the exploits.


It depends. Monocultures are also bad for computer security, since the failure mode is catastrophic.

Ideally, there would be a few tails-style projects competing with each other (there are; see sibling threads), and the internet would be more federated (for instance, if github is completely compromised right now, many people reading this will git pull malware in the next day or so).


Also if you’re rolling your own, you’re way more likely to not keep updates perfectly and patch everything that comes up.


Depends how you roll your own; something lightly modified from a "normal" distro can just take upstream package updates and so put you in a good spot.


"Many eyes" is a failed philosophy. Even if many people could, theoretically, look at the code few actually do as evidenced by the Heartbleed defect in OpenSSL. One of the most critical pieces of software, used by literally billions of consumers and basically every trillion dollar company, and they missed glaring coding errors that any basic static analyzer would automatically tag. Nobody was looking at even some of the most critical code. The first failure is that you need people actually looking, which basically requires being paid to do full-time work (as most work on Linux is these days).

In addition, even if people are looking, finding defects is really hard. A random onlooker has basically a 0% chance to find most of the critical zero-days afflicting Linux. It takes weeks to months of dedicated effort by technical experts with domain knowledge to find most such bugs. "Many eyes" is worthless to security, what you need is many trained technical experts with domain knowledge using high quality techniques and processes derived from successful high security projects.

This is not to say that "security through obscurity" is a good thing or that "open source" has no impact. Open source and development does have a large impact, it is just mostly on your ability to trust the auditing/security process as a random third-party, not the security itself. The security itself demands focused technical ability. However, the ability to trust the security claims derives from a technical evaluation by a technically competent, trusted party. The easiest way to do that if you are technically competent is to do it yourself. However, few people have that sort of time, so you farm out the work. If you are a big company or the government, you can usually get access to the source code under appropriate contractual protection, then you have your own technical staff (technically competent, trusted party) do the evaluation. If you are a smaller company, you might not have any technical staff appropriate for the task so you farm it out to a testing body (technically competent) who can probably be trusted since you are paying them.

However, if you are just some random person, you do not have the money to pay for a evaluation and you have no way of knowing if "Totally Not the NSA Certification Company" can be trusted. So, your best bet is inherent transparency and hoping that the unaffiliated lookers are, on average, not your enemy and technically competent. This is a okay option if you do not have access to better choices, and certainly better than nothing, but is a far cry from the other options where you have real control, incentive alignment, and insight into auditing processes. Only a organization incompetent at security would not use one of the better options for critical dependencys. Unfortunately, basically every large commercial IT organization, such as Google, Microsoft, Apple, Amazon, Crowdstrike, etc. is incompetent at security and none of them actually evaluate their dependencies or do any meaningful third-party certifications.

Funnily enough, this means my advice is practically useless, because the security of everybody is completely untrustworthy. Your only hope is "many eyes" because that is the only way to get any trustable audit at all. In the physical industries you have standards and certification bodies worth more than the paper they are written on, but in software everything in security is total snake oil and you should only believe what you can see for yourself. Hope that helps.


As always, depends on the threat model.


Security through minority actually.


This has been argued before: https://medium.com/@thegrugq/tor-and-its-discontents-ef51648...

I think this is somewhat sarcastic but the article goes as far as saying "[Tor Browser Bundle] is the only reason that FireFox is a valuable target." Firefox has improved sandboxing now though I don't think it's as good as Chromium.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: