Hacker Newsnew | past | comments | ask | show | jobs | submit | more matheusalmeida's commentslogin

Are you serious? A well designed ecosystem is much more important than trying to increase the number of gigahertz/megaflops every release. Just because a corvette is more powerful and way cheaper than a Ferrari/Porsche it does not mean that you will have a better experience than buying one of the latter.


The iPad 1 has serious usability issues due to the limited amount of RAM. After experiencing this, I would be nervous about the lifespan of an iOS device with only 512MB. On some level, specs really do matter.


I'm sure this comment will be heavily downvoted but I read several times that FBI and similar agencies were 7 years ahead in terms of technology including security of systems.

I didn't believe in those quotes and every time a situation like this happens I'm more convinced that they are no more advanced than the big companies (Microsoft, Mozilla...) and depend heavily on the updates released by those companies every week... Just my 5 cents...


Just the statement that one's organization is "seven years ahead in terms of computer security" implies a fair amount of technological ignorance.

How can an organization be seven years ahead of everyone else? Do they have Norton Antivirus 17.0, whereas the rest of us are using 10.0?


Well, they could for example invent RSA years before the public knows about it. (British intelligence did this). Or they could invent differential cryptanalysis before the public knows about it. (NSA did this.)


Yes, in fact, it implies technological illiteracy. But I can imagine what kinds of people would say that.


If my experience having worked closely with local government IT, also applies to the FBI (I would bet it does). Then they, just like any technology in the public sector, is usually 10 years behind in technology compared to the private sector. They're just mostly bored undertrained people, with no real incentives to do a good job. Who would rather play office politics to raise chances of getting promotion, than to learn the heck is this php injection thingy.

As opposed to the Hollywood scenes of ultra high tech rooms with floating transparent screens, shiny lights everywhere, and super advanced systems that can listen to your voice commands and instantly solve complex cases just by saying "enhance!". Which is probably the vision this person who talked to you had. Reality is more like windows xp and programs that compile even when the unit tests fail.


How would you know if a unit test fails, if the program doesn't compile?


Hashing passwords (so you can never get back to the original password) has been used for decades.


Well I would think that the FBI would really be the wrong agency for this to apply to has they are really no more than just normal police officers on the federal level.


I think people that say things like that are just terrified that no one has any idea what is going on. :)


There's no doubt that it's a very extensive book on software development and a very good one but I personally don't recommend it. If one really wants to know about software development, I find that it's much better to join an open source project and discuss the development of the project with other people than just reading about software development. It's almost the same thing as learning everything about an internal combustion engine but having no idea how to fix one.

In hindsight perhaps I shouldn't have bought the kindle edition of the book (Code Complete) because it's a very large book and it's not those kinds of books where we must read from cover to cover and I don't think it's very practical to read those books in a kindle but perhaps it's just me. Just my 2 cents.


I agree with this. As a senior, I followed Jeff's recommendation and bought Code Complete, but I found that a lot of it is becoming mainstream language. A book that thought me more personally is Effective Java, because I think some of its sections go way beyond Java and touch on some serious issues developers deal with, like how you structure and document APIs, when to use some patterns and so on..


I think I don't understand... On one side, you say that C isn't that complicated but on the other side you say you just use a small subset because you don't trust yourself and your colleagues to understand all the C quirks...

I think C is complicated with all the 191 undefined behaviors and 52 unspecified behaviors (source for these number is the paper that the author of the quiz wrote for specifying CSmith). One must master C before having absolutely knowledge that the code that he wrote doesn't fall into one of the two black-holes (undefined and unspecified behaviors). For instance, almost everyone that I ask, they tell me that dereferencing a NULL pointer yields 'Segmentation fault' but it's just not true. It's in fact undefined behavior.

The other thing is the rules for strict aliasing... It's pretty much impossible to convert one pointer to another without breaking some of the C rules. The experts say type punning is the way to go but type punning relies in another not defined behavior. Reading an element from an union that wasn't the last element written is undefined behavior.


I didn't say anything about a small subset. The rules I use are not a subset of all those special cases, they are broader rules that cover lots of potential pitfalls and are easier to remember. But C is definitely more complicated than it should be, no doubt about it.


That is true but it's not the reason... INT_MAX + 1 must be 0 for unsigned int because the C standard defines wrap-around behavior for unsigned data types. Saturated data types are only defined in Embedded C. E.g.: Sat Fract


That is the entire reason. Because the representation of signed numbers could vary from implementation to implementation, making INT_MAX + 1 (or any signed integer overflow) undefined allows for more efficient generated code. If it were defined in some way, the generated code would need checks for overflow wherever it was possible, which would lead to bigger and slower code.

INT_MAX + 1 is never 0 for unsigned int. (unsigned)INT_MAX + 1 is equal to INT_MAX + 1. You're thinking of UINT_MAX + 1, which is always 0.


It's not true that (unsigned)INT_MAX + 1 is never 0 - it's allowed for UINT_MAX == INT_MAX (leaving the sign bit unused in the unsigned representation). I've never seen such an implementation, though.


Yep, you're right.


Yes, my mistake. I meant UINT_MAX + 1.


There is a somewhat related text by Fefe: http://www.fefe.de/intof.html


That would be true for unsigned data types. The C Standard specifies as 'undefined behavior' overflow for signed types. The C compiler can do whatever it wants...


This is mainly on HN because people here tend to value privacy and there were already some great discussions about TSA (Transportation Security Administration). I still don't know how I would react if somebody (even a security officer from a foreign country) asked me to log into my gmail/email account.

That's even worse than an employer asking an employee to log into his/her Facebook account. I keep all sorts of private information in my personal email... I've got nothing to hide but I'll think twice before flying again to a foreign country.

Edit: Typo


One could comment the "very long command" with # and it would appear in the history... no hacks.


this is hotkeyed to Meta-# by default

(one of the readline features i rediscover every three months or so and promptly forget)


Tested and it works brilliantly... it's definitely faster than C-a # RET... Thanks for the tip.


it's fully customizable too. e.g. if you work in a LISP REPL and use rlwrap to provide readline functionality, you could put this in your ~/.inputrc:

    $if lisp
            set comment-begin ;
            M-;: insert-comment
    $endif


Isn't a bit strange that www.apple.co.uk isn't in apple's huge list of domains?


It looks like Apple Illustration got there first in August 1996. I presume Apple Computer didn't have the foresight to go around registering with every ccTLD back then.

It must be worth a pretty penny now though.


Add to that www.apple.gr as well.


I have to agree with the author of the post. I had the habit of checking my personal email, read BBC to see it the world was not in war and checking HN while the project I was working was building (I thought it was no big deal http://xkcd.com/303/). The big deal is that it affected my productivity. Period. When the program finished building or it aborted in the middle of the process of building, I was still thinking about something I read. It always took me some minutes before I was ready again to continue where I was.

Since the project I'm working can have long periods of building time (~up to 40minutes), I've started several projects while waiting for the program to build. Since they are all related to coding, I feel that I can swith taks much easily than reading news/checking email. And I believe I'll become a better programmer because of that (also because the projects I've started are tools to help me to perform my work).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: