Hacker Newsnew | past | comments | ask | show | jobs | submit | survivedurcode's commentslogin

LOL you should be upvoted as your comment perfectly captures the blind arrogance of the software industry.

When you call people computer illiterate, you are blind to the technocrat injustice imparted onto the general populace.

> The obnoxious behavior and obscure interaction that software-based products exhibit is institutionalizing what I call "software apartheid":”

> ― Alan Cooper, The Inmates Are Running the Asylum: Why High Tech Products Drive Us Crazy and How to Restore the Sanity

> “When programmers speak of "computer literacy," they are drawing red lines around ethnic groups, too, yet few have pointed this out.”

> ― Alan Cooper, The Inmates Are Running the Asylum: Why High Tech Products Drive Us Crazy and How to Restore the Sanity

You too can see the light and rise above the elitism of computer literacy. You know, there are many smart people that are too prideful to put up with what computer people demand as computer literacy. They suffer in silence, you will not have their loyalty, and they will switch to competing software the moment they are able to.


What? I never said being computer illiterate is bad. Plenty of fine people are computer illiterate. And plenty of fine people are fantastic at things I'll never be good at. That's fine.


lol UAC is such a lazy shitshow of a security implementation…

A) there is no interception to be had. It’s a fucking “Yes I am Admin” single click a child could do unsupervised.

B) It requires training for the user to know that this is a special UAC mode. That’s high-motivation, high-knowledge user training. Pilots train to recognize unusual signs. Your grandma does not train to recognize what UAC looks like, why it would come up and when. UAC is the biggest cop out of a security excuse and Windows should be ashamed.


Sure I guess, I don't know why UAC gets so much hate while sudo gets so much praise.

UAC is strictly better than sudo IMO.

Does UAC solve security for windows? Of course not, but we were comparing against sudo here.


> lol UAC is such a lazy shitshow of a security implementation…

It's by far the most secure and well thought out implementation of an elevation prompt across all operating systems.

A lot of thought went into designing the Secure Desktop [1] used by UAC, and really mac and linux not having something similar is an embarrassment.

[1] https://learn.microsoft.com/en-us/archive/blogs/uac/user-acc...


I stand corrected, it is not a lazy shitshow.

You’re right, fake sudo prompts is how people get exploited all day long. I’ve witnessed it on MacOS.

For UAC, the user still has to learn that the darkening on the screen and the prompt is “serious business.” I think that when a password is present and has been willfully supplied, prompting the user for the password guards against automatic/accidental acceptance (button-only user confirmation prompts). I understand that many users have a joke password that might as well not be something that’s not really any more secure than a click on a button.

I see that Sudo for Windows has been restricted to Desktop only. https://hudsonvalleyhost.com/blog/microsoft-officially-exclu...

From the design article you linked, I know it’s 2006 era:

> You hide the real mouse cursor and show a fake one some number of pixels offset to the real one

I think MacOS only in the recent years has “Full Desktop Control” as an accessibility-category permission (a confusing category to boot) it enforces on apps to prevent faking the cursor.


Continuing to use a memory-unsafe language that has no recourse for safety and is full of footguns and is frankly irresponsible for the software profession. God help us all.

By the way, the US government did the profession no favors by including C++ as a memory-unsafe language. It is possible to write memory-safe C++, safe array dereferencing C++. But it’s not obvious how to do it. Herb Sutter is working on it with CppFront. The point stands that C++ can be memory-safe code. If you make a mistake, you might write some unsafe code in C++. But you can fix that mistake and learn to avoid it.

When you write C, you are in the bad luck shitter. You have no choice. You will write memory—unsafe code and hope you don’t fuck it up. You will hope that a refactor of your code doesn’t fuck it up.

Ah, C, so simple! You, only you, are responsible for handling memory safely. Don’t fuck it up, cadet. (Don’t leave it all to computers like a C++ developer would.)

Put C in the bin, where it belongs.


You can't just put a language in the bin that has been used for 50 years and that a huge percentage the present day software infrastructure is built on.

I see comments like yours everywhere all the time and I seriously think you have a very unhealthy emotional relationship with this topic. You should not have that much hate in your heart for a programming language that has served us very well for many decades and still continues to do so. Even if C was literally all bad (which imho isn't even possible), you shouldn't be that angry at it.


When you write C++, you can allocate memory all day long and write ZERO delete statements. That is possible, I’ve been writing C++ like that since 1998 (Visual C++ 5.0 and lcc). Can you imagine allocating memory and never risk a premature or a forgotten delete? It is not possible in C. You can call it opinion, but I see fact. That makes C all that bad.

When I say put it in the bin, I don’t mean that good software hasn’t been written already with it, or can’t be written with it. But you should stop using it given the earliest opportunity. When given the ability to write object-oriented software, clever engineers with too much time add insane complexity justified by unproven hypotheticals. Believe me, I know very well why people shy away from C++ like a trauma response. Overly-engineered/overly-abstracted complexity, incomprehensible template syntax, inadequate standard library, indecipherable error messages, C++ has its warts. But it is possible to write memory-safe software in C++, and it is not in C (unless we are talking about little code toys!). My answer is that you don’t have to write complicated garbage in C++. Keep it simple like you are writing C. Add C++ features only to get safety. Add polymorphism only when it solves a problem. Never write an abstract class ahead of time. Never write a class ahead of time.

Downvote me all day long. Call me angry. When billions of dollars are lost because someone, in our modern age, decided to write new software in C, or continue to develop software in C instead of switching to a mixed C++/C codebase with an intent to phase out new development in C.

It’s hard not to get angry when modern software is written with avoidable CVEs in 2020’s. Use after free, buffer overflows, are you kidding me? These problems should have been relics in 2010+, but here we are.


There are still applications (especially with embedded devices) where you do not dynamically allocate memory or might not even use pointers at all.


There good tools that help improving memory safety in C and I do not think Rust is a good language. Of course, the worst about Rust are its fans.


Skill issue


It's been a skill issue for 40 years. How long are we going to continue searching for those programmers who don't make mistakes?


Programmers make stupid mistakes in the safest languages too, even more so today when software is a career and not a hobby. What does it matter if the memory allocation is safe when the programmer exposes all user sessions to the internet because reading Dockers' documentation is too much work? Even Github did a variant of this with all their resources.


Because memory vulnerabilities don't make programs immune to other dumb mistakes. You get these vulnerabilities on top of everything else that can go wrong in a program.

Manual checking of memory management correctness takes extra time and effort to review, debug, instrument, fuzz, etc. things that the compiler could be checking automatically and reliably. This misplaced effort wastes resources and takes focus away from dealing with all the other problems.

There's also a common line of thinking that that because working in C is hard, C programmers must be smarter and more diligent, so they wouldn't make dumb mistakes like the easy-language programmers do. I don't like such elitist view, but even if true, the better programmers can allocate their smarts to something more productive than expertise in programs corrupting themselves.


> programmers can allocate their smarts to something more productive than expertise in programs corrupting themselves

Amen. This is called progress.


Because memory vulnerabilities don't make programs immune to other dumb mistakes. You get these vulnerabilities on top of everything else that can go wrong in a program.

The issue is that these great new tools don't just fix the old vulnerabilities, they also provide a lot of new, powerful footguns for people to play with. They're shipping 2000 feet of rope with every language when all we need is 6 feet to hang ourselves.


There has been a bunch of failed C killers, and C++ has massively shat the bed, so I understand that people are jaded.

However, this pessimistic tradeoff is just not true in case of Rust — it has been focused from the start on preventing footguns, and actually does a great job of it. You don't trade one kind of failure for another, you replace them with compilation errors, and they've even invested a lot of effort into making these errors clear and useful.


I think there’s more to it than just messing with serotonin.

There’s something about Sertraline (Zoloft) that seems to make it quite reliable at causing brain zaps. 3 people I’ve known who stopped Sertraline all experienced brain zaps. 1 of those people also talked about stopping Prozac (cold turkey) and Lexapro (4wk taper) and did not have the zaps, but a 4-month taper of Sertraline was not enough to avoid them.

In fact in the article they recommend switching to Prozac and then tapering that, as a way to avoid the zaps.


I don't know enough specifics, but there's multiple subtypes of serotonin receptors. It's possible that perhaps Sertraline affects different subtypes or a combination. Perhaps the binding affinity is different, but yeah, each of these are different (hence why they're different drugs, and not just generics of the same thing).

I've had a brainzap when I was used MDMA somewhat regularly for a short period in the 90s. It was only once, so it was more a weird "What is this? I feel... something... in my brain?" But I don't think I would want to deal with it happening spontaneously and constantly like withdrawal symptoms present. Sounds awful.

I take Effexor which is an SNRI... From what I've read, is one of the worst in terms of withdrawal (brainzaps and other side-effects). I'm really not liking the idea of me having to someday go off of it, so far for now it keeps me stable, but... if something happened where I would need to go off of it, I'm not looking forward to it.


When I was taking busiprone, it would give me zaps about 30-45 minutes after taking it. Its pretty much the main reason I quit taking it. Nobody could tell me WTF it was or why it was happening. Not something you want when you already have anxiety problems.


Prozac has a notoriously long half-life, which is one of the reasons it's indicated for adolescents who are likely to skip/forget doses.

The long half-life is also the reason it's prescribed for SSRI withdrawal symptoms.

Sertraline is a potent SSRI in terms of inhibiting the serotonin transporter protein itself. It also has a relatively short half-life. Makes sense that cessation would induce withdrawal symptoms.

It's the same principle behind prescribing Suboxone for opioid withdrawal. The drug has a ridiculously long half-life, so it has the potential to smooth out what would otherwise be acute withdrawal.


I've heard similar about Zoloft, and Effexor (which is notorious for causing withdrawal symptoms when coming off it).

I believe Lexapro is a highly selective SSRI, which might explain its lack of withdrawal symptoms (and also its "does absolutely nothing" effect for some people). Prozac has a very long half-life compared to other SSRIs so it's basically got an in-built taper, and is why it's often 'cross-tapered' to when coming off another SSRI.


I took Lexapro for a month one time. Luckily I happened to also be seeing an endocrinologist so I happened to have before and after blood work. A few weeks after Lexapro my prolactin level spiked to far higher than is normal for a male. The endocrinologist was worried I had a prolactinoma and had me get an MRI! I stopped Lexapro and gradually the prolactin level went back to normal, but it took a while. There are far more effects from these drugs than are well documented or acknowledged by most of the medical community. I had to go digging myself to find studies and case reports connecting SSRIs to elevated prolactin and suggest the possibility to my doctors who said that's the first they'd heard about it. If your hormones are messed up or you're feeling more gender nonconforming, work with your doctor to see if it's your SSRI.


How long did it take? I’m doing an MRI for extremely elevated prolactin. Even if there’s no tumor I’ll probably be prescribed cabergoline, but would prefer to return to normal without using another drug.


I’ve been on Lexapro for a few years now, and if I skip it for a couple days I get very unpleasant brain zaps and dizziness. If this is a lack of withdrawal symptoms then I hate to imagine what other SSRIs cause!

Still, it’s been a life changing drug for me and I haven’t really had any bad side effects while I’ve been taking it.


I took Lexapro for a short period in my early 20's. Perhaps like 3-4 months. I definitely remember having significant brain zaps when I quit. It was a pretty low dose I was taking too - 10mg IIRC. Definitely no more than 20mg. I don't remember the zaps causing me significant stress at the time, but that was primarily because I knew to expect the zaps going in to the experience. Education helps a lot with anxiety. If people know to expect it, and that we are pretty sure that it's completely harmless but uncomfortable, the general anxiety around the symptom will probably be less.


It's different for everyone. I've been on it for over ten years and have never experienced any of these zaps. When I attempted to go off I just got extremely depressed. Never any zap though.


I would check your answer. These are pauses due to time spent writing to diagnostic outputs. These are not traditional collection pauses. This affects both jstat as well as writes of GC logs. (I.e. GC log writes will block the app just the same way)


Which is why for anything serious one should be using Flight Recorder instead.


Or /tmp should be a tmpfs as it is on most current Linux distributions.


Probably because pages mapped, even if they are locked into memory are not allowed to stay dirty forever. Does this help? https://stackoverflow.com/a/11024388 (In contrast, if you mlocked but never wrote to the pages, you probably would not encounter read pauses)


Beware the trade-offs of interning affecting GC behavior. Now you can’t have a stack-allocation optimization, for example.


The interning feature should be only used for some rare cases, it is not intended to be used widely.


Just because there was research from 1970 (as there was in subsequent years) showing that big design up front is a bad idea doesn’t mean that waterfall is a straw man argument. It is probably necessary when you are shipping code with extremely high costs of operation, where mistakes are extremely expensive (i.e. missiles, space shuttles). I imagine automotive ECU software is probably in that category.

The software industry has ignored research from 1970s and on and continues to ignore it today.

Look at the microservices craze. It’s another way that big-design up-front has been brought back.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: