Hacker Newsnew | past | comments | ask | show | jobs | submit | AlexTes's commentslogin

Let me guess, you do not work with JavaScript .


Maybe just not react-router.


As far as I can tell the license is okay unless your product depends on suing Facebook while at the same time being so strongly dependent on React you can't switch it out.


Beautiful, thank you for your story.


I don't think so. The cost would be extremely prohibitive. It would be far cheaper to print it in plastic and then add some reflectie coating.


Why would I use this over Antergos? Which is my current goto for preconfigured Arch.


Ignoring the negativity and taking it as an opportunity to improve your product and / or understanding of a segment of your audience.. I'll remember that!


I've been arguing for using Git Flow. Reading the post I have to say the stand our lead takes against Git Flow and in favour of very cosy CI is perhaos steonger than I realised.

He argues for pushing about as often as possible. With our small team thats very do-able, every push gets tested and linted by the 'blue' or 'green'. You're supossed to only push passing code which you easily can by running the tests and lint locally. So instead of all the pain points mentioned in the post you write passing code, pull and rebase on other passing code, and then push. Little code review, no worries about hasty reverting, few / early conflicts keeping us from trilling each other up or writing incompatible features.

The reason I argue for Git Flow? Our tree is an absolute mess. Most often a single chain, of often linearly scrambled features. In other words removing one feature would be hard and require a bunch of legwork, not a couple Git commands.

If anyone strongly feels there's a better way for a small team than lightning fast CI let me know!


I use near weekly updates of a custom Android 6.0 ROM and daily use Arch Linux which is a rolling release. My work is on the web with npm keeping us bleeding edge. Other people often ask why my system does this little extra thing that's useful. Usually the answer is found by --version.

Life with up to date tools can be real good. So unless I'm simply the only one using good tools, updates can be good.


I think it's a mixed bag, to be truthful, and it's a circumstance that's very well described by "pick your poison."

My primary OS is also Arch Linux, and while it's certainly stable, it's not without its warts. Failure to update regularly on a rolling release distro can have absolutely disastrous consequences (though you only have yourself to blame), and a healthy dose of caution is strongly recommended whenever a major update to important software is in the pipeline (think KDE4 to KDE5 transition). I think this sort of bug (Outlook) illustrates the importance of having an abundance of caution with new software where the failure modes may not be well understood by merit of its relative youth. But with rapid releases, I think the problem is a bit more focused on the end user: Someone who is unable or unwilling to take the risk of updates causing material harm to their workflow or consuming time they can't afford in order to fix potential problems should look for more conservative release cycles. I don't think it's really a matter of "good" versus "bad" tools; that may be part of it, but I can't help myself from thinking it's a matter of misplaced expectations.

That is, it's easy to fall into the mindset of erroneously believing that faster, more rapid updates is always better without fully appreciating their impact. (I've done this more times than I'm willing to admit.)

I do think, and maybe I'm wrong (which I usually am), that those of us who tend toward using rolling release distros have a bit of a bias and a rose tint to our glasses. We almost innately know what the risks are, and I think we take that for granted by assuming most others will freely accept such risks and appreciate the occupational hazards that go hand in hand with change. Not everyone has the same degree of patience, nor the same goals or motives. I think our optimism for and evangelizing of software that pushes rapid releases (like Arch, as an example) can help create an aura that lulls those we influence into expectations that don't mesh well with their use case, their personality, or their constraints. Is that a bad thing? I don't think so (evangelizing is important), because I do agree with you: Updates can be good. I just think we're all too happy to espouse advantages while sometimes glossing over potential drawbacks (guilty again as charged!). :)

Anyway, I should apologize: I didn't mean to wax philosophical. It's late in my timezone, and I saw another Arch user who provoked me into a short essay. I agree with you and username223, but I don't have any real answer. I do think that sometimes we ought to be more cautious with our advice and perhaps weigh context more heavily than our excitement allows. (I made the mistake once of suggesting Arch to someone who really ought to use something with sturdier training wheels. My only saving grace is that he never got around to installing it.)


It depends. With rather new still evolving ecosystems, every update is welcome. Also the security updates are important for OS and browsers.

But with stagnating platforms, like Windows where tipping point was Windows XP (or Windows 7), it goes downhill with every update since then, the bubble bursted, now the user is the product, and your files get screened and searched by the platform owners. You can run software from 1985 (Win 1.0) on Windows 7 (32bit, but it's just an arbitrary limitation to prevent 16bit applications from running on 64bit OS, the competition can do that see Wine and ReactOS). Almost every developer already moved on to the web or emerging new platforms like the market share dominating Android or the second most popular one, iOS/OSX. End users are smart, they don't buy into old antique burned platforms of yesteryear anymore. They application landscape is changing as well. And for everyone who is still happy with their old software, there is little reason to update, and it won't get better on sinking platforms.


No, its not an arbitrary decision - there are some pretty decent reasons why Microsoft declined to support running 16 bit apps via NTVDM/wowexec - virtual 8086 mode isn't supported in long mode - now certainly, they could have ported the functionality from NT 4 they used to get NTVDM working on non x86 processors (it included an i486 emulator) - but to what end, realistically, how many people are still running 16-bit only software?

Another thing I'd point out - Microsoft almost backed themselves into a corner worshiping at the shrine of backwards compatibility - to the point it was difficult to move their platform or their ecosystems software forward to use more modern, more secure and more reliable methods - so unless you've been very forward looking from the start (see IBM System Z) there is a real, fundamental and painful engineering cost to maintain a line to yesterday without great sacrifices to tomorrow.

I'd argue that the PC (be it Windows, OSX or Linux) is here to stay for the foreseeable future - it may not be the platform for everyone - but for many workloads and applications the web, or mobile simply will not do.


> unless you've been very forward looking from the start (see IBM System Z)

Can you elaborate? System Z always looks curious, but I don't think that many people who aren't involved with mainframes professionally had a chance to even look at it


Everything I know of System Z I've read online - it was designed more or less for backwards compatibility from the start.


> Almost every developer already moved on to the web or emerging new platforms like the market share dominating Android or the second most popular one, iOS/OSX.

It must be nice in your filter bubble.


I disagree with you. In the query "deluge retina", "deluge" is definitely the more important bit. After all you're looking for deluge and are hoping to find it with support for retina. Otherwise you would've written something like "torrent client retina".

Now this is where a bit of guess work comes in but I'd say Google correctly deduces there is no such thing. Even the best result for both terms just talks about some app not supporting retina, and from the looks of it does help with what other client you might want to use if switching to OSX when you were previously a deluge user on Linux (possibly when dealing with retina). But that's not deluge. That's a useless result considering your primary intent was finding deluge. So Google chooses to give you results that might get you what you want over results that (correctly) only disappoint explaining there is no such thing. So in the first three results Google correctly decides that, to get you a relevant result at all it needs to omit the "retina" to get you results that might possibly get you what you want - deluge despite not having retina support - over just giving you the results that are relevant but definitely won't get you what you want. Your query had no results that would get you what you wanted so Google tried to alter the query and see if it could get you something useful anyway.

I think Google trying to give you results that might be hits instead of giving you disappointment in the first place is very sensible behavior.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: