The service Sheldon Brown provided to cyclists worldwide cannot be overstated. It also becomes apparent every time one needs to research some bike subject involving any technology that appeared after his death.
I think an important part of the Vim workflow that I haven't seen mentioned is the fact that -at least for myself- it keeps the mind engaged during "menial" operations.
When switching from dealing with the information included in the text (typically thinking about code content) to operations of moving text around and other text logistics, I often find my mind "waiting" impatiently for my hands when using non-vim editors. In contrary, when using vim (I consider myself a mid-range user), I can somewhat remain closer to the flow.
At the same time, I still remember (albeit vaguely) a significant overhead of the Vim language before becoming acclimated.
I don't understand why people say there is an overhead. Press 'I' for insert mode gives you the same behavior as non-vim, so your immediately at the same efficiency. What am I missing?
The overhead exists when a novice is trying to use Vim the way it is intended to be used in order to learn. Using insert mode to get the same behavior as non-vim and avoid normal mode is, in my opinion, bad learning practice.
"You can host your own community then"; Sorry but I'm not really a fan of this "make your own (...)" attitude, especially when excusing corporate behavior against volunteers
I mean that there is a spectrum ranging from paying the bills and making profit to blatantly exploiting everyone involved. Based on the Apollo logistics, reddit API pricing seems to fall on the greedy end of the spectrum. At the same time we have examples with high-traffic websites (e.g. Wikipedia) which manage to build upon volunteer effort and be sustainable without pissing everyone off within a few days.
> I mean that there is a spectrum ranging from paying the bills and making profit to blatantly exploiting everyone involved.
They're not making profit, that's the entire point.
> Sorry but I'm not really a fan of this "make your own (...)" attitude, especially when excusing corporate behavior against volunteers
I'm similarly not a fan of discounting just how much money and work goes into building a company with millions of users, having helped do that in the past. My comment was not glib, it was entirely sincere; if you want to know just how much it costs, host your own community and see just how "greedy" it is.
The thing is that they should be able to introduce their API pricing without triggering the events of the past days. I'm not even remotely an expert, but the pricing seems too aggressive and seems to contradict was communicated earlier (again based on the apollo transcripts).
On the other hand they are supposed to be experts, or they should at least ask one, given the millions of users, thousands of mods, etc involved.
How do you know whether it was too aggressive or not? They could have always revised their estimates when they did the calculation for how much each call costs as well as how much lost revenue they get from third party apps not showing ads. Sure, they could have given more than a month's notice but the bill comes due at some point. Based on the API call figures Apollo has posted and having worked on API products in the past, I can entirely see how $20 million a year is reasonable given how much Apollo is pulling from Reddit's servers.
I infer it was aggressive, based on the following:
Per Christian (the Apollo app dev): "(...) Twitter's pricing was publicly ridiculed for its obscene price of $42,000 for 50 million tweets. Reddit's is still $12,000. For reference, I pay Imgur (a site similar to Reddit in user base and media) $166 for the same 50 million API calls." [1]
All assuming he is not lying (which I have no reason to believe, contrary to the reddit reps). Two orders of magnitude over Imgur pricing sounds a bit greedy, unless Imgur is also at the verge of collapse, which I'm not aware of.
There is no $166 plan. The least expensive $500/month plan is "only" 7.5M requests per month.
50M requests under the ultra plan (7.5M requests and then $0.001 for each one after) would cost $43k/month - and it would be more sensible to go to the "mega" plan then which is $10k/month for 150M API calls.
Not all high-traffic websites are comparable when it comes to sustainability. Comparing reddit and Wikipedia is definitely an apples and oranges situation. The software, data model and infrastructure are wildly different.
Just the simple fact that most of Wikipedia can largely be cached long(ish) term via CDNs reduces the capital needed to keep it running. reddit on the other hand, has constantly changing content, user configured listings, threaded discussion, media hosting, etc. Caching those types of things is a lot more difficult.
> reddit on the other hand, has constantly changing content
You mean like how Wikipedia has pages that anyone can edit that can (and do) get edited many times per minute?
> user configured listings
Kind of like how Wikipedia supports saved articles and reading lists?
> threaded discussion
If only each article on Wikipedia had a discussion page where you could talk about edits to the page. We could call it the Talk page. But I guess that’s too dynamic for them.
> media hosting
I’ve always thought Wikipedia would be better if they had images and maybe even videos to go with each article. Maybe we should petition them to add this.
It looks like ipv6 matching is supported since late 2017 (version 10.0 [0]), although the changelog states that "not all ban actions are IPv6-capable now". As for IPv6 capabilities, I don't have any recent experience with the software.
Couldn't you just ban the /64 and call it good? It's not like they get a random selection of addresses, they're all going to be the same CIDR. Or am I overlooking something here?
Possibly the best solution would be an installation of the Logitech Media Server (LMS) on a raspberry pi (e.g. piCorePlayer [0]) or an already available home server. LMS can then handle all squeezeboxes (or squeezelite installations on other hardware) on the same network and allows control via a web interface.
After spending a few years in a similar situation, and not wanting to maintain my own server, I settled with using runbox.com.
It's an economical and privacy-oriented, mail provider which allows (necessary for my needs) using your own domain. They have very quick and excellent support (the few times that I needed it) and also offer a CalDAV/CardDav service. Their web interface is lacking (I believe it is about to be updated) but this is an non-issue for me science I use mutt. I'm not aware of a mobile app (I personally use and suggest the excellent K-9 mail OSS app).
Yes that'S the entire point of the discussion. Yet people still claim you should backup your backup. But where are you going to backup it to? To another encrypted drive where the enclosure might fail? No you're going to use an unencrypted HDD and be done with it. You can always add encryption on top of an unencrypted HDD.
Sure, I completely agree. It strikes me as unnecessary, however, for a manufacturer to deliberately add a counter-intuitive and indeed dangerous element to a device commonly used for backup. In particular, many home backup systems are not created by experts who might think of things like this.
While the difficulty of reading markup vs a compiled document is subjective, there are tools such as texstudio [1] that allow real-time preview of the document synchronized with the code [2].