Hacker Newsnew | past | comments | ask | show | jobs | submit | cm3's commentslogin

I'm still waiting for the day that there's a responsible entity that only ever publishes things when they're 100% certain and is willing to bet their freedom (going to prison) on it. But given how information and influencing of opinions is a market and means control of the population, I'm afraid this won't happen with official support, and only be seen as the crazy lunatics news agency that publishes stuff a week or month later after having vetted things.

I'd like to say leave speed reporting to the Twittersphere and mandate a clear label on unvetted news reports on any network, but I doubt politics can have such influence on the media. I would love it if the news reports had a watermark that says fresh-and-unvetted just like "preliminary results" or "consult your doctor before taking".

First we have to encourage and support critical thinking, but too much of it may lead to some influencers misusing it to support causes which deny past and current crimes of humanity on itself and the planet.


That's scary. 100% certain? Nothing is 100% certain.

So they can't even say "it's likely to rain tomorrow"? This doesn't seem thought through.


Not all topics warrant certainty, but given the power of news outlets forming opinion and thereby influencing the population's behavior, I do think certain topics demand responsible reporting which doesn't report anything at all if uncertain. It's the same as a good police detective not disclosing speculations to the media because they pursue many leads but only conclude one, and you don't want a mob go lynch people. The same logic applies to news outlets forming people's opinions. That's why I think there's a need for, admittedly few, certain-report-only news agencies. That way, if you read posts on TheSun or E!Online, you get accustomed to not taking it seriously, forming doubt that this is most likely speculation. Once something gets confirmed unquestionably, it can migrate to one of the few vetted-only news outlets, if that's something they cover.

That said, our weather models are pretty good but not good enough to make certain predictions that far away into the future, but they can for the next few hours.

It's like a software company's model of code branches. The Apple/Google/whatever filesystem team works on something, it gets pushed into their level of production branch, then it percolates up to the shared production kernel branch, and after a couple more layers it hits the common branch, which is what public production binaries are made from and consists of kernel, userland, foobar modules all merged together. Not all software shops operate this way, but it's what size of a project can demand after it hits certain amount. The linux kernel works this way too, to name a successful non-commercial project. You can argue this doesn't prevent regressions, and that's true, but it's hard to deny there would be more regressions (aka false reporting) with unfiltered (aka unvetted) reporting.


I can understand having some news outlets with a much higher threshold.

I still don't believe the term "100% certain" is meaningful. Maybe if they were to put a label on certain facts: "This fact is considered by our editorial board to be 93% certain." And maybe have a chart, so that figure can change over time.

I think there are better ways, and that there should be accountability. I just am not in favor of black and white terms for concepts that, to me, are purely shades of gray.


I cannot use st because it still has corner cases that xterm works well but st doesn't yet, and I find the need to patch the sources hard to maintain when a new release comes out and has refactorings included.

But I must say that st's font rendering of Unicode is more complete than rxvt-unicode or xterm. With the same font configured in all three, st is the one that manages to render the most glyphs of plan9port's unicode sample file. Would be great to have the same in urxvt or xterm.


From the description on website, it appears to not bundle the computer, just the case and hdd. So, if we consider the HDD to be 60EURO max, having compared to other WD blue mobile disks (couldn't find the exact model listed anywhere), then that leaves you with 10EURO for the case. The total of 70EUROs may or may not include shipping, I couldn't figure that out without trying to place an order. It's not cheap, but that pricing isn't expensive either. If it includes a Pi 2, then it's cheap.


The Nextcloud box consists of the following parts: 1 TB USB3 hard drive from WDLabs Nextcloud case with room for the drive and a compute board

It's worded weird, but I assumed the "compute board" is the PI.


Strictly speaking that doesn't say it includes one, just that there's room for one. But I hope you're right. I've looked at the PDF instructions and still wasn't any wiser.


it does not come with a PI, it's just plastic box, cables, HDD, and screws. (and maybe an SD card with their distro on it)


yeah, that is what it is. The price is lower than the separate USB3 hard drive kit from WD so it is hard to argue it is expensive, though.

The OS is put together by Canonical, until now it was based on Ubuntu 16.04 and they'll soon update it to Snappy Ubuntu Core so you get a fully 0-maintenance solution.


The disk isn't cheaper than in retail, if you don't consider the case to be worth a lot more than is realistic.


How do you deal with the fact that some video tracks do not provide the needed cross-frame data or the times you're cutting are at unfortunate points which would require a re-encoding because a quick byte copy of the existing stream doesn't work or at the very least will complain later about missing things like color info (although it's played back correctly by mpv)?


I don't know what you mean by cross-frame data. The program is using ffmpeg's -ss, -t, and vcodec copy functionality, and I'm not sure how it handles these issues. I know that if you cut between keyframes (which is very likely to happen), then the data before the next keyframe will be lost, so it is not an exact cutting mechanism.


That's another issue, but it can also lose other data which will cause warnings but play most likely with correct color reproduction.

I don't have such a source file handy to reproduce, but if you try enough ISO MP4 containers with H246, you will hit one where this is the case.

Is `-t` different from `-to` which I've been using?


libavcodec tries to recover from missing frames (you need this for TV broadcasts) but it's actually not very good at it for H.264. You'd almost always see major artifacts.

Any idea what the exact warning text is?


Sounds like extensions that delegate text fields to native editors should be easier to write with better ability to expose a localhost http endpoint.


How similar is this to Opera 12's built-in httpd?


Yeah this reminded me of Opera Unite as well. It's not that similar really though. As I understand it:

Opera Unite was an isolated platform with prebuilt apps by Opera and custom apps you could download from an app store. FlyWeb is an API exposed to any web page.

Opera Unite gave you a public URL that was an Opera server reverse proxying to your local machine so you could share files, chat, etc. with your friends online. FlyWeb just publishes multicast DNS (Bonjour/Avahi/Zeroconf/etc.) service discovery records to your local network.



That app can't serve a local directory. It has to make a copy to a sandbox filesystem first. Try this app instead: https://github.com/kzahel/web-server-chrome (https://chrome.google.com/webstore/detail/web-server-for-chr...)


Chrome Apps are being killed on everywhere but Chrome OS though, so you won't be able to do this in a few years.


I had not heard Opera has httpd built-in, so it depends: what does Opera's httpd do?


Opera 12 had, which doesn't really exist anymore. It's Blink based now and I don't recall the feature being restored (yet).

http://help.opera.com/Windows/12.10/en/unite.html


I feel your pain, but the features will land when deemed stable. Without more frequent releases, those features would get into a release even later, so more frequent releases are a win either way.


This wasn't one of the question of the survey, but while having the right crowd around, I have to ask.

When will Rust get

1. function head patterns like other languages in the ML family, although Rust isn't really part of the family, but rather a distant cousin from another continent, which once played with ML and family during a summer vacation

2. support for naturally writing recursive functions


> Rust isn't really part of the family, but rather a distant cousin from another continent, which once played with ML and family during a summer vacation

It's actually more of a sibling in the family who ran away from home at the age of 6 and fell in with the crowd on the wrong side of the tracks.

Initially Rust was very much like ocaml. It isn't anymore :) Many of the normally-in-functional-languages features in Rust come from these days. Others were lost and re-added later. It's a very complex history.

> function head patterns like other languages in the ML family

could you elaborate? I'm not familiar with this feature (only have dabbled in sml).

> support for naturally writing recursive functions

yeah, I wish we had TCO.


> It's actually more of a sibling in the family who ran away from home at the age of 6 and fell in with the crowd on the wrong side of the tracks.

> Initially Rust was very much like ocaml. It isn't anymore :) Many of the normally-in-functional-languages features in Rust come from these days. Others were lost and re-added later. It's a very complex history.

Yeah, having tried Rust in those days, I kinda stopped when it broke every week, and was then surprised with the surface of 1.0. It seemed like a different person to talk to.

I've made my peace with the C'ification of Rust as the price to pay for attracting a large crowd of developers who grew up with C, C++, C#, Java, JavaScript, Ruby, Python, the list goes on. It's a reasonable sacrifice to make, but the two mentioned basic features aren't complex things to wish for.

> could you elaborate? I'm not familiar with this feature (only have dabbled in sml).

Imagine being able to hoist your match clauses into function head (signature?).

  oldEnoughToDrink :: Int -> Boolean
  oldEnoughToDrink 21 -> True
  oldEnoughToDrink _  -> False
Not all languages with support for that force you to repeat the function name, and there are good arguments for/against. For example in Erlang, when you define an anonymous function, you do not repeat it:

  OldEnough = fun(21) -> true;
                 (_)  -> false
              end,
Now, this may seem like a stupid little feature, but trust me when I say it's a natural feature to use like recursive functions after you're used to it.


Oh, yeah, I see what you mean by function head patterns. I'm aware of the coding pattern from Haskell, just didn't know the name :)

I don't think Rust will get support for that. You can simulate it with macros (and, later, syntax extensions). Of course, that isn't as clean as pure language support. I know why it makes recursion (esp tail recursion) easier to use though. You could always bring it up on the forums and try though.


1 and 2 go hand in hand in recursive functions, but 1 is useful without.

Say you have a function that should will tell you a file extension is likely to be that of a text file:

  isTxt("txt") -> true;
  isTxt("org") -> true;
  isTxt(_)     -> false.
With a more comfortable syntax, this can be expressed more concisely, but I just wanted to show that this isn't only useful for recursive functions.

If Rust is planned to get HKT, then I don't see why I cannot get pattern matching in function heads when there's also guards as found in ML languages.


> I just wanted to show that this isn't only useful for recursive functions.

Oh, I know that it's useful; I've used it in Haskell often. And the `fn foo (x) { match x {}}` pattern isn't uncommon in Rust.

> If Rust is planned to get HKT then I don't see why I cannot get pattern matching in function heads when there's also guards as found in ML languages.

The HKT proposal logically extends existing associated types syntax so that you effectively have HKT. It's particularly elegant in that it's something folks (who are unaware of what HKT is) on learning about associated types expect associated types to support. I've had folks ask countless times as to why you can't `type Foo<T>` in an associated type.

It also opens up access to patterns that weren't possible in the past.

OTOH function heads would just be sugar for fn + match. It doesn't open up new possibilities, it just makes some patterns easier to type. And frankly, with blocks being expressions, it's not much easier to type. You have to provide the function signature somehow, and it's there. The match arms are also there in both the Rust and haskell versions. The only thing exclusive to the Rust version is that you need to explicitly say `match`. Meh.

Languages have a "complexity budget" -- spend too much of it and folks won't learn your language because it's too complex. Rust has spent a lot of it on the borrow checker. Adding random bits of syntax spends this budget, and you need a compelling reason to do so. This is why I'm quite fond of the current HKT proposal -- I'd always thought HKT in Rust was pie-in-the-sky, but the current proposal ends up with a very unsurprising and natural-looking syntax (and doesn't feel "new"), which makes me think that it has a good chance of succeeding.

If you can come up with a good proposal for function heads, who knows, it might happen! But it will have to be good; I don't think just proposing function heads without tons of justification and/or a syntax that fits in naturally will work.


I should add 3, Erlang's bit syntax, which would be a perfect and natural fit for the target of Rust code. It's a very natural DSL to parse or build (complex) bit streams. You don't necessarily have to add bit syntax comprehensions.


Can you elaborate on what #2 means to you?


We are discouraged from writing recursive functions, which comes naturally when expressing many algorithms. Missing TCO is one cited reason to avoid it, so I gather we're not supposed to, generally speaking.


Support for guaranteeing TCE is possible and has been proposed, but there remain a few open questions, e.g. what to do about local variables with destructors (personally I'd statically forbid them from being in scope at the end of tail-call recursive functions).


I know, that's why I wrote "when", knowing it's been considered, but it doesn't appear to be missed by enough developers. I'm just spoiled having used those two features, and anytime someone cites the ML family as part of Rust's influence, it reminds me of these two basic, missing features.

Regarding semantics, without having thought about the Rust semantics too much, I'd suggest to check out the most prominent uses for recursive functions in OCaml, SML or Haskell, then try to consider that as the sweet spot to support in Rust.


I don't think it's discouraged (a least, not in the sense of it being considered bad); I think they're just being explicit about what guarantees you have (or rather, don't have) for recursive functions


I don't know if it's US production influence, but seasons 1 and 2 plus the bonus holiday episode were, compared to season 3, raw and felt more real. If I had to explain it, I'd say it's because it was a pure British production. I really hoped after the bad 1st episode that it would get better, but it didn't achieve the same positive effect as the previous seasons.

That said, if the content of season 3 manages to push some of the concerns outlined into the general public's perception and highlights them, then it's a good thing and well achieved.

The thing that stuck and was funniest for me was from the worst episode (ep 1) about the adapter and planned obsolescence.

Since San Junipero is mentioned here positively, I should add that it didn't work for me and seemed like ep 1 just too long for the content, but that's a subjective thing I guess. I kinda found it uninteresting and boring and honestly the ideas brought forward not crazy or exciting. Maybe it's because having lived in the tech world, the concepts are just as old as the availability electricity or even anyone who watched the matrix or read the mangas it's based on would find it "old".

Playtest reminded of the No End House creepypasta.

My favorite episode, despite the plot flaws, was Hated in the Nation, and that one almost saved the season. In light of InternetOf*, it was the episode I enjoyed the most.


> overnment was behind the fake coup.

Has this been proven yet? I'm under the impression that we're still in the stages of better not believing any side of the arguments because any secrets - if they exist - will be fresh and well hidden for the moment. If you ask me, I would be very careful claiming to know who orchestrated what.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: