Hacker Newsnew | past | comments | ask | show | jobs | submit | geerlingguy's commentslogin

It takes a few years, but the Broadcom chips in Pis eventually get mainline support for most peripherals, similar to modern Rockchip SoCs.

The major difference is Raspberry Pi maintains a parallel fork of Linux and keeps it up to date with LTS and new releases, even updating their Pi OS to later kernels faster than the upstream Debian releases.


Also, unlike a lot of other manufacturers who only provide builds of Linux for their own hardware for a couple of years, it seems that even the latest version of the official Raspberry Pi OS supports every Raspberry Pi model all the way back to the first one with the 32-bit version of the OS.

Likewise, the 64-bit version of the OS looks like it supports every Raspberry Pi model that has a 64-bit CPU.

https://www.raspberrypi.com/software/operating-systems/


I can confirm that even my first rpi from over a decade still runs fine with newest dietpi.

Yeah I was very impressed at being able to download a raspi image last year for my original pi model B, most companies would have just told me to throw it in the bin and buy the new one (at 10x the price lol)

TFA is about an Orange Pi, with a 12-core Arm chip, a bit more than a Raspberry Pi.

They are chasing the same waterfalls though jeff

As opposed to the rivers and the lakes that they’re used to?

The even more confounding factor is there are specific builds provided by every vendor of these Cix P1 systems: Radxa, Orange Pi, Minisforum, now MetaComputing... it is painful to try to sort it out, as someone who knows where to look.

I couldn't imagine recommending any of these boards to people who aren't already SBC tinkerers.


Also about half as efficient, if that matters, and 1.5-2x higher idle power consumption (again, if that matters).

Sometimes easier to acquire, but usually the same price or more expensive.


I can run my N100 nuc at 4W wall socket power draw idle. If I keep turbo boost off, it also stays there under normal load up to 6W full power. Then it is also terribly slow. With turbo boost enabled power draw can go to 8-10W on full load.

Not sure how this compares to the OrangePI in terms of performance per watt but it is already pretty far into the area of marginal gains for me at the cost of having to deal with ARM, custom housing, adapters to ensure the wall socket draw to be efficient etc. Having an efficient pico psu power a pi or orange pi is also not cheap.


Which NUC do you have? A lot of the nameless brands on aliexpress draw 10 watts on idle.

I have a minisforum.

Boost enabled. WiFi disabled. No changes to P clock states or something from bios. Fedora. Applied all suggestions from powertop. I don’t recall changing anything else.


Not the poster you're replying to, but I run an Acer laptop with an N305 CPU as a Plex server. Idle power draw with the lid closed is 4-5W and I keep the battery capped at 80% charge.

The N100/150/200/etc. can be clocked to use less power at idle (and capped for better thermals, especially in smaller or power-constrained devices).

A lot of the cheaper mini PCs seem to let the chip go wild, and don't implement sleep/low power states correctly, which is why the range is so wide. I've seen N100 boards idle at 6W, and others idle at 10-12W.


On the other hand RPi doesn't support suspend. So which wins depends if your application is always-on.

Hardware. It's like the Apple model (before they got into services). They sell a full suite of hardware that works great with their software, and they see the software as a way to keep good will, and also showcase their tech well.

They also sell a paid version, if you want a few extra features.


Their hardware is deeply reliable, affordable, and you can see that they have super solid software chops.

I made the unconventional choice of using a Blackmagic Micro Studio 4K camera for a robotic application and it turned out to be a not crazy choice - we get our choice of lenses and they have controllable focus and zoom, there's a REST API for the camera (which can connect to Ethernet), etc. To speak nothing of the crisp image. And that I can pick one up in 30 minutes at B&H (in NYC).

Industrial vision cameras can cost ~the same but you'll want to rip your hair out before you get to grab an image (or change the focus - sorry, that's mostly never possible).

Huge, huge fan of Blackmagic. The rock-solid free editing software is just cherry on top.


Interesting! How is the latency of this camera?

I can check tomorrow to give you a real answer.

We use the SDI output (that cable is sturdy and the bnc lock connector is rock solid) and a Blackmagic 12G SDI to HDMI converter, and then an El Gato HDMI capture card.

Intuitively, I’d say most of the delay is coming from the HDMI capture side (it’s a pretty cheap usb dongle).


I measured 40ms, so about 1-2 frames of lag. This was the camera's hdmi output captured with the el gato dongle (whose performance I underestimated..!)

> They also sell a paid version, if you want a few extra features.

And the great thing about the paid version is that updates are (so far) free with no subscription bs.

I paid for it once like 10 years ago and still get every new version for free.


And from what I remembered, it wasn't a too expensive license, a few hundreds?

Couple hundred, or free with the cameras.

Yep, I was at a broadcaster when we bought a whole pack of their SDI capture cards... the only ones on the market really (everyone else wanted to sell you massively expensive enterprise "appliances") for a very affordable price (I believe they were like 500$ a piece for 4 SDI inputs?).

Also they were first to sell us USB3 based HDMI capture devices that we could take around and do live capture from cameras at full HD for also a pretty affordable price (around 1000$?).

Whenever we needed affordable (semi) professional gear, they were consistently the ones to look at.


It's crazy that the RAW photo processing market is so underserved that a video editor can add on photo capabilities and it's immediately in the top 3 photo editors.

I mean, they all process image data, so it had that going for it, but I'm still disappointed Apple gave up on Aperture, then nobody really innovated after that, in terms of library management and workflows.


Darktable does a lot of things that are conceptually similar to what DaVinci Resolve is likely doing here.

One of the big things Darktable has been pushing for a few years is moving from the now deprecated display-referred workflow to a scene-referred one. The key idea is that you keep the image in something closer to the original scene as captured by the camera for as long as possible, instead of rendering it early into output-referred display space such as sRGB. With raw files that matters, because many editing operations behave very differently depending on where in the pipeline they happen.

That is a bit different from how tools like Adobe Lightroom tend to work. The main problem with display-referred workflows is not just reduced precision, but that you can end up clipping information and applying nonlinear transforms too early. Once that happens, later edits are working against damage that has effectively already been baked into the pipeline. So subtle tone mapping tweaks can push colors out of gamut, for example. There are a lot of ways to deal with that obviously and Adobe does a nice job of balancing tradeoffs. But they do remove a lot of choice and control from the process.

The UX tradeoff in Darktable is that module order matters a lot and there are a lot of different modules that do similar things in different ways. You can adjust modules in any order you like, but the processing order itself is usually best left alone. That is a leaky abstraction: it is hard to explain why the order matters unless you already understand what the pipeline is doing. And of course Darktable now allows reordering because there are sometimes valid reasons to do that. But that also means users can easily make things worse if they start changing the order without understanding the consequences.

But for simple editing, Darktable is actually really nice these days. I have some auto applied modules with rules for camera type and a few other things. Mostly it looks alright without me doing much. One of its strong points is rule based application of particular edits based on camera or lens. With my Fuji, it needs a little exposure correction because it under exposes intentionally to protect highlights for example.


I am a color science and image expert and couldn’t make heads nor tails of the dark table UI. I wanted to like it but it is just so horrible to use that I couldn’t stick with it.

Thanks for explaining this!

Only one mention of Aperture, suppose I can be the second one to also lament the loss. Lightroom never grew on me and I still miss the UI and workflow of Aperture.

Might give this a try. I just keep on holding back because I do not want to lose all my thousands upon thousands of edits.


Just this morning I was missing Aperture and mentioned it to my kids. I was organizing some pictures with Photos.app and adding a tag or removing from an album takes about 5 seconds for the UI to visually represent my change. I have a decently big library but certainly not anomalous. Aperture made it so fast to organize and deal with photos.

that's funny. before it was a video editor, it was an image color correction suite for RAW.

There are quite a lot of companies competing for the raw image editing market currently. It’s sad that none of the open source options are particularly good.

I sometimes wonder if the random people sitting there hawking a pile of Amazon goods that pops up after every Amazon purchase are already AI.

This is the first I've actually heard of the name change... I used to use Varnish quite a bit, and had a decent grasp of VCL, for Drupal deployments. But I think Varnish 6 or 7 was when I started dropping off managing the caching layer as almost every project chose to offload caching to Cloudflare.

I mean I can just replace Dropbox with a shell script.

That's funny because you could! Dropbox started a shell script :)

Funny though I would assume HN people would respect how hard real-time stuff and 'hardened' stuff is.


I think GP is referencing this somewhat [in]famous post/comment: https://news.ycombinator.com/item?id=8863#9224

HN audience has shifted, there is less technically minded people and more hustlers and farmers from other social media waste spaces. But alas.

"No wireless. Less space than a Nomad. Lame."

No, wait, that was that other site.


Worth noting: the source of this claim is anonymous, and so far the framing of the statement feels a little more radical than maybe what was said in the meeting.

Other news publications are trying to get the full story: https://x.com/jdflynn/status/2042076430406672829?s=46&t=u6IW...

I wouldn't put anything past the current admin, but I don't know what the US could stand to gain from directly antagonizing the Vatican.


I don't think the current admin knows what the US stands to gain from antagonizing the EU, China, Canada, Mexico, Japan, catholics, atheists, muslims, etc etc etc the list goes on - either.

Just adding this from a more trustworthy news source: https://www.pillarcatholic.com/p/nuncios-pentagon-meeting-wa...

While it sounds like the discussion was "tense" according to Vatican officials, it was blatantly mischaracterized in TFA, and nobody on either side recalls the Avignon papacy being mentioned or referred to.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: