Hacker Newsnew | past | comments | ask | show | jobs | submit | Xcelerate's commentslogin

Is there not some concept that utilizes cryptography in a way such that information about people is accessible, but if it's accessed, then the access request is added to a ledger (akin to blockchain) such that who made the access, when, and about whom becomes provably public knowledge?

> they all have an obvious and immediate majesty to them.

"Grandeur" is not the only criteria for nice national parks. I'm from the east coast, and while all of the breathtaking views in California were amazing, after a few years of living there I began to get frustrated that I couldn't find anywhere "cozy" to visit during the weekends. Some locations along the Russian River probably came the closest, but the jagged rocks and coniferous trees still didn't manifest the sort of "warm and snug" feeling one gets while river tubing along a mountain river in the Blue Ridge mountains. Temperature deciduous rainforests are actually quite rare across the planet, and particularly when the leaves change colors, it's a sight to behold.


Interesting perspective, thank you.

Would be interesting to think about what works are currently out there, published, yet will not be recognized as great intellectual achievements until much later after the fact for some reason.

I’ve heard that bitterness affects children more intensely. So I wonder how much of it is an acquired taste vs bitterness just becoming “milder” over time.

My three year old loves the taste of matcha. Even when I don't prepare it quite right and it turns out very bitter. He's pretty picky about near everything else. I think it's acquisition through mimicry.

Matcha is one of the more concentrated amino acids drinks you can make; given how hungry I remember being as a kid, I bet it tastes like liquid gold. And if you’re in a climate that tolerates rhododendrons you can plant a camellia sinensis bush for it straight from the vine as a bridge from matcha to steeped tea, steaming and roasting, etc.

> Matcha is one of the more concentrated amino acids drinks

Matcha is virtually entirely water. Multiple sources say that matcha has about 270 mg of amino acids per serving. Even if matcha powder were 100% amino acids (which would taste vile), a 2g serving would still be 2g.

Milk has about 4.5 grams of amino acid content per 100g (less than half a cup).


Yes, that’s one of the reasons dairy matcha lattes taste so good: not only is it more densely amino, but it’s also more broadly amino (e.g. milk is not particularly high in L-theanine), and the sweetness of the milk offsets the bitterness of the matcha, which lets you ramp up the density further beyond 2g if you like.

What I mean to say is that matcha is almost devoid of amino acid content. It’s basically a small cup of water. The small amounts of various compounds may have some beneficial effects, but amino acids are abundant in many foods and drinks. You don’t need to get them in micro doses from matcha.

Matcha may be tasty. It’s not a good source of aminos.


Okay.

It depends what I'm working on. If it's a bunch of interdependent systems that involve a large amount of data, a giant monitor is better. If the giant monitor is being used to make visible more application surfaces (Slack, email, VS Code, etc.), it makes focus worse.

The biggest improvement I've found for my focus is to force myself to close any open tabs/windows that are not absolutely necessary roughly every two hours. I used to be one of those people with 800 tabs open in the browser and 20 application windows spread across 8 desktop spaces. Was a concentration mess. Requiring myself to "clean up" periodically has really helped.


I set up my own home network with a Vertiv Liebert Li-ion UPS a few years ago and was thinking about how inefficient the whole process is regarding power. The current goes from AC to DC back to AC back to DC. Straight from the UPS as DC would work much better, and as I was teaching myself more about networking equipment, I was surprised to learn that most of it isn't DC input by default (i.e., each piece of equipment tends to come with built-in AC-DC conversion).

Then I started routing ethernet with PoE throughout my house and observed that other than a few large appliances, the majority of powered devices in a typical home in 2026 could be supplied via PoE DC current as well! Lighting, laptops, small/medium televisions. The current PoE spec allows up to 100 W, which covers like 80% of the powered devices in most homes. I think it would make more sense to have fewer AC outlets around the modern house and many more terminals for PoE instead (maybe with a more robust connector than RJ45). I wonder what sort of energy efficiency improvements this would yield. No more power bricks all over the place either.


"... throughout my house and observed that other than a few large appliances, the majority of powered devices in a typical home in 2026 could be supplied via PoE DC current as well!"

We installed 120 LED ceiling lights in our home circa 2020, all of which were run with high voltage (romex) and accompanied by 120 little transformer boxes that mount inside the ceiling next to them.

Later ...

We installed outdoor lighting with low voltage, outdoor rated wiring and powered by a 12V transformer[1] and I felt the same way you did: why did we use a mile of romex and install all of those little mini transformers when we could have powered the same lights with 12V and low voltage wire ?

I then learned that the energy draw of running the low-volt transformer all the time - especially one large enough to supply an entire house of lighting - would more than cancel out energy savings from powering lower voltage fixtures.

You don't have this problem with outdoor lighting because the entire transformer is on a switch leg and is off most of the time.

So ... I like the idea of removing a lot of unnecessary high voltage wire but it's not as simple as "just put all of your lights behind a transformer".

[1] https://residential.vistapro.com/lex-cms/product/262396-es-s...


> I then learned that the energy draw of running the low-volt transformer all the time - especially one large enough to supply an entire house of lighting - would more than cancel out energy savings from powering lower voltage fixtures.

That's not a constraint of physics, you can absolutely build a DC power supply that is efficient in a wide load range. (Worst case it might involve paralleling and switching between multiple PSUs that target different load ranges.) But of course something like that is more expensive...


> But of course something like that is more expensive...

More expensive than an inefficient unit, but it should still be a lot cheaper than 120 separate units, right?

And I expect one big fat unit to do a better job of smoothing out voltage and avoiding flicker than a bunch of single-light units. Especially because the output capacitors are sized for the entire system, but you'll rarely have all the lights on at the same time.

Though for efficiency I'd think you'd want 48v and not 12v.


Plus you save money on the conductors running to the lights.


These days, you should not be using transformers to power small loads at all, you should be using switching power supplies. They have negligible power draw when there's no load attached.


I think we're slowly, slowly coming around to the idea of domestic DC distribution. The vast majority of consumer electronics would be perfectly happy to consume 12v. It's cheaper, safer, more efficient. Less design work and certification on inbuilt AC adapters.

I think it's highly unlikely we'll see mass scale retrofits, but if enough momentum builds up, I can see it as a great bonus feature for new builds.

I got lucky with my house and every room has a dedicated phone line meeting at a distribution panel (a couple of 2x4s with screw terminals) built in the 50s. I'm in the process of converting it to light duty DC power. The wiring is only good for an amp or two, but at 48v that's still significant power transmission.


> I think it's highly unlikely we'll see mass scale retrofits, but if enough momentum builds up, I can see it as a great bonus feature for new builds.

I imagine rooftop solar could also source DC for the house directly (or via a battery), before hitting the inverter... ?

The main problem I see is educating consumers. Maybe that starts with a standard for DC outlets and plugs that can't be confused with AC... ?

(Now I'm imagining desktop computers with much simpler power supplies; but you'd presumably have to wire for dozens of amps incoming...)


48v is what most home battery/solar systems run off. Also coincidentally POE, IMO it makes a much more sensible candidate as it is still 'safe' while carrying 4x as much power for a given cable gauge. Consider laptops are 19 or 20v so you would essentially /need/ a minimum of 24v.


> I set up my own home network with a Vertiv Liebert Li-ion UPS a few years ago and was thinking about how inefficient the whole process is regarding power. The current goes from AC to DC back to AC back to DC.

With double-conversion, generally yes.

I recently ran across the (patented?) concept of a delta conversion/transformer UPS that seems to eliminate/reduce the inefficiencies:

* https://dc.mynetworkinsights.com/what-are-the-different-type...

* a bit technical: https://www.youtube.com/watch?v=nn_ydJemqCk

* Figures 6 to 8 [pdf]: https://www.totalpowersolutions.ie/wp-content/uploads/WP1-Di...

The double-conversion only occurs when there's a 'hiccup' from utility power, otherwise if power is clean the double-conversion is not done at all so the inefficiencies don't kick in.


One of the main problems is conductor size. I wish we could access 22AWG copper in cheap and cheerful cat5e/cat6 format cable. 24AWG cat5e, sometimes CCA is not great for doing large amounts of POE.


22AWG Cat6A is actually what I used (cheap it was not however).


> Lighting, laptops, small/medium televisions. The current PoE spec allows up to 100 W, which covers like 80% of the powered devices in most homes.

I find it a little hard to imagine that those devices outnumber things like stoves, dishwashers, washers/dryers, kettles, hair dryers... by 4:1.

Unsure why PoE would be better for LED lighting than the standard approach of screwing a bulb directly into AC, either. How many lumens do you get out of strip lights these days? And you still have AC-DC conversion for whatever's sourcing power onto the Ethernet link.


PoE is also fairly bulky, requires large connectors, and either requires a wholly isolated PD or what's basically a class 2 DC/DC converter. That's why PoE-powered stuff usually has that big transformer cube in it with a lot of clearance, slotted PCB, 2-4 kV capacitors etc.

In practice PoE will have lower efficiency than mains powered, since it'll usually be at least double conversion, often three converters in series, plus the losses of the thin network wires, and the relatively high idle losses / poor low-load efficiency of the necessarily over-dimensioned PSE.


I think Ubiquiti (makers of the UniFi wifi products, as well as some of the most popular managed PoE switches) also make a ton of other PoE products such as the usual stuff like cameras, ip phones, network switches, access card readers, door locks, and, now, ceiling lights (presumably due to the latest PoE standards delivering significant wattage).

It's super nice because you only need to put the UPS/ATS at the PoE switch and then you get power redundancy everywhere you have ethernet running (i.e. the phones don't go down).


> maybe with a more robust connector than RJ45

USB-C could be that connector, using USB-PD instead of PoE. Though I'm not sure I'd want to need that much smarts for every single power outlet.


Considering the number of times I've sheared off a USB-C connector vs ethernet, I wouldn't consider USB-C to be more robust than RJ45. YMMV.


How often do you unplug RJ45s versus USB-C, though?


I'm not talking about unplugging. Normal use.


And in normal use, how often do you handle or cycle an rj45 vs a usb?

Even when it's your job the usb are still handled and cycled way more often. You might handle 100 ethernet jacks today, but it won't be the same one 100 times. You plug it in and don't touch that one again for 5 years.


You mean the ethernet cable in my laptop? Couple of times a day. The box under my desk? Often? How about the hundreds of devices I work with that have RJ45 connectors for serial access? All the time? Are you seriously telling me you hold up a (decently made) ethernet cable next to a USB-C connector and think "yeah...the one a fraction of the size of the other is obviously mechanically stronger". Is this some Apple shill campaign to try and get people to think "no, no...the smaller, thinner, narrower and shallower the connector is the better it will be someplace where people bump it all the time"?


No, but looking at the two side by side, I do think that the one with a fragile clip that shears off every time somebody trips on it is going to be more of a problem than the friction fit one that will just come off.


As opposed to the one that has no clip to hold it in, and shears off the whole connector when you trip over it. And I can use the ethernet cable without the tab.

Some people will embrace any absurdity to pretend to be right.


USB-C devices tend to be mobile. Flexing and disconnecting are much more common.


The problem is that all of those DC devices don't operate on 48V either. The vast majority of chips require a 5V or lower input, so with a 48V DC supply you're still going to need a per-device PSU to do DC-DC conversion. In other words: no getting rid of power bricks.

Efficiency isn't as straightforward either. You're still being fed by 120V/230V AC, so you're going to need some kind of centralized rectifier and down converter. It'll need to be specced for peak use, but in practice it'll usually operate at a fraction of that load - which means it'll have a pretty poor efficiency. A per-device PSU can be designed exactly for the expected load, which means it'll operate at its peak efficiency.

We also don't use 5V DC grids because the wire losses would be horrible, so a domestic DC grid should probably operate at pretty close to regular AC voltage as well. In practice this means the most sensible option would be to have a centralized rectifier and a grid operating at whatever voltage it outputs - but what would be the point?

As to PoE: I personally really like the idea, but I don't believe it'll have a bright future. For its traditional use the main issue is that there doesn't seem to be a future for twisted-pair beyond 10Gbps. 25GBASE-T might exist as a standard on paper, but the hardware never took off due to complete disinterest from the datacenter market, and it is too limited to be of use in offices and homes. I fully expect that 25G will arrive in the home and office as some form of fiber-optic interconnect - with fiber+copper hybrid for things like access points.

On the other hand, for a lot of IoT applications PoE seems to be too complicated and too expensive. It makes sense for things like cameras, but individual lights, or things like smoke sensors are probably better served in office/industrial applications by either a regular AC supply or a local DC one, plus something like KNX, X10, CAN, or Modbus for comms: just being able to be wired as a bus rather than a star topology is already a massive advantage. And for domestic use the whole "has a wire" thing is of course a massive drawback - most consumers strongly prefer using Wifi over running a dedicated wire to every single little doodad.


What if all homes had battery storage and or solar? You could then simply use it's rectifier as needed or direct 48v from the solar panels. That would be even more efficient than 230v.


That'd be neat. But there's no standard for voltage for home solar: The batteries might be 12, 24, 48v, 60v, or even much more. Meanwhile, the panel arrays commonly output anything as low as 0V and up to ~600V. There's not much for rules and norms here.

Even if we were to standardize a low (<50V) voltage for DC distribution within homes, we'd still need ~120/240VAC to power big stuff, or we'd instead need even-larger conductors (more copper) than we use today to do the same work with low voltage.

But, sure -- we can play it out. So let's say we have an in-home 48VDC distribution standard and decide that this is the path forward and we enshrine it in law.

We need to convert whatever the solar system has available to 48VDC. Then, we need to distribute that 48VDC using a completely separate network of cabling. Finally, we still need to convert 48VDC to whatever it is that devices can actually use.

That's not representative of a reduction in steps, or an increase in efficiency.

That is instead just an increase in installed infrastructure expense, and a decrease in device compatibility. It takes what we have, which is simply universal (at least within any given geographical area) and adds complexity.

And for what? What's the perceived benefit?


Almost all home batteries are 48v, I think it would be reasonable to standardise on that.


So 48v it is.

Is the juice worth the squeeze, though? Two sets of home wiring voltages? Substantially bigger copper wire inside the walls instead of existing copper, in order to do the same work? Two sets of appliances (of all sizes) on shelves at the store? More adapters?

Billy now needs to bring 2 wall warts to make sure he can charge his portable gear at a friend's house instead of just 1, because he's never sure until he gets there if they've got a 120 or 240v house like they all used to be, a combination house, or if it's one of those solar-only places that only has the weird plugs.

What we have now is 1 cable plant connecting the rooms of a home, and an increasing number of hybrid solar inverters that -- on a sunny day -- cheerfully convert solar power directly from whatever the panels are outputting to the 120/240 VAC wiring that both existing and future appliances know how to use. At night, these hybrid systems do do the same thing from whatever voltage the battery uses and convert that to AC. There's only 1 voltage, and only 1 plug; Billy brings 1 wall wart and knows he can charge his stuff.

To be sure: What we have not strictly ideal, but then neither is changing things without a clear positive benefit.

Again: What's the qualitative advantage of changing this, other than change for the sake of change?

DC might feel nice and neat, but in reality it doesn't seem to be shaped that way at all to me.


I'm not sure converting the solar and battery to 48V, distributing it around, and converting it to the needed voltage at the point of use is any more efficient than inverting it into 230V and distributing it around.

Also, you'll need wires that 5 times thicker. Instead of needing a reasonably 1mm^2 for a normal 16A line, you'll need 5mm^2 for the same power.


Solar and battery are already at 48v in most cases, so you are avoiding converting it to AC.

I agree, it's unserious to suggest a cooker or something high power is going to run off of 48v. But for loads like lights, PC/Laptop/TV/Audio 16a at 48v is ~770W which is adequate for these devices.


Both solar and batteries voltages vary wildly and require a converter to use.


We live on societies, switching from AC to DC because your low-power home appliances doesn't need AC makes no sense. Home power usage is dominated by heating and cooling not by your 45w laptop charger.

DC infrastucture makes sense in highly specialised environments.... Like new gigawatt AI farms


Large home appliances probably mostly need DC power these days too. Look at clothes washers: they all have variable-speed reversing motors, so they're probably using brushless DC motors (which use motor drives that are fed with DC, and output variable-frequency and variable-amplitude sinusoidal waves to drive the motor). HVAC seems to be similar, with variable-speed motors and compressors.

I don't think that much stuff is left which actually needs AC power (usually to run an AC induction motor).


[HVAC] true for the fans and controllers but surely you wouldn't DC feed a compressor.


Aren't compressors these days usually variable-speed?


It's fun to think about. There's advantages both ways, but I think it leans most-heavily towards keeping AC.

1. One of these is simplicity. With AC, one single home run of cabling (eg, Romex) can feed a whole room full of stuff, like a bedroom or a living room. At one end of the run is a circuit breaker (a fairly simple electromechanical device) and at the other end is a series of outlets (which are physically daisy-chained, but are functionally just wired in parallel with eachother).

Since one single run of cable can feed many devices, it is easy to accomplish.

2. Another advantage is that it is universal. Anything can plug into these outlets. Whatever a person brings into the home to use, they can plug it into an outlet and it works. It works this same way in every home.

3. And there's quite a lot of power available: A common 20A 120v branch circuit cabled up with 12AWG Romex is stated to supply up to 16A continuously, or 1920W. For intermittent loads, it can supply 20A -- or 2400W. That's tiny by European standards, but it's still quite a lot of power. It's plenty to run a space heater when Grandma visits and she complains about the guest room being cold (even as you start to sweat when you cross the threshold to investigate) and a big TV and a whole world of table lamps, all at once. And you can plug this stuff into any outlets in a room, and it Just Works.

4. But, sure: Lots of devices want DC, not AC. So there's a necessary conversion step that is either integral to the device being plugged in, or in the form of the external wall warts we all know very well.

So let's compare to power-over-ethernet.

1. It's also simple, but only tangentially-so. One home-run cable per outlet, whether that outlet is used or not, is something that can be rationalized as being a simple topology. A PoE switch at the head-end instead of a central box with circuit breakers is a simple-enough thing to transition to. And a lot more individual cables are required, but they're relatively small and are generally easier to install.

2. It's standardized, but it's not universal at all. I've got a few PoE widgets around the house, but I'm pretty friggin' weird when it comes to what I do with electricity. I can't go to Wal-Mart and buy more PoE widgets to use at home, and when people visit they aren't bringing PoE adapters to charge their phones and other electronics. My computer monitor doesn't have a PoE input. I can easily imagine a table lamp or a fan that connects to PoE, and also uses it as a network connection for automation, and that sounds pretty sweet in ways that tickle my automation bones in the most filthy of fashions... but that's getting even further into the weeds compared to how regular people expect to do regular things.

3. There isn't a lot of power available. 802.3bt Type 4 is the highest spec. And within that spec: While switch ports can output up to 100W, a device being powered is limited drawing no more than 71.3W. Now, sure, that's 71.3W per port, but in a room with 10 ports that's still only ~700W -- at most -- in that room. And Grandma's space heater won't run on 71.3W, nor her electric blanket. My laptop wants more than this. The list of useful, portable things that we casually plug into a wall that only draw less than 71.3W is pretty short and most don't benefit from the main advantage of PoE, which is a combination of [some] power alongside high-speed Ethernet data.

4. We still need wall warts since PoE is nominally ~48VDC. For example: Phones use less than 71.3W while charging, but they don't run on 48V. That means 120V AC comes in from the grid, gets shifted to 48VDC for distribution within the dwelling, and then gets shifted yet again to the produce the power (5, 9, 15, and 20V are common-enough in USB PD world) that devices actually want. That's more lossy conversion steps, not fewer -- and we still get to keep the extra conversion (wall warts) as punishment for our great ideas. This is not the path towards increased energy efficiency.

---

PoE is great for the things we use it for today. A camera, a wireless access point -- you know, fixed-location stuff that uses networked data as its primary function and also requires power.

Installed PoE light fixtures (like, say, task lights in a kitchen) also sounds neat -- unless they die prematurely and no PoE replacements are to be found. (Now, you have not just one or two problems, but many: The lights aren't working in that space and they can't be replaced with a trip to Lowes because the Romex that would normally have been installed was deliberately deleted from the plan. It could have been a 20-minute DIY fix that costs less than $100, but now it involves drywall and paint and retrofitting new cabling. Or maybe PoE replacements do exist, but it's now 2035 and the new ones don't talk the same network protocols as the old ones did.)

But there are other upsides: I've got an 8-port PoE-powered network switch that works a treat. It's a dandy little thing. And it sure would be neat to plug my streaming box in with PoE and kill two birds with one cable; I would like that very much.

But most people? Most people don't give a damn about ethernet (PoE, or not!) these days, or streaming boxes, and that trend is increasing. They just plug their lamp into the regular outlet on the wall like they always have, and deal with whatever terrible UI is built into their smart TV, and use wifi for anything that needs data.

And when they buy a home that is filled with someone else's smart infrastucture, their first task (more often than not) is to figure out who to call to erase those parts completely and put it back to being normal and boring.


I’ve always wanted kids, ever since I was a kid myself, but I was never really sure what it would be like to be a parent.

Turns out it’s quite strange, because my kids bring me more joy than anything else. I’ll sit there for hours watching them play. You may think “that’s not strange—tons of parents say that”, but for my sort of personality, it’s very strange. I’ve always thought of myself as sort of overly analytical, detached, ambitious, and a bit obsessive. Not the sort of touchy-feely person who chases a two year old around with a smile on my face and likes watching videos of cute babies. Yet here I am. I enjoy it so much I’ve even tried to figure out if there’s a way I can take a sabbatical from work to spend the last two years with my youngest at home before he goes off to school (seems unlikely given how questions about a random two year gap on my resume might affect my long-term career).

It’s funny that as a kid I always wanted to work at a tech company for the interesting tech, but now as an adult my favorite thing about it has been the 4 months of parental leave I was able to have with each newborn.


You just write "spent two years raising my youngest kid [building tree houses and whatnot]". If you keep a bit up with tech, why would anyone think twice about that? They wouldn't where I live.


I've been using ChatGPT to teach myself all sorts of interesting fields of mathematics that I've wanted to learn but never had the time previously. I use the Pro version to pull up as many actual literature references as I can.

I don't use it at all to program despite that being my day job for exactly the reason you mentioned. I know I'll totally forget how to program. During a tight crunch period, I might use it as a quick API reference, but certainly not to generate any code. (Absolutely not saying it's not useful for this purpose—I just know myself well enough to know how this is going to go haha)


How do you get chatgpt to teach you well? I feel like no matter how dense and detailed i ask it to be or how much i ask it to elaborate and contextualize topics with their adjacent topics to give me a full holistic understanding, it just sucks at it and is always short of helping me truly understand and intuit the subject matter.


Yes, this is my experience as well. At some point you would be better off find something written by a human, because AI would just take you in circles.


This is an interesting usecase, and I want to learn more about your workflow. Do you also use Lean etc. for math proofs.


I’ve always thought the issue was a bit less “Find the interesting research problem” and more “Find the resources, network, or skills that get you into the position of being able to work on the interesting research problem.”

If you asked a bunch of researchers working on the “boring” stuff to predict what the hot papers of the year will be about, do we really think they’ll be that far off base? I’m not talking about groundbreaking or truly novel ideas that seem to come out of nowhere, but rather the high impact research that’s more typical of a field.

Even in big tech companies, it’s quite obvious what the interesting stuff to work on is. But there are limited spots and many more people who want those spots than are available.


Interesting. I don't quite agree. It's one thing to predict what general topics will be hot and popular this year. But that's not the same as what particular research problem will be important and have lasting influence.

There are a few kinds of important research. One is solving a well-defined, well-known problem everyone wants to solve but nobody knows how. Another is proposing a new problem, or a new formulation of it, that people didn't realize was important.

There is also highly-cited research that isn't necessarily important, such as being the next paper to slightly lower a benchmark through some tweaks (you get cited by all the subsequent papers that slightly lower the benchmark even further).


In the book The Cuckoo's egg, Cliff Stoll talks to I think Luiz Alvares. I don't have the book handy here, but Alvarez basically told him to nit get distracted by grants, bosses, ... Here is interesting science to do, so go for it. Just run faster than the rest of the world.

In a way, it was a sidetrack of the book, but for me the attitude speaking from that text was interesting and inspiring. When I could pull it off, it tended to work.


You made me order The Cuckoo's Egg. Luis Alvarez is my scientific hero since I read his memoir last year. Truly underappreciated in the pop-sci community.


“You’re not [X]—you’re [Y]” is the one that drives me nuts. [X] is typically some negative characterization that, without RLHF, the model would likely just state directly. I get enough politics/subtext from humans. I’d rather the LLM just call it straight.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: