Hacker Newsnew | past | comments | ask | show | jobs | submit | compounding_it's commentslogin

The ideal work/coding resolutions and sizes for macOS that I would suggest if you are going down this rabbit hole.

24 inch 1080p 24 inch 4k (2x scaling) 27 inch 1440p 27 inch 5k (2x scaling) 32 inch 6k (2x scaling)

Other sizes are going to either look bizarre or you’ll have to deal with fractional scaling.

Given that 4k is common in 27/32 inches and those are cheap displays these kinds of problems are expected. I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful.

I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions.


If you actually care about this stuff you are going to run something like https://github.com/waydabber/BetterDisplay which easily allows for HiDPI @ 4K resolution, it does not "look bizarre" or "require fractional scaling". This is what the OP is about. I do the same thing, I run native res w/ HiDPI on a 27" 4K screen as my only monitor, works great.

Unfortunately BetterDisplay cannot set HiDPI @ 4K on the M5 machines - that was the first thing I tried.

Sure, and that is the real tragedy here. The person I'm replying to is just pointing out that native support for high res sucks, which is true, but the real problem is what limits there are on 3rd party support.

32" 4k display at fraction scaling of 1.5 (150%) is fine for my day-to-day work (Excel, VS Code, Word, Web browsing, Teams etc.). It delivers sharp enough text at an effective resolution of 2560x1440 px. There are many 32" 4k displays that are affordable and good enough for office workers. I work in a brightly lit room, so I find that monitor brightness (over 350 nits) is the most important monitor feature for me, over text sharpness, color accuracy, or refresh rate.

So MacOS supports only a handful of low dpi resolutions and high dpi must be an integer multiple of one of those?

It doesn't have to be but it's really designed to run at exactly 2x scale.

What makes you say that? Unless I am mistaken, it’s only the Pro models who run at 2x by default.

For me it would be 16-27" 4k is fine, but and as you go up to 32" I'd be wanting 5 or 6k ideally as it's quite noticable for text (even when high DPI scaling is working and across operating systems).

Yup, 27" 4k with a Mac is truly awful. Don't do it. Get a 5k display.

If you're running the 4k display at 1440p, I'd agreed. But I run two 4k 60hz displays on a 16" MacBook Pro work laptop at 2880x1440 effective resolution and it looks fine to me. Yes, it doesn't look as good as the Studio Display I have on my personal Mac. But even though I have the MacBook Pro screen right next to the 27" monitors, I just don't notice the difference as I switch between them all day long.

I'm not saying there is no difference. But I suspect how one reacts to it is highly dependent on the person. I wear glasses that aren't perfectly focused for either screen, but they're good enough to get the job done - and mostly importantly, I get to use my two 4k 27" monitors to give me the same effective resolution as a Studio Display at far less money than two Studio Displays.


Disagree completely. Works great for me.

27" 4k is totally fine on Windows 11 (not a gamer). Everything is sharp at 150% scale.

Windows does ha not have this issue.

That's what I'm pointing out. The person I replied to thinks it does: "I have personally refused to accept in the past that 27 inch 4k isn’t as bad as people say and got one myself only to regret buying it. Get the correct size and scaling and your life will be peaceful. I would recommend the same for Linux and Windows too tbh but people who game might be fine with other sizes and resolutions."

I have dual 27" monitors, both at work and at home. At work, they're 4K monitors, because that's all they have in this size for some reason (LG if it makes a difference). At home, my own monitors are ASUS ProArt 1440p monitors. I run Linux in both places.

I really like my 1440p monitors at home more than the 4K monitors at work. At work, I'm always dealing with scaling and font size issues, but at home everything looks perfect. So I think you're onto something here: 1440p just seems to be a better resolution on a 27" panel.


Their suggestion : get an Apple monitor that we just launched.

I don't think that one supports this use.

video games

>Do you actually own that /48?

In my experience the ISP generally fixes a /64 for each customer. So if in the future you change your ISP, you might want to keep the remaining addresses same while just using a script to replace the preceding /64 address.


My ISPs change the /64 more often. So I use the ULA a lot more often. My router runs its own DNS server and then it advertises this DNS server using a ULA address.

I have mentioned this elsewhere, but ISPs should make BYOIPv6 more common, not just to the Business customers.

Their are people like OP who do this via a VPS provider that supports BYOIP and then tunnel to the VPS network, so there is a demand.

https://news.ycombinator.com/item?id=47355038


I've never heard of an end user ISP that would announce and route a customer owned block of addresses. They'll all give you a static allocation, but it will be in their block. Maybe if you were a huge customer they could do it... but I can't believe they would go to that much trouble for the measly <$100/month they get from me.

Also, I very much don't want all my outbound internet traffic to come from a permanent address range I am publicly known to own. I'd still want an ephemeral /56 for outbound traffic that changed from time to time.


Typically it's similar to ipv4, they try to assign the same address/prefix for the same MAC/DUID. The most common reason to lose your addresses is replacing your router. Hopefully new routers allow you to set the dhcpv6 DUID somehow...

I haven't experienced this. For me it's statically assigned but my guess is that the PON serial and/or MAC is being used or the customer ID. I think the ISPs have gotten very automated these days and everything seems to be some sort of SDN. It saves lot of labour hours in troubleshooting like customer forgetting their wifi passwords to their routers.

Interesting. Honestly I like having control over it, that would annoy me. I deliberately change the DUID in dhcpcd to force my public addresses to change every so often.

Maybe in 50 years the cache of CPUs and GPUs will be 1TB. Enough to run multiple LLMs (a model entirely run for each task). Having robots like in the movies would need LLMs much much faster than what we see today.

doubtful that we will still have this computer architecture by then

In order to go from 360p video 15 years ago to 4K HDR today, I have upgraded from a 2mbps 802.11g WiFi on a 1366x768 display to a 200mbps connection on 802.11ax and a 55 inch 4k television.

The experience is quite immersive and well worth the upgrade that happened very progressively (WiFi 5 1080p then WiFi 6/7 4K).


At the same time, we had cheap consumer gigabit ethernet, and still have cheap consumer gigabit ethernet. 2.5 is getting there price-wise, but switches are still somewhat rare/expensive.

Emphasis on "somewhat" - I was able to build a 10GB backbone for my NAS and such for less than $200 or so with a CRS305 and some direct connect cables; looks like the CRS304 would have made this even easier ...

To be fair, I only started this because the dock I got had 10G - https://www.owc.com/solutions/thunderbolt-pro-dock and I saw some 10G cards on eBay cheap and my old Nortel switch had a 10G uplink and ... well, you know how it goes!


As someone who works with networking (consumer prosumer enterprise everything) the problem is far more complex than : make it open.

Manufacturers can support devices for long but it costs money which the consumers / businesses aren’t willing to pay or value. Cybersecurity is a joke and the general consensus is : we will pay for things as and when there is a fire. We don’t put a price on prevention because we can’t really show it to shareholders how we profited from not being attacked since we blocked those. So we create an arbitrary certification and pass things according to it. This certification doesn’t say anything about firmware. But if we do get attacked then we can convince the shareholders to spend money on better equipment this financial year and then not bother until the next time we have a problem.

Some of these certifications focus on what the devices allow you to do (like acls and firewalls) and see if they pass these tests. But actually looking at the firmware and finding vulnerabilities is not in scope.


>Since when were payment networks latency sensitive?

Apple Pay is extremely fast from my experience (at least the web version). There is a high percentage of market loss if payments take long or fail. Im sure there must be a graph for where it plateaus with diminishing returns when it comes to speed but faster payments definitely help with sales.


FAANG employees here are cheap to hire. They work very hard to remain rich or become rich from nothing (50-60LPA will basically make you rich in 5-6 years if you save and invest well). Leetcode grind and competitive problem solving is Indian childhood bread and butter these days. And given how much househelp exists in India this kind of model is perfectly suited to be outsourced to young and middle aged Indians who have virtually no life beyond CTC anymore.

I’m just surprised it took them this long to outsource.

The risk of course is people start their own companies learning from big tech and Indians get more UPI like tech.


If the Democrats were smart (they are not) they could landslide next election (and 5 more) by running a simple campaign, “Americans First,” the core of which would be to slap 1000% tax of any job which is outsources. Your company wants to hire someone from X, for every dollar paid in salary you have to pay $100 to the IRS.


The funny thing is that this same 60LPA person will happily take a 300k job (assuming parity) with a 50% tax cut because the standard of living is still quite high in maybe Atlanta or Pennsylvania compared to Hyderabad or Bangalore.

In the above scenario the federal government is collecting zero taxes for the employees and the shareholders are getting richer.

By cutting H1Bs the Americans are actually losing money by outsourcing jobs and creating a larger divide between the rich and the poor. Something that the rich actually don’t have a problem with and something people just seem to miss.


> …something people just seem to miss

this is because “people” have stopped thinking for themselves are overwhelmed by “social” and all the rest of the “media” pushing whatever narrative the ruling party wants them to hear. they see “oh look, we have a problem with H1B which will be solved by $100k payment” - boom - “America First /s”

I work on a project where 40+% of staff is off-shore, surely it is much worse in many other places


So are you saying that no multinational company ca have employees overseas?

This is really a poorly thought out proposal.


there are tens of thousands of national companies - hence the term “out”sourcing.

the “multinational” issue can be solved as well (if anyone cared to solve it)


So exactly how do you solve it? Don’t let any multi national companies hire from outside of the US?

Are you going to also ban “national” companies from setting up overseas departments?


we can start with 37,574 national companies which will take care of tens / hundreds of thousands of jobs and expand out.

don’t be a “can’t be done” person looking for excuses, be a solution guy - it will serve you well in life. this is an actual real problem that needs a solution - be a part of that solution


And you are going to tell all of them they can’t hire cheaper developers but the multi nationals can? You do realize that it puts them at a competitive disadvantage?

Are you going to also tell them they are not allowed to expand overseas?


you are completely missing the point…


No you are making completely illogical proposals without thinking about the very obvious knock on effects or how illogical it really is.


then they'd immediately create an indian subsidiary, Google SEAsia, and the hire through them

that's what many do anyway

there could be a tax on offshoring in general but good luck getting that passed, or enforcing it


> good luck getting that passed, or enforcing it

not sure I have seen this type of comment when we - for entertainment of the masses more so than anything else - slapped a $100k fee for H1B to be delivered in manila envelope to Mar-a-Largo.

The "good luck getting that passed" is where the problem lies, there is no "America First" that anyone actually wants, the entire "America First" is missing "Americans Last" in that slogan.

The "enforcing" it part is actually quite easy :)


The Pythagoras theorem doesn’t change even if you use an LLM. Fundamentals shouldn’t either. Don’t see why schools should see this any differently.


> The Pythagoras theorem doesn’t change even if you use an LLM.

Indeed. But it does change if you want an answer on a non-Euclidian surface, e.g. big scale things on the surface of Earth where questions like "what's a square?" don't get the common-sense answer you may expect them to have.

I bring this up because one of my earlier tests of AI models is how well they can deal with this, and it took a few years before I got even one correct answer to my non-Euclidian problem, and even then the model only got it correct by importing a python library into a code interpreter that did this part of the work on behalf of the model.


I agree. That's why universities should never teach any practical real world programming languages. They should stick to Scheme and MMIX.


I don't mind Scheme - love it. But MMIX is one heck of a convoluted, "fantasy-alien" assembly language that I cannot stand. Gave up reading TAOCP because of it. Knuth should have stuck to pseudo-code or plain C.


Not sure if that's sarcasm or not, but when I was in uni (late 90s), it was C++, which was very much a practical real-world language. There was a bit of JavaScript and web stuff, but not much (but Javascript was only 4 years old when I was a senior, so...).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: