I always blew my mind that people don't mention this more. The UK is blessed with some of the most plentiful and reliable wind resources on earth.
Banning onshore wind turbines was just insanity. Despite the insanity, the UK has made great steps with offshore wind, but offshore wind is expensive and has all sorts of accompanying headaches. Onshore is super cheap and quick to build per unit power b comparison.
Onshore wind turbines are going to be much more important to the future of UK energy independence than balcony solar.
They used to, but this generation has been hit by at least 4 independant crises that have made them more expensive over time:
1. Moore's law dying means that older nodes are still very useful, so there's still lots of people bidding for capacity on these 8 or whatever year old foundry nodes who are willing to pay lots of money, meaning that base costs of the CPU and GPU haven't fallen as quickly as one would expect even in the absence of 2-4.
2. The Pandemic and then Russia's invasion of Ukraine screwed with supply chains quite a bit, and caused an inflationary spike.
3. Trump's tariffs affected the profitability of these consoles, and inserted a lot of uncertainty into them because nobody knows how the tariffs situation will evolve over time, or if Sony will get reimbursed or not for the illegal tariffs that were levied against them. The uncertainty and general animosity towards the USA has also caused the US Dollar to slump in value relative to a lot of currencies, which then pressures Sony to raise prices.
4. The current RAM, SSD, and GPU shortages caused by LLM hyperscalers is again spiking the costs of their components
I'm surprised it took them this long. They must have had an extremely large stockpile of PS5s, or just a large willingness to lose margings when they refused to raise prices in the USA (but raised them outside of the USA!) in response to the tariffs, and then even held steady when RAM prices went crazy.
My guess is that this is because the USA was the last holdout region for the Xbox having any degree of market penetration, so they kept prices there low in order to totally kill the Xbox, and have now decided that it's time to start recovering those lost margins.
Nintendo much be in an especially hard place. They just released their new generation of consoles, but from what I hear holiday sales did "fine". Not great, fine. So they definitely don't want to stunt their entire generation by raisig their prices in the first year while they are already worried about hardware sales. Maybe that's a part of why they recently reported the price differences between physical and digital games.
Yeah, but the previous US price increase was both smaller and much much later than the corresponding increase that Microsoft did with their xboxes. That's what makes me think this is more about strangling Xbox than anything else.
Especially because they were willing to piss off their lucrative European and Asian customers by raising *their* prices in response to US tariffs instead of raising the prices in the USA, effectively making all their worldwide customers subsidize lower costs in the USA.
It'd be nice if they do, but I don't really see how. Training these open-weight local LLMs is still insanely expensive and hard to do, even if it's cheaper and faster than what the big corps are doing.
I don't get the financial motive for someone to keep funding these open-weight model training programs other than just purposefully trying to kill the big AI providers.
Not only are third party GPUs not supported on apple silicon, but thunderbolt has significantly more latency and lower bandwidth than 'real' PCIe implementations, even ones with similarly cut down lanes like oculink.
Apple tried before to push everything out into external PCIe enclosures and people hated it. Maybe this'll go differently this time, the Mac Studio is certainly a much more compelling offering than the trashcan Mac Pro. But I think this is still a shitty and painful situation for a lot of specific users.
Oh yeah, the market for these capabilities is tiny, no doubt. But at least historically, the people that wanted these things tended to be very big lucrative customers, and also tended to be very influential word-of-mouth Apple evangalists.
Just as a random personal example, my uncle was an Apple guy since the 80s, and when I was a kid in the early 2000s he always had a Mac Pro, several Macbook pros, and a bunch of other Apple gear. He played a big role in convincing all of his 3 other brothers that Apple was the way to go, and those brothers then raised kids in households with only Apple computers.
This uncle is still a mostly Apple user, but he's increasingly pissed at Apple, and definitely no longer evangelical in recommending it to people. He needs to have a linux server machine for his home NAS, and another one for some specialized work applications and it really frustrates him that Apple abandoned his market segment.
I think there's a real possibility that if Apple had pissed him off like this in the early 2000s instead of the 2020s, that our wider family might have not ended up as being so Apple centric, so Apple may have missed out on a lot more sales than just a couple of Mac Pros and expensive software licenses.
The memory does *not* live on the same chip as the CPU and GPU, you appear to be thinking of HBM. Apple is using regular LPDDR5 RAM on separate chips, but soldered near to the CPU/GPU.
The soldering does serve a purpose though, the shorter traces allow for better signal integrity at higher speeds. This isn't something special about what Apple is doing though, Intel and AMD are doing the exact same thing with the exact same LPDDR5 chips on their respective APUs.
HBM is still almost purely reserved for datacentre GPUs.
Thunderbolt can kinda-sorta mimic PCIe, but it needs to chop up the PCIe signal into smaller packets, transmit them and then put them back together and this introduces a big jump in latency, even when bandwidth can be rather high.
For many applications this isn't a big deal, but for others it causes major problems (gaming being the big one, but really anything that's latency sensitive is going to suffer a lot).
The framework desktop is quite cool, but those Ryzen Max CPUs are still a pretty poor competitor to Apple's chips if what you care about it running an LLM. Ryzen Max tops out at 256 GB/s of memory bandwidth, whereas an M4 Max can hit 560 GB/s of bandwidth.
So even if the model fits in the memory buffer on the Ryzen Max, you're still going to hit something like half the tokens/second just because the GPU will be sitting around waiting for data.
Personally, I'd rather have the Framework machine, but if running local LLMs is your main goal, the offerings from Apple are very compelling, even when you adjust for the higher price on the Apple machine.
2 hours? That's crazy. I regularly get 6-7 hours of coding done without charging on my Framework 13 linux with an AMD 7840 chip in the power-saving setting.
Do you have some sort of background process eating up your battery or something? Maybe cloud sync or something that's too aggressive?
Not really, no. These CPUs are very fast. Giving them a lower power budget is usually not noticable.
For regular tasks like web browsing and coding I cannot notice any difference. The only times I really notice it is when I try to benchmark a piece of code and the benchmark comes out ~30-40% slower than expected, and then I remember that I'm power saving mode. But I just have to hit one button on my dock to switch profiles.
As I said though, the computer is already so snappy and fast that I don't notice the difference between the power saving profile and the high performance profile for anything but benchmarking and gaming.
Even taking the Geekbench numbers at face value (which would be stupid), saying that another computer is 2.26x faster at something that I don't notice the speed is not interesting to me at all.
More battery life is of course great, but to be honest 6-7 hours away from the wall is more than enough for me.
All else equal, I'd of course love to have a computer with a slicker design, more performance, and more battery life, but some things are just more important to me. I like being able to repair and upgrade my computer, and I like to have first class linux support. Those two things just make a much bigger difference to me personally.
The question though is if those lags you're noticing have anything to do with CPU or GPU performance. My editor and web browser are basically never CPU or GPU bound in their performance.
If you notice a hitch when doing something, that's almost certainly because the computer is fetching data from your SSD that's not hot in the RAM or CPU cache.
Clocking the CPU higher won't help with that at all.
For me it was really just that I constantly felt like Apple was doing everything they could to entrap me in their ecosystem and make it maximally painful to leave.
The breaking point was when I tried out their "Hide my email" feature and I just knew what direction everything was going. At that point I just decided I wanted out, and was more than happy to deal with the idiosyncracies of Linux and Framework to get away from that.
Linux and Framework have problems, but their problems don't feel malicious and/or negligent the way problems with Apple or Microsoft feel. I'd rather deal with some annoyances but feel that I'm part of a community project to build something pro-social, open, and sustainable rather than closed and focused on entrapment and rent-seeking.
You don't need to enter their ecosystem to use the computers.
I have been working on MBP for years now and I don't even have an Apple account, I just install my browser and whatever apps I need and then go on with my day.
The most "Apple" feature I used is the time machine but it's usable without any account.
Yeah, it was mostly the stuff on iOS that drove me away, macOS can be used as a relatively open and okay laptop OS without their lock-in features, but I also found that those lock-in features were the only things that were really compelling to me about their laptops.
Without their special stuff, I just find macOS to be an okay, but rather opinionated and frustrating OS to use, whereas I find KDE on Linux to be a bit less polished, but much nicer at least for me as a software dev.
I think macOS is nice if you use it exactly the way that Apple wants you to use it, otherwise it's just painful.
> I think macOS is nice if you use it exactly the way that Apple wants you to use it
Do you have an example? Apart from a few small opinionated decisions, I find Macos to mostly get out of my way.
Of course it lacks the customization that Linux offers, and there are a few UX issues with the DE (switching desktop animations, window management, etc), but for a software dev, being UNIX is pretty good and opens lots of opportunities.
Compared to Windows which is actively hostile towards its users, it's night and day
I don't have any major things to complain about, but I think those small opinionated things just build up rather heavily for me over time. There's a lot of little third party fixes for various things, but often Apple will break those third party fixes with new MacOS releases, and once you upgrade a computer you're not allowed to downgrade which is legitimately infuriating if it breaks something you rely on.
I think I recall something back in 2019 where the Catalina update also broke my favoured programming language because of some notorization change or something, and the process to approve improperly notorized apps was somehow broken, but I don't remember what exactly it was. That was around the time I switched to Linux, but my memory is fuzzy.
And yes, I agree I'd much rather use MacOS than Windows any day. I think I would be fine on a MacOS machine other than for gaming, where Linux and Windows are just way way way ahead in terms of compatibility and performance.
But given the choice between MacOS and Linux, I just feel more comfortable and more respected on Linux than MacOS, both in terms of customization, and general ideology.
Call me paranoid, but I really do believe that Apple wishes to lock down MacOS just as much as iOS is locked down, they just haven't found a way to do it yet that wouldn't cause a massive loss of users.
Banning onshore wind turbines was just insanity. Despite the insanity, the UK has made great steps with offshore wind, but offshore wind is expensive and has all sorts of accompanying headaches. Onshore is super cheap and quick to build per unit power b comparison.
Onshore wind turbines are going to be much more important to the future of UK energy independence than balcony solar.
reply