I think the way to think about all of these Apple products is through the lens of what I believe their long-term strategy to be.
Apple wants to onboard people into the new Apple ecosystem, which is iToys and backend services. They believe this will be better for you, not just for them. They control the iOS-based world in a way they could never control "computers". They're trying to get as many people as they can into their new Disneyland-ish ecosystem, where everything you do is approved and monitored and everything you buy is bought through them.
The low-end Mac seems to only matter now as a source of low-end users, who need carrot and stick to move them off Mac and onto iOS, so low-end Macs won't get better but low-end iOS devices will. I suspect we'll see touchscreen laptops running iOS pretty soon, and that will signal the end of low-end Macs.
The high-end Mac users have a better story, because Apple needs developers for their ecosystem, and those people need good tools. I can't imagine Apple being interested in developing dev tools that run on non-Apple devices, and I think they'll want an absolutely guaranteed source of computers with whatever features are needed to develop the best stuff for their ecosystem, so I think a high-end Mac product line will live on with each new version being a bit more iOS-like. If they want to keep the lockdown on iOS while retaining full power to develop for it, they will probably have to keep the less-locked-down macOS going for dev machines for a long time, but things like the Mac Mini have no place in this picture.
> I suspect we'll see touchscreen laptops running iOS pretty soon
Seems unlikely. From a 2016 interview:
> MacOS is meant to be operated with a mouse and keyboard. Apple says it doesn’t make sense to lean forward to touch your Mac screen. An iOS-powered iPad works best with fingers, which means you can just lean back while using it.
> “We did spend a great deal of time looking at this a number of years ago and came to the conclusion that to make the best personal computer, you can’t try to turn MacOS into an iPhone,” Schiller says. “Conversely, you can’t turn iOS into a Mac…. So each one is best at what they’re meant to be — and we take what makes sense to add from each, but without fundamentally changing them so they’re compromised.”
The split view of 'producer' vs 'consumer' often seems like a good yardstick for this kinda stuff, rather than high/low end.
For situations where you want to be productive on a device, mouse and keyboard, desktop, big screen, proper seat all very important. For situations where someone consumes on a device, touch input, portability, small screen, headphones connectivity.
That seems to match the philosophical split for MacOS vs IOS, and maybe future developments can be best predicted by applying that lens.
I've never met people that fall squarely into "producer" or "consumer", everyone you talk to has a creative hobby or three, and might be either a "producer" or a "consumer" at different times of day, or different moods, or as inspiration strikes.
It's a wide, diverse spectrum turned into a strange false dichotomy that leaves us with weird pseudo-categories like "prosumer" (and "conducer"?) middle grounds.
Maybe there is something about enforcing that dichotomy in leaving them to different contexts, different devices it forces you to switch hats in a physical visceral sense. Certainly studies say that such a forced context switch can be good for the brain to know it's time to X where X is produce or consume (or "prosume"?). Yet that seems a strange reason to force that on users. Is it anything more than an equivalent to "It's good for you, you must eat your vegetables before dessert?"
I naturally flow between consumption and production use cases, so why shouldn't the devices I use switch to my context instead of me switching to theirs?
It's an interesting philosophical split, but I'm curious how useful it is in the real world. Microsoft's "everyone can be a creator, everyone should be a creator, your devices should support you whatever you decide to do on the creation/consumption spectrum" reaction marketing seems the better approach to me, philosophically. I'm not defined by a "consumer" or "producer" 'gender', and I don't see why I should be.
How does this philosophical split help Apple in the long term?
>why shouldn't the devices I use switch to my context instead of me switching to theirs?
Mostly because, in my opinion, every attempt to create a device that does that has resulted in failure, and those failures are due to fundamental opposing priorities with each 'mode'.
I may of course be wrong, in which case you're right this framework should be junked. But so far nobody's cracked both use cases well with a single setup, and the model holds. IMO.
>How does this philosophical split help Apple in the long term?
For a brand that's betting so heavily on the walled garden, ecosystem-based sales and marketing methods, having an 'answer' to every consumer need is essential to is to not have an answer for one of your customer's needs, ecause that opens the door for that customer to adopt something outside your ecosystem. Keep them within the curated suite of products you offer, and keep them happy, or you'll see attrition. Note all of the noise about the latest MBP devices and how they're no longer meeting the needs of many content creators, to see how tricky this can be.
As a coder I don't want to sacrifice usability, price, and / or power to get a flexible device like this. I have an iPad specifically for the chill out and consume experience. It has great battery life, lower power consumption, some simple games, and less distractions.
I don't need my car to make me toast either (not intended with snark, just an extreme example).
Apple sees (and hopefully continues to) an actual divide even if the edges are a bit blurred. So the question here is whether or not the tablet needs to exist as a category... and I think it does, at least for me.
> Beware, simple frameworks like this might not capture the whole picture.
I would describe it more like a model. And yeah, those never capture the whole picture, and don't need to to be useful.
> heavy marketing of the new iPad as a content creation device for schools and creative people.
You can do creation of stuff on iPads, of course. I use it a lot as a very-portable writing machine. But still, I think that marketing was just... marketing. Every one wants to think of themselves as creators, and their kids to be doing things rather than watching YouTube, but still iPads are mostly used for consumption and are much better at it than at creation, except for a few jobs/hobbies.
A better counter argument, for me, would be to just watch a non-technical person use a Windows PC. Many would be better served by a safer, simpler iOS on a 20-inch screen with an ok keyboard.
We wouldn't, because we want a more-than-just-ok keyboard, and many windows, and bash, and a many-core CPU that is great at virtualization, and...
Infantalizing "non-technical users" just means you avoid creating new "technical users", does it not? If "we" all sit in our high and mighty many-core bastions, and leave the masses to shiny, locked down "consumption only devices" outside our gates while we look down upon them, we create a nice closed walled garden, certainly. But is that really the sort of class system we need?
I don't want to infantilize users either, but I do want to give them the right tool for the job. Some people are core users and want powerful tools. Some are ring[1] users and want simple tools to quickly and easily do simple things without having to develop a core competency.
[1] Ring User is a Jared Spool term for non-core users--they are in a ring around the core. The Core/Ring more useful model than Expert/Novice, because expert conflates expertise in the tool with expertise in the domain area. https://articles.uie.com/multiple_personalities/
It's such bullshit. I've done development work on touchscreen Windows laptops and it's very handy. Just the fact that you can touch the mobile simulators just like you touch real devices is super useful. And for some reason Apple has no problem having people lean in to touch their useless MacBook keyboard touchstrip thing. I don't know what they're smoking at Apple these days.
I can understand how having a touchscreen helps with mobile development, but I have no use for touchscreen functionality in my development. I don't have a desire for my macbook to have a touchscreen at all. Further, I think the touch bar was a huge mistake. I've really gotta side with the idea of letting a desktop/laptop excel at being a computer rather than suffering from an identity problem.
All that said, we all get to vote with our dollars :)
Just because you don't have a use (today) doesn't mean that no one has a use for it, much less that you wouldn't find a use for it tomorrow.
Touch on a laptop screen is a huge convenience, it's not a laptop identity problem. Sometimes tapping or scrolling a thing feels more natural, even on a laptop, even on any other screen that you own. Apple has used that very logic in their own marketing of the iPhone and iPad, that touching things is the more natural feel, that directly interacting with things on the screen is quite natural.
To a child growing up with iPhones and iPads, a screen without touch is "broken".
For an adult me, switching between a Windows laptop with touch to a macOS laptop without one takes a bit of mental gymnastics because I keep wanting to just tap the shiny icons to do things and I get briefly confused why the screen is broken when I try to touch them. I don't care how good you think a trackpad is, it's still sometimes faster to reach up and poke something with your pointer finger than track a pointer the long journey over to a thing. It's a nice convenience to have touch on the screen and do it directly.
Maybe that convenience doesn't interest you, that's fine, but it doesn't mean that convenience doesn't exist. (Plus, when you get into differently abled people you start to find people that that convenience starts to matter a lot more for their day-to-day happiness.)
I don’t really disagree with what you’re saying. I realize my viewpoint on this topic was
Very selfish. But that is why I pointed out that we all get to vote with our money.
The problem, for me at least, is I don't want anyone smearing their greasy fingers on my display. I like seeing things on a clean display so I would never buy a laptop with this "feature".
Exactly. The best way to take Apple is to take what they say is gospel... and ignore it completely.
If you had listened to Apple, the iPhone would forever be small enough to operate one-handed, the iPad would never go above or below 10", etc etc. All of that was true until it happened, and suddenly it was the best thing ever.
If only MS had realised this when releasing Server 2012 - I remember how awful that was because they had tried to do a touch interface on a server.
Things have improved quite a bit but its still a mess
The theory, as explained by a Microsoft employee at the time, was that you would be remoting in with your tablet from wherever you happened to and thus needed a touch interface.
I would like to explain to that techie how remote desktop connections work, and what full-screen refreshes do over poor connections... Not to mention the joys of having "kill this production server" buttons right next to "important app" buttons when working through daisy-chained remote sessions onto low-resolution desktops...
MS always had the funny notion you'd need a GUI to manage your server. By now, I think it's mostly pride and an unwillingness to admit a good command line rules that scenario.
And I'm not talking PowerShell here. That thing is more a scripting language than a shell.
Well, they have the iPad Pro with keyboard cover already. Who's to say that they couldn't make that into an actual laptop product, replacing their lower-end macOS devices?
Also, none of your quotes mention the impossibility of making an iOS device with yet another different form factor. It merely says that they won't merge the two OSes. Plus, even if that weren't the case, it wouldn't be the first time that Apple went back on a public statement while rebranding the backtracking as the invention of a whole new computing paradigm.
2016 is a long time ago in tech terms. Given that they've just announced that iOS apps are coming to OSX, I would be very surprised if touch-screen Macbooks aren't on the way.
I've used touch enabled laptops for quite a while, and it's absolutely a good interface for many things I do as a developer. I'm sure I'm not going to use it on my desktop machine because the screens are about 1m away from me.
Change for no good reason (Windows 8?) is worse than little change (APFS, minor updates). My Gnome desktop is basically unchanged for a good couple major releases of Linux and I'm still very happy with it.
Yes, the things I want from an OS in order of priority are security fixes, bug fixes, support for new hardware, performance improvements, usability enhancements then finally new features.
>They control the iOS-based world in a way they could never control "computers". They're trying to get as many people as they can into their new Disneyland-ish ecosystem, where everything you do is approved and monitored and everything you buy is bought through them.
This sounds so dystopic.
I hope that being able to publish software without having to bow to a gatekeeper won't become a thing of the past.
Not just "publishing" software but even writing private software for your own, and your friends' & families', devices is something we can't take for granted. For personal and (private) family use, I gave up on trying to do native apps for iPhone and decided to learn how to do whatever was possible with web apps.
And if iOS Safari (Apple's "gatekeeper" of last resort) falls too far behind, I'll take myself and the whole gang I serve as "IT installer, fixer, and advisor" for right out of Apple land.
As Mojave is getting iOS security model, Windows is merging the Win32 into UWP, ChromeOS and Android are turning into one OS, and who knows what they would do with Fucshia.
What do you mean by “Mojave is getting iOS security model”. What I’ve seen announced is that, as of 10.14, the system will ask the user when an app first wants access to, e.g., microphone or camera. Do you mean something else?
macOS is getting a new security runtime, the scope of Gatekeeper will be expanded, entitlements will be required for all security critical OS areas, application signing and registration even for apps distributed outside the store.
For Mojave, it will still be opt-in, later versions depends on how it will received.
They don't refresh the hardware aggressively because they just flat out don't need to. It's not about running down the Mac, if that was their aim they wouldn't have spent tons of engineering resources on the redesigned super-slim MBPs with new keyboard design and brand new whizzy touch bar. Those features may or may be problematic or even flat out mistakes, but they're not a sign of lazy lack of focus.
The thing is the vast majority of Mac customers don't know what a clock speed is, don't know the difference between RAM memory and Flash memory, and don't care, and are buying these things by the truckload. Apple has always been tardy about hardware updates. It looks like right now they really have actually dropped the ball, and at last they recognised that with the Mac Pro. However all the actual evidence points to this is simply a miscalculation, not a conscious choice.
They feel so slow though, do you find that? I use a late 2014 model as a build machine. It's a Core i5 2.6 Ghz with 8 Gb RAM and a regular HDD, and it takes ~20 mins average to build a not very large Unity game. The last build Xcode took 7m 45s and exporting the IPA took 9m 17s. I haven't profiled where all that time is going yet. Maybe I should have sprung for the Fusion Drive and i7.
I’m completely convinced that my 12 inch iPad Pro blows away my 2014 Mac mini in every respect. Mac mini has 8GB ram, and an SSD. Still doesn’t feel nearly as snappy as my iPad.
Mine's late 2014 too (2.8 i5 rather than 2.6) but the Fusion Drive makes a massive difference - can't recommend it enough. (That said, I'm not using Unity.)
What about the low-end Mac (e.g. the Mac Mini) as a "dip your toe in" approach to inviting developers into the macOS/iOS ecosystem? That's always what I understood it as before—it was never an HTPC, and regular users always preferred either AIOs like the iMac, or notebooks. The Mac Mini was specifically the "you already have a Windows workstation, now unplug it and plug its peripherals into this to try out developing on a Mac" experience.
I have one - let me see, a mid-2010 - sitting under my desk. It's my media server, primarily, but I use it for other bits and pieces. It's useful to have an always-on Mac in the house, given that my primary machine is now a MacBook Pro.
It got an SSD a few years ago, prolonging its life. It's probably been turned on for 90% of the time since I got it. Current uptime: 26 days.
I use my ipad pro as a laptop replacement. It works better than I expected except for a few quirks around keyboard support in apps like safari and aws workspaces which are irritating enough to make coding bear impossible.
But in many ways its better than a laptop, especially around note taking and reading.
For most content dev, especially media creation, productivity is proportional to screen area.
The iPad Pro is fine for limited tasks, including simple media creation. But there’s no way anyone is going to want to edit a Hollywood movie or record and mix a complex album on one.
> The high-end Mac users have a better story, because Apple needs developers for their ecosystem, and those people need good tools. I can't imagine Apple being interested in developing dev tools that run on non-Apple devices
They could turn macOS into an exclusive dev tool for their ecosystem. Currently it's basically the main dev platform for rivaling ecosystems like the web, and the companies in that space that compete with Apple.
A switch of the desktop architecture (they already announced that this will happen in the coming years) might enforce the apptore on iOS as well, and destroy the homebrew and third party scene.
It's fun but you can't leave the park. If I want to install a 3rd party app on my $1100 iPhone without using the App store i'm shit out of luck. I think it's unacceptable that if I'm paying that much for a phone I have to go to a different brand to get that freedom and flexibility. The App store is great but the consumer should have a choice to install apps without it despite the potential dangers. As an adult, if I want to install a Youporn or casino app on my phone I don't need Apple telling me that's not allowed.
That level of lockdown is a feature. I would like more freedom and control in iOS too sometimes, but I've gotten used to it. I'm still on my first iPhone (6s) and I'm happy with it for what it does. Contrast this with Androids. Every vendor has their own variation of the OS, APIs are inconsistently implemented across devices, and it's a good bet that the phone will stop receiving OS upgrades in 2 years.
I don't think iOS is perfect by any means, but I take Apple's Draconian measures along with the the long term stability and higher app quality. It's the best mobile OS on the market for me. I still won't get the iPhone X because of the price and face ID replacing touch ID. I'd probably pay to replace the battery and use this phone for another 2 years. It's nice that I can reasonably count on that.
Well it's not a feature for me. Not sure how allowing vendors to install apps onto iOS without the app store (and having a few warning popups) has anything to do with OS fragmentation on Android. Isn't it an interesting coincidence how those 'draconian measures' you mention that are so great have a side effect of being really, really good for Apples business.
I pay Apple $800 for an iPad so there won't be Youporn or casino apps in their app store. Either go to Android (no clue what their free for all store allows) or visit the website.
The closed system is what I want and I'm happy to pay a premium for it.
You agree with what is presently censored are you sure that this will always be so?
If you want a few players to control the gateway to most future communications and culture so that your kids will have to work harder to see boobs online then i question your judgment.
Geekbench is pretty much useless, especially across architectures and OSes. It claims a maybe three-watt iPhone CPU can beat a 40-watt Ryzen 1200, which is plainly ridiculous. Apple makes a good CPU, but you'd have to be crazy to think they have beaten the competition by a factor of ten in power consumption with no compromise. When you see something incredible, be incredulous.
Does anyone know of a decent iPhone benchmark? I'd be very interested in comparing their hardware to other CPUs, but none of the benchmarks I know run on iPhones. And the people who know what they're doing, e.g. Phoronix, don't test much mobile hardware.
> It claims a maybe three-watt iPhone CPU can beat a 40-watt Ryzen 1200, which is plainly ridiculous.
I've seen this claim before. The only way it makes sense to me is that the Apple chip has a much more powerful integrated GPU than then AMD one.
I suspect that the GPU in CPUs meant for computers are more optimized for things like yield, since anyone looking for performance can just buy an expansion card. Phone SOCs don't have this luxury, so they're probably pushed to the performance limit when it comes to graphics
To be clear - this is just a guess - but it's the only one I can think of that makes sense.
Many ARM SOCs use a "BIG.little"[0] CPU technique where they can swap from a high-power dual- or quad-core section of the CPU to a low-power quad-core section.
> Inside the iPhone 8 and iPhone 8 Plus, there's a new six-core A11 Bionic chip... There are two performance cores that are 25 percent faster than the A10 Fusion chip in the iPhone 7, and four efficiency cores that are 70 percent faster.[1]
They've added 2 low-power cores, and made the cores faster than the previous version. Normally, only one set of cores could operate at once.
> The A11 features an Apple-designed 64-bit ARMv8-A six-core CPU, with two high-performance cores at 2.39 GHz, called Monsoon, and four energy-efficient cores, called Mistral.[1][6][4] The A11 uses a new second-generation performance controller, which permits the A11 to use all six cores simultaneously,[8] unlike its predecessor the A10.[2]
Here, the 2 high-power and 4 low-power cores can all operate at once, alongside the 3-core GPU - 2 cores of which can assist ML tasks too.
> There's a new Apple-designed 3-core GPU that's 30 percent faster than the previous-generation GPU, and two of the cores, the Neural Engine, make machine learning tasks faster than ever.[1,ibid]
Not sure how much power it consumes in that mode, but it's reported to use half the energy of the A10.
> In terms of graphics performance, the GPU is said to be 30% faster than last year’s, while consuming half the power when working at the same rate as the A10.[3]
The transistor count of an A11 is 4.3 million, the Ryzen has about 4.8 million depending on the type. And x86 is a bit wasteful in terms of what it does effectively with those transistors because the front-end required to translate x86 instructions.
And x86 is a bit wasteful in terms of what it does effectively with those transistors because the front-end required to translate x86 instructions.
That's a tiny amount in comparison to all the other pieces like cache, ultrawide SIMD units, and bus interfaces; and given that it acts as a sort of "decompressor" interfacing the slower memory containing denser instructions with the faster core and its wide uops, I'd say those transistors are quite well spent.
AArch32 is about as compilcated as x86 to decode. About 1200 instructions with dozens of different encodings, and instructions that aren't aligned to instruction width and can straddle a page and cache line boundary.
This may be part of the reason why no A11-based devices support AArch32. iOS 11 dropped support for 32-bit apps, and its highly possible the chip doesn’t support the ISA at all.
You claim GeekBench's scores are wrong, but you provide no evidence that A) they are wrong or B) an alternative benchmark that contradicts them. So, why again do you know that they are wrong besides the fact that you find them "incredible" and therefore they must be "incredulous?"
Geek bench does not perform long running tasks where you would observe thermal throttling . So a gaming benchmark would work. 3dmark ? https://benchmarks.ul.com/3dmark
Anandtech test with Basemark OS II which is cross platform, although they only compare Android and iOS devices and haven’t published an article for the iPhones 8 / X yet sadly. Here’s the iPhone 7 review they did though and comparisons against similar gen Android devices: https://www.anandtech.com/show/10685/the-iphone-7-and-iphone...
Are geekbench scores meant to be used comparing different CPUs architectures like that? I thought they only made sense when comparing already similar processors.
"But the striking thing to me is just how much smaller the Intel NUC is."
Add the volume of the NUC's hideous and ungainly power supply brick. Then make your judgement on size.
If you've taken a mini apart, you'd see that the easiest way volume could be significantly reduced is by replacing the space for two 3.5" drives with space for one modern NVME gum stick - a trick which the already NUC does thanks to its being developed more recently.
The mini has Thunderbolt and and digital audio in/out. The NUC has neither of these.
If you compare the interior of the mini and the NUC, the NUC does not stand out as an marvel of engineering or miniaturization.
Sufficiently-small-form-factor computers become "AIO enablement dongles", in the sense that you can stick one to the back of a monitor or TV (much like a "compute stick") and now that monitor or TV is an AIO PC. People use NUCs in that use-case quite a lot.
Sufficiently-SFF computers are also used as kiosks, for POS systems, and other embedded use-cases. This isn't really the "pitch" behind the Mac Mini, but it's something the NUC is quite good at.
I really don't understand how people can put up with these huge power bricks. Just because you can hide it under a desk it doesn't mean it's not there.
And speaking of all in ones, the iMac doesn't have a power brick either. If you screw your NUC to the back of a monitor/tv you will end up with two power bricks if you're unlucky. They may fit under the desk, or the cable may be too short and you'd have to keep one or both on the desk. I don't see the advantage.
Apple has other, significant problems right now, but the devices not being mini enough isn't one of them.
Every Mac I've every seen can message with a huge power brick, often integrated directly into the plug.
I can't plug my MacBook in under my desk because it's one two-pronged and it's super heavy so it falls down.
I don't care if it has a power brick or not, just make it easy to plug in. Bonus: a cable material that is designed to withstand years of use not months; their cables are no better than the competition and they have a crazy failure rate.
I thought this thread was about desktops. You did know that Apple sells some desktops? For example the Mac Mini, about which the original article is complaining for the wrong reasons (I agree the hardware inside is painfully obsolete and no one should buy a mac mini atm).
The Intel NUC is also a desktop (well, non battery operated) computer, not a laptop.
Those Mac wall-wart chargers are awful. You essentially have to buy the extension cable, and voila, you have a bulkier version of the power adapter every laptop comes with.
And how in the world do they not have a grounded plug?
Because they're double insulated, so no need for ground. Also with UK plugs (The superior plug type) the apple chargers are pretty great, no way they're falling out the wall then. With the US plug they're pretty annoying though.
But I have to say I do prefer the extension (which I luckily have from previous Macs).
> I don’t expect that Apple would make a box quite this ugly—those two USB ports on the front of the case would be the first to go
That's what's annoying about Apple hardware and great about PCs in a nutshell: the constant need to put facile design looks above actual usability. Why should I have to fiddle around the back every time I want to plug or unplug a usb device?
Indeed, to the contrary I find the Apple aesthetic of making products that look like featureless blobs with an apple logo on them to be bland and ugly, but I suppose it fits with their direction of locked-down devices and fostering ignorance: the less the user can do with it, and the more control Apple has over what the user does, the better.
It's disappointing to see only 4 USB ports on the high-end Mac Pro, when high-end PC motherboards are available with double-digit numbers of USB ports.
You can always add dozens more USB ports with a powered hub. 6 Thunderbolt ports is unheard of however and provides great connectivity that you can't add after the fact, even on DIY PCs.
Nobody is defending the Mac Pro. Even Apple has, in a step almost unheard of for them, admitted that the current Mac Pro was a mistake, and promised its replacement will be very different and more extensible.
There's a bit of a bet going on in Apple's mindset. For most people, USB drives are no longer a thing (Dropbox, mail-to-self, etc.), so why put ports in the front? If they're in the back you can set up your keyboard cable or w/e nicely and keep those out of the way.
_If_ your USB usage is not 'often plugging/unplugging stuff into the port', then having the ports in the back is better for usability and the like.
Of course this doesn't work if the bet they make is wrong (see MBP dongles)
Because there are a zillion things other than USB drives that people plug into those ports. Phones (both to charge and occasionally to sync), headphones, 2FA authenticators, cameras, gamepads, printers--the list is long and it isn't getting much shorter over time.
The thing is people will plug their printer in there because it's the first slot they see, then the cable management will look really bad and ruin the industrial design the PC maker wanted. I'd like to see a Mac Mini/NUC with a Qi charger on top to charge a phone.
Most 2FA USB devices do not support NFC to make them wireless. Even if you do have an NFC enabled 2FA device, Macs don't have NFC. So, wireless is not an option.
I use them frequently for cameras, dongles for stuff like steam controllers or internet etc. Apple still have them on their machines because they know people use them - if nobody used them they wouldn’t be there at all.
They put them at the back because to them how it looks in the Apple Store is more important than their users being irritated every time they use it.
I can't understand why Apple doesn't make a fanless Mini. There's a respectable market for fanless NUCs, and the original Mini was a great machine. Plus, it's a cheap entry level to the Apple ecosystem.
"Cheap" and "entry-level" are two terms Apple is particularly unfond of, however. The reason why the Mac mini is no longer being updated may be because it's simply too cheap.
This is confusing me more and more not your comment but Apple’s position on this. I used to think the Mac Mini was a bit if a cast off because it was really the only Mac for awhile that didnt encourage buying all your periphels from Apple.
That being said with the ending of Airports and Apple branded displays it appears first party accesories arent as important as I thought. I’m not entirely sure why it’s being left to stagnate.
4x the price of some vastly inferior tablets. Don't get me wrong, the margin is still high, but the entry level iPad is well above the specs and build of the Kindle Fire
Please. They lowered the price from $499 for iPad Air 2 w/16gb to $329 for iPad w/32gb. The change in 2017 was to bifurcate the line, targeting general market use and schools with the regular iPad, and creative professionals and higher end market with the pro line. The iPad Air 2 didn't fit that strategy, and schools couldn't afford the $499 entry point of iPads and were going with Chromebooks.
Re: screens, students are really rough on equipment, and laminated screens like those in iPad Air 2 and iPad Pro are quite expensive to replace. The iPad has the glass/digitizer separated from the screen which lowers repair costs significantly.
The iPhone SE is primarily aimed at emerging markets, isn’t it? In such markets, a $600 (base price) computer without any peripherals is a comparatively much more difficult sell.
The iPad at $329 is an odd exception to Apple’s typical rules, I’ll admit. There’s a large price divide between it and the iPad Pro without there being a comparatively huge feature divide, making it kind of unique in Apple’s lineup.
I disagree. The strategic product development options a cheap low-energy fanless machine gives are enormous. For example, Apple could market it as a secure privacy-respecting home server against Google cloud services.
Recently I have been contemplating getting an Apple machine at home just for Keynote, Pages, iMovie, Counter-Strike, iMessage/Facetime (and Photoshop, perhaps?), which I don't get on my low-powered Linux laptop. I've been using a Hackintosh-on-KVM kinda solution and it works OK for the most part even with my dedicated GPU, but not ideal and I'd be happy to convert to the light side to enjoy seamless upgrades and stuff.
I don't want to shell out like $1200 for a dumb macbook that I don't use, but I would be happy to shell out $500 for a Keynote machine.
For me, Keynote/Pages are the killer apps that only macOS has. iMessage/FT is the killer app that the Apple ecosystem has. Having a cheap entry to them is a great idea.
Just my 2c, as someone who went Windows - Linux - OS X - but: completely agree on Keynote. It has the best UI of any software I’ve ever used. No kidding.
Pages is good, useable, nice, but not worth buying a Mac for.
Just today, I quickly created a slideshow by writing markdown using Marp and loved the experience. I'm guessing this will become my new workflow for making slides.
I've been a LaTEX/Markdown person for years before converting to Powerpoint/Keynote.
If you want your slides to be minimal and clean and you don't have a lot of time to do, then Markdown works. To me, over time I am convinced that having the ability to do animations to explain technical ideas is great. While it's not impossible to do fly in/out/moving animations in LaTEX, presentation software makes it ten times easier to maintain.
I think that not everything is best explained in code. Sometimes you need pictures/animations to convey the ideas.
Yeah, this illustration is what can be done in PPT in 5 minutes to explain how adding more layers/data can filter noise from signal, when I don't think it will take 5 minutes for me to write a LaTEX code that does the same. That's not counting Powerpoint does allow me to draw lines and stuff easily on a canvas. If I want to do it in LaTEX, I probably will need to draw that diagram on a separate program and import it back.
Sorry, the slide is clipped for some reason, but that's because I don't know how to configure OBS. It's not the best illustration ever, but you probably get the idea.
Here is an animation that isn't mine and I think it's really well done to explain a hard idea (not necessarily in PPT):
I agree. I've given talks on pretty math-heavy ideas but they can still usually be explained visually in a way that is significantly easier to understand without losing fidelity. I love LaTeX and Beamer for pure text/math, but once you want animated slides, or even including videos/gifs, PDF gets flaky fast. Keynote and PPT are pretty powerful tools.
Steven Wittens have written about this and his MathBox^2. I've never used it, and from the examples it looks like it might be too much work for most presentations to be worth it, but the examples are pretty impressive.
Oh, I'm already completely convinced about the expressive power of animations. Just wasn't sure how much is possible with PPT/Keynote. Could 3Blue1brown videos be made in Keynote? But I can definitely see your point. :-)
... which can be quite an adventure if you get the latest generation & don't do this kind of stuff on a regular basis. You basically have to disassemble the whole machine!
They exist but they are quite limited. Even installing fonts on iOS seems like a hack, so I can't imagine doing advanced stuff such as inserting videos into a presentation being easy on it.
My guess is that the profit margin was too small to justify, and that furthermore they didn't want to cannibalize their own MacBook/iMac/whatever sales, since I am assuming all those have a much higher margin for apple compared to the mac mini.
I mean apple is primed right now to deliver the "phone dock to dumb terminal" dream. Get people to by the phones, pay for the cloud storage, and sell them a $900 "Smart Monitor" with bluetooth KB and trackpad.
And legally supports os x. I'd probably have bought an os x license for testing/fun/development if I was allowed to run it directly or in a VM on any hardware I have access to. Sadly I can't, because I don't have any Apple computers.
yes i do think that is imminent; we were still chasing Ghz until pretty recently, but now power requirements are dropping enough that we can get, for example, 24h 4k decoding passively
> Apple TV 4K is tiny compared to a Mac Mini, but judging by Geekbench scores (Mac Mini; iPad Pro, which uses the A10X in the Apple TV) it’s a slightly faster computer than even the maxed-out Mac Mini configuration. Apple TV 4K probably has better GPU performance too. In addition to all the performance problems stemming from the fact that the Mac Mini hasn’t been updated in three years, it’s also inarguable that it’s no longer even “mini”. You could arrange four Apple TV units in a 2 × 2 square and they’d take up the same volume as one Mac Mini.
Hadn't thought about that. Would be awesome if it would be possible to run arbitrary software on that little TV box, e.g. use it as a real, powerful (for the size), cheap general purpose personal computer.
It's about the size of a 3DS, has an Intel Core m3 Kaby Lake dual core 2.6Ghz CPU with integrated GPU, 8GB of RAM and a swappable m.2 SSD. Plus built in screen, keyboard, game controller etc. Has an HDMI port, USB and so on so you can connect it to more comfortable I/O devices when using it from a fixed location.
Runs Windows 10 so you can probably get a Linux prompt on it.
Weighs 1 pound and can run GTA V at playable speeds, ~$700 or the same as a mid-tier Mac Mini. The same manufacturer also produces a few other pocketable-portables.
I think it's important to understand that a no-name manufacturer can do this, Apple should be able to do it as a rounding error in their sleep.
I'm still using my 4-core, mid-2011 Mini. Aside from the poorly supported graphics hardware--video playback in Chrome, etc has inexplicably taken up more CPU over the years, I guess because lazy developers increasingly depend on more modern CPU and GPU features--I have no reason to upgrade and nothing enticing to upgrade to. It seems my mini won't be supported for macOS 10.14, and I have no idea what I'll do.
I'm also still hanging onto my Macbook Air, which to me is the epitome of laptop design.
I'm still rocking my mid-2012 Mini that I upgraded to 16GB RAM and an SSD, luckily I'm getting 10.14, but otherwise I also see zero reason to upgrade my desktop Mac atm. This machine has been more than powerful for most tasks I've thrown at it, and it still runs like a dream.
If Apple were to actually upgrade/update the Mini with new CPUs and a smaller form factor, I wouldn't hesitate to upgrade finally. Those Intel NUCs are starting to get real appealing to me, but I'm not ready to jump to Windows or Linux for my home machines...
I upgraded a 2012 Mini with a 240GB SanDisk [0]. It wasn't too bad, but watch the YouTube videos first. (And, as noted above, the 2014 may be trickier.)
You probably don't have hardware level support for the VP9 codec, when YouTube switched to that my laptop at the time would run it's fans at 100% just to play a video.
I remember playing DVDs on 20-ish year old hardware. Even without a hardware decoder, I can't imagine how a modern CPU is unable to handle simple tasks like video decoding. Maybe its the higher bitrate..
An HD Netflix stream has about the same bitrate as a DVD, but has drastically better image quality. Modern codecs are extremely efficient in terms of bandwidth, but that comes at the cost of computational complexity. Decoding HEVC or VP9 is far from a simple task; hardware acceleration of video decoding is absolutely crucial for power-constrained devices like smartphones and laptops.
> Modern codecs are extremely efficient in terms of bandwidth, but that comes at the cost of computational complexity
Right, but as with every codec, the cost is always heavily skewed towards encoding the stream.
>Decoding HEVC or VP9 is far from a simple task; hardware acceleration of video decoding is absolutely crucial for power-constrained devices like smartphones and laptops.
I don't believe that decoding video streams should come anywhere close to saturating a modern CPU in 2018. The hardware acceleration you mentioned isn't magic, and doesn't use any crazy tech either. Its going to be some kind of low powered ARM chip on MediaTek SoCs like the MT8581.
I'm still using a late `09 mini with 8GB ram and an SSD drive. It's still working fine for web app dev work. This has been the best Mac I've ever owned. I would love to see a new Mac Mini come out and I've been ready to buy one and waiting on them to announce the new models for awhile now.
That said, I'm ready to move over to a Linux Desktop now if Apple ditches the product entirely. I don't want to, but I know I'll do fine with it.
I was a Mac user for ever 20 years -- finally ditched it a few weeks ago [for Linux] after getting fed up with their daft shenanigans.
The whole App Store / iTunes archipelago, the willful rejection of expandable and [easily] user-serviceable hardware design in favor of a 'form before function' approach, the related planned obsolescence strategy [e.g., older hardware simply will not run recent software; new OS breaks older 3-party software], and the ghastly overpricing were among the factors that drove me off.
Mainly, I liked MacOS because I prefer putting work into projects as opposed to maintaining my own system, and MacOS "just works," until it doesn't. These days though, I run an idiotresistant distribution of Linux on my productivity machine, a minimal one on my 'learning' machine, and am generally more content. Also, running Linux, I feel less like I'm directly interfacing with a villain from 'Black Mirror'
I get it. I've been using Macs for 25+ years and I recently installed a Linux distro with KDE on my Mac in preparation for the ThinkPad I ordered after much deliberation (in lieu of getting one of those crappy "touch bar" MacBook Pros) on which I planned to run Linux while staying on MacOS on the Mac. I am almost certain now that I'm going to stick with Linux across the board on both machines. KDE suits me quite well.
It's been a death by a thousand cuts staying with Apple these past few years. iOS is completely locked down (and shows no signs of relenting) and I only see them moving MacOS in the same direction. Their sub-par, insulting hardware offerings only further cement my view that they no longer give a crap about what the Mac used to be. So, like you, I'm out. It was a fun ride. I've been on-board since System 7.
Total agree. I didn't actually have a computer when System 7 came out, but I used to hang out at the UMO computer lab with my mom and play with Hypercard. Fun, quality stuff.
Good call on the Thinkpad -- after I got my T61 in the mail a few weeks ago and realized immediately that its ergonomics were vastly superior to my MBPs' which cost a heck of a lot more, I was like "gee, maybe I should have taken a hard look at this issue sooner..."
They crippled the Mac Mini (which was quad-core, 2 drive bays and upgradeable memory) to try and drive people onto the Mac Pro, when the Mini was a perfect machine for many workloads. Then of course they abandoned the Pro. So I wouldn’t hold your breath.
It's been a number of years (the Intel NUC was a new product) since our office bought its first Mac mini. At that time, it was the only product that fit the requirements of size, performance and cooling, and even then was cheaper than many alternatives which weren't as good a fit. Today, there are quite a number of products, including gamer or workstation-class machines, in this form factor. What was once exotic for its compactness is a commodity. All-the-while, Apple has kept this product at a near-standstill. Almost certainly they're worried about cannibalizing their iMac line, or even the Mac Pro line. Given the level of innovation that has take place over the last five years, the remarkable thinness of laptops which Apple helped pioneer beginning with the Air, why aren't they putting out a Mac mini the size and thickness of a slice of bread?
Yep. The 13" Air was a really popular machine that people loved. Apple could have simply updated it with a better display, specs and USB C and they would have made a killing. Instead, they decided to make the 12" Macbook and keep the crap display on the Air...
A lot of people are saying that Apple doesn't make a better Mac Mini because they're worried it would cannibalize their iMac/Mac Pro sales.
But if your entry level machine is providing more value for $ than your pricier models, then perhaps there's a problem with those model.
Just looking at the iMac lineup in Canada and the prices are insane for what you get. Just for example, the basic 21.5" iMac, when upgraded to a SSD, costs $1700 CAD.
I still hold the belief they should of renamed the Air to "MacBook" and not done that ugly MacBook, or released the ugly one as a MacBook Air. Just swap the names, I'll buy the MacBook if it's the MacBook Air model, otherwise I'd rather go for the Pro. The Air is really nice, doesn't get in your way. I wouldn't use it for programming, but for what I use it for it's perfect.
"They crippled the Mac Mini (which was quad-core, 2 drive bays and upgradeable memory) to try and drive people onto the Mac Pro"
Ummmm, no. That's implausible. There's no plausible way to "drive people" from a $700 computer to a $3000 computer. They're entirely different classes of buyer.
There's no plausible way to "drive people" from a $700 computer to a $3000 computer. They're entirely different classes of buyer.
You’re thinking consumers, not businesses. Businesses or professionals will buy the tool they need to fulfill their use cases. The Mac Mini was an ideal developer workstation. So your choice was, buy the more expensive model or find some other business to be in.
I've got an 2011-era Mac Mini that I've loved for developing - updated to 8GB of RAM + an SSD, it's been a great Xamarin machine for years. However now I'm looking for a replacement as Mojave won't support the hardware. With the dated Mac Mini and reports of many keyboard issues on Macbook Pros I'm looking at a Mac Air for a new development machine - and even these are getting a bit long in the tooth. Not really sure what Apple is up to these days.
I'm pretty disappointed that the same event where Apple touted that they were keeping support for the iPhone 5S was the one they dropped support for pre-2012 Macs.
Especially because my maxed out 2011 Mini apparently has similar geekbench results as the current entry-level Mini, which certainly isn't true on iOS.
So maybe what I'm most upset about is that the Mini + Pro haven't had updates in so long, but the software side is behaving like they don't have an ancient product line.
That may be the reason but I would point out that many companies that sell high end systems and OSes (IBM, Sun/Oracle, etc...) at some point require the customer pay for supporting older systems (even then there is an age cutoff). For Apple, supporting older Minis without the customer paying for this support doesn't seem to be in Apple's interest from a business standpoint.
I still really like the iMac. While it doesn’t quite get the attention it deserves, it’s still a legitimately great machine. Overall, it may be the best machine Apple makes right now.
I’m not all that fond of the Magic Keyboard (mine failed after a completely minor spill), but that’s an easy enough fix.
It seems like Apple has given up on desktops entirely. OSX lags unbearably when performing normal workloads. Forget about running Xcode or a CPU intensive workload, your machine will freeze due to excessive memory leaks and swapping. It is an issue at the framework level, and I say that because all software written for OSX has to use a set of common frameworks to interface with the OS itself.
Wait for the new hardware in September. Maybe it would be ok. If not, you can get yourself a late 2015 with all the normal ports, keyboard and performance.
I'd love to, but I need a Mojave-compatible machine prior to it's release so we can get our Xamarin application updated before-hand. Might be best to go with the 2015 option anyway, thanks!
I've just finished doing something similar. My mac mini was from 2011 and the harddisk had just died. It was an awkward year in that an SSD can be fitted to replace the mechanical drive, but it would be limited to half-speed on the controller.
The replacement is a NUC from last year (NUC7i5BNK), but there did not seem to be any point installing macos on it - over the years Apple seem to have crammed in a lot of stuff that I don't like and made it much worse as a small cheap unix workstation.
It's running FreeBSD instead and the difference in size / performance / efficiency is remarkable. It's sad that Apple have let the mac mini line die such a slow protracted death, but unless you are determined to be walled inside Apple's ecosystem there is very little reason to get one.
Doesn't this all feed into the idea that Apple is ditching x86/x64 and going full ARM?
It would make sense that the Apple TV is so much more powerful considering Apple pours so much into custom ARM chip design and OS development. MacOS doesn't work on ARM, and it doesn't seem like Apple is interested in porting it. Seems more likely the new Mini, if there ever is one, would be ARM based.
I would be shocked to an extreme degree if Apple hasn’t been maintaining a version of macOS for their A-series chips for several years now. There’s precedent — they kept the lid sealed on their x86 OS X build for half a decade before the Intel transition was announced.
So while I agree that updates to the Mini are being held back by a pending ARM transition, I don’t think it’ll run iOS. It’ll run macOS for ARM.
Per-core performance has been stagnant in the past decade. Apple already appears to beat Intel's Y-series processors in most things. Scaling the un-core parts up to compete on desktop will take some doing, but scaling up into the relatively restricted 15-28W range is much easier.
Most importantly, Intel charges a premium for ultra-low power chips. If Apple can make a chip that's 80% of the performance for a couple hundred less per unit, it's a great deal. Moving everything to a unified ISA reduces design and developer costs and increases total control (aka vendor lock-in) which also increases profits.
Apple was a co-founder of ARM Holdings (along side Acorn and VLSI Tech). I don't know how much stock they hold now, but Wikipedia says they held almost 15% in 1999.
In the 2010-ish time, there was a huge mystery corporation looking into buying MIPS before it was ripped into pieces (with the actual MIPS design team going to Imagination Technologies). It's worth considering that ARMv8 64-bit is extremely similar to the MIPS ISA.
Insiders have expressed surprise at Apple's progress. It takes 4-5 years to make and launch a new architecture (and A7 was completely new in every aspect). The starting mark for a third-party design should be from Oct 2011 when the ARMv8 ISA was announced. What's the actual time to a shipping product?
* Apple A7 -- (Sept 2013) 1 year 11 Months
* A53 -- (mid 2014) 2.5 years
* Apple A8 -- (Sept 2014) 3 years
* A57 -- (Q1 2015) 3.5 years
* Apple A9 -- (Sept 2015) 4 years
* Kryo -- (Dec 2015) 4 years 3 months
* A72 -- (Q1 2016) 4.5 years -- basically a fixed A57
How did Apple design such a fast chip in less than 2 years? They did not. Even Intel with a virtually unlimited budget, many more designers, and in-house fabs can't make that happen (PA Semi was good, but not that good). Apple had to know about the new ISA when apparently no other companies did. If ARM told only Apple about the upcoming ISA, that could easily run afoul of regulators.
Here's what I speculate happened. ARMv7 is holding back chip design. Apple looks into buying MIPS because its a great ISA, but changing architectures without backward compatibility is super risky (this was during the app war period). You don't do risky if you don't have to.
Start designing a chip using a new ISA that's very similar to MIPS (a known good starting point to reduce design time). Approach ARM with the options. You can turn that ISA into a new MIPS ISA pretty easily or you can hand over that ISA for ARM to release with guaranteed backward compatibility. Between the long-time business connections, fear of losing Apple, and the fact that it's a good design, ARM agrees. MIPS is divided up with ARM buying rights to most of their patents (basically eliminating the chances of ARM getting sued over ISA similarities).
ARM starts on their own designs and releases the ISA 6-10 months later. Apple chips launch when expected (and have a multi-year lead). ARM chips launch when expected given a few months insider info. Kryo and other third-party cores launch when expected compared to the ISA release.
If I had Apples resources then I think I'd be working on a ARM (or maybe even RISC-V) box that would replace the Apple TV and Mac Mini and serve as a new home IoT server. It could serve as a home based backend to a constellation of Apple devices and services.
I'd agree that processing power I think is pretty reasonable, it would be nice for it to handle storage too. Imagine if one were editing a set of video on an ipad pro, or fast backup/restore of other idevices, or you wanted security video peripherals, etc.
Yep, but the AppleTV already acts as a "Home IoT server" alongside the HomePod. (and an iPad can it too.)
HomeKit uses one of those 3 devices to allow external access to your local sensors. For instance, I get notifications when the front door is opened and closed. I can unlock it remotely too. Am slowly adding extra sensors.
From the cousin comment:
> security video peripherals
See HomeCam https://homecam.app
Unfortunately you can't run any old app in the background on any of these closed devices. I have a quad-core Mini that does duty as media library, software update caching, video encoding and photo export... if I could do those on an ATV it would be great.
It seems like for the homecam.app there's no storage of the video stream, or one would need to add another box/device for that?
I haven't bought into any home automation hubs yet, it just doesn't seem quite there in terms of being worth it from any of the vendors, and I think if it were, I'd just run a full up server for a central hub.
That doesn’t seem like the best example. I’m not sure what the context was for that video, hit it doesn’t look like Steve said that tablets would fail. He said that a tablet made in that era would only be useful to rich people who want an extra device just for content consumption. That’s still most of the market for tablets today.
That's the point. Apple generally doesn't make forward-looking statements about its tech. They say that everything they don't do yet is wrong, and when they get it working they say it's so right. The "right now" part is implied.
He could have rewritten the last sentence from "They just seem to have lost any interest in making one that runs MacOS." to "They just seem to have lost any interest in MacOS.".
I think I have about about 20 Mac Mini's for various roles around my office. Probably the most used computers / servers in our entire environment. They're starting to show their age though. One by one they are being retired.
I (might have) had a similar experience to the author, but instead with an earlier skylake model of the Intel NUCs. The only issue I (in theory) had was the predictably slow graphical performance due to the lack of a GPU.
Ultimately I chose to keep using an older macbook pro as my main driver and installed linux to run things that were too slow for Vagrant/VMs.
[Edit] Upon reflecting on this article and my (theoretical) experience, I'm going to do a little reading into how well supported AMD APUs are (i.e. Ryzen 5 2400g).
The thing that annoys me most is that Apple won't say that this product is dead, or if it is not dead then I'm even more annoyed at how long it takes to update.
It's dead if it is not a product that will be updated.
That's the definition of dead for any line of computer products - sure you might be able to buy one now, but if there are no more iterations coming then the product (line) is dead.
As old time MS-DOS loving Never Appler, I don't get this. Every few years, if I need a new computer, I look at the options and then I buy or build one that meets my needs.
What does it matter if the vendor is going to, or not going to, offer a similar-but-updated one in a few years?
Well, if you're investing time and money into a platform, you don't want to waste it on a dead platform.
I'll use myself as an example. I've written some scientific visualisation software, which uses OpenGL. It uses Qt and works on multiple platforms, including FreeBSD, Linux, MacOS X and Windows. Should I continue to support MacOS and invest in Mac hardware to do so?
Right now, Apple have just deprecated OpenGL; maybe they will drop it entirely in a few years. Their hardware is anæmic, with low end GPUs and not much memory. I'm not going to invest in Metal support given the lacklustre hardware support for high end visualisation. I'm also not sure I'll have that many users on the platform, particularly if it doesn't get some serious improvements. You can order or build an excellent PC and run the software on any other operating system of your choice and have it run circles around the Mac in terms of performance and capabilities. So on balance, I'm retaining support for the moment, but I will almost certainly drop it in the next year or two unless they up their game.
Sad really, MacOS X was wonderful a decade back. Today, it's an outdated and poorly-maintained, buggy mess. And the hardware story is tragic.
That is a different context. As a consumer I don't care about OpenGL. Developers will support whatever newer APIs Apple comes out with depending on their sales numbers. Its a simple cost/benefit.
As a consumer, from a long-term perspective, I would never choose OSX because it has no history of backwards compatibility. So I would never choose OSX for running my line of business applications.
I don't agree that it's a different context. An end user might not care directly about OpenGL or any other low level feature, but these things will directly affect the availability and quality of third-party software for the platform, and hence factor a consumer's decision about which platform to invest in.
Okay, I'll go ahead and agree with you that it can be a factor. My opinion is that it is an insignificant factor for a person in a store buying a computer.
Nitpick: I always hated Windows, I've used Linux for the last 23 years.
But I still don't get it. Suppose in two years time, Apple axes the Mac Mini; but today the mini meets your needs. In three years time, you will have to shop around for whatever MacOS supporting computer meets your needs then, but you'll have to do that regardless of which Mac you buy today.
Now it might matter if you are considering switching OSes. I.e. you fear that in a few years time, MacOS will support no computers that meet your needs. In that case, you might consider making your peace with Linux (or even Windows!) today.
I've tried all operating systems that took my interest.
For my purposes, and by my personal measures, OSX is far and away the very best OS out there. I therefore will buy whatever computer I must to use that OS. Nothing else matters to me.
But realistically, when has Apple guaranteed anything about future products? Even if they released a newer version of the mini, you might hate it because they changed something you like.
As much as I like to think apple is really fuct and what not (I mean, I relish seeing them suck lately), fact is they are mere inches away from fixing all of the recent complaints people have had with their products. I'm seriously looking forward to this years macbook refresh, (less so the mini, but its still a cool piece of kit).
Context is everything. We're talking about desktop computers. We already know laptop users dwarf desktop users these days. My question implied I was curious about what percentage of desktop users use iMacs. 70%, even for a product that isn't widely used, is interesting.
Mac sales are pretty good relative to the overall market.
iMacs are almost definitely a minority compared to laptops (Apple doesn't break out sales by model), but since Apple is selling somewhere north of 10 million Macs/year, there have to be several million iMacs in active use.
Love my 5k iMac I bought 2 years ago; it replaced a 10 year old Mac Pro when I just couldn't justify the pricetag of a new computer AND display separately, with the 5k iMac starting at just $1800. Friend just bought an iMac Pro. My mom's still going on a 20" iMac from, I think, 10 years ago. I replaced the hard disk with an SSD at some point, but I can't even remember why -- possibly just to give it a performance bump. I know the disk didn't outright fail but it's possible it was making noise.
Not the GP, no source, but anecdotally, I see lots and lots of design related offices around London littered with iMacs everywhere. Dunno if they like them, but they certainly use them.
I have a 5k iMac. No real complaints with it, although the design is pretty dated with its massive bezels and huge chin. Compute-wise it serves its purpose for me, good performance for an all-in-one.
The low end Mac isn’t about the hardware. It’s about being the cheapest possible entry level thing required to get the macOS developer experience, ie being able to compile iPhone apps, run Selenium over Safari or whatever.
At the end of the day, they aren’t really selling it for the hardware. Otherwise it’d be like 6mm thick, half the footprint and three times as fast.
Man, in the last couple of years I've been counting myself lucky in that I bought a late 2012 Mini with the fastest i7 offered. It appears to be downhill for the Mini from then on.
Can someone explain why Apple computers seemingly only have a 2-3 year lifespan? I've had the "opportunity" of owning three Macs in my life and none of them have lasted more than a couple years.
Sure, they work. I gave me Grandma my old iMac and she uses it to this day. But it's impossibly slow, as is my 2015 iMac that I'm writing this on.
Restarting this $2500 computer takes several minutes, opening applications takes similar period of time. The computer lags when you open ANY save dialog. Even opening finder takes about 1 second.
Meanwhile I also have a five year old Windows PC I built myself, restarts in under 15 seconds. As soon as I see my desktop I can launch any application instantly.
Does the iMac possibly have a HDD rather than an SSD (which your custom PC probably has)?
The SSD makes a huge difference in response time. It’s the only reason I’m still using my 5 year old MacBook Pro which has a PCIe SSD and I have no plans to upgrade other than to maybe get TB3 support.
I actually do real development on it, running multiple applications, running Xcode builds, the lot. It does everything I throw at it like a champ.
I’m not sure where you’re getting the 2-3 but that’s definitely not what you should be getting from a modern PC/Mac, especially if they have an SSD.
I bought a 2011 mac mini used for $250. It's been great for ios dev. It ran xcode, simulator, visual code & safari with 50+ open tabs well. Sadly it started acting up last night during an update and I need to reinstall the OS, but for a 7 yr old hardware it's done fine and I'm no apple fan.
Probably because you don't have an SSD. It's a shame that entry level models of both the iMac and Mac Mini have hard disks rather than SSDs, because it gives a bad experience.
I have a 2012 MacBook Air, 8GB, 256GB SSD still works great, had to replace the battery, but that's all. I run virtual machines on it, edit photos and video, no major problems with it speed wise.
My experience is that macOS is similar to how Windows used to be. You have to wipe it out and reinstall it every year or two to get your performance back.
In fact, performance has gotten slightly better for me on my 2013 MacBook Pro, especially after the Metal/APFS upgrade after High Sierra. And I’ve been upgrading since Mavericks now, not doing a clean install.
Apple wants to onboard people into the new Apple ecosystem, which is iToys and backend services. They believe this will be better for you, not just for them. They control the iOS-based world in a way they could never control "computers". They're trying to get as many people as they can into their new Disneyland-ish ecosystem, where everything you do is approved and monitored and everything you buy is bought through them.
The low-end Mac seems to only matter now as a source of low-end users, who need carrot and stick to move them off Mac and onto iOS, so low-end Macs won't get better but low-end iOS devices will. I suspect we'll see touchscreen laptops running iOS pretty soon, and that will signal the end of low-end Macs.
The high-end Mac users have a better story, because Apple needs developers for their ecosystem, and those people need good tools. I can't imagine Apple being interested in developing dev tools that run on non-Apple devices, and I think they'll want an absolutely guaranteed source of computers with whatever features are needed to develop the best stuff for their ecosystem, so I think a high-end Mac product line will live on with each new version being a bit more iOS-like. If they want to keep the lockdown on iOS while retaining full power to develop for it, they will probably have to keep the less-locked-down macOS going for dev machines for a long time, but things like the Mac Mini have no place in this picture.