Apple has discontinued the Mac Pro – but it’s just the first of the tower computers to go. The rest will follow soon.
Fruit-sniffers extraordaire 9-to-5 Mac got the news yesterday, complete with official confirmation from Apple itself. It’s official and it’s happened, but there have been warning signs for months – in November 2025, Bloomberg’s Matt Gurman said “The Mac Pro is on the back burner.”
The phantom fruit-flingers of Silicon Valley launched the seven-thousand-buck Apple Silicon-based Mac Pro in June 2023, with an M2 Ultra SoC. It sported seven PCIe slots – but the problem was that cash-rich customers couldn’t add the sorts of expansion that normally go into a PCIe slot… to the extent that Apple publishes a page about PCIe cards you can install in your Mac Pro (2023). Notably, the machine did not support add-on GPUs: only the GPU that’s integrated into the CPU complex along with the machine’s RAM and primary flash storage. The machine also had no RAM expansion whatsoever.
Presumably, this limited its appeal for many traditional buyers, and the machine never saw an M3 or M4 model, let alone the M5 SoC that The Register covered shortly before Bloomberg called the Arm64 cheesegrater’s fate.
I’m not understanding the logic here. Apple killed their last tower. That isn’t surprising, and their user base is perfectly happy buying nothing but SOCs.
Then there is a still-expanding PC gaming market, where building the machine from discrete parts is a portion of the hobby. By and large, this has never really overlapped with Apple’s user base.
The article does a poor job saying why we should expect non-Apple machines to go the same direction.
It seems to be implying that SoCs will soon become so fast and efficient that they will always outperform custom builds in every category including price.
There’s an argument to be made there (not saying I agree).
There are some memory latency benefits to putting memory on a single chip, but to date, that’s largely been handled by adding cache memory to the CPU, and later adding multiple tiers of it, rather than eliminating discrete memory.
The first personal computer I used had 4kB of main memory.
My current desktop has a CPU with 1MB of L1 cache, 16MB of L2 cache, 128MB of L3 cache, and then the system as a whole has 128GB of discrete main memory.
Most of the time, the cache just does the right thing, and for software that is highly performance-sensitive, one might go use something like Valgrind’s cachegrind or something like that to profile and optimize the critical bits of software to minimize cache misses.
I could believe that maybe, say, one could provide on-core memory that the OS could be more-aware of, say, let it have more control over the tiered storage. Maybe restructure the present system. But I’m more dubious that we’ll say “there’s no reason to have a tier of expandable, volatile storage off-CPU at all on desktops”.
EDIT: That argument is mostly a technical one, but another, this one from a business standpoint. I expect PC builders have a pretty substantial business reason to not want to move to SoCs. Right now, PC builders can, to some degree, use price discrimination to convert consumer surplus to producer surplus. A consumer will typically pay disproportionately more for a computer with more memory, for example, when they purchase from a given vendor. If the system is instead sized at the CPU vendor, then the CPU vendor is going to do the same thing, probably more effectively, as there’s less competition in the CPU market, and it’ll be the PC builder seeing money head over to the CPU vendor — they’ll pay a premium for high-end SoCs.
In Apple’s case, that’s not a factor, because Apple has vertically-integrated production. They make their own CPUs. Apple’s PC builder guys aren’t concerned about Apple’s CPU guys extracting money from them. But Dell or HP or suchlike don’t manufacture their own CPUs, and thus have a business incentive to maintain a modular system. Unless one thinks that the PC market as a whole is going to transition to a small number of vertically-integrated businesses that look like Apple, I guess, where you have one or two giant PC makers who basically own their supply chain, but I haven’t heard about anything like that happening.
My parents bought an Acer Pentium 55 (yeah, the one with the floating point issues) after having the 8088 and 386 custom built. It was such a shitshow that when I headed to college, we considered a DEC Alpha … in the end, I got a P-II 266. 64MB of RAM and the worst reliability I’ve ever seen in a hard drive. My roommate had a K6-2 233 with 32MB of RAM. His computer never crashed. For obvious reasons, I built a K6-2 300 system, and I’d not return to Intel for a decade.
They already are. Increased speed and efficiency are solid reasons. The Mac Pro was absolutely enormous in comparison to the new Mac Studio, which absolutely blows it away in terms of performance, while being a lot cheaper. Strix Halo is a great example of similar benefits on the PC front.
The vast majority of PCs aren’t sold to hobbyists. Gamers mostly benefit from the existence of other markets that they can sell these chips too. If those go, these chips get taken off the market.
This is a bad article. It’s just an Apple fanboy watching their company continue its trend of shitting on customers and assuming that everyone inevitably will, apparently never once reflecting on whether their insistence of sticking with Apple is the real problem.
Their argument boils down to CPUs increasingly integrating basic versions of other components over time meaning that desktops will disappear… Ignoring that the desktop market has stayed surprisingly flat that entire time and has certainly not disappeared.
If your argument is that integrated CPUs will outclass discrete components connected with high speed buses then you need to make it from an engineering standpoint, not a headline one.
I also don’t understand his reasoning that because NVidia don’t buy ARM they don’t get to make an integrated CPU… Nvidia made and sold an integrated ARM CPU before ever being rumoured to buy them, and they still make and sell it to this day … because ARM’s entire business model is based on companies like Nvidia licensing their designs.
It’s an opinion piece. I don’t agree with all of it, either.
This said, do you really miss having a northbridge and southbridge?
It’s just an Apple fanboy
checks article history
Almost all of their articles are about Linux.
Hey now, let’s let this user craft their own reality!
Just because Apple failed to make expandible hardware doesn’t mean if won’t still work for PC’s.
The interesting thing is the people who will care the most about this are professional users, who actually did require a machine with real expandability, to stuff full of the likes of SDI video IO cards (eg https://www.aja.com/products/kona-5).
If you ask those people, they’ll undoubtedly gladly tell you how much it sucked dealing with Thunderbolt-to-PCIe expansion cages during the “Trashcan” era in order to use their machine for their work.
While Thunderbolt’s throughput has certainly improved a bunch since then (80Gbps symmetrical or 120/40Gbps asymmetrical for TB5, vs 20Gbps for TB2 back in that era), latency and stability still frankly leave a lot to be desired versus a real PCIe slot.
For people who already perceive Apple devices as overpriced toy computers, their further alienating what was at one point their primary target audience - high-end professional users - will certainly seem like an odd choice.
The writing on the wall is large and clear. You can still have high-end kit, but you don’t get to put it together from discrete bits. The fastest parts – the CPU, GPU, volatile and non-volatile storage – all get assembled as a single, highly integrated, non-upgradable component.
Honestly I’m shocked desktop PCs have lasted this long.
That being said, PC gaming is a growing trend, not shrinking, so I suspect there will continue to be at least some availability in the future for those components?
Additionally, while Macs are really great at some workloads, they’re still inferior in others to existing desktop machines with dedicated GPUs, and the closest competitor from Apple will still cost at least twice as much.
PC gaming is a growing trend, not shrinking
Wait until we see the 2026 stats for hardware sales. 📉
Though I think the supply issues will hurt consoles just as much.
I’m not sold that modular desktops are going away in general. SoCs have some benefits in terms of power usage, but those are most-substantial on phones and least-substantial on the desktop.
My understanding is that memory may move away from DIMMs to CAMM2 to permit for higher speeds, but that’s still a modular system.
Yes. That Apple can do these things because their soc is their market deferential. It’s not an over all market direction.
CAMM has been around for years now but I’ve never seen a single model using them. Even Framework passed on them with their new desktop.
You don’t need to as long as you’re getting sufficient speeds from non-soldered DIMMs, and desktops are generally still using non-soldered DIMMs.
Desktop PCs are so much more powerful and fast than laptops of the same spec. Not to mention cheaper.
High integration on laptops decreases space and cost by wildly increasing battery life for the same battery
This isn’t about laptop/desktop but about modular vs. Integrated processors.
Integrated processors let laptops be faster without also using power. Strictly speaking it’d be cheaper to just use a faster CPU but battery life is more important than cost so lots of money is spent on integrating processors.
Desktops are still around because they’re upgradable and faster than their laptop brothers.
…once again, not talking about laptops.
An AIO is effectively a laptop without a keyboard. They’re functionally very similar (appealing to less power-hungry users). They’re just less mobile.
Presumably it’s cheaper for apple to just put the integrated CPUs in everything because it’d be expensive to make another model.
I garuntee you this trade off only makes sense for Apple. Other AIOs don’t always have the new laptop chips from Intel because it makes more sense to use the desktop one with all the space they have.
They put them in everything because they’re smaller and more efficient (and thus quieter) and because they’re competitive with PC desktops in performance. And economies of scale doesn’t hurt either.
I get what you mean. What I’m trying to say is that desktop/non integrated CPUs are cheaper and this cost savings continues into a large form factor. Apple doesn’t put a desktop chip in their iMacs because they don’t make one. That’s not what their customer base needs. If they did it’d be 4x faster for the same price.
And these arm chips are slower than x86. X86 is so much faster at least for single core performance which matters a LOT more for desktop use cases
Aside from them, discrete graphics cards are history, just as disk controllers were a few decades earlier. DIMM slots are going too. The primary storage will be built in. (The industry missed a great deal there.)
Discrete disk controllers are still around.
My last desktop had a PCI SATA card that I added after I exhausted all of the on-motherboard SATA slots.
My current one has a JBOD SATA USB Mass Storage enclosure.
We are talking about Apple, the “you’ll pay $bigbucks to have one usb port and you’ll be happy about it” Apple here…
I mean, Apple is the example the author is using to come to his conclusions, but he’s talking about the industry as a whole regarding the disk controllers.








