Skip to main content

Apple is Arming itself

(source: Thenextweb.com)

I’m still following Apple and developments in hardware and software in general as it is still somewhat of a hobby of mine (having started my professional career developing on NeXT). Even after I have given Apple bad grades here and here, I think yesterday’s announcement that Apple is moving to their own Arm chips, is something that had long been coming and I think it will potentially alter the computing landscape. While I’m certainly no longer able to have insight knowledge or even be able to follow the last development in chip design, I do believe in phases or cycles of evolution and revolution in technology and I think that’s true for information technology as it is for cars or anything else for that matter. Apple just started to ramp up on a revolution cycle.  
Reading the opinion of analysts regarding Apple’s risky move from Intel to their own Chips left me bewildered. After all, this is Apple’s third processor architecture change – from the Motorola 68K to the PowerPC and then on to Intel. Three different instruction sets and every time the migration worked without too much hassle. So, for Apple this has always worked out and it gotten better and easier. The move from PowerPC to Intel was especially smooth given that in the best case, a simple recompile of the app was all the developers needed to do – Apple had already experience not just because they moved previously but also because the company (NeXT) they acquired and that brought Steve Jobs back to Apple had offered its operating system (NEXTSTEP) for four different platforms: Intel x86, Motorola’s 680x0, PA-Risc (HP) and Sparc (Sun) – so there is, and probably continues to be, tremendous experience with a multi-processor landscape. So, let’s instead focus on what it means for Apple and the wider ecosystem:   

1st and foremost, the big losers are chip OEMs such as Intel and AMD (and probably others as well). Intel primarily because it had been the sole CPU provider since 2005. For AMD it’s a double whammy. On one hand it has been providing the GPUs for the Mac lineup but also, and in my opinion the bigger loss in the long-term, an opportunity to “steal” the thunder from Intel and to position themselves with their own intel compatible CPUs. AMD’s Ryzen and Threadripper platforms have started to outperform Intel where it counts. Keep in mind, Apple is no longer the niche PC and Laptop producers it once was, today it has roughly 10% of the desktop and laptop global market share. Those 10% will be felt by Intel and AMD.

2nd, the other PC OEM’s that are somewhat significant will feel this in the price pressure, this includes the likes of Dell, Lenovo and HP that use chips from both Intel and AMD; they are still dependent on these chips for as long as there is no smooth migration plan for Windows to make a clean transition to Arm or any other chip architecture. And we should note that it’s not for lack of trying, as Microsoft has tried (and so far, failed) to move Windows to anything other than Intel x86. There was, for example, a push to move to PowerPC in the 90’s (failed), then there was a push to move to Arm with their first generation of Surface products (failed), and there is another attempt to move to Arm with their latest Surface iteration. Microsoft has repeatedly failed at this for two reasons: a) fear of cannibalization and b) they simply do not have the tools and frameworks in place to make this transparent to the users. Where Apple shines is in their run-time layer, libraries and SDKs that have been mostly architecture neutral (thanks to NEXTSTEP), whereas in the case of Windows, it was never designed to be neutral. However, the move that Apple is planning will force the three big players to rethink their strategy because they will not be able to keep up with Apple in the mobility computing space as they will never reach the performance-per-watt levels of an Arm chip. So, Apple’s new MacBooks with Arm chips will not only be faster than the competition, they will also be thinner and possibly without a fan (no noise). I did an assessment on my spouse’s iPhone 11 Pro Max (geekbench) and that thing is already 3 times faster than my Intel based Microsoft Surface from 2017 (geekbench)!

3rd, given the realities outlined above, Apple’s move has the potential to usher in a whole new computing paradigm. Over the years, we have seen some movement away from Intel and to break with long standing traditions. For example, Facebook started to design their own chips (link) and last year, Tesla started to build cars with their own chips (link), the data center behemoth Amazon (link) is designing its own custom silicon and not to be left out, Google is creating a new AI chip (link) as well as a chip for their upcoming phone. But when it comes to the Wintel-duopoly at the workplace or at home, nothing was able to make a dent. This might finally change that.  
While I’m still not crazy about how Apple is taking control of every aspect in their ecosystem, I have to admit, I’m grateful that they are all-in on their own chip design and that I hope this will send a clear and strong signal that now is the time, to start thinking out of the (chip design) box and improve computing well beyond the small incremental changes that we had to endure over the last decade.

Comments

Popular posts from this blog

Will smart phone cameras be better than digital mirrorless cameras?

  If you believe Terushi Shimizu or rather, the way the press is formulating it , then camera phones will have better image quality in 2024 than your trusty DSLR or mirrorless digital camera. He backs this up with sensor technology advancements and computational photography. He has a point.     However, as a digital camera enthusiast myself, I must strongly disagree with this point of view. The message might be interpreted in such way that its meaning reflects a view that we are no longer bound by physics to get the best image quality.     The thing is this, the bigger your camera sensor, the more photons it can capture. However, this comes at the realization that big sensors require big lenses which in turn makes the camera big and heavy. I’m simplifying of course, but that’s physics. For camera makers it is therefore always a question of tradeoffs: do you want better image quality or do you want a smaller and lighter camera. Camera phones or cameras in smartph...

Apples Vision Pro Headset strategy is all about its Arm-chips.

  Apple has given us a vision of what their VR and AR future might entail. But as have others pointed out numerous times, the whole point of the showcase at the WWDC 23 was to let people experiment, I’ve heard others say that it’s like the launch of the Apple Watch when Apple didn’t really know what would become of it. This is similar and yet different.  Just like the Apple Watch (and the iPad before it), Apple sought to porting its whole ecosystem onto a watch – granted, the Apple Watch can’t live on its own and a better comparison would probably be the iPad. The iPad can live without any other Apple device and unlike the iPhone, never really had a clearly defined function other than to watch movies and browse the web. It was not until it gained the ability to be used with a pencil that artists and designers started to explore the potential.  I’m trying to point out that Apple took 5 years from the first iPad in 2010 to the iPad Pro with pencil in 2015 to find its “kille...

The new shiny armor of AI

If we listen to the media, business leaders, and the press, we should be getting behind the AI wagon because of its potential to automate many of the processes everyday companies struggle with. I don’t dismiss this notion entirely because I think it’s true if you have the ability to integrate this technology in a meaningful way. For example, the startup company " scrambl " (full disclosure, I’m a minority investor) is making use of gen-AI by "understanding" CVs (curriculum vitae) from applicants and identifying the skills to match them to open positions. This works great – I have seen this in action, and while there are some misses, most of that "normalization of skills" works. There are other promising examples, such as Q&A systems to understand the documentation of a complex environment. When combined with RAG ( retrieval augmented generation ), this has the potential to significantly reduce the time it takes to make complexities understandable. But ...