This year, Apple released, to much fanfare, a somewhat obscure technical change to how its computers work: Macs will transition away from Intel’s CPUs to in-house processors known as “Apple Silicon,” more similar to the technology Apple already uses in its phones and tablets. It is a tremendous amount of hype for something rather technical, and to people used to more user-visible feature announcements, this can be somewhat disappointing, or at least confusing.

What does this actually mean for the end user? Apple claims that these new Macs will be (many times) faster, run cooler, and have much better battery life. Are these improvements as drastic as Apple claims? Will there be downsides and other adjustments that users will have to make, or will these new computers just work like faster, less power-hungry Macs?

A lot of responses I’ve seen seem pretty skeptical, which is fair. It’s been a long time since the drastic improvements of Moore’s law have been the norm in computing; we’re used to much more incremental improvements. And Apple is claiming to achieve these improvements by moving away from Intel, when Intel is the established market leader in making high-powered PC processors. Can these new computers really be that much better?

My answer, as your computer-nerdy friend, is that these computers are not only going to be a great technical improvement over previous Macs, but represent a revolution even beyond the Apple ecosystem, a turning point for PCs in general, and one that was a long time coming. To explain why, I’m going to delve into some computer history to give context to this shift, and some of the technical details of how computers work, and specifically in what ways these new Macs will work differently from the current ones.

So bear with me as we go deep. I promise it’s relevant.

Operating Systems and App Compatibility#

Do you remember when it mattered a lot more which operating system you ran?

Nowadays, most of my personal time on the computer is spent on the web browser, doing my writing on a website, my TV watching on a website and even my TODO lists. I look at the bottom of the screen on my non-work computer, the MacBook I’m using to type this, and I see a slew of icons for various apps: a messenger app, a mail app, a calendar app, a spreadsheet, a word processor, all in all standard computer fare, but not getting much use compared to that Google Chrome icon. As a result, unless we’re doing specialized tasks, like programming (as I do for work) or CAD or photo-editing, apps besides the browser is not the big deal it used to be.

But once, it was a huge deal. Every computer user had several different apps in their workflow, and using an alternative operating system (as macOS once was) risked not being able to find appropriate equivalent apps, and possibly not even being able to read documents that would be in “Windows formatting.” There was tremendous social pressure to use the same software as other people, to the extent that this webcomic rang true.

Therefore, every serious personal computer application existed as a Windows program, specifically a Windows program that ran on Intel (or Intel-compatible) processors (known as “Wintel,” especially by those who criticized it as a monopoly). A company would support other types of computers only as an after-thought. And at that time, apps were distributed on CDs and stored on physical media. It was common to have an old version of an app lying around, not be able to receive live updates for it, and to expect it to run on a newly-purchased computer, unmodified.

In such an environment, compatibility was key to profits. Each new Intel processor, and each new version of Windows, had to support all the apps that could run on the previous version. There was no higher priority. For years after Windows 95, since before Windows 95 was even released, Microsoft had another operating system, Windows NT, that was far more stable and technically superior. But Windows 95 was more similar to Windows 3.1 before it, and supported more apps, so until Microsoft could get Windows NT to run all those other apps, it was stuck with the inferior product. Eventually, 6 years later, Microsoft came out with a version of Windows NT able to run 95 apps: Windows XP. Even that transition was gnarly.

The Intel side of the “Wintel” monopoly was similar. To this day, a modern Intel processor is capable of running MS-DOS programs from the 80s directly, without requiring any emulation layer in the operating system or any modifications to those programs. The antiquated 16-bit instructions that comprise those programs will still be interpreted by the modern hardware, which also supports countless other compatibility modes for various eras of the processor history. And this is true not only for the Intel processors that power Windows PCs, where it makes some amount of sense to support all the Microsoft ecosystems of years past, but also on Macs, where this history is much shallower.

Processor Design and Instruction-Set Architecture#

See, Intel is more than just a company. Intel is also an instruction set architecture, also known as Intel64 (or Intel32 for 32 bit versions) or x86 (sometimes x64 for 64 bit versions). When applications are prepared to run on Intel, they are (traditionally) compiled to a file containing a sequence of instructions. The meaning of each instruction is determined by complicated standards, and the hardware of the processor must take these instructions, and actually perform the designated operations.

The amount of complexity involved in the meanings of these instructions, or the instruction set architecture (ISA) is large. Intel’s ISA was documented in 3 paperback volumes back when I was a child in the early aughts when my parents were gracious enough to order the set for me. It has only grown more gnarly since then.

Because of the vast compatibility requirements that Intel in particular historically has faced, their ISA, has never been redesigned from scratch since the 80’s. This means that, instead of having every instruction be a fixed length of say, 4 bytes, Intel instructions can range from 1 to 15 bytes. Some of them do simple things like adding two numbers together, whereas others do more complicated things like copying an entire string of characters. Many of the design choices would never be made by a modern engineer, but Intel is stuck with them for historical reasons.

But Intel hasn’t been able to clean this up, for the same reason that “Wintel” customers can’t switch from Intel: Any clean-up would mean old apps would not be able to run on the new computers. When Intel did attempt this, with Itanium, the world wasn’t ready, even though Intel tried to leverage the already-difficult transition to 64-bit as a reason to get people to switch to a completely new architecture. Even worse, any clean-up would mean that Intel would have to compete with other companies as equals, whereas now there is only one other company, AMD, that is allowed (for historical reasons) to design Intel-ISA processors. Embarrassingly, it was AMD that actually convinced everyone to switch to 64-bit, by providing a much more gradual transition to a 64-bit ISA far more similar to the existing 32-bit one.

Processor architectures with such convoluted ISAs are referred to as CISC, for Complex Instruction Set Computing. All of this complexity must be implemented using more complex hardware. Intel processors come with decoders to break down over-complicated instructions into smaller pieces, reorder buffers to optimize on-the-fly which pieces can be done when, and extra circuitry to handle all of the various compatibility modes that are necessary to support old software. All of this constrains processor design and, very specifically, draws extra power.

What’s the alternative? The transition to mobile, to phones and tablets, gave computing something of a fresh start. No one ever expected their “Wintel” apps to run on a phone, and so when initial iPhones and Androids came out, new apps were written from scratch. Those would be written for whatever processor architecture Apple and Google chose, and they chose a more modern, less-CISC ISA: ARM. ARM stands for Advanced RISC Machine, where RISC, or Reduced Instruction Set Computing, is the opposite of CISC.

The ARM ISA, which unlike Intel’s is available for any company to license and design their own compatible processors, takes a moderate position in the historical RISC/CISC wars, and require in any case far less decoding circuitry than Intel processors require. When Intel tried to make Intel-ISA processors for phones and lightweight laptops, the Atom processors, it was a failure. The decoder got in the way of achieving a good combination of performance and power consumption, and the resulting phones were either unacceptably slow or unacceptably low in battery life.

And Apple Silicon is Apple’s branding for their ARM ISA processors, supporting the ARM ISA with Apple’s proprietary processor design. They’re bringing the benefits of phones to the PC world.

New Modes of App Development#

So the “Wintel” monopoly and Intel in general never jumped from the PC world to mobile, and as a result we have our cool, fanless, high battery life but high performance phones we have today. But why, then, do Macs, which never ran Windows programs, use Intel processors to begin with? Why are they switching now? And why is this a turning point for PCs in general?

Well, for one thing, it’s not entirely true that Macs don’t run Windows programs. A key reason why Apple switched from Power to Intel in the first place was that Macs can run Windows programs: by either running a version of Windows simultaneously to running macOS (, or, more simply, by rebooting the same computer into Windows, which runs on Mac just as well as on any other type of PC. When Intel Macs first came out, this was a decisive feature for many switchers, nervous to abandon app compatibility.

And at the time, Intel was the best processor manufacturer in existence, so that Intel processors with their flaws were still better (as they were produced through better manufacturing processes) than the POWER-based RISC processors Apple was previously using. Get better processors and get some level of Windows-compatibility: the decision was clear for Apple at the time.

But now, Windows compatibility is not important to hardly any Mac users. And there are a number of reasons for that. Nowadays, a lot of software is not translated to machine code, the level at which ISAs are relevant. A lot of software is delivered to us via the browser, where a portion runs on servers in the cloud and a portion is written in Javascript, and then either interpreted in that form, or translated to the ISA of the computer live by the browser. Once the browser supports an ISA, all websites come with it.

And even for apps, many of them are written in higher-level languages that use a virtual machine or Just-In-Time compilation to be processor-architecture neutral. These programming language technologies matured after Intel had already become stuck with its backwards-compatibility advantage, and apps written with them also are easily portable, which is to say, brought to a new ISA. Once the Java virtual machine (for example) is ported to ARM, all Java programs come with it.

And even for programs that are compiled in a traditional fashion, written in an old-fashioned compiled programming language like C or C++ (or the Apple-specific Objective-C or Swift), ISA compatibility is no longer the issue it once was. These languages have evolved over time to make it easier to re-target ISAs, and programmers nowadays are better trained in writing their code in such a way that it can be compiled for any ISA. Creating an ARM version of a Mac app is just a switch in the compilation system, and maybe finding a few obscure bugs where the differences matter a little more deeply.

And once the new version is made, Android and iOS both transitioned from 32-bit to 64-bit ARM, requiring new apps to be built, and we as customers hardly noticed. Developers quietly prepared 64 bit versions of our apps, and when we upgraded to 64-bit compatible phones, we didn’t notice that the app store sent us a different version. After all, we get updated versions of apps from the app store all the time. As long as the developer can adjust, the end user just has to do some more downloading – cheap and easy in an era of widespread broadband.


And so what does Apple lose out on for the Apple Silicon Macs? The ability to boot Windows is now more of a liability than an asset for them. Old programs will rapidly be ported. Due to modern technology, which allows us to translate between ISAs on the spot, emulating Intel on the new Macs is often faster than running the Intel programs directly on the old Macs.

As users, we might be mildly frustrated by it. I certainly will be a little worried about buying such a computer until I know that some version of Linux will work smoothly on it – Linux on ARM is currently very much so a second-class citizen in the PC Linux world.

But the advantages are great. No longer constrained by hardware decoders, Macs will cheat the old trade-off between computational power and battery life, and get to have their cake and eat it too, at least for one round of abrupt improvement for the transition. And because now the PC and phones use the same processor architectures, iOS and iPadOS apps will now work on macOS as well, saving time for writers of tablet applications.

And other PC manufacturers will end up having to notice. Windows on ARM exists already, though it is currently obscure. If Microsoft can cultivate as modern an ecosystem as Apple, where a combination of emulation and streamlined distribution make it easy to get ARM versions, these new Macs might start an ARMification trend.

This spells (long-term) doom for Intel. Their business model is tied to their ISA, on the premise that no one can afford to switch away from the most popular ISA, that everyone is locked in. This was never true for mobile, in spite of Intel’s best efforts, and as Apple is demonstrating, it also hasn’t really been true for PCs for a while either.