The Differences Between ARM and Intel x86-64

The Differences Between ARM and Intel x86-64

Is the next laptop or server going to be powered by ARM? Filled with charms that make them highly desirable as powerful performers in sleek laptops to edge servers, these new chips have kicked off a revolution, attempting to dethrone the long-rule of x86-64. But whatexactlydifferentiates one go-getter processor from the other, and more importantly, whether ARM can deliver more advantages to justify the abandonment of its competitor? Let’s break down the ultimate silicon showdown.

Read : Can ARM eventually replace x86 CPU in laptops? Let’s find out!

What Does x86-64 Mean Anyway?

Ever heard someone say the computer’s brain was an “x86”? That’s usually the hypernym for processors, mostly Intel, found in desktops and laptops operated under that architecture. Consider “x86” the base. Now just sit back and imagine a powerful 64-bit engine gliding right atop that base-a designation of the architecture “x86-64” running practically all modern computers out there today.

Still perplexed by the great mystery of the two “Program Files” folders in Windows? Think of it as a backstage pass to grasp how your 32-bit and 64-bit applications coexist in perfect (or sometimes imperfect) harmony.

X86arm Programfiles

Have you ever wondered where your applications disappear after you have installed them? Windows has a behind-the-scenes sorting system: 32-bit programs squeeze into the “x86” folder, while their bulkier 64-bit counterparts go elsewhere. This neat logic of arrangement has been around since 2001, with the release of 64-bit Windows XP, bringing some order into the digital world.

Think of “x86-64” as simply “x86” for now. Under the hood, they share enough DNA to keep things streamlined.

The labels still keep some remnants of the ancient heritage, for x86 was born from the name of Intel’s 8086 processor. That one was 16-bit, a first of its kind! It laid down the foundation from which the rest will be built. Much of the instruction set continues to exist, that same language modern processors conversing to tackle the hardest computing problems today.

Imagine: Today and code running with technology that was conceived in an era of bell-bottoms. That’s right, quite a chunk of our digital world is still waltzing to the tune of a computer architecture that came into being in 1978!

The 8086. A name that resonated in the halls of computer history, whispered in awe by all who remember the microscopic era of personal computing. Intel built an empire out of that chip, and the 80286 and 80386 came next, each in their own turn adding weight to the would-be processing power. Then came Pentium, a great leap away from tradition. But the legacy of 8086 lives on. Even today, every chip that speaks its language, that understands its fundamental instructions, is forever branded “x86” a silent salute to the processor that began the whole journey.

Trust it or not, the present 64-bit high-performing processors, those for which AMD led the way with its Opteron series, owe something to a fairly humble ancestor: the 8086. The foundational chip continues being the very backbone of architecture unto almost every desktop and server CPU present in the market.

Instruction Sets

Processor inside your computer’s brain has a secret language: instructions set. Think of it as a tiny built-in dictionary of actions that the chip understands. Every word in this dictionary is a basic command- add these numbers, move the data around. Anything your computer does, from displaying these words to running complicated games, is really achieved through these very basic instructions. The source code you see is but the tip of an iceberg. Beneath lies the assembly language, which serves as a translator working at the client’s side to convert from some high-level language into a natural language of the processor: machine code, the pure and atomic instructions that actually do the magic.

The IHS of an Intel i386

Image source: Wikimedia Commons

This is where we get to the core difference between ARM and x86 chips.

Imagine processors as chefs. An x86 processor traverses the CISC way with that thousand-ingredient recipe for a very complex dish. ARM is that RISC approach. It has the basic few ingredients and instructions to make the same dish through the execution of many fast and easy steps. While the CISC chef might take longer time over the dish, the RISC chef hinges on speed and efficiency.

CISC vs. RISC

Imagine two chefs: Chef CISC and Chef RISC. Chef CISC would just point his finger, casting his magic over the one big all-encompassing order: “MULT 2, 3!” The chef actually handles retrieving the ingredients, mixing up the product, and finally, an artistic plating of everything all in one go. Chef RISC operates differently: first, “LOAD the ingredients!” Then “PRODUCE the masterpiece!” and finally, “STORE it away!” Three precise steps for one culinary outcome. The big difference is that CISC enjoys an almost artistic elegance while RISC is the step-wise grammar purist.

Apple M1 SoC mounted on a MacBook

Image source: Wikimedia Commons

While the CISC chip might seem more efficient because its commands appear simpler, keep in mind a few important differences:

CISC processors take a marathon-like approach, slowly yet patiently completing very complex instructions that may last for multiple clock cycles at once. RISC processors are sprints. One instruction is fully executed in a single cycle on a RISC processor. That elaborateMULTperhaps ties with the RISC sprinter team performing the same job-a few focused instructions matching the marathoner’s pace.

The fewer the instructions, the fewer the number of transistors. RISC architecture empowers a level of simplicity in exchange for packing more power in less silicon-furnished footprint than CISC, where complex instructions gallop an island with a rich transistor count.

  • Third, the lower number of transistors required by RISC allows for lower power usage.

Imagine the pure elegance and sheer intuitiveness behind code that tricks your computer into barely needing to understand you. This is what CISC tries to promise. Using such a processor, you do not have to fight with viciously obscure typed-in-for-your-convenience machine instructions. For multiplication, for example, it reads:foo = foo * bar. TheMULTcommand at the processor level is almost quasi-similar in CISC. Like the computer has been half-aware of your thoughts since forever, handling rather complicated instructions with the lightest whispers.

On a conceptual level, RISC processors are viewed as minimalist chefs, who demand step-by-step recipes (compiler instructions) to transform raw ingredients (code) into a finished dish (assembly language). Conversely, CISC processors see themselves as all-in-one food processors that perform complex operations more or less directly. In addition, CISC processors can access the pantry (system memory) on their own, grabbing ingredients as desired. An RISC processor, however, feels it must load all ingredients onto its workstation (processor registers) before it gets to work at all.

The consumer locking to the RISC vs. CISC debate is without the clear performance champion. But RISC has one weapon that could change the rules of engagement and grab the crown.

Power Consumption

When it comes to power efficiency, RISC architecture strongly competes with the PISC one. Imagine it this way: CISC is the fuel-guzzling SUV, while RISC is your satin, electric scooter. By elegantly keeping the unwarranted adornments away and focusing on the streamlined instruction set, the RISC chips remain highly compact and power, which under the CISC architecture, goes for a big gulp.

Interior components of a traditional ATX desktop power supply

Image source: Wikimedia Commons

In your everyday smartphone, you find a pocketful of tiny powerhouses sipping power rather than gulping it down. These are tiny chips designed with minimum resistance and optimum efficiency. Think of these microchips like a seasoned marathon runner as opposed to a bulky weightlifter. Lean and mean! By the very nature of their operation, these microchips generate little heat, hence allowing your phone to perform at peak ability all day long-with some going through the whole night on a single charge!

Going forward, note that the ARM invasion has not yet commenced in the universe of computers. What ARM chips present is undeniable power sipping; a huge plus for the battery-powered devices. Desktop machines, however, stand for other priorities. Being plug-bound, they desire raw performance, the old CISC guys are sitting pretty over there, and will not be giving up that crown anytime soon.

Should I Get an ARM or x86 System?

The tech world spits the word ARM around. With Microsoft bringing Windows 11 on ARM and Apple going full-throttle into ARM Macs, one is left to wonder how big a part one wants to play in the ARM revolution. In essence: do you want the utmost battery efficiency that could squeeze every milliampere? Or do you want pure, unadulterated power with all the battery bleed that could even match a teenager’s allowance? Choose consciously.

For most needs, picking a system is straightforward. One last thing: don’t leave home without your charger!

Image credit: Engineer man in sterile suit is holding Microchip by DepositPhotos

Thanks for reading The Differences Between ARM and Intel x86-64

Getairo
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.