The microprocessor is 50 years old. But what makes it so special?
If you’re reading this article on an electronic device, thank the microprocessor. If you’re reading it the old-fashioned way, in print, thank the microprocessor anyway. We couldn’t have edited, designed, published and printed this newspaper, especially with most of us working from home, without little chips all over the place — from our design team’s special CPUs to those in the printing press itself.
Techies think of the microprocessor as the brain of a device — the circuitry that tells mobile phones, microwaves, Alexa, traffic signals, ATMs, MRI machines, Mars rovers and other gadgets what to do.
Us regular folk? We don’t think of the microprocessor at all. We’ve never had to.
But think of even the act of calling someone on your mobile phone. “Three microprocessors are powering our conversation,” says Kamakoti Veezhinathan, professor of computer science and engineering at the Indian Institute of Technology (IIT)-Madras, speaking via WhatsApp from Chennai. “One in your phone, one in mine, one in my car so can speak hands-free. And in between are thousands more connecting us.”
This year marks half a century of those connections.
Intel released the 4004 in 1971. It was the first commercially available chip that incorporated a central processing unit, memory capacity, input and output. Essentially, it could perform functions, save them and re-present them on request — and it was smaller than a fingernail.
Where previous processors had been designed to do just one thing — run an assembly line or crunch numbers, for instance — engineers could programme the 4004s to perform different sets of tasks across different devices.
It was revolutionary for the time. And it became a kind of building block for tech, sparking a digital evolution, bringing the personal computer and new technologies into our homes, shaping lives, communities, economies. And fittingly for a 50th birthday, it worries about whether it will survive the next 50 years.
For such a superhero, the 4004 has a surprisingly bland origin story. Tech companies around the world had been experimenting with all-in-one processing panels through the late 1960s. At Intel, a young scientist, Ted Hoff, knew a better design was possible. So when a Japanese firm, Busicom, asked Intel to develop chips for their line of printing calculators, he decided to use that project to try and redesign the chip itself.
Hoff created the 4004 as a mini-computer on a single slice of silicon, one that could be fitted into more than calculators. Engineer Frederico Faggin fashioned the hardware into a workable processor. Busicom, meanwhile, wondered why they were paying so much and waiting so long for Intel to deliver.
Intel was so sure of their little 4004, they paid the Japanese some of their money back, so they could retain ownership rights. Busicom got what they wanted — a calculator that could print out its answers — and sold some 100,000 pieces. Intel kept improving on their design, inspired other tech firms to create their own versions, and changed the course of history.
In an essay from 1982, Hungarian-American author Dennis Báthory-Kitsz — who reported extensively on technology during the first generation of computers in the ’70s — refers to microprocessors as “the greatest body of tools since the industrial revolution, perhaps even since the beginning of civilisation”. In a talk delivered the same year, he described a processor chip as “the first tool which is at once both wheel and writing”, meaning it could do your bidding, but also understand a new command when it came.
And because it was small, cheap, versatile and efficient, it turned technology from a complicated scientific tool into a part of everyday life. Before the 1970s, computers were bulky industrial gadgets, developed mainly for manufacturing, the military, or space programmes.
The idea of adding electronics to kids’ toys, car engines, Ganpati pandal lights, hair dryers or security systems would have seemed absurd. “Now, we can’t even think of inventing a gadget without embedding technology,” Veezhinathan says.
The world map
Microprocessor manufacturing and design changed the map of world economies, boosting South Korea, Japan, Taiwan and the US. In India, it was a game-changer in other ways. By 1984, the government’s New Computer Policy had reduced import tariffs on hardware and software. Software became a de-licensed industry, making software-service exporters eligible for bank loans. IT parks were developed to create a home-grown ecosystem of techies.
Entrepreneur Maulik Jasubhai knew India was interested in the growing global market for software. Back from studying in the US, he launched Intelligent Computing Chip, a monthly tech magazine. “Much of the developments at the time were published in tech journals or business publications,” he recalls. “We hit the sweet spot, talking about new advancements to a general audience.” Issues covered Microsoft’s high-priced software, which forced users to turn to pirated versions of Windows; some editions came with highly coveted free CDs containing new software.
“Intel and other companies showed the world how tech could be democratised and made available to everyone,” Jasubhai says. It spawned a new class of developers and coders in India, and in generations that were to follow, a separate ecosystem of call-centre employees who could troubleshoot tech problems half a planet away. “This was in a country that didn’t have much of a tech history. It changed the fortunes for lakhs of families,” Veezhinathan says. “The programmer is to the microprocessor what the student is to the educational institute. We owe it a great debt.”
The next step
We’ve come a long way since those days of free CDs, bulky devices and technology as a luxury frill. Your basic smartphone has more computing power than the tech that put man on the Moon in 1969. There are now more microprocessors on Earth than people. Most of them power communications devices, household and car tech and the Internet of Things (IoT). Barely 1% of microprocessors end up in home computers.
Ted Hoff’s invention also ended up saving his own life — at 83, a microprocessor-powered pacemaker keeps his heart ticking.
As we hunger for more, the minerals, heavy metals and other components that go into making a chip are getting harder to find. A global microchip shortage in 2020 meant that several car, domestic appliance and video-game manufacturing factories had to halt production or raise prices. Several countries and global companies — Alphabet, Apple and Alibaba included — are exploring ways to create custom chips for their specific needs, especially machine learning.
At IIT-Madras, Veezhinathan leads the special laboratory that developed India’s first indigenous microprocessors. “There is a need for domain-specific architecture particularly in India,” he says. “It would help to have a separate type of processor to check air quality… something that needs low power so we don’t have to keep changing its batteries, but can still aggregate data and communicate with a central system.”
The first of these indigenous purpose-built microprocessors, Shakthi, was released in 2018. Veezhinathan hopes it might be adopted in animal husbandry and your cornershop. A new one, Moushik, was launched last year and is designed specifically for IoT devices. “It’s a step towards being a self-reliant India,” Veezhinathan says.
Ajit, developed by IIT-Bombay in 2019, is the first processor to be conceptualised, designed, developed and manufactured in India. It is geared for larger systems such as robotics, automation, eventually India’s satellites.
Some new microprocessors aren’t even trying to be micro. A California-based start-up, Cerebas Systems, released the world’s largest computer chip in 2019. Where most processors are rarely larger than a postage stamp, this one is as big as an iPad. The manufacturers believe it will give Artificial Intelligence systems a big boost.
For those who’ve followed the development of the chip, an interesting milestone now awaits. In the 1960s, Intel co-founder Gordon Moore predicted that as we developed tech, the processing power of a chip would double every two years, while the price would fall by half. That formula has been Intel’s driving force for five decades, setting the benchmark of what technology can be expected to do — and cost. That prediction is reaching its limit.
Chips, now measured in nanometres, can’t get any smaller without interfering with their silicon atoms; the manufacturing is no longer as cheap; resources are running out; and disposing of outdated tech carries a high environmental cost.
Will the chips of the future be powered by nanomagnetics, graphene, carbon nanotubes or gallium oxide? And will we ever have enough? Veezhinathan says we’re just going to have to figure it out. “We can’t live without microprocessors now. So working towards a sustainable process and e-waste management will be key.”
Jasubhai, meanwhile, hopes we look beyond what’s already been invented. “Indian techies are great innovators. We love tinkering. But we haven’t been great inventors,” he says. “Our ability to imagine has been limited by our computing power. But the inventor ecosystem is being built now. I’m optimistic that we’ll get to quantum computing, a theoretical process that does away with the chip, and offers unlimited processing power, in the coming decade.”