close_game
close_game

The microprocessor is 50 years old. But what makes it so special?

Feb 26, 2021 08:19 PM IST

Half a century ago, Intel’s 4004 chip was born, ushering in the digital revolution. Take a closer look at the little chip that changed the world, altered India’s place in it, and quietly powers modern life

If you’re reading this article on an electronic device, thank the microprocessor. If you’re reading it the old-fashioned way, in print, thank the microprocessor anyway. We couldn’t have edited, designed, published and printed this newspaper, especially with most of us working from home, without little chips all over the place — from our design team’s special CPUs to those in the printing press itself.

Intel employees in 1971. This California facility is where the 4004 was initially manufactured. (Image courtesy: Intel)
Intel employees in 1971. This California facility is where the 4004 was initially manufactured. (Image courtesy: Intel)

Techies think of the microprocessor as the brain of a device — the circuitry that tells mobile phones, microwaves, Alexa, traffic signals, ATMs, MRI machines, Mars rovers and other gadgets what to do.

Stay tuned with breaking news on HT Channel on Facebook. Join Now

Us regular folk? We don’t think of the microprocessor at all. We’ve never had to.

But think of even the act of calling someone on your mobile phone. “Three microprocessors are powering our conversation,” says Kamakoti Veezhinathan, professor of computer science and engineering at the Indian Institute of Technology (IIT)-Madras, speaking via WhatsApp from Chennai. “One in your phone, one in mine, one in my car so can speak hands-free. And in between are thousands more connecting us.”

This year marks half a century of those connections.

The one that started it all. The 4004 chip was a general-purpose mini-computer that fit on to a slice of silicon. The latest versions are smaller and thousands of times faster. (Intel)
The one that started it all. The 4004 chip was a general-purpose mini-computer that fit on to a slice of silicon. The latest versions are smaller and thousands of times faster. (Intel)

Intel released the 4004 in 1971. It was the first commercially available chip that incorporated a central processing unit, memory capacity, input and output. Essentially, it could perform functions, save them and re-present them on request — and it was smaller than a fingernail.

Where previous processors had been designed to do just one thing — run an assembly line or crunch numbers, for instance — engineers could programme the 4004s to perform different sets of tasks across different devices.

It was revolutionary for the time. And it became a kind of building block for tech, sparking a digital evolution, bringing the personal computer and new technologies into our homes, shaping lives, communities, economies. And fittingly for a 50th birthday, it worries about whether it will survive the next 50 years.

Small beginnings

For such a superhero, the 4004 has a surprisingly bland origin story. Tech companies around the world had been experimenting with all-in-one processing panels through the late 1960s. At Intel, a young scientist, Ted Hoff, knew a better design was possible. So when a Japanese firm, Busicom, asked Intel to develop chips for their line of printing calculators, he decided to use that project to try and redesign the chip itself.

Hoff created the 4004 as a mini-computer on a single slice of silicon, one that could be fitted into more than calculators. Engineer Frederico Faggin fashioned the hardware into a workable processor. Busicom, meanwhile, wondered why they were paying so much and waiting so long for Intel to deliver.

Intel was so sure of their little 4004, they paid the Japanese some of their money back, so they could retain ownership rights. Busicom got what they wanted — a calculator that could print out its answers — and sold some 100,000 pieces. Intel kept improving on their design, inspired other tech firms to create their own versions, and changed the course of history.

Busicom’s calculators were the first devices that ran on microprocessors. But Intel knew they could be used for much much more. (Image courtesy: Intel)
Busicom’s calculators were the first devices that ran on microprocessors. But Intel knew they could be used for much much more. (Image courtesy: Intel)

In an essay from 1982, Hungarian-American author Dennis Báthory-Kitsz — who reported extensively on technology during the first generation of computers in the ’70s — refers to microprocessors as “the greatest body of tools since the industrial revolution, perhaps even since the beginning of civilisation”. In a talk delivered the same year, he described a processor chip as “the first tool which is at once both wheel and writing”, meaning it could do your bidding, but also understand a new command when it came.

And because it was small, cheap, versatile and efficient, it turned technology from a complicated scientific tool into a part of everyday life. Before the 1970s, computers were bulky industrial gadgets, developed mainly for manufacturing, the military, or space programmes.

The idea of adding electronics to kids’ toys, car engines, Ganpati pandal lights, hair dryers or security systems would have seemed absurd. “Now, we can’t even think of inventing a gadget without embedding technology,” Veezhinathan says.

The world map

Microprocessor manufacturing and design changed the map of world economies, boosting South Korea, Japan, Taiwan and the US. In India, it was a game-changer in other ways. By 1984, the government’s New Computer Policy had reduced import tariffs on hardware and software. Software became a de-licensed industry, making software-service exporters eligible for bank loans. IT parks were developed to create a home-grown ecosystem of techies.

The original Macintosh 128k, released in 1984, made PCs more popular among the general population. Notice that slit opening? That was for floppy discs. (Shutterstock)
The original Macintosh 128k, released in 1984, made PCs more popular among the general population. Notice that slit opening? That was for floppy discs. (Shutterstock)

Entrepreneur Maulik Jasubhai knew India was interested in the growing global market for software. Back from studying in the US, he launched Intelligent Computing Chip, a monthly tech magazine. “Much of the developments at the time were published in tech journals or business publications,” he recalls. “We hit the sweet spot, talking about new advancements to a general audience.” Issues covered Microsoft’s high-priced software, which forced users to turn to pirated versions of Windows; some editions came with highly coveted free CDs containing new software.

“Intel and other companies showed the world how tech could be democratised and made available to everyone,” Jasubhai says. It spawned a new class of developers and coders in India, and in generations that were to follow, a separate ecosystem of call-centre employees who could troubleshoot tech problems half a planet away. “This was in a country that didn’t have much of a tech history. It changed the fortunes for lakhs of families,” Veezhinathan says. “The programmer is to the microprocessor what the student is to the educational institute. We owe it a great debt.”

The next step

We’ve come a long way since those days of free CDs, bulky devices and technology as a luxury frill. Your basic smartphone has more computing power than the tech that put man on the Moon in 1969. There are now more microprocessors on Earth than people. Most of them power communications devices, household and car tech and the Internet of Things (IoT). Barely 1% of microprocessors end up in home computers.

Ted Hoff’s invention also ended up saving his own life — at 83, a microprocessor-powered pacemaker keeps his heart ticking.

Shakti, India’s first indigenous chip, is designed especially to use less power, meaning it will work well in rural areas, in city traffic signals, and perhaps even your cornershop. (IIT-Madras)
Shakti, India’s first indigenous chip, is designed especially to use less power, meaning it will work well in rural areas, in city traffic signals, and perhaps even your cornershop. (IIT-Madras)

As we hunger for more, the minerals, heavy metals and other components that go into making a chip are getting harder to find. A global microchip shortage in 2020 meant that several car, domestic appliance and video-game manufacturing factories had to halt production or raise prices. Several countries and global companies — Alphabet, Apple and Alibaba included — are exploring ways to create custom chips for their specific needs, especially machine learning.

At IIT-Madras, Veezhinathan leads the special laboratory that developed India’s first indigenous microprocessors. “There is a need for domain-specific architecture particularly in India,” he says. “It would help to have a separate type of processor to check air quality… something that needs low power so we don’t have to keep changing its batteries, but can still aggregate data and communicate with a central system.”

The first of these indigenous purpose-built microprocessors, Shakthi, was released in 2018. Veezhinathan hopes it might be adopted in animal husbandry and your cornershop. A new one, Moushik, was launched last year and is designed specifically for IoT devices. “It’s a step towards being a self-reliant India,” Veezhinathan says.

Ajit, developed by IIT-Bombay in 2019, is the first processor to be conceptualised, designed, developed and manufactured in India. It is geared for larger systems such as robotics, automation, eventually India’s satellites.

Bigger, better?

Some new microprocessors aren’t even trying to be micro. A California-based start-up, Cerebas Systems, released the world’s largest computer chip in 2019. Where most processors are rarely larger than a postage stamp, this one is as big as an iPad. The manufacturers believe it will give Artificial Intelligence systems a big boost.

For those who’ve followed the development of the chip, an interesting milestone now awaits. In the 1960s, Intel co-founder Gordon Moore predicted that as we developed tech, the processing power of a chip would double every two years, while the price would fall by half. That formula has been Intel’s driving force for five decades, setting the benchmark of what technology can be expected to do — and cost. That prediction is reaching its limit.

Chips, now measured in nanometres, can’t get any smaller without interfering with their silicon atoms; the manufacturing is no longer as cheap; resources are running out; and disposing of outdated tech carries a high environmental cost.

Will the chips of the future be powered by nanomagnetics, graphene, carbon nanotubes or gallium oxide? And will we ever have enough? Veezhinathan says we’re just going to have to figure it out. “We can’t live without microprocessors now. So working towards a sustainable process and e-waste management will be key.”

Jasubhai, meanwhile, hopes we look beyond what’s already been invented. “Indian techies are great innovators. We love tinkering. But we haven’t been great inventors,” he says. “Our ability to imagine has been limited by our computing power. But the inventor ecosystem is being built now. I’m optimistic that we’ll get to quantum computing, a theoretical process that does away with the chip, and offers unlimited processing power, in the coming decade.”

Catch your daily dose of Fashion, Health, Festivals, Travel, Relationship, Recipe and all the other Latest Lifestyle News on Hindustan Times Website and APPs
SHARE THIS ARTICLE ON
  • ABOUT THE AUTHOR
    author-default-90x90

    Rachel Lopez is a a writer and editor with the Hindustan Times. She has worked with the Times Group, Time Out and Vogue and has a special interest in city history, culture, etymology and internet and society.

SHARE
Story Saved
Live Score
OPEN APP
Saved Articles
Following
My Reads
Sign out
New Delhi 0C
Tuesday, December 05, 2023
Start 14 Days Free Trial Subscribe Now
Register Free and get Exciting Deals