Sixty years ago, Gordon Moore already realized how powerful chips would become

“Wonders like home computers, automatically controlled cars and personal wearable communications.” That was the prediction made in 1965 by American chemist Gordon Moore in an article for the magazine Electronics.

Such miracles were possible, Moore expected, because chips would become twice as complex and twice as powerful every year, propelled by continuous scaling. Cramming more components onto integrated circuits, read the somewhat irreverent headline of that article. In other words: how do you cram more components onto a chip. It’s the reason a new smartwatch is just as powerful as a computer from a few years ago.

Born in 1929 in San Francisco, Gordon Earle Moore was the founder of chip giant Intel and one of the founders of the chip industry. As a chemist, he worked with pioneers such as William Shockley (co-inventor of the transistor) and Robert Noyce, inventor of the first practical integrated circuit. Moore died Friday at the age of 94.

The annual doubling of the number of circuits or transistors per chip that Moore envisioned as early as 1965 took on a life of its own and became the gold standard for the chip industry. Moore’s Law became the benchmark for exponential growth. Not only for chips, but also for derivative technologies, such as improving data speeds or developing artificial intelligence. Anything that relies on computational power can theoretically develop at the speed of Moore’s Law.

“I come across Moore’s Law on Google more often than Murphy’s Law,” said Moore. He preferred to call his law an ‘observation’ or a ‘wild extrapolation’. He himself was most surprised that the figures turned out to be reasonably accurate. “My real goal was to make it clear that we had a technology that would make electronics cheaper. I never expected the estimate to be so accurate.”

“I actually had too little data to make a reliable prediction,” said Moore. He based his calculation on this sum: “The chips we were working with at the time had about eight elements, the new generation had sixteen. In the lab we worked on chips with more than thirty parts. I saw it was about doubling a year; if you continue that line, more than 60,000 circuits could fit on one chip in ten years.”

Experiment

Gordon Moore was more than Moore’s Law. He was one of the pioneers of Silicon Valley, who pioneered computer engineering and the entrepreneurial drive that became typical of the West Coast tech industry. “Every new idea became at least a new company,” Moore recalled in a 1994 memoir he wrote for the California Institute of Technology, where he was a student.

As a child, Moore experimented with chemistry boxes, preferably with substances that could explode. Experiment continued to dominate his work as he searched for the ideal semiconductor to create reliable circuits on chips. Silicon, which gave the name to Silicon Valley, turned out to be the most suitable.

Moore started out as a chemist in the lab of William Shockley, co-inventor of the transistor and winner of the 1956 Nobel Prize in Physics. Conditions at Shockley were too bad to design reliable electronics, Moore said afterwards. “I don’t think we made much progress, it was a filthy place, with no air conditioning. We learned especially well what did not work.”

Moore left in 1957 with seven colleagues to start his own company, Fairchild. These “treacherous eight,” as they were nicknamed, each put in $500 each—a monthly salary at the time—and attracted the legendary Arthur Rock as a co-investor. Rock would also later be one of the early investors in Apple and Intel.

Although Texas Instruments had already invented an integrated circuit in 1958, it was Fairchild’s Robert Noyce who developed a more practical variant. Fairchild grew to a company of 30,000 employees, with Gordon Moore as director of research.

In 1968 Moore left Fairchild, in the wake of Robert Noyce. Together with Andy Grove, they founded Intel, a company that became the largest chip manufacturer in the world.

In the early 1970s, Intel pioneered the design of an entire system on a chip, the microprocessor. That would become an essential building block for the personal computer.

Infinite

Gordon Moore was Intel’s chief executive from 1979 to 1987 and served as chairman until 1997. In his life he saw his own law become reality. The number of circuits in a chip grew from a handful to billions. It called it a promise that fulfilled itself: “The industry saw it as something they had to do in order not to fall behind in technology. To stay ahead, at the point where the chip industry has the biggest gains, they had to move as fast as Moore’s Law predicted.”

But scaling cannot be sustained indefinitely. Moore’s Law had to be adjusted a few times. In 1975, it changed to doubling the number of chips every two years. Later the rate was adjusted to a doubling once every 18 months. In 2015, Intel missed the step towards an even more accurate process, so that Moore’s Law was ‘missed’ again. Competitors such as TSMC and Samsung have now taken a step further with the reduction in scale. Intel is trying to catch up with the pace in its new chip factories.

Also read this story, in which NRC took a look behind the scenes at ASML for a year: What is the secret of the Dutch high-tech marvel?

The Dutch company ASML is one of Intel’s most important suppliers, with lithography machines that print increasingly finely meshed connections on chips. This is the essential part of the production process. The most advanced lithography machines can now print chip structures with an accuracy of 3 nanometers, three millionths of a millimeter.

The reduction in scale will continue in the coming years according to Moore’s Law, thinks ASML. There are also other ways to cram more computing power into a smaller footprint, such as by stacking chips. The price per ‘bit’, per computer calculation, then continues to fall. That principle keeps the chip industry going because new applications for chips are constantly emerging. Moore’s Law lives on even if its originator is dead.

By chance

Moore considered himself an “entrepreneur by chance.” More of a doer than a seer, who later admitted that he never saw the internet coming. “We saw that computers could do useful things, but not that they would become so important for our communication.”

When asked about missed opportunities, he told how he once turned down a proposal from one of the Intel designers. It was a computer for home use, long before Apple introduced the first personal computer. “Why would anyone want a computer at home?” Moore replied gruffly. For example, he deprived Intel of the opportunity to build its own PCs – although the company then took full advantage of the home PC as a chip supplier. It was introduced by IBM in 1981.

In 1985, Intel had to stop producing memory chips because competition from Japan put a lot of pressure on prices. Moore laid off a quarter of all staff. He experienced that as a defeat, he said afterwards.

Moore had two sons with his wife Betty. Together they founded the His capital Gordon & Betty Moore Foundation, in which an estimated of Forbes converted to about USD 5.5 billion. Every year, $300 million is donated to charities.

Moore also gave that amount to his beloved Bay Area, the area around San Francisco. In addition, his foundation invested nearly one and a half billion dollars in basic scientific research, which he warmly welcomed as a chemist. Moore wanted to do something practical with his wealth. “Betty and I have more than we can use. It is better to do something more while we are still alive than to wait until we are dead.”

ttn-32