1968 to 1969: Few semiconductor startups from Silicon Valley’s early days have survived as Integrated Device Manufacturers (IDMs). Intel is the only one that remains on the leading edge. One reason is a competitive culture founded on institutional learning that traces back to the strategies put in place by Gordon Moore at Intel’s very beginning. While other spin-offs in the valley were emulating the giants in the southwest, Motorola and Texas Instruments, Moore would, with the help of others lay down a series of overlapping strategies for its success that are still in play today. Some of these were employed by other companies at the time. What was unique to Moore’s approach was the number and how they coupled to align its overarching strategy from suppliers, to organization, to customer:
Don’t follow the leader: Moore’s product strategy would create the first vertically integrated semiconductor company. Up until that point the major players had won with a horizontal strategy that consumed shelf space at distributers, as well as, on bookshelves and designer minds at end-customers. Their chip offerings covered a spectrum of building-block functionality that a system designer could easily piece together if they bought from the same semiconductor manufacturer maker. With more than 5000 electronics manufacturers in the United States alone, the bulk of semiconductor sales went through distributors. Sales professionals at these distributors led customers to the chip maker with the greatest breadth of functions to maximize their commissions, locking out smaller players.
Moore realized it would be impossible to play this game, so he did what they were not: find a single product with large enough demand to fill a fab and use your core strengths to be the best at it. That product would be memory.
Rely on the Supply Chain to conserve resources: In the early days of the industry, semiconductor manufacturers made their own tools, masks, and wafers. At the time and well into the future, the common wisdom was that the ‘secret sauce’ to competitive advantage lay in the tools. But by the time Intel started, there was a growing base of suppliers that had grown out of the Silicon Cycle.
With every downturn, came layoffs. And the first engineers to be laid off were the ones who had developed the tools, masks, and wafers. With everything in place and business down, they were no longer needed. These unemployed engineers would then take their skills to start up their own companies, creating a merchant supply chain. Over time, the competitiveness of the companies they left would diminish as a result of losing these engineers. Meanwhile, supplier start-ups would come out with products that were more competitive than internally made tools — strengthened by insights gained from working with multiple customers.
Moore saw that Intel didn’t have the resources to develop a new set of internal tools … And may well have seen the degenerative result of the silicon cycle on this aging strategy. He certainly saw that internal development of such products could no longer be a core strength for a semiconductor company. Being one of the first to partner with suppliers would give Intel a leg-up while conserving resources for product development. One example of such partnering was Gordon driving an investment in MicroMask so they could afford to bring in one of the first e-beam mask writers in the Valley.
Align product development strategy to manufacturing strategy to market strategy: If anyone understood Moore’s Law to the point of internalizing it, it has to have been Gordon Moore. While most others saw it as a marketing gimmick, he saw it as a force of nature. At the surface, was the simple observation that scaling would bring down cost-per-transistor, which would allow designers to make evermore complex systems that could be sold at the same real price — thus expanding the market. The force of nature that Moore saw was the effect on competition for these expanding markets would drive the entire supply chain and that Intel had to ride it. While the scientific equivalent called ‘emergent behavior’ would not be discovered for decades, he understood the mechanism as well as anyone since Adam Smith. And would incorporate it into Intel’s strategy.
This understanding is arguably an epiphany arising out of another problem he faced: rising design costs due to increasing circuit complexity (yes, rising design costs were an issue limiting scaling as early as in the sixties). EDA tools were decades away from discovery. Chip designs had to be painstakingly hand cut in Rubylith to make masks. Meanwhile, as Moore’s Law drove the ability to put more transistors on a chip, semiconductor customers were demanding more complex custom circuits. Many a company would slay themselves in the seventies trying to compete with a custom or ASIC approach. Intel would not, as he would keep them focused on standard products, returning the company to that focus whenever it was led astray by customers.
Moore saw memory as a way to spread design costs over a much larger market. In doing so, he was embedding this alignment into his strategy.
Moving R&D to the production floor was another way Moore would bring this alignment to bear on Intel’s success. “At Fairchild, NOTHING made it from R&D to the customer!” — to quote Andy Grove. Production reengineered products, while innovation decomposed in a cesspool of corporate politics. By putting research in production, engineers could quickly see and fix problems. For example, two critical problems plagued early MOS IC yields: 1) Sharp edges on dielectric corners made by lithographic etching caused aluminum interconnect to fail. More himself would solve this by phosphorus doping the SiO2 layer and annealing it to turn these sharp corners into smooth shoulders. This secret would keep Intel in the yield lead for years. 2) Another yield headache was pinholes. This would be solved by dipping the wafer in acid after the dielectric was put down and then coating it with a second layer. While both layers had pinholes, the probability that two overlapped in each layer was minimized, thus dramatically increasing yield.
Make your employees owners: Gordon Moore and Bob Noyce were two of the earliest proponents of doing this with stock options. They had seen the grating effect of people leaving Fairchild because it had no options (Charlie Sporck one of the most notable, who left to turn National Semiconductor around).
The semiconductor business was grindingly tough in those days. But unlike traditional businesses, it needed engaged employees that exercised their minds by solving tough problems. He had experienced the stifling Gilded Age mentality of Eastern managers at Fairchild where power was expressed in big offices and reserved parking, neither him nor Bob Noyce wanted to repeat it. They would be one of the first to architect offices in the egalitarian approach of cubicles. There was no need for open-door policies … There were no doors. But what could replace these expensive artifacts of power that so many believed in?
Moore believed that making employees owners with stock options would not only engage them in the success of the company, but also see private offices as wasteful of what could be personal returns. Hence, Intel became one of the earliest companies to create what is now known as Silicon Valley business culture.
It would also make many of its employees’ millionaires in the process as Moore and Noyce forged one of the most innovative companies in history.
Intel’s first 106 employees
There was more to this egalitarian approach Gordon Moore took than meets the eye. Eastern business culture development was weighed down by the excesses of the Gilded Age — only a few decades earlier. There had been rough treatment of employees, with entrenched battle lines drawn between totalitarian employers and unions. Workers were often seen as only cogs in the wheels of production.
Like the orchards of Silicon Valley, the tech companies that came to replace them were from families of ordinary people with modest means. Their more egalitarian culture had emerged out of the Free Silver populist movement of the late-19th century that pitted farmers against what was called the “gold monopoly” of the east, and so wonderfully coded in L. Frank Baum’s “The Wizard of Oz.” These had been repressive times and always on the cusp of revolt. So, it should come as no surprise that Silicon Valley business culture developed into such a unique world that other regions have found so hard to emulate.
Make Strategic Bets: Moore’s decision to build out their first fab with 2-inch wafers was a major strategic bet that would give them a critical cost advantage. It was a classic blue ocean strategy as this bet centered on others, depreciating existing fabs with smaller wafers, would not follow. Depreciation fears has often caused semiconductor executives to become victims of the sunk cost fallacy. They fail to see that money spent is long gone and depreciation is merely an accounting artifact … or as Bob Boehlke liked to say, “There is cash … And everything else is accounting.” Of course, others would follow, but taking the safe route with an existing wafer size would have doomed Intel. Moving to a larger wafer size ahead of others gave Intel a critical cost edge to offset the cost burden of being at the leading edge ahead of others. This is because the economic advantage of a larger wafer is in larger profits. And these profits accrue with every wafer processed…
The leading edge … always has the edge — especially when cost is the advantage.
Hit the customer at their pain points: It is a maxim that instead of listening to your customers, you observe your customers to find their pain points. In the late 1950’s, as today, memory was a constant problem for system makers. As CPU (Central Processing Unit) development brought more compute power the need for scratch-pad memory, or what we now call cache, grew. First they used tubes and by the mid-fifties were using . Magnetic core was more reliable than tubes and much faster, with access times of 9 microseconds compared to about 25 for tubes. While they were still about 100 times cheaper than solid-state memory in 1968, they had to be hand-woven by humans, making them difficult to scale.
Moore believed that lithographic scaling would bring down the cost of solid-state memory to intersect core, while offering much faster performance and surpassing core in size. The first result of this came when : The 3101 bipolar SRAM with features that included “fast access times of 50 nanoseconds” … and … “power dissipation of 6 milliwatts-per-bit.” But, it was priced at $99.50 each, or a whopping $1.56 per-bit. Core was selling for roughly 2 cents-per-bit at the time. Yet the race was on.
It was a race that Intel would win. Within the year, Intel would follow-through with its 1101 256-bit SRAM and in the following year it introduced its 1103 DRAM in 1970, that out of the box bested core memory on price-per-bit. Here is Intel’s ad for the 1103, announcing the end for core:
Intel Delivers: This ad highlights another pain point for customers. In those days, semiconductor supply was fairly unreliable. Fab yields would crash for no apparent reason and it would take weeks to find the culprit. The strategy behind the “intel delivers” logo may well have come when Gordon Moore and Bob Graham — soon to be Intel’s first VP of marketing — were out at sea fishing as Noyce was getting the funding needed to start the company up. That strategy was to deliver a brand promise that Intel would not have these crashes. That you could count on Intel to always deliver on-time. Moore credits Graham with developing the logo. But the weight of this promise would fall on Intel’s first employee: Andy Grove.
: He would one day tell that Moore and Noyce would say, the ‘S’ in his name was an abbreviation for Andrew “Ship-the-Shit” Grove.
Hedge your bets: Moore’s strategy of partnering with suppliers would free up engineering resources for his “Goldilocks strategy.” This would entail splitting them up into three different teams: each developing a memory based on a different technology. The three would be a single chip bipolar memory, a single chip silicon gate metal-oxide semiconductor memory, and a multi-chip package that wired together four memory chips and today would fall into the advanced packaging category of 2.5D heterogeneous integration.
The first, a bipolar chip, was considered too hot a porridge, because it was easy to do and would quickly draw competition. While it had the advantage of coming first as the 3101 SRAM, as predicted by Moore, it was quickly copied by Fairchild and others.
The latter, based on wiring together four chips, was too cold a porridge, because it would prove to be too difficult to make. It had all the problems limiting multi-chip packages for decades: Yields were extremely low because there was no such thing as a Known-Good-Die (KGD). They couldn’t figure out how to test it. Nor could it pass the ‘shake-rattle-and-roll’ test, failing badly when Moore dropped one on the assembly line and all four chips went flying from its ceramic substrate. In an early fast-to-fail strategy, long before the term existed, Moore killed the project.
Moore actually believed the middle, based on silicon-gate MOS technology, was just right competitively. While they still needed to develop the technology, it would put Intel in the lead with a healthy gap from the others. It would come soon after as the Intel 1101 256-bit MOS SRAM. Intel was on its way.