TL;DR
This newsletter is about the semiconductor industry and AI development, and is the first on a three-part analysis on this topic. This part provides a short history of computer chips, an overview of the semiconductor supply chain and some of the key players in the industry.
Here are the key takeaways:
The invention of the transistor in the 1940s was a significant moment in the history of computer chips. It was with this semiconductor that the first iterations of advanced computer chips were developed, paving the way for more sophisticated computers.
Throughout the the remainder of the 20th century and in the early 21st century, these chips continued to improve in accordance with the principle of what is known as Moore's law. This resulted in more powerful chips that became cheaper to produce.
Today, the supply chain for computer chips is global and complex. It involves a network of different companies that design, manufacture, test, package and distribute these components which are found in many modern electronic devices.
There are a variety of companies involved in various aspects of the semiconductor supply chain. Among the most significant ones include TSMC, SMIC, Samsung, Intel, ASML, Huawei and Nvidia.
A short history of computer chips
This starts with George Boole in nineteenth-century England. He was a mathematician and philosopher who came up with the idea to represent logic as a series of binaries (i.e., binary digits known as 'bits').
These bits can, theoretically, be represented by anything, for example using red and blue M&Ms. However, 1
and 0
are used to represent bits.
Bits are the bedrock of computing. In 1938 Claude Shannon, a masters student at MIT, discovered that electronic circuits could be built using Boolean logic, with 1
and 0
representing 'on' and 'off':
It was a transformative discovery, which paved the way for computers built using electronic components. The first programmable, electronic, digital computer would famously be used by a team of Allied codebreakers, including Alan Turing, during World War Two.1
From this development came the transistor in 1947. The transistor is a type of semiconductor made from material that partly conducts electricity and partly does not (hence the 'semi' in 'semiconductor').
Scientists at Bell Labs (which included John Bardeen, Walter Houser Brattain and William Shockley) realised that these semiconductors could be used to build 'logic gates'. These are "devices that could do elementary logic calculations."2
A computing device capable of performing lots of calculations can be produced by stacking these logic gates together. This is what paved the way for more sophisticated computers.
When they were first developed, transistors were big and clunky, requiring further R&D to make them smaller. This started in 1970 when Robert Noyce of Fairchild Semiconductor created the integrated circuit; this combined several really small transistors into a single component.
Due to the tiny size of these transistors, specialist equipment was needed to execute the discrete process for creating them:
They were made through an elaborate process a little like chemical photography, called photolithography. Engineers would shine ultraviolet light through a film with a circuit design on it, much a like a child's stencil. This imprints a circuit onto a silicon wafer, and the process can be repeated several times on a single wafer - until you have several transistors on top of one another. Each wafer may contain several identical copies of circuits, laid out in a grid. Slice off one copy and have a silicon 'chip'.3
The trend of making these chips ever smaller was recognised in 1965 by Gordon Moore, a researcher at Fairchild Semiconductor. Moore found that:
The physical area of an integrated circuit was falling by half every year without any decrease in the number of transistors contained therein.
The films used for photolithography were getting more detailed, the transistors themselves were getting smaller and the components were becoming more intricate, resulting in decreasing costs and rising performance.
This is the idea behind Moore's law, whereby "every 18-24 months, chips would get twice as powerful for the same cost."4 While Moore's law is more of an observation rather than an actual law, the development it describes remained true for the semiconductor industry throughout the remainder of the 20th century and in the early 21st century:
In 1958, Fairchild Semiconductor sold 100 transistors to IBM for $150 apiece. By the 1960s, the price had fallen to $8 or so per transistor. By 1972...the average cost of a transistor had fallen to 15 cents, and the semiconductor industry was churning out between 100 billion and 1 trillion transistors a year. By 2014, humanity produced 250 billion transistors annually; 25 times the number of stars in the Milky Way. Each second, the world's fabs - the specialised factories that turn out transistors - spewed out 8 trillion transistors. The cost of a transistor had dropped to a few billlionths of a dollar.
Why does this matter? Because it led computers to improve at an astonishing rate. The speed that a computer can process information is roughly proportional to the number of transistors that make up its processing unit. As chips gained transistors, they got faster. Much faster. At the same time, the chips themselves were getting cheaper.5
How computer chips are made today
Today, the supply chain for semiconductors is global and complex. It involves a network of different companies that design, manufacture, test, package and distribute these components which are found in many modern electronic devices.
The supply chain consists of four main stages: design and development, fabrication, testing and assembly, and distribution.
Design and development
This is where companies design the architecture of the chip. The way a chip is built depends on what it is being optimised for and needs of the eventual user, meaning that there different types of chips that exist.
For example, memory chips, also known as random access memory, are designed to temporarily store data that the computer's processor (another type of chip) is actively processing. Other types of chips include central processing units (CPUs) and graphic processing units (GPUs).
Due to the continuation of Moore's law, computer chips are now so incredibly small in size that it is standard to measure them in terms of nanometers (nm), or one billionth of a meter. Today's most advanced chips, like those found in the latest iPhones, are just 3nm in size, and the smaller the transistor, the more that can be packed into the chip, increasing its power.
There are several metrics that can be used to measure the processing power of a chip. One metric is floating point operations per second, also known as FLOPS, which measures how quickly a computer or processor can perform mathematical calculations involving floating-point numbers:6
In simple terms, FLOPS measure how quickly a chip can process math problems with decimal numbers.
The quicker it can complete these problems within a certain time, the more computational power the chip possesses.
For example, if a chip has a FLOPS rating of 2 gigaflops (2 billion FLOPS), it means that the computer can perform 2 billion floating-point calculations per second.
The higher the FLOPS, the more powerful the chip.
Complex software is normally required to develop these chips given the number of layers involved and how small these devices are. The US captures around 40% of the design market, which includes electronic design automation, intellectual property, and design services revenue.
Fabrication
This involves companies that operate factories with specialised equipment to produce computer chips, which are known as foundries. Taiwan is quite dominant in this market, producing around 37% of the world's semiconductors in its foundries.7
The specialised equipment is used to create and etch the chip designs onto layers of silicon wafers. This process uses photolithography, where engineers "shine ultraviolet light through a film with a circuit design on it, much like a child's stencil."8
A range of materials are needed for the fabrication process, including silicon wafers, photomasks and photoresists. According to the US Geological Survey (USGS), the top producer of silicon is China, and other top producers include Russia, Brazil, Norway and the US.
For producing semiconductors, a particular type of sand called silica is used, which is made of silicon dioxide. This sand is melted and cast into a cylinder called an ingot. Thin silicon wafers can then be sliced from this ingot and the chip designs can be etched on to each layer.
Testing and assembly:
After making the chips, they are tested to ensure proper functioning. This is usually carried out by a company other than the fabricator.
The purpose of the testing is to ensure that the chips work as intended. This helps to identify chips that may be defective.
After passing these tests, the chips are assembled into products like smartphones and laptops. The assembly process usually takes place at another company.
Distribution
The completed device with the chips inside are distributed to be sold. For instance, servers equipped with GPUs are sold to cloud companies that integrate the hardware into their data centres to provide virtual computing services over the internet.
Key players in the semiconductor industry
There are several different companies involved in the various aspects of the semiconductor supply chain. The diagram below attempts to map some of these companies and where they fit within this supply chain:
There are many more companies that form part of this supply chain. Some of the more significant ones are explored in more detail below.
TSMC
The Taiwanese Semiconductor Manufacturing Company (TSMC) is arguably the most important company in the semiconductor supply chain.
As the world's biggest semiconductor manufacturer, TSMC has around 50% market share in the global foundry market, with Samsung in second place at 17%. TSMC therefore produces the majority of the most advanced chips, including those powering the data centres used for AI development.
Back in the 1960s, the Taiwanese government made a very conscious decision to give its country a crucial role in the world of semiconductors. At the time, Taiwan was already one of the leading Asian countries for assembling semiconductor devices (namely testing chips made abroad and attaching them to plastic or ceramic packages).
But it wanted to strengthen its position further in light of the growing competition from China:
Across the Taiwan Strait, Mao Zedong had died in 1976, reducing the threat of imminent invasion. But China now posed an economic challenge. Under its new, post-Mao leadership, China began integrating into the global economy by attracting some of the basic manufacturing and assembly jobs that Taiwan had used to lift itself out of poverty. With lower wages and several hundred million peasants eager to trade subsistence farming for factory jobs, Chain's entry into electronics assembly threatened to put Taiwan out of business.9
As such, in the 1987, the Taiwanese government turned to Morris Chang to help set up a chip manufacturing company. Chang had helped Texas Instruments to establish a facility in Taiwan to assemble its chips, and his experience would prove useful for the founding of TSMC.
More specifically, Chang would set up the company with a business model that was previously rejected by Texas Instruments; specialising in the manufacturing of chips that have been designed by another company:
At the time, chip firms like TI, Intel, and Motorola mostly manufactured chips they had designed in-house. Change pitched this new business model to fellow TI executives in March 1976. "The low cost of computing power," he explained to his TI colleagues, "will open up a wealth of applications that are not now served by semiconductors," creating new sources of demand for chips, which would soon be used in everything from phones to cars to dishwashers. The firms that made these goods lacked the expertise to produce semiconductors, so they'd prefer to outsource fabrication to a specialist, he reasoned. Moreover, as technology advanced and transistors shrank, the cost of manufacturing equipment and R&D would rise. Only companies that produced large volumes of chips would be cost-competitive.10
With TSMC specialising in building chips that were designed elsewhere, so-called 'fabless' companies specialising in designing and selling chips started to emerge. This "transformed the tech sector by putting computing power in all sorts of devices."11
The creation of TSMC therefore helped to establish both the foundry business model and the fabless business model:
Foundry business model - making chips designed by others (i.e., TSMC).
Fabless business model - designing chips to be made by others.
As a result, this moved the semiconductor industry away from vertical integration, in which companies both designed and manufactured their own chips. This provided the opportunity for several fabless companies to emerge and thrive, including Nvidia.
SMIC
The Semiconductor Manufacturing International Corporation (SMIC) is China's largest chip manufacturer.
It only makes up only 5% of the global foundry market share, and therefore a long way behind the market leader TSMC. However, it played a key role in the development of Huawei's Kirin chip in its Mate 60 smartphone and is arguably close to producing chips as advanced as those made by TSMC despite being restrained by US sanctions:
SMIC is at worst only a handful years behind TSMC. One could argue that SMIC is at most only a few years behind Intel and Samsung despite restrictions. As SMIC is replicating what has been done elsewhere, the gap could be even narrower due to their excellent engineering pool from mainland China as well as many courted immigrants from Taiwan that were formerly employed by TSMC.
The founding of SMIC was fuelled by the desire to compete with TSMC. Leveraging the country's growing dominance in assembling electronic devices, the Chinese Communist Party (CCP) turned to Richard Chang, whom at Texas Instruments "became an expert in operating fabs, running TI's facilities around the world, from the U.S. to Japan, Singapore to Italy."12 The CCP trusted him to do the same for them with their own fab company.
SMIC was founded in 2000 raising more than $1.5 billion from a range of international investors, including Goldman Sachs, Motorola, and Toshiba. These funds were used to hire hundreds of foreign employees to operate the fab, including four hundred from Taiwan and mirroring the strategy taken by TSMC years earlier in which it hired foreigners (particularly experienced Americans).
Samsung
The Samsung Group is a South Korean technology company with a significant presence in the semiconductor industry.
It has foundries around the world, though mainly in South Korea, and largely produces memory chips for desktop computers and smartphones. This activity makes up around 35% of the company's revenue.
Samsung both designs and fabricates its own chips. However, as mentioned before, TSMC still dominates the production of the most advanced semiconductors, benefitting from being able to serve wealthy customers like Apple since, unlike Samsung, the Taiwanese company is not a direct competitor with these tech companies.
Samsung has been trying to make itself more competitive in the space, for example by mass producing 3nm chips. It remains to be seen whether these efforts will prove successful.
Intel
This American company is both a designer and fabricator of semiconductors. It owns a number of foundries worldwide designing and producing chips, though most are located in the US.
The story of Intel's founding involves one of the scientists responsible for the development of the transistor at Bell Labs in the 1940s, William Shockley. In 1955, Shockley founded Shockley Semiconductor Laboratory which was a pioneering semiconductor developer and the first company to work on silicon-based semiconductor devices in Silicon Valley.
Shockley hired eight top scientists to work with him at his organisation: Julius Blank, Victor Grinich, Jean Hoerni, Eugene Kleiner, Jay Last, Gordon Moore, Robert Noyce, and Sheldon Roberts. But as Sebastian Mallaby notes in his book, The Power Law: Venture Capital and the Art of Disruption, while Shockley was considered a "scientific genius", he was also criticised as a "maniacal despot" and consequently quite unpleasant to work for.13
As a result, these top scientists at Shockley, who would come to be known as the 'traitorous eight', decided to defect and start their own company called Fairchild Semiconductor in 1957 with the help of venture capitalist Arthur Rock. It became a leading company in making transistors.
But almost a decade after Fairchild's founding two of the traitorous eight, Robert Noyce and Gordon Moore, left to create Intel in 1968. The company would go on to become very successful in the semiconductor industry, managing to become a publicly traded company just two years after its founding.
Intel's success is the result of its strong footing in microprocessors. The company has always specialised in the fabrication of CPUs, a type of microprocessor that is essentially the 'brain' of the computer.
The strength of CPUs is that they are reliable general purpose chips capable of running many different kinds of calculations. As such, beginning in the 80s, Intel held a dominant position in CPUs and, for a long time, almost every PC maker had to use chips produced either by Intel or AMD "because these two firms had a de facto monopoly on the x86 instruction set that PCs required."14
However, while Intel still remains a profitable company today, it has not been without its struggles. For one, it missed the opportunity presented by the growth of AI research, of which has instead been taken by Nvidia and others.
Additionally, the company failed to upgrade its manufacturing equipment during the advent of extreme lithography around a decade ago. This turned out to be a "historic mistake" that saw TSMC overtake the American company to become the most advanced chip maker in the world.
ASML
As already explained, modern semiconductors are incredibly small and therefore require specialist equipment to fabricate. In this space, the Dutch company Advanced Semiconductor Materials Lithography (ASML) holds the position of being the "sole provider of the latest generation of photolithography scanner equipment (extreme ultraviolet, or EUV, lithography machines)".
This is the equipment that TSMC, Samsung, Intel and other fabricators rely on to produce chips nanometers in size. ASML is therefore a very important company in the supply chain, both currently and for the future too:
Without ASML's lithography machines, products as ubiquitous as Apple's iPhones or as sophisticated as the Nvidia chips that power ChatGPT would be impossible. Only three companies in the world - Intel, Samsung and TSMC - are capable of manufacturing the advanced processors that make these products possible; all rely on ASML's cutting-edge equipment to do so.
The innovations achieved by ASML have ensured that transistors have continued to shrink, thereby making chips more powerful. The pace of progress in the technology industry over the past five decades has been made possible by exponential increases in semiconductor transistor density.
Huawei
This Chinese telecoms company was founded by its current CEO Ren Zhengfei, who served in the People's Liberation Army (PLA) during which time he worked in a factory producing synthetic fibre for garments. After leaving the PLA, he moved to Shenzhen which was at the time one of the beneficiaries of Chinese economic reforms to spur growth and entrepreneurship through favourable laws and increased foreign investment.
It was under these conditions that Zhengfei started a business importing telecom switches from Hong Kong with $5,000 in startup capital. But his business partners cut Zhengfei off when they realised that he was making money reselling their equipment, leading him to build his own:
By the early 1990s, Huawei had several hundred people working in R&D, largely focused on building switching equipment. Since those days, the telecom infrastructure has merged with digital infrastructure. The same cell towers that transmit calls also send other types of data. So Huawei's equipment now plays an important - and in many countries, crucial - role in transmitting the world's data. Today it is one of the world's three biggest providers of equipment on cell towers, alongside Finland's Nokia and Sweden's Ericsson.15
Huawei has been accused of intellectual property (IP) theft in the past and has settled patent infringement lawsuits with the likes of Cisco and Motorola. Even so, it would not be accurate to attribute the company's success to IP theft alone:
The company has developed efficient manufacturing processes that have driven down costs and built products that customers see as high-quality. Huawei's spending on R&D, meanwhile, is world leading. The company spends several times more on R&D than other Chinese tech firms. Its roughly $15 billion annual R&D budget is paralleled by only a handful of firms, including tech companies like Google and Amazon, pharmaceutical companies like Merck, and carmakers like Daimler or Volkswagen. Even when weighing Huawei's track record of intellectual property theft, the company's multibillion-dollar R&D spending suggests a fundamentally different ethos than the "copy it" mentality of Soviet Zelenograd, or the many other Chinese firms that have tried to break into the chip industry on the cheap.16
For many years, Huawei's equipment was widely used across the world, allowing the company to gain more market share. Its growth even caused some Western telecom firms to go bust.
Such growth allowed the company to venture into smartphones and, eventually, designing its own chips. It develops around 250 semiconductors for its wide range of products.
Also, like many other fabless companies, it relied on TSMC to manufacture its chips:
By the end of the 2010s, Huawei's HiSilicon unit was designing some of the world's most complex chips for smartphones and had become TSMC's second-largest customer. Huawei's phones still required chips from other companies, too, like memory chips or various types of signal processors. But mastering the production of cell phone processors was an impressive feat. America's near monopoly on the world's most profitable chip design businesses was under threat. This was more evidence that Huawei was successfully replicating what South Korea's Samsung or Japan's Sony had done decades earlier: learning to produce advanced technology, wining global markets, investing in R&D, and challenging America's tech leaders.17
However, Huawei's progress on chips has been somewhat hampered by US sanctions, forcing it to rely more on Chinese companies in the supply chain to produce the components it needs. Despite this, the company does not appear to be too far behind the likes of Apple in terms of the quality of their chips.
Nvidia
As mentioned before, the founding of TSMC opened up opportunities to start fabless companies that design chips that other companies actually manufacture. Nvidia is an example of such a fabless business.
Founded in the early 90s, Nvidia originally focused on computer graphics rather than microprocessors, partly due to Intel's dominance in the market for the latter. Accordingly, its main product has always been its GPUs.
GPUs were originally designed to enable computers to render high quality 3D graphics. To achieve this, the chips are designed to perform parallel processing; they can run numerous calculations in parallel, which is required for generating graphics on a computer.
These calculations are applied to each of the thousands of pixels in the image that dictate how those pixels are rendered. Such calculations are fairly simple, but GPUs can run them on thousands of pixels simultaneously, something that CPUs struggle with.18
But in addition to designing GPUs, Nvidia has also developed a software ecosystem around them called CUDA:
In 2006, realizing that high-speed parallel computations could be used for purposes besides computer graphics, Nvidia released CUDA, software that lets GPUs be programmed in a standard programming language, without any reference to graphics at all...Huang gave away CUDA for free, but the software only works with Nvidia's chips. By making the chips useful beyond the graphics industry, Nvidia discovered a vast new market for parallel processing, from computational chemistry to weather forecasting. At the time, Huang could only dimly perceive the potential growth in what would become the biggest use case for parallel processing: artificial intelligence.19
Today, Nvidia offers full-stack computing infrastructure and is (currently) the most valuable company in the world by market capitalisation. Data centre sales make up over 80% of its revenue, driven by the strong demand for the GPUs needed to develop large language models, recommendation engines and other AI-driven apps.
In reporting its results for Q1 of 2025, Jensen Huang, CEO of Nvidia and one of its original founders, emphasised company's rising influence in the current AI hype:
The next industrial revolution has begun. Companies and countries are partnering with NVIDIA to shift the trillion-dollar installed base of traditional data centers to accelerated computing and build a new type of data center, AI factories, to produce a new commodity, artificial intelligence.
Azeem Azhar, Exponential: How Accelerating Technology is Leaving Us Behind and What To Do About It (Penguin Random House 2021), p.19.
Azeem Azhar, Exponential: How Accelerating Technology is Leaving Us Behind and What To Do About It (Penguin Random House 2021), p.18.
Azeem Azhar, Exponential: How Accelerating Technology is Leaving Us Behind and What To Do About It (Penguin Random House 2021), p.19.
Azeem Azhar, Exponential: How Accelerating Technology is Leaving Us Behind and What To Do About It (Penguin Random House 2021), p.19.
Azeem Azhar, Exponential: How Accelerating Technology is Leaving Us Behind and What To Do About It (Penguin Random House 2021), p.21.
A floating point number is used to represent numbers with decimal points that a computer can interpret.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), xxv.
Azeem Azhar, Exponential: How Accelerating Technology is Leaving Us Behind and What To Do About It (Penguin Random House 2021), p.19.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 164.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 166.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 168.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 179.
Sebastian Mallaby, The Power Law: Venture Capital and the Art of Disruption (Allen Lane 2022), p.22.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 209.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 271.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 272.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 275.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 210.
Chris Miller, Chip War: The Fight for the World’s Most Critical Technology (Simon & Schuster 2022), 211.