2025 03 27 100x microsystems and microchips
Post
Cancel

100x microsystems and microchips

100x microsystems and microchips

History and microchips

The microchips fueled the Silicon Valley revolutions in the 1970s and made the US the leading nation in the field of designing, creating, and scaling the use of integrated circuits. Soon followed by a revolution in Japan, Korea and Europe.

But in the late 90s, semiconductor fabrication started to migrate to Asia, first the chip packaging industry, where TSMC (Taiwan Semiconductor Manufacturing Company) emerged as a major player. Then the “fabless” model, where companies design chips but outsource manufacturing, became more prominent and all US manufacturers happily outsourced the US national talent and technology abroad.

What was kept in the US was the design of integrated circuits and products. In a way, this is similar to other heavy industries like steel, chemicals, materials, car manufacturing industries that slowly globalized their operation in the 1970s and 1980s.

Today

Today the US has shed most of the production capability of advanced microchips, and to make things worse, also has a deep shortage of workers and know-how in the area of microchip fabrication, instrumentation and tools, deployment of fabrication facilities (fabs).

To make matters even worse, most of the microchip production of advanced components is in the geo-politically unstable island of Taiwan.

Europe maintained leadership in the area of microchip machinery needed in the fabs.

All the microchips you use in your car, microwave oven, fridge, computer, laptop, iphone, cell-phone, internet router, etc. are all built outside of the US. And that would be OK if it wasn’t that pretty much all of them are currently built only in Taiwan.

Globalization helps industry save cost by moving laborious and manual work to cheaper countries, but it does not help when the global trade is affected and there is a national need for the industry to manufacture on its own soil.

What has happened to building microchips?

The way to keep advanced industries like the US in a prominent position is to keep advancing the field and innovate. But one can hardly innovate on something they do not produce anymore, or have no knowledge of the challenges of production.

Many of us today would like to see the development of microchip have the same trajectory of writing coding or 3D printing: use your cheap design tools at home, design your microchip, based on a cheap or free library of components, send it off to fabrication in a cheap and shared run, receive the parts after some time in the mail.

But no… We spoke about the barrier before: https://euge-blog.github.io/2022/08/08/semiconductor-for-masses.html

In a nutshell the barriers (mountains) are:

  • No cheap CAD (computer aided design) software for microchips, coupled with foundry design kits
  • Legal barriers in obtaining foundry design kits and circuit components (IPs) or intellectual property to enable Lego-like modularity
  • Expensive production runs in modern fabrication processes
  • Lack of design foundries readily available for prototyping and production, especially in niche areas of microchips like memories
  • Funding an ecosystem to break these in US has failed because it keeps funding large corporation that have no interest to change their modus operandi, and in fact have squandered a fortune in taxpayers monty over the last 2 decades obfuscating the design of microchip even further

These are seemingly insurmountable barriers to progress, but in reality all that is needed are better business models that foster sharing resources without sacrificing trade secrets.

Articles like this one: https://semianalysis.com/2025/03/11/america-is-missing-the-new-labor-economy-robotics-part-1/ also talk about how iterating with groups in two continents 12h apart is not an effective way, and that building small cities where manufacturing in an area is concentrated can help to boost productivity and innovation.

Smart and industrious colleagues have spent years trying to democratize microchips: https://www.zeroasic.com/ but unless they have the backup of a continuous stream of funding and the right attention of the leadership, they cannot alone move mountains.

Beside democratizing microchip design, another issue is innovation in fabrication techniques.

The illusion of light

Today microchips are built by light and gas. Gas is cheap but light is becoming the most expensive “material”.

Microchips today start with a flat disk of silicon, that is patterned using masks and light exposure of a very small wavelength, to produce nano-meter features. Features of a transistor used to be 1-2 micron wide in the 1990s, and progressively shrunk to today’s 5-3 nanometers. Today microchip fabrication requires the use of a light that is deep in the ultraviolet spectrum (small wavelength = small features) but is VERY expensive to produce and focus (lenses do not work, finding material to bend lights is a research endeavor).

If you think one day we could have the equivalent of a 3D printer at home to forge our microchips you would be surprised to learn that the current machine needed just to produce light for microchip fabrication is 2-containers big and cost a fortune. No 3D printer anytime soon 😔

Light is just not the way down any longer. We need to find better ways.

The missing 100x

AI today is still at least 100x less efficient than what we want it to be, taking the human brain as a proxy.

Today digital microchips powering your computers, laptop, cell-phone use power proportional to:

Static: P = I_leak V
Dynamic: P = A C V^2 f

I_leak is the current that leaks off circuit when they are supposed to be OFF (but powered).
V is the operation voltage of the circuit
A is the fraction of time the circuit is active - say 50%typically
f is the frequency of operation of the circuit

What can be done to reduce the power consumed by your processors?

Modern circuits operate by using a clock that generally uses ~50% of the overall power of a circuit. If your laptop uses 20 W of power, 10 are “wasted” by the clock tree.

IDEA #1: remove clocks

Circuit designers have been trying to save power, reducing V to the physical limits of devices, today close to 0.7 V. Also by reducing C = making devices smaller = light-based scaling.

Today, barring a complete change of fabrication technologies, we can only do:
IDEA #2: reduce f

This means reducing the frequency of operation = doing less. We cannot do this unless we also increase the number of devices accordingly. Instead of running 1 device at 1 GHz, run 1M devices at 1 KHz. Does this help?

It can only help if we make sure the I_leak of devices is low, so having more devices does not instead raise the power consumption, and that devices are used “less” per unit time. This may mean using older microchip processes with less leakage, at the expense of more capacitance C.

IDEA #3: reduce A

Instead of using all your devices at 50% of the duty cycle, run them most sparsely, say 1% of the time. This would reduce power by 50x.

This requires re-thinking our algorithms so that they make use of sparsity in data or events.

To the rescue

What technologies can offer the gains we need?

I argue that we ought to move away from the current microchip fabrication technologies and instead revolutionize the field with new ideas. This is not easy as the current techniques are the pinnacle of human abilities in positioning layers of atoms, manipulating material at the atomic scale.

Can we go beyond silicon technologies?

  • The brain uses biological neurons
  • The human brain the most complex computing device in the known universe

Yet our ability to manipulate biology at the scale and resolution of microchip technologies is not yet developed.

Also do you want a brain on your desktop drinking orange juice while doing your taxes? 😀

What other technologies can we use? TBD.

Before we can answer this question we need to know what kind of “device” we are looking for. Maybe we need a “transistor” or something we can turn ON / OFF using an input signal (digital or analog). Or maybe we need a “neuron” that is again something we can turn ON/OFF or make it pulse (spike) based on MANY analog input signals.

Or maybe the basic building blocks are a transistor and a capacitor, or equivalent: one switch to get signal in and a storage tank to store some inputs.

In the most general sense we need to have a device that can take an input and use it to control its output. If we had such a device, we could build computers with it.

Thoughts

In general, losing manufacturing to another country is never a good idea. It drains the brains and hands that can do specific jobs and also the opportunity for innovation.

Maybe one of the main reasons we do not have flying cars today is that we never tried. We were just too busy to reduce costs at the expense of innovation.

But what about reducing manufacturing costs IN YOUR OWN COUNTRY? This is the true INNOVATION!

It requires innovation in automation, robotics, high-throughput machinery, streamlined processes to obtain and process raw materials, store them, retrieve them, manipulate and forge.

Without local need for better manufacturing, we will lose this kind of innovation to other countries, slowing eroding competitiveness.

Only the government can help maintain manufacturing leadership, the ones with the vision to see the future and stick to long term 10s of years plans. It needs to concentrate factories and resources around a few areas, create an ecosystem.

about the author

I have more than 20 years of experience in neural networks in both hardware and software (a rare combination). About me: Medium, webpage, Scholar, LinkedIn.

If you found this article useful, please consider a donation to support more tutorials and blogs. Any contribution can make a difference!

This post is licensed under CC BY 4.0 by the author.

Trending Tags