Semiconductors we want to make, semiconductors we want to use

Jun OKAMURA
9 min readDec 28, 2024

--

The original note is in Japanese below and this is translated by OpenAI-o1.
https://qiita.com/jun1okamura/items/4b74af3a7b4b9e49ebff

In 1990, NHK hailed Japan as an “electronic powerhouse,” spotlighting the semiconductor industry. Now, 30 years later, it’s back in the limelight. This time around, the show's star is cutting-edge manufacturing technology, but here I want to zero in on semiconductor design. This piece follows my previous design-related posts — “PDK Then and Now” (uploaded December 3, 2023) and “A Qualitative Cost Analysis of the Semiconductor Business” (uploaded August 1, 2024). I’m grateful to everyone who read those posts; if they helped folks in the semiconductor realm get a clearer picture of design, I’d be delighted.

Today, under the heading “Semiconductors We Want to Make, Semiconductors We Want to Use,” I’m going to explain in simple terms some of the challenges around semiconductor device design. I’ve fleshed it out to be useful not just for semiconductor designers but also for those designing or planning equipment and systems that use semiconductors. I hope you’ll enjoy reading it!

Please note that this content is based on my opinions and views from past work experience. It does not represent any organization.

Comparing It to Cooking

Recently, while writing a paper on semiconductor design for a management journal, I got the idea to liken the semiconductor industry to the restaurant business. Essentially:

  • Semiconductor manufacturing equipment = Kitchen appliances
  • Semiconductor manufacturing techniques = Cooking methods
  • Semiconductor processing technology = Culinary skills
  • Semiconductor devices = Dishes served

In the semiconductor industry, attention often goes to the equipment, techniques, and processes for building chips. But in the restaurant world, what matters for business success isn’t the kitchen gear or cooking method — it’s “What kind of dish will make customers happy?”

Likewise, when it comes to semiconductors, the key question should be: “What kind of chip do users want?” For some reason, though, that question doesn’t get much airtime.

It’s recognized that over the past 65 years of semiconductor integration, miniaturization was seen as “the ultimate path to competitiveness in the semiconductor manufacturing industry,” largely because for so long, you couldn’t meet user specs unless you pushed the boundaries of miniaturization. On top of that, the number of companies able to manufacture cutting-edge, ultra-fine semiconductors has now shrunk to just three worldwide. Since supporting these firms is crucial for securing semiconductors — viewed as strategic resources that can sway the balance of power between nations — producing them is seen as indispensable for national economic security.

Certainly, ultra-advanced manufacturing processes are needed to meet the specs for AI chips. However, we still see plenty of chips made using “legacy processes” that date back over 20 years. This shows that “the required miniaturization technology varies by the user’s needs.”

Using our cooking analogy, “the necessary kitchen equipment, cooking methods, and culinary skills depend on the dish you’re making,” which is perfectly logical. Also, a restaurant owner is always thinking about how to use their limited cooking facilities, methods, and skills to devise dishes that will delight their customers. After all, that’s where the chef’s true artistry comes into play.

Of course, some super-luxury dishes simply can’t be completed without the latest kitchen gear and techniques. But that’s not a make-or-break requirement for running a restaurant.

The Chef’s Job

Semiconductor designers are essentially the “chefs” of the electronics world. Their job is to design the semiconductors that match what the customers — i.e., the users — are craving. Just like in the restaurant industry, a chef aims to excel at both the menu (specifications) and the cooking (implementation). However, in general, when people say “semiconductor designer,” especially in Japan, they tend to refer to experts in the latter — “cooking (implementation)”. It seems there’s an assumption that if you’re a master of the cooking (implementation) side, you’ll naturally be able to whip up a satisfying menu (specifications) too. But is that the case?

In reality, even if you’re a world-class cook, if you neglect the art of menu creation (specifications) — things like flavor, presentation, choice of serving dishes, or the combination and order of a course meal — you’ll never reach three-star status. The same goes for semiconductors: if you don’t pay attention to user-friendly specs, like pin layouts, interface choices, operating speed, temperature range, and driver compatibility, you’ll end up with a chip that no one wants to buy.

Of course, the reverse is also true: no matter how good the menu (specifications) is, if the cooking (implementation) skills aren’t top-notch — choosing the right circuit technology, process, architecture, etc. — then users won’t be satisfied with the final product, and the semiconductor won’t sell.

The takeaway here is that a semiconductor designer’s goal should be to excel at both the menu (specifications) and the cooking (implementation).

Be a Gourmet!

They say that ramen shop owners never skip going around and tasting ramen wherever they can. And if you’re an Italian or French chef, you’ll do the same overseas. This kind of “taste-testing” is an essential effort to refine a chef’s menu (specifications). Just as top chefs are gourmets, top semiconductor designers ought to be “chip gourmets.”

A chip gourmet is an engineer with the ability to dive deep into datasheets, combine chips and software to form a complete system, and balance performance and cost with a discerning eye. It’s someone with enough imagination to define the system’s requirements and see them through.

Back in the 1980s, when Japanese semiconductors reigned supreme, major domestic electronics manufacturers designed and produced the semiconductors for their own consumer electronics in-house. The IDM (Integrated Device Manufacture) structure allowed engineers who could envision the required specs from a system perspective to work side by side with engineers who could implement those specs into the chip. This synergy brought semiconductor design capabilities to their peak. But from the 1990s onward, as those consumer electronics faced weakening product planning and competitiveness, the semiconductor sector also began to decline.

At the time, many large IDM semiconductor divisions were spun off to form separate subsidiaries. They aimed to fill their fab lines by touting advanced manufacturing processes to external clients for contract chip design and manufacturing. But in the end, this strategy hollowed out the base of engineers who could conceptualize system-level requirements. In a sense, there was an overwhelming focus on chip implementation and management while neglecting to foster chip gourmets.

Meanwhile, in the US, semiconductor startups and fabless chip makers forged a different path. Rather than just emphasizing implementation technology, they set themselves apart by cultivating the capability to imagine requirements as a complete system — the very skillset a chip gourmet needs.

Semiconductors We Want to Make

Looking back on my career, I’ve noticed that semiconductor designers who specialize in cooking (implementation) often pursue “semiconductors they want to make” to verify their advantages or unique approaches on the implementation side.

At semiconductor design conferences, papers and presentations are typically evaluated by how well the Figure of Merit (FOM) — things like lower power consumption, higher operating speed, and better conversion accuracy — has improved. Naturally, the goal becomes to showcase ideas that outperform these metrics on a global stage.

But will such “semiconductors we want to make” actually succeed in business? Even if you’ve got a competitive edge in terms of performance — like speed or accuracy — if you exceed user requirements, you may be weeded out by cost disadvantages.

Using the restaurant analogy, simply hiring a talented chef doesn’t guarantee a booming business. If the restaurant can’t serve a menu that fits its location and target customers, it’ll eventually fail.

So, for a “semiconductor we want to make” to become a “semiconductor people want to use” and succeed commercially, the performance has to line up with user requirements.

Semiconductors People Want to Use

Like the restaurant industry, the merits of a “menu (specifications)” — in other words, whether a semiconductor is something people want to use — are rarely judged by academic papers or conference presentations. Instead, the market itself determines how well these “semiconductors people want to use” fare by looking at actual sales.

In the semiconductor world, making sure the cooking (implementation) aligns with the menu (specifications) is also crucial. Without a “pull” of information from users, relying solely on the designer’s “push” of technical know-how ends up going nowhere.

Perhaps the gradual decline of Japan’s semiconductor industry over the last 30 years comes down to a lack of effort in gathering that “pull” from users. They may have placed too much faith in their ability to differentiate themselves solely through design and manufacturing expertise.

For Japanese manufacturing to advance — and for the semiconductor sector to recover — we need a push-pull relationship, where the “chefs” (semiconductor designers) work closely with the engineers at equipment makers, system makers, and electronics manufacturers to shape the specs they want to use.

In restaurant terms, it’s like creating an exclusive “off-menu” item based on a loyal customer’s request. Unfortunately, ever since Japan’s big IDM companies withdrew from the ASIC business, domestic users themselves seem to have lost much of their ability to craft unique specifications.

The Cost of “Never Failing” vs. Daily Training

Just as a chef’s skill and a regular’s discerning taste are honed through constant practice, so too are a semiconductor designer’s expertise and a semiconductor user’s keen eye shaped by day-to-day efforts. Yet, because development costs for cutting-edge semiconductors have skyrocketed exponentially, the industry seems to place all its emphasis on “never failing” at the expense of giving engineers enough opportunities to develop their skills.

Have we ended up sinking more and more money into fancy design tools, essentially taking out multiple layers of “insurance” to avoid failure, rather than nurturing semiconductor designers’ abilities? Have we, in prioritizing “never failing,” chosen programmable FPGAs, general-purpose processors, or off-the-shelf boards — thereby crushing the seeds of true differentiation for end users?

It’s possible that, constrained by the notion that custom semiconductor development is too expensive, both semiconductor designers and semiconductor users find themselves stuck in a dead end.

Another big issue is the oligopoly of mega-foundries overseas, which makes it nearly impossible to get LVP (Low Volume Production) for the niche devices needed by “long-tail” equipment manufacturers. Are turnkey design vendors aligned with these mega-foundries steering semiconductor users toward expensive advanced-node proposals purely for their own benefit?

Likewise, relying on programmable FPGAs, general-purpose processors, or off-the-shelf boards might be fueling product commoditization, copycats, and knowledge leaks, ultimately undermining a company’s competitive advantage.

Legacy Semiconductors and Open-Source EDA

Just like you don’t need expensive kitchen equipment to hone a chef’s skills, you don’t necessarily need cutting-edge semiconductor manufacturing technology to train a semiconductor designer. And, just like you don’t need to dine at ultra-luxury restaurants to sharpen a connoisseur’s palate, there’s no hard requirement to explore devices that can only be made with the most advanced processes in order to sharpen the discerning eye of semiconductor users.

Both the “skills of semiconductor designers” and the “keen insight of semiconductor users” come from hands-on practice. The most important thing, in my view, is to repeatedly go through the process of turning a user’s device requirements into a working semiconductor, and then assembling and evaluating that chip in a system context.

Using legacy semiconductor technologies that even small and mid-sized companies can afford, combined with cost-friendly open-source EDA tools, makes it possible to methodically iterate through the loop of:

(1) defining device requirements → (2) implementing semiconductor designs → (3) building shuttle prototypes → (4) evaluating them in an actual system.

Using a restaurant analogy, just as new cooks refine their skills by making the staff meal, semiconductor designers also need “staff-meal” design experience to polish their craft. Legacy semiconductor tech and affordable open-source EDA tools offer exactly this kind of training opportunity.

Likewise, if elite foodies sharpen their eyes and palates by frequenting different restaurants, perhaps system designers could also benefit from “taste-testing” in device design. Again, legacy semiconductors and open-source EDA tools can provide them with exactly that chance to develop their keen sense of what makes a chip truly appetizing.

Aiming for Push-Pull Collaboration

When people hear “semiconductors,” they often think of miniaturization, cutting-edge lithography, manufacturing equipment, or specialized materials. But without an integrated circuit (IC) implemented, a semiconductor is just a piece of silicon. The key points are which IC functions should be built in and how much value they provide. In practice, the choice of manufacturing technology — the process of embedding those IC functions into the silicon — is a business decision, balancing costs against the value that the circuit delivers.

Meanwhile, the value created by an IC ultimately stems from how it’s used — by those who integrate the semiconductor into a device or system. If the final device or system doesn’t become more valuable, then there’s no real value in the semiconductor. That’s why we need a synergy between the “push” (the technical prowess of foundries and semiconductor designers) and the “pull” (the requirements of device/system manufacturers as the end users).

To achieve this push-pull collaboration, we should lower the barriers to semiconductor device development and foster an environment where semiconductor designers and system engineers can learn from each other. By making use of legacy semiconductor technologies and open-source EDA tools, we can open the door to new possibilities and drive this collaboration forward!

Jun

2024/Dec/23rd

--

--

No responses yet