How Chips Are Designed
In this chapter...
- Old-Style Design Process
- New-Style Design Process
- Verifying the Design Works
- Using Outside IP
- Getting to Tape Out and Film
- Current Problems and Future Trends
Chip design has come a long way since the first semiconductor chips were assembled, literally, by hand. Rarely does a profession reinvent itself so often and so fundamentally. New chips today are designed in a completely different manner from those of just 10 years ago; it's almost certain that the job description will change again in another 10 years.
New chips are designed, naturally enough, with the help of computers. These computers (often called engineering workstations) and the chips inside them were themselves designed this way, providing a nice circular closure to the entire process. The computers that chip-design engineers use are not fundamentally different from a normal PC, and even the specialized software would not look too alien to a casual PC user.
That software is breathtakingly expensive, however, and supports a multibillion-dollar industry all by itself. Called electronic design automation (EDA), the business of creating and selling chip-design software supports thousands of computer programmers around the world. Chip-design engineers rely on their computers and their EDA software "tools" the way a carpenter relies on a collection of specialized tools. There are a handful of large EDA vendors, notably Synopsys (Mountain View, California), Mentor Graphics (Portland, Oregon), and Cadence Design Systems (San Jose, California) that are the equivalent of Craftsman, Stanley, and Black & Decker. There are also numerous smaller "boutique" EDA vendors that supply special-purpose software tools to chip designers in niche markets.
Regardless of the tools a chip designer uses, the goal is always the same: to create a working blueprint for a new chip and get it ready for manufacturing. In the chip-design world, that's called "getting to film," as film is the ultimate goal of a chip designer. The film (which is really not film any more, as we shall see) is used by chip makers in their factories to actually manufacture the chip, a process that's described in the next chapter.
Old-Style Design Process
Before personal computers, EDA, and automated tools, engineers originally used rubylith, a red plastic film sold by art-supply houses that is still used today by sign makers and graphic artists. Early chip designers would cut strips of rubylith with X-acto knives and tape them to large transparent sheets hanging on their wall. Each layer of silicon or aluminum in the final chip required its own separate sheet, covered with a criss-crossing pattern of taped-on stripes.
By laying two or more of these transparent sheets atop one another and lining them up carefully, you could check to make sure that the rubylith from one touched the rubylith from another at exactly the right points. Or, you'd make sure the tape strips didn't touch, creating an unwanted electrical short in the actual chip. This was painstaking work, to be sure, but such are the tribulations of the pioneers. The whole process was a bit like designing a tall building by drawing each floor on a separate sheet and stacking the sheets to be sure the walls, wiring, stairs, and plumbing all match up precisely.
This task was called taping out, for reasons that are fairly obvious. Tape out was (and still is) a big milestone in every chip-design project. Once you'd taped out, you were nearly done. All the preliminary design work was complete, all the calculations were checked and double-checked, and all the planning was finished. About the only thing left to do was to wait for the chip to be made.
One last step remained, however, before you could get excited about waiting for silicon. You had to make film from your oversized rubylith layers. This was a simple photographic reduction process. Each rubylith-covered layer was used as a mask, projecting criss-crossed shadows onto a small film negative. It works just like a slide projector showing vacation snapshots on a big screen, but instead of making the images bigger, the reduction process makes them smaller. Each separate rubylith layer is projected onto a different film negative that is exactly the size of the chip itself, less than one inch on a side. Now you have a film set that you can send for fabrication.