Skip to main content

The World of FPGAs and ASICs: Part 1

FPGA_Microsoft_Catapult2_c7d79b88903632176b04d9331448791f064258b6.jpg

                                                                                                   Credit : Microsoft Research

In Part 1 of this two-part post, I’ll be looking at the origins of programmable hardware which has reduced the chip-count of many PCB designs. Part 2 will show how the FPGA and the ASIC are now replacing the microprocessor and DSP in high-speed and high-reliability applications.

Early Digital Logic

The first semiconductor-based computer to run a program was built in 1953 at Manchester University. It contained 92 point-contact transistors and 550 diodes, but could only manage an average instruction rate of about 33 instructions/sec. At least it was a big improvement over its thermionic valve (‘tube’) based predecessors, as it consumed less power and was physically smaller. I mention it because it represents an early starting-point in the rush to build machines with more and more processing power with an ever-decreasing physical size. All digital computers, even now, consist of interconnected logic gates, the latter providing the basic functions of AND, OR and NOT. You can build just about anything with just those three types of gate. Actually, you are more likely to use them in their negative-logic form with inverted outputs: NAND and NOR. This is because this format requires the minimum number of transistor stages; an extra inverting transistor is needed to yield AND and OR with a corresponding increase in propagation delay and power consumption. These logic functions were so useful as building-blocks, they featured amongst the first commercial integrated circuit designs. The 74-series of Transistor-Transistor Logic (TTL) chips is still available today including the 7400 (Quad 2-input NAND gates) and 7402 (Quad 2-input NOR gates), albeit with vastly improved silicon technology.

Computers in the late 1960s and early 1970s contained many boards full of TLL or CMOS gate chips. As the IC technology improved, new chips with ‘special functions’ such as counters and shift registers appeared, reducing the number of boards and helping to shrink the size of the computer. Large Scale Integration (LSI) produced the first RAM chips and then the microprocessor. Even with a single-chip processor and memory chips, it was still very difficult to make a genuine single-board computer. There was still a lot of necessary ‘glue-logic’ unique to a particular design which had to be made from basic gate chips.

The Gate Array

Very quickly, chip manufacturers started producing customisable devices called Programmable Array Logic (PAL) and Uncommitted Logic Arrays (ULA). A PAL contains fixed groups of AND/OR gates called Product Terms: within each group, the gates are connected together by fuse links. The links are addressed by an external programmer as Programmable Read-Only Memory. To create the desired circuit, the fuses are ‘blown’ by the programmer where necessary to disconnect unwanted links. Hence these early devices were ‘One Time Programmable’ (OTP) and errors could not be corrected. They were relatively cheap though, and the waste of a few chips was well worth the reduction in overall chip count. Later devices used a UV-Erasable link memory which could be reconfigured before EEPROM-based chips arrived that could be reprogrammed in situ. Newer devices contain flip-flops, multipliers and even RAM blocks to increase their versatility. With the advent of Very Large-Scale Integration (VLSI) these programmable devices have become very complex, with SRAM forming the internal configuration memory boot-loaded on power-up from an external EPROM or Flash chip. Welcome to the FPGA.

The ULA took a different approach to the PAL. A user now started with a ‘blank canvas’. The manufacturer provided a set of software tools allowing the developer to design their own chip at the individual transistor/diode level. It means you can create what amounts to a customised design with no redundant or unused elements, optimised for speed and power. The down-side is that the designer has to generate the ‘track’ layout for the device: rather like a PCB layout covering the inter-element links including common clock, power and ground connections. The user supplied the manufacturer with a diagram showing the exact positions of each element on the silicon and a mask for the metal layer providing all the interconnections. ULAs were not able to contain a high-density of elements, so the whole exercise was rather similar to that of PCB design. The ULA was, in fact, an early form of what we now call a fully-customisable Application-Specific Integrated Circuit or ASIC. The ASIC is manufactured with your design and any mistake will be very costly. So, you don’t have one of these made unless the design is fully proven; probably using an FPGA. A mistake in the design of the early Pentium microprocessors known as the FDIV Bug cost Intel hundreds of millions of dollars.

Example of space-saving with Programmable Logic

Boards from two 1980’s single-board computers are shown below, one with lots of simple ‘glue-logic’ chips (Fig.1), the other with a single ULA device (Fig.2). Most people will have at least heard of Clive Sinclair’s groundbreaking ZX81; the Jupiter Ace which used the Forth programming language may be less familiar. Even without the base for the membrane keyboard, it’s clear that the Jupiter Ace needs a lot more ‘real-estate’ for its many chips. The circuit for the ZX81 is very similar, but all those Small- and Medium-Scale ICs (SSI/MSI), have been combined in one single ULA chip. An interesting fact is that the ZX81 pre-dates the Ace by a year. It illustrates where likely sales play a part in any design: the Ace was not forecast to sell well enough to justify the cost of a ULA.

FPGA_1_blog_1_cb10485d9d2e9833adf9df8d31d920fe4de16c08.png

FPGA_1_blog_2_50df3e9cc7b81da40efbb6d012e8ba1e640be2ee.png

The FPGA has come a long way since the days of PALs and ULAs. The on-board function modules have become ever more complex with more and more being packed onto a single chip. You ‘program’ them using high-level languages such as VHDL and Verilog. Until recently, they were very, very expensive, way beyond the pockets of Makers. All that has changed and in Part 2 we shall see how they are being used to replace microprocessors and DSPs in applications such as Artificial Intelligence.

But what of simple glue-logic replacement? If you just want a small, cheap device to provide some basic logic functions, don’t worry, the PAL lives on, in its new guise of a Simple Programmable Logic Device or SPLD. FPGAs and ASICs are now, unsurprisingly, called Complex Programmable Logic Devices or CPLDs. Why don’t modern SBCs like the Raspberry Pi seem to need glue-logic anymore? Find out in Part 2.

If you're stuck for something to do, follow my posts on Twitter. I link to interesting articles on new electronics and related technologies, retweeting posts I spot about robots, space exploration, and other issues.

 

Engineer, PhD, lecturer, freelance technical writer, blogger & tweeter interested in robots, AI, planetary explorers and all things electronic. STEM ambassador. Designed, built and programmed my first microcomputer in 1976. Still learning, still building, still coding today.
DesignSpark Electrical Logolinkedin