Issue



Design validation at first silicon


03/01/2000







Design validation at first silicon

Improving engineering IC design validation is critical to the success of accelerating technology.

David Eastman

As micropro-cessor performance exceeds 850MHz, de-vice dimensions shrink below .25 microns and logic density explodes into tens of millions of gates, design and simulation tools are being pushed beyond traditional models and engineering experience to successfully verify these advanced designs. The ability to achieve full design functionality at speed is being challenged. To ensure a quick hand-off to produce these higher performance designs, there is a critical need to validate design performance at first silicon, while the device is still in engineering development. This is especially the case as complex application- specific integrated circuit (ASIC) designs give way to more complex system-level-integrated or systems-on-chip (SOC) devices.

With one in five devices offering multi-functionality on a single high-performance chip - integrating logic, analog and memory functions - perhaps the single greatest impact on design success lies in design validation at first silicon. By 2003, design validation may be even more critical based on predictions that three of every five devices developed will be SOCs.

The Gating Item to Time-to-Samples

There is no point in designing the latest, greatest and fastest IC if it cannot be validated or characterized. For increasingly complex digital designs, in-circuit emulation, improved simulation technologies and formal verification tools are barely keeping pace. For mixed-signal designs, some good tools exist, but major improvements are still required to handle greater complexities. Adding memory to digital and analog designs or outsourcing other core configurations for integration onto SOCs further complicates the design-to-test-to-manufacturing process.

Of course, the complexity issues presented by new technology do not just focus on validation issues; there are daunting challenges related to materials, manufacturing methods, packaging, development time and cost, and reliability. For SOCs to be successful, and for time-to-market and development costs to mirror today`s ASIC development, dramatically new design, test, assembly and packaging processes will need to be implemented.

Still, the single greatest challenge facing SOC development is testability. In its technology roadmaps and in predictions of future integrated circuit (IC) development, the Semiconductor Industry Association (SIA) has stated that time-to-market, time-to-yield and time-to-samples will be gated by test. This statement is supported by many semiconductor manufacturers who complain that test debug now accounts for as much as 50 percent of total IC development time. Complex designs require more time for test development, test debug, characterization and failure analysis, and present the greatest problems for quick hand-off to production. The solutions may be found in engineering validation, which analyzes failures and gives product and design engineers a clear roadmap to problem-solving (Figure 1).

Challenging Traditional Validation Methods

In addition to electronic design automation (EDA) design verification tools, built-in-self-test (BIST) and design-for-test (DFT) methods (which are largely focused on verifying the virtual model), semiconductor manufacturers employ a variety of methods to validate first silicon prototypes. This includes custom-built rack and stack instrumentation, automated test equipment (ATE) and dedicated engineering validation systems. Certainly, each system has its advantages, but none delivers the crucial information that is obtainable from a dedicated engineering validation system.

Until the advent of high-speed, high-performance microprocessors, test engineers at many small semiconductor companies developed "rack and stack" or custom-designed bench-top test instrumentation. This approach was relatively successful when ICs were populated with a few hundred transistors or had simple logic and analog capabilities. However, given today`s highly advanced designs, where millions of gates and tens of millions of test vectors are the norm, bench-top instruments cannot reliably handle the full range of at-speed functional testing.

Essentially, bench-top instrumentation cannot handle the amount of analysis required on complex multi-million transistor devices or the interconnect environment for sub-micron semiconductors that operate at very high frequencies and very low signal levels. Testing for each iteration of a product or next generation is unreliable or difficult to replicate from device to device. On a more practical level, bench-top systems tend to require specialized skill sets that are beyond the core competencies of those asked to develop them. They are difficult to construct and integrate as single, cohesive systems, and they require specialized software development, which can be a large investment.

Large semiconductor companies that use ATE for high-speed production test of packaged IC products often adapt these same ATE systems for engineering validation. Although not specifically designed for this purpose, the main reason to reconfigure and reprogram an ATE to meet the needs of engineering validation is to hand off the tested device to production as quickly as possible. "Same-system" testing appears to be a convenient and logical approach. However, using a pro- duction test system to validate engineering designs can be problematic, and new device architectures challenge the ability of the ATE to offer quality, full function validation.

The first challenge is reprogramming the ATE - essentially a "go/no go" test system designed to quickly ascertain failures - into a fully functioning, interactive engineering validation system designed to determine why and where failures occur. Basically, production ATE is designed with a high-speed, high-throughput, multi-site, cost-per-device focus (Figure 2). This method verifies that a device meets or fails a well-defined specification. In this example, throughput speed is used to quickly identify the general failure of the device rather than to garner information to locate problems. ATE systems also tend to take up a lot of floor space, are not used full-time, and they require on-site analysis equipment, and sophisticated cooling and calibration. In contrast, an engineering validation system can be optimized for validation, characterization, failure analysis and yield enhancement of ICs and system integrated level devices.

Why Use a System Optimized for Engineering IC Validation?

The answer is simple: The engineering system is optimized for validation, and its only function is analysis of the silicon prototype, not production test of the finished package.

On system-level integrated devices (where logic, analog and memory functions are being integrated into total systems on a single piece of silicon and where specifications differ across various individual cores and across the spectrum of the device), failures are becoming more difficult to discover and analyze. A validation system can provide at-speed full-functional testing.

Characterization and rapid device debug are crucial steps to bring world-class competitive ICs to market, and validation techniques are also needed to produce a successful device design. Engineering validation focuses on prototype or first silicon validation, when the virtual electronic design model is made into a real physical working device. This not only helps to ensure that the device works as designed, but it can also improve yields, quality and reliability and afford the design engineer an opportunity to push an innovative design.

It`s also important to note that new devices are running at rapidly increasing clock and data rates, resulting in multiple-bus architectures that allow different sections of a device to effectively run at different speeds. Current test technology, such as custom bench-top instrumentation and the more commonly used production test equipment, can be hard-pressed to meet the timing demands of these new devices or offer at-speed, full-functional testing.

Unlike a production ATE system, an engineering validation system focuses on establishing and characterizing specifications, not just verifying conformance to specification. A digitally focused validation system collects and displays all digital pin data, allowing full logic analysis in one pass. Device speed path errors can be examined and analyzed using cycle stretch and shrink adjustments to the drive data. This same tool can be used to stretch digital performance beyond the original specifications resulting in speed bining and higher product margins.

Digital- and analog-focused validation systems also have the ability to allow for full analysis using phase or frequency-locked digital signal processing (DSP) techniques that analyze either internal circuit blocks, or cells and full multi-block system functions (Figure 3). DSP instrumentation covers a wide analog dynamic range and spectrum, providing static and dynamic analysis in time, frequency and phase relationships within a mixed-signal environment.

Selecting an Engineering Validation System

Factors to consider when choosing a validation system are generally grounded in the need to reduce back-end, post-silicon design-cycle times and cost. Features to take into account include cost, size (is it small enough to fit into an engineering lab environment?), ease of use, repeatable and traceable results, the ability to link to EDA tools, and mobility (can the system be shared by a number of design teams, even in remote, offsite locations?). A system should be able to quickly offer "what if" scenarios that will allow a team to establish or stretch device specifications quickly - not in weeks but hours. An ideal system would be one that allows intuitive and interactive investigation of prototype silicon and offers the capability to debug and characterize new silicon designs quickly. It is important that the software is easy to use, optimized for the type of testing needed to validate the IC design. The software tools should be able to evaluate the source EDA information and format, translate this data into useable pattern data, and then allow this information to be easily manipulated for "what-if" analysis. This will help facilitate quick decision-making. To ensure this, the validation tools should be simple; design teams should not be required to learn difficult programs, spend time programming systems or have a difficult time accessing a system after hours.

Conclusion

Today`s accelerating technology is challenging the ability of product design teams to successfully validate and characterize designs. Traditional testing methods unfortunately do not have the range of capabilities to test these new advanced designs and give product design teams confidence that they are fully validated and functional. Optimized engineering validation systems offer a solution to help semiconductor manufacturers bring ever-more complex designs to market quickly and cost-effectively.

DAVID EASTMAN, manager of corporate communications, can be contacted at Integrated Measurement Systems, 9525 S.W. Gemini Dr., Beaverton, OR 97008; 503-626-7117; Fax: 503-644-6969; E-mail: [email protected].

Click here to enlarge image

Click here to enlarge image

Figure 1. A mixed-signal IC validation system for the verification, characterization and failure analysis of high performance digital, mixed-signal prototype ICs and SOC devices.

Click here to enlarge image

Figure 2. Analysis of passing and failing test vectors.

Click here to enlarge image

Figure 3. Analysis and validation of analog signal outputs simulated in an audio CODEC device.