Text preview for : hci_wp.pdf part of



| Home

WHITE
PA P E R



Reduced Test Time for HCI
and Electromigration Tests

Many reliability "wearout" tests monitor a performance parameter that
degrades steadily with the log of the time on stress. In most cases, a time to
10% degradation is measured. The time to 10% degradation is considered a
benchmark because many devices are tested at speeds or voltages that are 10%
greater than the certified capability of the semiconductor devices. For example,
a DRAM might be tested and found to be fully functional with a 45ns access
time but then sold as a slower 50ns device. This "guard-banding" allows for a
drift of up to 10% in the critical performance parameters without the device
falling outside of its specified performance. A reliability test must prove that
the device will not experience a critical performance parameter drift of more
than 10% over the expected product lifetime (typically ten or 20 years).
Accelerated stress levels can be used to obtain a measure of the time to
10% degradation in a shorter period. However, a good knowledge of the
failure mechanism is required in order to extrapolate the results to find the
time to 10% degradation at the use conditions. In most cases, this will require
testing at several different stress conditions to extract the relationship between
the stress condition and rate of degradation. This multiplies the cost of the test
and limits the test time reduction to what can be obtained using the lowest
stress condition.
The maximum stress conditions are typically limited by parasitic
considerations such as joule heating or source-drain punch-through voltage.
Additionally, competing failure mechanisms can cause a change in the tested
failure mechanism at higher stress conditions (e.g., the change from grain
boundary diffusion to bulk diffusion at higher temperatures for
electromigration tests). This limits the acceleration that can be applied to the
highest stress condition.

Keithley Instruments, Inc.
28775 Aurora Road
Cleveland, Ohio 44139
(440) 248-0400
Fax: (440) 248-6168
www.keithley.com
A G r e a t e r M e a s u r e o f C o n f i d e n c e
An alternative technique is to measure the time to a smaller percent degradation at the
true use conditions. Given that the rate of degradation is generally linear in the log time
domain, a smaller percentage change in the measured degradation can be measured in a much
shorter test time. The use of "use condition" stress levels in such a test will allow testing at
only one stress condition, and will eliminate the need to understand the stress vs. time
relationship. Additionally, there will be no concerns that the higher stress conditions will
change the rate of degradation. However, the use of this technique will require very low
instrumentation noise levels and very short time resolution.
Consider the rate of degradation seen in Figure 1. The transistor exposed to a "use
condition stress" is found to degrade at a rate of 10%/decade. With this slope, the requirement
of degradation of less than 10% in ten years means that the device must show less than 9%
degradation in one year. This can be extended to require less than 8% degradation in 1/10
year, or 36.5 days. Further, it must show less than 7% degradation in 3.65 days and less than
6% degradation in 0.365 days or 8.76 hours. If a test is to be conducted with a test duration of
8.76 hours, then the results will have to be extrapolated over four decades in time. For this
extrapolation to have any meaning, we must be able to show data accumulated over at least
four orders of magnitude in time. This will mean that the minimum time resolution must be
3.15 seconds. At 3.15 seconds, we would expect to measure a degradation of only 2%.
Measuring this value accurately requires less than 0.2% measurement noise.
All of this sounds very possible. However, these requirements are clearly a function of
the rate of degradation. Figure 1 shows several different rates of degradation, all resulting in
a 10% degradation in ten years. Table 1 shows the minimum time and minimum instrument
resolution to measure and extrapolate these results with an 8.76-hour stress.


Table 1: Lower Instrument Noise Required for Steeper Rates of Degradation

Rate of Degradation (%/decade) 10 20 200 500

Minimum Time Resolution (seconds) 3.15 3.15 3.15 3.15

Minimum Instrument Resolution 5.13% 2.79% 0.039% 2.56E-5%



Table 1 clearly shows that the steeper the rate of degradation, the lower the
instrumentation noise required to measure the degradation with a fixed test duration. Looked
at another way, this data can be used to determine the minimum test time if the instrument
noise is given.




A G r e a t e r M e a s u r e o f C o n f i d e n c e
Figure 1. HCI Degradation = 10%/10 years


The minimum time point can be extracted from any of the curves in Figure 1 by
drawing a line on the graph to represent 10