TI collaborates with University of Texas at Dallas to improve wafer production efficiency – applying big data approaches to test data

There’s a new field in engineering – it’s called adaptive test. This new field has multiple sub-areas with innovation potential, and one of these areas – which TI has partnered with University of Texas at Dallas researchers to study -- is test-cost reduction. It involves developing methods aimed at increasing wafer manufacturing efficiency while also maintaining – or even improving – quality.

The project initiated about three years ago, when Yiorgos Makris, a Yale University faculty member at the time, was awarded a Semiconductor Research Corporation grant for adaptive test research in this area. Makris needed industrial data to demonstrate his methods, and John Carulli, a Distinguished Member of the Technical Staff at TI, was able to help.

The two worked together several times a year on exchanging research and results before Makris joined the University of Texas at Dallas faculty as an associate professor in the Erik Jonsson School of Engineering and Computer Science. That’s when they began to collaborate more frequently. “As we were demonstrating the new methods we were creating, we had to explore,” Makris says.

The basic premise of the research is that sampled test data from a few die locations on a wafer could determine whether all of the sites on the wafer are good or bad. This would make it possible to test far fewer die locations yet make an educated and accurate guess on the rest of the batch -- dramatically reducing wafer test time.

Post doctorial researcher Ke Huang, along with Makris, Carulli and former student Nathan Kupp, published a paper on the project titled “Handling Discontinuous Effects in Modeling Spatial Correlation of Wafer-level Analog/RF Tests,” at the Design Automation and Test in Europe conference in 2013. It’s the premiere test conference in Europe; their work nabbed them the best paper award.

Carulli says that that there are different ways to use the results of wafer data collection. “We can leverage this data understanding to test, say, 100 die on a wafer instead of 1,000, but we could have a good idea of what those other 900 points would do,” Carulli says. “There are a lot of different things we could do with this, such as making predictions and telling testers what to look for to be faster or improve quality.”

It’s an area and project that many, including TI  test engineer Amit Nahar, have been working on. Nahar was also funded by the SRC as a Portland State University student, and joined TI after receiving his master’s degree in 2005. 

Nahar now manages   TI’s adaptive test research group, where he “uses test  measurements like voltages, frequencies and currents to identify adaptive test strategies,” he says, and collaborates with Carulli and Makris. Another one of Makris’ students – Constantinos Xanthopoulos – will begin a TI SRC Internship in June to collaborate with Nahar and try out methods in pilot production that have been developed at UT Dallas.

Makris says the high volume production test data from TI is an important factor in the project. “We rely on data to understand industrial environments in order to make our methods more relevant, so we need to be able to test and prove our work in a lab.”

Plus, he says, “UT Dallas and TI have a long history, and they are synergistically connected in a very important way. So for us to be able to interact with TI is a win-win situation.”

While the project is slated to wrap at summer’s end, Makris is hoping that the results will show the value in additional research. “We have been among the few groups who started looking at all these statistical methods more than 10 years ago, and this is becoming a hot area with many more players,” he says.

Nahar agrees. “It’s a relatively new field, and basically there are tons of possibilities for us to improve  quality and reduce test cost,” he says. “A motivator for me is to research and develop  these new methods  for adaptive test”