Proving the benefits of data analysis

By Dave Lammers

The semiconductor industry is collecting massive amounts of data from fab equipment and other sources. But is the trend toward using that data in a Smart Manufacturing or Industry 4.0 approach happening fast enough in what Mike Plisinski, CEO of Rudolph Technologies, calls a “very conservative” chip manufacturing sector?

“There are a lot of buzzwords being thrown around now, and much of it has existed for a long time with APC, FDC, and other existing capabilities. What was inhibiting the industry in the past was the ability to align this huge volume of data,” Plisinskisaid.

While the industry became successful at adding sensors to tools and collecting data, the ability to track that data and make use of it in predictive maintenance or other analytics thus far “has had minimal success,” he said. With fab processes and manufacturing supply chains getting more complex, customers are trying to figure out how to move beyond implementing statistical process control (SPC) on data streams.

What is the next step? Plisinski said now that individual processes are well understood, the next phase is data alignment across the fab’s systems. As control of leading-edge processes becomes more challenging, customers realize that the interactions between the process steps must be understood more deeply.

“Understanding these interactions requires aligning these digital threads and data streams. When a customer understands that when a chamber changes temperature by point one degrees Celsius, it impacts the critical dimensions of the lithography process by X, Y, and Z. Understanding those interactions has been a significant challenge and is an area that we have focused on from a variety of angles over the last five years,” Plisinski said.

Rudolph engineers have worked to integrate multiple data threads (see Figure), aligning various forms of data into one database for analysis by Rudolph’s Yield Management System (YMS). “For a number of years we’ve been able to align data. The limitation was in the database: the data storage, the speed of retrieval and analysis were limitations. Recently new types of databases have come out, so that instead of relational, columnar-type databases, the new databases have been perfect for factory data analysis, for streaming data. That’s been a huge enabler for the industry,” he said.

Rudolph engineers have worked to integrate multiple data threads into one database.

Leveraging AI’s capabilities

A decade ago, Rudolph launched an early neural-network based system designed to help customers optimize yields. The software analyzed data from across a fab to learn from variations in the data.

“The problem back then was that neural networks of this kind used non-linear math that was too new for our conservative industry, an industry accustomed to first principle analytics. As artificial intelligence has been used in other industries, AI is becoming more accepted worldwide, and our industry is also looking at ways to leverage some of the capabilities of artificial intelligence,” he said.

Collecting and making use of data with a fab is “no small feat,” Plisinskisaid, but that leads to sharing and aligning data across the value chain: the wafer fab, packaging and assembly, and others.

“To gain increased insights from the data streams or digital threads, to bring these threads all together and make sense of all of it. It is what I call weaving a fabric of knowledge: taking individual data threads, bringing them together, and weaving a much clearer picture of what’s going on.”

Security concerns run deep

One of the biggest challenges is how to securely transfer data between the different factories that make up the supply chain. “Even if they are owned by one entity, transferring that large volume of data, even if it’s over a private dedicated network, is a big challenge. If you start to pick and choose to summarize the data, you are losing some of the benefit. Finding that balance is important.”

The semiconductor industry is gaining insights from companies analyzing, for instance, streaming video. The network infrastructures, compression algorithms, transfers of information from mobile wireless devices, and other technologies are making it easier to connect semiconductor fabs.

“Security is perhaps the biggest challenge. It’s a mental challenge as much as a technical one, and by that I mean there is more than reluctance, there’s a fundamental disdain for letting the data out of a factory, for even letting data into the factory,” he said.

Within fabs, there is a tug of war between equipment vendors which want to own the data and provide value-add services, and customers who argue that since they own the tools they own the data. The contentious debate grows more intense when vendors talk about taking data out of the fab. “That’s one of the challenges that the industry has to work on — the concerns around security and competitive information getting leaked out.” Developing a front-end process is “a multibillion dollar bet, and if that data leaks out it can be devastating to market-share leadership,” Plisinski said.

Early adopter stories

The challenge facing Rudolph and other companies is to convince their customers of the value of sharing data; that “the benefits will outweigh their concerns. Thus far, the proof of the benefit has been somewhat limited.”

“At least from a Rudolph perspective, we’ve had some early adopters that have seen some significant benefits. And I think as those stories get out there and as we start to highlight what some of these early adopters have seen, others at the executive level in these companies will start to question their teams about some of their assumptions and concerns. Eventually I think we’ll find a way forward. But right now that’s a significant challenge,”Plisinski said.

It is a classic chicken-and-egg problem, making it harder to get beyond theories to case-study benefits. “What helped us is that some of the early adopters had complete control of their entire value chain. They were fully integrated. And so we were able to get over the concerns about data sharing and focus on the technical challenges of transferring all that data and centralizing it in one place for analytical purposes. From there we got to see the benefits and document them in a way that we could share with others, while protecting IP.”

Aggregating data, buying databases and analytical software, building algorithms – all cost money, in most cases adding up to millions of dollars. But if yields improve by .25 or half a percent, the payback comes in six to eight months, he said.

“It’s a very conservative industry, an applied science type of industry. Trying to prove the value of software — a kind of black magic exercise — has always been difficult. But as the industry’s problems have become so complex, it is requiring these sophisticated software solutions.”

“We will have examples of successful case studies in our booth during SEMICON West. Anyone wanting further information is invited to stop by and talk to our experts,” adds Plisinski.

POST A COMMENT

Easily post a comment below using your Linkedin, Twitter, Google or Facebook account. Comments won't automatically be posted to your social media accounts unless you select to share.