BY PETE SINGER
There’s an old proverb that the shoemaker’s children always go barefoot, indicating how some professionals don’t apply their skills for themselves. Until lately, that has seemed the case with the semiconductor manufacturing industry which has been good at collecting massive amounts of data, but no so good at analyzing that data and using it to improve efficiency, boost yield and reduce costs. In short, the industry could be making better use of the technology it has developed.
That’s now changing, thanks to a worldwide focus on Industry 4.0–more commonly known as “smart manufacturing” in the U.S. – which represents a new approach to automation and data exchange in manufacturing technologies. It includes cyber-physical systems, the Internet of things, cloud computing, cognitive computing and the use of artificial intelligence/deep learning.
At SEMICON West this year, these trends will be showcased in a new Smart Manufacturing Pavilion where you’ll be able to see – and experience – data-sharing breakthroughs that are creating smarter manufacturing processes, increasing yields and profits, and spurring innovation across the industry. Each machine along the Pavilion’s multi-step line is displayed, virtually or with actual equipment on the floor – from design and materials through front-end patterning, to packaging and test to final board and system assembly.
In preparation for the show, I had the opportunity to talk to Mike Plisinski, CEO of Rudolph Technologies, the sponsor of the Smart Pavilion about smart manufacturing. He said in the past “the industry got very good at collecting a lot of data. We sensors on all kinds of tools and equipment and we’d track it with the idea of being able to do predictive maintenance or predictive analytics. That I think had minimal success,” he said.
What’s different now? “With the industry consolidating and the supply chains and products getting more complex that’s created the need to go beyond what existed. What was inhibiting that in the past was really the ability to align this huge volume of data,” he said. The next evolution is driven by the need to improve the processes. “As we’ve gone down into sub-20 nanometer, the interactions between the process steps are more complex, there’s more interaction, so understanding that interaction requires aligning digital threads and data streams.” If a process chamber changed temperature by 0.1°C, for example, what impact did it have on lithography process by x, y, z CD control. That’s the level of detail that’s required.
“That has been a significant challenge and that’s one of the areas that we’ve focused on over the last four, five years — to provide that kind of data alignment across the systems,” Plisinski said.
Every company is different, of course, and some have been managing this more effectively than others, but the cobbler’s children are finally getting new shoes.