Category Archives: Semicon West

3D-Micromac AG (booth #1645 in the South Hall) this week introduced the microPREP 2.0 laser ablation system for high-volume sample preparation of metals, semiconductors, ceramics and compound materials for microstructure diagnostics and failure analysis (FA).

Built on a highly flexible platform with a small table-top footprint, the microPREP 2.0 allows for easy integration into FA workflows. Developed jointly with Fraunhofer Institute for Microstructure of Materials and Systems (IMWS), the microPREP 2.0 complements existing approaches to sample preparation such as focused ion beam (FIB) micromachining, offering up to 10,000 times higher ablation rates and therefore an order of magnitude lower cost of ownership (CoO) compared to FIB. As the first stand-alone, ultrashort pulsed laser-based tool for sample preparation, the microPREP 2.0 brings additional unique capabilities, such as enabling large-area and 3D-shape sampling to allow for more comprehensive testing of complex structures.

Cutting and preparing samples from semiconductor wafers, dies and packages for microstructure diagnostics and FA is an essential but time-consuming and costly step. The primary method of sample preparation used in semiconductor and electronics manufacturing today is FIB micromachining, which can take several hours to prepare a typical sample. FIB only allows for very small sample sizes, and precious FIB time is wasted by “digging” excavations needed for cross-sectional imaging in a scanning electron microscope or making a TEM lamella. Reaching larger depths or widths is severely restricted by the limited ablation rate.

3D-Micromac’s microPREP 2.0 significantly accelerates these critical steps, bringing sample preparation for semiconductor and materials research to a new level. By off-loading the vast majority of sample prep work from the FIB tool and relegating FIB to final polishing or replacing it completely depending on application, microPREP 2.0 reduces time to final sample to less than one hour in many cases.

“This award-winning tool brings unprecedented flexibility into sample prep. We at Fraunhofer IMWS are facing the need for targeted, artifact-free and most reliable preparation workflows to be able to serve our industry customers with cutting-edge microstructure diagnostics. Made for diverse techniques like SEM inspection of advanced-packaging devices, X-ray microscopy, atom probe tomography, and micro mechanics, microPREP was developed jointly with 3D-Micromac to close gaps in preparation workflows,” said Thomas Höche, Fraunhofer IMWS.

The microPREP 2.0 laser ablation system.

KLA-Tencor Corporation announced two new defect inspection products at SEMICON West this week, addressing two key challenges in tool and process monitoring during silicon wafer and chip manufacturing at the leading-edge logic and memory nodes. The VoyagerTM1015 system offers new capability to inspect patterned wafers, including inspection in the lithography cell immediately after development of the photoresist, when the wafer can be reworked. The Surfscan SP7 system delivers unprecedented defect detection sensitivity on bare wafers, smooth and rough films—essential for manufacturing silicon substrates intended for the 7nm logic and advanced memory device nodes, and equally critical for earliest detection of process issues during chip manufacturing. Together the two new inspection systems are designed to accelerate time-to-market for innovative electronic devices by capturing defect excursions at their source.

“With leading IC technologies, wafer and chip manufacturers have very little room for error,” said Oreste Donzella, Senior Vice President and Chief Marketing Officer at KLA-Tencor. “Critical dimensions of next-generation chips are so small that the minimum size of a yield-killing defect on bare silicon wafers or blanket-film monitor wafers has shrunk below the detection limit of available tool monitoring systems. A second key gap in the defect detection space has been reliably detecting yield-killing defects introduced early in the lithography process, whether 193i or EUV. Our engineering teams have developed two new defect inspection systems—one for unpatterned/monitor wafers and one for patterned wafers—that provide key capability for engineers to address these difficult defect issues rapidly and accurately.”

The Surfscan SP7 unpatterned wafer defect inspection system achieves its high sensitivity through innovations in illumination and sensor architecture that produce decades of improvement in resolution over that of the previous-generation Surfscan tool. This leap in resolution is the key to detection of the smallest killer defects. The new resolution realm also enables real-time classification of many defect types, such as particles, scratches, slip lines and stacking faults—without removing the wafer from the Surfscan tool or affecting the system throughput. At the same time, control over peak power density allows the Surfscan SP7 to inspect thin, delicate EUV photoresist materials.

The Voyager 1015 patterned wafer defect inspection system closes a long-standing industry gap in after-develop inspection (ADI), leveraging novel illumination, collection and sensor architecture. This revolutionary laser scattering inspection system drives sensitivity forward while reducing nuisance signals—and delivers results substantially sooner than the next-best alternatives. Like the new Surfscan SP7, the Voyager system features exceptional control of power density, allowing inline inspection of delicate photoresist materials after develop. High throughput capture of critical defects in the litho cell and other modules of the fab allows process issues to be identified and rectified rapidly.

Gases and engineering company The Linde Group (Booth #5644 in the North hall) is investing in expansion of existing products to improve business continuity planning while adding new products with improved purity to meet the growing needs of sub-10nm semiconductor factories and advanced flat panel manufacturers. Linde remains the global leader in rare gas and laser mixture production technology.

Linde has expanded capacity for fluorine/nitrogen mixtures at Medford, Oregon for etching and chamber cleaning applications.

  • This allows both low- and high-pressure fluorine and nitrogen mixture production.
  • On-site high-purity fluorine production minimizes third-party supply issues.
  • The product line is expanding to include fluorine/argon mixtures in place with tri-mix capability(fluorine/argon/nitrogen) later in 2018.
  • This facility complements fluorine mixture production at the Linde Alpha, New Jersey facility.

Linde is also developing deposition precursors and etch gases: silicon precursors, digermanium mixtures, high K and metal gate precursors, isotope gases and etch gases such as CF3I (trifluoroiodomethane)and custom fluorinated silane.

“Linde’s story this year is continued investment for customers,” said Paul Stockman, Linde Electronics’ Head of Market Development. “What we’re doing in the US mirrors what we’re doing globally, which is investing in new materials and new production capabilities and locating them close to where our customers are. We have uniform processes and multiple sites of production, and looking to optimize supply chains for our customers.”

“Linde recognizes that our customers continue to make investments in new processes and technologies.  We are committed to investing with them for the materials they will require now and in the future”, states Matt Adams, Head of Sales and Marketing for Linde Electronics and Specialty Products.

By Ed Korczynski

To fulfill the promise of the Internet of Things (IoT), the world needs low-cost high-bandwidth radio-frequency (RF) chips for 5th-generation (5G) internet technology. Despite standards not being completely defined yet it is clear that 5G hardware will have to be more complex than 4G kit, because it will have to provide a total solution that is ultra-reliable with at least 10 Gb/second bandwidth. A significant challenge remains in developing new high-speed transistor technologies for RF communications with low power to allow IoT “edge” devices to operate reliably off of batteries.

At the most recent Imec Technology Forum in Antwerp, Belgium, Nadine Collaert, Distinguished MTS of imec, discussed recent research results from the consortium’s High-Speed Analog and RF Program. In addition to working on core transistor fabrication technology R&D, imec has also been working on system-technology co-integration (STCO) and design-technology co-integration (DTCO) for RF applications.

Comparing the system specifications needed for mobile handsets to those for base-stations, transmitter power consumption should be 10x lower, while the receiver power consumption needs to be 2x lower. Today using silicon CMOS transistors, four power amplifiers alone consume 65% of a transmitter chip’s power. Heterogeneous Bipolar Transistors (HBT) and High Electron Mobility Transistors (HEMT) built using compound semiconductors such as gallium-arsenide (GaAs), gallium-nitride (GaN), or indium-phosphide (InP) provide excellent RF device results. However, compared to making CMOS chips on silicon, HBT and HEMT manufacturing on compound semiconductor substrates is inherently expensive and difficult.

Heterogeneous Bipolar Transistors (HBT) and High Electron Mobility Transistors (HEMT) both rely upon the precise epitaxial growth of semiconductor layers, and such growth is easier when the underlying substrate material has similar atomic arrangement. While it is much more difficult to grow epi-layers of compound semiconductors on silicon wafers, imec does R&D using 300-mm diameter silicon substrates with a goal of maintaining device quality while lowering production costs. The Figure shows cross-sections of the two “tracks” of III-V and GaN transistor materials being explored by imec for future RF chips.

III-V on Silicon and GaN-on-Silicon RF device cross-sections, showing work on both Heterogeneous Bipolar Transistors (HBT) and High Electron Mobility Transistors (HEMT) for 5G applications. (Source: imec)

Imec’s High-Speed Analog/RF Program objectives include the following:

  • High-speed III-V RF devices using low-cost, high-volume silicon-compatible processes and modules,
  • Co-optimization with advance silicon CMOS to reduce form factor and enable power-efficient systems with higher performance, and
  • Technology-circuit design co-optimization to enable complex RF-FEM modules with heterogeneous integration.

5G technology deployment will start with speeds below 6GHz,  because technologies in that range have already been proven and the costs are known. However, after five years the frequency will change to the “mm-wave” range with the first wavelength band at ~28GHz. GaN material with a wide bandgap and high charge-density has been a base-station technology, and it could be an ideal material for low-power mm-wave RF devices for future handsets.

This R&D leverages the III-V on silicon capability that has been developed by imec for CMOS:Photonic integration. RF transistors could be stacked over CMOS transistors using either wafer- or die-stacking, or both could be monolithically co-integrated on one silicon chip. Work on monolithic integration of GaN-on-Silicon is happening now, and could also be used for photonics where faster transistors can improve the performance of optical links.

By Pete Singer

Nitrous oxide (N2O) has a variety of uses in the semiconductor manufacturing industry. It is the oxygen source for chemical vapor deposition of silicon oxy-nitride (doped or undoped) or silicon dioxide, where it is used in conjunction with deposition gases such as silane. It’s also used in diffusion (oxidation, nitridation, etc.), rapid thermal processing (RTP) and for chamber seasoning.

Why these uses – and more importantly what happens to the gas afterward — may soon becoming under more scrutiny because it is being included for the first time in the IPPC (Intergovernmental Panel on Climate Change) GHG (Greenhouse Gas) guidelines. The IPCC has refined guidelines released in 2006 and expect to have a new revision in 2019. “Refined guidelines are actually up and coming and the inclusion of nitrous oxide in them is a major revision from the 2006 document,” said Mike Czerniak, Environmental Solutions Business development Manager, Edwards. Czerniak is on the IPPC committee and lead author of the semiconductor section.

Although the semiconductor industry uses a very small amount of N2O compared to other applications (dentistry, whip cream, drag racing, scuba diving), it is a concern because after CO2and CH4, N2O is the 3rd most prevalent man-induced GHG, accounting for 7% of emissions. According to the U.S. Environmental Protection Agency, 5% of U.S. N2O originates from industrial manufacturing, including semiconductor manufacturing.

Czerniak said the semiconductor industry been very proactive about trying to offset and reduce its carbon dioxide footprint. “The aspiration set by the world’s semiconductor council to reduce the carbon footprint of a chip to 30 percent of what it was in 2010, which itself was a massive reduction of what it used to be back in the last millennium,” he said. Unfortunately, although that trend had been going down for the first half of the decade, it started going up again in 2016. “although each individual processing step has a much lower carbon footprint than it used to have, the number of processing steps is much higher than they used to be,” Czerniak explain. “In the 1990s, it might take 300-400 processing steps to make a chip. Nowadays you’re looking at 2,000-4,000 steps.”

There are two ways of abating N20 so that it does not pollute the atmosphere: reduce it or oxidize it.  Oxidizing it – which creates NO2and NO (and other oxides know as NOx) — is not the way to go, according to Czerniak. “These oxides have their own problems. NOx is a gas that most countries are trying to reduce emissions of. It’s usually found as a byproduct of fuel combustion, particularly in things like automobiles and it adds to things like acid rain,” he said.

Edwards’ view is that it’s much better to minimize the formation of the NOx in the first place. “The good news is that it is possible inside a combustion abatement system where the gas comes in at the top, we burn a fuel gas and air on a combustor pad and basically the main reactant gas then is water vapor, which we use to remove the fluorine effluent, which is the one we normally try to get rid of from chamber cleans,” Czerniak said.

The tricky part is that information from the tool is required. “We can — when there is nitrous oxide present on a signal from the processing tool — add additional methane fuel into the incoming gas specifically to act as a reducing agent to reduce the nitrous oxide to nitrogen and water vapor,” he explained. “We inject it at just the right flow rate to effectively get rid of the nitrous oxide without forming the undesirable NOx byproducts.”

Figure 1 showshowcareful control of combustion conditions make them reduce rather than oxidizing during the N2O step by the addition of CH4. 30 slm N2O represents two typical process chambers.

“It’s not complicated technology,” Czerniak concluded. “You just have to do it right.”

10:30 am -12:30 pm
Digital Medicine and Remote Patient Monitoring
Moscone North, TechXPOT North

10:30 am – 12:30 pm
Materials & Packaging for Automotive
Meet the Experts Theater Moscone North, Smart Transportation Pavilion

10:30 am – 12:55 pm
Lithography at 5nm and Below
Moscone South, TechXPOT South

10:30 am – 12:30 pm
New Monitoring and Metrology Technologies, Wet and Dry
Meet the Experts Theater Moscone South, Smart Manufacturing Pavilion

1:30 pm – 4:00 pm
Connected Car to Connected World – The Road to Monetization
Meet the Experts Theater Moscone North, Smart Transportation Pavilion

2:00 pm – 4:00 pm
Data and AI: Ahead of the Curve — Applications Already Incorporating Big Data and AI
Moscone North, TechXPOT North

2:00pm to 4:00pm
Scaling Every Which Way!
Moscone South, TechXPOT South

By Pete Singer

In a keynote talk on Tuesday in the Yerba Buena theater, Dr. John E. Kelly, III, Senior Vice President, Cognitive Solutions and IBM Research, talked about how the era of Artificial Intelligence (AI) was upon us, and how it will dramatically the world. “This is an era of computing which is at a scale that will dwarf the previous era, in ways that will change all of our businesses and all of our industries, and all of our lives,” he said. “This will be another 50, 60 or more years of technology breakthrough innovation that will change the world.  This is the era that’s going to power our semiconductor industry forward. The number of opportunities is enormous.”

Dr. John E. Kelly, III, Senior Vice President, Cognitive Solutions and IBM Research

Kelly, with 40 years of experience in the industry, recalled how the first era of computing began with mechanical computers 100 years ago, and then transition into the programmable era of computing. In 1980, Kelly said “we were trying to stack two 16 kilobis DRAMs to get a 32 bit stack and we were trying to cram a thousand transistors into a microprocessor.” Microprocessors today have 15 billion transistors. “It’s been a heck of a ride,” he said.

IBM’s Summit is not only the biggest computer in the world, this is the smartest computer in the world, according to Kelly.

Kelly pointed to the power of exponentials, noting that Moore’s Law represented the first exponential and Metcalf’s Law — which says the value of the network increases as the square of the number of connected devices to the network – is the second exponential. Kelly said there’s no end to this second potential, as devices such as medical connected devices and Internet of thing devices get connected.

A third exponential is now upon us, Kelly said. “The core of this exponential is that data is doubling every 12 to 18 months. In fact, in some industries like healthcare, data is doubling every six months,” he said. The challenge is that the data is useless unless it can be analyzed. “Our computers are lousy in dealing with that large unstructured data and frankly there aren’t enough programmers in the world to deal with that explosion of data and extract value,” Kelly said. “The only way forward is through the use of machine learning and artificial intelligence to extract insights from that data.”

Kelly talked about IBM’s history of AI – teaching early system 600 machines to play checkers, beating chess grandmaster Gary Kasparov with Deep Blue, Watson’s Jeopardy wins and most recently, Watson Debater. That can “not only can answer questions but can listen to a person’s argument on something, reason and counter-argue in full natural language against that position in a full dialogue, continuously.”

What’s changed? “We continue to make advances in artificial intelligence, machine learning and deep learning algorithms that are just stunning,” Kelly said. “We are now able to learn over smaller and smaller amounts of data and translate that learning from one domain to another to another to another and start to get scale. Now is the time when this exponential is going to really explode.”

How does that equate to opportunity? Kelly said that on top of the existing $1.5-2B information technology industry, there’s another $2 trillion of decision support opportunity for artificial intelligence. “Literally every industry in the world, whether its industrial products, financial services, retail, every industry in the world is going to be impacted and transformed by this,” he said.

Quantum computing, which Kelly describe as a fourth exponential, is also coming which will in turn dwarf all of the previous ones. “Beyond AI, this is going to be the most important thing I’ve ever seen in my career. Quantum computing is a complete game changer,” he said.

The bad news? During his talk, Kelly sounded one cautionary note: “Companies that lead exponentials win. Companies that don’t lead, or even try to quickly follow, fail on exponential curves. Our industry is littered with examples of that,” he said.

By Shannon Davis

Steve Jobs. Benjamin Franklin. Albert Einstein. Marie Curie. What do these world-changers all have in common? Where did their drive to innovate come from? Melissa Schilling, PhD, had to find out.

“Innovation and creativity has been a hot area of research for a long time, but we don’t tend to study outliers and in part that’s because there’s methodological challenges with that,” she explained to the audience during her keynote address on Tuesday at SEMICON West 2018.

Melissa Schilling, PhD, New York University

So, the New York University professor created a multiple case study research project to tackle these questions, which are addressed at length in her latest book, “Quirky: The Remarkable Story of the Traits, Foibles, and Genius of Breakthrough Innovators Who Changed the World.” Her book invites us into the lives of eight world-famous game-changers — Albert Einstein, Benjamin Franklin, Elon Musk, Dean Kamen, Nikola Tesla, Marie Curie, Thomas Edison, and Steve Jobs – and identifies the common traits and experiences that drove them to make spectacular breakthroughs, again and again. Schilling believed that once we understand what makes someone a serial innovator; we can also understand the breakthrough innovation potential in all of us.

The first common trait Schilling identified in her research was a sense of separateness – a discovery that she found remarkable.

“I thought most people would be super connected with lots of diverse connections,” she said. “I was wrong about that. Every single person I studied, with the exception of Benjamin Franklin, had this…feeling of detachment.”

Einstein, said Schilling, even went so far as to say he didn’t need direct contact with individual humans, even his own family. Marie Curie and her husband eventually sent both of their daughters to be raised by their grandparents, so that they could devote more time to their research. Dean Kamen’s feelings of separateness helped to shield him when his peers didn’t believe it was possible to create a two-wheeled wheelchair (which we now know as the Segway).

What can we learn from this? “First thing we have to learn is that we need norms that permit people to be unorthodox,” said Schilling. “We need to be able to embrace weirdness.”

Schilling pushed back against the idea of brainstorming teams in the tech world, a practice she says has potential innovators stuck putting out ideas that are more likely to get consensus from the rest of their team. She instead suggested to allow employees to work alone first, to commit to an idea and elaborate on it before sharing it with a team.

“Brainstorming teams cause people to come to mediocre compromises,” she said.

The second shared trait of serial innovators Schilling discussed was self-efficacy.

“Self-efficacy is that faith you have that you can overcome obstacles to achieve your goals and it makes you take on bigger projects,” Schilling explained.

She pointed to Elon Musks’ persistence in developing reusable rockets, in spite of NASA’s claims that it couldn’t be done, and Nikola Tesla’s dream of harnessing the power of Niagara Falls to provide electricity, despite having only seen a picture of Niagara on a postcard when he was a child in Croatia.

“Encourage people to try even if they fail,” she said, and warned against rescuing people who could benefit from learning things on their own.

The third trait Schilling outlined was one she said seven of the eight innovators possessed, which was having an intensely idealistic goal that mattered more to them than just about anything else.

“When you have an idealistic goal that people in your company can identify with, they’re going to work harder, they’re going to work longer, they’re going to think bigger, and they’re going to love it more,” she said.

And while timing and luck often did play an undeniable role in many of the serial innovators lives, Schilling was most surprised to learn that access to capital didn’t affect her research subjects’ abilities to innovate.

“Every single one of these people… started out flat broke,” she said. “They did not become innovators because they had access to capital.”

What was more important, she said, was their access to other people who had resources.

“One of the most valuable things you can do is help connect people to the other people they need,” she concluded.

By Pete Singer

The importance of data gathered and analysed in the subfab – the place where vacuum pumps, abatements systems and other supporting equipment operates – is growing. Increasingly, manufacturers are finding that these systems have a direct impact on yield, safety, cost-of-ownership and ultimately capacity and cycle time.

“The subfab is getting recognized evermore as a contributor to the overall fab effectiveness, particularly when the fab is looking to get last fractions of a percentage of performance efficiencies,” notes Alan Ifould, Global Market Sector Manager at Edwards.

There’s also keen interest in tying this data with process data from the fab, the MES (manufacturing execution software) system and ultimately the ERP (enterprise resource planning) system as part of today’s efforts to understand and control the entire data ecosystem.

Subfab data systems provide a volume of data related not only to vacuum and abatement equipment, but also upstream, to the foreline, gate valve and chamber. Of special interest is the monitoring of vacuum faults, which can negatively impact quality, cost and safety. “A vacuum fault is anything that results in a loss of a degradation in vacuum,” said Ifould.

Ideally, faults – and the overall quality of the vacuum system — are proactively managed. Potential faults are detected days or even weeks before they occur and addressed during regularly scheduled tool maintenance, for example. “We’re finding that our ability to detect vacuum faults in the wider vacuum system comes very much to the fore,” Ifould said.

Data seen at the pump or abatement can help determine the size and location of vacuum system leaks. Algorithms based around vacuum science and thermodynamics can lead engineers to problematic leaks that, over time, can have a significant impact on yield.

Often, the first reaction to a loss in chamber pressure is to blame the vaccum pump, Ifould said. Vacuum pumps can be swapped out in about 4 hours, but if the process tool goes down while in operation, it could be in excess of 48 hours to get everything back up and running. Even then, it might be something other than the pump that caused the initial problem, such as a leak in a gate valve or in the foreline. It’s essential to accurately diagnose the problem(s) at the onset, but that can be a challenge: “You only need a small leak in a gate valve, and you immediately have problems with maintaining the base pressure in the chamber. The pump may become overloaded because of the additional gas load caused by leaks,” he said.

Edwards has developed a verity of new data collection and analysis strategies aimed at improving such decision making. The SMA (Site Management Application) is latest addition to data analytics portfolio, focused on subfab. As shown in Figure, SMA is designed to provide insight into maintenance activities, equipment performance and fault resolution. It is implemented in parallel with the company’s VTPS (Vacuum Technique Production System), which drives standard work and behaviors based on LEAN principles and best known methods.

Edwards is also working on what it calls “sensorization” where, for example, the use of vibration analytics can detect anomalies otherwise missed by traditional monitoring techniques.

Ifould said the SMA and sensorization helps improve the stability of fab operations by bringing veracity to the data. “It’s one thing to have a volume of data, but the data itself is of little value unless it’s of good quality,” he said. “When we’re looking at equipment operations and the way you have operators involved, being able to bring discipline to the behaviors of those operators to the task that they perform brings discipline to the data and improves the veracity of the data,” he said.

He said Edwards has been using this approach to “great effect” over the last year. “We can help our customers see where some of their maintenance practices need to be improved to eliminate some of the sources of error that cause some of those vacuum faults,” he said.

More recently, Edwards is looking to move beyond a simple predictive maintenance model (PdM) to a model that include quality (PdMQ). The model includesnot only the condition of the subfab equipment, but of the quality of the vacuum it provides, and therefore the process it supports. “We’re not just considering the condition of the subfab equipment and being able to predict when that may fail, but considering the quality of the vacuum that system actually provides.”

Harnessing data from all parts of the fab ecosystem is essential, Ifould notes, but has its challenges, especially when it comes to IP. “In an ideal world, we would like to receive contextualized data which allows us to relate what’s happening in the vacuum pump into the process itself. That becomes challenging because of the IP sensitivity,” he said.

Site Management Application, the latest addition to Edwards’ data analytics portfolio, is designed to provide insight into maintenance activities, equipment performance and fault resolution.

By Pete Singer

Many new innovations were discussed at imec’s U.S. International Technology Forum (ITF) on Monday at the Grand Hyatt in San Francisco, including quantum computing, artificial intelligence, sub-3nm logic, memory computing, solid-state batteries, EUV, RF and photonics, but perhaps the most interesting was new technology that enables human cells, tissues and organs to be grown and analyzed on-chip.

After an introduction by SEMI President Ajit Monacha – who said he believes the semiconductor industry will reach $1 trillion in market size by 2030 (“there’s no shortage of killer applications,” he said) — Luc Van den hove, president and CEO of imec, kicked off the afternoon session speaking about many projects underway that bring leading microelectronics technologies to bear on today’s looming healthcare crisis. “We all live longer than ever before and that’s fantastic,” he said. “But by living longer we also spend a longer part of our life being ill. What we need is a shift from extending lifespan to extending healthspan. What we need is to find ways to cure and prevent some of these diseases like cancer, like heart diseases and especially dementia.”

Today, drug development is so time-consuming and costly, is because of the insufficiency of the existing methodologies for drug screening assays. These current assays are based on poor cell models that limit the quality of the resulting data, and result in inadequate biological relevance. Additionally, there is a lack of spatial resolution of the assays, resulting in the inability to screen single cells in a cell culture. “It is rather slow, it is quite labor intensive and it provides limited information,” Van den hove said. “With our semiconductor platform we have developed recently a multi-electrode array (MEA) chip on which we can grow cells, in which we can grow tissue and organs. We can monitor processes that are happening within the cells or between the cells during massive drug testing.”

The MEA (see Figure) packs 16,384 electrodes, distributed over 16 wells, and offers multiparametric analysis. Each of the 1,024 electrodes in a well can detect intracellular action potentials, aside from the traditional extracellular signals. Further, imec’s chip is patterned with microstructures to allow for a structured cell growth mimicking a specific organ.

A novel organ-on-chip platform for pharmacological studies with unprecedented signal quality. It fuses imec’s high-density multi-electrode array (MEA)-chip with a microfluidic well plate, developed in collaboration with Micronit Microtechnologies, in which cells can be cultured, providing an environment that mimics human physiology.

Earlier this year, in May at imec’s ITF forum in Europe, Veerle Reumers, project leader at imec, explained how the MEA works: “By using grooves, heart cells can for example grow into a more heart-like tissue. In this way, we fabricate miniature hearts-on-a-chip, making it possible to test the effect of drugs in a more biologically relevant context. Imec’s organ-on-chip platform is the first system that enables on-chip multi-well assays, which means that you can perform different experiments or – in other words – analyze different compounds, in parallel on a single chip,” he explained. “This is a considerable increase in throughput compared to current single-well MEAs and we aim to further increase the throughput by adding more wells in a system.”

Van den hove said they have been testing the chip. “The beauty of the semiconductor platform is that we can, because of the miniaturization capability, parallelize an enormous amount of this testing and accelerate drug testing. We can measure what we never measured before, at speeds that you couldn’t think of before.”

He added that imec recently embarked on a new initiative aimed to cure dementia called Mission Lucidity. “Together with some of our clinical biomedical research teams, we are on a mission to decode dementia, to develop a cure to prevent this disease,” he said.

The MEA will be one tool used in the initiative, but also coming into play will be the groups neuroprobes — which Van den hove said are among the world’s most advanced probes and are being used by nearly all the leading neuroscience research teams – along with next generation wearables. “By combining these tools, we want to better understand the processes that are happening in the brain. We can measure those processes with much higher resolution than what could be done before. This may be able to detect the onset disease earlier on. By administering the right medication earlier, we hope to be able to prevent the disease from further progressing,” he said.