Category Archives: Manufacturing

Kirigami (also called “paper-cuts” or “jianzhi”) is one of the most traditional Chinese folk arts. It is widely used in window decorations, gift cards, festivals, and ceremonies, etc. Kirigami involves cutting and folding flat objects into 3D shapes. Recently, the techniques of this ancient art have been used in various scientific and technological fields, including designs for solar arrays, biomedical devices and micro-/nano- electromechanical systems (MEMS/NEMS).

Macroscopic paper-cuts in a paper sheet and nano-kirigami in an 80-nm thick gold film. Credit: Institute of Physics

Dr. LI Jiafang, from the Institute of Physics (IOP), Chinese Academy of Sciences, has recently formed an international team to apply kirigami techniques to advanced 3D nanofabrication.

Inspired by a traditional Chinese kirigami design called “pulling flower,” the team developed a direct nano-kirigami method to work with flat films at the nanoscale. They utilized a focused ion beam (FIB) instead of knives/scissors to cut a precise pattern in a free-standing gold nanofilm, then used the same FIB, instead of hands, to gradually “pull” the nanopattern into a complex 3D shape.

The “pulling” forces were induced by heterogeneous vacancies (introducing tensile stress) and the implanted ions (introducing compressive stress) within the gold nanofilm during FIB irradiation.

By utilizing the topography-guided stress equilibrium within the nanofilm, versatile 3D shape transformations such as upward buckling, downward bending, complex rotation and twisting of nanostructures were precisely achieved.

While previous attempts to create functional kirigami devices have used complicated sequential procedures and have been primarily aimed at realizing mechanical rather than optical functions, this new nano-kirigami method, in contrast, can be implemented in a single fabrication step and could be used to perform a number of optical functions.

For a proof-of-concept demonstration, the team produced a 3D pinwheel-like structure with giant optical chirality. The nanodevice achieved efficient manipulation of “left-handed” and “right-handed” circularly polarized light and exhibited strong uniaxial optical rotation effects in telecommunication wavelengths.

In this way, the team demonstrated a multidisciplinary connection between the two fields of nanomechanics and nanophotonics. This may represent a brand new direction for emerging kirigami research.

The team also developed a theoretical model to elucidate the dynamics during the nano-kirigami fabrication. This is of great significance since it allows researchers to design 3D nanogeometries based on desired optical functionalities. In contrast, previous studies relied heavily on intuitive designs.

In other words, in terms of geometric design, nano-kirigami offers an intelligent 3D nanofabrication method beyond traditional bottom-up, top-down and self-assembly nanofabrication techniques.

Its concept can be extended to broad nanofabrication platforms and could lead to the realization of complex optical nanostructures for sensing, computation, micro-/nano- electromechanical systems or biomedical devices.

This work, entitled “Nano-kirigami with giant optical chirality,” was published in Science Advances on July 6, 2018.

Broadcom Inc. (NASDAQ: AVGO), a semiconductor device supplier to the wired, wireless, enterprise storage, and industrial end markets, and CA Technologies (NASDAQ: CA), one of the world’s leading providers of information technology (IT) management software and solutions, today announced that the companies have entered into a definitive agreement under which Broadcom has agreed to acquire CA to build one of the world’s leading infrastructure technology companies.

Under the terms of the agreement, which has been approved by the boards of directors of both companies, CA’s shareholders will receive $44.50 per share in cash. This represents a premium of approximately 20% to the closing price of CA common stock on July 11, 2018, the last trading day prior to the transaction announcement, and a premium of approximately 23% to CA’s volume-weighted average price (“VWAP”) for the last 30 trading days. The all-cash transaction represents an equity value of approximately $18.9 billion, and an enterprise value of approximately $18.4 billion.

Hock Tan, President and Chief Executive Officer of Broadcom, said, “This transaction represents an important building block as we create one of the world’s leading infrastructure technology companies. With its sizeable installed base of customers, CA is uniquely positioned across the growing and fragmented infrastructure software market, and its mainframe and enterprise software franchises will add to our portfolio of mission critical technology businesses. We intend to continue to strengthen these franchises to meet the growing demand for infrastructure software solutions.”

“We are excited to have reached this definitive agreement with Broadcom,” said Mike Gregoire, CA Technologies Chief Executive Officer. “This combination aligns our expertise in software with Broadcom’s leadership in the semiconductor industry. The benefits of this agreement extend to our shareholders who will receive a significant and immediate premium for their shares, as well as our employees who will join an organization that shares our values of innovation, collaboration and engineering excellence. We look forward to completing the transaction and ensuring a smooth transition.”

The transaction is expected to drive Broadcom’s long-term Adjusted EBITDA margins above 55% and be immediately accretive to Broadcom’s non-GAAP EPS. On a combined basis, Broadcom expects to have last twelve months non-GAAP revenues of approximately $23.9 billion and last twelve months non-GAAP Adjusted EBITDA of approximately $11.6 billion.

As a global leader in mainframe and enterprise software, CA’s solutions help organizations of all sizes develop, manage, and secure complex IT environments that increase productivity and enhance competitiveness. CA leverages its learnings and development expertise across its Mainframe and Enterprise Solutions businesses, resulting in cross enterprise, multi-platform support for customers. The majority of CA’s largest customers transact with CA across both its Mainframe and Enterprise Solutions portfolios. CA benefits from predictable and recurring revenues with the average duration of bookings exceeding three years. CA operates across 40 countries and currently holds more than 1,500 patents worldwide, with more than 950 patents pending.

BY PAUL VAN DER HEIDE, director of materials and components analysis, imec, Leuven, Belgium

To keep up with Moore’s Law, the semiconductor industry continues to push the envelope in developing new device architectures containing novel materials. This in turn pushes the need for new solid-state analytical capabilities, whether for materials characterization or inline metrology. Aside from basic R&D, these capabilities are established at critical points of the semiconductor device manufacturing line, to measure, for example, the thickness and composition of a thin film, dopant profiles of transistor’s source/drain regions, the nature of defects on a wafer’s surface, etc. This approach is used to reduce “time to data”. We cannot wait until the end of the manufacturing line to know if a device will be functional or not. Every process step costs money and a fully functional device can take months to fabricate. Recent advances in instrumentation and computational power have opened the door to many new, exciting analytical possibilities.

One example that comes to mind concerns the development of coherent sources. So far, coherent photon sources have been used for probing the atomic and electronic structure of materials, but only within large, dedicated synchrotron radiation facilities. Through recent developments, table top coherent photon sources have been introduced that could soon see demand in the semiconductor lab/fab environment.

The increased computational power now at our finger tips is also allowing us to make the most of these and other sources through imaging techniques such as ptychography. Ptychog- raphy allows for the complex patterns resulting from coherent electron or photon interaction with a sample to be processed into recognizable images to a resolution close to the sources wavelength without the requirement of lenses (lenses tend to introduce aberrations). Potential application areas extend from non-destructive imaging of surface and subsurface structures, to probing chemical reactions at sub femto-second timescales.

Detector developments are also benefiting many analytical techniques presently used. As an example, transmission electron microscopy (TEM) and scanning transmission electron microscopy (STEM) can now image, with atomic resolution, heavy as well as light elements. Combining this with increased computational power, allows for further devel- opment of imaging approaches such as tomography, holography, ptychography, differential phase contrast imaging, etc. All of which allow TEM/STEM to not only look at atoms in e.g. 2D materials such as MoS2 in far greater detail, but also opens the possibility to map electric fields and magnetic domains to unprecedented resolution.

The semiconductor industry is evolving at a very rapid pace. Since the beginning of the 21st century, we have seen numerous disruptive technologies emerge; technologies that need to serve is an increasingly fragmented applications space. It’s no longer solely about ‘the central processing unit (CPU)’. Other applications ranging from the internet of things, autonomous vehicles, wearable human-electronics interface, etc., are being pursued, each coming with unique requirements and analytical needs.

Looking ten to fifteen years ahead, we will witness a different landscape. Although I’m sure that existing techniques such as TEM/STEM will still be heavily used – probably more so than we realize now (we are already seeing TEM/STEM being extended into the fab). We will also see developments that will push the boundaries of what is possible. This would range from the increased use of hybrid metrology (combining results from multiple different analytical techniques and process steps) to the development of new innovative approaches.

To illustrate the latter, I take the example of secondary ion mass spectrometry (SIMS). With SIMS, an energetic ion beam is directed at the solid sample of interest, causing atoms in the near surface region to leave this surface. A small percentage of them are ionized, and pass through a mass spectrometer which separates the ions from one another according to their mass to charge ratio. When this is done in the dynamic-SIMS mode, a depth profile of the sample’s composition can be derived. Today, with this technique, we can’t focus the incoming energetic ion beam into a confined volume, i.e. onto a spot that approaches the size of a transistor. But at imec, novel concepts were intro- duced, resulting in what are called 1.5D SIMS and self-focusing SIMS (SF-SIMS). These approaches are based on the detection of constituents within repeatable array structures, giving averaged and statistically significant information. This way, the spatial resolution limit of SIMS was overcome.

And there are exciting developments occurring here at imec in other analytical fields such as atom probe tomography (APT), photoelectron spectroscopy (PES), Raman spectroscopy, Rutherford back scattering (RBS), scanning probe microscopy (SPM), etc. One important milestone has been the development of Fast Fourier Transform-SSRM (FFT-SSRM) at imec. This allows one to measure carrier distributions in FinFETs to unparalleled sensitivity.

Yet, probably the biggest challenge materials characterization and inline metrology face over the next ten to fifteen years will be how to keep costs down. Today, we make use of highly specialized techniques developed on mutually exclusive and costly platforms. But why not make use of micro-electro-mechanical systems (MEMS) that could simultaneously perform analysis in a highly parallel fashion, and perhaps even in situ? One can imagine scenarios in which an army of such units could scan an entire wafer in the fraction of the time it takes now, or alternatively, the incorporation of such units into wafer test structure regions.

BY PETE SINGER

There’s an old proverb that the shoemaker’s children always go barefoot, indicating how some professionals don’t apply their skills for themselves. Until lately, that has seemed the case with the semiconductor manufacturing industry which has been good at collecting massive amounts of data, but no so good at analyzing that data and using it to improve efficiency, boost yield and reduce costs. In short, the industry could be making better use of the technology it has developed.

That’s now changing, thanks to a worldwide focus on Industry 4.0–more commonly known as “smart manufacturing” in the U.S. – which represents a new approach to automation and data exchange in manufacturing technologies. It includes cyber-physical systems, the Internet of things, cloud computing, cognitive computing and the use of artificial intelligence/deep learning.

At SEMICON West this year, these trends will be showcased in a new Smart Manufacturing Pavilion where you’ll be able to see – and experience – data-sharing breakthroughs that are creating smarter manufacturing processes, increasing yields and profits, and spurring innovation across the industry. Each machine along the Pavilion’s multi-step line is displayed, virtually or with actual equipment on the floor – from design and materials through front-end patterning, to packaging and test to final board and system assembly.

In preparation for the show, I had the opportunity to talk to Mike Plisinski, CEO of Rudolph Technologies, the sponsor of the Smart Pavilion about smart manufacturing. He said in the past “the industry got very good at collecting a lot of data. We sensors on all kinds of tools and equipment and we’d track it with the idea of being able to do predictive maintenance or predictive analytics. That I think had minimal success,” he said.

What’s different now? “With the industry consolidating and the supply chains and products getting more complex that’s created the need to go beyond what existed. What was inhibiting that in the past was really the ability to align this huge volume of data,” he said. The next evolution is driven by the need to improve the processes. “As we’ve gone down into sub-20 nanometer, the interactions between the process steps are more complex, there’s more interaction, so understanding that interaction requires aligning digital threads and data streams.” If a process chamber changed temperature by 0.1°C, for example, what impact did it have on lithography process by x, y, z CD control. That’s the level of detail that’s required.

“That has been a significant challenge and that’s one of the areas that we’ve focused on over the last four, five years — to provide that kind of data alignment across the systems,” Plisinski said.

Every company is different, of course, and some have been managing this more effectively than others, but the cobbler’s children are finally getting new shoes.

By Dave Lammers

The semiconductor industry is collecting massive amounts of data from fab equipment and other sources. But is the trend toward using that data in a Smart Manufacturing or Industry 4.0 approach happening fast enough in what Mike Plisinski, CEO of Rudolph Technologies, calls a “very conservative” chip manufacturing sector?

“There are a lot of buzzwords being thrown around now, and much of it has existed for a long time with APC, FDC, and other existing capabilities. What was inhibiting the industry in the past was the ability to align this huge volume of data,” Plisinskisaid.

While the industry became successful at adding sensors to tools and collecting data, the ability to track that data and make use of it in predictive maintenance or other analytics thus far “has had minimal success,” he said. With fab processes and manufacturing supply chains getting more complex, customers are trying to figure out how to move beyond implementing statistical process control (SPC) on data streams.

What is the next step? Plisinski said now that individual processes are well understood, the next phase is data alignment across the fab’s systems. As control of leading-edge processes becomes more challenging, customers realize that the interactions between the process steps must be understood more deeply.

“Understanding these interactions requires aligning these digital threads and data streams. When a customer understands that when a chamber changes temperature by point one degrees Celsius, it impacts the critical dimensions of the lithography process by X, Y, and Z. Understanding those interactions has been a significant challenge and is an area that we have focused on from a variety of angles over the last five years,” Plisinski said.

Rudolph engineers have worked to integrate multiple data threads (see Figure), aligning various forms of data into one database for analysis by Rudolph’s Yield Management System (YMS). “For a number of years we’ve been able to align data. The limitation was in the database: the data storage, the speed of retrieval and analysis were limitations. Recently new types of databases have come out, so that instead of relational, columnar-type databases, the new databases have been perfect for factory data analysis, for streaming data. That’s been a huge enabler for the industry,” he said.

Rudolph engineers have worked to integrate multiple data threads into one database.

Leveraging AI’s capabilities

A decade ago, Rudolph launched an early neural-network based system designed to help customers optimize yields. The software analyzed data from across a fab to learn from variations in the data.

“The problem back then was that neural networks of this kind used non-linear math that was too new for our conservative industry, an industry accustomed to first principle analytics. As artificial intelligence has been used in other industries, AI is becoming more accepted worldwide, and our industry is also looking at ways to leverage some of the capabilities of artificial intelligence,” he said.

Collecting and making use of data with a fab is “no small feat,” Plisinskisaid, but that leads to sharing and aligning data across the value chain: the wafer fab, packaging and assembly, and others.

“To gain increased insights from the data streams or digital threads, to bring these threads all together and make sense of all of it. It is what I call weaving a fabric of knowledge: taking individual data threads, bringing them together, and weaving a much clearer picture of what’s going on.”

Security concerns run deep

One of the biggest challenges is how to securely transfer data between the different factories that make up the supply chain. “Even if they are owned by one entity, transferring that large volume of data, even if it’s over a private dedicated network, is a big challenge. If you start to pick and choose to summarize the data, you are losing some of the benefit. Finding that balance is important.”

The semiconductor industry is gaining insights from companies analyzing, for instance, streaming video. The network infrastructures, compression algorithms, transfers of information from mobile wireless devices, and other technologies are making it easier to connect semiconductor fabs.

“Security is perhaps the biggest challenge. It’s a mental challenge as much as a technical one, and by that I mean there is more than reluctance, there’s a fundamental disdain for letting the data out of a factory, for even letting data into the factory,” he said.

Within fabs, there is a tug of war between equipment vendors which want to own the data and provide value-add services, and customers who argue that since they own the tools they own the data. The contentious debate grows more intense when vendors talk about taking data out of the fab. “That’s one of the challenges that the industry has to work on — the concerns around security and competitive information getting leaked out.” Developing a front-end process is “a multibillion dollar bet, and if that data leaks out it can be devastating to market-share leadership,” Plisinski said.

Early adopter stories

The challenge facing Rudolph and other companies is to convince their customers of the value of sharing data; that “the benefits will outweigh their concerns. Thus far, the proof of the benefit has been somewhat limited.”

“At least from a Rudolph perspective, we’ve had some early adopters that have seen some significant benefits. And I think as those stories get out there and as we start to highlight what some of these early adopters have seen, others at the executive level in these companies will start to question their teams about some of their assumptions and concerns. Eventually I think we’ll find a way forward. But right now that’s a significant challenge,”Plisinski said.

It is a classic chicken-and-egg problem, making it harder to get beyond theories to case-study benefits. “What helped us is that some of the early adopters had complete control of their entire value chain. They were fully integrated. And so we were able to get over the concerns about data sharing and focus on the technical challenges of transferring all that data and centralizing it in one place for analytical purposes. From there we got to see the benefits and document them in a way that we could share with others, while protecting IP.”

Aggregating data, buying databases and analytical software, building algorithms – all cost money, in most cases adding up to millions of dollars. But if yields improve by .25 or half a percent, the payback comes in six to eight months, he said.

“It’s a very conservative industry, an applied science type of industry. Trying to prove the value of software — a kind of black magic exercise — has always been difficult. But as the industry’s problems have become so complex, it is requiring these sophisticated software solutions.”

“We will have examples of successful case studies in our booth during SEMICON West. Anyone wanting further information is invited to stop by and talk to our experts,” adds Plisinski.

Directly converting electrical power to heat is easy. It regularly happens in your toaster, that is, if you make toast regularly. The opposite, converting heat into electrical power, isn’t so easy.

Researchers from Sandia National Laboratories have developed a tiny silicon-based device that can harness what was previously called waste heat and turn it into DC power. Their advance was recently published in Physical Review Applied.

This tiny silicon-based device developed at Sandia National Laboratories can catch and convert waste heat into electrical power. The rectenna, short for rectifying antenna, is made of common aluminum, silicon and silicon dioxide using standard processes from the integrated circuit industry. Credit: Photo by Randy Montoya/Sandia National Laboratories

“We have developed a new method for essentially recovering energy from waste heat. Car engines produce a lot of heat and that heat is just waste, right? So imagine if you could convert that engine heat into electrical power for a hybrid car. This is the first step in that direction, but much more work needs to be done,” said Paul Davids, a physicist and the principal investigator for the study.

“In the short term we’re looking to make a compact infrared power supply, perhaps to replace radioisotope thermoelectric generators.” Called RTGs, the generators are used for such tasks as powering sensors for space missions that don’t get enough direct sunlight to power solar panels.

Davids’ device is made of common and abundant materials, such as aluminum, silicon and silicon dioxide — or glass — combined in very uncommon ways.

Silicon device catches, channels and converts heat into power

Smaller than a pinkie nail, the device is about 1/8 inch by 1/8 inch, half as thick as a dime and metallically shiny. The top is aluminum that is etched with stripes roughly 20 times smaller than the width of a human hair. This pattern, though far too small to be seen by eye, serves as an antenna to catch the infrared radiation.

Between the aluminum top and the silicon bottom is a very thin layer of silicon dioxide. This layer is about 20 silicon atoms thick, or 16,000 times thinner than a human hair. The patterned and etched aluminum antenna channels the infrared radiation into this thin layer.

The infrared radiation trapped in the silicon dioxide creates very fast electrical oscillations, about 50 trillion times a second. This pushes electrons back and forth between the aluminum and the silicon in an asymmetric manner. This process, called rectification, generates net DC electrical current.

The team calls its device an infrared rectenna, a portmanteau of rectifying antenna. It is a solid-state device with no moving parts to jam, bend or break, and doesn’t have to directly touch the heat source, which can cause thermal stress.

Infrared rectenna production uses common, scalable processes

Because the team makes the infrared rectenna with the same processes used by the integrated circuit industry, it’s readily scalable, said Joshua Shank, electrical engineer and the paper’s first author, who tested the devices and modeled the underlying physics while he was a Sandia postdoctoral fellow.

He added, “We’ve deliberately focused on common materials and processes that are scalable. In theory, any commercial integrated circuit fabrication facility could make these rectennas.”

That isn’t to say creating the current device was easy. Rob Jarecki, the fabrication engineer who led process development, said, “There’s immense complexity under the hood and the devices require all kinds of processing tricks to build them.”

One of the biggest fabrication challenges was inserting small amounts of other elements into the silicon, or doping it, so that it would reflect infrared light like a metal, said Jarecki. “Typically you don’t dope silicon to death, you don’t try to turn it into a metal, because you have metals for that. In this case we needed it doped as much as possible without wrecking the material.”

The devices were made at Sandia’s Microsystems Engineering, Science and Applications Complex. The team has been issued a patent for the infrared rectenna and have filed several additional patents.

The version of the infrared rectenna the team reported in Physical Review Applied produces 8 nanowatts of power per square centimeter from a specialized heat lamp at 840 degrees. For context, a typical solar-powered calculator uses about 5 microwatts, so they would need a sheet of infrared rectennas slightly larger than a standard piece of paper to power a calculator. So, the team has many ideas for future improvements to make the infrared rectenna more efficient.

Future work to improve infrared rectenna efficiency

These ideas include making the rectenna’s top pattern 2D x’s instead of 1D stripes, in order to absorb infrared light over all polarizations; redesigning the rectifying layer to be a full-wave rectifier instead of the current half-wave rectifier; and making the infrared rectenna on a thinner silicon wafer to minimize power loss due to resistance.

Through improved design and greater conversion efficiency, the power output per unit area will increase. Davids thinks that within five years, the infrared rectenna may be a good alternative to RTGs for compact power supplies.

Shank said, “We need to continue to improve in order to be comparable to RTGs, but the rectennas will be useful for any application where you need something to work reliably for a long time and where you can’t go in and just change the battery. However, we’re not going to be an alternative for solar panels as a source of grid-scale power, at least not in the near term.”

Davids added, “We’ve been whittling away at the problem and now we’re beginning to get to the point where we’re seeing relatively large gains in power conversion, and I think that there’s a path forward as an alternative to thermoelectrics. It feels good to get to this point. It would be great if we could scale it up and change the world.”

Smart technologies take center stage tomorrow as SEMICON West, the flagship U.S. event for connecting the electronics manufacturing supply chain, opens for three days of insights into leading technologies and applications that will power future industry expansion. Building on this year’s record-breaking industry growth, SEMICON West – July 10-12, 2018, at the Moscone Center in San Francisco – spotlights how cognitive learning technologies and other disruptors will transform industries and lives.

Themed BEYOND SMART and presented by SEMI, SEMICON West 2018 features top technologists and industry leaders highlighting the significance of artificial intelligence (AI) and the latest technologies and trends in smart transportation, smart manufacturing, smart medtech, smart data, big data, blockchain and the Internet of Things (IoT).

Seven keynotes and more than 250 subject matter experts will offer insights into critical opportunities and issues across the global microelectronics supply chain. The event also features new Smart Pavilions to showcase interactive technologies for immersive, virtual experiences.

Smart transportation and smart manufacturing pavilions: Applying AI to accelerate capabilities

Automotive leads all new applications in semiconductor growth and is a major demand driver for technologies inrelated segments such as MEMS and sensors. The SEMICON West Smart Transportation and Smart Manufacturing pavilions showcase AI breakthroughs that are enabling more intelligent transportation performance and manufacturing processes, increasing yields and profits, and spurring innovation across the industry.

Smart workforce pavilion: Connecting next-generation talent with the microelectronics industry

SEMICON West also tackles the vital industry issue of how to attract new talent with the skills to deliver future innovations. Reliant on a highly skilled workforce, the industry today faces thousands of job openings, fierce competition for workers and the need to strengthen its talent pipeline. Educational and engaging, the Smart Workforce Pavilion connects the microelectronics industry with college students and entry-level professionals.

In the Workforce Pavilion “Meet the Experts” Theater, recruiters from top companies are available for on-the-spot interviews, while career coaches offer mentoring, tips on cover letter and resume writing, job-search guidance, and more. SEMI will also host High Tech U (HTU) in conjunction with the SEMICON West Smart Workforce Pavilion. The highly interactive program supported by Advantest, Edwards, KLA-Tencor and TEL exposes high school students to STEM education pathways and useful insights about careers in the industry.

By Paula Doe, SEMI

With artificial intelligence (AI) rapidly evolving, look for applications like voice recognition and image recognition to get more efficient, more affordable, and far more common in a variety of products over the next few years. This growth in applications will drive demand for new architectures that deliver the higher performance and lower power consumption required for widespread AI adoption.

“The challenge for AI at the edge is to optimize the whole system-on-a-chip architecture and its components, all the way to semiconductor technology IP blocks, to process complex AI workloads quickly and at low power,” says Qualcomm Technologies Senior Director of Engineering Evgeni Gousev, who will provide an update on the progress of AI at the edge in a Data and AI program at SEMICON West, July 10-12 in San Francisco.

Qualcomm Snapdragon 845 uses heterogeneous computing across the CPU, GPU, and DSP for power-efficient processing for constantly evolving AI models. Source: Qualcomm

A system approach that optimizes across hardware, software, and algorithms is necessary to deliver the ultra-low power – to a sub 1-milliwatt level, low enough to enable always-on machine vision processing – for the usually energy-intensive AI computing. From the chip architecture perspective, processing AI workloads with the most appropriate engine, such as the CPU, GPU, and DSP with dedicated hardware acceleration, provides the best power efficiency – and flexibility for dealing with rapidly changing AI models and growing diversity of applications.

“But we’re going to run out of brute force options, so the future opportunity is more innovations with new architectures, dedicated hardware, new algorithms, and new software.” – Evgeni Gousev, Qualcomm Technologies

“So far it’s been largely a brute force approach using conventional architectures and cloud-based infrastructure,” says Evgeni. “But we’re going to run out of brute force options, so future opportunities lie in developing innovative architectures, dedicated hardware, new algorithms, and new software. Innovation will be especially important for AI at the edge and applications requiring always-on functionality. Training is mostly in the cloud now, but in the near future it will start migrating to the device as the algorithms and hardware improve. AI at the edge will also  remove some privacy concerns,  an increasingly important issue for data collection and management.”

Practical AI applications at the edge where resources are constrained run the gamut, spanning smartphones, drones, autonomous vehicles, virtual reality, augmented reality and smart home solutions such as connected cameras. “More AI on the edge will create a huge opportunity for the whole ecosystem – chip designers, semiconductor and device manufacturers, applications developers, and data and service providers. And it’s going to make a significant impact on the way we work, live, and interact with the world around us,” Evgeni said.

Future generations of chips may need more disruptive systems-level change to handle high data volumes with low power

A next-generation solution for handling the massive proliferation of AI data could be a nanotechnology system, such as the collaborative N3XT (Nano-Engineered Computing Systems Technology) project, led by H.S. Philip Wong and Subhasish Mitra at Stanford. “Even with next-generation scaling of transistors and new memory chips, the bottlenecks in moving data in and out of memory for processing will remain,” says Mitra, another speaker in the SEMICON West program. “The true benefits of nanotechnology will only come from new architectures enabled by nanosystems. One thing we are certain of is that massively more capable and more energy-efficient systems will be necessary for almost any future application, so we will need to think about system-level improvements.”

Major improvement in handling high volumes of data with low high energy use will require system-level improvements, such as monolithic 3D integration of carbon nanotube transistors in the multi-campus N3XT chip research effort. Source: Stanford University

That means carbon nanotube transistors for logic, high density non-volatile MRAM and ReRAM for memory, fine-grained monolithic 3D for integration, new architectures for computation immersed in memory, and new materials for heat removal. “The N3XT approach is key for the 1000X energy efficiency needed,” says Mitra.

“One thing we are certain of is that massively more capable and more energy efficient systems will be necessary for almost any future application, so we will need to think about system-level improvements.” – Subhasish Mitra, Stanford University

Researchers have demonstrated improvements in all these areas, including multiple hardware nanosystem prototypes targeting AI applications. The researchers have transferred multiple layers of as-grown carbon nanotubes to the target wafer to significantly improve CNT density. They have developed a low-power TiN/HfOx/Pt ReRAM whose low-temperature CNT and ReRAM processes enable multiple vertical layers to be grown on top of one another for ultra-dense and fine-grained monolithic 3D integration.

Other speakers at the Data and AI TechXpot include Fram Akiki, VP Electronics, Siemens; Hariharan Ananthanarayanan, motion planning engineer, Osaro; and David Haynes, Sr. director, strategic marketing, Lam Research.  See SEMICONWest.org.

A team headed by the TUM physicists Alexander Holleitner and Reinhard Kienberger has succeeded for the first time in generating ultrashort electric pulses on a chip using metal antennas only a few nanometers in size, then running the signals a few millimeters above the surface and reading them in again a controlled manner.

Classical electronics allows frequencies up to around 100 gigahertz. Optoelectronics uses electromagnetic phenomena starting at 10 terahertz. This range in between is referred to as the terahertz gap, since components for signal generation, conversion and detection have been extremely difficult to implement.

The TUM physicists Alexander Holleitner and Reinhard Kienberger succeeded in generating electric pulses in the frequency range up to 10 terahertz using tiny, so-called plasmonic antennas and run them over a chip. Researchers call antennas plasmonic if, because of their shape, they amplify the light intensity at the metal surfaces.

Asymmetric antennas

The shape of the antennas is important. They are asymmetrical: One side of the nanometer-sized metal structures is more pointed than the other. When a lens-focused laser pulse excites the antennas, they emit more electrons on their pointed side than on the opposite flat ones. An electric current flows between the contacts – but only as long as the antennas are excited with the laser light.

“In photoemission, the light pulse causes electrons to be emitted from the metal into the vacuum,” explains Christoph Karnetzky, lead author of the Nature work. “All the lighting effects are stronger on the sharp side, including the photoemission that we use to generate a small amount of current.”

Ultrashort terahertz signals

The light pulses lasted only a few femtoseconds. Correspondingly short were the electrical pulses in the antennas. Technically, the structure is particularly interesting because the nano-antennas can be integrated into terahertz circuits a mere several millimeters across.

In this way, a femtosecond laser pulse with a frequency of 200 terahertz could generate an ultra-short terahertz signal with a frequency of up to 10 terahertz in the circuits on the chip, according to Karnetzky.

The researchers used sapphire as the chip material because it cannot be stimulated optically and, thus, causes no interference. With an eye on future applications, they used 1.5-micron wavelength lasers deployed in traditional internet fiber-optic cables.

An amazing discovery

Holleitner and his colleagues made yet another amazing discovery: Both the electrical and the terahertz pulses were non-linearly dependent on the excitation power of the laser used. This indicates that the photoemission in the antennas is triggered by the absorption of multiple photons per light pulse.

“Such fast, nonlinear on-chip pulses did not exist hitherto,” says Alexander Holleitner. Utilizing this effect he hopes to discover even faster tunnel emission effects in the antennas and to use them for chip applications.

STMicroelectronics CEO Jean-Marc Chery and SEMI President and CEO Ajit Manocha will kick off the co-located SEMIMEMS & Sensors Industry Group’s (SEMI-MSIG’s) European MEMS & Sensors Summit 2018 and European Imaging & Sensors Summit (September 19-21 in Grenoble, France). Global technology leaders will examine the influence of megatrends, such as artificial and autonomous intelligence, hyperscale data centers, cybersecurity, authentication, human-machine interface, and virtual reality/augmented reality (VR/AR) on MEMS, sensors and imaging. Speakers will also explore new platforms, models and materials that support the performance and volume requirements of tomorrow’s MEMS, sensors and imaging devices.

In his executive keynote, NXP Semiconductors SVP/CTO Lars Reger will discuss the powerful decentralized ways that sensors allow cars to perform more human-like decision-making in autonomous driving. Mr. Reger will highlight a complex automotive ecosystem that requires both MEMS and non-MEMS sensors — as well as other electronic measurement and control systems — to advance the autonomous vehicles of today and tomorrow. CEA Leti CEO Emmanuel Sabonnadière will present on how innovation is feeding technology, providing an overview on operational excellence, innovations in technology, talent management and leadership. An additional executive keynote speaker from Renault will be announced soon.

“Our European Summits offer influential stakeholders a unique forum to explore the technological developments — and manufacturing and materials advancements — that will dramatically improve MEMS, sensors and imaging technologies — and the markets in which they play,” said Laith Altimime, president, SEMI Europe. “Whether partners, competitors, suppliers or end-customers, attendees will also benefit from mutual engagement during the exhibition and networking events that make our European Summits so unique.”

Other Highlights

  • Feature Presentations

o   Megatrends impacts on the MEMS business — Eric Mounier, Yole Développement

o   Future trends and drivers for sensors markets — Dr. Michael Alexander, Roland Berger

o   Disruption in the authentication sensor market — Manuel Tagliavini, IHS Markit

o   Image sensors technology innovations enabling market megatrends — Roberto Bez, LFoundry

o   Embracing design for manufacturing in MEMS – success and disappointment — Ian Roane, Micralyne

o   Advanced substrates for MEMS and photonic applications — Vesa-Pekka Lempinen, Okmetic Oy

o   Sensors enabling smart HMI — Christian Mandl, Infineon Technologies

o   Image and vision sensors, systems and applications for smart cities — Thierry Ligozat, Teledyne e2v

o   Trends and recent developments in 3D microscopy for biomedical applications — Michael Kempe, Carl Zeiss AG

o   AI-enabled imaging at the edge — Petronel Bigiogi, XPERI

  • MEMS and Imaging Technology Showcase — several strictly vetted companies will perform live demos of their MEMS-, imaging- or sensors-based products as they compete for audience votes.
  • Joint Show-Floor Exhibition
  • Networking events such as the welcome reception and a gala dinner held for both MEMS and Sensors and Imaging & Sensors Summit attendees
  • MEMS & Sensors Summit: stay in touch via Twitter at www.twitter.com (use #MEMSEU).
  • Imaging & Sensors Summit: stay in touch via Twitter at www.twitter.com (use #imagingEU).
  • Registration: registration is open now, with early-bird pricing available until August 17, 2018. Visit: http://www.semi.org/eu/mems-and-sensors-2018-registration

 

SEMI-MSIG’s Summits will be held at the WTC in Grenoble, France, in the heart of the French Silicon Valley (5-7 Place Robert Schuman, 38000 Grenoble, France). Premier sponsors of the Summits include: Gold Sponsors ASE Group, Presto Engineering, Inc. and SUSS MicroTec Group; Silver Sponsors Applied Materials, EV Group, LFoundry, and SPTS Technologies. Event sponsors include: JSR Micro N.V., Materion, Okmetic, and Trymax.