Tag Archives: letter-mems-tech

By integrating the design of antenna and electronics, researchers have boosted the energy and spectrum efficiency for a new class of millimeter wave transmitters, allowing improved modulation and reduced generation of waste heat. The result could be longer talk time and higher data rates in millimeter wave wireless communication devices for future 5G applications.

The new co-design technique allows simultaneous optimization of the millimeter wave antennas and electronics. The hybrid devices use conventional materials and integrated circuit (IC) technology, meaning no changes would be required to manufacture and package them. The co-design scheme allows fabrication of multiple transmitters and receivers on the same IC chip or the same package, potentially enabling multiple-input-multiple-output (MIMO) systems as well as boosting data rates and link diversity.

Researchers from the Georgia Institute of Technology presented their proof-of-concept antenna-based outphasing transmitter on June 11 at the 2018 Radio Frequency Integrated Circuits Symposium (RFIC) in Philadelphia. Their other antenna-electronics co-design work was published at the 2017 and 2018 IEEE International Solid-State Circuits Conference (ISSCC) and multiple peer-reviewed IEEE journals. The Intel Corporation and U.S. Army Research Office sponsored the research.

Georgia Tech researchers are shown with electronics equipment and antenna setup used to measure far-field radiated output signal from millimeter wave transmitters. Shown are Graduate Research Assistant Huy Thong Nguyen, Graduate Research Assistant Sensen Li, and Assistant Professor Hua Wang. (Credit: Allison Carter, Georgia Tech)

“In this proof-of-example, our electronics and antenna were designed so that they can work together to achieve a unique on-antenna outphasing active load modulation capability that significantly enhances the efficiency of the entire transmitter,” said Hua Wang, an assistant professor in Georgia Tech’s School of Electrical and Computer Engineering. “This system could replace many types of transmitters in wireless mobile devices, base stations and infrastructure links in data centers.”

Key to the new design is maintaining a high-energy efficiency regardless whether the device is operating at its peak or average output power. The efficiency of most conventional transmitters is high only at the peak power but drops substantially at low power levels, resulting in low efficiency when amplifying complex spectrally efficient modulations. Moreover, conventional transmitters often add the outputs from multiple electronics using lossy power combiner circuits, exacerbating the efficiency degradation.

“We are combining the output power though a dual-feed loop antenna, and by doing so with our innovation in the antenna and electronics, we can substantially improve the energy efficiency,” said Wang, who is the Demetrius T. Paris Professor in the School of Electrical and Computer Engineering.  “The innovation in this particular design is to merge the antenna and electronics to achieve the so-called outphasing operation that dynamically modulates and optimizes the output voltages and currents of power transistors, so that the millimeter wave transmitter maintains a high energy efficiency both at the peak and average power.”

Beyond energy efficiency, the co-design also facilitates spectrum efficiency by allowing more complex modulation protocols. That will enable transmission of a higher data rate within the fixed spectrum allocation that poses a significant challenge for 5G systems.

“Within the same channel bandwidth, the proposed transmitter can transmit six to ten times higher data rate,” Wang said. “Integrating the antenna gives us more degrees of freedom to explore design innovation, something that could not be done before.”

Sensen Li, a Georgia Tech graduate research assistant who received the Best Student Paper Award at the 2018 RFIC symposium, said the innovation resulted from bringing together two disciplines that have traditionally worked separately.

“We are merging the technologies of electronics and antennas, bringing these two disciplines together to break through limits,” he said. “These improvements could not be achieved by working on them independently. By taking advantage of this new co-design concept, we can further improve the performance of future wireless transmitters.”

The new designs have been implemented in 45-nanometer CMOS SOI IC devices and flip-chip packaged on high-frequency laminate boards, where testing has confirmed a minimum two-fold increase in energy efficiency, Wang said.

The antenna electronics co-design is enabled by exploring the unique nature of multi-feed antennas.

“An antenna structure with multiple feeds allows us to use multiple electronics to drive the antenna concurrently. Different from conventional single-feed antennas, multi-feed antennas can serve not only as radiating elements, but they can also function as signal processing units that interface among multiple electronic circuits,” Wang explained. “This opens a completely new design paradigm to have different electronic circuits driving the antenna collectively with different but optimized signal conditions, achieving unprecedented energy efficiency, spectral efficiency and reconfigurability.”

The cross-disciplinary co-design could also facilitate fabrication and operation of multiple transmitters and receivers on the same chip, allowing hundreds or even thousands of elements to work together as a whole system. “In massive MIMO systems, we need to have a lot of transmitters and receivers, so energy efficiency will become even more important,” Wang noted.

Having large numbers of elements working together becomes more practical at millimeter wave frequencies because the wavelength reduction means elements can be placed closer together to achieve compact systems, he pointed out. These factors could pave the way for new types of beamforming that are essential in future millimeter wave 5G systems.

Power demands could drive adoption of the technology for battery-powered devices, but Wang says the technology could also be useful for grid-powered systems such as base stations or wireless connections to replace cables in large data centers. In those applications, expanding data rates and reducing cooling needs could make the new devices attractive.

“Higher energy efficiency also means less energy will be converted to heat that must be removed to satisfy the thermal management,” he said. “In large data centers, even a small reduction in thermal load per device can add up. We hope to simplify the thermal requirements of these electronic devices.”

In addition to those already mentioned, the research team included Taiyun Chi, Huy Thong Nguyen and Tzu-Yuan Huang, all from Georgia Tech.

There are limits to how accurately you can measure things. Think of an X-ray image: it is likely quite blurry and something only an expert physician can interpret properly. The contrast between different tissues is rather poor but could be improved by longer exposure times, higher intensity, or by taking several images and overlapping them. But there are considerable limitations: humans can safely be exposed to only so much radiation, and imaging takes time and resources.

A well-established rule of thumb is the so-called standard quantum limit: the precision of the measurement scales inversely with the square root of available resources. In other words, the more resources – time, radiation power, number of images, etc. – you throw in, the more accurate your measurement will be. This will, however, only get you so far: extreme precision also means using excessive resources.

A team of researchers from Aalto University, ETH Zurich, and MIPT and Landau Institute in Moscow have pushed the envelope and came up with a way to measure magnetic fields using a quantum system – with accuracy beyond the standard quantum limit.

An artificial atom realised from superconducting strips of aluminum on a silicon chip can be employed for the detection of magnetic fields. Credit: Babi Brasileiro / Aalto University

The detection of magnetic fields is important in a variety of fields, from geological prospecting to imaging brain activity. The researchers believe that their work is a first step towards of using quantum-enhanced methods for sensor technology.

‘We wanted to design a highly efficient but minimally invasive measurement technique. Imagine, for example, extremely sensitive samples: we have to either use as low intensities as possible to observe the samples or push the measurement time to a minimum,’ explains Sorin Paraoanu, leader of the Kvantti research group at Aalto University.

Their paper, published in the prestigious journal npj Quantum Information shows how to improve the accuracy of magnetic field measurements by exploiting the coherence of a superconducting artificial atom, a qubit. It is a tiny device made of overlapping strips of aluminium evaporated on a silicon chip – a technology similar to the one used to fabricate the processors of mobile phones and computers.

When the device is cooled to a very low temperature, magic happens: the electrical current flows in it without any resistance and starts to display quantum mechanical properties similar to those of real atoms. When irradiated with a microwave pulse – not unlike the ones in household microwave ovens – the state of the artificial atom changes. It turns out that this change depends on the external magnetic field applied: measure the atom and you will figure out the magnetic field.

But to surpass the standard quantum limit, yet another trick had to be performed using a technique similar to a widely-applied branch of machine learning, pattern recognition.

‘We use an adaptive technique: first, we perform a measurement, and then, depending on the result, we let our pattern recognition algorithm decide how to change a control parameter in the next step in order to achieve the fastest estimation of the magnetic field,’ explains Andrey Lebedev, corresponding author from ETH Zurich, now at MIPT in Moscow.

‘This is a nice example of quantum technology at work: by combining a quantum phenomenon with a measurement technique based on supervised machine learning, we can enhance the sensitivity of magnetic field detectors to a realm that clearly breaks the standard quantum limit,’ Lebedev says.

Optimum Semiconductor Technologies, Inc., a fabless semiconductor company providing highly-integrated Systems on Chips (SoCs) for China’s thriving electronics markets, announced the GP8300 SoC. The GP8300 dramatically reduces chip cost, area, and power consumption for image recognition and object detection in a broad range of products such as self-driving cars, autonomous vehicles, smart cameras and other IoT edge devices.

Created in 28nm technology, the GP8300 includes four 2GHz ‘Unity’ CPU cores from General Processor Technologies (GPT) interconnected with a cache coherent memory supporting Heterogeneous Systems Architecture (HSA) processing for a common programming framework. The GP8300 also integrates four of GPT’s new 2GHz Variable Length Vector DSP (VLVm1) cores for signal processing applications. Within the chip, the out-of-order CPUs execute control code while very long vectors process data. In addition to these generalized compute units, the chip also integrates two 1GHz AI accelerators from GPT.

“The GP8300 brings together several of GPT’s innovative IP cores with underlying embedded artificial intelligence (eAI) algorithms in a highly-integrated design targeting a wide range of exciting applications,” said Gary Nacer, President and COO of Optimum. “The new SoC is one of the first CNN accelerators in China, and it provides the right combination of high performance, low power consumption, and the cost efficiency that our customers need as they create innovative new products.”

Building on the success of OST’s innovative SB3500 multithreaded heterogeneous computing platform for low-power software defined radio (SDR), the GP8300 represents a new architecture that achieves deep integration of eAI, edge computing, and communications on a single chip. OST provides support for CaffeNet-based training and tools for automatic fixed-point conversion and compression for inference.

A team headed by the TUM physicists Alexander Holleitner and Reinhard Kienberger has succeeded for the first time in generating ultrashort electric pulses on a chip using metal antennas only a few nanometers in size, then running the signals a few millimeters above the surface and reading them in again a controlled manner.

Classical electronics allows frequencies up to around 100 gigahertz. Optoelectronics uses electromagnetic phenomena starting at 10 terahertz. This range in between is referred to as the terahertz gap, since components for signal generation, conversion and detection have been extremely difficult to implement.

The TUM physicists Alexander Holleitner and Reinhard Kienberger succeeded in generating electric pulses in the frequency range up to 10 terahertz using tiny, so-called plasmonic antennas and run them over a chip. Researchers call antennas plasmonic if, because of their shape, they amplify the light intensity at the metal surfaces.

Asymmetric antennas

The shape of the antennas is important. They are asymmetrical: One side of the nanometer-sized metal structures is more pointed than the other. When a lens-focused laser pulse excites the antennas, they emit more electrons on their pointed side than on the opposite flat ones. An electric current flows between the contacts – but only as long as the antennas are excited with the laser light.

“In photoemission, the light pulse causes electrons to be emitted from the metal into the vacuum,” explains Christoph Karnetzky, lead author of the Nature work. “All the lighting effects are stronger on the sharp side, including the photoemission that we use to generate a small amount of current.”

Ultrashort terahertz signals

The light pulses lasted only a few femtoseconds. Correspondingly short were the electrical pulses in the antennas. Technically, the structure is particularly interesting because the nano-antennas can be integrated into terahertz circuits a mere several millimeters across.

In this way, a femtosecond laser pulse with a frequency of 200 terahertz could generate an ultra-short terahertz signal with a frequency of up to 10 terahertz in the circuits on the chip, according to Karnetzky.

The researchers used sapphire as the chip material because it cannot be stimulated optically and, thus, causes no interference. With an eye on future applications, they used 1.5-micron wavelength lasers deployed in traditional internet fiber-optic cables.

An amazing discovery

Holleitner and his colleagues made yet another amazing discovery: Both the electrical and the terahertz pulses were non-linearly dependent on the excitation power of the laser used. This indicates that the photoemission in the antennas is triggered by the absorption of multiple photons per light pulse.

“Such fast, nonlinear on-chip pulses did not exist hitherto,” says Alexander Holleitner. Utilizing this effect he hopes to discover even faster tunnel emission effects in the antennas and to use them for chip applications.

IBM’s announcement that they had produced the world’s smallest computer back in March raised a few eyebrows at the University of Michigan, home of the previous champion of tiny computing.

Now, the Michigan team has gone even smaller, with a device that measures just 0.3 mm to a side—dwarfed by a grain of rice.

The reason for the curiosity is that IBM’s claim calls for a re-examination of what constitutes a computer. Previous systems, including the 2x2x4mm Michigan Micro Mote, retain their programming and data even when they are not externally powered.

Unplug a desktop computer, and its program and data are still there when it boots itself up once the power is back. These new microdevices, from IBM and now Michigan, lose all prior programming and data as soon as they lose power.

“We are not sure if they should be called computers or not. It’s more of a matter of opinion whether they have the minimum functionality required,” said David Blaauw, a professor of electrical and computer engineering, who led the development of the new system together with Dennis Sylvester, also a professor of ECE, and Jamie Phillips, an Arthur F. Thurnau Professor and professor of ECE.

In addition to the RAM and photovoltaics, the new computing devices have processors and wireless transmitters and receivers. Because they are too small to have conventional radio antennae, they receive and transmit data with visible light. A base station provides light for power and programming, and it receives the data.

One of the big challenges in making a computer about 1/10th the size of IBM’s was how to run at very low power when the system packaging had to be transparent. The light from the base station—and from the device’s own transmission LED—can induce currents in its tiny circuits.

“We basically had to invent new ways of approaching circuit design that would be equally low power but could also tolerate light,” Blaauw said.

For example, that meant exchanging diodes, which can act like tiny solar cells, for switched capacitors.

Another challenge was achieving high accuracy while running on low power, which makes many of the usual electrical signals (like charge, current and voltage) noisier.

Designed as a precision temperature sensor, the new device converts temperatures into time intervals, defined with electronic pulses. The intervals are measured on-chip against a steady time interval sent by the base station and then converted into a temperature. As a result, the computer can report temperatures in minuscule regions—such as a cluster of cells—with an error of about 0.1 degrees Celsius.

The system is very flexible and could be reimagined for a variety of purposes, but the team chose precision temperature measurements because of a need in oncology. Their longstanding collaborator, Gary Luker, a professor of radiology and biomedical engineering, wants to answer questions about temperature in tumors.

Some studies suggest that tumors run hotter than normal tissue, but the data isn’t solid enough for confidence on the issue. Temperature may also help in evaluating cancer treatments.

“Since the temperature sensor is small and biocompatible, we can implant it into a mouse and cancer cells grow around it,” Luker said. “We are using this temperature sensor to investigate variations in temperature within a tumor versus normal tissue and if we can use changes in temperature to determine success or failure of therapy.”

Even as Luker’s experiments run, Blaauw, Sylvester and Phillips look forward to what purposes others will find for their latest microcomputing device.

“When we first made our millimeter system, we actually didn’t know exactly all the things it would be useful for. But once we published it, we started receiving dozens and dozens and dozens of inquiries,” Blaauw said.

And that device, the Michigan Micro Mote, may turn out to be the world’s smallest computer even still—depending on what the community decides are a computer’s minimum requirements.

What good is a tiny computer? Applications of the Michigan Micro Mote:

  • Pressure sensing inside the eye for glaucoma diagnosis
  • Cancer studies
  • Oil reservoir monitoring
  • Biochemical process monitoring
  • Surveillance: audio and visual
  • Tiny snail studies

The study was presented June 21 at the 2018 Symposia on VLSI Technology and Circuits. The paper is titled “A 0.04mm3 16nW Wireless and Batteryless Sensor System with Integrated Cortex-M0+ Processor and Optical Communication for Cellular Temperature Measurement.”

The work was done in collaboration with Mie Fujitsu Semiconductor Ltd. Japan and Fujitsu Electronics America Inc.

Microelectrodes can be used for direct measurement of electrical signals in the brain or heart. These applications require soft materials, however. With existing methods, attaching electrodes to such materials poses significant challenges. A team at the Technical University of Munich (TUM) has now succeeded in printing electrodes directly onto several soft substrates.

Researchers from TUM and Forschungszentrum Jülich have successfully teamed up to perform inkjet printing onto a gummy bear. This might initially sound like scientists at play – but it may in fact point the way forward to major changes in medical diagnostics. For one thing, it was not an image or logo that Prof. Bernhard Wolfrum’s team deposited on the chewy candy, but rather a microelectrode array. These components, comprised of a large number of electrodes, can detect voltage changes resulting from activity in neurons or muscle cells, for example.

Researchers from the Technical University of Munich (TUM) have succeeded in printing microelectrode arrays directly onto several soft substrates. Soft materials are better suited for devices that directly measure electrical signals from organs like the brain or heart. Credit: N. Adly / TUM

Second, gummy bears have a property that is important when using microelectrode arrays in living cells: they are soft. Microelectrode arrays have been around for a long time. In their original form, they consist of hard materials such as silicon. This results in several disadvantages when they come into contact with living cells. In the laboratory, their hardness affects the shape and organization of the cells, for example. And inside the body, the hard materials can trigger inflammation or the loss of organ functionalities.

Rapid prototyping with inkjet printers

When electrode arrays are placed on soft materials, these problems are avoided. This has sparked intensive research into these solutions. Until now, most initiatives have used traditional methods, which are time-consuming and require access to expensive specialized laboratories. “If you instead print the electrodes, you can produce a prototype relatively quickly and cheaply. The same applies if you need to rework it,” says Bernhard Wolfrum, Professor of Neuroelectronics at TUM. “Rapid prototyping of this kind enables us to work in entirely new ways.”

Wolfrum and his team work with a high-tech version of an inkjet printer. The electrodes themselves are printed with carbon-based ink. To prevent the sensors from picking up stray signals, a neutral protective layer is then added to the carbon paths.

Materials for various applications

The researchers tested the process on various substrates, including PDMS (polydimethylsiloxane) – a soft form of silicon – agarose – a substance commonly used in biology experiments – and finally various forms of gelatin, including a gummy bear that was first melted and then allowed to harden. Each of these materials has properties suitable for certain applications. For example, gelatin-coated implants can reduce unwanted reactions in living tissue.

Through experiments with cell cultures, the team was able to confirm that the sensors provide reliable measurements. With an average width of 30 micrometers, they also permit measurements on a single cell or just a few cells. This is difficult to achieve with established printing methods.

“The difficulty is in fine-tuning all of the components – both the technical set-up of the printer and the composition of the ink,” says Nouran Adly, the first author of the study. “In the case of PDMS, for example, we had to use a pre-treatment we developed just to get the ink to adhere to the surface.”

Wide range of potential applications

Printed microelectrode arrays on soft materials could be used in many different areas. They are suitable not only for rapid prototyping in research, but could also change the way patients are treated. “In the future, similar soft structures could be used to monitor nerve or heart functions in the body, for example, or even serve as a pacemaker,” says Prof. Wolfrum. At present he is working with his team to print more complex three-dimensional microelectrode arrays. They are also studying printable sensors that react selectively to chemical substances, and not only to voltage fluctuations.

Infrared spectroscopy is the benchmark method for detecting and analyzing organic compounds. But it requires complicated procedures and large, expensive instruments, making device miniaturization challenging and hindering its use for some industrial and medical applications and for data collection out in the field, such as for measuring pollutant concentrations. Furthermore, it is fundamentally limited by low sensitivities and therefore requires large sample amounts.

However, scientists at EPFL’s School of Engineering and at Australian National University (ANU) have developed a compact and sensitive nanophotonic system that can identify a molecule’s absorption characteristics without using conventional spectrometry.

The authors show a pixelated sensor metasurface for molecular spectroscopy. It consists of metapixels designed to concentrate light into nanometer-sized volumes in order to amplify and detect the absorption fingerprint of analyte molecules at specific resonance wavelengths. Simultaneous imaging-based read-out of all metapixels provides a spatial map of the molecular absorption fingerprint sampled at the individual resonance wavelengths. This pixelated absorption map can be seen as a two-dimensional barcode of the molecular fingerprint, which encodes the characteristic absorption bands as distinct features of the resulting image. Credit: EPFL

Their system consists of an engineered surface covered with hundreds of tiny sensors called metapixels, which can generate a distinct bar code for every molecule that the surface comes into contact with. These bar codes can be massively analyzed and classified using advanced pattern recognition and sorting technology such as artificial neural networks. This research – which sits at the crossroads of physics, nanotechnology and big data – has been published in Science.

Translating molecules into bar codes

The chemical bonds in organic molecules each have a specific orientation and vibrational mode. That means every molecule has a set of characteristic energy levels, which are commonly located in the mid-infrared range – corresponding to wavelengths of around 4 to 10 microns. Therefore, each type of molecule absorbs light at different frequencies, giving each one a unique “signature.” Infrared spectroscopy detects whether a given molecule is present in a sample by seeing if the sample absorbs light rays at the molecule’s signature frequencies. However, such analyses require lab instruments with a hefty size and price tag.

The pioneering system developed by the EPFL scientists is both highly sensitive and capable of being miniaturized; it uses nanostructures that can trap light on the nanoscale and thereby provide very high detection levels for samples on the surface. “The molecules we want to detect are nanometric in scale, so bridging this size gap is an essential step,” says Hatice Altug, head of EPFL’s BioNanoPhotonic Systems Laboratory and a coauthor of the study.

The system’s nanostructures are grouped into what are called metapixels so that each one resonates at a different frequency. When a molecule comes into contact with the surface, the way the molecule absorbs light changes the behavior of all the metapixels it touches.

“Importantly, the metapixels are arranged in such a way that different vibrational frequencies are mapped to different areas on the surface,” says Andreas Tittl, lead author of the study.

This creates a pixelated map of light absorption that can be translated into a molecular bar code – all without using a spectrometer.

The scientists have already used their system to detect polymers, pesticides and organic compounds. What’s more, their system is compatible with CMOS technology.

“Thanks to our sensors’ unique optical properties, we can generate bar codes even with broadband light sources and detectors,” says Aleksandrs Leitis, a coauthor of the study.

There are a number of potential applications for this new system. “For instance, it could be used to make portable medical testing devices that generate bar codes for each of the biomarkers found in a blood sample,” says Dragomir Neshev, another coauthor of the study.

Artificial intelligence could be used in conjunction with this new technology to create and process a whole library of molecular bar codes for compounds ranging from protein and DNA to pesticides and polymers. That would give researchers a new tool for quickly and accurately spotting miniscule amounts of compounds present in complex samples.

Imec today announced at the International Microwave Symposium (IMS, Philadelphia, USA), the world’s first CMOS 140GHz radar-on-chip system with integrated antennas in standard 28nm technology. The achievement is an important step in the development of radar-based sensors for a myriad of smart intuitive applications, such as building security, remote health monitoring of car drivers, breathing and heart rate of patients, and gesture recognition for man-machine interaction.

Radars are extremely promising as sensors for contactless, non-intrusive interaction in internet-of-things applications such as people detection & classification, vital signs monitoring and gesture interfacing. A wide adoption will only be possible if radars achieve a higher resolution, become much smaller, more power-efficient to run, and cheaper to produce and to buy. This is what imec’s research on 140GHz radar technology targets.

This low-power 140GHz radar solution comprises an imec proprietary two antenna SISO (Single Input Single Output) radar transceiver chip and a frequency modulated continuous wave phase-locked loop (FMCW PLL), off-the shelf ADCs and FPGA and a Matlab chain. The transceiver features on-chip antennas achieving a gain close to 3dBi. The excellent radar link budgets are supported thanks to the transmitter Effective Isotropic Radiated Power (EIRP)  that exceeds 9dBm and a receiver noise figure below 6.4dB. The total power consumption for transmitter and receiver remains below 500mW, which can be further reduced by duty cycling. The FMCW PLL  enables  fast slopes up to 500MHz/ms over a 10GHz bandwidth around 140GHz with a slope linearity error below 0.5% and has a power consumption below 50mW. The FPGA contains real-time implementation of basic radar processing functions such as FFTs (Fast Fourier Transforms) and filters, and is complemented by a Matlab chain for detections, CFAR (Constant False Alarm Rate), direction-of-arrival estimation and other advanced radar processing.

“With our prototype radar, we have demonstrated all critical specs for radar technology in 28nm standard CMOS technology,” said Wim Van Thillo, IoT program director at imec. “We are well advanced in incorporating multiple antenna paths in our most recent generation solution, which will enable a fine angular resolution of 1.5cm in a complete MIMO radar form factor of only a few square centimeters. We expect this prototype in the lab by the end of 2018, at which point our partners can start building their application demonstrators. First applications are expected to be person detection and classification for smart buildings, remote car driver vital signs monitoring (as cars evolve towards self-driving vehicles), and gesture recognition for intuitive man-machine interactions. Plenty more innovations will be enabled by this technology, once app developers start working with it.”

This imec 140GHz radar open innovation R&D collaborative program has been endorsed by Panasonic, and imec invites potential interested parties to join.

At this week’s 2018 IEEE International Interconnect Technology Conference (IITC 2018), imec will present 11 papers on advanced interconnects, ranging from extending Cu and Co damascene metallization, all the way to evaluating new alternatives such as Ru and graphene. After careful evaluation of the resistance and reliability behavior, imec takes first steps towards extending conventional metallization into to the 3nm technology node.

For almost two decades, Cu-based dual damascene has been the workhorse industrial process flow for building reliable interconnects. But when downscaling logic device technology towards the 5nm and 3nm technology nodes, meeting resistance and reliability requirements for the tightly pitched Cu lines has become increasingly challenging. The industry is however in favor of extending the current damascene technology as long as possible, and therefore, different solutions have emerged.

To set the limits of scaling, imec has benchmarked the resistance of Cu with respect to Co and Ru in a damascene vehicle with scaled dimensions, demonstrating that Cu still outperforms Co for wire cross sections down to 300nm2 (or linewidths of 12nm), which corresponds to the 3nm technology node. To meet reliability requirements, one option is to use Cu in combination with thin diffusion barriers such as tantalum nitride (TaN)) and liners such as Co or Ru. It was found that the TaN diffusion barrier can be scaled to below 2nm while maintaining excellent Cu diffusion barrier properties.

For Cu linewidths down to 15–12nm, imec also modeled the impact of the interconnect line-edge roughness on the system-level performance. Line-edge roughness is caused by the lithographic and patterning steps of interconnect wires, resulting in small variations in wire width and spacing. At small pitches, these can affect the Cu interconnect resistance and variability. Although there is a significant impact of line-edge roughness on the resistance distribution for short Cu wires, the effect largely averages out at the system level.

An alternative solution to extend the traditional damascene flow is replacing Cu by Co. Today Co requires a diffusion barrier – an option that recently gained industrial acceptance. A next possible step is to enable barrierless Co or at least sub-nm barrier thickness with careful interface engineering. Co has the clear advantage of having a lower resistance for smaller wire cross-secions and smaller vias. Based on electromigration and thermal storage experiments, imec presents a detailed study of the mechanisms that impact Co via reliability, showing the abscence of voids in barrierless Co vias, demonstrating a better scalability of Co towards smaller nodes.

The research is performed in cooperation with imec’s key nano interconnect program partners including GlobalFoundries, Huawei, Intel, Micron, Qualcomm, Samsung, SK Hynix, SanDisk/Western Digital, Sony Semiconductor Solutions, TOSHIBA Memory and TSMC.

Researchers at Seoul National University and Stanford University developed artificial mechanosensory nerves using flexible organic devices to emulate biological sensory afferent nerves. They used the artificial mechanosensory nerves to control a disabled insect leg and distinguish braille characters.

Compared to conventional digital computers, biological nervous system is powerful for real-world problems, such as visual image processing, voice recognition, tactile sensing, and movement control. This inspired scientists and engineers to work on neuromorphic computing, bioinspired sensors, robot control, and prosthetics. The previous approaches involved implementations at the software level on conventional digital computers and circuit designs using classical silicon devices which have shown critical issues related to power consumption, cost, and multifunction.

The research describes artificial mechanosensory nerves based on flexible organic devices to emulate biological mechanosensory nerves. “The recently found mechanisms of information processing in biological mechanosensory nerves were adopted in our artificial system,” said Zhenan Bao at Stanford University.

The artificial mechanosensory nerves are composed of three essential components: mechanoreceptors (resistive pressure sensors), neurons (organic ring oscillators), and synapses (organic electrochemical transistors). The pressure information from artificial mechanoreceptors can be converted to action potentials through artificial neurons. Multiple action potentials can be integrated into an artificial synapse to actuate biological muscles and recognize braille characters.

Devices that mimic the signal processing and functionality of biological systems can simplify the design of bioinspired system or reduce power consumption. The researchers said organic devices are advantageous because their functional properties can be tuned, they can be printed on a large area at a low cost, and they are flexible like soft biological systems.

Wentao Xu, a researcher at Seoul National University, and Yeongin Kim and Alex Chortos, graduate students at Stanford University, used their artificial mechanosensory nerves to detect large-scale textures and object movements and distinguish braille characters. They also connected the artificial mechanosensory nerves to motor nerves in a detached insect leg and control muscles.

Professor Tae-Woo Lee, a Professor at Seoul National University said, “Our artificial mechanosensory nerves can be used for bioinspired robots and prosthetics compatible with and comfortable for humans.” Lee said, “The development of human-like robots and prosthetics that help people with neurological disabilities can benefit from our work.”