Category Archives: Applications

Kirigami (also called “paper-cuts” or “jianzhi”) is one of the most traditional Chinese folk arts. It is widely used in window decorations, gift cards, festivals, and ceremonies, etc. Kirigami involves cutting and folding flat objects into 3D shapes. Recently, the techniques of this ancient art have been used in various scientific and technological fields, including designs for solar arrays, biomedical devices and micro-/nano- electromechanical systems (MEMS/NEMS).

Macroscopic paper-cuts in a paper sheet and nano-kirigami in an 80-nm thick gold film. Credit: Institute of Physics

Dr. LI Jiafang, from the Institute of Physics (IOP), Chinese Academy of Sciences, has recently formed an international team to apply kirigami techniques to advanced 3D nanofabrication.

Inspired by a traditional Chinese kirigami design called “pulling flower,” the team developed a direct nano-kirigami method to work with flat films at the nanoscale. They utilized a focused ion beam (FIB) instead of knives/scissors to cut a precise pattern in a free-standing gold nanofilm, then used the same FIB, instead of hands, to gradually “pull” the nanopattern into a complex 3D shape.

The “pulling” forces were induced by heterogeneous vacancies (introducing tensile stress) and the implanted ions (introducing compressive stress) within the gold nanofilm during FIB irradiation.

By utilizing the topography-guided stress equilibrium within the nanofilm, versatile 3D shape transformations such as upward buckling, downward bending, complex rotation and twisting of nanostructures were precisely achieved.

While previous attempts to create functional kirigami devices have used complicated sequential procedures and have been primarily aimed at realizing mechanical rather than optical functions, this new nano-kirigami method, in contrast, can be implemented in a single fabrication step and could be used to perform a number of optical functions.

For a proof-of-concept demonstration, the team produced a 3D pinwheel-like structure with giant optical chirality. The nanodevice achieved efficient manipulation of “left-handed” and “right-handed” circularly polarized light and exhibited strong uniaxial optical rotation effects in telecommunication wavelengths.

In this way, the team demonstrated a multidisciplinary connection between the two fields of nanomechanics and nanophotonics. This may represent a brand new direction for emerging kirigami research.

The team also developed a theoretical model to elucidate the dynamics during the nano-kirigami fabrication. This is of great significance since it allows researchers to design 3D nanogeometries based on desired optical functionalities. In contrast, previous studies relied heavily on intuitive designs.

In other words, in terms of geometric design, nano-kirigami offers an intelligent 3D nanofabrication method beyond traditional bottom-up, top-down and self-assembly nanofabrication techniques.

Its concept can be extended to broad nanofabrication platforms and could lead to the realization of complex optical nanostructures for sensing, computation, micro-/nano- electromechanical systems or biomedical devices.

This work, entitled “Nano-kirigami with giant optical chirality,” was published in Science Advances on July 6, 2018.

By Pete Singer

Many new innovations were discussed at imec’s U.S. International Technology Forum (ITF) on Monday at the Grand Hyatt in San Francisco, including quantum computing, artificial intelligence, sub-3nm logic, memory computing, solid-state batteries, EUV, RF and photonics, but perhaps the most interesting was new technology that enables human cells, tissues and organs to be grown and analyzed on-chip.

After an introduction by SEMI President Ajit Monacha – who said he believes the semiconductor industry will reach $1 trillion in market size by 2030 (“there’s no shortage of killer applications,” he said) — Luc Van den hove, president and CEO of imec, kicked off the afternoon session speaking about many projects underway that bring leading microelectronics technologies to bear on today’s looming healthcare crisis. “We all live longer than ever before and that’s fantastic,” he said. “But by living longer we also spend a longer part of our life being ill. What we need is a shift from extending lifespan to extending healthspan. What we need is to find ways to cure and prevent some of these diseases like cancer, like heart diseases and especially dementia.”

Today, drug development is so time-consuming and costly, is because of the insufficiency of the existing methodologies for drug screening assays. These current assays are based on poor cell models that limit the quality of the resulting data, and result in inadequate biological relevance. Additionally, there is a lack of spatial resolution of the assays, resulting in the inability to screen single cells in a cell culture. “It is rather slow, it is quite labor intensive and it provides limited information,” Van den hove said. “With our semiconductor platform we have developed recently a multi-electrode array (MEA) chip on which we can grow cells, in which we can grow tissue and organs. We can monitor processes that are happening within the cells or between the cells during massive drug testing.”

The MEA (see Figure) packs 16,384 electrodes, distributed over 16 wells, and offers multiparametric analysis. Each of the 1,024 electrodes in a well can detect intracellular action potentials, aside from the traditional extracellular signals. Further, imec’s chip is patterned with microstructures to allow for a structured cell growth mimicking a specific organ.

A novel organ-on-chip platform for pharmacological studies with unprecedented signal quality. It fuses imec’s high-density multi-electrode array (MEA)-chip with a microfluidic well plate, developed in collaboration with Micronit Microtechnologies, in which cells can be cultured, providing an environment that mimics human physiology.

Earlier this year, in May at imec’s ITF forum in Europe, Veerle Reumers, project leader at imec, explained how the MEA works: “By using grooves, heart cells can for example grow into a more heart-like tissue. In this way, we fabricate miniature hearts-on-a-chip, making it possible to test the effect of drugs in a more biologically relevant context. Imec’s organ-on-chip platform is the first system that enables on-chip multi-well assays, which means that you can perform different experiments or – in other words – analyze different compounds, in parallel on a single chip,” he explained. “This is a considerable increase in throughput compared to current single-well MEAs and we aim to further increase the throughput by adding more wells in a system.”

Van den hove said they have been testing the chip. “The beauty of the semiconductor platform is that we can, because of the miniaturization capability, parallelize an enormous amount of this testing and accelerate drug testing. We can measure what we never measured before, at speeds that you couldn’t think of before.”

He added that imec recently embarked on a new initiative aimed to cure dementia called Mission Lucidity. “Together with some of our clinical biomedical research teams, we are on a mission to decode dementia, to develop a cure to prevent this disease,” he said.

The MEA will be one tool used in the initiative, but also coming into play will be the groups neuroprobes — which Van den hove said are among the world’s most advanced probes and are being used by nearly all the leading neuroscience research teams – along with next generation wearables. “By combining these tools, we want to better understand the processes that are happening in the brain. We can measure those processes with much higher resolution than what could be done before. This may be able to detect the onset disease earlier on. By administering the right medication earlier, we hope to be able to prevent the disease from further progressing,” he said.

BY PAUL VAN DER HEIDE, director of materials and components analysis, imec, Leuven, Belgium

To keep up with Moore’s Law, the semiconductor industry continues to push the envelope in developing new device architectures containing novel materials. This in turn pushes the need for new solid-state analytical capabilities, whether for materials characterization or inline metrology. Aside from basic R&D, these capabilities are established at critical points of the semiconductor device manufacturing line, to measure, for example, the thickness and composition of a thin film, dopant profiles of transistor’s source/drain regions, the nature of defects on a wafer’s surface, etc. This approach is used to reduce “time to data”. We cannot wait until the end of the manufacturing line to know if a device will be functional or not. Every process step costs money and a fully functional device can take months to fabricate. Recent advances in instrumentation and computational power have opened the door to many new, exciting analytical possibilities.

One example that comes to mind concerns the development of coherent sources. So far, coherent photon sources have been used for probing the atomic and electronic structure of materials, but only within large, dedicated synchrotron radiation facilities. Through recent developments, table top coherent photon sources have been introduced that could soon see demand in the semiconductor lab/fab environment.

The increased computational power now at our finger tips is also allowing us to make the most of these and other sources through imaging techniques such as ptychography. Ptychog- raphy allows for the complex patterns resulting from coherent electron or photon interaction with a sample to be processed into recognizable images to a resolution close to the sources wavelength without the requirement of lenses (lenses tend to introduce aberrations). Potential application areas extend from non-destructive imaging of surface and subsurface structures, to probing chemical reactions at sub femto-second timescales.

Detector developments are also benefiting many analytical techniques presently used. As an example, transmission electron microscopy (TEM) and scanning transmission electron microscopy (STEM) can now image, with atomic resolution, heavy as well as light elements. Combining this with increased computational power, allows for further devel- opment of imaging approaches such as tomography, holography, ptychography, differential phase contrast imaging, etc. All of which allow TEM/STEM to not only look at atoms in e.g. 2D materials such as MoS2 in far greater detail, but also opens the possibility to map electric fields and magnetic domains to unprecedented resolution.

The semiconductor industry is evolving at a very rapid pace. Since the beginning of the 21st century, we have seen numerous disruptive technologies emerge; technologies that need to serve is an increasingly fragmented applications space. It’s no longer solely about ‘the central processing unit (CPU)’. Other applications ranging from the internet of things, autonomous vehicles, wearable human-electronics interface, etc., are being pursued, each coming with unique requirements and analytical needs.

Looking ten to fifteen years ahead, we will witness a different landscape. Although I’m sure that existing techniques such as TEM/STEM will still be heavily used – probably more so than we realize now (we are already seeing TEM/STEM being extended into the fab). We will also see developments that will push the boundaries of what is possible. This would range from the increased use of hybrid metrology (combining results from multiple different analytical techniques and process steps) to the development of new innovative approaches.

To illustrate the latter, I take the example of secondary ion mass spectrometry (SIMS). With SIMS, an energetic ion beam is directed at the solid sample of interest, causing atoms in the near surface region to leave this surface. A small percentage of them are ionized, and pass through a mass spectrometer which separates the ions from one another according to their mass to charge ratio. When this is done in the dynamic-SIMS mode, a depth profile of the sample’s composition can be derived. Today, with this technique, we can’t focus the incoming energetic ion beam into a confined volume, i.e. onto a spot that approaches the size of a transistor. But at imec, novel concepts were intro- duced, resulting in what are called 1.5D SIMS and self-focusing SIMS (SF-SIMS). These approaches are based on the detection of constituents within repeatable array structures, giving averaged and statistically significant information. This way, the spatial resolution limit of SIMS was overcome.

And there are exciting developments occurring here at imec in other analytical fields such as atom probe tomography (APT), photoelectron spectroscopy (PES), Raman spectroscopy, Rutherford back scattering (RBS), scanning probe microscopy (SPM), etc. One important milestone has been the development of Fast Fourier Transform-SSRM (FFT-SSRM) at imec. This allows one to measure carrier distributions in FinFETs to unparalleled sensitivity.

Yet, probably the biggest challenge materials characterization and inline metrology face over the next ten to fifteen years will be how to keep costs down. Today, we make use of highly specialized techniques developed on mutually exclusive and costly platforms. But why not make use of micro-electro-mechanical systems (MEMS) that could simultaneously perform analysis in a highly parallel fashion, and perhaps even in situ? One can imagine scenarios in which an army of such units could scan an entire wafer in the fraction of the time it takes now, or alternatively, the incorporation of such units into wafer test structure regions.

By Dave Lammers

The semiconductor industry is collecting massive amounts of data from fab equipment and other sources. But is the trend toward using that data in a Smart Manufacturing or Industry 4.0 approach happening fast enough in what Mike Plisinski, CEO of Rudolph Technologies, calls a “very conservative” chip manufacturing sector?

“There are a lot of buzzwords being thrown around now, and much of it has existed for a long time with APC, FDC, and other existing capabilities. What was inhibiting the industry in the past was the ability to align this huge volume of data,” Plisinskisaid.

While the industry became successful at adding sensors to tools and collecting data, the ability to track that data and make use of it in predictive maintenance or other analytics thus far “has had minimal success,” he said. With fab processes and manufacturing supply chains getting more complex, customers are trying to figure out how to move beyond implementing statistical process control (SPC) on data streams.

What is the next step? Plisinski said now that individual processes are well understood, the next phase is data alignment across the fab’s systems. As control of leading-edge processes becomes more challenging, customers realize that the interactions between the process steps must be understood more deeply.

“Understanding these interactions requires aligning these digital threads and data streams. When a customer understands that when a chamber changes temperature by point one degrees Celsius, it impacts the critical dimensions of the lithography process by X, Y, and Z. Understanding those interactions has been a significant challenge and is an area that we have focused on from a variety of angles over the last five years,” Plisinski said.

Rudolph engineers have worked to integrate multiple data threads (see Figure), aligning various forms of data into one database for analysis by Rudolph’s Yield Management System (YMS). “For a number of years we’ve been able to align data. The limitation was in the database: the data storage, the speed of retrieval and analysis were limitations. Recently new types of databases have come out, so that instead of relational, columnar-type databases, the new databases have been perfect for factory data analysis, for streaming data. That’s been a huge enabler for the industry,” he said.

Rudolph engineers have worked to integrate multiple data threads into one database.

Leveraging AI’s capabilities

A decade ago, Rudolph launched an early neural-network based system designed to help customers optimize yields. The software analyzed data from across a fab to learn from variations in the data.

“The problem back then was that neural networks of this kind used non-linear math that was too new for our conservative industry, an industry accustomed to first principle analytics. As artificial intelligence has been used in other industries, AI is becoming more accepted worldwide, and our industry is also looking at ways to leverage some of the capabilities of artificial intelligence,” he said.

Collecting and making use of data with a fab is “no small feat,” Plisinskisaid, but that leads to sharing and aligning data across the value chain: the wafer fab, packaging and assembly, and others.

“To gain increased insights from the data streams or digital threads, to bring these threads all together and make sense of all of it. It is what I call weaving a fabric of knowledge: taking individual data threads, bringing them together, and weaving a much clearer picture of what’s going on.”

Security concerns run deep

One of the biggest challenges is how to securely transfer data between the different factories that make up the supply chain. “Even if they are owned by one entity, transferring that large volume of data, even if it’s over a private dedicated network, is a big challenge. If you start to pick and choose to summarize the data, you are losing some of the benefit. Finding that balance is important.”

The semiconductor industry is gaining insights from companies analyzing, for instance, streaming video. The network infrastructures, compression algorithms, transfers of information from mobile wireless devices, and other technologies are making it easier to connect semiconductor fabs.

“Security is perhaps the biggest challenge. It’s a mental challenge as much as a technical one, and by that I mean there is more than reluctance, there’s a fundamental disdain for letting the data out of a factory, for even letting data into the factory,” he said.

Within fabs, there is a tug of war between equipment vendors which want to own the data and provide value-add services, and customers who argue that since they own the tools they own the data. The contentious debate grows more intense when vendors talk about taking data out of the fab. “That’s one of the challenges that the industry has to work on — the concerns around security and competitive information getting leaked out.” Developing a front-end process is “a multibillion dollar bet, and if that data leaks out it can be devastating to market-share leadership,” Plisinski said.

Early adopter stories

The challenge facing Rudolph and other companies is to convince their customers of the value of sharing data; that “the benefits will outweigh their concerns. Thus far, the proof of the benefit has been somewhat limited.”

“At least from a Rudolph perspective, we’ve had some early adopters that have seen some significant benefits. And I think as those stories get out there and as we start to highlight what some of these early adopters have seen, others at the executive level in these companies will start to question their teams about some of their assumptions and concerns. Eventually I think we’ll find a way forward. But right now that’s a significant challenge,”Plisinski said.

It is a classic chicken-and-egg problem, making it harder to get beyond theories to case-study benefits. “What helped us is that some of the early adopters had complete control of their entire value chain. They were fully integrated. And so we were able to get over the concerns about data sharing and focus on the technical challenges of transferring all that data and centralizing it in one place for analytical purposes. From there we got to see the benefits and document them in a way that we could share with others, while protecting IP.”

Aggregating data, buying databases and analytical software, building algorithms – all cost money, in most cases adding up to millions of dollars. But if yields improve by .25 or half a percent, the payback comes in six to eight months, he said.

“It’s a very conservative industry, an applied science type of industry. Trying to prove the value of software — a kind of black magic exercise — has always been difficult. But as the industry’s problems have become so complex, it is requiring these sophisticated software solutions.”

“We will have examples of successful case studies in our booth during SEMICON West. Anyone wanting further information is invited to stop by and talk to our experts,” adds Plisinski.

By integrating the design of antenna and electronics, researchers have boosted the energy and spectrum efficiency for a new class of millimeter wave transmitters, allowing improved modulation and reduced generation of waste heat. The result could be longer talk time and higher data rates in millimeter wave wireless communication devices for future 5G applications.

The new co-design technique allows simultaneous optimization of the millimeter wave antennas and electronics. The hybrid devices use conventional materials and integrated circuit (IC) technology, meaning no changes would be required to manufacture and package them. The co-design scheme allows fabrication of multiple transmitters and receivers on the same IC chip or the same package, potentially enabling multiple-input-multiple-output (MIMO) systems as well as boosting data rates and link diversity.

Researchers from the Georgia Institute of Technology presented their proof-of-concept antenna-based outphasing transmitter on June 11 at the 2018 Radio Frequency Integrated Circuits Symposium (RFIC) in Philadelphia. Their other antenna-electronics co-design work was published at the 2017 and 2018 IEEE International Solid-State Circuits Conference (ISSCC) and multiple peer-reviewed IEEE journals. The Intel Corporation and U.S. Army Research Office sponsored the research.

Georgia Tech researchers are shown with electronics equipment and antenna setup used to measure far-field radiated output signal from millimeter wave transmitters. Shown are Graduate Research Assistant Huy Thong Nguyen, Graduate Research Assistant Sensen Li, and Assistant Professor Hua Wang. (Credit: Allison Carter, Georgia Tech)

“In this proof-of-example, our electronics and antenna were designed so that they can work together to achieve a unique on-antenna outphasing active load modulation capability that significantly enhances the efficiency of the entire transmitter,” said Hua Wang, an assistant professor in Georgia Tech’s School of Electrical and Computer Engineering. “This system could replace many types of transmitters in wireless mobile devices, base stations and infrastructure links in data centers.”

Key to the new design is maintaining a high-energy efficiency regardless whether the device is operating at its peak or average output power. The efficiency of most conventional transmitters is high only at the peak power but drops substantially at low power levels, resulting in low efficiency when amplifying complex spectrally efficient modulations. Moreover, conventional transmitters often add the outputs from multiple electronics using lossy power combiner circuits, exacerbating the efficiency degradation.

“We are combining the output power though a dual-feed loop antenna, and by doing so with our innovation in the antenna and electronics, we can substantially improve the energy efficiency,” said Wang, who is the Demetrius T. Paris Professor in the School of Electrical and Computer Engineering.  “The innovation in this particular design is to merge the antenna and electronics to achieve the so-called outphasing operation that dynamically modulates and optimizes the output voltages and currents of power transistors, so that the millimeter wave transmitter maintains a high energy efficiency both at the peak and average power.”

Beyond energy efficiency, the co-design also facilitates spectrum efficiency by allowing more complex modulation protocols. That will enable transmission of a higher data rate within the fixed spectrum allocation that poses a significant challenge for 5G systems.

“Within the same channel bandwidth, the proposed transmitter can transmit six to ten times higher data rate,” Wang said. “Integrating the antenna gives us more degrees of freedom to explore design innovation, something that could not be done before.”

Sensen Li, a Georgia Tech graduate research assistant who received the Best Student Paper Award at the 2018 RFIC symposium, said the innovation resulted from bringing together two disciplines that have traditionally worked separately.

“We are merging the technologies of electronics and antennas, bringing these two disciplines together to break through limits,” he said. “These improvements could not be achieved by working on them independently. By taking advantage of this new co-design concept, we can further improve the performance of future wireless transmitters.”

The new designs have been implemented in 45-nanometer CMOS SOI IC devices and flip-chip packaged on high-frequency laminate boards, where testing has confirmed a minimum two-fold increase in energy efficiency, Wang said.

The antenna electronics co-design is enabled by exploring the unique nature of multi-feed antennas.

“An antenna structure with multiple feeds allows us to use multiple electronics to drive the antenna concurrently. Different from conventional single-feed antennas, multi-feed antennas can serve not only as radiating elements, but they can also function as signal processing units that interface among multiple electronic circuits,” Wang explained. “This opens a completely new design paradigm to have different electronic circuits driving the antenna collectively with different but optimized signal conditions, achieving unprecedented energy efficiency, spectral efficiency and reconfigurability.”

The cross-disciplinary co-design could also facilitate fabrication and operation of multiple transmitters and receivers on the same chip, allowing hundreds or even thousands of elements to work together as a whole system. “In massive MIMO systems, we need to have a lot of transmitters and receivers, so energy efficiency will become even more important,” Wang noted.

Having large numbers of elements working together becomes more practical at millimeter wave frequencies because the wavelength reduction means elements can be placed closer together to achieve compact systems, he pointed out. These factors could pave the way for new types of beamforming that are essential in future millimeter wave 5G systems.

Power demands could drive adoption of the technology for battery-powered devices, but Wang says the technology could also be useful for grid-powered systems such as base stations or wireless connections to replace cables in large data centers. In those applications, expanding data rates and reducing cooling needs could make the new devices attractive.

“Higher energy efficiency also means less energy will be converted to heat that must be removed to satisfy the thermal management,” he said. “In large data centers, even a small reduction in thermal load per device can add up. We hope to simplify the thermal requirements of these electronic devices.”

In addition to those already mentioned, the research team included Taiyun Chi, Huy Thong Nguyen and Tzu-Yuan Huang, all from Georgia Tech.

There are limits to how accurately you can measure things. Think of an X-ray image: it is likely quite blurry and something only an expert physician can interpret properly. The contrast between different tissues is rather poor but could be improved by longer exposure times, higher intensity, or by taking several images and overlapping them. But there are considerable limitations: humans can safely be exposed to only so much radiation, and imaging takes time and resources.

A well-established rule of thumb is the so-called standard quantum limit: the precision of the measurement scales inversely with the square root of available resources. In other words, the more resources – time, radiation power, number of images, etc. – you throw in, the more accurate your measurement will be. This will, however, only get you so far: extreme precision also means using excessive resources.

A team of researchers from Aalto University, ETH Zurich, and MIPT and Landau Institute in Moscow have pushed the envelope and came up with a way to measure magnetic fields using a quantum system – with accuracy beyond the standard quantum limit.

An artificial atom realised from superconducting strips of aluminum on a silicon chip can be employed for the detection of magnetic fields. Credit: Babi Brasileiro / Aalto University

The detection of magnetic fields is important in a variety of fields, from geological prospecting to imaging brain activity. The researchers believe that their work is a first step towards of using quantum-enhanced methods for sensor technology.

‘We wanted to design a highly efficient but minimally invasive measurement technique. Imagine, for example, extremely sensitive samples: we have to either use as low intensities as possible to observe the samples or push the measurement time to a minimum,’ explains Sorin Paraoanu, leader of the Kvantti research group at Aalto University.

Their paper, published in the prestigious journal npj Quantum Information shows how to improve the accuracy of magnetic field measurements by exploiting the coherence of a superconducting artificial atom, a qubit. It is a tiny device made of overlapping strips of aluminium evaporated on a silicon chip – a technology similar to the one used to fabricate the processors of mobile phones and computers.

When the device is cooled to a very low temperature, magic happens: the electrical current flows in it without any resistance and starts to display quantum mechanical properties similar to those of real atoms. When irradiated with a microwave pulse – not unlike the ones in household microwave ovens – the state of the artificial atom changes. It turns out that this change depends on the external magnetic field applied: measure the atom and you will figure out the magnetic field.

But to surpass the standard quantum limit, yet another trick had to be performed using a technique similar to a widely-applied branch of machine learning, pattern recognition.

‘We use an adaptive technique: first, we perform a measurement, and then, depending on the result, we let our pattern recognition algorithm decide how to change a control parameter in the next step in order to achieve the fastest estimation of the magnetic field,’ explains Andrey Lebedev, corresponding author from ETH Zurich, now at MIPT in Moscow.

‘This is a nice example of quantum technology at work: by combining a quantum phenomenon with a measurement technique based on supervised machine learning, we can enhance the sensitivity of magnetic field detectors to a realm that clearly breaks the standard quantum limit,’ Lebedev says.

Leti, a research institute at CEA Tech, Transdev, a global provider of mobility services, and IRT Nanoelec, an R&D center focused on information and communication technologies (ICT) using micro- and nanoelectronics, today announced a pilot program to characterize and assess LiDAR sensors to improve performance and safety of autonomous vehicles.

Transdev’s latest innovative transportation technologies already allow to operate fleets of autonomous vehicles for shared mobility. The perception of the environment through sensors is essential to offer the best client experience in terms of comfort and operation speed guaranteeing the required level of safety and security.  Evaluating sensor effectiveness and robustness is critical to develop the Transdev’s Autonomous Transport System that will allow the operation of autonomous vehicles fleets in a maximum of environmental conditions safely and securely.

In the pilot program, Leti teams will focus on perception requirements and challenges from a LiDAR system perspective and evaluate the sensors in real-world conditions. Vehicles will be exposed to objects with varying reflectivity, such as tires and street signs, as well as environmental conditions, such as weather, available light and fog. In addition to evaluating the sensors’ performance, the project will produce a list of criteria and objective parameters by which various commercial LiDAR systems could be evaluated.

“As an innovative supplier of autonomous transportation vehicles for smart cities, Transdev is leading the procession toward responsive, efficient and safe services with buses and shuttles,” said Leti CEO Emmanuel Sabonnadière. “This project will build on Leti’s sensor-fusion knowhow and sensor development expertise to strengthen Transdev’s testing and evaluation of sensors for its vehicles.”

Yann Leriche, Transdev’s CEO North America, said: “Providing the best client experience with the guarantee of safety, security and quality of service, will confirm Transdev as a pioneer in integrating autonomous transport systems into global mobility networks”.

As smart functionality makes its way into homes and businesses, two devices are gaining a foothold into broader ecosystems to maximize growth and revenue opportunities: smart speakers and smart meters. No longer simply intelligent appliances in the home, these devices are becoming key entry points into the massive Internet of Things (IoT) value chain. According to business information provider IHS Markit (Nasdaq: INFO), by the end of 2021, there will be an installed base of 328 million smart speakers and more than 1.13 billion smart electricity, water and gas meters.

“No matter the type of ‘smart’ device, device makers face the same challenge: keep costs down while increasing functionality,” said Paul Erickson, senior analyst for connected device research at IHS Markit. “The IoT is transformational for connected devices, and vendors large and small are vying to be part of the market. Many, like Google and Amazon, are selling their devices at or below margin because they understand the long-term opportunity lies in the applications and services these devices make possible.”

Smart speakers: growth, growth, growth ahead

Smart speakers, which enable voice-based media playback, smart home control, telephony, messaging, e-commerce and informational queries, use a range of connectivity options to leverage artificial intelligence (AI) and Cloud capabilities to enable an ever-increasing range of IoT devices.

By 2021, smart speaker revenue is expected to reach $11.2 billion, up from $6.3 billion in 2018, IHS Markit says. “While many options are available to device makers to enter the home ecosystem, the cost and convenience advantages of smart speakers will ensure that demand remains strong for years to come,” Erickson said.

“The smart speaker concept is most powerful when it leverages large, established ecosystems where there is broad app and development support across devices and platforms,” Erickson said. “These ecosystems allow the speakers to access diverse information and e-commerce resources and to receive support from other smart home devices.”

Smart meters: bridging the gap between utilities and their customers

Basic utility meters only monitor power usage, limiting the ability of utility companies to interact with end consumers. Smart meters expand the capabilities of utility companies by providing more regular and informative data, allowing better usage analysis, time-of-use rates and subsidies, leakage warnings and more.

“Smart meters are revolutionizing the way utilities and consumers interact, enhancing capabilities beyond the ‘meter to cash’ process,” said David Green, research manager for smart utilities infrastructure at IHS Markit. “Smart meters will be an increasingly critical entry point into utility ecosystems aiming to create more intelligent, efficient and cleaner electricity networks.”

Like smart speakers, smart meters are anticipated to enjoy considerable growth in the years ahead. Over 188 million smart meters will be shipped in 2023, generating $9.5 billion in hardware revenues, IHS Markit says. In 2023, the installation base of smart electricity, water and gas meters will exceed 1.35 billion. “Smart meters form the backbone of the data collection system for utilities, paving the way for entirely new categories of value-added revenue,” Green said.

There are limits to how accurately you can measure things. Think of an X-ray image: it is likely quite blurry and something only an expert physician can interpret properly. The contrast between different tissues is rather poor but could be improved by longer exposure times, higher intensity, or by taking several images and overlapping them. But there are considerable limitations: humans can safely be exposed to only so much radiation, and imaging takes time and resources.

A well-established rule of thumb is the so-called standard quantum limit: the precision of the measurement scales inversely with the square root of available resources. In other words, the more resources – time, radiation power, number of images, etc. – you throw in, the more accurate your measurement will be. This will, however, only get you so far: extreme precision also means using excessive resources.

A team of researchers from Aalto University, ETH Zurich, and MIPT and Landau Institute in Moscow have pushed the envelope and came up with a way to measure magnetic fields using a quantum system – with accuracy beyond the standard quantum limit.

The detection of magnetic fields is important in a variety of fields, from geological prospecting to imaging brain activity. The researchers believe that their work is a first step towards of using quantum-enhanced methods for sensor technology.

‘We wanted to design a highly efficient but minimally invasive measurement technique. Imagine, for example, extremely sensitive samples: we have to either use as low intensities as possible to observe the samples or push the measurement time to a minimum,’ explains Sorin Paraoanu, leader of the Kvantti research group at Aalto University.

Their paper, published in the prestigious journal npj Quantum Information shows how to improve the accuracy of magnetic field measurements by exploiting the coherence of a superconducting artificial atom, a qubit. It is a tiny device made of overlapping strips of aluminium evaporated on a silicon chip – a technology similar to the one used to fabricate the processors of mobile phones and computers.

When the device is cooled to a very low temperature, magic happens: the electrical current flows in it without any resistance and starts to display quantum mechanical properties similar to those of real atoms. When irradiated with a microwave pulse – not unlike the ones in household microwave ovens – the state of the artificial atom changes. It turns out that this change depends on the external magnetic field applied: measure the atom and you will figure out the magnetic field.

But to surpass the standard quantum limit, yet another trick had to be performed using a technique similar to a widely-applied branch of machine learning, pattern recognition.

‘We use an adaptive technique: first, we perform a measurement, and then, depending on the result, we let our pattern recognition algorithm decide how to change a control parameter in the next step in order to achieve the fastest estimation of the magnetic field,’ explains Andrey Lebedev, corresponding author from ETH Zurich, now at MIPT in Moscow.

‘This is a nice example of quantum technology at work: by combining a quantum phenomenon with a measurement technique based on supervised machine learning, we can enhance the sensitivity of magnetic field detectors to a realm that clearly breaks the standard quantum limit,’ Lebedev says.

Alta Devices has today announced that its most recent single junction solar cell has been certified by NREL (National Renewable Energy Laboratory) as being 28.9% efficient. This certification confirms that Alta has set a new record and continues to hold the world record efficiency for this type of solar cell. This breakthrough, combined with the unique thinness and flexibility of Alta’s cells, redefines how solar technology can be used to empower autonomy in many applications.

“Alta Devices goal is to continue to lead the industry in solar technology and to enable a broad range of autonomous systems. We believe this is the best way to support the innovations of our customers,” said Jian Ding, Alta Devices CEO.

Autonomous systems are predicted to become a part of daily life – often operating without human intervention. However, every time an autonomous system or vehicle has to stop to refuel or recharge, it requires intervention and is no longer truly autonomous. Alta focuses on developing the world’s best solar technology specifically for autonomous power, allowing vehicles to seamlessly recharge while in motion.

Alta Devices has held continuous world records for solar efficiency for most of the last decade. Alta Devices Founders, Professor Harry Atwater of Caltech and Professor Eli Yablonovitch of the University of California Berkeley explained the significance of this record:

Prof. Atwater said, “Achieving a new record for this class of devices is a landmark because a 1-sun, 1-junction cell is the archetypal solar cell. The fact that Alta is breaking its own record is also significant since many other teams have been actively attempting to break this record.”

Elaborating on the fundamental technical understanding that has driven this achievement, Professor Yablonovitch said, “Alta has the first solar cell based on Internal Luminescence Extraction, which has enabled Alta to remain ahead of others. This scientific principle will be in all future high efficiency solar cells.”

The company has recently launched its Gen4 AnyLight™ commercial technology, demonstrating a significant weight reduction from the previous version, resulting in an improved power to weight ratio of 160 percent. This is critical for tomorrow’s autonomous UAVs (unmanned aerial vehicles), electric vehicles, and sensors. It can be used to generate substantial power over small surfaces without compromising design criteria.