Category Archives: MEMS

There are limits to how accurately you can measure things. Think of an X-ray image: it is likely quite blurry and something only an expert physician can interpret properly. The contrast between different tissues is rather poor but could be improved by longer exposure times, higher intensity, or by taking several images and overlapping them. But there are considerable limitations: humans can safely be exposed to only so much radiation, and imaging takes time and resources.

A well-established rule of thumb is the so-called standard quantum limit: the precision of the measurement scales inversely with the square root of available resources. In other words, the more resources – time, radiation power, number of images, etc. – you throw in, the more accurate your measurement will be. This will, however, only get you so far: extreme precision also means using excessive resources.

A team of researchers from Aalto University, ETH Zurich, and MIPT and Landau Institute in Moscow have pushed the envelope and came up with a way to measure magnetic fields using a quantum system – with accuracy beyond the standard quantum limit.

An artificial atom realised from superconducting strips of aluminum on a silicon chip can be employed for the detection of magnetic fields. Credit: Babi Brasileiro / Aalto University

The detection of magnetic fields is important in a variety of fields, from geological prospecting to imaging brain activity. The researchers believe that their work is a first step towards of using quantum-enhanced methods for sensor technology.

‘We wanted to design a highly efficient but minimally invasive measurement technique. Imagine, for example, extremely sensitive samples: we have to either use as low intensities as possible to observe the samples or push the measurement time to a minimum,’ explains Sorin Paraoanu, leader of the Kvantti research group at Aalto University.

Their paper, published in the prestigious journal npj Quantum Information shows how to improve the accuracy of magnetic field measurements by exploiting the coherence of a superconducting artificial atom, a qubit. It is a tiny device made of overlapping strips of aluminium evaporated on a silicon chip – a technology similar to the one used to fabricate the processors of mobile phones and computers.

When the device is cooled to a very low temperature, magic happens: the electrical current flows in it without any resistance and starts to display quantum mechanical properties similar to those of real atoms. When irradiated with a microwave pulse – not unlike the ones in household microwave ovens – the state of the artificial atom changes. It turns out that this change depends on the external magnetic field applied: measure the atom and you will figure out the magnetic field.

But to surpass the standard quantum limit, yet another trick had to be performed using a technique similar to a widely-applied branch of machine learning, pattern recognition.

‘We use an adaptive technique: first, we perform a measurement, and then, depending on the result, we let our pattern recognition algorithm decide how to change a control parameter in the next step in order to achieve the fastest estimation of the magnetic field,’ explains Andrey Lebedev, corresponding author from ETH Zurich, now at MIPT in Moscow.

‘This is a nice example of quantum technology at work: by combining a quantum phenomenon with a measurement technique based on supervised machine learning, we can enhance the sensitivity of magnetic field detectors to a realm that clearly breaks the standard quantum limit,’ Lebedev says.

Leti, a research institute at CEA Tech, Transdev, a global provider of mobility services, and IRT Nanoelec, an R&D center focused on information and communication technologies (ICT) using micro- and nanoelectronics, today announced a pilot program to characterize and assess LiDAR sensors to improve performance and safety of autonomous vehicles.

Transdev’s latest innovative transportation technologies already allow to operate fleets of autonomous vehicles for shared mobility. The perception of the environment through sensors is essential to offer the best client experience in terms of comfort and operation speed guaranteeing the required level of safety and security.  Evaluating sensor effectiveness and robustness is critical to develop the Transdev’s Autonomous Transport System that will allow the operation of autonomous vehicles fleets in a maximum of environmental conditions safely and securely.

In the pilot program, Leti teams will focus on perception requirements and challenges from a LiDAR system perspective and evaluate the sensors in real-world conditions. Vehicles will be exposed to objects with varying reflectivity, such as tires and street signs, as well as environmental conditions, such as weather, available light and fog. In addition to evaluating the sensors’ performance, the project will produce a list of criteria and objective parameters by which various commercial LiDAR systems could be evaluated.

“As an innovative supplier of autonomous transportation vehicles for smart cities, Transdev is leading the procession toward responsive, efficient and safe services with buses and shuttles,” said Leti CEO Emmanuel Sabonnadière. “This project will build on Leti’s sensor-fusion knowhow and sensor development expertise to strengthen Transdev’s testing and evaluation of sensors for its vehicles.”

Yann Leriche, Transdev’s CEO North America, said: “Providing the best client experience with the guarantee of safety, security and quality of service, will confirm Transdev as a pioneer in integrating autonomous transport systems into global mobility networks”.

As smart functionality makes its way into homes and businesses, two devices are gaining a foothold into broader ecosystems to maximize growth and revenue opportunities: smart speakers and smart meters. No longer simply intelligent appliances in the home, these devices are becoming key entry points into the massive Internet of Things (IoT) value chain. According to business information provider IHS Markit (Nasdaq: INFO), by the end of 2021, there will be an installed base of 328 million smart speakers and more than 1.13 billion smart electricity, water and gas meters.

“No matter the type of ‘smart’ device, device makers face the same challenge: keep costs down while increasing functionality,” said Paul Erickson, senior analyst for connected device research at IHS Markit. “The IoT is transformational for connected devices, and vendors large and small are vying to be part of the market. Many, like Google and Amazon, are selling their devices at or below margin because they understand the long-term opportunity lies in the applications and services these devices make possible.”

Smart speakers: growth, growth, growth ahead

Smart speakers, which enable voice-based media playback, smart home control, telephony, messaging, e-commerce and informational queries, use a range of connectivity options to leverage artificial intelligence (AI) and Cloud capabilities to enable an ever-increasing range of IoT devices.

By 2021, smart speaker revenue is expected to reach $11.2 billion, up from $6.3 billion in 2018, IHS Markit says. “While many options are available to device makers to enter the home ecosystem, the cost and convenience advantages of smart speakers will ensure that demand remains strong for years to come,” Erickson said.

“The smart speaker concept is most powerful when it leverages large, established ecosystems where there is broad app and development support across devices and platforms,” Erickson said. “These ecosystems allow the speakers to access diverse information and e-commerce resources and to receive support from other smart home devices.”

Smart meters: bridging the gap between utilities and their customers

Basic utility meters only monitor power usage, limiting the ability of utility companies to interact with end consumers. Smart meters expand the capabilities of utility companies by providing more regular and informative data, allowing better usage analysis, time-of-use rates and subsidies, leakage warnings and more.

“Smart meters are revolutionizing the way utilities and consumers interact, enhancing capabilities beyond the ‘meter to cash’ process,” said David Green, research manager for smart utilities infrastructure at IHS Markit. “Smart meters will be an increasingly critical entry point into utility ecosystems aiming to create more intelligent, efficient and cleaner electricity networks.”

Like smart speakers, smart meters are anticipated to enjoy considerable growth in the years ahead. Over 188 million smart meters will be shipped in 2023, generating $9.5 billion in hardware revenues, IHS Markit says. In 2023, the installation base of smart electricity, water and gas meters will exceed 1.35 billion. “Smart meters form the backbone of the data collection system for utilities, paving the way for entirely new categories of value-added revenue,” Green said.

There are limits to how accurately you can measure things. Think of an X-ray image: it is likely quite blurry and something only an expert physician can interpret properly. The contrast between different tissues is rather poor but could be improved by longer exposure times, higher intensity, or by taking several images and overlapping them. But there are considerable limitations: humans can safely be exposed to only so much radiation, and imaging takes time and resources.

A well-established rule of thumb is the so-called standard quantum limit: the precision of the measurement scales inversely with the square root of available resources. In other words, the more resources – time, radiation power, number of images, etc. – you throw in, the more accurate your measurement will be. This will, however, only get you so far: extreme precision also means using excessive resources.

A team of researchers from Aalto University, ETH Zurich, and MIPT and Landau Institute in Moscow have pushed the envelope and came up with a way to measure magnetic fields using a quantum system – with accuracy beyond the standard quantum limit.

The detection of magnetic fields is important in a variety of fields, from geological prospecting to imaging brain activity. The researchers believe that their work is a first step towards of using quantum-enhanced methods for sensor technology.

‘We wanted to design a highly efficient but minimally invasive measurement technique. Imagine, for example, extremely sensitive samples: we have to either use as low intensities as possible to observe the samples or push the measurement time to a minimum,’ explains Sorin Paraoanu, leader of the Kvantti research group at Aalto University.

Their paper, published in the prestigious journal npj Quantum Information shows how to improve the accuracy of magnetic field measurements by exploiting the coherence of a superconducting artificial atom, a qubit. It is a tiny device made of overlapping strips of aluminium evaporated on a silicon chip – a technology similar to the one used to fabricate the processors of mobile phones and computers.

When the device is cooled to a very low temperature, magic happens: the electrical current flows in it without any resistance and starts to display quantum mechanical properties similar to those of real atoms. When irradiated with a microwave pulse – not unlike the ones in household microwave ovens – the state of the artificial atom changes. It turns out that this change depends on the external magnetic field applied: measure the atom and you will figure out the magnetic field.

But to surpass the standard quantum limit, yet another trick had to be performed using a technique similar to a widely-applied branch of machine learning, pattern recognition.

‘We use an adaptive technique: first, we perform a measurement, and then, depending on the result, we let our pattern recognition algorithm decide how to change a control parameter in the next step in order to achieve the fastest estimation of the magnetic field,’ explains Andrey Lebedev, corresponding author from ETH Zurich, now at MIPT in Moscow.

‘This is a nice example of quantum technology at work: by combining a quantum phenomenon with a measurement technique based on supervised machine learning, we can enhance the sensitivity of magnetic field detectors to a realm that clearly breaks the standard quantum limit,’ Lebedev says.

By Paula Doe, SEMI

For medtech applications to flourish, sensors need a supporting infrastructure that translates the data they harvest into actionable insights, says Qualcomm Life director of business development Gene Dantsker, who will speak about the future of digital healthcare in the Medtech program at SEMICON West. “Rarely can one device give a complete diagnosis,” he notes. “What’s missing is the integration of all the sensor data into prescriptive information.”

The maturing medtech sector has developed to the point where sensors can now capture massive amounts of data, conveniently collected from people via mobile devices. The sector now has higher compute capacity to process the data, and improving software can produce actionable insight from the information. The next challenge is to seamlessly integrate these components into legacy medical systems without disrupting existing workflow. “Doctors and nurses don’t have time for disruptive technology – a new system has to be invisible and frictionless to use, with one or fewer buttons, no training and truly automatic Bluetooth-like pairing,” he says. “So device makers need to pack all system intelligence into the circuits and software.”

Getting actionable healthcare information from sensors requires integration into the existing medical infrastructure. Source: Qualcomm Life

One interesting example is United Healthcare’s use of the Qualcomm Life infrastructure to collect data from the fitness trackers of 350,000 patients. The insurance company then pays users $4 a day, or ~$1500 a year, for standing, walking six times a day and other behaviors that clinical evidence shows will both improve patient health and reduce healthcare costs. “It’s a perfect storm of motivations for all stakeholders,” he says.

Next hot MEMS topics: Piezoelectric devices, environmental sensors, near-zero power standby

With sensor technology continuing to evolve, look for coming innovations in MEMS in piezoelectric devices, environmental sensors and near zero-power standby devices, says Alissa Fitzgerald, Founder and Managing Member of A.M. Fitzgerald and Associates, who will provide an update on emerging sensor technologies in the MEMS program at SEMICON West.

Piezoelectric devices can potentially be more stable and perhaps even easier to ramp to volume than capacitive ones, with AlN devices for microphones and ultrasonic sensors finding quick success. Now the maturing infrastructure for lead zirconate titantate (PZT) is enabling the scaling of production of higher performing piezo material with thin film deposition equipment from suppliers like Ulvac Technologies and Solmates and in foundry processes at Silex and STMicroelectronics, she notes.

In academic research, where most new MEMS emerge, market interest is driving development of environmental sensors and zero-power standby devices. With demand for environmental monitoring growing, much work is focusing on technologies that improve the sensitivity, selectivity and time of response of gas and particulate sensors. Research and funding is also focusing on zero or near-zero power standby sensors, using open circuits that draw no power until a physical stimulus such as vibration or heat wakes them up.

MEMS, however, likely won’t find as much of a market in autonomous vehicles as once thought. “While the automotive sensor market will need many optical sensors, MEMS players are competing with other optical and mechanical solutions,” says Fitzgerald. “And here the usual MEMS advantage of small size may not matter much, and the devices will have to meet the challenging automotive requirements for extreme ruggedness.”

By Paula Doe, SEMI

With artificial intelligence (AI) rapidly evolving, look for applications like voice recognition and image recognition to get more efficient, more affordable, and far more common in a variety of products over the next few years. This growth in applications will drive demand for new architectures that deliver the higher performance and lower power consumption required for widespread AI adoption.

“The challenge for AI at the edge is to optimize the whole system-on-a-chip architecture and its components, all the way to semiconductor technology IP blocks, to process complex AI workloads quickly and at low power,” says Qualcomm Technologies Senior Director of Engineering Evgeni Gousev, who will provide an update on the progress of AI at the edge in a Data and AI program at SEMICON West, July 10-12 in San Francisco.

Qualcomm Snapdragon 845 uses heterogeneous computing across the CPU, GPU, and DSP for power-efficient processing for constantly evolving AI models. Source: Qualcomm

A system approach that optimizes across hardware, software, and algorithms is necessary to deliver the ultra-low power – to a sub 1-milliwatt level, low enough to enable always-on machine vision processing – for the usually energy-intensive AI computing. From the chip architecture perspective, processing AI workloads with the most appropriate engine, such as the CPU, GPU, and DSP with dedicated hardware acceleration, provides the best power efficiency – and flexibility for dealing with rapidly changing AI models and growing diversity of applications.

“But we’re going to run out of brute force options, so the future opportunity is more innovations with new architectures, dedicated hardware, new algorithms, and new software.” – Evgeni Gousev, Qualcomm Technologies

“So far it’s been largely a brute force approach using conventional architectures and cloud-based infrastructure,” says Evgeni. “But we’re going to run out of brute force options, so future opportunities lie in developing innovative architectures, dedicated hardware, new algorithms, and new software. Innovation will be especially important for AI at the edge and applications requiring always-on functionality. Training is mostly in the cloud now, but in the near future it will start migrating to the device as the algorithms and hardware improve. AI at the edge will also  remove some privacy concerns,  an increasingly important issue for data collection and management.”

Practical AI applications at the edge where resources are constrained run the gamut, spanning smartphones, drones, autonomous vehicles, virtual reality, augmented reality and smart home solutions such as connected cameras. “More AI on the edge will create a huge opportunity for the whole ecosystem – chip designers, semiconductor and device manufacturers, applications developers, and data and service providers. And it’s going to make a significant impact on the way we work, live, and interact with the world around us,” Evgeni said.

Future generations of chips may need more disruptive systems-level change to handle high data volumes with low power

A next-generation solution for handling the massive proliferation of AI data could be a nanotechnology system, such as the collaborative N3XT (Nano-Engineered Computing Systems Technology) project, led by H.S. Philip Wong and Subhasish Mitra at Stanford. “Even with next-generation scaling of transistors and new memory chips, the bottlenecks in moving data in and out of memory for processing will remain,” says Mitra, another speaker in the SEMICON West program. “The true benefits of nanotechnology will only come from new architectures enabled by nanosystems. One thing we are certain of is that massively more capable and more energy-efficient systems will be necessary for almost any future application, so we will need to think about system-level improvements.”

Major improvement in handling high volumes of data with low high energy use will require system-level improvements, such as monolithic 3D integration of carbon nanotube transistors in the multi-campus N3XT chip research effort. Source: Stanford University

That means carbon nanotube transistors for logic, high density non-volatile MRAM and ReRAM for memory, fine-grained monolithic 3D for integration, new architectures for computation immersed in memory, and new materials for heat removal. “The N3XT approach is key for the 1000X energy efficiency needed,” says Mitra.

“One thing we are certain of is that massively more capable and more energy efficient systems will be necessary for almost any future application, so we will need to think about system-level improvements.” – Subhasish Mitra, Stanford University

Researchers have demonstrated improvements in all these areas, including multiple hardware nanosystem prototypes targeting AI applications. The researchers have transferred multiple layers of as-grown carbon nanotubes to the target wafer to significantly improve CNT density. They have developed a low-power TiN/HfOx/Pt ReRAM whose low-temperature CNT and ReRAM processes enable multiple vertical layers to be grown on top of one another for ultra-dense and fine-grained monolithic 3D integration.

Other speakers at the Data and AI TechXpot include Fram Akiki, VP Electronics, Siemens; Hariharan Ananthanarayanan, motion planning engineer, Osaro; and David Haynes, Sr. director, strategic marketing, Lam Research.  See SEMICONWest.org.

Optimum Semiconductor Technologies, Inc., a fabless semiconductor company providing highly-integrated Systems on Chips (SoCs) for China’s thriving electronics markets, announced the GP8300 SoC. The GP8300 dramatically reduces chip cost, area, and power consumption for image recognition and object detection in a broad range of products such as self-driving cars, autonomous vehicles, smart cameras and other IoT edge devices.

Created in 28nm technology, the GP8300 includes four 2GHz ‘Unity’ CPU cores from General Processor Technologies (GPT) interconnected with a cache coherent memory supporting Heterogeneous Systems Architecture (HSA) processing for a common programming framework. The GP8300 also integrates four of GPT’s new 2GHz Variable Length Vector DSP (VLVm1) cores for signal processing applications. Within the chip, the out-of-order CPUs execute control code while very long vectors process data. In addition to these generalized compute units, the chip also integrates two 1GHz AI accelerators from GPT.

“The GP8300 brings together several of GPT’s innovative IP cores with underlying embedded artificial intelligence (eAI) algorithms in a highly-integrated design targeting a wide range of exciting applications,” said Gary Nacer, President and COO of Optimum. “The new SoC is one of the first CNN accelerators in China, and it provides the right combination of high performance, low power consumption, and the cost efficiency that our customers need as they create innovative new products.”

Building on the success of OST’s innovative SB3500 multithreaded heterogeneous computing platform for low-power software defined radio (SDR), the GP8300 represents a new architecture that achieves deep integration of eAI, edge computing, and communications on a single chip. OST provides support for CaffeNet-based training and tools for automatic fixed-point conversion and compression for inference.

A team headed by the TUM physicists Alexander Holleitner and Reinhard Kienberger has succeeded for the first time in generating ultrashort electric pulses on a chip using metal antennas only a few nanometers in size, then running the signals a few millimeters above the surface and reading them in again a controlled manner.

Classical electronics allows frequencies up to around 100 gigahertz. Optoelectronics uses electromagnetic phenomena starting at 10 terahertz. This range in between is referred to as the terahertz gap, since components for signal generation, conversion and detection have been extremely difficult to implement.

The TUM physicists Alexander Holleitner and Reinhard Kienberger succeeded in generating electric pulses in the frequency range up to 10 terahertz using tiny, so-called plasmonic antennas and run them over a chip. Researchers call antennas plasmonic if, because of their shape, they amplify the light intensity at the metal surfaces.

Asymmetric antennas

The shape of the antennas is important. They are asymmetrical: One side of the nanometer-sized metal structures is more pointed than the other. When a lens-focused laser pulse excites the antennas, they emit more electrons on their pointed side than on the opposite flat ones. An electric current flows between the contacts – but only as long as the antennas are excited with the laser light.

“In photoemission, the light pulse causes electrons to be emitted from the metal into the vacuum,” explains Christoph Karnetzky, lead author of the Nature work. “All the lighting effects are stronger on the sharp side, including the photoemission that we use to generate a small amount of current.”

Ultrashort terahertz signals

The light pulses lasted only a few femtoseconds. Correspondingly short were the electrical pulses in the antennas. Technically, the structure is particularly interesting because the nano-antennas can be integrated into terahertz circuits a mere several millimeters across.

In this way, a femtosecond laser pulse with a frequency of 200 terahertz could generate an ultra-short terahertz signal with a frequency of up to 10 terahertz in the circuits on the chip, according to Karnetzky.

The researchers used sapphire as the chip material because it cannot be stimulated optically and, thus, causes no interference. With an eye on future applications, they used 1.5-micron wavelength lasers deployed in traditional internet fiber-optic cables.

An amazing discovery

Holleitner and his colleagues made yet another amazing discovery: Both the electrical and the terahertz pulses were non-linearly dependent on the excitation power of the laser used. This indicates that the photoemission in the antennas is triggered by the absorption of multiple photons per light pulse.

“Such fast, nonlinear on-chip pulses did not exist hitherto,” says Alexander Holleitner. Utilizing this effect he hopes to discover even faster tunnel emission effects in the antennas and to use them for chip applications.

Leti, a research institute of CEA Tech, today announced that field trials of its new Low Power Wide Area (LPWA) technology, a waveform tailored for Internet of Things (IoT) applications, showed significant performance gains in coverage, data-rate flexibility and power consumption compared to leading LPWA technologies.

Leti’s LPWA approach includes its patented Turbo-FSK waveform, a flexible approach to the physical layer. It also relies on channel bonding, the ability to aggregate non-contiguous communication channels to increase coverage and data rates. The field trials confirmed the benefits of Leti’s LPWA approach in comparison to LoRaTM and NB-IoT, two leading LPWA technologies that enable wide-area communications at low cost and long battery life.

The results indicate the new technology is especially suitable for long-range massive machine-type communication (mMTC) systems. These systems, in which tens of billions of machine-type terminals communicate wirelessly, are expected to proliferate after 5G networks are deployed, beginning in 2020. Cellular systems designed for humans do not adequately transmit the very short data packets that define mMTC systems.

Figure 1: Performance chart comparison

Designed to demonstrate the performance and flexibility of the new waveform, the field-trial results stem primarily from the system’s flexible approach of the physical layer. The flexibility allows data-rate scaling from 3Mbit/s down to 4kbit/s, when transmission conditions are not particularly favorable and/or a long transmission range is required.

Under favorable transmission conditions, e.g. a shorter range and line of sight, the Leti system can select high data rates using widely deployed single-carrier frequency-division multiplexing (SC-FDM) physical layers to take advantage of the low power consumption of the transmission mode. Under more severe transmission conditions, the system switches to more resilient high-performance orthogonal frequency division multiplexing (OFDM). When both very long-range transmission and power efficiency are required, the system selects Turbo-FSK, which combines an orthogonal modulation with a parallel concatenation of convolutional codes and makes the waveform suitable to turbo processing. The selection is made automatically via a medium access control (MAC) approach optimized for IoT applications.

“Leti’s Turbo-FSK receiver performs close to the Shannon limit, which is the maximum rate that data can be transmitted over a given noisy channel without error, and is geared for low spectral efficiency,” said Vincent Berg, head of Leti’s Smart Object Communication Laboratory. “Moreover, the waveform exhibits a constant envelope, i.e. it has a peak-to-average-power ratio (PAPR) equal to 0dB, which is especially beneficial for power consumption. Turbo-FSK is therefore well adapted to future LPWA systems, especially in 5G cellular systems.”

In the new system, the MAC layer exploits the advantages of the different waveforms and is designed to self-adapt to context, i.e. the usage scenario and application. It optimally selects the most appropriate configuration according to the application requirements, such as device mobility, high data rate, energy efficiency or when the network becomes crowded, and is coupled with a decision module that adapts the communication depending on the radio environment. The optimization of the application transmission requirements is realized by the dynamic adaptation of the MAC protocol, and the decision module controls link quality.

IC Insights recently released its Update to its 2018 IC Market Drivers Report.  The Update includes IC Insights’ latest outlooks on the smartphone, automotive, PC/tablet and Internet of Things (IoT) markets.

The Update shows a final 2017 ranking of the top smartphone leaders in terms of unit shipments.  As shown in Figure 1, 9 of the top 12 smartphone suppliers were headquartered in China.  Two South Korean companies (Samsung and LG) and one U.S. supplier (Apple) were the other leaders.

Figure 1

Samsung and Apple dominated the smartphone market from 2015 through 2017.  In total, these two companies shipped 526 million smartphones and held a combined 35% share of the total smartphone market in 2016. Moreover, these two companies shipped over one-half billion smartphones (533 million) in 2017 with their combined smartphone unit marketshare increasing one point to 36%.

Samsung’s total smartphone unit sales were up by 2% in 2017 to 317 million units, slightly outpacing the total smartphone market that grew by 1%.  Meanwhile, orders for new Apple iPhones fell 7% in 2016, much worse than the 4% growth rate exhibited for the worldwide smartphone market.  However, Apple rebounded somewhat in 2017 with its total smartphone unit shipments being flat last year.

It appears that the up-and-coming Chinese producers like Huawei, OPPO, Vivo, and Xiaomi are giving a serious challenge to Samsung and Apple for smartphone marketshare.  It should be noted, however, that Samsung and Apple still hold a commanding share of the high-end smartphone segment—that is, smartphones priced more than $200.

The number four and five ranked smartphone suppliers on the list are owned by the same China-based parent company—BBK Electronics.  Combined handset unit shipments from these two companies were 213.1 million in 2017, just 2.7 million less than second-ranked Apple.

Overall, there was very little middle ground with regard to smartphone shipment growth rates among the top 12 suppliers in 2017.  As shown, four of the top 12 companies registered double-digit unit growth while the other eight companies logged 2% or less increases and four of those displayed a double-digit decline.  Three Chinese smartphone suppliers (Xiaomi, OPPO, and Vivo) saw their shipments surge at least 24% in 2017.  Xiaomi displayed the highest growth rate of any of the top-12 smartphone suppliers (73%). Meanwhile, another three Chinese suppliers (LeEco/Coolpad, ZTE, and TCL) saw their smartphone shipments fall by more than 20% last year.

Combined, the nine leading smartphone suppliers based in China shipped 626 million smartphones in 2017, an 11% increase from 565 million smartphones that these nine companies shipped in 2016. The top nine Chinese smartphone suppliers together held a 42% share of the worldwide smartphone market in 2017, up four points from the 38% share these companies held in 2016 and eight points better than the 34% combined share these companies held in 2015.

IC Insights projects smartphone shipments in 2018 will rise 2%, to 1.53 billion units.  Moreover, smartphone unit shipments are forecast to grow at low single-digit annual rates through 2021.