The future of MEMS and sensors: Beyond human senses

By Dr. Eric Mounier

2017 was a good year for the MEMS and sensors business, and that upward trend should continue. We forecast extended strong growth for the sensors and actuators market, reaching more than $100 billion in 2023 for a total of 185 billion units. Optical sensors, especially CMOS image sensors, will have the lion’s share with almost 40 percent of market value. MEMS will also play an important role in that growth: During 2018–2023, the MEMS market will experience 17.5 percent growth in value and 26.7 percent growth in units, with the consumer market accounting for more than 50 percent(1)share overall.

Evolution of sensors

Sensors were first developed and used for physical sensing: shock, pressure, then acceleration and rotation. Greater investment in R&D spurred MEMS’ expansion from physical sensing to light management (e.g., micromirrors) and then to uncooled infrared sensing (e.g., microbolometers). From sensing light to sensing sound, MEMS microphones formed the next wave of MEMS development. MEMS and sensors are entering a new and exciting phase of evolution as they transcend human perception, progressing toward ultrasonic, infrared and hyperspectral sensing.

Sensors can help us to compensate when our physical or emotional sensing is limited in some way. Higher-performance MEMS microphones are already helping the hearing-impaired. Researchers at Arizona State University are among those developing cochlear implants — featuring piezoelectric MEMS sensors — which may one day restore hearing to those with significant hearing loss.

The visually impaired may take heart in knowing that researchers at Stanford University are collaborating on silicon retinal implants. Pixium Vision began clinical trials in humans in 2017 with its silicon retinal implants.

It’s not science fiction to think that we will use future generations of sensors for emotion/empathy sensing. Augmenting our reality, such sensing could have many uses, perhaps even aiding the ability of people on the autism spectrum to more easily interpret the emotions of others.

Through my years in the MEMS industry, I have identified three distinct eras in MEMS’ evolution:

  1. The “detection era” in the very first years, when we used simple sensors to detect a shock.
  2. The “measuring era” when sensors could not only sense and detect but also measure (e.g., a rotation).
  3. The “global-perception awareness era” when we increasingly use sensors to map the environment. We conduct 3D imaging with Lidar for autonomous vehicles. We monitor air quality using environmental sensors. We recognize gestures using accelerometers and/or ultrasonics. We implement biometry with fingerprint and facial recognition sensors. This is possible thanks to sensor fusion of multiple parameters, together with artificial intelligence.

Numerous technological breakthroughs are responsible for this steady stream of advancements: new sensor design, new processes and materials, new integration approaches, new packaging, sensor fusion, and new detection principles.

Global Awareness Sensing

The era of global awareness sensing is upon us. We can either view global awareness as an extension of human sensing capabilities (e.g., adding infrared imaging to visible) or as beyond-human sensing capabilities (e.g., machines with superior environmental perception, such as Lidar in a robotic vehicle). Think about Professor X in Marvel’s universe, and you can imagine how human perception could evolve in the future!

Some companies envisioned global awareness from the start. Movea (now part of TDK InvenSense), for example, began their development with inertial MEMS. Others implemented global awareness by combining optical sensors such as Lidar and night-vision sensors for robotic cars. A third contingent grouped environmental sensors (gas, particle, pressure, temperature) to check air quality. The newest entrant in this group, the particle sensor, could play an especially important role in air-quality sensing, particularly in wearable devices.

Driven by increasing societal concern over mounting evidence of global air-quality deterioration, air pollution has become a major topic in our society. Studies show that there is no safe level of particulates. Instead, for every increase in concentration of PM10 or PM2.5 inhalable particles in the air, the lung cancer rate is rising proportionately. Combining a particle sensor with a mapping application in a wearable could allow us to identify the locations of the most polluted urban zones.

The Need for Artificial Intelligence

To realize global awareness, we also need artificial intelligence (AI), but first, we have challenges to solve. Activity tracking, for example, requires accurate live classification of AI data. Relegating all AI processing to a main processor, however, would consume significant CPU resources, reducing available processing power. Likewise, storing all AI data on the device would push up storage costs. To marry AI with MEMS, we must do the following:

  1. Decouple feature processing from the execution of the classification engine to a more powerful external processor.
  2. Reduce storage and processing demands by deploying only the features required for accurate activity recognition.
  3. Install low-power MEMS sensors that can incorporate data from multiple sensors (sensor fusion) and enable pre-processing for always-on execution.
  4. Retrain the model with system-supported data that can accurately identify the user’s activities.

There are two ways to add AI and software in mobile and automotive applications. The first is a centralized approach, where sensor data is processed in the auxiliary power unit (APU) that contains the software. The second is a decentralized approach, where the sensor chip is localized in the same package, close to the software and the AI (in the DSP for a CMOS image sensor, for example). Whatever the approach, MEMS and sensors manufacturers need to understand AI, although they are unlikely to gain much value at the sensor-chip level.

Heading to an Augmented World

We have achieved massive progress in sensor development over the years and are now reaching the point when sensors can mimic or augment most of our perception: vision, hearing, touch, smell and even emotion/empathy as well as some aesthetic senses. We should realize that humans are not the only ones to benefit from these developments. Enhanced perception will also allow robots to help us in our daily lives (through smart transportation, better medical care, contextually aware environments and more). We need to couple smart sensors’ development with AI to further enhance our experiences with the people, places and things in our lives.

About the author

With almost 20 years’ experience in MEMS, sensors and photonics applications, markets, and technology analyses, Dr. Eric Mounier provides in-depth industry insight into current and future trends. As a Principal Analyst, Technology & Markets, MEMS & Photonics, in the Photonics, Sensing & Display Division, he contributes daily to the development of MEMS and photonics activities at Yole Développement (Yole). He is involved with a large collection of market and technology reports, as well as multiple custom consulting projects: business strategy, identification of investment or acquisition targets, due diligence (buy/sell side), market and technology analyses, cost modeling, and technology scouting, etc.

Previously, Mounier held R&D and marketing positions at CEA Leti (France). He has spoken in numerous international conferences and has authored or co-authored more than 100 papers. Mounier has a Semiconductor Engineering Degree and a PhD in Optoelectronics from the National Polytechnic Institute of Grenoble (France).

Mounier is a featured speaker at SEMI-MSIG European MEMS & Sensors Summit, September 20, 2018 in Grenoble, France.

Originally published on the SEMI blog.

POST A COMMENT

Easily post a comment below using your Linkedin, Twitter, Google or Facebook account. Comments won't automatically be posted to your social media accounts unless you select to share.