Category Archives: Applications

Leti, a research institute of CEA Tech, will demonstrate at CES 2018 its new wristband that measures physical indicators of a range of conditions, including sleep apnea, dehydration and dialysis-treatment response. 

APNEAband provides accurate, real-time detection of sleep-apnea events caused by pauses in breathing or shallow breaths during sleep. The wristband measures heart rate, variation in the time interval between heartbeats, oxygen saturation levels in the blood and stress level. The combination of these four indicators helps physicians make a complete and reliable medical diagnosis of sleep apnea.

“This small wristband eliminates the need to spend the night in a medical lab hooked up to sensors and equipment that measure these key indicators,” said Alexandre Thermet, Leti healthcare industrial partnership manager in the U.S. “APNEAband brings a safe, easy-to-use, affordable and non-invasive solution to detect sleep apnea at home.”

Working with Prof. Jean-Louis Pepin and his team at Grenoble Alpes University and INSERM from the Physiology Laboratory in Grenoble CHU’s hospital, Leti designed, developed and validated an advanced software technology that efficiently extracts and screens health parameters relevant to sleep apnea.  Pr. Pepin, a principal clinical-trial investigator at Grenoble CHU Hospital, and its team provided medical guidelines to support this sleep-apnea project.

APNEAband’s embedded technology can be applied to detect and track various other health conditions, such as acute mountain sickness, dehydration, dialysis treatment response, chronic pain, epileptic seizures, phobia and panic disorder. The wristband’s cardiac-coherence biofeedback also helps people who want to achieve total relaxation with simple breathing exercises. Possible applications also include detecting work-related stress or hot flashes and stress, while playing video games.

A new technique developed by researchers at Technische Universität München, Forschungszentrum Jülich, and RWTH Aachen University, published in Elsevier’s Materials Today, provides a unique insight into how the charging rate of lithium ion batteries can be a factor limiting their lifetime and safety.

State-of-the-art lithium ion batteries are powering a revolution in clean transport and high-end consumer electronics, but there is still plenty of scope for improving charging time. Currently, reducing charging time by increasing the charging current compromises battery lifetime and safety.

“The rate at which lithium ions can be reversibly intercalated into the graphite anode, just before lithium plating sets in, limits the charging current,” explains Johannes Wandt, PhD, of Technische Universität München (TUM).

Lithium ion batteries consist of a positively charged transition metal oxide cathode and a negatively charged graphite anode in a liquid electrolyte. During charging, lithium ions move from the cathode (deintercalate) to the anode (intercalate). However if the charging rate is too high, lithium ions deposit as a metallic layer on the surface of the anode rather than inserting themselves into the graphite. “This undesired lithium plating side reaction causes rapid cell degradation and poses a safety hazard,” Dr. Wandt added.

Dr. Wandt and Dr. Hubert A. Gasteiger, Chair of Technical Electrochemistry at TUM, along with colleagues from Forschungzentrum Jülich and RWTH Aachen University, set out to develop a new tool to detect the actual amount of lithium plating on a graphite anode in real-time. The result is a technique the researchers call operando electron paramagnetic resonance (EPR).

“The easiest way to observe lithium metal plating is by opening a cell at the end of its lifetime and checking visually by eye or microscope,” said Dr. Wandt. “There are also nondestructive electrochemical techniques that give information on whether lithium plating has occurred during battery charging.”

Neither approach, however, provides much if any information about the onset of lithium metal plating or the amount of lithium metal present during charging. EPR, by contrast, detects the magnetic moment associated with unpaired conduction electrons in metallic lithium with very high sensitivity and time resolution on the order of a few minutes or even seconds.

“In its present form, this technique is mainly limited to laboratory-scale cells, but there are a number of possible applications,” explains Dr. Josef Granwehr of Forschungzentrum Jülich and RWTH Aachen University. “So far, the development of advanced fast charging procedures has been based mainly on simulations but an analytical technique to experimentally validate these results has been missing. The technique will also be very interesting for testing battery materials and their influence on lithium metal plating. In particular, electrolyte additives that could suppress or reduce lithium metal plating.”

Dr. Wandt highlights that fast charging for electric vehicles could be a key application to benefit from further analysis of the work.

Until now, there has been no analytical technique available that can directly determine the maximum charging rate, which is a function of the state of charge, temperature, electrode geometry, and other factors, before lithium metal plating starts. The new technique could provide a much-needed experimental validation of frequently used computational models, as well as a means of investigating the effect of new battery materials and additives on lithium metal plating.

The researchers are now working with other collaborators to benchmark their experimental results against numerical simulations of the plating process in simple model systems.

“Our goal is to develop a toolset that facilitates a practical understanding of lithium metal plating for different battery designs and cycling protocols,” explains Dr. Rüdiger-A. Eichel of Forschungzentrum Jülich and RWTH Aachen University.

Accelerometers and gyroscopes are fueling the robotic revolution, especially the drones’ market segment. However, these MEMS devices are not the only ones on the market place anymore, with environmental sensors penetrating this industry too.

InvenSense, today TDK, combined it: the US-based company, IMU leader and formerly Apple’s supplier during many years, released last month the world’s 1st 7-axis motion tracking device combining accelerometer, gyroscope and pressure sensor. InvenSense announces the ICM-20789 7-axis combo sensor dedicated to mainly drones and flying toys as well as smart watches, wearables, activity monitoring, floor and stair counting etc.

The reverse costing company, System Plus Company has investigated the 7-axis component and technologies selected by InvenSense. Aim of this analysis was to identify the technologies selected by the leading company as well as to understand the impacts on the manufacturing costs.

What are the technical choices made by InvenSense? What are the benefits for the device in term of performances? What is the impact on the manufacturing process flow?

System Plus Consulting’s team proposes today a comprehensive technology and cost analysis, including as well a detailed comparison with the previous generation of combo sensors from InvenSense.

ILLUS_INVENSENSE_TDK_ReverseEngineering_SYSTEMPLUSCONSULTING_Dec2017

The drone’s market segment dedicated to consumer applications confirms its attractiveness with 23% CAGR between 2016 and 2021. According to Yole Développement, sister company of System Plus Consulting, the market should reach almost US$ 3.4 billion in 2023 (Source : Sensors for drones and robots: market opportunities and technology revolution report, Yole Développement, 2016). Under this dynamic context, System Plus Consulting’s experts are following the technical advances and the evolution of the manufacturing costs of the combo devices. InvenSense’s device is a good example of this technology breakthrough: indeed, for the 1st time, a company presents a 7-axis component combining accelerometer, gyroscope and barometric pressure sensor, integrated on the same package. Innovation clearly is not in the selection of the components, comments the reverse engineering & costing company, but more in the smart combination of the three devices in the same package.

Stéphane Elisabeth, RF and Advanced Packaging Cost Engineer from System Plus Consulting explains“Using single package integration, the US company merged a 6-axis inertial sensor already identified in iPhone 6 with a barometric pressure sensor based on a design coming from the barometric division of Sensirion. Therefore, InvenSense took benefits of Sensirion’s partial acquisition, taking place in 2016, by developing a specific approach eliminating a package and minimizing board area requirements.”

ILLUS_INVENSENSE_TDK_Combo_CostBreakdown_SYSTEMPLUSCONSULTING_Dec2017

InvenSense was able to integrate its own barometric pressure sensor thanks to the knowledge reached with the acquisition of Sensirion’s barometric division. This device is shipped in a 4 mm x 4 mm x 1.37 mm land grid array (LGA) package.

InvenSense acquired the pressure sensor business from Sensirion Holding AG and its affiliates used in the development of capacitive-type monolithic digital pressure-sensor technology platform.

InvenSense’s financial report highlights the details of this acquisition: the purchase price associated with the acquisition was approximately US$9.8 million, of which US$5.7 million was allocated to developed technology with an estimated useful life of six years and US$4.1 million was allocated to goodwill.

Faced with this simple but impressive technical innovation, what will be the answer of other MEMS & Sensors manufacturers? Will this combination of IMU with barometric pressure sensor be followed by competitors? The selling prices of IMUs have fell in recent years and adding new functions is a way to keep a profitable ASP.

Researchers at Aalto University, Finland, have developed a biosensor that enables creating a range of new easy-to-use health tests similar to home pregnancy tests. The plasmonic biosensor can detect diseased exosomes even by the naked eye. Exosomes, important indicators of health conditions, are cell-derived vesicles that are present in blood and urine.

A rapid analysis by biosensors helps recognize inflammatory bowel diseases, cancer and other diseases rapidly and start relevant treatments in time. In addition to using discovery in biomedicine, industry may use advanced applications in energy.

Researchers created a new biosensor by depositing plasmonic metaparticles on a black, physical body that absorbs all incident electromagnetic radiation. A plasmon is a quantum of plasma oscillation. Plasmonic materials have been used for making objects invisible in scientific tests. They efficiently reflect and absorb light. Plasmonic materials are based on the effective polarizabilities of metallic nanostructures.

The carriers containing Ag nanoparticles are covered with various dielectrics of AlN, SiO2 and the composites thereof that are placed on a black background to enhance the reflectivity contrast of various colours at a normal angle of incidence. Credit: Aalto University

The carriers containing Ag nanoparticles are covered with various dielectrics of AlN, SiO2 and the composites thereof that are placed on a black background to enhance the reflectivity contrast of various colours at a normal angle of incidence. Credit: Aalto University

“It is extraordinary that we can detect diseased exosomes by the naked eye. The conventional plasmonic biosensors are able to detect analytes solely at a molecular level. So far, the naked-eye detection of biosamples has been either rarely considered or unsuccessful”, says Professor Mady Elbahri from Aalto University.

Plasmonic dipoles are famous for their strong scattering and absorption. Dr. Shahin Homaeigohar and Moheb Abdealziz from Aalto University explain that the research group has succeeded in demonstrating the as-yet unknown specular reflection and the Brewster effect of ultrafine plasmonic dipoles on a black body host.

“We exploited it as the basis of new design rules to differentiate diseased human serum exosomes from healthy ones in a simple manner with no need to any specialized equipment”, says Dr. Abdou Elsharawy from the University of Kiel.

The novel approach enables a simple and cost-effective design of a perfect colored absorber and creation of vivid interference plasmonic colors.

According to Elbahri, there is no need to use of sophisticated fabrication and patterning methods. It enables naked-eye environmental and bulk biodetection of samples with a very minor change of molecular polarizability of even 0.001%.

Think keeping your coffee warm is important? Try satellites. If a satellite’s temperature is not maintained within its optimal range, its performance can suffer which could mean it could be harder to track wildfires or other natural disasters, your Google maps might not work and your Netflix binge might be interrupted. This might be prevented with a new material recently developed by USC Viterbi School of Engineering engineers.

When satellites travel behind the Earth, the Earth can block the sun’s rays from reaching the satellites—cooling them down. In space, a satellite can face extreme temperature variation as much as 190 to 260 degrees Fahrenheit. It’s long been a challenge for engineers to keep satellite temperatures from fluctuating wildly. Satellites have conventionally used one of two mechanisms: physical “shutters” or heat pipes to regulate heat. Both solutions can deplete on-board power reserves. Even with solar power, the output is limited. Furthermore, both solutions add mass, weight and design complexity to satellites, which are already quite expensive to launch.

Taking cues from humans who have a self-contained system to manage internal temperature through homeostasis, a team of researchers including Michelle L. Povinelli, a Professor in the Ming Hsieh Department of Electrical Engineering at the USC Viterbi School of Engineering, and USC Viterbi students Shao-Hua Wu and Mingkun Chen, along with Michael T. Barako, Vladan Jankovic, Philip W.C. Hon and Luke A. Sweatlock of Northrop Grumman, developed a new material to self-regulate the temperature of the satellite. The team of engineers with expertise in optics, photonics, and thermal engineering developed a hybrid structure of silicon and vanadium dioxide with a conical design to better control the radiation from the body of the satellite. It’s like a textured skin or coating.

Vanadium dioxide functions as what is known as a “phase-change” material. It acts in two distinct ways: as an insulator at low temperatures and a conductor at high temperatures. This affects how it radiates heat. At over 134 degrees Fahrenheit (330 degrees Kelvin), it radiates as much heat as possible to cool the satellite down. At about two degrees below this, the material shuts off the heat radiation to warm the satellite up. The material’s conical structure (almost like a prickly skin) is invisible to the human eye at about less than half the thickness of a single human hair–but has a distinct purpose of helping the satellite to switch its radiation on and off very effectively.

Results

The hybrid material developed by USC and Northrop Grumman is twenty times better at maintaining temperature than silicon alone. Importantly, passively regulating heat and temperature of satellites could increase the life span of the satellites by reducing the need to expend on-board power.

Applications on Earth

Besides use on a satellite, the material could also be used on Earth for thermal management. It could be applied to a building over a large area to more efficiently maintain a building’s temperature.

The study, “Thermal homeostasis using microstructured phase-change materials,” is published in Optica. The research was funded by Northrop Grumman and the National Science Foundation. This development is part of a thematic research effort between Northrop Grumman, NG Next Basic Research and USC known as the Northrop Grumman Institute of Optical Nanomaterials and Nanophotonics (NG-ION2).

The researchers are now working on developing the material in the USC microfabrication facility and will likely benefit from the new capabilities in the recently-dedicated John D. O’Brien Nanofabrication Laboratory in the USC Michelson Center for Convergent Bioscience.

A team of University of Alberta engineers developed a new way to produce electrical power that can charge handheld devices or sensors that monitor anything from pipelines to medical implants. The discovery sets a new world standard in devices called triboelectric nanogenerators by producing a high-density DC current–a vast improvement over low-quality AC currents produced by other research teams.

Jun Liu, a PhD student working under the supervision of chemical engineering professor Thomas Thundat, was conducting research unrelated to these tiny generators, using a device called an atomic force microscope. It provides images at the atomic level using a tiny cantilever to “feel” an object, the same way you might learn about an object by running a finger over it. Liu forgot to press a button that would apply electricity to the sample–but he still saw a current coming from the material.

“I didn’t know why I was seeing a current,” he recalled.

One theory was that it was an anomaly or a technical problem, or interference. But Liu wanted to get to the bottom of it. He eventually pinned the cause on the friction of the microscope’s probe on the material. It’s like shuffling across a carpet then touching someone and giving them a shock.

It turns out that the mechanical energy of the microscope’s cantilever moving across a surface can generate a flow of electricity. But instead of releasing all the energy in one burst, the U of A team generated a steady current.

“Many other researchers are trying to generate power at the prototype stages but their performances are limited by the current density they’re getting–that is the problem we solved,” said Liu.

“This is big,” said Thundat. “So far, what other teams have been able to do is to generate very high voltages, but not the current. What Jun has discovered is a new way to get continuous flow of high current.”

The discovery means that nanoscale generators have the potential to harvest power for electrical devices based on nanoscale movement and vibration: an engine, traffic on a roadway–even a heartbeat. It could lead to technology with applications in everything from sensors used to monitor the physical strength of structures such as bridges or pipelines, the performance of engines or wearable electronic devices.

Liu said the applications are limited only by imagination.

IBM’s Khare on A.I.


December 7, 2017

BY PETE SINGER, Editor-in-Chief

Mukesh Khare, VP of IBM Research, talked about the impact artificial intelligence (AI) is going to have on the semiconductor industry during a recent panel session hosted by Applied Materials. He said that today most artificial intelligence is too complex. It requires, training, building models and then doing inferencing using those models. “The reason there is good in artificial intelligence is because of the exponential increase in data, and cheap compute. But, keep in mind that, the compute that we are using right now is the old compute. That compute was built to do spreadsheet, databases, the traditional compute.

“Since that compute is cheap and available, we are making use of it. Even with the cheap and available compute in cloud, it takes months to generate those models. So right now, most of the training is still being done in cloud. Whereas, inferencing, making use from that model is done at the edge. However, going forward, it is not possible because the devices at the edge are continuously generating so much data that you cannot send all the data back to the cloud, generate models, and come back on the edge.

“Eventually, a lot of training needs to move to the edge as well,” Khare said. This will require some innovation so that the compute, which is being done right now in cloud, can be transferred over to edge with low-power devices, cheap devices. Applied Materials’ CIO Jay Kerley added that innovation has to happen not only at the edge, but in the data center and at the network layer, as well as in the software frameworks. “Not only the AI frameworks, but what’s driving compression, de-duplication at the storage layer is absolutely critical as well,” he said.

Khare also weighed in on how transistors and memory will need to evolve to meet the demands of new AI computer architec- tures, “For artificial intelligence in our world, we have to think very differently. This is an inflection, but this is the kind of inflection that world has not seen for last 60 years.” He said the world has gone from tabulating system era (1900 to 1940) to the programmable system era in 1950s, which we are still using. “We are entering the era of what we call cognitive computing, which we believe started in 2011, when IBM first demonstrated artificial intelligence through our Watson System, which played Jeopardy,” he said.

Khare said “we are still using the technology of programmable systems, such as logic, memory, the traditional way of thinking, and applying it to AI, because that’s the best we’ve got.”
AI needs more innovation at all levels, Khare said. “You have to think about systems level optimization, chip design level optimization, device level optimization, and eventually materials level optimization,” he said. “The artificial workloads that are coming out are very different. They do not require the traditional way of thinking — they require the way the brain thinks. These are the brain inspired systems that will start to evolve.”

Khare believes analog compute might hold the answer. “Analog compute is where compute started many, many years ago. It was never adopted because the precision was not high enough, so there were a lot of errors. But the brain doesn’t think in 32 bits, our brain thinks analog, right? So we have to bring those technologies to the forefront,” he said. “In research at IBM we can see that there could be several orders of magnitude reduction in power, or improvement in efficiency that’s possible by intro- ducing some of those concepts, which are more brain inspired.”

Christos Georgiopoulos (former Intel VP and professor who was also on the panel) said a new compute model is required for A.I. “It’s important to understand that the traditional workloads that we all knew and loved for the last forty years, don’t apply with A.I. They are completely new workloads that require very different type of capabilities from the machines that you build,” he said. “With these new kind of workloads, you’re going to require not only new architectures, you’re going to require new system level design. And you’re going to require new capabilities like frameworks. He said TensorFlow, which is an open-source software library for machine intelligence originally developed by researchers and engineers working on the Google Brain Team, seems to be the biggest framework right now. “Google made it public for only one very good reason. The TPU that they have created runs TensorFlow better than any other hardware around. Well, guess what? If you write something on TensorFlow, you want to go to the Google backend to run it, because you know you’re going to get great results. These kind of architectures are getting created right now that we’re going to see a lot more of,” he said.

Integrated circuit sales for automotive systems and the Internet of Things are forecast to grow 70% faster than total IC revenues between 2016 and 2021, according to IC Insights’ new 2018 Integrated Circuit Market Drivers Report.  ICs used in automobiles and other vehicles are forecast to generate worldwide sales of $42.9 billion in 2021 compared to $22.9 billion in 2016, while integrated circuit revenues for Internet of Things (IoT) functionality in a wide range of systems, sensors, and objects are expected to reach $34.2 billion in four years compared to $18.4 billion last year, says the new 358-page report.

Between 2016 and 2021, automotive and IoT IC sales are projected to rise by compound annual growth rates (CAGRs) of 13.4% and 13.2%, respectively, compared to 7.9% for the entire IC market, which is projected to reach $434.5 billion in four years versus $297.7 billion last year.  As shown in Figure 1, strong five-year IC sales growth rates are also expected in medical electronics (a CAGR of 9.7% to $7.8 billion in 2021) and wearable systems (a CAGR of 9.0% to $4.9 billion).

Figure 1

Figure 1

Cellphone IC sales—the biggest end-use market application for integrated circuits, accounting for about 25% of the IC market’s total revenues—are expected to grow by a CAGR of 7.8% in the 2016-2021 period, reaching $105.6 billion in the final year of the new report’s forecast. Meanwhile, weak and negative IC sales growth rates are expected to continue in video game consoles (a CAGR of -1.9% to $9.7 billion in 2021) and tablet computers (a CAGR of -2.3% to 10.7 billion), according to the 2018 IC Market Drivers report.

Sharply higher average selling prices (ASPs) for DRAMs and NAND flash are playing a significant role in driving up dollar-sales volumes for ICs in cellphones and PCs (both desktop and notebook computers) in 2017.  Cellphone IC sales are on pace to surge 24% this year to an estimated $89.7 billion, while PC integrated circuit dollar volume is expected to climb 17.6% to $69.0 billion.   For both the cellphone and PC market segments, 2017 will be the strongest increase in IC sales since the 2010 recovery year from the 2009 downturn.  The 2018 IC Market Drivers report’s forecast shows cellphone integrated circuit sales rising 8% to $97.3 billion next year and PC IC revenues growing 5% to $72.6 billion in 2018.

The new report estimates that automotive IC sales will rise 22% in 2017 to about $28.0 billion after increasing 11% in 2016. Automotive IC sales are forecast to increase 16% in 2018 to $32.4 billion. Meanwhile, IoT-related integrated circuit sales are on pace to grow 14% in 2017 to an estimated $14.5 billion after increasing about 18% in 2016.  In 2018, integrated circuit sales for Internet of Things end-use applications are expected to rise 16% to about $16.8 billion, according to the 2018 edition of the IC Market Drivers report.

Quantenna Communications, Inc. (Nasdaq:QTNA), a developer of high performance Wi-Fi solutions, today announced that Dr. Nambi Seshadri, Quantenna’s chief technologist has been selected as the 2018 IEEE Alexander Graham Bell Medal recipient for exceptional contributions to wireless, networking and engineering. In addition to this highest honor, Seshadri’s prize consists of a gold medal, a bronze replica, a certificate, and an honorarium.

“The innovations by Nambi form the basis for some of today’s Wi-Fi and other wireless networking standards and systems, now in use by billions of Wi-Fi users,” said Dr. Sam Heidari, Chairman and Chief Executive Officer, Quantenna. “We are honored to have such a distinguished and accomplished chief technologist on our team. The process is extraordinarily competitive, this is a great lifetime accomplishment and one of the most prestigious honors that one may receive in our field.”

Every year, the IEEE board of directors selects a SINGLE individual to receive the IEEE Alexander Graham Bell Medal. The selection criteria used include weighing the value of the individual’s contribution to communication among people as well as to communication sciences and engineering, and an evaluation of the contributor, nominator and references. The timeliness of the recognition, and quality of the nomination also are considered.

The IEEE Alexander Graham Bell Medal was established in 1976 by the IEEE Board of Directors, in commemoration of the centennial of the telephone’s invention, to provide recognition for outstanding contributions to telecommunications. The invention of the telephone by Alexander Graham Bell in 1876 was a major event in electrotechnology. It was instrumental in stimulating the broad telecommunications industry that has dramatically improved life throughout the world. As an individual, Bell himself exemplified the contributions that scientists and engineers have made to the betterment of mankind.

In addition to serving as chief technologist to Quantenna, Seshadri is a Professor of Electrical and Computer Engineering (ECE) for the University of California, San Diego. Prior to Quantenna, Seshadri held multiple senior positions at Broadcom Corporation where he helped Broadcom’s wireless initiatives, including it’s foray into cellular, mobile multimedia, low power wireless connectivity, GPS and others. During 2011-2014, he also served as the General Manager of the Mobile Platforms Business Unit. Prior to joining Broadcom Corporation, he was a Member of Technical Staff at with AT&T Bell Lab Laboratories and Head of Communications Research at AT&T Shannon Labs where he contributed to fundamental advances in wireless communication theory and practice.

Seshadri was elected Fellow of the Institute of Electrical and Electronic Engineers (IEEE) in 2000 and was elected to the National Academy of Engineering (USA) in 2012 and as a Foreign Member of the Indian National Academy of Engineering in the year 2013. He holds approximately 200 patents. He was a co-recipient of the IEEE Information Theory Paper Award in 1999 for his paper with Tarokh and Calderbank on space-time codes, and his IEEE Journal on Selected Areas In Communications (JSAC) paper on space-time coding modems with Naguib, Tarokh, and Calderbank was selected by IEEE Communication Society for publication in, “The Best of the Best: Fifty Years of Communications and Networking Research,” for 2003.

The ConFab 2018, to be held at The Cosmopolitan of Las Vegas on May 21-23, is thrilled to announce the newest opening day Keynote speaker, Professor John M. Martinis. John is a Research Scientist who heads up Google’s Quantum AI Lab. He also holds the Worster Chair of Experimental Physics at the University of California, Santa Barbara. The lab is particularly interested in applying quantum computing to artificial intelligence and machine learning, and as one of Google’s quantum computing gurus, John shared the company’s “stretch goal”. That is to build and test a 49-qubit (“quantum bit”) quantum computer by the end of this year. The test will be a milestone in quantum computer technology.

The conference team is also very excited to have IBM distinguished Engineer, Rama Divakaruni – who is responsible for IBM Advanced Process Technology Research – present his Keynote Address: How Artificial Intelligence is driving the “New” Semiconductor Era. Both Keynotes, set for May 21, promise to be outstanding presentations.

Additional outstanding speakers at The ConFab 2018 include:

  • Dan Armbrust, CEO and Co-founder of Silicon Catalyst will present: “Enabling a Startup Ecosystem for Semiconductors” describing the current environment for semiconductor startups.
  • George Gomba, GLOBALFOUNDRIES VP of Technology Research will discuss the EUV lithography project with SUNY Polytechnic Institute now finding its way into advanced semiconductor manufacturing.
  • John Hu, Director of Advanced Technology for Nvidia – John heads up R&D of Advanced IC Process Technologies and programs, Design For Manufacturing, Testchips, and New technology/ IC product.
  • Tom Sonderman, President of Sky Water Technology Foundry will focus on smart manufacturing ecosystems based on big data platform, predictive analytics and IoT.
  • Kou Kuo Suu of ULVAC Japan will delve into manufacturing various types of NVM memory chips, including Phase-Change memory (PCRAM).

More industry experts adding to the conference will be announced soon.  Further event details are available at: www.theconfab.com.