Category Archives: Device Architecture

ArterisIP, the supplier of silicon-proven commercial system-on-chip (SoC) interconnect IP, today announced it has joined the FDXcelerator Partner Program. This program enables SoC designers to integrate ArterisIP interconnect IP into their projects with the ability to accelerate the timing closure process for FDX-based designs. The partnership speeds the development of pioneering products in applications from automotive ADAS and machine learning to small IoT processors.

ArterisIP offerings participating in the FDXcelerator program include:

  • The Ncore Cache Coherent Interconnect IP with Ncore Resilience Package, which has been chosen by the industry’s leading automotive ADAS, autonomous driving, and machine learning SoC vendors for its power, performance, and area advantages and ISO 26262 functional safety features.
  • The FlexNoC Interconnect IP with FlexNoC Resilience Package, which is the backbone interconnect for most mobility and consumer electronics SoC designs where power consumption, performance, and cost are key design metrics.
  • The PIANO Timing Closure Package, which assists back-end timing closure with technology that works earlier in the SoC design flow, thereby reducing schedule risk.

“The addition of ArterisIP to the FDXcelerator Partnership Program has already realized benefits with the implementation of an FD-SOI automotive ADAS multi-processor SoC with fellow FDXcelerator partner Dream Chip Technologies,” said Alain Mutricy, senior vice president of product management at GF. “ArterisIP’s commitment to GF’s FDX technology enables a scalable on-chip interconnect IP technology that will help our customers meet stringent automotive safety requirements.”

“GF’s FDXcelerator program plays an important role for ArterisIP, enabling us to gain access to FD-SOI technology process and design information to enable improved automation of our interconnect timing closure assistance technology,” said K. Charles Janac, President and CEO of ArterisIP. “Interconnect timing closure assistance is becoming imperative as technologies like FD-SOI shrink feature sizes and allow ever-increasing transistor and wire densities.”

With the prospects of large 450mm wafers going nowhere, IC manufacturers are increasing efforts to maximize fabrication plants using 300mm and 200mm diameter silicon substrates. The number of 300mm wafer production-class fabs in operation worldwide is expected to increase each year between now and 2021 to reach 123 compared to 98 in 2016, according to the forecast in IC Insights’ Global Wafer Capacity 2017-2021 report.

As shown in Figure 1, 300mm wafers represented 63.6% of worldwide IC fab capacity at the end of 2016 and are projected to reach 71.2% by the end of 2021, which translates into a compound annual growth rate (CAGR) of 8.1% in terms of silicon area for processing by plant equipment in the five-year period.

capacity install

Figure 1

The report’s count of 98 production-class 300mm fabs in use worldwide at the end of 2016 excludes numerous R&D front-end lines and a few high-volume 300mm plants that make non-IC semiconductors (such as power transistors).  Currently, there are eight 300mm wafer fabs that have opened or are scheduled to open in 2017, which is the highest number in one year since 2014 when seven were added, says the Global Wafer Capacity report.  Another nine are scheduled to open in 2018.   Virtually all these new fabs will be for DRAM, flash memory, or foundry capacity, according to the report.

Even though 300mm wafers are now the majority wafer size in use, both in terms of total surface area and in actual quantity of wafers, there is still much life remaining in 200mm fabs, the capacity report concludes.  IC production capacity on 200mm wafers is expected to increase every year through 2021, growing at a CAGR of 1.1% in terms of total available silicon area. However, the share of the IC industry’s monthly wafer capacity represented by 200mm wafers is forecast to drop from 28.4% in 2016 to 22.8% in 2021.

IC Insights believes there is still much life left in 200mm fabs because not all semiconductor devices are able to take advantage of the cost savings 300mm wafers can provide.  Fabs running 200mm wafers will continue to be profitable for many more years for the fabrication of numerous types of ICs, such as specialty memories, display drivers, microcontrollers, and RF and analog products.  In addition, 200mm fabs are also used for manufacturing MEMS-based “non-IC” products such as accelerometers, pressure sensors, and actuators, including acoustic-wave RF filtering devices and micro-mirror chips for digital projectors and displays, as well as power discrete semiconductors and some high-brightness LEDs.

By Zvi Or-Bach, President & CEO, MonolithIC 3D Inc.

Next week, as part of the IEEE S3S 2017 program, we will present a paper (18.3) titled “A 1,000x Improvement in Computer Systems by Bridging the Processor Memory Gap”. The paper details a monolithic 3D technology that is low-cost and ready to be rapidly deployed using the current transistor processes. In that talk, we will also describe how such an integration technology could be used to improve performance and reduce power and cost of most computer systems, suggestive of a 1,000x total system benefit. This game changing technology would be presented also in the CoolCube open workshop, a free satellite event of the conference 3DI program.

In an interesting coincidence DARPA just came out with a calls for >50x improvement in SoC

The 3DSoC DARPA solicitation reads: “As noted above, the 3DSoC technology demonstrated at the end of the program (3.5 Years) should also have the following characteristics:

Capability of > 50X the performance at power when compared with 7nm 2D CMOS technology.

The 3DSoC program goal of 50x is to allow proposals suggesting US-built device at 90nm node vs. 7nm of computer chip using conventional 2D technologies. Looking at the table below we can see that if 7nm technology is used the benefit would be over 300x

darpa

This represents a paradigm shift for the computer industry and high-tech world, as normal scaling would provide 3x improvement at best. The emergence of AI and deep learning system makes memory access a key challenge for future systems, and indicate the far larger benefits offered by monolithic 3D integration.

The following charts were presented by the 3DSoC program manager Linton Salmon at the 3DSoC proposers day. The program calls for the use of monolithic 3D to overcome the current weakest link in computers – the memory wall.

darpa 2

Leading to the 3DSoC solicitation was work done by Stanford, MIT, Berkeley and Carnegie Mellon.

darpa 3

Proposals are due by Nov 6.

There is a unique opportunity to hear the 3DSoC DARPA Program Manager, Dr. Linton Salmon, articulate the program and what DARPA is looking for during his invited talk at the S3S 2017 conference next week.

Scientists have long searched for the next generation of materials that can catalyze a revolution in renewable energy harvesting and storage.

One candidate appears to be metal-organic frameworks. Scientists have used these very small, flexible, ultra-thin, super-porous crystalline structures to do everything from capturing and converting carbon into fuels to storing hydrogen and other gases. Their biggest drawback has been their lack of conductivity.

Now, according to USC scientists, it turns out that metal-organic frameworks can conduct electricity in the same way metals do.

This opens the door for metal organic-frameworks to one day efficiently store renewable energy at a very large, almost unthinkable scale.

The cobalt-based metal-organic framework used by the USC scientists, with purple representing cobalt, yellow representing sulfur and gray representing carbon. Credit: Smaranda Marinescu

The cobalt-based metal-organic framework used by the USC scientists, with purple representing cobalt, yellow representing sulfur and gray representing carbon. Credit: Smaranda Marinescu

“For the first time ever, we have demonstrated a metal-organic framework that exhibits conductivity like that of a metal. The natural porosity of the metal-organic framework makes it ideal for reducing the mass of material, allowing for lighter, more compact devices” said Brent Melot, assistant professor of chemistry at the USC Dornsife College of Letters, Arts & Sciences.

“Metallic conductivity in tandem with other catalytic properties would add to its potential for renewable energy production and storage” said Smaranda Marinescu, assistant professor of chemistry at the USC Dornsife College.

Their findings were published July 13 in the Journal of the American Chemical Society.

An emerging catalyst for long-term renewable energy storage

Metal-organic frameworks are so porous that they are well-suited for absorbing and storing gases like hydrogen and carbon dioxide. Their storage is highly concentrated: 1 gram of surface area provides the equivalent of thousands of square feet in storage.

Solar has not yet been maximized as an energy source. The earth receives more energy from one hour of sunlight than is consumed in one year by the entire planet, but there is currently no way to use this energy because there is no way to conserve all of it. This intermittency is intrinsic to nearly all renewable power sources, making it impossible to harvest and store energy unless, say, the sun is shining or the wind is blowing.

If scientists and industries could one day regularly reproduce the capability demonstrated by Marinescu, it would go a long way to reducing intermittency, allowing us to finally make solar energy an enduring and more permanent resource.

Metal or semiconductor: why not both?

Metal-organic frameworks are two-dimensional structures that contain cobalt, sulfur, and carbon atoms. In many ways, they very broadly resemble something like graphene, which is also a very thin layer of two-dimensional, transparent material.

As temperature goes down, metals become more conductive. Conversely, as the temperature goes up, it is semiconductors that become more conductive.

In the experiments run by Marinescu’s group, they used a cobalt-based metal-organic framework that mimicked the conductivity of both a metal and semiconductor at different temperatures. The metal-organic framework designed by the scientists demonstrated its greatest conductivity at both very low and very high temperatures.

Today, Intel announced the delivery of a 17-qubit superconducting test chip for quantum computing to QuTech, Intel’s quantum research partner in the Netherlands. The new chip was fabricated by Intel and features a unique design to achieve improved yield and performance.

The delivery of this chip demonstrates the fast progress Intel and QuTech are making in researching and developing a working quantum computing system. It also underscores the importance of material science and semiconductor manufacturing in realizing the promise of quantum computing.

Intel’s director of quantum hardware, Jim Clarke, holds the new 17-qubit superconducting test chip. (Credit: Intel Corporation)

Intel’s director of quantum hardware, Jim Clarke, holds the new 17-qubit superconducting test chip. (Credit: Intel Corporation)

Quantum computing, in essence, is the ultimate in parallel computing, with the potential to tackle problems conventional computers can’t handle. For example, quantum computers may simulate nature to advance research in chemistry, materials science and molecular modeling – like helping to create a new catalyst to sequester carbon dioxide, or create a room temperature superconductor or discover new drugs.

However, despite much experimental progress and speculation, there are inherent challenges to building viable, large-scale quantum systems that produce accurate outputs. Making qubits (the building blocks of quantum computing) uniform and stable is one such obstacle.

Qubits are tremendously fragile: Any noise or unintended observation of them can cause data loss. This fragility requires them to operate at about 20 millikelvin – 250 times colder than deep space. This extreme operating environment makes the packaging of qubits key to their performance and function. Intel’s Components Research Group (CR) in Oregon and Assembly Test and Technology Development (ATTD) teams in Arizona are pushing the limits of chip design and packaging technology to address quantum computing’s unique challenges.

About the size of a quarter (in a package about the size of a half-dollar coin), the new 17-qubit test chip’s improved design features include:

  • New architecture allowing improved reliability, thermal performance and reduced radio frequency (RF) interference between qubits.
  • A scalable interconnect scheme that allows for 10 to 100 times more signals into and out of the chip as compared to wirebonded chips.
  • Advanced processes, materials and designs that enable Intel’s packaging to scale for quantum integrated circuits, which are much larger than conventional silicon chips.

“Our quantum research has progressed to the point where our partner QuTech is simulating quantum algorithm workloads, and Intel is fabricating new qubit test chips on a regular basis in our leading-edge manufacturing facilities,” said Dr. Michael Mayberry, corporate vice president and managing director of Intel Labs. “Intel’s expertise in fabrication, control electronics and architecture sets us apart and will serve us well as we venture into new computing paradigms, from neuromorphic to quantum computing.”

Intel’s collaborative relationship with QuTech to accelerate advancements in quantum computing began in 2015. Since that time, the collaboration has achieved many milestones – from demonstrating key circuit blocks for an integrated cryogenic-CMOS control system to developing a spin qubit fabrication flow on Intel’s 300mm process technology and developing this unique packaging solution for superconducting qubits. Through this partnership, the time from design and fabrication to test has been greatly accelerated.

“With this test chip, we’ll focus on connecting, controlling and measuring multiple, entangled qubits towards an error correction scheme and a logical qubit,” said professor Leo DiCarlo of QuTech. “This work will allow us to uncover new insights in quantum computing that will shape the next stage of development.”

Advancing the quantum computing system

Intel and QuTech’s work in quantum computing goes beyond the development and testing of superconducting qubit devices. The collaboration spans the entire quantum system – or “stack” – from qubit devices to the hardware and software architecture required to control these devices as well as quantum applications. All of these elements are essential to advancing quantum computing from research to reality.

Also, unlike others, Intel is investigating multiple qubit types. These include the superconducting qubits incorporated into this newest test chip, and an alternative type called spin qubits in silicon. These spin qubits resemble a single electron transistor similar in many ways to conventional transistors and potentially able to be manufactured with comparable processes.

While quantum computers promise greater efficiency and performance to handle certain problems, they won’t replace the need for conventional computing or other emerging technologies like neuromorphic computing. We’ll need the technical advances that Moore’s law delivers in order to invent and scale these emerging technologies.

Intel is investing not only to invent new ways of computing, but also to advance the foundation of Moore’s Law, which makes this future possible.

The semiconductor IP market is expected to be valued at USD 6.22 billion by 2023, at a CAGR of 4.87% between 2017 and 2023, according to the new research report “Semiconductor IP Market by Design IP (processor IP, interface IP, memory IP), Source (royalty and licensing), vertical (consumer electronics, telecom, industrial, automotive, commercial), and Geography – Global Forecast to 2023,” published by MarketsandMarkets. The major factors driving this market include the advancement in multicore technology for consumer electronics sector, increasing demand for modern SoC designs leading to market growth, and growing demand for connected devices.

Consumer electronics to hold largest share of semiconductor IP market during forecast period

The increase in the use of consumer electronics in all the regions is boosting the growth of the semiconductor IP market for the consumer electronics vertical. Moreover, the markets for consumer electronic in APAC and RoW are expected to provide further growth opportunities for the market players as these regions are in a growing phase. In addition, APAC holds dominant share in the market for consumer electronics.

Processor IP to hold largest share of semiconductor IP market during forecast period

Owing to the increased demand for microprocessor, microcontroller, digital signal processor, and graphics processing unit across various verticals, the processor IP segment held the largest share of the semiconductor IP market in 2016, and it is expected to continue the same during the forecast period. The growth of the segment during the forecast period is attributed to the increasing application of processors in the telecom industry for 5G and in high-end cars. The market for processor IP for the automotive vertical is expected to grow at the highest CAGR between 2017 and 2023 due to increasing use of processors in advanced driver assistance systems (ADAS) and infotainment systems.

APAC to hold largest share of semiconductor IP market during forecast period

APAC held the largest share of the market in 2016 and is likely to dominate the semiconductor IP market with the largest market share during the forecast period as well. APAC is a major market for the consumer electronics, telecom, and automotive verticals. Also, this region has become a global focal point for large investments and business expansion opportunities. Moreover, the developments in electric vehicles are expected to provide an opportunity to the growth of the semiconductor IP market in China.

A research collaboration between Osaka University and the Nara Institute of Science and Technology for the first time used scanning tunneling microscopy (STM) to create images of atomically flat side-surfaces of 3D silicon crystals. This work helps semiconductor manufacturers continue to innovate while producing smaller, faster, and more energy-efficient computer chips for computers and smartphones.

Spatial-derivative STM images with 200x200 nm^2 at Vs = +1.5 V. Flat terraces become brighter and edges darker. The downstairs direction runs from left ((110) top-surface) to right ((-1-10) back-surface). Credit: Osaka University

Spatial-derivative STM images with 200×200 nm^2 at Vs = +1.5 V. Flat terraces become brighter and edges darker. The downstairs direction runs from left ((110) top-surface) to right ((-1-10) back-surface). Credit: Osaka University

Our computers and smartphones each are loaded with millions of tiny transistors. The processing speed of these devices has increased dramatically over time as the number of transistors that can fit on a single computer chip continues to increase. Based on Moore’s Law, the number of transistors per chip will double about every 2 years, and in this area it seems to be holding up. To keep up this pace of rapid innovation, computer manufacturers are continually on the lookout for new methods to make each transistor ever smaller.

Current microprocessors are made by adding patterns of circuits to flat silicon wafers. A novel way to cram more transistors in the same space is to fabricate 3D-structures. Fin-type field effect transistors (FETs) are named as such because they have fin-like silicon structures that extend into the air, off the surface of the chip. However, this new method requires a silicon crystal with a perfectly flat top and side-surfaces, instead of just the top surface, as with current devices. Designing the next generation of chips will require new knowledge of the atomic structures of the side-surfaces.

Now, researchers at Osaka University and the Nara Institute of Science and Technology report that they have used STM to image the side-surface of a silicon crystal for the first time. STM is a powerful technique that allows the locations of the individual silicon atoms to be seen. By passing a sharp tip very close to the sample, electrons can jump across the gap and create an electrical current. The microscope monitored this current, and determined the location of the atoms in the sample.

“Our study is a big first step toward the atomically resolved evaluation of transistors designed to have 3D-shapes,” study coauthor Azusa Hattori says.

To make the side-surfaces as smooth as possible, the researchers first treated the crystals with a process called reactive ion etching. Coauthor Hidekazu Tanaka says, “Our ability to directly look at the side-surfaces using STM proves that we can make artificial 3D structures with near-perfect atomic surface ordering.”

The same electrostatic charge that can make hair stand on end and attach balloons to clothing could be an efficient way to drive atomically thin electronic memory devices of the future, according to a new study led by researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab).

In a study published today in the journal Nature, scientists have found a way to reversibly change the atomic structure of a 2-D material by injecting, or “doping,” it with electrons. The process uses far less energy than current methods for changing the configuration of a material’s structure.

Schematic shows the configuration for structural phase transition on a molybdenum ditelluride monolayer (MoTe2, shown as yellow and blue spheres), which is anchored by a metal electrodes (top gate and ground). The ionic liquid covering the monolayer and electrodes enables a high density of electrons to populate the monolayer, leading to changes in the structural lattice from a hexagonal (2H) to monoclinic (1T') pattern. Credit: Ying Wang/Berkeley Lab

Schematic shows the configuration for structural phase transition on a molybdenum ditelluride monolayer (MoTe2, shown as yellow and blue spheres), which is anchored by a metal electrodes (top gate and ground). The ionic liquid covering the monolayer and electrodes enables a high density of electrons to populate the monolayer, leading to changes in the structural lattice from a hexagonal (2H) to monoclinic (1T’) pattern. Credit: Ying Wang/Berkeley Lab

“We show, for the first time, that it is possible to inject electrons to drive structural phase changes in materials,” said study principal investigator Xiang Zhang, senior faculty scientist at Berkeley Lab’s Materials Sciences Division and a professor at UC Berkeley. “By adding electrons into a material, the overall energy goes up and will tip off the balance, resulting in the atomic structure re-arranging to a new pattern that is more stable. Such electron doping-driven structural phase transitions at the 2-D limit is not only important in fundamental physics; it also opens the door for new electronic memory and low-power switching in the next generation of ultra-thin devices.”

Switching a material’s structural configuration from one phase to another is the fundamental, binary characteristic that underlies today’s digital circuitry. Electronic components capable of this phase transition have shrunk down to paper-thin sizes, but they are still considered to be bulk, 3-D layers by scientists. By comparison, 2-D monolayer materials are composed of a single layer of atoms or molecules whose thickness is 100,000 times as small as a human hair.

“The idea of electron doping to alter a material’s atomic structure is unique to 2-D materials, which are much more electrically tunable compared with 3-D bulk materials,” said study co-lead author Jun Xiao, a graduate student in Zhang’s lab.

The classic approach to driving the structural transition of materials involves heating to above 500 degrees Celsius. Such methods are energy-intensive and not feasible for practical applications. In addition, the excess heat can significantly reduce the life span of components in integrated circuits.

A number of research groups have also investigated the use of chemicals to alter the configuration of atoms in semiconductor materials, but that process is still difficult to control and has not been widely adopted by industry.

“Here we use electrostatic doping to control the atomic configuration of a two-dimensional material,” said study co-lead author Ying Wang, another graduate student in Zhang’s lab. “Compared to the use of chemicals, our method is reversible and free of impurities. It has greater potential for integration into the manufacturing of cell phones, computers and other electronic devices.”

The researchers used molybdenum ditelluride (MoTe2), a typical 2-D semiconductor, and coated it with an ionic liquid (DEME-TFSI), which has an ultra-high capacitance, or ability to store electric charges. The layer of ionic liquid allowed the researchers to inject the semiconductor with electrons at a density of a hundred trillion to a quadrillion per square centimeter. It is an electron density that is one to two orders higher in magnitude than what could be achieved in 3-D bulk materials, the researchers said.

Through spectroscopic analysis, the researchers determined that the injection of electrons changed the atoms’ arrangement of the molybdenum ditelluride from a hexagonal shape to one that is monoclinic, which has more of a slanted cuboid shape. Once the electrons were retracted, the crystal structure returned to its original hexagonal pattern, showing that the phase transition is reversible. Moreover, these two types of atom arrangements have very different symmetries, providing a large contrast for applications in optical components.

“Such an atomically thin device could have dual functions, serving simultaneously as optical or electrical transistors, and hence broaden the functionalities of the electronics used in our daily lives,” said Wang.

Fujitsu Semiconductor Limited and ON Semiconductor (Nasdaq: ON) today announced an agreement that ON Semiconductor will purchase a 30 percent incremental share of Fujitsu’s 8-inch wafer fab in Aizu-Wakamatsu, resulting in 40 percent ownership when the purchase is completed. The purchase is scheduled to be completed on April 1, 2018, subject to certain regulatory approvals and other closing conditions.

The two companies entered into an agreement in 2014, under which ON Semiconductor obtained a 10 percent ownership interest in Fujitsu’s Aizu 8-inch fab. Initial transfers began in 2014, and successful production and ramp-up of wafers began in June 2015. ON Semiconductor continues to increase demand at the Aizu 8-inch fab, and both companies determined that further strategic partnership will maximize the value both companies provide.

ON Semiconductor plans to increase ownership to 60 percent by the second half of 2018 and to 100 percent in the first half of 2020, allowing ON Semiconductor to add capacity to their global footprint. This additional capacity will allow ON Semiconductor to continue scaling its business based on demand and enable increased supply chain flexibility.

“We believe that transforming into a globally competitive company is the key for the continuous growth of the Aizu 8-inch fab. Furthering our strategic partnership with ON Semiconductor, who provides a broad product portfolio, will enable the Aizu 8-inch fab to secure future growth,” said Kagemasa Magaribuchi, president of Fujitsu Semiconductor Limited. “We believe that the growth of the Aizu 8-inch fab will contribute to maintaining and expanding a strong workforce and assist with the development of the regions.”

“We have had a strong and successful partnership with Fujitsu since announcing our investment in 2014,” said Keith Jackson, president and CEO of ON Semiconductor. “We believe furthering our partnership with Fujitsu Semiconductor will enable us to maintain our industry-leading manufacturing cost structure and also help us optimize our capital spending in coming years. This is a strategic investment for ON Semiconductor to secure additional manufacturing capacity, in support of our accelerated production needs and for revenue growth in coming years.”

Manufacturing is a core competency for ON Semiconductor, and approximately 75 percent of manufacturing operations are done internally through the company’s industry leading cost structure.

Gartner, Inc. this week highlighted the top strategic technology trends that will impact most organizations in 2018. Analysts presented their findings during Gartner Symposium/ITxpo, which took place through Thursday.

Gartner defines a strategic technology trend as one with substantial disruptive potential that is beginning to break out of an emerging state into broader impact and use, or which are rapidly growing trends with a high degree of volatility reaching tipping points over the next five years.

“Gartner’s top 10 strategic technology trends for 2018 tie into the Intelligent Digital Mesh. The intelligent digital mesh is a foundation for future digital business and ecosystems,” said David Cearley, vice president and Gartner Fellow. “IT leaders must factor these technology trends into their innovation strategies or risk losing ground to those that do.”

The first three strategic technology trends explore how artificial intelligence (AI) and machine learning are seeping into virtually everything and represent a major battleground for technology providers over the next five years. The next four trends focus on blending the digital and physical worlds to create an immersive, digitally enhanced environment. The last three refer to exploiting connections between an expanding set of people and businesses, as well as devices, content and services to deliver digital business outcomes.

The top 10 strategic technology trends for 2018 are:

AI Foundation
Creating systems that learn, adapt and potentially act autonomously will be a major battleground for technology vendors through at least 2020. The ability to use AI to enhance decision making, reinvent business models and ecosystems, and remake the customer experience will drive the payoff for digital initiatives through 2025.

“AI techniques are evolving rapidly and organizations will need to invest significantly in skills, processes and tools to successfully exploit these techniques and build AI-enhanced systems,” said Mr. Cearley. “Investment areas can include data preparation, integration, algorithm and training methodology selection, and model creation. Multiple constituencies including data scientists, developers and business process owners will need to work together.”

Intelligent Apps and Analytics
Over the next few years, virtually every app, application and service will incorporate some level of AI. Some of these apps will be obvious intelligent apps that could not exist without AI and machine learning. Others will be unobtrusive users of AI that provide intelligence behind the scenes. Intelligent apps create a new intelligent intermediary layer between people and systems and have the potential to transform the nature of work and the structure of the workplace.

“Explore intelligent apps as a way of augmenting human activity and not simply as a way of replacing people,” said Mr. Cearley. “Augmented analytics is a particularly strategic growing area which uses machine learning to automate data preparation, insight discovery and insight sharing for a broad range of business users, operational workers and citizen data scientists.”

AI has become the next major battleground in a wide range of software and service markets, including aspects of enterprise resource planning (ERP). Packaged software and service providers should outline how they’ll be using AI to add business value in new versions in the form of advanced analytics, intelligent processes and advanced user experiences.

Intelligent Things
Intelligent things are physical things that go beyond the execution of rigid programming models to exploit AI to deliver advanced behaviors and interact more naturally with their surroundings and with people. AI is driving advances for new intelligent things (such as autonomous vehicles, robots and drones) and delivering enhanced capability to many existing things (such as Internet of Things [IoT] connected consumer and industrial systems).

“Currently, the use of autonomous vehicles in controlled settings (for example, in farming and mining) is a rapidly growing area of intelligent things. We are likely to see examples of autonomous vehicles on limited, well-defined and controlled roadways by 2022, but general use of autonomous cars will likely require a person in the driver’s seat in case the technology should unexpectedly fail,” said Mr. Cearley. “For at least the next five years, we expect that semiautonomous scenarios requiring a driver will dominate. During this time, manufacturers will test the technology more rigorously, and the nontechnology issues such as regulations, legal issues and cultural acceptance will be addressed.” 

Digital Twin
A digital twin refers to the digital representation of a real-world entity or system. Digital twins in the context of IoT projects is particularly promising over the next three to five years and is leading the interest in digital twins today. Well-designed digital twins of assets have the potential to significantly improve enterprise decision making. These digital twins are linked to their real-world counterparts and are used to understand the state of the thing or system, respond to changes, improve operations and add value. Organizations will implement digital twins simply at first, then evolve them over time, improving their ability to collect and visualize the right data, apply the right analytics and rules, and respond effectively to business objectives.

“Over time, digital representations of virtually every aspect of our world will be connected dynamically with their real-world counterpart and with one another and infused with AI-based capabilities to enable advanced simulation, operation and analysis,” said Mr. Cearley. “City planners, digital marketers, healthcare professionals and industrial planners will all benefit from this long-term shift to the integrated digital twin world.”

Cloud to the Edge
Edge computing describes a computing topology in which information processing, and content collection and delivery, are placed closer to the sources of this information. Connectivity and latency challenges, bandwidth constraints and greater functionality embedded at the edge favors distributed models. Enterprises should begin using edge design patterns in their infrastructure architectures — particularly for those with significant IoT elements.

While many view cloud and edge as competing approaches, cloud is a style of computing where elastically scalable technology capabilities are delivered as a service and does not inherently mandate a centralized model.

“When used as complementary concepts, cloud can be the style of computing used to create a service-oriented model and a centralized control and coordination structure with edge being used as a delivery style allowing for disconnected or distributed process execution of aspects of the cloud service,” said Mr. Cearley.

Conversational Platforms
Conversational platforms will drive the next big paradigm shift in how humans interact with the digital world. The burden of translating intent shifts from user to computer. The platform takes a question or command from the user and then responds by executing some function, presenting some content or asking for additional input. Over the next few years, conversational interfaces will become a primary design goal for user interaction and be delivered in dedicated hardware, core OS features, platforms and applications.

“Conversational platforms have reached a tipping point in terms of understanding language and basic user intent, but they still fall short,” said Mr. Cearley. “The challenge that conversational platforms face is that users must communicate in a very structured way, and this is often a frustrating experience. A primary differentiator among conversational platforms will be the robustness of their conversational models and the application programming interface (API) and event models used to access, invoke and orchestrate third-party services to deliver complex outcomes.” 

Immersive Experience
While conversational interfaces are changing how people control the digital world, virtual, augmented and mixed reality are changing the way that people perceive and interact with the digital world. The virtual reality (VR) and augmented reality (AR) market is currently adolescent and fragmented. Interest is high, resulting in many novelty VR applications that deliver little real business value outside of advanced entertainment, such as video games and 360-degree spherical videos. To drive real tangible business benefit, enterprises must examine specific real-life scenarios where VR and AR can be applied to make employees more productive and enhance the design, training and visualization processes.

Mixed reality, a type of immersion that merges and extends the technical functionality of both AR and VR, is emerging as the immersive experience of choice providing a compelling technology that optimizes its interface to better match how people view and interact with their world. Mixed reality exists along a spectrum and includes head-mounted displays (HMDs) for augmented or virtual reality as well as smartphone and tablet-based AR and use of environmental sensors. Mixed reality represents the span of how people perceive and interact with the digital world.

Blockchain
Blockchain is evolving from a digital currency infrastructure into a platform for digital transformation. Blockchain technologies offer a radical departure from the current centralized transaction and record-keeping mechanisms and can serve as a foundation of disruptive digital business for both established enterprises and startups. Although the hype surrounding blockchains originally focused on the financial services industry, blockchains have many potential applications, including government, healthcare, manufacturing, media distribution, identity verification, title registry and supply chain. Although it holds long-term promise and will undoubtedly create disruption, blockchain promise outstrips blockchain reality, and many of the associated technologies are immature for the next two to three years.

Event Driven
Central to digital business is the idea that the business is always sensing and ready to exploit new digital business moments. Business events could be anything that is noted digitally, reflecting the discovery of notable states or state changes, for example, completion of a purchase order, or an aircraft landing. With the use of event brokers, IoT, cloud computing, blockchain, in-memory data management and AI, business events can be detected faster and analyzed in greater detail. But technology alone without cultural and leadership change does not deliver the full value of the event-driven model. Digital business drives the need for IT leaders, planners and architects to embrace event thinking.

Continuous Adaptive Risk and Trust
To securely enable digital business initiatives in a world of advanced, targeted attacks, security and risk management leaders must adopt a continuous adaptive risk and trust assessment (CARTA) approach to allow real-time, risk and trust-based decision making with adaptive responses. Security infrastructure must be adaptive everywhere, to embrace the opportunity — and manage the risks — that comes delivering security that moves at the speed of digital business.

As part of a CARTA approach, organizations must overcome the barriers between security teams and application teams, much as DevOps tools and processes overcome the divide between development and operations. Information security architects must integrate security testing at multiple points into DevOps workflows in a collaborative way that is largely transparent to developers, and preserves the teamwork, agility and speed of DevOps and agile development environments, delivering “DevSecOps.” CARTA can also be applied at runtime with approaches such as deception technologies. Advances in technologies such as virtualization and software-defined networking has made it easier to deploy, manage and monitor “adaptive honeypots” — the basic component of network-based deception.

Gartner clients can learn more in the Gartner Special Report “Top Strategic Technology Trends for 2018.” Additional detailed analysis on each tech trend can be found in the Smarter With Gartner article “Gartner Top 10 Strategic Technology Trends for 2018.”