Tag Archives: letter-materials-tech

In a recent study published in Science, researchers at ICFO – The Institute of Photonic Sciences in Barcelona, Spain, along with other members of the Graphene Flagship, reached the ultimate level of light confinement. They have been able to confine light down to a space one atom, the smallest possible. This will pave the way to ultra-small optical switches, detectors and sensors.

Light can function as an ultra-fast communication channel, for example between different sections of a computer chip, but it can also be used for ultra-sensitive sensors or on-chip nanoscale lasers. There is currently much research into how to further shrink devices that control and guide light.

New techniques searching for ways to confine light into extremely tiny spaces, much smaller than current ones, have been on the rise. Researchers had previously found that metals can compress light below the wavelength-scale (diffraction limit), but more confinement would always come at the cost of more energy loss. This fundamental issue has now been overcome.

“Graphene keeps surprising us: nobody thought that confining light to the one-atom limit would be possible. It will open a completely new set of applications, such as optical communications and sensing at a scale below one nanometer,” said ICREA Professor Frank Koppens at ICFO – The Institute of Photonic Sciences in Barcelona, Spain, who led the research.

This team of researchers including those from ICFO (Spain), University of Minho (Portugal) and MIT (USA) used stacks of two-dimensional materials, called heterostructures, to build up a new nano-optical device. They took a graphene monolayer (which acts as a semi-metal), and stacked onto it a hexagonal boron nitride (hBN) monolayer (an insulator), and on top of this deposited an array of metallic rods. They used graphene because it can guide light in the form of plasmons, which are oscillations of the electrons, interacting strongly with light.

“At first we were looking for a new way to excite graphene plasmons. On the way, we found that the confinement was stronger than before and the additional losses minimal. So we decided to go to the one atom limit with surprising results,” said David Alcaraz Iranzo, the lead author from ICFO.

By sending infra-red light through their devices, the researchers observed how the plasmons propagated in between the metal and the graphene. To reach the smallest space conceivable, they decided to reduce the gap between the metal and graphene as much as possible to see if the confinement of light remained efficient, i.e. without additional energy losses. Strikingly, they saw that even when a monolayer of hBN was used as a spacer, the plasmons were still excited, and could propagate freely while being confined to a channel of just one atom thick. They managed to switch this plasmon propagation on and off, simply by applying an electrical voltage, demonstrating the control of light guided in channels smaller than one nanometer.

This enables new opto-electronic devices that are just one nanometer thick, such as ultra-small optical switches, detectors and sensors. Due to the paradigm shift in optical field confinement, extreme light-matter interactions can now be explored that were not accessible before. The atom-scale toolbox of two-dimensional materials has now also proven applicable for many types of new devices where both light and electrons can be controlled even down to the scale of a nanometer.

Professor Andrea C. Ferrari, Science and Technology Officer of the Graphene Flagship, and Chair of its Management Panel, added “While the flagship is driving the development of novel applications, in particular in the field of photonics and optoelectronics, we do not lose sight of fundamental research. The impressive results reported in this paper are a testimony to the relevance for cutting edge science of the Flagship work. Having reached the ultimate limit of light confinement could lead to new devices with unprecedented small dimensions.”

Solar cells have great potential as a source of clean electrical energy, but so far they have not been cheap, light, and flexible enough for widespread use. Now a team of researchers led by Tandon Associate Professor André D. Taylor of the Chemical and Biomolecular Engineering Department has found an innovative and promising way to improve solar cells and make their use in many applications more likely.

Most organic solar cells use fullerenes, spherical molecules of carbon. The problem, explains Taylor, is that fullerenes are expensive and don’t absorb enough light. Over the last 10 years he has made significant progress in improving organic solar cells, and he has recently focused on using non-fullerenes, which until now have been inefficient. However, he says, “the non-fullerenes are improving enough to give fullerenes a run for their money.”

Think of a solar cell as a sandwich, Taylor says. The “meat” or active layer – made of electron donors and acceptors – is in the middle, absorbing sunlight and transforming it into electricity (electrons and holes), while the “bread,” or outside layers, consist of electrodes that transport that electricity. His team’s goal was to have the cell absorb light across as large a spectrum as possible using a variety of materials, yet at the same time allow these materials to work together well. “My group works on key parts of the ‘sandwich,’ such as the electron and hole transporting layers of the ‘bread,’ while other groups may work only on the ‘meat’ or interlayer materials. The question is: How do you get them to play together? The right blend of these disparate materials is extremely difficult to achieve.”

Using a squaraine molecule in a new way – as a crystallizing agent – did the trick. “We added a small molecule that functions as an electron donor by itself and enhances the absorption of the active layer,” Taylor explains. “By adding this small molecule, it facilitates the orientation of the donor-acceptor polymer (called PBDB-T) with the non-fullerene acceptor, ITIC, in a favorable arrangement.”

This solar architecture also uses another design mechanism that the Taylor group pioneered known as a FRET-based solar cell. FRET, or Förster resonance energy transfer, is an energy transfer mechanism first observed in photosynthesis, by which plants use sunlight. Using a new polymer and non-fullerene blend with squaraine, the team converted more than 10 percent of solar energy into power. Just a few years ago this was considered too lofty a goal for single-junction polymer solar cells. “There are now newer polymer non-fullerene systems that can perform above 13 percent, so we view our contribution as a viable strategy for improving these systems,” Taylor says.

The organic solar cells developed by his team are flexible and could one day be used in applications supporting electric vehicles, wearable electronics, or backpacks to charge cell phones. Eventually, they could contribute significantly to the supply of electric power. “We expect that this crystallizing-agent method will attract attention from chemists and materials scientists affiliated with organic electronics,” says Yifan Zheng, Taylor’s former research student and lead author of the article about the work in the journal Materials Today.

Next for the research team? They are working on a type of solar cell called a perovskite as well as continuing to improve non-fullerene organic solar cells.

A new smart and responsive material can stiffen up like a worked-out muscle, say the Iowa State University engineers who developed it.

Stress a muscle and it gets stronger. Mechanically stress the rubbery material – say with a twist or a bend – and the material automatically stiffens by up to 300 percent, the engineers said. In lab tests, mechanical stresses transformed a flexible strip of the material into a hard composite that can support 50 times its own weight.

Examples of the new smart material, left to right: A flexible strip; a flexible strip that stiffened when twisted; a flexible strip transformed into a hard composite that can hold up a weight. Credit: Christopher Gannon/Iowa State University

Examples of the new smart material, left to right: A flexible strip; a flexible strip that stiffened when twisted; a flexible strip transformed into a hard composite that can hold up a weight. Credit: Christopher Gannon/Iowa State University

This new composite material doesn’t need outside energy sources such as heat, light or electricity to change its properties. And it could be used in a variety of ways, including applications in medicine and industry.

The material is described in a paper recently published online by the scientific journal Materials Horizons. The lead authors are Martin Thuo and Michael Bartlett, Iowa State assistant professors of materials science and engineering. First authors are Boyce Chang and Ravi Tutika, Iowa State doctoral students in materials science and engineering. Chang is also a student associate of the U.S. Department of Energy’s Ames Laboratory.

Iowa State startup funds for Thuo and Bartlett supported development of the new material. Thuo’s Black & Veatch faculty fellowship also helped support the project.

Development of the material combined Thuo’s expertise in micro-sized, liquid-metal particles with Bartlett’s expertise in soft materials such as rubbers, plastics and gels.

It’s a powerful combination.

The researchers found a simple, low-cost way to produce particles of undercooled metal – that’s metal that remains liquid even below its melting temperature. The tiny particles (they’re just 1 to 20 millionths of a meter across) are created by exposing droplets of melted metal to oxygen, creating an oxidation layer that coats the droplets and stops the liquid metal from turning solid. They also found ways to mix the liquid-metal particles with a rubbery elastomer material without breaking the particles.

When this hybrid material is subject to mechanical stresses – pushing, twisting, bending, squeezing – the liquid-metal particles break open. The liquid metal flows out of the oxide shell, fuses together and solidifies.

“You can squeeze these particles just like a balloon,” Thuo said. “When they pop, that’s what makes the metal flow and solidify.”

The result, Bartlett said, is a “metal mesh that forms inside the material.”

Thuo and Bartlett said the popping point can be tuned to make the liquid metal flow after varying amounts of mechanical stress. Tuning could involve changing the metal used, changing the particle sizes or changing the soft material.

In this case, the liquid-metal particles contain Field’s metal, an alloy of bismuth, indium and tin. But Thuo said other metals will work, too.

“The idea is that no matter what metal you can get to undercool, you’ll get the same behavior,” he said.

The engineers say the new material could be used in medicine to support delicate tissues or in industry to protect valuable sensors. There could also be uses in soft and bio-inspired robotics or reconfigurable and wearable electronics. The Iowa State University Research Foundation is working to patent the material and it is available for licensing.

“A device with this material can flex up to a certain amount of load,” Bartlett said. “But if you continue stressing it, the elastomer will stiffen and stop or slow down these forces.”

And that, the engineers say, is how they’re putting some muscle in their new smart material.

 

Researchers at the University of Illinois at Chicago describe a new technique for precisely measuring the temperature and behavior of new two-dimensional materials that will allow engineers to design smaller and faster microprocessors. Their findings are reported in the journal Physical Review Letters.

Newly developed two-dimensional materials, such as graphene — which consists of a single layer of carbon atoms — have the potential to replace traditional microprocessing chips based on silicon, which have reached the limit of how small they can get. But engineers have been stymied by the inability to measure how temperature will affect these new materials, collectively known as transition metal dichalcogenides, or TMDs.

Using scanning transmission electron microscopy combined with spectroscopy, researchers at UIC were able to measure the temperature of several two-dimensional materials at the atomic level, paving the way for much smaller and faster microprocessors. They were also able to use their technique to measure how the two-dimensional materials would expand when heated.

“Microprocessing chips in computers and other electronics get very hot, and we need to be able to measure not only how hot they can get, but how much the material will expand when heated,” said Robert Klie, professor of physics at UIC and corresponding author of the paper. “Knowing how a material will expand is important because if a material expands too much, connections with other materials, such as metal wires, can break and the chip is useless.”

Traditional ways to measure temperature don’t work on tiny flakes of two-dimensional materials that would be used in microprocessors because they are just too small. Optical temperature measurements, which use a reflected laser light to measure temperature, can’t be used on TMD chips because they don’t have enough surface area to accommodate the laser beam.

“We need to understand how heat builds up and how it is transmitted at the interface between two materials in order to build efficient microprocessors that work,” said Klie.

Klie and his colleagues devised a way to take temperature measurements of TMDs at the atomic level using scanning transition electron microscopy, which uses a beam of electrons transmitted through a specimen to form an image.

“Using this technique, we can zero in on and measure the vibration of atoms and electrons, which is essentially the temperature of a single atom in a two-dimensional material,” said Klie. Temperature is a measure of the average kinetic energy of the random motions of the particles, or atoms that make up a material. As a material gets hotter, the frequency of the atomic vibration gets higher. At absolute zero, the lowest theoretical temperature, all atomic motion stops.

Klie and his colleagues heated microscopic “flakes” of various TMDs inside the chamber of a scanning transmission electron microscope to different temperatures and then aimed the microscope’s electron beam at the material. Using a technique called electron energy-loss spectroscopy, they were able to measure the scattering of electrons off the two-dimensional materials caused by the electron beam. The scattering patterns were entered into a computer model that translated them into measurements of the vibrations of the atoms in the material – in other words, the temperature of the material at the atomic level.

“With this new technique, we can measure the temperature of a material with a resolution that is nearly 10 times better than conventional methods,” said Klie. “With this new approach, we can design better electronic devices that will be less prone to overheating and consume less power.”

The technique can also be used to predict how much materials will expand when heated and contract when cooled, which will help engineers build chips that are less prone to breaking at points where one material touches another, such as when a two-dimensional material chip makes contact with a wire.

“No other method can measure this effect at the spatial resolution we report,” said Klie. “This will allow engineers to design devices that can manage temperature changes between two different materials at the nano-scale level.”

Engineers at the University of California, Riverside, have reported advances in so-called “spintronic” devices that will help lead to a new technology for computing and data storage. They have developed methods to detect signals from spintronic components made of low-cost metals and silicon, which overcomes a major barrier to wide application of spintronics. Previously such devices depended on complex structures that used rare and expensive metals such as platinum. The researchers were led by Sandeep Kumar, an assistant professor of mechanical engineering.

UCR researchers have developed methods to detect signals from spintronic components made of low-cost metals and silicon. Credit: UC Riverside

UCR researchers have developed methods to detect signals from spintronic components made of low-cost metals and silicon. Credit: UC Riverside

Spintronic devices promise to solve major problems in today’s electronic computers, in that the computers use massive amounts of electricity and generate heat that requires expending even more energy for cooling. By contrast, spintronic devices generate little heat and use relatively minuscule amounts of electricity. Spintronic computers would require no energy to maintain data in memory. They would also start instantly and have the potential to be far more powerful than today’s computers.

While electronics depends on the charge of electrons to generate the binary ones or zeroes of computer data, spintronics depends on the property of electrons called spin. Spintronic materials register binary data via the “up” or “down” spin orientation of electrons–like the north and south of bar magnets–in the materials. A major barrier to development of spintronics devices is generating and detecting the infinitesimal electric spin signals in spintronic materials.

In one paper published in the January issue of the scientific journal Applied Physics Letters, Kumar and colleagues reported an efficient technique of detecting the spin currents in a simple two-layer sandwich of silicon and a nickel-iron alloy called Permalloy. All three of the components are both inexpensive and abundant and could provide the basis for commercial spintronic devices. They also operate at room temperature. The layers were created with the widely used electronics manufacturing processes called sputtering. Co-authors of the paper were graduate students Ravindra Bhardwaj and Paul Lou.

In their experiments, the researchers heated one side of the Permalloy-silicon bi-layer sandwich to create a temperature gradient, which generated an electrical voltage in the bi-layer. The voltage was due to phenomenon known as the spin-Seebeck effect. The engineers found that they could detect the resulting “spin current” in the bi-layer due to another phenomenon known as the “inverse spin-Hall effect.”

The researchers said their findings will have application to efficient magnetic switching in computer memories, and “these scientific breakthroughs may give impetus” to development of such devices. More broadly, they concluded, “These results bring the ubiquitous Si (silicon) to forefront of spintronics research and will lay the foundation of energy efficient Si spintronics and Si spin caloritronics devices.”

In two other scientific papers, the researchers demonstrated that they could generate a key property for spintronics materials, called antiferromagnetism, in silicon. The achievement opens an important pathway to commercial spintronics, said the researchers, given that silicon is inexpensive and can be manufactured using a mature technology with a long history of application in electronics.

Ferromagnetism is the property of magnetic materials in which the magnetic poles of the atoms are aligned in the same direction. In contrast, antiferromagnetism is a property in which the neighboring atoms are magnetically oriented in opposite directions. These “magnetic moments” are due to the spin of electrons in the atoms, and is central to the application of the materials in spintronics.

In the two papers, Kumar and Lou reported detecting antiferromagnetism in the two types of silicon–called n-type and p-type–used in transistors and other electronic components. N-type semiconductor silicon is “doped” with substances that cause it to have an abundance of negatively-charged electrons; and p-type silicon is doped to have a large concentration of positively charged “holes.” Combining the two types enables switching of current in such devices as transistors used in computer memories and other electronics.

In the paper in the Journal of Magnetism and Magnetic Materials, Lou and Kumar reported detecting the spin-Hall effect and antiferromagnetism in n-silicon. Their experiments used a multilayer thin film comprising palladium, nickel-iron Permalloy, manganese oxide and n-silicon.

And in the second paper, in the scientific journal physica status solidi, they reported detecting in p-silicon spin-driven antiferromagnetism and a transition of silicon between metal and insulator properties. Those experiments used a thin film similar to those with the n-silicon.

The researchers wrote in the latter paper that “The observed emergent antiferromagnetic behavior may lay the foundation of Si (silicon) spintronics and may change every field involving Si thin films. These experiments also present potential electric control of magnetic behavior using simple semiconductor electronics physics. The observed large change in resistance and doping dependence of phase transformation encourages the development of antiferromagnetic and phase change spintronics devices.”

In further studies, Kumar and his colleagues are developing technology to switch spin currents on and off in the materials, with the ultimate goal of creating a spin transistor. They are also working to generate larger, higher-voltage spintronic chips. The result of their work could be extremely low-power, compact transmitters and sensors, as well as energy-efficient data storage and computer memories, said Kumar.

When it comes to processing power, the human brain just can’t be beat.

Packed within the squishy, football-sized organ are somewhere around 100 billion neurons. At any given moment, a single neuron can relay instructions to thousands of other neurons via synapses — the spaces between neurons, across which neurotransmitters are exchanged. There are more than 100 trillion synapses that mediate neuron signaling in the brain, strengthening some connections while pruning others, in a process that enables the brain to recognize patterns, remember facts, and carry out other learning tasks, at lightning speeds.

Researchers in the emerging field of “neuromorphic computing” have attempted to design computer chips that work like the human brain. Instead of carrying out computations based on binary, on/off signaling, like digital chips do today, the elements of a “brain on a chip” would work in an analog fashion, exchanging a gradient of signals, or “weights,” much like neurons that activate in various ways depending on the type and number of ions that flow across a synapse.

In this way, small neuromorphic chips could, like the brain, efficiently process millions of streams of parallel computations that are currently only possible with large banks of supercomputers. But one significant hangup on the way to such portable artificial intelligence has been the neural synapse, which has been particularly tricky to reproduce in hardware.

Now engineers at MIT have designed an artificial synapse in such a way that they can precisely control the strength of an electric current flowing across it, similar to the way ions flow between neurons. The team has built a small chip with artificial synapses, made from silicon germanium. In simulations, the researchers found that the chip and its synapses could be used to recognize samples of handwriting, with 95 percent accuracy.

The design, published today in the journal Nature Materials, is a major step toward building portable, low-power neuromorphic chips for use in pattern recognition and other learning tasks.

The research was led by Jeehwan Kim, the Class of 1947 Career Development Assistant Professor in the departments of Mechanical Engineering and Materials Science and Engineering, and a principal investigator in MIT’s Research Laboratory of Electronics and Microsystems Technology Laboratories. His co-authors are Shinhyun Choi (first author), Scott Tan (co-first author), Zefan Li, Yunjo Kim, Chanyeol Choi, and Hanwool Yeon of MIT, along with Pai-Yu Chen and Shimeng Yu of Arizona State University.

Too many paths

Most neuromorphic chip designs attempt to emulate the synaptic connection between neurons using two conductive layers separated by a “switching medium,” or synapse-like space. When a voltage is applied, ions should move in the switching medium to create conductive filaments, similarly to how the “weight” of a synapse changes.

But it’s been difficult to control the flow of ions in existing designs. Kim says that’s because most switching mediums, made of amorphous materials, have unlimited possible paths through which ions can travel — a bit like Pachinko, a mechanical arcade game that funnels small steel balls down through a series of pins and levers, which act to either divert or direct the balls out of the machine.

Like Pachinko, existing switching mediums contain multiple paths that make it difficult to predict where ions will make it through. Kim says that can create unwanted nonuniformity in a synapse’s performance.

“Once you apply some voltage to represent some data with your artificial neuron, you have to erase and be able to write it again in the exact same way,” Kim says. “But in an amorphous solid, when you write again, the ions go in different directions because there are lots of defects. This stream is changing, and it’s hard to control. That’s the biggest problem — nonuniformity of the artificial synapse.”

A perfect mismatch

Instead of using amorphous materials as an artificial synapse, Kim and his colleagues looked to single-crystalline silicon, a defect-free conducting material made from atoms arranged in a continuously ordered alignment. The team sought to create a precise, one-dimensional line defect, or dislocation, through the silicon, through which ions could predictably flow.

To do so, the researchers started with a wafer of silicon, resembling, at microscopic resolution, a chicken-wire pattern. They then grew a similar pattern of silicon germanium — a material also used commonly in transistors — on top of the silicon wafer. Silicon germanium’s lattice is slightly larger than that of silicon, and Kim found that together, the two perfectly mismatched materials can form a funnel-like dislocation, creating a single path through which ions can flow.

The researchers fabricated a neuromorphic chip consisting of artificial synapses made from silicon germanium, each synapse measuring about 25 nanometers across. They applied voltage to each synapse and found that all synapses exhibited more or less the same current, or flow of ions, with about a 4 percent variation between synapses — a much more uniform performance compared with synapses made from amorphous material.

They also tested a single synapse over multiple trials, applying the same voltage over 700 cycles, and found the synapse exhibited the same current, with just 1 percent variation from cycle to cycle.

“This is the most uniform device we could achieve, which is the key to demonstrating artificial neural networks,” Kim says.

Writing, recognized

As a final test, Kim’s team explored how its device would perform if it were to carry out actual learning tasks — specifically, recognizing samples of handwriting, which researchers consider to be a first practical test for neuromorphic chips. Such chips would consist of “input/hidden/output neurons,” each connected to other “neurons” via filament-based artificial synapses.

Scientists believe such stacks of neural nets can be made to “learn.” For instance, when fed an input that is a handwritten ‘1,’ with an output that labels it as ‘1,’ certain output neurons will be activated by input neurons and weights from an artificial synapse. When more examples of handwritten ‘1s’ are fed into the same chip, the same output neurons may be activated when they sense similar features between different samples of the same letter, thus “learning” in a fashion similar to what the brain does.

Kim and his colleagues ran a computer simulation of an artificial neural network consisting of three sheets of neural layers connected via two layers of artificial synapses, the properties of which they based on measurements from their actual neuromorphic chip. They fed into their simulation tens of thousands of samples from a handwritten recognition dataset commonly used by neuromorphic designers, and found that their neural network hardware recognized handwritten samples 95 percent of the time, compared to the 97 percent accuracy of existing software algorithms.

The team is in the process of fabricating a working neuromorphic chip that can carry out handwriting-recognition tasks, not in simulation but in reality. Looking beyond handwriting, Kim says the team’s artificial synapse design will enable much smaller, portable neural network devices that can perform complex computations that currently are only possible with large supercomputers.

“Ultimately we want a chip as big as a fingernail to replace one big supercomputer,” Kim says. “This opens a stepping stone to produce real artificial hardware.”

This research was supported in part by the National Science Foundation.

Conventional electronics rely on controlling electric charge. Recently, researchers have been exploring the potential for a new technology, called spintronics, that relies on detecting and controlling a particle’s spin. This technology could lead to new types of more efficient and powerful devices.

In a paper published in Applied Physics Letters, from AIP Publishing, researchers measured how strongly a charge carrier’s spin interacts with a magnetic field in diamond. This crucial property shows diamond as a promising material for spintronic devices.

Diamond is attractive because it would be easier to process and fabricate into spintronic devices than typical semiconductor materials, said Golrokh Akhgar, a physicist at La Trobe University in Australia. Conventional quantum devices are based on multiple thin layers of semiconductors, which require an elaborate fabrication process in an ultrahigh vacuum.

“Diamond is normally an extremely good insulator,” Akhgar said. But, when exposed to hydrogen plasma, the diamond incorporates hydrogen atoms into its surface. When a hydrogenated diamond is introduced to moist air, it becomes electrically conductive because a thin layer of water forms on its surface, pulling electrons from the diamond. The missing electrons at the diamond surface behave like positively charged particles, called holes, making the surface conductive.

Researchers found that these holes have many of the right properties for spintronics. The most important property is a relativistic effect called spin-orbit coupling, where the spin of a charge carrier interacts with its orbital motion. A strong coupling enables researchers to control the particle’s spin with an electric field.

In previous work, the researchers measured how strongly a hole’s spin-orbit coupling could be engineered with an electric field. They also showed that an external electric field could tune the strength of the coupling.

In recent experiments, the researchers measured how strongly a hole’s spin interacts with a magnetic field. For this measurement, the researchers applied constant magnetic fields of different strengths parallel to the diamond surface at temperatures below 4 Kelvin. They also simultaneously applied a steadily varying perpendicular field. By monitoring how the electrical resistance of the diamond changed, they determined the g-factor. This quantity could help researchers control spin in future devices using a magnetic field.

“The coupling strength of carrier spins to electric and magnetic fields lies at the heart of spintronics,” Akhgar said. “We now have the two crucial parameters for the manipulation of spins in the conductive surface layer of diamond by either electric or magnetic fields.”

Additionally, diamond is transparent, so it can be incorporated into optical devices that operate with visible or ultraviolet light. Nitrogen-vacancy diamonds — which contain nitrogen atoms paired with missing carbon atoms in its crystal structure — show promise as a quantum bit, or qubit, the basis for quantum information technology. Being able to manipulate spin and use it as a qubit could lead to yet more devices with untapped potential, Akhgar said.

Researchers have identified a mechanism that triggers shape-memory phenomena in organic crystals used in plastic electronics. Shape-shifting structural materials are made with metal alloys, but the new generation of economical printable plastic electronics is poised to benefit from this phenomenon, too. Shape-memory materials science and plastic electronics technology, when merged, could open the door to advancements in low-power electronics, medical electronics devices and multifunctional shape-memory materials.

The findings are published in the journal Nature Communications and confirm the shape-memory phenomenon in two organic semiconductors materials.

Illinois chemistry and biomolecular engineering professor Ying Diao, right, and graduate student Hyunjoong Chung are part of a team that has identified a mechanism that triggers shape-memory in organic crystals used in plastic electronics. Credit: L. Brian Stauffer

Illinois chemistry and biomolecular engineering professor Ying Diao, right, and graduate student Hyunjoong Chung are part of a team that has identified a mechanism that triggers shape-memory in organic crystals used in plastic electronics. Credit: L. Brian Stauffer

Devices like the expandable stents that open and unblock clogged human blood vessels use shape-memory technology. Heat, light and electrical signals, or mechanic forces pass information through the devices telling them to expand, contract, bend and morph back into their original form – and can do so repeatedly, like a snake constricting to swallow its dinner. This effect works well with metals, but remains elusive in synthetic organic materials because of the complexity of the molecules used to create them.

“The shape-memory phenomenon is common in nature, but we are not really sure about nature’s design rules at the molecular level,” said professor of chemical and biomolecular engineering and co-author of the study, Ying Diao. “Nature uses organic compounds that are very different from the metal alloys used in shape-memory materials on the market today,” Diao said. “In naturally occurring shape-memory materials, the molecules transform cooperatively, meaning that they all move together during shape change. Otherwise, these materials would shatter and the shape change would not be reversible and ultrafast.”

The discovery of the shape-memory mechanism in synthetic organic material was quite serendipitous, Diao said. The team accidentally created large organic crystals and was curious to find out how they would transform given heat.

“We looked at the single crystals under a microscope and found that the transformation process is dramatically different than we expected,” said graduate student and co-author Hyunjoong Chung. “We saw concerted movement of a whole layer of molecules sweeping through the crystal that seem to drive the shape-memory effect – something that is rarely observed in organic crystals and is therefore largely unexplored.”

This unexpected observation led the team to want to explore the merger between shape-memory materials science and the field of organic electronics, the researchers said. “Today’s electronics are dependent on transistors to switch on and off, which is a very energy-intensive process,” Diao said. “If we can use the shape-memory effect in plastic semiconductors to modulate electronic properties in a cooperative manner, it would require very low energy input, potentially contributing to advancements in low-power and more efficient electronics.”

The team is currently using heat to demonstrate the shape-memory effect, but are experimenting with light waves, electrical fields and mechanical force for future demonstrations. They are also exploring the molecular origin of the shape-memory mechanism by tweaking the molecular structure of their materials. “We have already found that changing just one atom in a molecule can significantly alter the phenomenon,” Chung said.

The researchers are very excited about the molecular cooperativity aspect discovered with this research and its potential application to the recent Nobel Prize-winning concept of molecular machines, Diao said. “These molecules can change conformation cooperatively at the molecular level, and the small molecular structure change is amplified over millions of molecules to actuate large motion at the macroscopic scale.”

One of the big challenges in computer architecture is integrating storage, memory and processing in one unit. This would make computers faster and more energy efficient. University of Groningen physicists have taken a big step towards this goal by combining a niobium doped strontium titanate (SrTiO3) semiconductor with ferromagnetic cobalt. At the interface, this creates a spin-memristor with storage abilities, paving the way for neuromorphic computing architectures. The results were published on 22 January in Scientific Reports.

The device developed by the physicists combines the memristor effect of semiconductors with a spin-based phenomenon called tunnelling anisotropic magnetoresistance (TAMR) and works at room temperature. The SrTiO3 semiconductor has a non-volatile variable resistance when interfaced with cobalt: an electric field can be used to change it from low to high resistance and back. This is known as the electroresistance effect.

Tunability

Furthermore, when a magnetic field was applied across the same interface, in and out of the plane of the cobalt, this showed a tunablity of the TAMR spin voltage by 1.2 mV. This coexistence of both a large change in the value of TAMR and electroresistance across the same device at room temperature has not previously been demonstrated in other material systems.

‘This means we can store additional information in a non-volatile way in the memristor, thus creating a very simple and elegant integrated spin-memristor device that operates at room temperature’, explains Professor of Spintronics of Functional Materials Tamalika Banerjee. She works at the Zernike Institute for Advanced Materials at the University of Groningen. So far, attempts to combine spin-based storage, memory and computing have been hampered by a complex architecture in addition to other factors.

Brain

The key to the success of the Banerjee group device is the interface between cobalt and the semiconductor. ‘We have shown that a one-nanometre thick insulating layer of aluminium oxide makes the TAMR effect disappear’, says Banerjee. It took quite some work to engineer the interface. They did so by adjusting the niobium doping of the semiconductor and thus the potential landscape at the interface. The same coexistence can’t be realized with silicon as a semiconductor: ‘You need the heavy atoms in SrTiO3 for the spin orbit coupling at the interface that is responsible for the large TAMR effect at room temperature.’

These devices could be used in a brain-like computer architecture. They would act like the synapses that connect the neurons. The synapse responds to an external stimulus, but this response also depends on the synapse’s memory of previous stimuli. ‘We are now considering how to create a bio-inspired computer architecture based on our discovery.’ Such a system would move away from the classical Von Neumann architecture. The big advantage is that it is expected to use less energy and thus produce less heat. ‘This will be useful for the “Internet of Things”, where connecting different devices and networks generates unsustainable amounts of heat.’

Energy efficiency

The physics of what exactly happens at the interface of cobalt and the strontium semiconductor is complicated, and more work needs to be done to understand this. Banerjee: ‘Once we understand it better, we will be able to improve the performance of the system. We are currently working on that. But it works well as it is, so we are also thinking of building a more complex system with such spin-memristors to test actual algorithms for specific cognition capabilities of the human brain.’ Banerjee’s device is relatively simple. Scaling it up to a full computing architecture is the next big step.

How to integrate these devices in a parallel computing architecture that mimics the working of the brain is a question that fascinates Banerjee. ‘Our brain is a fantastic computer, in the sense that it can process vast amounts of information in parallel with an energy efficiency that is far superior to that of a supercomputer.’ Banerjee’s team’s findings could lead to new architectures for brain-inspired computing.

Sometimes it pays to be two-dimensional. The merits of graphene, a 2D sheet of carbon atoms, are well established. In its wake have followed a host of “post-graphene materials” – structural analogues of graphene made of other elements like silicon or germanium.

Now, an international research team led by Nagoya University (Japan) involving Aix-Marseille University (France), the Max Planck Institute in Hamburg (Germany) and the University of the Basque country (Spain) has unveiled the first truly planar sample of stanene: single sheets of tin (Sn) atoms. Planar stanene is hotly tipped as an extraordinary electrical conductor for high technology.

High-resolution STM image of stanene prepared on a Ag2Sn surface alloy. The honeycomb stanene structure model is superimposed. Credit: Junji Yuhara

High-resolution STM image of stanene prepared on a Ag2Sn surface alloy. The honeycomb stanene structure model is superimposed. Credit: Junji Yuhara

Just as graphene differs from ordinary graphite, so does stanene behave very differently to humble tin in bulk form. Because of relatively strong spin-orbit interactions for electrons in heavy elements, single-layer tin is predicted to be a “topological insulator,” also known as a quantum spin Hall (QSH) insulator. Materials in this remarkable class are electrically insulating in their interiors, but have highly conductive surfaces/edges. This, in theory, makes a single-layered topological insulator an ideal wiring material for nanoelectronics. Moreover, the highly conductive channels at the edge of these materials can carry special chiral currents with spins locked with transport directions, which makes them also very appealing for spintronics applications.

In previous studies, where stanene was grown on substrates of bismuth telluride or antimony, the tin layers turned out to be highly buckled and relatively inhomogeneous. The Nagoya team instead chose silver (Ag) as their host – specifically, the Ag(111) crystal facet, whose lattice constant is slightly larger than that of the freestanding stanene, leading to the formation of flattened tin monolayer in a large area, one step closer to the scalable industrial applications.

Individual tin atoms were slowly deposited onto silver, known as epitaxial growth. Crucially, the stanene layer did not form directly on top of the silver surface. Instead, as shown by core-level spectroscopy, the first step was the formation of a surface alloy (Ag2Sn) between the two species. Then, another round of tin deposition produced a layer of pure, highly crystalline stanene atop the alloy. Tunneling microscopy shows striking images of a honeycomb lattice of tin atoms, illustrating the hexagonal structure of stanene.

The alloy guaranteed the flatness of the tin layer, as confirmed by density-functional theory calculations. Junji Yuhara, lead author of an article by the team published in 2D Materials, explains: “Stanene follows the crystalline periodicity of the Ag2Sn surface alloy. Therefore, instead of buckling as it would in isolation, the stanene layer flattens out – at the cost of a slight strain – to maximize contact with the alloy beneath.” This mutual stabilization between stanene and host not only keeps the stanene layers impeccably flat, but lets them grow to impressive sizes of around 5,000 square nanometers.

Planar stanene has exciting prospects in electronics and computing. “The QSH effect is rather delicate, and most topological insulators only show it at low temperatures”, according to project team leader Guy Le Lay at Aix-Marseille University. “However, stanene is predicted to adopt a QSH state even at room temperature and above, especially when functionalized with other elements. In the future, we hope to see stanene partnered up with silicene in computer circuitry. That combination could drastically speed up computational efficiency, even compared with the current cutting-edge technology.”