Category Archives: Top Story Right

The new Galaxy S4 from Samsung Electronics joins a growing trend of premium smartphones featuring enhanced active-matrix organic light-emitting diode (AMOLED) panels, spurring the market for these high-quality displays to more than double by 2017.

AMOLED display shipments for mobile handset applications are expected to grow to 447.7 million units in 2017, up from 195.1 million units in 2013, according to insights from the IHS iSuppli Emerging Displays Service at information and analytics provider HIS. Within the mobile handset display market, the market share for AMOLED displays is forecast to grow from 7.9% in 2013 to 15.2 percent in 2017, as presented in the figure below. AMOLED’s market share for 4-inch or larger handset displays employed in smartphones is set to increase to 24.4% in 2017, up from 23.0% in 2013.

AMOLED display shipments

“Because of their use in marquee products like the Galaxy S4, high-quality AMOLEDs are growing in popularity and gaining share at the expense of liquid crystal display (LCD) screens,” said Vinita Jakhanwal, director for mobile & emerging displays and technology at IHS. “These attractive AMOLEDs are part of a growing trend of large-sized, high-resolution displays used in mobile devices. With the S4 representing the first time that a full high-definition (HD) AMOLED has been used in mobile handsets, Samsung continues to raise the profile of this display technology.”

AMOLED on display in the S4

For its new premium smartphone, Samsung Display—the AMOLED display supplier for Samsung Electronics—increased the AMOLED pixel format to 1920 by 1080 Full High Definition (Full HD), up from 1280 x 720 WXGA present in the Galaxy S III. Part of Samsung’s popular Galaxy line, the S4 joins several high-end smartphone models from other manufacturers also featuring 1920 by 1080 resolution but distinguished by an important difference. The other handsets use thin-film transistor LCD (TFT-LCD) displays, while the Galaxy S4 is the first Full HD smartphone utilizing an AMOLED display.

Among the handsets with 1920 by 1080 TFT-LCD panels are the 4.8-inch HTC One, the 5.0-inch Sony Xperia Z, the 5.0-inch ZTE Grand S, the 5.0-inch OPPO Find, the 5.5-inch LG Optimus G Pro and the 5.5-inch Lenovo Ideaphone K900.

Samsung tackles technical issues

The high-resolution mobile handset display market is currently dominated by Low-Temperature Polysilicon (LTPS) TFT-LCDs, which accounts for the entire Full HD mobile handset display market.

Reaching the high-resolution point with true pixel densities greater than 300 pixels per inch (ppi) has been a challenge for AMOLED displays, as it is difficult to achieve dense pixel arrangements using the conventional Fine Metal Mask process while still securing enough display brightness and not compromising power consumption.

Samsung Display, however, was able to enhance AMOLED display performance by implementing two new technologies in addition to its existing Fine Metal Mask process. The maker succeeded in increasing the lighting area in AMOLED panels with its new structure of Pentile matrix, and it used phosphorescent material for the green subpixels, allowing better light management and lower energy consumption.

As a result, the AMOLED display was able to achieve a denser pixel arrangement, boosting its pixel density to greater than 400 ppi and resulting in 1920 x 1080 Full HD display in the Galaxy S4. This compares to 1280 x 720 WXGA in the Galaxy S III, 800 x 480 WVGA in the Galaxy S2 and 1280 x 800 WXGA in the Galaxy Note. The higher pixel density provides sharper and more defined images, while being able to display more content on a smaller display area.

Samsung also implemented the Floating Touch system in Galaxy S4, allowing users to interact with the touch screen by letting their fingers hover a few inches away from the display. By combining mutual capacitance, the on-surface normal touch sensing and self-capacitance, the Floating Touch in the Galaxy S4 expands the user experience of the display. It also detects touch inputs from gloved hands, a feature that was first introduced through Nokia’s Lumia 920 in 2012.

Galaxy S4 will be the first Full HD AMOLED display offering in the market. However, material lifetime, color balance and limited supplier base still need to be addressed for a larger market presence of OLEDs and stronger competitiveness against LTPS TFT-LCDs.

North America-based manufacturers of semiconductor equipment posted $1.07 billion in orders worldwide in February 2013 (three-month average basis) and a book-to-bill ratio of 1.10, according to the February Book-to-Bill Report published today by SEMI.  A book-to-bill of 1.10 means that $110 worth of orders were received for every $100 of product billed for the month.

The three-month average of worldwide billings in February 2013 was $975.3 million. The billings figure is 0.8 percent higher than the final January 2013 level of $968.0 million, and is 26.3 percent lower than the February 2012 billings level of $1.32 billion.

book-to-bill ratio semiconductor industry Feb 2013

“Three-month average bookings and billings posted by North American semiconductor equipment providers remain above parity and consistent with prior month levels," said Denny McGuirk, president and CEO of SEMI. "We expect modest investment by semiconductor makers in the first half of the year with foundry and advanced packaging technology among the near-term spending drivers.”

The SEMI book-to-bill is a ratio of three-month moving averages of worldwide bookings and billings for North American-based semiconductor equipment manufacturers. Billings and bookings figures are in millions of U.S. dollars.

The data was compiled by David Powell, Inc., an independent financial services firm, without audit, from data submitted directly by the participants. SEMI and David Powell, Inc. assume no responsibility for the accuracy of the underlying data.

The data is contained in a monthly Book-to-Bill Report published by SEMI. The report tracks billings and bookings worldwide of North American-headquartered manufacturers of equipment used to manufacture semiconductor devices, not billings and bookings of the chips themselves. The Book-to-Bill report is one of three reports included with the Equipment Market Data Subscription (EMDS).

SEMI is the global industry association serving the nano- and micro-electronic manufacturing supply chains. SEMI maintains offices in Bangalore, Beijing, Berlin, Brussels, Grenoble, Hsinchu, Moscow, San Jose, Seoul, Shanghai, Singapore, Tokyo, and Washington, D.C.

 

With the introduction of the Galaxy S4, Samsung Electronics continues to lead the market in the adoption of pressure sensors in smartphones, paving the way for massive growth in the market for these devices in the coming years.

Global shipments of microelectromechanical system (MEMS) pressure sensors in cellphones are set to rise to 681 million units in 2016, up more than eightfold from 82 million in 2012, according to the IHS iSuppli MEMS & Sensors Service at information and analytics provider IHS (NYSE: IHS). Shipments this year are expected to double to 162 million units, as presented in the attached figure, primarily due to Samsung’s usage of pressure sensors in the Galaxy S4 and other smartphone models.

“Samsung is the only major original equipment manufacturer (OEM) now using pressure sensors in all its flagship smartphone models,” said Jérémie Bouchaud, director and senior principal analyst for MEMS and sensors at IHS. “The company appears to be slightly ahead of its time in its adoption of pressure sensors, even though the most compelling application—indoor navigation—is still not ready for deployment. However, Samsung seems to want to anticipate the start of this market and get a jump on the competition for pressure sensors. The pressure device represents just one component among a wealth of different sensors used in the S4.”

Pressure’s rising

Besides Samsung, few other OEMs have been using pressure sensors in smartphones. The only other smartphone OEMs to use pressure sensors in their products are Sony Mobile in a couple of models in 2012, and a few Chinese vendors, like Xiaomi.

Apple Inc., which pioneered the use of MEMS sensors in smartphones, does not employ pressure sensors at the moment in the iPhone. However, IHS expects Apple will start them in 2014, which will contribute to another doubling of the market in 2014 to 325 million units.

Applying pressure

Although pressure sensors aren’t very useful currently in smartphones, they hold strong potential for the future.

The most interesting application now is the fast Global Positioning System (GPS) lock, wherein the GPS chipset can lock on to a satellite signal and calculate positions more quickly by using the pressure sensor to determine the smartphone’s altitude.

However, the most exciting use for pressure sensors in the future will be indoor navigation, an area with massive potential growth in retail and travel applications. Pressure sensors will provide the floor accuracy required to determine which level a user is on within a structure.

While the ecosystem is not yet fully in place for indoor location/navigation, IHS anticipates this market will reach a breakthrough in growth during the next 12 to 18 months.

By this time, Samsung will have a considerable lead over Apple and other competitors in the installed base of pressure sensors in smartphones.

Samsung takes lead in smartphone MEMS sensors

Although Apple pioneered the usage of MEMS sensors in smartphones, and was the top consumer of these devices for many years, Samsung in 2012 took the lead from Apple for the first time. With Samsung expected to maintain hegemony in smartphone shipments in 2013 and the company loading up on the number of MEMS and other sensors in each smartphone that it ships, its lead in this area is is likely to continue to grow.

Given its emphasis on detecting and adapting to consumer lifestyles, the Galaxy S4 integrates a wealth of different sensors, including the accelerometer, RGB light, geomagnetic, proximity, gyroscope, barometer, gesture and even temperature and humidity varieties.

Sensor suppliers

While IHS has not yet conducted a physical teardown of the Galaxy S4, the IHS iSuppli MEMS and Sensors Service is able to anticipate the likely suppliers of these devices for the smartphone.

The pressure sensor in the S4 is made either by STMicroelectronics, as it was in the Galaxy S III; or by Bosch, like what was used in the Galaxy Note 1 and 2. Both companies are the only mass producers of these devices today for handsets.

And just as in the Samsung Galaxy S III, STMicroelectronics and yet another supplier, InvenSense, are expected to share the supply of the S4’s inertial measurement unit (IMU), which combines the accelerometer and gyroscope.

Meanwhile, the S4’s compass could be supplied by any one of three entities: by AKM—the same as the Galaxy S III; or by Yamaha—as was used in a previous member of the Galaxy smartphone line; or by Alps—which is an up-and-coming manufacturer in this area.

Maximum RGB

IHS expects that Samsung will continue to use an RGB sensor in the S4, as part of a combo device that aggregates RGB, proximity, and IR LED emitter, as it did in the Galaxy Note 2 and the Samsung S III. Samsung was the only user of such combo sensors in smartphones in 2012.

If the RGB sensor is installed on the side of the S4 display, it will be used to sense the color temperature of the room where it’s located, and adapt the contrast and colors on the display to enhance the viewing experience. Such RGB sensors are useful for high-end displays. Since the Galaxy S4 is expected to have full high-definition display—unlike the S3—the added value of having an RGB sensor might be more obvious and noticeable in the S4.

The RGB sensor also could be installed on the back the Galaxy S4 in conjunction with the camera module. This can help in taking better pictures by correcting the white balance.

Capella Microsystems is likely to be the RGB supplier, just as in the Galaxy S III. Other potential suppliers are ams-TAOS, Maxim and Hamamatsu.

Cisco Systems is preparing for a major shift in the industry, as the Internet of Things starts to become a reality. At an annual press event in San Jose, California this week, Cisco officials claimed that the much-anticipated IoT industry could be a $14 trillion opportunity, and they are ready to embrace the change.

Rob Lloyd, president of sales and development at Cisco, told the press that he believes as many as 50 billion devices will be connected to the Internet by 2020, from which, he believes, the $14 trillion business opportunity will stem. The trend will create business opportunities initially in manufacturing, but extend into government, energy and health care, he said, as sensors will become part of traffic systems, hospitals, refineries and other civil and business infrastructures. These opportunities will extend far beyond today’s budgets for computer and communication systems.

An ambitious plan for Cisco, though some might recall that Cisco’s CEO has announced this plan before. Last year, John Chambers, Cisco’s chairman and chief executive, told the press that he expects the company will experience a shift in customers, handling government and large businesses’ projects such as designing and managing systems for clean water or efficient traffic.

“The first 10 years (of the commercial Internet) were really about transactions, and the last 10 were about interactions,” Padmasree Warrior, Cisco’s chief technology and strategy officer, told the press this week. “The next 10 is about processes being more efficient.”

However, the IoT space is already presenting plenty of challenges. Cisco is working with utilities worldwide in the hopes that 10 million smart meters will be deployed by the end of the year, supporting IoT protocol. Cisco has already deployed about $180 billion worth of network equipment into the world, Warrior said, and will build hardware and software that interacts efficiently with the legacy gear, so new kinds of intelligent systems can be quickly deployed.

What do you know about the Internet of Things? Do you think it’s all hype or a real opportunity? Let us know what you think in the comment section below. Comments posted on Solid State Technology articles will not automatically be posted to your social media account unless you select to share.

 

Seven O-S-D product categories and device groups reached record-high sales in 2012 compared to 14 new records being set in 2011, according to data shown in the 2013 edition of IC Insights’ O-S-D Report, A Market Analysis and Forecast for Optoelectronics, Sensors/Actuators, and Discretes.  Figure 1 shows that in 2012, two sales records were achieved in optoelectronics, four in sensors/actuators (including total sensor sales), and one in discretes.  Ten new sales records are expected to be set in the O-S-D markets in 2013.  All the products shown in Figure 1 are forecast to grow by moderate percentages in 2013, which will lift them again to new record-high levels.  Total sales of MEMS-based products are expected in rise 9% in 2013 and reach a new annual record of $7.6 billion, surpassing the current peak of $7.1 billion set in 2011.

O-S-D products record sales 2012

With sales in the much larger IC segment falling 4% in 2012, O-S-D’s share of total semiconductor revenues grew to 19% in 2012 versus 18% in 2011 and 14% in 2002.  O-S-D’s marketshare of total semiconductor sales in 2012 was the highest it’s been since 1991.

Key findings and forecasts in the 2013 O-S-D Report include:

CMOS image sensors were the fastest growing O-S-D product category in 2012 with sales rising 22% to a new record-high $7.1 billion, blowing past the previous peak of $5.8 billion set in 2011. Since the 2009 downturn year, CMOS image sensor sales have climbed 85% due to the strong growth of embedded cameras used in smartphones and portable computers (including tablets) and the expansion of digital imaging into more systems applications. CMOS designs are now grabbing large chunks of marketshare from CCD image sensors, which are forecast to see revenues decline by a CAGR of 2.4% between 2012 and 2017.  Sales of CMOS imaging devices are projected to grow by a CAGR of about 12.0% in the forecast period and account for 85% of the total image sensor market versus 15% for CCDs in 2017.  This compares to a 60/40 split in 2009.

High-brightness LED revenues climbed 20% in 2012 to nearly $9.5 billion and are expected to hit the $20.0 billion level in 2017, with annual sales growing by a CAGR of 16% in the next five years. That’s the good news, but of immediate concern is whether new solid-state lighting applications are growing fast enough to consume the large amounts of production capacity being added worldwide in LED wafer fabs—especially in China.  Solid-state lighting’s main growth engine in recent years—backlighting in LCD televisions and computer screens—is slowing, and the multi-billion dollar question is whether the next wave of applications (e.g., LED light bulbs, new interior and exterior lighting systems, digital signs and billboards, automotive headlamps, long-lasting street lights, and other uses) can keep the industry ahead of a potential glut in high-brightness lamp devices.

About 81% of the sensor/actuator market’s sales in 2012 came from semiconductor products built with MEMS technology.  Sensors accounted for 52% of MEMS-based device sales in 2012, while actuators were 48% of the total.   A 10% drop in actuator sales in 2012 lowered total revenues for MEMS-based devices to $7.0 billion from the current peak of $7.1 billion in 2011.  By 2017, MEMS-based sensors and actuators are projected to reach $13.5 billion in sales, which will be a CAGR increase of 14.0% from 2012, and unit shipments are expected to grow by a CAGR of 17.4% in the next five years to 9.7 billion devices.  MEMS manufacturing continues to move into the mainstream IC foundry segment, which will open more capacity to fabless companies and larger suppliers. TSMC, GlobalFoundries, UMC, and SMIC all have increased investments to expand their presence in MEMS production using 200mm wafers.

Among the strongest growth drivers covered in the O-S-D Report are: high-brightness LEDs for solid-state lighting applications; laser transmitters for high-speed optical networks; MEMS-based acceleration/yaw sensors for highly adaptive embedded control in cellphones, tablet computers, and consumer products; CMOS imaging devices for automobiles, machine vision, medical, and new human-recognition interfaces; and a range of power transistors for energy-saving electronics and battery management.

 Now in its eighth annual edition, the 2013 O-S-D Report contains a detailed forecast of sales, unit shipments, and selling prices for more than 30 individual product types and categories through 2017.

 

In the 1970s, the semiconductor industry was vertically integrated. Most companies were IDMs with manufacturing, design, intellectual property (IP) and marketing activities. During this period, manufacturing technology evolution was strong and required new fabs, which directly increased capital expenditures. Companies desired an attractive Return on Invested Capital (ROIC), and to obtain it they developed a new, lower-investment business model by including manufacturing services in their portfolio.

In the 1980’s, the first pure foundries emerged — and ten years later, the fabless business model was born. The rest, as they say, is history; according to GSA, in 2011 there were 1,800 fabless companies worldwide, covering a variety of sectors.

In the power electronics field, the fabless business model is not as common compared to the MEMS industry. For example, most power electronics players have their own capabilities/fabs dedicated mainly to silicon wafer manufacturing. According to Yole Développement, $4B was generated by MEMS fabless companies in 2012, against less than $300K in the power electronics area. Is power electronics a world apart?

“Less than 10 companies have been clearly identified in the Power Electronics industry. This trend is clearly linked to the introduction of new materials like GaN and SiC wafers in manufacturing technologies. New products already commercialized, for example photovoltaic inverters, use thesenew materials. Yole Développement is currently analyzing the Power Electronics industry in orderto understand what the next step will be,” explains Alexandre Avron, Power Electronics Technology & Market Analyst at Yole Développement.

Indeed, for a long time the power electronics field and its key players only considered silicon wafers. Today, however, the power semiconductor industry is entering a new era: for the first time, power electronics companies are developing new solutions based on “non-silicon” manufacturing technologies. This evolution is not without big investments, though, and in order to limit them, some companies have decided to become fabless and collaborate with large fabs to produce the necessary components.

The truth is that power electronics is not a world apart, and that the fabless business model has just become a reality in the field. It represents a real opportunity for power electronics companies to introduce new components and embrace the technology evolution.

For the second straight year, Yole Développement and Serma have joined forces to organize the Successful Semiconductor Fabless conference, a unique European event dedicated to the fabless business model. This event takes place in Paris, from April 10 to 12.

The steady increase in PC capabilities that has justified the upgrade cycle and fueled the long-term growth of the PC market is undergoing a historical deceleration, as evidenced by the slowing increase in dynamic random access memory (DRAM) content in notebooks and desktops since 2007.

Annual growth in the average DRAM usage per shipped PC has been slowing dramatically since peaking in 2007, according to an IHS iSuppli DRAM Dynamics Market Brief from information and analytics provider IHS. Following a 21.4% increase in 2012, the average growth of DRAM content per PC will decline to a record low of 17.4% this year, as presented in the attached figure. This compares to the high point of 56.1% in 2007, and 49.9% in 2008.

“For a generation, PCs have steadily improved their hardware performance and capabilities every year, with faster microprocessors, rising storage capacities and major increases in DRAM content,” said Clifford Leimbach, memory analyst at IHS. “These improvements—largely driven by rising performance demands of new operating system software—have justified the replacement cycle for PCs, compelling consumers and businesses to buy new machines to keep pace. However, on the DRAM front, the velocity of the increase has slackened. This slowdown reflects the maturity of the PC platform as well as a change in the nature of notebook computers as OEMs adjust to the rise of alternative systems—namely smartphones and media tablets.”

The growth in DRAM loading in PCs is expected remain in a low range in the coming years, rising by 21.3% in 2014 to and then continuing in the 20.0% range until at least 2016.

Notebooks slim down on DRAM

Notebooks increasingly are adopting ultrathin form factors and striving to increase battery life in order to become more competitive with popular media tablets. Because of this, DRAM chips must share limited space on the PC motherboard with other semiconductors that control the notebook’s other functions. Incorporating more DRAM bits can limit other notebook capabilities.

Notebook makers have shown a willingness to limit increase in DRAM on their systems, rather than sacrifice the thin form factor or eschew other features.

Desktops feel their age

For desktops, the slowing in DRAM bit growth reflects the maturity of PC hardware and operating system software.

DRAM has become less of a bottleneck in PC performance, tempering the need to increase DRAM bits in each system to ostensibly improve system speed.

Moreover, a change in PC operating system requirements has had the effect of limiting growth in DRAM loading. The latest version of Windows, in particular, has not required a step up in DRAM content, unlike previous Windows system versions where increased DRAM loading was explicitly required for desktops to avail of optimal performance that came with a new OS.

Post-PC era realities

“All told, PCs no longer need to add DRAM content as much as they did in the previous times, when failure to increase memory content in either desktops or laptops could have resulted in a direct impediment to performance,” Leimbach said. “The new normal now calls for a different state of affairs, in which DRAM PC loading won’t be growing at the same rates seen in past years.”

PCs historically have dominated DRAM consumption. However, starting in the second quarter of 2012, PCs accounted for less than half of all DRAM shipments—the first time in a generation that they didn’t consume 50 percent or more of the leading type of semiconductor memory. This is partly due to slowing shipment growth for PCs, combined with the deceleration in DRAM loading growth.

The development also illustrates the diminishing dominion of PCs in the electronics supply chain—and represented another sign of the post-PC era.

“The arrival of the post-PC era doesn’t mean that people will stop using personal computers, or even necessarily that the PC market will stop expanding,” Leimbach said. “What the post-PC era does mean is that personal computers are not at the center of the technology universe anymore—and are seeing their hegemony over the electronics supply chain erode. PCs are no longer generating the kind of growth and overwhelming market size that can single-handedly drive demand, pricing and technology trends in DRAM any many other major technology businesses.”

graphene collapse observed in berkley labThe first experimental observation of a quantum mechanical phenomenon that was predicted nearly 70 years ago holds important implications for the future of graphene-based electronic devices. Working with microscopic artificial atomic nuclei fabricated on graphene, a collaboration of researchers led by scientists with the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) Berkeley have imaged the “atomic collapse” states theorized to occur around super-large atomic nuclei.

“Atomic collapse is one of the holy grails of graphene research, as well as a holy grail of atomic and nuclear physics,” says Michael Crommie, a physicist who holds joint appointments with Berkeley Lab’s Materials Sciences Division and UC Berkeley’s Physics Department. “While this work represents a very nice confirmation of basic relativistic quantum mechanics predictions made many decades ago, it is also highly relevant for future nanoscale devices where electrical charge is concentrated into very small areas.”

Crommie is the corresponding author of a paper describing this work in the journal Science. The paper is titled “Observing Atomic Collapse Resonances in Artificial Nuclei on Graphene.”  Co-authors are Yang Wang, Dillon Wong, Andrey Shytov, Victor Brar, Sangkook Choi, Qiong Wu, Hsin-Zon Tsai, William Regan, Alex Zettl, Roland Kawakami, Steven Louie, and Leonid Levitov.

Originating from the ideas of quantum mechanics pioneer Paul Dirac, atomic collapse theory holds that when the positive electrical charge of a super-heavy atomic nucleus surpasses a critical threshold, the resulting strong Coulomb field causes a negatively charged electron to populate a state where the electron spirals down to the nucleus and then spirals away again, emitting a positron (a positively–charged electron) in the process. This highly unusual electronic state is a significant departure from what happens in a typical atom, where electrons occupy stable circular orbits around the nucleus.

 “Nuclear physicists have tried to observe atomic collapse for many decades, but they never unambiguously saw the effect because it is so hard to make and maintain the necessary super-large nuclei,” Crommie says. “Graphene has given us the opportunity to see a condensed matter analog of this behavior, since the extraordinary relativistic nature of electrons in graphene yields a much smaller nuclear charge threshold for creating the special supercritical nuclei that will exhibit atomic collapse behavior.”

Perhaps no other material is currently generating as much excitement for new electronic technologies as graphene, sheets of pure carbon just one atom thick through which electrons can freely race 100 times faster than they move through silicon. Electrons moving through graphene’s two-dimensional layer of carbon atoms, which are arranged in a hexagonally patterned honeycomb lattice, perfectly mimic the behavior of highly relativistic charged particles with no mass. Superthin, superstrong, superflexible, and superfast as an electrical conductor, graphene has been touted as a potential wonder material for a host of electronic applications, starting with ultrafast transistors.

In recent years scientists predicted that highly-charged impurities in graphene should exhibit a unique electronic resonance – a build-up of electrons partially localized in space and energy – corresponding to the atomic collapse state of super-large atomic nuclei. Last summer Crommie’s team set the stage for experimentally verifying this prediction by confirming that graphene’s electrons in the vicinity of charged atoms follow the rules of relativistic quantum mechanics. However, the charge on the atoms in that study was not yet large enough to see the elusive atomic collapse.

“Those results, however, were encouraging and indicated that we should be able to see the same atomic physics with highly charged impurities in graphene as the atomic collapse physics predicted for isolated atoms with highly charged nuclei,” Crommie says. “That is to say, we should see an electron exhibiting a semiclassical inward spiral trajectory and a novel quantum mechanical state that is partially electron-like near the nucleus and partially hole-like far from the nucleus. For graphene we talk about ‘holes’ instead of the positrons discussed by nuclear physicists.”

Non-relativistic electrons orbiting a subcritical nucleus exhibit the traditional circular Bohr orbit of atomic physics. But when the charge on a nucleus exceeds the critical value, Zc, the semiclassical electron trajectory is predicted to spiral in toward the nucleus, then spiral away, a novel electronic state known as “atomic collapse.” Artificial nuclei composed of three or more calcium dimers on graphene exhibit this behavior as graphene’s electrons move in the supercritical Coulomb potential.

To test this idea, Crommie and his research group used a specially equipped scanning tunneling microscope (STM) in ultra-high vacuum to construct, via atomic manipulation, artificial  nuclei on the surface of a gated graphene device. The “nuclei” were actually clusters made up of pairs, or dimers, of calcium ions. With the STM, the researchers pushed calcium dimers together into a cluster, one by one, until the total charge in the cluster became supercritical. STM spectroscopy was then used to measure the spatial and energetic characteristics of the resulting atomic collapse electronic state around the supercritical impurity.

“The positively charged calcium dimers at the surface of graphene in our artificial nuclei played the same role that protons play in regular atomic nuclei,” Crommie says. “By squeezing enough positive charge into a sufficiently small area, we were able to directly image how electrons behave around a nucleus as the nuclear charge is methodically increased from below the supercritical charge limit, where there is no atomic collapse, to above the supercritical charge limit, where atomic collapse occurs.”

Observing atomic collapse physics in a condensed matter system is very different from observing it in a particle collider, Crommie says. Whereas in a particle collider the “smoking gun” evidence of atomic collapse is the emission of a positron from the supercritical nucleus, in a condensed matter system the smoking gun is the onset of a signature electronic state in the region nearby the supercritical nucleus. Crommie and his group observed this signature electronic state with artificial nuclei of three or more calcium dimers.

“The way in which we observe the atomic collapse state in condensed matter and think about it is quite different from how the nuclear and high-energy physicists think about it and how they have tried to observe it, but the heart of the physics is essentially the same,” says Crommie.

If the immense promise of graphene-based electronic devices is to be fully realized, scientists and engineers will need to achieve a better understanding of phenomena such as this that involve the interactions of electrons with each other and with impurities in the material.

“Just as donor and acceptor states play a crucial role in understanding the behavior of conventional semiconductors, so too should atomic collapse states play a similar role in understanding the properties of defects and dopants in future graphene devices,” Crommie says. “Because atomic collapse states are the most highly localized electronic states possible in pristine graphene, they also present completely new opportunities for directly exploring and understanding electronic behavior in graphene.”

In addition to Berkeley Lab and UC Berkeley, other institutions represented in this work include UC Riverside, MIT, and the University of Exeter.

Berkeley Lab’s work was supported by DOE’s Office of Science.  Other members of the research team received support from the Office of Naval Research and the National Science Foundation. Computational resources were provided by DOE at Berkeley Lab’s NERSC facility.

The photonics industry gathered in Washington, D.C., to engage in a discussion about a national photonics initiative.

More than 100 representatives from government and the photonics industry convened in Washington, D.C., on February 28 to identify focus areas for a national photonics initiative (NPI), engaging academia, industry, and government in a collaboration to address barriers to continued U.S. leadership in photonics.

Titled “Optics & Photonics: Lighting A Path for the Future,” the event was organized by SPIE, the international society for optics and photonics, in partnership with four other technical organizations. The meeting included briefings by subcommittees and industry representatives on future needs, and perspectives of technology experts from the five key optics and photonics sectors — communication, defense, health and medicine, manufacturing, and energy –on how focus ideas for the NPI.

Recommendations are expected to be released later this month.

Establishment of the NPI was a key recommendation of the groundbreaking National Academy of Sciences report “Optics & Photonics, Essential Technologies for Our Nation” released in August 2012.

Last week’s event was attended by representatives of numerous government labs and agencies, such as the Department of Energy, National Institute of Standards and Technology, DARPA, the National Science Foundation, Office of Naval Research, and NASA. Industry representatives included attendees from Corning, Agilent, Northrup-Grumman, Alacatel-Lucent, and IBM.

Speakers touched on issues such as decreasing numbers of U.S. STEM (science, technology, engineering, and mathematics) graduates for the next generation of the workforce, the increased investment by other national governments in science and technology, and the lack of a cohesive photonics R&D direction in the U.S. in the face of well-defined initiatives in several other countries.

Read more on Europe’s plans for a single semiconductor strategy

Without a cohesive policy in support of photonics advances, speakers warned, the U.S. will slip from its place of technology leadership, manufacturing will continue to shift outside the U.S., and forward progress in photonics-enabled applications in medicine, cybersecurity, broadband, bridge and highway infrastructure safety, and other areas will be impaired.

“Photonics is a critical enabler for our high-tech economy,” said Paul McManamon, one of several members of the report that produced the committee who attended last week’s event. “The Internet, MRIs and CAT scans, and space mission spin-offs such as optical blood diagnostic instruments and infrared cameras that indicate hot spots in a fire are just a few examples of photonics-enabled applications. If the U.S. wants to retain high-tech leadership and jobs, we need the National Photonics Initiative.”

Committee members Alan Willner, Tom Baer, and Edward White, also attended and participated in a panel discussion.

Along with SPIE, sponsoring organizations included the Optical Society (OSA), IEEE Photonics Society, American Physical Society, and the Laser Institute of America.

SPIE is the international society for optics and photonics, a not-for-profit organization founded in 1955 to advance light-based technologies. The Society serves nearly 225,000 constituents from approximately 150 countries, offering conferences, continuing education, books, journals, and a digital library in support of interdisciplinary information exchange, professional networking, and patent precedent. SPIE provided over $3.2 million in support of education and outreach programs in 2012.

Following a healthy expansion in 2012, the growth of the global automotive semiconductor market will decelerate slightly this year because of a slowdown in the aftermarket and portable navigation device (PND) segments.

Total semiconductor revenue in 2013 derived from automotive infotainment will reach $6.67 billion, up 3% from $6.48 billion, according to an IHS Automotive Market Tracker Report from information and analytics provider IHS. Growth this year will be lower than last year’s approximately 4% increase, but an acceleration is expected next year and beyond, with revenue growth of 3 to 7% each year during the next five years. By 2018, automotive infotainment semiconductor revenue worldwide will amount to $8.54 billion, as shown in the figure below.

Driving into the future

“Despite relatively soft growth this year, the automotive infotainment semiconductor market is set for continued expansion well into the future—fueled by major technology improvements that not only increase the functionality of cars but also improve the overall driving experience,” said Luca DeAmbroggi, senior analyst for automotive infotainment at IHS. “The muted growth this year is the result of decreased revenue in the aftermarket sector, where sales are depressed because cars are being sold with more complete infotainment features and systems, reducing the need for consumers to make upgrades. The progress of the market also is being slowed by a continuing decline in the PND segment, as motorists increasingly turn away from dedicated navigation devices and toward smartphone-based solutions.”

The drop in aftermarket and PND sales will eat into gains made in semiconductor sales to OEMs, which will rise a projected 6% from 2012 to $4.5 billion this year.

However, some signals of inventory burnout started in the fourth quarter of 2012 also are expected to dampen semiconductor production revenue in the first half 2013.

No dip in the road for automotive infotainment

Despite the reduced speed ahead, the automotive infotainment market overall remains immune to a downturn, unlike other markets that have been negatively affected by global economic uncertainties. The importance of automotive infotainment continues to increase as consumers clamor for built-in connectivity and telematics in cars, which now have become a major selling point of new vehicles. Used either alone or with mobile devices like smartphones and tablets, the infotainment systems in cars then allow occupants to access information, safety features and entertainment options at will, paving the way for a more seamless interaction with the outside world.

In the long run, however, technology changes in cars will not just be associated with new features and hardware integration into the vehicle, but will also be influenced by new hardware strategies.

For instance, automotive infotainment systems are quickly developing toward a PC-like architectural approach in which more functionality is dependent on a powerful main central unit, IHS Automotive believes. This means that software will acquire greater importance as a differentiator among brands seeking to make their infotainment products and features stand out. Applications previously implemented via hardware will be reconfigured instead into simpler programs reliant on a heavily centralized unit marked by strong processing power and memory capabilities.

Infotainment for the masses

On a semiconductor level, growth will be fostered not just by the implementation of more infotainment features into a vehicle, but also by broader technology diffusion among various vehicle segments—trickling from high-end luxury rides all the way down to entry-level pieces. Government regulations and mandates, including those relating to electronic stability control or tire-pressure monitoring, will also help boost semiconductor growth.

Healthy growth to occur in various infotainment segments

Within the automotive infotainment market, PNDs will be the only segment to decline in the coming years. Shipments of the once-popular devices will fall from 33.6 million units last year to 24.0 million by 2018. Meanwhile, the combined market this year for PND-related analog and logic application-specific standard product (ASSP) integrated circuits will be down 18 percent on the year to less than $330 million.

In contrast to PNDs, growth is forecast to take place in various other automotive infotainment segments, including in-dash navigation systems, connectivity in head units, telematics, and both satellite and terrestrial digital radio.

In-dash navigation systems, for instance, will enjoy increased penetration worldwide in vehicle head units, deepening from 19% last year to more than 32 percent in 2018. Total in-dash silicon revenues in 2013 will reach $290 million, up from $274 million in 2012.

For connectivity systems in head units—a major trend in infotainment—Bluetooth and USB remain the de facto standard for wired and wireless connectivity given a 35% attach rate for each in 2012.

Increased momentum will likewise be found in other technologies aiming to cover high-definition applications, such as High-Definition Multimedia Interface (HDMI) and Mobile High-Definition Link (MHL).

Telematics on the rise

In telematics, General Motors’ OnStar and other similar systems continue to have the most mature and widespread market presence. OnStar-type embedded systems hauled in revenue of $480 million last year, with takings by 2018 expected to reach $1.8 billion. Telematics will grow quickly in Europe in the next couple of years as regulations become effective, making features like eCall mandatory in vehicles for summoning help during emergencies.

Automotive OEMs will also lend increasing support to satellite and terrestrial digital radio systems, such as HD Radio in North America and Digital Audio Broadcasting in Europe. In particular, automotive silicon revenue from terrestrial digital radio formats will rise sharply within a span of six years, climbing from $55 million in 2012 to more than $140 million by 2018.