Tag Archives: Small Times Magazine

Stan Williams and his colleague Greg Snider at HP Labs in Palo Alto, Calif., have completed research that could lead to making field-programmable gate arrays (FPGAs) up to 8x denser-while using less energy for a given computation-than those currently being produced. The work was featured in the January 24 issue of Nanotechnology, published by the British Institute of Physics. It uses an idea developed by Dmitri Strukov and Konstantin Likharev of Stony Brook University in New York for connecting a nanowire crossbar to CMOS.

Williams, a chemical physicist, and his research group are also working on ways to grow nano-sized switches and wires via chemical reactions and have them assemble themselves into electronic circuits. He discusses the challenges and delights of directing research in this field with Small Times’ Jo McIntyre.


Stan Williams
Click here to enlarge image

Q: Is this recently announced FPGA breakthrough the most important nano project you are working on now?

In our lab there are at least 30 distinct projects going on. I love all my children. I’m not going to say any one is better than any other. It’s a very broad palette covering mechanics, electronics, photonics, and metamaterials. In a metamaterial, you are creating an ordered structure. With this you can now make new types of materials that have never existed in nature that have very specific technical applications. Our electronics efforts are on wires and switches and manufacturing and memories and logic devices. This FPGA is one example. I am not bored.

Q: When did you get involved in this FPGA research?

This is a long story. It is at least 10 years of evolution. It goes all the way back to work we did trying to understand how to build switches at the molecular scale. In about the mid-1990s we had been looking at means other than transistors to do information management tasks (e.g., to perform logic operations). We started with some fairly interesting ideas as to how we would have something in place that would be needed when Moore’s law reaches the end of the line.

We were inspired, or informed, by work going on at Stony Brook. They have been reading our papers and came up with a clever idea for connecting these nano-wires to underlying CMOS. But because we had already been thinking about this for several years, we immediately recognized the importance. This is the way scientific research is supposed to work.

Q: Did that idea lead to the paper published in Nanotechnology?

Yes, this recent paper that came out is the result of a continuous phase of research over 10 years. The paper is a realistic, near-term road map for introducing a new technique into silicon CMOS that should not be much more difficult than many of the other changes in materials that have been done.

The most important thing we’re learning is how to keep Moore’s law going as long as possible without hiccups or interruptions. We have some new fabrication techniques, so we asked how we could make hybrid circuits, nanowires plus larger transistors, with the idea of extending the technology.

Q: What were your goals for this research?

It looks like a tremendous opportunity to improve one type of circuit FPGA dramatically without having to scale down transistors, but also, with an ulterior motive, to get people to incorporate it into an existing circuit.

Before any industry is going to start using our switches and wires, it has to be in the manufacturing infrastructure. The issue is to broaden the technology portfolio of the entire industry that was making integrated circuits.

We viewed the FPGA as the lowest-hanging fruit. It was the one we felt could be done. The problem four or five years ago was trying to figure out how to put switches into the interconnect of the surface and make it better.

Q: How did you solve that switch interconnect problem?

Through an interesting series of circumstances, we finally figured out how to do it. We had some ideas we had been batting around. The group at Stony Brook had an idea for a crossbar structure, or network of nanowires and switches to CMOS. A light bulb went off for us. Ideas have continued to flow. We’re now getting more ideas for other circuit types.

Q: That problem of having tiny nanowires connect to comparatively large transistors has existed for years. How did you manage to do it?

It depends on the size of circuits. It’s not such a stretch anymore. And we have developed a new lithography process-imprint lithography-and can fabricate chip sizes as small as 15 nanometers that have never been used in a fabrication environment before. What we’ve put together is not so totally crazy. We can put this in the fab process.

Q: Is this work defined as research or development?

My group is defined as research, but we have now spanned the range from more than 10 years out to looking at tomorrow.

The fact that we have been pushing to do this has elevated our standing within the eyes of our management. We could have just sat back and done wonderful research all that time, but we volunteered to step up and help our research find its way into the industry so HP can benefit from it.

Q: What has been the reaction to the announcement of the breakthrough?

Very favorable. I’m hearing from people all over the world about it. It was a demonstration of technical prowess on one hand, showing we can do significant things here at HP, but beyond that, it really is something useful. Reading about this in The Wall Street Journal increased internal enthusiasm for the project.

Q: Is anybody else close to doing this?

This takes time. It’s not a sprint; it’s a marathon. My group has existed for 12 years now. Anyone who is going to get into this is going to have to go on a significant journey.

There are a lot of other somewhat similar types of things going on at IBM and Infineon Technologies, and at several Japanese companies. Lots of people would like to use carbon nanotubes, but they’re not thinking of using active switches.

Q: How realistic is the goal of having a laboratory prototype completed within the year?

I wouldn’t have said it if I didn’t think it would happen. Would I guarantee it will happen? No. There are a lot of things that can go wrong.

The fundamental research is over, and now the hard work begins. Lots of unanticipated things can happen. That’s why we wanted to get things into a fab. If everything works out as we truly hope it will, in a year or so we should have a nice prototype. But it could take much longer.

Q: How involved are you going to be in this as it gets handed off to the fab people in Corvallis, Ore.?

The handoff never works very well. This is a very complex thing we are trying to do. It’s easy for us to make an assumption that something is easily doable, but when it meets the test in the fab environment, there’s a big distance between the lab and the fab. That will be a major accomplishment. If we can do it, then we start to have a platform, a hybrid-type chip. Now you can think of having it go off in a dozen different pathways.

We respect our colleagues in Corvallis. We have a strong collaborative effort now. There are 10 people there who specifically work in areas close to ours. There is a lot of e-mailing, phone calls, and physical visits back and forth, which assist in technology transfer. We will be on-call and willing to help out whenever they need some assistance.

Q: What do you consider the most promising direction nanotechnology research is taking these days-electronics, medicine, optics, cosmetics …

There is an explosion in all of these areas. It’s becoming real a lot faster than I was anticipating. I was one of the people who was cautioning against over-hyping the area, especially in the materials area. Look at all these incredible composite materials that have been made. There’s a huge, multi-billion dollar market for nanomaterials. That’s taking off like a rocket. More and more people are admitting they are doing it.

Q: Admitting?

There’s the whole issue of dealing with activist-type fallout. I’ve actually given talks at various conferences and had people stand up and start screaming and yelling at me. Fringe groups can be very vocal and you don’t know whether people will get physical. You kind of make a decision about whether to stand and be counted. Early on, HP established a policy that we weren’t going to hide anything.

Q: What accomplishments at HP Labs are you most proud of?

I’m mostly proud of the fact that we have brought together a multi-disciplinary team of really brilliant people to discover and invent. The most difficult part was inventing a new language or dialect so we could all understand each other. People were using the same words for entirely different concepts. Finally, things started to click. Now we have nearly 60 people working together amazingly well-people who [otherwise] would have had no reason to talk to each other.

They had the same boss who kept telling them to “just talk to each other.” It was difficult to get some buy-in to this. It was not preordained that this could work. Finally having all of us being able to work with each other so well was an interesting journey in and of itself.


The Williams File

Stan Williams, 55, joined HP Labs in 1995 as principal lab scientist. He is now an HP senior fellow and director of a group he founded as Quantum Structures Research Initiative, now called Quantum Science Research. The group’s purpose is to explore nanometer-scale electronics.

Williams received the Feynman Prize for Nanotechnology and the Julius Springer Award for Applied Physics. He co-authored and edited Nanotechnology Research Directions, which proposed the National Nanotechnology Initiative that Congress created in 2000 with $485 million in initial funding.

As Goliath Ltd. prepares to debut its nanotech-enabled product, the David Co. brandishes its slingshot full of patents

By Richard Acello

Each January, the Consumer Electronics Show in Las Vegas plays host to the latest in high-tech wizardry. This year, SED Inc., a joint venture of Canon Inc. and Toshiba Corp., was said to have its 55-inch television set ready for display. But the SED TV never made it to Vegas.

The IP factor

Toshiba issued a press statement that read, in part, “After many months of planning for CES 2007, it is with deep regret that we inform you that Toshiba is forced to cancel the 55-inch panel exhibition. The reason is neither [a] technical nor [a] business issue but we are not allowed to disclose details due to [a] confidentiality obligation.”

At the same time, Canon found itself involved in legal action with Nano Proprietary Inc., an Austin, Texas-based technology company. Nano Proprietary is not a rival electronics manufacturer, but describes itself as “first and foremost a research and development company.” The company explains, “We have an extensive portfolio of intellectual property that we have developed over the years and our goal is to develop a portfolio of recurring revenue streams by licensing our intellectual property to others.”


The advent of intellectual property (IP) litigation indicates maturity of the nanotech industry, says Jim Peterson, partner at the law firm Jones Day.
Click here to enlarge image

Nano Proprietary has an agreement with Canon to license its technology; the agreement extends to Canon’s subsidiaries, but not, Nano Proprietary contends, to the Canon-Toshiba joint venture SED.

In April 2005, Nano Proprietary filed suit against Canon, and its wholly owned subsidiary Canon USA, in the U.S. District Court for the Western District of Texas, seeking a declaratory judgment that SED’s products were not covered by Nano Proprietary’s licensing agreement with Canon. “We allege that SED Inc. is not covered under a license we gave to Canon in 1999, a license that extended to Canon subsidiaries, but prohibited Canon from sub-licensing the patents to others,” says Tom Gilbertsen, a partner with New York firm Kelley, Drye and Warren. SED Inc., says Gilbertsen, is not covered under the 1999 license and Canon breached the license contract by sublicensing SED Inc., entitling Nano Proprietary to damages that may range into the hundreds of millions of dollars.

In November 2006, an Austin District Court judge denied Canon’s motion for summary judgment in the case. Canon argued that SED was a subsidiary of Canon, but Judge Sam Sparks disagreed. “To put it bluntly, Canon’s characterization of SED as a subsidiary simply can’t pass the smell test,” Sparks wrote. “Canon has bargained away its voting rights in SED. Dead fish don’t swim, dead dogs don’t hunt, and Canon’s dead voting rights don’t give it a majority of the shares entitled to vote in SED. This court declines to recognize a corporate fiction designed for the sole purpose of evading Canon’s contractual obligations.”

Patton Lochridge of the Austin firm of McGinnis, Lochridge and Kilgore, who responded in court to the Nano Proprietary complaint on behalf of Canon, referred questions about the case to New York attorney Nicholas Cannella of Fitzpatrick, Cella, Harper and Scinto, who did not return calls about the matter.

Gilbertsen said the case was headed to trial, but in the meantime, Nano Proprietary was open to a settlement. “We’ve had dialogue with them, so Canon knows what we want them to do,” he said. “We’re open to Canon and Toshiba obtaining a license with us.”

Then on January 12, Toshiba said it had reached an agreement to have Canon purchase from Toshiba all of its outstanding shares in SED, so that SED would become a wholly owned subsidiary of Canon.

Another battlefield

Nano Proprietary is not the only nanotech “David” to trot out its IP portfolio before a well-known manufacturer. In December 2006, DA Nanomaterials, a Tempe, Ariz.-based joint venture of chemical giant DuPont and Lehigh Valley-based Air Products, went to court in Arizona federal district court against Cabot Microelectronics, based in Aurora, Ill. DA Nanomaterials sought a judgment that it is not infringing on Cabot’s intellectual property.

Cabot says DA Nanomaterials is infringing the process Cabot uses to make and sell slurry polishing and pad products used in the manufacture of semiconductor chips. The chips go into electronics products from cell phones to servers.

Unlike Nano Proprietary, Cabot is a manufacturer of the products in dispute and does not necessarily want to negotiate a license with DA Nanomaterials. In fact, Cabot says the suit results from Cabot’s refusal to grant DA Nanomaterials a license to the disputed technology. “We don’t think it’s appropriate to ask or demand that they be given a license,” said H. Carol Bernstein, Cabot’s general counsel.

In June 2006, Cabot was successful in an action before the U.S. International Trade Commission against Korea-based Chiel Industries involving some of the same technology at issue in the DA Nanomaterials case, said Bernstein.

Natural progression

What the Nano Proprietary and Cabot cases have in common is the willingness of companies with extensive nanotech portfolios to assess their intellectual property rights as nanotechnology products move from drawing boards to reality.

“Nanotech intellectual property exists in thousands of patents that have been out there for a while,” says Jim Peterson who heads up the nanotechnology practice at the Bay Area law firm Jones Day. “It’s an indication of the maturation of the nanotech industry.”

Peterson says the trend of IP-rich firms using their portfolios to win licensing agreements is reminiscent of what happened in the development of biotechnology. “First comes the innovation, then comes the ‘so what’ component, or how you go from innovation to something that is useful, and that involves wrapping your arms around it and creating intellectual property, so you have something to license,” he adds.

Companies with extensive IP portfolios will set out to try and license their technologies, and while legal action may be part of the scenario, their main goal will be to get paid for their IP.

“You’ll see people trying to license their technology to larger companies, set out to be licensing companies, and developing strategies toward that,” Peterson says. “But I think litigation is an expensive proposition, and the substance behind it is the licensing because nobody makes money on litigation except the lawyers.”


Editor’s note: As this issue was going to press, Judge Samuel Sparks ruled in the Canon/Nano Proprietary case, saying that Nano Proprietary has the right to terminate its licensing agreement with Canon, keep its original licensing fee of $5.56 million, and seek damages for material breach of contract.

A new breed of sophisticated, low-cost microscopes is enabling new vision for industry and academe

BY CHARLES CHOI

Electron and scanning probe microscopes-those wonderful machines that allow us to peer at objects on the atomic and nano scales-have traditionally been large and expensive. Increasingly, however, toolmakers are introducing versions of these devices that are smaller (sometimes even portable) and less expensive than ever before as they begin to serve a potentially enormous market.

“It’s part of the same trend you see with all electronics. Instruments are now getting smaller and getting more capabilities squeezed into them,” says Mark Flowers, co-founder of Nanoscience Instruments in Phoenix, Ariz., U.S. distributor of Nanosurf’s portable and tabletop AFMs.

Students, start-ups, and industry

With this new generation of instruments, toolmakers are targeting primarily the education niche that optical microscopes typically fill. “The optical microscope market is about three to four times larger than the electron microscope market worldwide,” says Robert Gordon, vice president of Hitachi High Technologies America nanotechnology business development. Hitachi entered the desktop imaging market in 2006 with its tabletop scanning electron microscope (SEM) TM-1000.

“Before, a lot of these instruments were really off-limits in the lab, only for Ph.D. students and professors and postdocs to use, and often only at the universities with research facilities-the MITs and Stanfords,” says Agilent AFM operations manager Jeff Jones. “One trend we now see is these instruments finding use at state universities that want students to have broader capabilities, teaching universities that aren’t necessarily research-oriented but want to apply for nanotechnology research grants. They want to include AFMs in courses at the undergrad and grad level and want more-advanced instrumentation.”

Hitachi has “over the past one-year period concentrated heavily on the high school, small college, and university markets, and we are a major player and supporter of the government’s initiative to further develop our workforce and enhance our science and technology programs,” says Gordon. “We had always focused on pretty expensive electron microscopes, but now we’re looking into smaller microscopes with higher volumes to compete with optical microscopes and with a very big audience of people.”


Hitachi’s TM-1000 is allowing more schools to compete for nanotech research grants.
Click here to enlarge image

Entering the academic market is also a way of training generations of students in the use of advanced instruments they might use later in their careers. “We’re helping spread the infrastructure,” says Jones.

One key consideration behind entering the academic market, besides price, is ease of use. “Ease of use is not a consideration when you have a Ph.D. student who will spend five years on an instrument, but it is when you have a student for three or six weeks,” adds Jones. “You need something to get students up to speed quickly. You want something like an optical microscope, where anyone can walk up to it and see what he or she is looking for without being an expert in the technique.”

“We’ve also included a curriculum with our instruments that professors can use as part of a microscopy course,” Jones says.

Ease of use is also important in the industrial environment, “where technicians are running samples over and over for quality metrics,” continues Jones.

That’s one reason why start-ups comprise another potentially ripe area for these instruments. About 15 years ago, “AFMs were complicated and expensive. There was no market for ones that were easier to use, because people who used them were experts in the area,” says Paul West, CTO and founder of Pacific Nanotechnology in Santa Clara, Calif. “People now want microscopes to solve problems, as opposed to the microscopes being the project itself.”

“Many start-ups cannot afford the traditional electron microscopes, which cost at least $150,000. So a tabletop microscope is a good entry-level microscope for them to start using,” says Gordon.

“We’re looking into new areas for us, servicing companies in the fields of nanomaterials, the biosciences, such as pharma, and in cosmetics. There’s a lot of business we can gather there,” adds Gordon. “We’ve even found that major semiconductor customers are embracing it, for improving throughput in their failure analysis labs, for instance.”

Pioneers paved the way

One of the earliest companies to produce lower-cost, smaller-size advanced microscopes is Nanosurf in Liestal, Switzerland, spun off from the University of Basel in 1997. Nanosurf’s latest entry-level device, the easyScan 2 AFM, is portable. Its scanner fits in the palm of your hand and has a resolution of 150 x 27 picometers.

“At the time we started, there were only big, complex instruments on the market, operated by very experienced scientists,” says Nanosurf CEO Robert Sum. “We had been approached by high school teachers who had the need for small, easy-to-use microscopes, and we saw the potential there.”

The easyScan can be used on samples other microscopes might find it hard to access, “such as an airplane wing,” Sum says. “When it’s difficult to get the sample to the microscope, you bring the microscope to the sample.”

The easyScan is upgradeable with a variety of different scan heads and software packages, as well as vibration isolation attachments. Other microscopes Nanosurf offers are the portable Mobile S AFM, which includes multiple modes, and the tabletop Nanite, an automated system.

“The easyScan 2 is modular and can start below $15,000, and the Mobile S and Nanite can get closer to $100,000, depending on how they’re configured,” says Nanoscience Instruments’ Flowers.


Nanosurf’s easyScan can work on samples inaccessible to other microscopes.
Click here to enlarge image

NASA researchers have approached Nanosurf to adapt the company’s AFMs for the Phoenix mission to Mars, scheduled for launch in August 2007. “The goal is a 300-gram microscope and 8-watt power consumption to analyze soil and ice samples,” Sum says.

So far Nanosurf has sold more than 1,000 of its STMs and 300 of its AFMs. “We’ve mainly looked at academic customers for the last 10 years,” Sum says.

Flowers adds that more and more industrial customers are emerging. “Time is money, and having an instrument that’s easy to use and accessible at different skill levels is very attractive for industrial quality control and industrial research.”

Another company that entered early into the advanced, lower-cost desktop imaging field is Pacific Nanotechnology. Its Nano-R2, the second generation of the Nano-R SPM for research and educational purposes, can accommodate samples up to a square inch. The Nano-I is a more-specialized tabletop system that uses the same scanner and software, but can accommodate samples as large as 12 inches for industrial applications, such as inspection of wafers, disks, and technical samples.

Pacific Nanotechnology’s systems have a vertical resolution of 0.5 angstroms and a horizontal resolution of 0.5nm. The Nano-R2 sells for $80,000, while the Nano-I retails at $120,000.


Veeco introduced its Caliber to attract a growing market it was not reaching.
Click here to enlarge image

“Our systems are about 50 percent smaller than conventional AFMs and SPMs, at 12 inches x 12 inches x 12 inches,” says West. “They weigh maybe 100lbs.”

One of the main requirements for a desktop AFM or SPM is vibration isolation. “One of the things we incorporate into our devices is a heavy granite block, which helps protect the microscope against vibrations,” West says. Other vibration engineering safeguards remain proprietary information.

Pacific Nanotechnology also provides attachments to help users scan in gaseous environments, as well as heating stages and modes that allow scanning for electrical conductivity, magnetic fields, and other physical properties of samples. These can cost $1,000 to $5,000.

Here come the big guns

Increasingly, microscope giants are entering the field.

For instance, Veeco introduced a new handheld AFM in 2006. “We were seeing a market where we were losing, and now we have the Caliber AFM, which literally sits in the palm of your hand,” says David Rossi, vice president of Veeco’s Nano-Bio Business Unit marketing and business development. The Caliber has near-atomic-level resolution, he adds.

The list price for the Caliber is $57,000 in the U.S. “We’ve already shipped 25 to universities and materials development companies since the September timeframe,” says Rossi. “Our customers at this point are probably about two-thirds academia and one-third industry. It’s interesting: We’re seeing education institutions that would not have purchased a $200,000 instrument buying four or five Calibers for their teaching labs.”

Tabletop AFMs from Veeco include the CP2 and the MultiMode, “the highest-resolution AFM commercially available,” according to Rossi. The MultiMode costs $125,000 to $175,000, depending on the modes it comes with and what size scanner it has, while the CP2 costs from $80,000 to $110,000, also depending on the configuration.

Another industry leader, Agilent, announced its own tabletop imaging system-Agilent 5400 AFM/SPM-in December 2006, “which looks about the size of a classic optical microscope,” says Agilent’s Jones.

“We’ve targeted the market for instruments from $50,000 to $100,000, for both universities, who want more functionality than microscopes costing less than $50,000, but don’t need the bells and whistles for the $100,000 to $200,000 microscopes,” says Jones. “We saw this trend of people getting grants under $100,000, and it seemed like a good market. We’re also shooting for industry, where very pretty high performance is required in testing the same things over and over again but [where they are] not conducting hero experiments.”


Agilent designed its 5400 AFM/SPM for ease of use.
Click here to enlarge image

When compared to the more-expensive Agilent 5500, the 5400 “doesn’t have some environmental control aspects, such as full environmental control or being able to do some of the more-difficult temperature changes, but we do have the ability to upgrade the 5400 to the 5500,” Jones says. “We’ve also completely rewritten the software to make it easier to use and so it can get set up easier.”

“We’re happy with sales. We’ve almost reached double the initial targets for the first quarter,” Jones says.

The 5500 imaging modes, such as the magnetic AC mode, which allows better imaging in fluids, are compatible with the 5400. These modes range in price from a few thousand dollars to $30,000.

Electronics giant Hitachi introduced its TM-1000 last year at a price of $60,000. “That’s similar to an expensive imaging analysis optical microscope from a price standpoint,” says Hitachi’s Gordon. “At the same time, our microscope offers imaging at up to 10,000X, about 1,000 times past the point after which optical microscopes run into difficulties. It also provides much better depth of field and surface topography detail than optical does.”

Hitachi achieved its shrinking trick by miniaturizing the electron column and optics at the heart of an SEM using computer-aided design (CAD). Also, the company used smaller pumps to provide the same quality vacuum as other SEMs. A laptop that contains the software to run the microscope is included.

Intriguingly, unlike conventional SEMs, the TM-1000 does not require metal coatings to observe nonconductive samples. Cutting out this elaborate preparatory step is part of Hitachi’s strategy to make the microscope as simple to use as a digital camera.

“When electron beam-sensitive samples start to charge, which could damage them, the variable pressure mode bleeds a little air into the system to minimize the charge put on any noncoated materials,” Gordon explains.

The TM-1000 does have much lower magnification than more-expensive SEMs. It also has fewer software capabilities, “keeping with the idea that it should be simple to use,” Gordon says. Moreover, unlike other SEMs, the TM-1000 operates only at a fixed voltage of 15kV.

Total production volume for the TM-1000 on a worldwide basis is approaching about 400 units, Gordon says. “We’re interested in going to much higher volumes-possibly 200 units per month,” he adds. In the future, Hitachi will consider attachments that add functionality, such as energy dispersive x-ray analysis or software packages allowing image archiving and database management, while not making the microscope too complicated to use.

Hot on Hitachi’s heels is FEI with its new Phenom, a tabletop electron microscope with magnification capability up to 20,000X. Designed to be easy to operate, Phenom offers a touch-screen monitor. It is currently being sold in the Netherlands, Belgium, Germany, and Luxembourg, with other countries added soon. Suggested applications for it include pharmaceuticals, metallurgy, manufacturing process, quality control testing, and basic research.

Rossi expects the market to grow as the semiconductor and other industries experience increasing metrology demands and the life sciences sector approaches the nano scale.

The devices will also continue toward trends of improved performance, capabilities, and ease of use, West adds.

“I’m very excited about how the market’s growing and about other companies entering it,” Sum says. “In the beginning, there was no competition, and so one wondered whether or not there was a big market here. If other companies are entering, that shows these technologies represent a good direction to go in. Competition’s good.”

Feb. 28, 2007&#8212In Washington, D.C., the Project on Emerging Nanotechnologies delivered a presentation titled Nanotechnology: A Progress Report on Understanding Occupational Safety and Health Issues. Dr. Paul A. Schulte of the National Institute for Occupational Safety and Health (NIOSH) discussed progress made in understanding and preventing work-related injuries and illnesses potentially caused by nanoparticles and nanomaterials, drawing on the new NIOSH Nanotechnology Research Center Progress Report, released in January.

Feb. 28, 2007&#8212Diagnostic products developer Nanogen Inc. has been issued four patents by the U.S. Patent and Trademark Office for inventions related to diabetes and Alzheimers disease biomarkers. The current patents are the most recent in a series describing biomarkers associated with these diseases.

U.S. Patent No. 7,179,610, “Complement C3 precursor biopolymer markers indicative of insulin resistance,” and U.S. Patent No. 7,132,244, “Betaine/GABA transport protein biopolymer marker indicative of insulin resistance,” describe the use of mass spectrometry and time-of-flight detection techniques to identify biopolymers that could potentially be used in the diagnosis of or development of therapeutics for the metabolic conditions Syndrome X and type II diabetes.

U.S. Patent No. 7,179,605, “Fibronectin precursor biopolymer markers indicative of Alzheimer’s disease,” and U.S. Patent No. 7,179,606, “IG heavy chain, IG kappa, IF lambda biopolymer markers indicative of Alzheimer’s disease,” relate to the identification of protein biomarkers for Alzheimer’s disease.
Nanogen’s ten years of nanotechnology research is supported for its potential for diagnostic and biodefense applications.

Innovation Society and TÜV SÜD develop “world’s first” certifiable nanosafety label

Feb. 27, 2007&#8212The Innovation Society, a Switzerland-based nanoconsulting firm and TÜV SÜD (Munich, Germany), a global technology- and certification company, have developed what they call the first certifiable nanospecific risk management and monitoring system.

Called CENARIOS, the product provides hazard and risk assessment, integrated risk-monitoring- and customizable Issue-management and communication tools. The system is currently being implemented by Buhler PARTE, which will make it operational by this summer.

CENARIOS was developed in 2006 in cooperation with industry in order to meet the particular safety requirements of companies that are producing or handling nanomaterials. The system can be applied in all “nanorelated” industry sectors, e.g. textiles, cosmetic, energy, packaging, food, chemical, pharmaceutical, automotive, and electronics. The certificate is intended to improve safety standards of processes and products over the entire supply chain.

The system is audited; the certificate is awarded by an independent authority of TÜV SÜD and will be renewed periodically. The certificate testifies a “state-of-the-art” standard for risk-management processes; it recognizes a company’s commitment to process and product safety and its engagement in protection of human health and environment and workplace safety. It therefore promises a powerful element of confidence building towards customers, authorities, investors, insurance companies, and the public.


Discera hopes to shake up the quartz crystal oscillator market with its MEMS replacement product line. (Photo: Discera)

Feb. 27, 2007&#8212In addition to announcing the shipment of a family of long-anticipated silicon oscillators, Discera Inc. boasts an agreement to help push those MEMS into OEM products. Discera hopes that the moves will shake up the $3.5 billion market for traditional quartz crystal oscillators.

Discera says its line of MOS1 MEMS-based oscillators have been tested and endorsed by one of the leading quartz timing vendors, Vectron International. Vectron, designer, manufacturer and marketer of frequency control, sensor, and hybrid products, is sampling VMC2 (Discera’s MOS1) as a direct replacement for quartz crystal oscillators for high volume timing applications.

Quartz solutions have been pushed to the limits in terms of scalability, cost and processing, says Discera. The company hopes its MEMS solution will disrupt the traditional quartz crystal timing market. MOS1 can be used as a direct replacement for quartz crystal oscillators as a timing device in a market estimated at $3.5 billion.

“MEMS timing devices have been anticipated for a long time,” said Ed Grant, vice president North American products and operations, Vectron. “We are very familiar with the challenges of delivering a MEMS solution…These first products will enable us to expand our market presence in the high volume commercial and consumer electronics space.”

Discera says its MOS1 is poised to enable electronics companies to overcome the challenges posed by quartz crystal technology. The promise of reduced costs and smaller product footprint offered by MEMS, has, until now, been offset by concerns about silicon resonators and reliability. These concerns included frequency stability over temperature cycling, aging, vibration operation and shock resistance.

Discera’s technology consists of a silicon MEMS resonator and an ASIC embedded within a conventional QFN package or ceramic package. The MOS1 oscillator family generates frequencies between 1 to 125 MHz with excellent temperature and jitter performance and is packaged in a tiny industry standard IC package, providing a significant cost advantage. The MOS1 family of oscillators has passed industry standard qualification tests, as well as extreme reliability tests, demonstrating superior mechanical reliability compared to standard Bulk Acoustic Wave quartz solutions, and is ready to ship in production volumes.

Discera’s product has passed all XO requirements with the following features:

Lower costs: Discera sees a path of 15 percent cost reduction every year while quartz crystal companies are running at a10-20 percent margin and unable to offer significant cost reductions to their customers.

Shorter lead time: Provides a unique way to define the oscillator frequency anywhere between 1MHz-125MHz with a resolution of 2ppm. Unlike most of the programmable oscillators which only provide a limited number of frequencies, Discera can provide almost any frequency.

Better reliability: CMOS MEMS oscillators can reach performance requirements and costs at least 100 times lower than quartz crystal oscillators in special applications for extreme environments. Testing
more robust with industry standard packaging techniques

Discera’s CTO, Wan-Thai Hsu, was named an “Innovator of the Year” finalist for 2006, in the Third Annual Creativity in Electronics (ACE) Awards, for his work in bringing MEMS based oscillators to reality. Discera recently demonstrated its technology as a plug-and-play direct replacement of quartz crystal in a camcorder device at the Electronica show. This was the first MEMS demonstration in a consumer device. Key target applications for MEMS-based timing devices are DVD players, gaming consoles, set top boxes, camcorders, PDAs and cameras. Since the oscillator is the heartbeat of all electronic systems, the impact of the MEMS oscillator does not only effect the timing market but essentially all the systems.

The Discera-Vectron jointly available product consists of a silicon MEMS resonator and an ASIC embedded within a conventional QFN package or ceramic package. The product is now available in production quantities and has been successfully tested in 25,000 centrifuge acceleration and 50G vibration operation with no measurable deviation.

By Tom Cheyney
Small Times Contributing Editor

Feb 26, 2007&#8212For the emerging flexible thin-film, organic, and printed electronics markets to flourish, most industry professionals agree that roll-to-roll (R2R) processing must be implemented on the factory floor. Indeed, thin-film photovoltaics, OLEDs, and RFIDs are already in pilot or volume production using R2R techniques.

But for the manufacturing of large-area and conformable displays, paperlike e-books, and high-performance solid-state lighting to take place, it remains an open question whether inkjet, thermal laser imaging, or other printing technologies; optical, imprint, or digital lithography; adapted semiconductor and LCD processing methods; inorganic or organic material; or a combination of the various methodologies and chemistries will be leveraged into successful, scaleable R2R approaches.

Why R2R? The main reason is cost, which must be cut by at least 50% compared with batch-processed components, according to Display Search‘s Barry Young during his presentation at the recent Flexible Display & Microelectronics Conference. Applied Materials‘ Hans Maidhof echoed the cost-cutting sentiment, saying that “flex will only succeed if it’s cheaper.”

“Thin films reduce the cost of semiconductor materials, continuous fabrication increases utilization and reduces production costs, and application of industrial processes simplifies production while providing high manufacturing rates¿. R2R [also] leverages form factor to lower overall customer costs,” noted Maidhof.

The challenges facing those trying to commercialize continuous processing are cultural as well as technical. “R2R and web converting folks are not used to semiconductor requirements and vice versa,” Hewlett-Packard Labs‘ Carl Taussig told Small Times. “We pretty much build everything, and building [your own] tools slows development.”

Included among the equipment that Taussig and his team have built is the self-aligned imprint lithography (SAIL) system. This high-resolution tool, along with stamping, mastering, and other proprietary technologies, has allowed HP (with its partner PowerFilm Solar) to build what he claims is “the first flexible TFTs and [active-matrix] backplanes made fully with R2R processes,” adding that the first integrated display “will come out any day now.”

Eran Elizur of Kodak‘s graphic communications group described a seventh-generation dry-process, maskless production tool for printing thermal color filters, which resides “at a customer site in Asia for evaluation.” The system handles 2250 x 2250 mm substrates, employs five 5-micron-resolution laser heads, has 3-micron imaging accuracy with an imaging speed of up to 2 meters per second, and supposedly decreases manufacturing costs per panel by 30%.

For R2R processes to be consistent, efficient, and high yielding, there must be reduced contamination and defectivity levels, precise endpoint control, very high uniformity, subnanometer-level surface roughness, and assured reliability.

Applied’s Maidhof admitted that, although his company has a cleanroom-compatible vacuum web coater system, the “particle issue is most important” and “not totally solved in our tools.”

“With an endless process, where’s the endpoint?” quipped Taussig of HP Labs. For their amorphous-silicon TFT process, his group uses interferometry to assess the endpoint of back-channel etching and fluorescence techniques to monitor and control the thickness of the polymer mask etch.

“Surface roughness is still an issue with flex and is not good enough for making transistors,” explained PARC‘s Bob Street. He told Small Times that the roughness average needs to be a few angstroms, but that plastic substrates remain “5 to 10 times rougher than glass.” Pointing out the susceptibility of flexible substrates to scratching, Street said that the plastics people “need to learn how to improve quality.”

Dan Gamota of Motorola, which has produced “more than 60 miles of printed electronics” and is “close to getting dielectric layers 1 to 5 microns thick,” believes that the key challenge facing printed electronics is “how to [perform] quality control and characterization [tests] on rolls many hundreds or thousands of feet long. How do you check individual transistor device mobility on 2000 feet of film?”

Display Search’s Young zeroed in on the limitations of current R2R technology. “Do you have a process where a particular display is going to stay constant from roll to roll? There’s not a lot of flexibility in roll-to-roll manufacturing.”

Feb. 26, 2007&#8212The Environmental Protection Agency has issued its Nanotechnology White Paper, EPA 100/B-07/001, to inform EPA’s management of the science issues and needs associated with nanotechnology, to support related EPA program office needs, and to communicate the identified nanotechnology science issues to stakeholders and the public.

The white paper provides background information regarding nanotechnology and various environmental issues, and discusses the risk assessment of nanomaterials, the environmentally-responsible development of nanoscale materials, and the EPA’s research needs regarding nanomaterials.


Bosch’s EBS integrates with a car battery’s pole for space savings (Photo: Bosch)

Feb. 23, 2007&#8212Bosch has begun manufacture of its Electronic Battery Sensor (EBS), which promises to help motorists avoid flat batteries, the most common cause of breakdowns. With its integrated evaluation electronics, the MEMS sensor determines the battery’s condition by measuring voltage, current, and temperature. The energy management system incorporated in modern cars uses these values to continuously guarantee sufficient battery energy, so that the vehicle can be started reliably even after a long stationary period. In 2007, more vehicles equipped with the sensor will go into series production.

The Bosch battery sensor consists of a chip that contains all the electronics, and a shunt for current measurement. These two components, along with the pole terminal, form a unit that can be connected directly to the car battery, fitting into the pole niche of standard automotive batteries. This yields a significant saving in both space and costs over previous solutions.

Bosch developed the associated software for battery-state detection in collaboration with Varta, and integrated the algorithms into the EBS chip. The sensor determines the battery’s capacity, state of charge, and expected performance, and sends this information via an LIN interface to the vehicle’s superordinate energy management system, which uses the data to optimize the state of charge.

If over a period of time more electric charge is used than the alternator can provide, the level of charge in the battery will fall. The energy management system compensates for this before a critical battery state is reached by reducing the power consumption of comfort items such as the seat heating, and may even switch them off altogether for short periods. It can also increase the combustion engine’s idle speed, and thus the alternator speed, if the vehicle is stuck in a traffic jam for a longer period. This improves the battery’s state of charge, which means that the period in which a vehicle can be re-started reliably is now much longer, even if a large number of electrical consumers drew on the battery on the vehicle’s previous journey or if the vehicle has been left standing for a considerable period.

Besides the present state of charge, the software can also forecast the battery’s future charging condition. And, it will be possible to control power generation by the alternator more precisely. This promises to reduce fuel consumption, and thus also emissions of pollutants.

Accurate information about the battery is also needed in vehicles with stop-start systems. For example, the engine will only be switched off if there is sufficient power available to restart it subsequently without any difficulty. And even when the vehicle is being manufactured, a quiescent current test can be done, allowing any problems to be detected. The sensor opens up greater diagnostic possibilities for garages&#8212for example, when a customer is having recurring problems with a flat battery.