Category Archives: LEDs

For embryonic technologies such as nanotechnology, intellectual property (IP) has become intricately complicated, like building at the nanoscale. Nanotechnology patent filings continue to proliferate and are extremely competitive, which adds to the complexity. These days, one business commentator will complain that patents have become too powerful. The next commentary will then express dismay that the system is too weak. We hear concern, if not angst, about nanotechnology patent thickets, although patent thickets have been around for a long time.

Despite all the rigmarole over patents, the patent system has become central to nanotechnology. This is particularly true for small, emerging companies for which IP assets are a large part of corporate value, particularly if they can establish IP dominance. U.S. leadership in nanotechnology is directly linked to its complex but effective Bayh-Dole system for technology transfer to the marketplace through patent licensing to small companies. In this complex environment, businesses need answers for patent, licensing and deal issues. To help, we provide some practical tips on patent strategy and execution, as well as some updates.

Patent strategy. At least two strategic themes emerge for any patent strategy. First, aggressively generate base corporate value by regular buildup of the nanotechnology portfolio, which will provide the company with assets valued by the market. Establish legally sound IP domination. Second, do not forget to add seasoning; generate additional value with targeted, strategic filings. For example, file patent applications before beginning work on a joint development agreement to protect the company as others gain access to the technology.

If your company is based on patent licensing under the Bayh-Dole system, become knowledgeable about Bayh-Dole. The system provides an IP base for companies to exploit, but companies must know the limits. For example, understand domestic manufacturing requirements, reporting compliance and government rights in the invention. Companies working with universities should assume control of the patent prosecution as early as possible.

Broad patent prioritization should be carried out. For example, patents should be crudely ranked for corporate value (like the latest top 20 sports poll). Do not blindly file and maintain patent applications, but proactively map applications against product development plans (and competitors’ products) on a country-by-country basis so that unnecessary filings can be pruned.

Avoid being blindsided by monitoring for competitive patents. For example, valuable technology insights can be gleaned from competitive patents. Licensing and joint development opportunities can be uncovered. Another compelling reason: Investors want it.

Execution. The best patent strategy may not mean much without practical execution. Managing a patent portfolio is not easy, as the task is filled with details and deadlines. Recognize that patenting is a highly specialized process and requires help from outside counsel.

Companies should file all patent applications, including provisional patent applications that are strong enough to contain the essence of the invention, using solid attorney input. Weak provisional filings represent lost opportunities to generate corporate value. They can damage later attempts to protect the technology, and can undermine the IP value. Also, companies should execute honestly with the U.S. Patent and Trademark Office (PTO) to avoid later charges of inequitable conduct.

Patent updates. The PTO continues to develop a more rigorous, quality-based examination system. This includes creation of a nanotechnology classification system, 977. The number of patents in 977 now stands at more than 2,600. The first PTO-classified nanotechnology patent to issue was filed in 1974 (No. 4,107,288). Looking to the future, the PTO granted Zyvex a 977 patent on self-replicating manufacturing stations (No. 6,510,359). IP figured prominently in recent transitions at NaturalNano and Quantum Dot.

Patenting requires hard work to reap the rewards of the investment. When Nobel Laureate Richard Feynman kicked off nanotechnology in 1959 with his “Room at the Bottom” speech, he certainly did not have in mind the work required by the modern U.S. patent system – he talked more about having fun. Feynman envisioned that scientists would compete to build nanostructures motivated by human qualities other than money. However, he also recognized the motivating role of money and established a financial award for building the first small-scale motor.

In the same way, patents provide an economic incentive to build at the nanoscale. Nanotechnology’s creative genius is a fountainhead of the U.S. economy. Through patents, companies should protect the genius behind these nanotechnology miracles.

Click here to enlarge image

Stephen Maebius is a partner and Steven Rutt is an associate at Foley & Lardner LLP. They can be reached at [email protected] and at [email protected].

By Candace Stuart

It only takes the first three pages of “The Eye for Innovation” to understand the reasons for the successes behind Control Data and Robert Price, its former chairman, president and chief executive officer. Technology, Price writes, equals know-how. Innovation stands for problem solving.

So much for gizmos, gadgets and the fuzzy, long-winded descriptions that litter corporate board rooms and business school classrooms. Price distills business concepts to their essence in a fascinating analysis of a company that was founded nearly 50 years ago to provide high-end data processing and equipment primarily for scientific and defense users. It metamorphosed from computers to peripherals and finally services. It now exists as Ceridian Corp.

While Control Data may no longer be a well-known name in the world of high tech, one of its founders remains an icon. Seymour Cray, architect of supercomputers and the engineering genius behind several of Control Data’s early blockbuster systems, was among the dozen pioneers who helped grow the Minnesota startup. Within 12 years, the company went from having no products and only $600,000 into a corporation with a global presence and revenues in excess of $1 billion.


“The Eye for Innovation”
By Robert M. Price
(nonfiction, 329 pages, published in 2005 by Yale University Press, $30 in hardback)
Click here to enlarge image

Price, who rose up the executive ranks from general manager in the 1960s to president by 1981, credits a company culture that encouraged staff to use their know-how and problem-solving skills to beat competitors like IBM. Know-how led to a diverse set of products, from Cray’s powerful supercomputers to educational software and shareware. Problem-solving abilities helped managers recognize and create business opportunities.

Control Data added peripherals to its portfolio to provide a low-cost solution to support its computer business. Services took center stage when it became clear that clients needed more than hardware and software. Price takes some pleasure in noting that IBM, Control Data’s nemesis for most of its existence, has shifted its focus to services.

In many ways, “The Eye for Innovation” resembles the standard how-to books found in the business section of a bookstore. Price uses Control Data’s decades-long history to illustrate industrial, academic and governmental partnerships, employee relations and even civic responsibilities. He weaves stories of Control Data’s novel products, services and workplace concepts to guide managers.

If you want innovation, for instance, accept failure. Cray continued to head up a laboratory in Wisconsin despite some expensive design duds. That sent a strong signal to innovators that Control Data was sincere in its mission. But by 1985, Control Data’s enthusiasm for all good ideas created an unfocused and unprofitable business.

“There was too much attention to what might be the next opportunity and too little to the problems of the opportunities already in hand,” he writes. “… Every new services opportunity was exciting – it clearly had potential, even if poorly defined, and it was a great challenge to our capacity for innovation. What could be more enticing? And now the price had to be paid, and it was: in 1986, thirteen business units were sold or shut down.”

The cautionary tale reminds me of companies involved in emerging and platform technologies such as microsystems and nanotechnology. Innovation can become a siren song if it is isolated from customers’ needs, Price argues. Control Data survived its crisis, but not without scars.

I often wondered as I read the book which of today’s micro and nanotech companies will prove to be their generation’s Control Data. Are nanomaterials specialists such as Oxonica, Carbon Nanotechnologies Inc. or Nanophase Technologies poised to become leaders, like Control Data circa 1960? Are mature companies such as BASF, 3M, GE – and of course, IBM – breeding innovations, as Price claims Control Data did for its first three decades, that are based on micro and nanotechnology? The Control Data saga offers a compass for them all.

Groups worldwide rally to bring consistency to nano

By Matt Kelly

A little bit of the mystery has gone out of nanotechnology. And that, everyone agrees, is a good thing.

Late last year engineers for the first time endorsed a standard specifically for nanotechnology: P1650, a method of describing the electrical properties of carbon nanotubes. The announcement came on the heels of another standards initiative. In November, the International Standards Organization (ISO) created a committee to forge nanotech standards.

Two years in the making, P1650 was ratified by the Institute of Electrical and Electronics Engineers (IEEE) in December. More are coming, and researchers insist such specifications cannot arrive soon enough.

“We need this big time,” said Jonathan Tucker, an industry consultant with Keithley Instruments Inc. in Cleveland, which makes testing equipment. “If I buy a jar of carbon nanotubes, to the naked eye it just looks like carbon black. I have no clue what I really have there.”

Determining what other people have has vexed nanotech researchers for years. Without standard means of testing nanoscale devices, or even standard terms to define what those devices are, researchers cannot reliably reproduce other scientists’ results. Manufacturers cannot scale up production of a prototype they create. Those sorts of obstacles prevent commercialization of basic nanotech research from moving forward.

“This is definitely important to the industry,” said Michael Holman, an analyst with Lux Research. Some of Lux’s large corporate clients, he said, are hesitant to pursue nanotech vigorously because of the lack of standards. “They’ve found that the materials advertised on the Web site are one thing, but what they’re actually able to deliver is often another. … It’s held them back in some cases.”

P1650 represents a first step to remedy the situation. The standard directs nanotube manufacturers to describe the tubes’ length, diameter and number of walls, along with other basic characteristics. While nobody is required to obey the standard, IEEE officials hope nanotube manufacturers will voluntarily obey so their products are more attractive to prospective customers.

From here, however, the remedy only gets more difficult. P1650 only addresses electrical engineering concerns about nanotubes. According to Daniel Gamota, a researcher at Motorola Corp. who led the IEEE’s P1650 working group, that focus made the standard “pretty simple” to define. Future standards that tackle more complicated subjects will be more challenging because nanotechnology cuts across so many disciplines.

Already, for example, the IEEE is developing another standard: P1690, to describe the properties of nanotubes when they are additives to bulk materials. That idea cuts across chemical, thermal and mechanical engineering, so more people must sit at the table to hash out the details. Gamota admitted “this one could be tougher.”

Circumstances are much the same for nanotech standards in life sciences. ASTM International has taken the lead on that front with a working group led by the Nanotechnology Characterization Laboratory. The NCL has proposed 12 protocols to measure and describe nanoparticles’ effects on living tissue, and already sent four of them to an ASTM subcommittee so private sector participants can give input.

NCL director Scott McNeil said nanoparticles are tricky to characterize because many are naturally fluorescent or interact with enzymes. Fluorescence and enzymes are two common tools to describe microbes, so the NCL must devise a whole new “characterization kit” for nanoparticles rather than use a pre-existing one.

The four standards that have gone to ASTM for review so far address how a particle reacts to blood cells; cell death; cytotoxicity; and a reactive test to see how nanoparticles affect samples of bone marrow. McNeil expects decisions on the standards within the next six months.


NNCO director Clayton Teague spearheads U.S. efforts to create standards in nanotechnology. Photo courtesy of the NNCO
Click here to enlarge image

Globally, the ISO brought together representatives from 22 nations in London in November to discuss what standards to address first. The top candidates were metrology, health and environmental safety, and terminology. Working groups were created for each. Canada now leads the terminology group, Japan the metrology, and the United States the health and environmental.

Clayton Teague, director of the National Nanotechnology Coordination Office and the U.S. point man on standards, now chairs a technical advisory group to develop ISO standards and to cooperate with other groups like IEEE and ASTM on their work. The ISO’s technical committee, Teague said, will reconvene in June and “there’s full expectation that we’ll have quite a number of work items to put on the table for consideration, and action will formally be taken by then.”

Expect to see some ISO technical reports or public specifications on nanotech about one year from now, Teague said. Such documents don’t have the force of a standard, but they are good indicators of where standard-setting bodies want to go. A fully ratified standard – which would be called ISO 229, with various sub-standards tacked on – could take three years to adopt.

And of all needed standards, Teague said, the most important is simple terminology: “a major area of unmet need.” Even basic wording to describe nanoscale items remains imprecise, and without that language the technology will never be able to mature.

John Miller, vice president of intellectual property at Arrowhead Research Corp., notes that the United States alone has granted patents regarding carbon “nanotubes,” “nanostructures,” and “nanofibers” when all the applications sought to patent essentially the same thing.

“This is gradually being solved at the patent office, but it does create broad and overlapping patents… with different examiners looking at similar applications with differing claim language,” he said. “The problem will emerge when products come to market and people start suing each other.”

That’s one standard procedure nanotech researchers hope to avoid.

December 22, 2005 – Toshiba Corp. and Sony Corp. have developed promising semiconductor technology that could help realize next-generation large-scale ICs with 45nm line-widths, reported the Nikkei English News. The new technology makes it easier for current to flow in transistor devices by bending silicon crystals.

Channels where current flows are bent, making it easier for charge carriers to move. This advance has led to the development of transistors with a film having the ideal thickness of 30nm. When used in LSI chips with 45nm line-widths, the new technology makes it possible to boost the electric current driving capacity of the device by about 40%.

The two companies have also created technology that can reduce the volume of wiring in next-generation LSI chips. With the 45nm generation of chips expected to be commercialized from 2007, the two companies are looking at ways to best adopt the new technologies.

NUCRYST Pharma prices IPO


December 22, 2005

(Update: Trading in NUCRYST Pharmaceuticals shares began at about 12 p.m on the Nasdaq. The symbol was changed from NCST to NCSTV.)

Dec. 22, 2005 — NUCRYST Pharmaceuticals Corp., a Wakefield, Mass., maker of medical products that fight infection and inflammation based on nanocrystalline silver technology, announced an initial public offering of 4.5 million common shares at a price of $10 per share.

The company said its common shares are slated to trade on the Nasdaq National Market under the symbol NCST and on the Toronto Stock Exchange (in Canadian dollars) under the trading symbol NCS. It did not say when public trading on either exchange was expected to begin.

The shares are being offered by an underwriting syndicate led by Jefferies & Company Inc. and co-managed by Adams Harkness Inc., GMP Securities L.P., and SunTrust Robinson Humphrey.

NUCRYST has granted the underwriters a 30-day option to purchase up to 675,000 additional shares to cover over-allotments, if any.

Net proceeds from the offering are expected to be approximately $39.9 million (or $46.1 million if the underwriters exercise the over-allotment) after deducting underwriting discounts and commissions and estimated offering expenses.

The company said it plans to use approximately $35 million of the net proceeds for capital expenditures, research and development and other general corporate purposes.

It will use the remaining proceeds of $4.9 million to $11.1 million to repay part of a $46.5 million debt to The Westaim Corp., NUCRYST’s parent company.

Westaim has agreed to allow NUCRYST to pay off the additional debt remaining after the cash payment with NUCRYST shares. After the offering, Westaim will continue to own a majority equity position in NUCRYST.

– David Forman

December 15, 2005 – A research group from Idemitsu Kosan Co. and Keio University has developed a way of producing YAG (yttrium aluminum garnet) phosphor in the form of a powder with particles measuring just 10nm dia., reports the Nikkei English News.

YAG phosphor is coated on blue LEDs to create light-emitting diodes that emit white light. The YAG phosphor material is typically processed into a powder with crystal grains measuring in the microns. But these particles are so big that they scatter light, reducing the brightness of the white LED.

The new nanopowder particles are smaller than the wavelength of the light, so there is almost no scattering. The result is a white LED that in principle can emit light that is 256 times as bright as a conventional white LED.

The YAG phosphor nanopowder is made by applying pressure to a liquid heated to 300C. The process not only yields smaller grains but is also less costly than conventional methods that bake the starting material at a temperature of 1000C.

Dec. 9, 2005 – A consortium of companies has announced its first research grants under the Semiconductor Industry Association’s new Nanoelectronics Research Initiative (NRI).

The grants will fund the creation of two new university-based nanoelectronics research centers — one in California and the other centered in New York — as well as support additional research at five National Science Foundation nanoscience centers and at a research group in Texas.

The two new research centers are The Western Institute of Nanoelectronics (WIN) in California and The Institute for Nanoelectronics Discovery and Exploration (INDEX) in Albany, N.Y.

WIN will be headquartered at the UCLA Henry Samueli School of Engineering and Applied Science. Participants will come from three University of California campuses (Los Angeles, Berkeley, and Santa Barbara) and Stanford University. The institute will focus on novel spintronics and plasmonic devices. In addition to its NRI funding, it will also receive additional direct support from Intel and the UC Discovery program.

INDEX will be headquartered at the College of Nanoscale Science and Engineering of the State University of New York-Albany. It will include also the Georgia Institute of Technology, Harvard University, the Massachusetts Institute of Technology, Purdue University, Rensselaer Polytechnic Institute and Yale University.

The New York-based institute will focus on the development of nanomaterial systems; atomic-scale fabrication technologies; predictive modeling protocols for devices, subsystems and systems; power dissipation management designs, and realistic architectural integration schemes for realizing novel magnetic and molecular quantum devices. It will also receive additional direct funding from IBM, and the Semiconductor Industry Association says support from New York State is expected.

The industry consortium — known as Nanoelectronics Research Corp., or NERC — and NSF also announced supplemental grants for nanoelectronics research during fiscal year 2006 at five existing NSF nanoscience centers: Network for Computational Nanotechnology at Purdue University, Center for Nanoscopic Materials at the University of Virginia, Materials Research Science and Engineering Center at the University of California, Santa Barbara, Center for Electronic Transport in Molecular Nanostructures at Columbia University, and Center for Nanoscale Systems and their Device Applications at Harvard University.

In addition, NERC announced an individual grant to the research team led by Professor Banerjee at the University of Texas at Austin for exploratory work in spintronics, and NSF announced an additional supplemental grant for nanoelectronics research to the Center for Semiconductor Physics in Nanostructures at the University of Oklahoma/University of Arkansas.

The companies participating in NRI (Advanced Micro Devices, Inc.; Freescale Semiconductor, Inc.; International Business Machines Corp.; Intel Corp.; Micron Technology, Inc.; and Texas Instruments, Inc.) will assign researchers to collaborate with the university teams.

– David Forman

December 2, 2005 – Total utilization rates for IC production rose slightly to 90.1% in 3Q05, up from 89.1% in 2Q05 and 84.8% in 1Q05, with run rates for leading-edge technologies (<0.12 microns, and 0.12-0.16 microns) pushing to 96%-97% by the end of September, according to data from Semiconductor International Capacity Statistics (SICAS).

More mature technologies also showed strength in 3Q05 — utilization rates for 0.16-0.2 microns and 0.2-0.3 microns process technologies bounced up to 87.2% and 85.3% respectively, after each bottomed out to three-year lows just two quarters ago. Overall IC capacity rose slightly from the prior quarter (3.3%) and from a year ago (8.7%) to 1.58 million wafer starts/week (200mm-equivalent wafers).

Chipmakers increased capacity for 300mm operations by 15% quarter-on-quarter to 259,400 wafer starts/week and bumped up actual usage by 17% to 240,500 wafer starts/week, pushing 300mm utilization rates to 92.7%.
Foundry utilization rates increased to 92.3% in 3Q05, maintaining solid growth from 83.0% in 2Q and 75.2% in 1Q05.

Foundries capacity declined 2.9% sequentially to 245,200 wafer starts/week (up 18.1% year-on-year), while wafer starts increased 8%. — James Montgomery, News Editor

Spaces of innovation


December 1, 2005

Cleanroom environments are being found in more diverse settings than ever before

By Jerry Kinkade, AIA, NCARB, and Josh Rownd, AIA, NCARB

Cleanrooms were once a specialty space relegated to corporate high-tech, computer-chip research and manufacturing. In recent years, however, cleanrooms have been built in university settings for teaching and basic research. The academic uses for cleanrooms have expanded from the traditional into biology and other life sciences. These sophisticated, multifunctional, and interdisciplinary cleanrooms provide critical solutions to modern research and education.

Cleanroom design has also responded to the scientific community’s need to collaborate. Large, ballroom-type cleanrooms and as-needed space assignments have replaced the traditional bay-and-chase design. Additionally, the interdisciplinary scientists using the space expect cleanrooms to provide not only a clean environment, but also vibration-free floors and benches, space free of electromagnetic fields, and temperature-controlled air. To effect the best guards against these elements, many large cleanrooms are being designed for below-grade locations.

Cleanrooms now take into consideration the need to protect the work within as well as the outside natural environment. This means designating the cleanroom as a barrier facility inside a containment facility, with all the required differences. These include different pressures, types of materials and spaces, and procedures, such as those pertaining to gowning.

These cleanrooms are, however, expensive to build and operate. To offset the high costs, public-private partnerships are growing exponentially. Oftentimes, public-private partnerships engage both public and private entities’ unique strengths, and direct joint endeavors toward achievements that optimize public needs, funds, and services. Furthermore, pairing higher education with corporate financing and resources helps overcome hardship posed by government cuts to science and research facilities. As a result, these combined efforts bear the fruit of technically advanced laboratories and cleanrooms.

Gearing up for higher education

Cleanrooms are likely the most expensive research spaces to build and operate, but ways exist to cut their costs. Some larger universities, for instance, opt to build technology parks on their campuses, where a private manufacturer might co-locate its facility and share resources with the university.

Public-private technology parks often incorporate “incubator” facilities that include pilot plants and mini-manufacturing research facilities. Used to iron out a manufacturing process, they produce small numbers of market-ready products without conducting large-scale manufacturing operations. Such facilities generally need to be highly flexible, since the manufactured products inevitably change. Good design will furnish a pilot plant with more utilities and types of power than a single product may require. However, this abundance of utilities will allow the flexibility to adapt to rapid changes from R&D to product development to pilot plants and back again. These utilities include clean air for heating and cooling, clean power, independent systems and in many cases back-up electrical generators and manifold gas farms.

The university’s faculty and students, as well as the corporation, benefit. Academic cleanrooms create opportunities for new business and train tomorrow’s engineers and scientists at both graduate and undergraduate levels. This sets up a revolving door of opportunity in that it provides a corporate funding stream to the university, thus enabling the university to offer its students real-world experience. This, in turn, makes the students more valuable, and over time will attract better students and faculty members to the institution.

Broader science

At Purdue University’s Birck Nanotechnology Center, designers worked with the university early on in the project to define the school’s needs (see Fig. 1). Before long, user groups began to expand. Soon, every science and engineering department in the university laid claim to space in the new facility. The ballroom-type cleanrooms and laboratories thus resulted in a far more multifunctional space than originally planned. Ballroom cleanrooms, unlike bay and chase designs, are more flexible in that they allow the research to be broader in nature. When research is completed in one lab, for example, it may be used for another project. This makes the space more attractive to prospective researchers. As a result, space at the Birck Center is not permanently assigned.


Figure 1. The Birck Nanotechnology Center, nearing completion in West Lafayette, Indiana, will provide approximately 215,000 gross square feet of interactive, interdisciplinary laboratory, cleanroom, office, teaching laboratories and seminar space to pursue research in nanoscale applications. Image courtesy of HDR Architecture.
Click here to enlarge image

Laboratories within the same facility can vary greatly. Three-quarters of the Birck labs have been designated for nanotechnology research. They are built as a positive-pressure barrier facility so as to protect research inside the room from outside contaminants. The remaining quarter of the labs are intended for bio research. They were built as a containment facility, which uses negative pressure to protect the outside world from indoor contaminants.

Not only is the pressurization different, but the finishes are different as well. For example, flooring varies between barrier and containment facilities. Nanotech labs require a raised floor where water cannot be introduced. On the other hand, biology labs require a seamless floor to facilitate cleaning and disinfection. Additionally, separate gowning areas are needed because the gowning process-including showers and sinks-differ between barrier and containment labs.

Budget constraints have led to the combining of teaching and research laboratories, which have historically been separate. Undergraduate teaching labs are designed for greater supervision, while graduate students use research-style labs. Teaching labs are generally less complicated. Undergraduate and graduate students may be able to use the same cleanroom to take advantage of economies of scale, but may need separate support areas. Examples of duplicated support space include instrument rooms, set-up, or storage rooms.

Considerations for sensitive environments

To protect sensitive tools and experiments, extreme measures must sometimes be designed into a cleanroom. This is especially true with electromagnetic interference (EMI) and vibration. Buildings are often surrounded by electrified systems-such as light rails, subways, trains-that produce electromagnetic fields (EMF). These EMF emissions result in potential interference to electrically sensitive equipment (e.g., computer monitors, diagnostic tools, electron microscopes).

Some magnetic fields can be shielded using ferromagnetic and/or thick, highly conductive materials. Examples include carbon steel, silicon-iron steel, nickel-iron, cobalt-iron, and copper or aluminum. Magnetic fields from static DC and AC light rail sources, however, are difficult to shield. They easily permeate nearly all materials including people, trees, buildings, concrete, and most metals. Here, designers mitigate damage from these fields by increasing the distance between the EMI source and susceptible electronics.

Vibration interference comes from a wide range of sources. Not only might passing vehicles (cars, buses, trains, etc.) cause vibrations, so also might a light wind against a building’s side. Below-grade labs aren’t immune; those forces on the building’s above-grade portion could transfer through its structural system. Even if a lab is completely below grade, the wind blowing against a tree, or even someone mowing the lawn, can vibrate the earth.


Figure 2. The NIST Nanofabrication Facility and ultraclean wing of the Advanced Measurement Laboratory (Gaithersburg, Md.) will be operated as a user facility. It will provide NIST’s collaborators with access to expensive nanofabrication tools and specialized expertise in a shared-cost environment. Image courtesy of HDR Architecture. Photography by Steve Hall © Hedrich Blessing.
Click here to enlarge image

The most sophisticated labs, such as the National Institute of Standards and Technology’s (NIST) Advanced Measurement Laboratory (AML) in Gaithersburg, Maryland, have done just this (see Fig. 2). The AML placed its most sensitive laboratories well below grade surrounded by the aboveground facilities, thus minimizing external negative impacts. Dr. Eric Steel of NIST’s Surface and Microanalysis Science division was quoted as saying, “Our new laboratory provides a dramatically better environment. It took us over three years to get a $1 million surface analysis instrument in our laboratory to meet its designed resolution. Within a week of moving into AML, the same instrument met spec.” (See Fig. 3)


Figure 3. NIST’s experimental Molecular Measuring Machine (M3) for ultraprecise, two-dimensional measurements was recently installed in an ultraclean laboratory in one of the Advanced Measurement Laboratory’s underground metrology wings. The instrument is designed to be operated by remote control from the adjoining room to minimize environmental disturbances. Image courtesy of HDR Architecture. Photography by Steve Hall © Hedrich Blessing.
Click here to enlarge image

null

Forward-thinking features

Clean environments can be disturbed easily. Many facilities are therefore being designed with a visitor corridor alongside the lab’s exterior. This allows investors into the heart of the facility without disrupting research experiments or requiring visitors to gown up. Typically, such corridors have windows with a mechanism to close off the view when necessary.

For the greater good

Cleanroom use is expanding. No longer just for corporate, high-tech chip research and manufacturing, these highly sophisticated, interdisciplinary spaces provide critical solutions to today’s research and education. However, the enormous price tag required to build and staff these labs, combined with current economic constraints, such as rising interest and inflation rates, force universities to consider different ways to pursue their goals; the need to seek new paradigms in funding and design has never been greater. Public-private partnering, collaboration among diverse university departments, and/or multifunctional spaces can all help overcome these barriers. Without collaboration and flexibility, these spaces of innovation will not be built.

Click here to enlarge image

Jerry Kinkade, AIA, is a laboratory programmer/planner with HDR Architecture, Inc. in Omaha, Nebraska. Mr. Kinkade focuses his experience and expertise on programming and planning for higher-education research facilities and has led professional teams providing services for facility analyses, master planning, programming and planning for both academic and corporate high-tech facilities. He can be reached via e-mail at: [email protected].

Click here to enlarge image

Josh Rownd, AIA, is a science and technology principal with HDR Architecture, Inc. in Omaha, Nebraska. With twenty years of experience in the industry, Mr. Rownd provides a unique blend of experience including projects for microelectronic, biomedical, medical device, pharmaceutical, R&D laboratories and cleanrooms. He can be reached via e-mail at: [email protected].

Nov. 30, 2005 — Adnavance Technologies Inc., a Vancouver, British Columbia nanobiotechnology company, announced that it has completed a Series A round of financing totaling $3.85 million.

The company is developing product applications that use the electrical conductivity properties of both DNA and novel metallic forms of DNA for healthcare and nanotechnology applications.

The round was led by the Working Opportunity Fund, with additional participation from BC Medical Innovations Fund, Canadian Medical Discoveries Fund and the Business Development Bank of Canada.

The proceeds will be used to advance the company’s core M-DNA technology for a number of commercial applications, with a primary focus on the development of ultra high sensitivity biosensors, molecular detection devices used to improve medical diagnosis and predict disease outcome. The company’s technology is designed to enable the detection of a variety of infectious diseases, both viral and bacterial, in less than five minutes. It also is working on other applications.

The company has also secured substantial funding from The National Research Council’s Industrial Research Assistance Program and from The Natural Sciences and Engineering Research Council of Canada to support its research and development programs.