Meet the Nimble-Fingered Interface of the Future

Microsoft's Kinect, a 3-D camera and software for gaming, has made a big impact since its launch in 2010. Eight million devices were sold in the product's.....

Electronic Implant Dissolves in the Body

Researchers at the University of Illinois at Urbana-Champaign, Tufts University, and ......

Sorting Chip May Lead to Cell Phone-Sized Medical Labs

The device uses two beams of acoustic -- or sound -- waves to act as acoustic tweezers and sort a continuous flow of cells on a ....

TDK sees hard drive breakthrough in areal density

Perpendicular magnetic recording was an idea that languished for many years, says a TDK technology backgrounder, because the ....

Engineers invent new device that could increase Internet

The device uses the force generated by light to flop a mechanical switch of light on and off at a very high speed........


How Personal Genomics Could Change Health Care

Several months after deciphering his genetic code last year, Stanford bioengineer Stephen Quake approached a cardiologist colleague. Early analysis of his DNA had flagged a rare genetic variant as potentially linked to heart problems. The variant, in fact, was located in a gene linked to sudden cardiac death in athletes, so physician Euan Ashley suggested Quake visit his office for some follow-up screening. Inspired by that meeting, the scientists spent the next year figuring out how to examine his genome in a way that would be meaningful to both Quake and his doctor.

The result--published today in The Lancet--is the most comprehensive clinical analysis of a human genome to date, highlighting both the medical potential of genomics and the hurdles that remain. "We wanted to try to answer the question of what a physician should do when a patient walks into the office with a copy of his genome and says 'treat me,' " says Quake, who was named one of Technology Review's top young innovators in 2002.

As the cost of sequencing has plummeted in the last few years--from about $3 billion for the Human Genome Project to less than $5,000 today--the number of complete human genomes has blossomed. Hundreds have now been sequenced, though only about 13 have been made public. Scientists are moving their focus from the technical hurdles of sequencing itself to what they say that will be a much more difficult task: analyzing the content of genomes to better understand human disease and the health risks of the individual.

Quake and 13 other "genome pioneers"--a select group who have had their entire genomes sequenced--described their efforts to use their genomes to better manage their health at a conference in Cambridge, MA, this week. The early adopters included James Watson, co-discoverer of the structure of DNA, Harvard professor Henry Louis Gates, Jr., entrepreneur Ester Dyson, 17-year-old Anne West, and a handful of genomics executives. 

Despite the complexity and remaining mystery of the human genome--scientists still don't know the function of 90 to 95 percent of human genes--many of the pioneers at the event described using their genomic information to make medical decisions. John West, former CEO of Solexa, a sequencing company that was acquired by genomics giant Illumina, recently had his genome sequenced along with that of his wife and two children. West and his wife discovered they have higher risk of a certain type of glaucoma, which sent them to the ophthalmologist for screening. "Now we know there is something to look for, and the test is easy and relatively inexpensive," he said at the conference. (Daughter Anne presented the results of her analysis of her family's genomes to the illustrious audience.)

Seong-Jin Kim, director of the Lee Gilya Cancer and Diabetes Institute at Gachon University of Medicine and Science, in South Korea, discovered after genome sequencing that he has a tenfold increased risk of macular degeneration, the leading cause of blindness in people over age 60. "I am diligently taking preemptive steps in everyday life to prevent it," he said at the conference. He takes high doses of antioxidants, which have been shown to slow progression of the disease, has regular eye exams, and avoids activities that tend to overexert the eyes. (The scientist is also trying to convince his wife to switch to an LED television, because they may be less damaging to the eyes than LCD.) 

While individual genetic tests could have been performed for each of the medical conditions in these cases, the cost of genome sequencing is dropping so quickly that it will soon be cheaper to sequence the whole genome rather than various parts. Quake, who published his own genome sequence, without interpretation, in 2009, can now go back to his genome anytime a new publication describes the possible implications of one of his variations.

In the Lancet paper, Quake and his collaborators undertook a comprehensive analysis of his genome. The researchers focused on variants that had been linked to risk of disease in previous studies, and those thought to play a role in a patient's response to drugs. Beginning with an average risk for a particular disease of someone of Quake's age and background, they added or subtracted risk using the genetic information. "That's the most challenging thing, there is no accepted method for how to do that," says Ashley. "We tried to prioritize what would be the most important thing to discuss with a doctor."

Highlighting just how difficult it is to analyze a genome, Quake points out that it took just a few weeks to sequence his genome, which was published in a scientific paper with three authors. Analyzing the genome for its clinical relevance took a year, and the resulting paper has 20 authors.

The rare variant that led Quake to Ashley's office in the first place provides an illustrative example of the state of genome interpretation. The variant is located in a gene that is well-known to be linked to sudden death. But it turns out that Quake's particular variation is fairly common and present in some healthy people. Combining that knowledge with the results of screening tests--all perfectly normal--led the team to conclude it isn't dangerous. The team also found a completely new variant in that gene, which they haven't yet been able to interpret. "At the moment our tools are relatively limited," says Ashley. 

In addition, researchers found variants in other genes linked to cardiomyopathy, a disease that weakens and enlarges the heart, which may help explain a history of sudden death in the family. "Maybe I could have guessed it based on family history," says Quake. "But it's one thing to know there is a family history and another to know I had the allele. That sent me to a cardiologist."

Quake says he hasn't listened to all the medical advice derived from his genome. While he learned that he has a higher-than-average genetic risk for other types of heart disease, he found via the traditional way that physicians use to calculate risk--which doesn't include genetic information--that he's just under the threshold for statin use. Ashley felt the genetic risk factors were enough to put him over the line and suggested he start taking the drugs. "I haven't followed that yet," says Quake. "I'm still thinking about it."

Some experts are already concerned that widespread genetic testing will lead to unnecessary medical follow-ups, driving high medical costs even higher. But Quake hopes that genome sequencing will ultimately lower costs. "I think this will provide a way to ration health care so that people at risk can get [screening tests] more frequently and those who aren't get it less frequently," he says. 

By Emily Singer 
From Technology Review

Sculpting a Nano 'World'

IBM researchers have invented a low-cost and relatively simple fabrication tool capable of reliably creating features as small as 15 nanometers. To show off the tool, the researchers at IBM's Zurich lab made a three-dimensional map of the Earth so small that 1,000 of them would fit onto a single grain of salt.

Nano-cartography: IBM Zurich researchers have created a tiny map of the world; 1,000 such maps could fit on a grain of salt. The map was drawn using a new nano-fabrication tool capable of creating features as small as 15 nanometers.

Existing nano-fabrication techniques like electron beam lithography have difficulty making features much smaller than 30 nanometers and are expensive and complex instruments. In contrast, the IBM researchers say their new fabrication tool sits on a tabletop at one-fifth to one-tenth the cost.

The new instrument is a descendant of the scanning tunneling microscope (STM) invented by IBM Zurich scientists in the early 1980s. That microscope made it possible, for the first time, to image and manipulate atoms. The new instrument uses an extremely small silicon tip that is rapidly scanned across the surface of the substrate. The tip is cantilevered like those used in atomic force microscopy (or AFM: an offshoot of STM that was invented in 1986), enabling it to apply nanonewtons of force to the surface. But unlike AFM, the tip is heated.

Where it touches the substrate, the thermal energy at the tip is sufficient to break weak bonds within the material. "We provide enough thermal energy so these molecules become mobile, crawl along the hot tip and evaporate," says Urs Duerig, a scientist IBM's Zurich Research Laboratory, in Switzerland. Together with colleague Armin Knoll and others, Duerig developed the new technique. What's remarkable about this, he says, is that it removes exactly the same amount of the material each time.

The advantage of the new instrument, compared to techniques such as e-beam lithography that involve removing material by bombarding it with particles, is that the effect is more localized. Although e-beam lithography can create features as small 15 nanometers, at resolutions below 30 nanometers, stray electrons tend to cause interactions with parts of the material neighboring the target area.

One advantage of the new technique is that it can bore down into the substrate at different depths, again at very high resolutions. This was demonstrated by etching into a molecular glass substrate a 25-nanometer-high topographical representation of the Swiss mountain, the Matterhorn, with a scale of 1:5 billion. The 3-D image was made by selectively removing material in 120 different layers. 

Nano-patterning: At the heart of the new tool is a tiny silicon tip. It is able to carve out features as small as 15 nanometers through heating and the application of nanonewtons of pressure.

This ability to create 3-D structures is intriguing, says Zahid Durrani at Imperial College London. "It's completely novel," he says. "I've never seen anything like this before." However, as with other probe technologies, extending the process to large numbers of tips operating in parallel is likely to prove challenging, says Durrani.

Karl Berggren, co-director of MIT's Nanostructures Laboratory, says IBM's instrument is an incredibly "clever and elegant" solution. "They've done something quite creative here," he says. Researchers have long struggled with thermal methods of probe lithography, but it was slow and resolutions were mediocre, says Berggren. "IBM has changed that," he says. "So making sub-20-nanometer-scale lithography available to labs that need it at reasonable cost may be the long-term legacy of this work. And it is a very important one."
In contrast, e-beam lithography requires several steps and tends to be very expensive, with systems costing up to $5 million, says Berggren. The IBM instrument is small enough to sit on a desktop and should cost around $100,000.

It is also relatively fast, says Duerig. Because the tip can write each "pixel" in microseconds, it can be scanned across the substrate very rapidly. The world map, for example, which consists of 500,000 pixels, took just two minutes to draw.

A crucial step in developing this technique involved finding suitable organic substrates. To this end, colleagues at IBM's Research-Almaden, in California, were brought in to help find hard organic substrates that could be used as so-called "resists," a sort of mask used in chip fabrication.

The challenge was to find materials that were tough enough to be used as substrates, but which could be thermally decomposed easily, evaporating into nonreactive chunks when brought into contact with the hot tip. In the case of the world map, a polymer called polyphthalaldehyde was found suitable, and for the Matterhorn, the IBM scientists used a form of molecular glass.


By Duncan Graham-Rowe
From Technology Review

Self-Powered Flexible Electronics

Touch-screen computing is all the rage, appearing in countless smart phones, laptops, and tablet computers.
Now researchers at Samsung and Sungkyunkwan University in Korea have come up with a way to capture power when a touch screen flexes under a user's touch. The researchers have integrated flexible, transparent electrodes with an energy-scavenging material to make a film that could provide supplementary power for portable electronics. The film can be printed over large areas using roll-to-roll processes, but are at least five years from the market.

 On a bender: This machine is testing the electrical properties of a graphene sheet. Korean researchers have incorporated these stretchy electrodes with thin-film nano-generators to make an energy-harvesting screen.

 The screens take advantage of the piezoelectric effect--the tendency of some materials to generate an electrical potential when they're mechanically stressed. Materials scientists are developing devices that use nanoscale piezoelectronics to scavenge mechanical energy, such as the vibrations caused by footsteps. But the field is young, and some major challenges remain. The power output of a single piezoelectric nanowire is quite small (around a picowatt), so harvesting significant power requires integrating many wires into a large array; materials scientists are still experimenting with how to engineer these screens to make larger devices.

Samsung's experimental device sandwiches piezoelectric nanorods between highly conductive graphene electrodes on top of flexible plastic sheets. The group's aim is to replace the rigid and power-consuming electrodes and sensors used on the front of today's touch-screen displays with a flexible touch-sensor system that powers itself. Ultimately, this setup might generate enough power to help run the display and other parts of the device functions. Rolling up such a screen, for instance, could help recharge its batteries.

"The flexibility and rollability of the nano-generators gives us unique application areas such as wireless power sources for future foldable, stretchable, and wearable electronics systems," says Sang-Woo Kim, professor of materials science and engineering at Sungkyunkwan University. Kim led the research with Jae-Young Choi, a researcher at Samsung Advanced Institute of Technology.

The same group previously put nano-generators on indium tin oxide electrodes. This transparent, conductive material is used to make the electrodes on today's displays, but it is inflexible.

To make the new nano-generators, the researchers start by growing graphene--a single-atom-thick carbon material that's highly conductive, transparent, and stretchy--on top of a silicon substrate, using chemical vapor deposition. Next, through an etching process developed by the group last year, the graphene is released from the silicon; and the graphene is removed by rolling a sheet of plastic over the surface. The graphene-plastic substrate is then submerged in a chemical bath containing a zinc reactant and heated, causing a dense lawn of zinc-oxide nanorods to grow on its surface. Finally, the device is topped off with another sheet of graphene on plastic.

In a paper published this month in the journal Advanced Materials, the Samsung researchers describe several small prototype devices made this way. Pressing the screen induces a local change in electrical potential across the nanowires that can be used to sense the location of, for example, a finger, as in a conventional touch screen. The material can generate about 20 nanowatts per square centimeter. Kim says the group has subsequently made more powerful devices about 200 centimeters squared. These produce about a microwatt per square centimeter. Kim says this is enough for a self-powered touch sensor and "indicates we can realize self-powered flexible portable devices without any help of additional power sources such as batteries in the near future."

"It's pretty impressive to integrate all these things in a foldable, macroscale device," says Michael McAlpine, professor of mechanical engineering at Princeton University. He notes that the potential of zinc oxide nanowires as a piezoelectric sensing material and nanoscale power source was previously demonstrated by Georgia Tech materials scientist Zhong Lin Wang. But integrating these materials over a large area with a flexible, transparent electrode opens up new applications, says McAlpine.

The methods used to make the nano-generators are compatible with large-scale manufacturing, according to Kim. His group is working to boost the power output of the films--the main obstacle is the quality of the electrodes. One possible solution is to improve the connection between the nanowires and the electrodes by eliminating flaws in the structure of the graphene. The Korean group is also experimenting with adding small amounts of impurities to the material, a process called doping, to improve its conductivity.

By Katherine Bourzac
From Technology Review

New chip stores a billion pages in one square inch

A North Carolina State University professor has developed a computer chip that can store an  an entire library’s worth of information. The chip exploits a new development in the use of nanodots, or nanoscale magnets, and represents a significant advance in computer-memory technology.


“We have created magnetic nanodots that store one bit of information on each nanodot, allowing us to store over one billion pages of information in a chip that is one square inch,” says Dr Jay Narayan, professor of Materials Science and Engineering at NC State.

The breakthrough is that the nanodots are made of single, defect-free crystals, creating magnetic sensors that are integrated directly into a silicon electronic chip. 

These nanodots, which can be made uniformly as small as six nanometers in diameter, are all precisely oriented in the same way – allowing programmers to reliably read and write data to the chips.

The chips themselves can be manufactured cost-effectively, but the next step is to develop magnetic packaging that will enable users to take advantage of the chips – using something, such as laser technology, that can effectively interact with the nanodots.

From tgdaily.com

Asteroid may show how life came to Earth

For the first time, water and organic molecules have been detected on an asteroid. The discovery lends plausibility to the theory that both  life on earth and water arrived through asteroid strikes. University of Central Florida researchers found a thin layer of water ice and organic molecules on the surface of 24 Themis, the largest in a family of asteroids orbiting between Mars and Jupiter.

 

"What we've found suggests that an asteroid like this one may have hit Earth and brought our planet its water," said UCF physics professor Humberto Campins, the study's lead author.

Using NASA's Infrared Telescope Facility in Hawaii, Campins and his team of researchers measured the intensity of the reflected sunlight as 24 Themis rotated, indicating the makeup of the asteroid's surface.

They were surprised to find ice and carbon-based compounds evenly distributed on 24 Themis. More specifically, the discovery of ice was particularly unexpected, because the asteroids were believed to be too warm to maintain it.

The researchers plan to check out several hypotheses to explain the presence of ice. Perhaps most promising, they say, is the possibility that 24 Themis might have preserved the ice in its subsoil, just below the surface.

From tgdaily.com

Innovative digital technologies assist specialists in anatomical reconstruction

"The common thread of digital technology in fields from prosthetics to surgery to anthropology is its ability to enhance outcomes," said Suzanne N. Verma, MAMS, Assistant Professor and Anaplastologist, Oral and Maxillofacial Surgery at the Texas

A & M Health Science Center Baylor College of Dentistry in Dallas, who will co-chair the symposium. "Technology is the palette and the specialist's creativity is the brush."

Kenneth E. Salyer, MD, FACS, FAAP, of the World Craniofacial Foundation in Dallas will discuss how he used technology in planning the surgery performed to separate Egyptian conjoined twins Mohamed and Ahmed Ibrahim in 2003. The twins were joined at the top of their heads. Lessons learned from the successful separation and reconstruction of the twins are opening up new opportunities for future work in tissue engineering and regenerative medicine.

Douglas Owsley, PhD, Curator and Head of the Division of Physical Anthropology at the Smithsonian's National Museum of Natural History in Washington, D.C., will discuss scientists' perspective of Kennewick Man, one of the earliest skeletons ever found in the Americas. Kennewick Man is more than 9,000 years old, and Dr. Owsley used digital technology to scan the specimen's skull and help to physically determine what it would look like with facial muscles and skin.

Ms. Verma will speak about how digital technology assists her in planning surgery and designing facial prostheses. "For example, we can use radiographic data to virtually create a 3D model of our patients, allowing us to preoperatively plan where to place an implant, plan the surgical approach for removing a tumor, or use the data to create a physical model of the missing anatomy" she said.

Andy Christensen, President of Medical Modeling Inc. in Golden, Colo. and co-chair of the symposium will discuss tactile medical modeling and the digital reconstruction process. In tactile medical modeling, specialists use data from digital imaging processes such as computed tomography and magnetic resonance imaging to create accurate plastic models.

Other topics to be presented at the symposium include the assessment of hard tissue structure and mechanics using digital models, and synchronizing sound, spatial positioning and anatomic visualizations in real time.
Provided by Federation of American Societies for Experimental Biology (news : web)

From physorg.com

HIV Patients Hold Clues to Salmonella Vaccine Development

Nontyphoidal strains of Salmonella (NTS) usually cause vomiting and diarrhoea in high-income countries and are mainly contracted by consuming infected foods, such as uncooked meat and eggs. NTS can also cause fatal bloodstream infections in people with compromised immunity, such as HIV-infected individuals, and children under two years of age or with malaria, anaemia or malnutrition.

This is a particular problem in Africa where Salmonellae are the most common bacteria to infect the blood. Such bloodstream infections can be treated with antibiotics, but drug resistance is on the increase and there is currently no vaccine available.

Cells of Salmonella isolated from macrophages. 


"The association between HIV infection and fatal cases of nontyphoidal Salmonella disease has been known since the onset of the AIDS pandemic 26 years ago, but this is the first time we've been able to offer a scientific explanation why," said Dr Cal MacLennan from the University of Birmingham, who led the research.
In an earlier study of African children, the team of researchers working at the Malawi-Liverpool-Wellcome Trust Clinical Research Programme, at the University of Birmingham and the University of Malawi's College of Medicine, had shown that protective Salmonella-specific antibodies generated in the first two years of life are critical for controlling the infection.

In the new study, the researchers turned their attention to immunity in African adults. While blood samples from HIV-uninfected adults killed Salmonella without difficulty, those from many HIV-infected Africans could not kill Salmonella. Since HIV causes significant defects in the immune system, the team examined whether a lack of these antibodies might account for the absence of killing and explain why HIV-infected adults are particularly susceptible to Salmonella infections.

Contrary to expectations, the team found that blood from HIV-infected adults harboured high levels of antibodies to Salmonella, molecules that normally help the immune system to fight infections. However, unlike the antibodies in healthy adults, these antibodies were unable to kill Salmonella. In fact, antibodies from these people actually stopped the antibodies from healthy adults from killing Salmonella.

The team went on to show that this difference in ability to kill Salmonella is due to the part of the Salmonella that the antibodies bind to. The protective 'killing' antibodies bind to structures on the surface of the bacteria known as outer membrane proteins. This then allows the immune system to destroy the Salmonella bacteria.
On the other hand, large numbers of antibodies in HIV-infected Africans bind to a structure that sticks out from the surface of the Salmonella known as LPS (lipopolysaccharide). These 'blocking' antibodies appear to divert the immune system away from the surface of the bacteria and stop the killing antibodies from doing their job.

When the researchers specifically removed the blocking antibodies from HIV-infected blood samples, they found killing antibodies present in the blood that could once again kill the bacteria. This shows that people infected with HIV still have the protective killing antibodies generated in the first two years of life that can control Salmonella infection, but the excess of blocking antibodies stops the killing antibodies from working.
"We normally think of HIV patients as being more susceptible to bacterial infections because of deficiencies in their immune systems, and often they have problems making antibodies when given vaccinations. In the present study, we found that it's actually an excess of antibodies that causes the problem," explained Dr MacLennan.

"The findings are important because LPS is currently being investigated as a potential target for a vaccine. Our observations that antibodies targeting LPS can actually impede the protective immune response to Salmonella would caution against this, suggesting that such a vaccine could do more harm than good."
A vaccine that protects both young children and HIV-infected adults from fatal cases of NTS is urgently needed in Africa. The findings from this study suggest that the outer membrane proteins could potentially serve as alternative vaccine targets and this is an area that the team is currently investigating.

From sciencedaily.com

Concentrated Solar Set to Shine

A California-based startup, Amonix, has received $129 million in venture-capital investments to further its commercialization of concentrated photovoltaic technology. The company's product combines powerful lenses, a tracking system, and solar cells for large, highly efficient solar-power installations. The funding could give the company, and the emerging field of concentrated photovoltaics, the boost it needs for widespread utility-scale deployments.

Concentrated power: Amonix's solar power generator converts 25 percent of sunlight into AC power.

"We've looked at 100 solar companies in the last 18 months, and Amonix is the one that stood out to us as having breakout potential," says Ben Kortlang, a partner at venture capital firm Kleiner Perkins Caufield & Byers, which led the recent investment. 

Amonix recently launched its newest solar concentrator, which converts one fourth of the sunlight that falls on it into AC electricity. That's compared with the approximately 18 percent system efficiency--including inverters that convert solar's DC power to useable AC power--of the most efficient photovoltaic systems that don't use special optics or track the sun.

To collect sunlight as efficiently as possible, Amonix starts with a massive 23.5-meter-by-15-meter array. The array is covered with thin, plastic Fresnel lenses, each measuring 350 square centimeters, that focus sunlight to an area that's .7 square centimeters. The sunlight, concentrated to 500 times its normal intensity, hits an ultra-efficient multi-junction solar cell that converts 39 percent of the light into electricity. The cell, made by Spectrolab, is the most efficient in the world, demonstrating more than 41 percent efficiency in lab tests. To further enhance performance, Amonix uses a tracking system that keeps the lenses pointed within .8 degrees of the angle of the sun throughout the day. 

Utility companies, however, have been reluctant to invest in any concentrated photovoltaic systems due in part to the device's high level of complexity. Proper functioning of each component is crucial because the lenses require very precise alignment with the sun in order to focus light on the solar cells. "The difference between being in alignment and being one degree off is the entire system works or it doesn't," says Johanna Schmidtke, an analyst with Lux Research. 

Amonix's technology already accounts for some 13 megawatts of installed capacity, which represents more than half of all installed concentrated photovoltaic capacity in the world. And so long as Amonix can prove its reliability, its technology offers several distinct environmental advantages over other types of utility-scale solar. 

Concentrated solar power--solar thermal systems that use highly concentrated sunlight to create steam that drives electric turbines--has begun to run afoul of environmental regulations because they typically require vast amounts of water. In contrast, concentrated photovoltaics don't require water to generate electricity and, because of their high efficiency, they don't blanket large swaths of land. 

The recent venture capital funding will allow Amonix to scale up manufacturing, and perhaps more importantly, will strengthen the company's balance sheet. "If you are going to be a supplier to the utility industry, you have to be a well-capitalized company that can stand behind its deployments," Kortlang says. 

By Phil McKenna
From Technology Review

Pigs Offer Cystic Fibrosis Clues

Researchers at the University of Iowa and the University of Missouri have developed a better animal model of cystic fibrosis. Newborn pigs that have been bred with a genetic mutation for the disease are the first animals to exhibit clinical symptoms similar to those in humans with the condition. The results point to a more effective way of studying cystic fibrosis and finding drugs to treat it.

In the muck: Pigs bred with a mutation for cystic fibrosis exhibit classic symptoms found in humans with the disease, including a thickened layer of mucus (top, in magenta) lining the membranes of organs like the lung and pancreas.


Cystic fibrosis is one of the most common life-shortening hereditary diseases. It affects 30,000 people in the United States and 70,000 worldwide. For years, scientists have tried to track the disease in mice engineered with the genetic mutation for cystic fibrosis, but the mice have not developed the trademark symptoms, including chronic lung disease. The pigs, whose organs are a closer match with humans, developed lung disease several months after birth. The researchers have published the results in the current issue of Science Translational Medicine.

Part of the motivation for the research is that, while the genetic root cystic fibrosis has long been known, it's still unclear how this leads to lung disease. 

The team found that newborn pigs with the disease had elevated levels of bacteria in their lungs. This finding may help resolve a long-standing debate --whether people with the disease are born with a hypersensitive inflammatory response that leads to lung disease, or whether inflammation occurs only when bacteria are present. 

"One question looming in the field is this chicken or egg concept of whether infection comes first, or whether inflammation precedes," says David Stoltz, assistant professor of pulmonary, critical care and occupational medicine at the University of Iowa, who was involved in the project. "It's important to answer that question because if you now know the sequence of events of lung disease in cystic fibrosis, that could dictate how you treat it."

Currently, physicians may choose to aggressively treat both infection and inflammation to slow the progression of lung disease. However, if researchers identify which of the two comes first in the disease, doctors could focus therapies to treat it much earlier, potentially extending lung function. 

In order to identify the early effects of cystic fibrosis, Stoltz and his colleagues observed the genetically engineered pigs from birth. Within months, the pigs developed signs of lung disease, including airway inflammation, mucous accumulation, and bacterial infection. 

The researchers took lung tissue cultures from both modified and normal piglets, and performed multiple tests to look for early signs of lung inflammation, including elevated white blood cell counts. They also did gene expression profiles related to inflammation. The group found no major differences between the disease and control groups before the disease group began showing symptoms, suggesting that people born with cystic fibrosis are not necessarily born with a hypersensitive immune response. 

Going a step further, the team tested the newborns' ability to fight off infection by introducing Staphylococcus aureus into their lungs, a bacteria commonly found in infants and children with cystic fibrosis. After four hours, researchers found the bacteria lingered in the lungs of the modified pigs, whereas their healthy counterparts were able to completely get rid of it. 

"This study says that it seems to be the bacteria triggering the immune response," says William Guggino, director of the Cystic Fibrosis Research Development Program at Johns Hopkins University. "To have an animal model where you could apply drugs and see if they work to correct this bacterial infection is a pretty big advance."

Alice Prince, professor of pediatrics in pharmacology at Columbia University, who was not involved in the study, says studying cystic fibrosis in pigs makes sense physiologically. "I think it's a very promising model," he says. "The immunology of pigs is more like people, and the airway cells in the pig lung are more like a human than in a mouse, so you could try a lot more therapies than you can with mice." 

While the pigs reproduce clear clinical trademarks of the human disease, the researchers found that, like humans with cystic fibrosis, the piglets are born with a bowel obstruction that, without surgery, is 100 percent fatal. "These are incredibly expensive animals," says Craig Gerard, chief of the division for respiratory services at Children's Hospital in Boston. 

Stoltz and his team plan to test a variety of drugs in the modified pigs, including compounds that counteract the effects of the relevant gene mutation. In cystic fibrosis, the mutated gene, CFTR, alters the activity of a key ion channel in the membranes of organs like the lungs and pancreas, causing thick mucus to build up, which exacerbates lung function. Scientists have recently identified drug compounds that improve the activity of the ion channel, which could potentially restore ion transport and lung function. 

By Jennifer Chu
From Technology Review

Solar Metamaterials

In an advance that could lead to solar cells that more fully utilize sunlight, researchers at Caltech have designed materials that can bend visible light at unusual but precise angles, no matter its polarization. The scientists hope the materials are a step toward perfectly transparent solar-cell coatings that would direct all the sun's rays into the active area to improve solar power output.




Solar material: Caltech researcher Stanley Burgos uses a focused ion beam microscope to examine a new metamaterial. The material’s microscopic structure, visible on the computer screen, can be tuned to interact with light in unusual ways.


Many groups are working on novel antireflective solar cell coatings in the hopes of getting more light into solar cells. The Caltech group, which includes Harry Atwater, professor of applied physics and materials science, and researcher Stanley Burgos, is addressing the problem by precisely tailoring the structure of materials at the nano and micro scales, creating "metamaterials" that exhibit optical properties that are not found in naturally occurring materials. In the most recent work, Atwater and his coworkers demonstrated a material that precisely controls the path of visible light regardless of the polarization of the light--a first for metamaterials. 

The Caltech metamaterial is a metal film several hundred nanometers thick. The films are patterned with circular cavities, each of which surrounds a wirelike column made of the same material. The space between the wire and the cavity wall is filled with a second metal. Depending on the dimensions of the patterns, the material bends, or refracts, light of different colors to a different degree. Atwater says the goal of his project is to make films with a refractive index exactly equal to that of air. Such a material would not bend light at all but would transmit it perfectly, with no reflection. When light moves from one medium to another, it scatters--this is why a straw in a glass of water appears to be broken. There's a mismatch between the refractive index of water and air. A solar cell coated with a material whose refractive index is identical to that of air would reflect no light at all. 

The films that Atwater's group is making are metallic conductors, and could also serve as the top electrode on a solar cell. Atwater says that while some metamaterial designs have been complex to make and involve multilayered structures, these single-layer films can be made using lithography and etching techniques commonplace in the chip-making industry.

The ability of the material to work with both polarizations of light is exciting, says Nicholas Fang, professor of materials science and engineering at the University of Illinois at Urbana-Champaign. But, he says, one of the major remaining challenges in engineering metamaterials is loss. As these metal structures interact with light, they lose energy to heat. This heat loss is so great in Atwater's current materials that just 40 percent of incident light passes through them.

For solar applications, Atwater says his goal is a metamaterial film that passes 90 percent of the light. To that end, his group and others in the field are developing ways to amplify light as it passes through metamaterials. Optical amplifiers are used in lasers and in telecommunications; incorporating them with thin films like Atwater's will enable metamaterials to find their way into practical applications in devices like solar cells.

By Katherine Bourzac 
From Technology Review

Home Sensor Startup Snapped Up

If you knew how much electricity your plasma television used or how much water your dishwasher drank at different times of day, would you change your habits to conserve more and spend less on utilities? Researchers at the University of Washington, Duke University, and Georgia Tech believe that you might. Several years ago they invented sensors that could track the electricity consumption and water usage throughout an entire building via a single point on each system. In 2008, the researchers founded a company called Zensi to commercialize the technology, and last week, they sold that company to Belkin, an electronics hardware manufacturer.

A line of easy-to-install sensors for homes could be commercially available within the next year, says Shwetak Patel, professor of computer science and engineering at the University of Washington, and co-inventor of Zensi's sensors. Data from such sensors could lead to itemized utility bills--and customers who are more aware of the energy sinks in their homes, he says. 

Right now it's impossible for a consumer to get an accurate gauge of energy use without deploying numerous expensive sensors. But cost reductions in key technologies have made the concept of watching every device in a home more feasible, says Ivo Steklac, executive vice president of sales and strategy at Tendril, a Boulder, CO-based, energy-monitoring startup. The key technologies are high-speed analog-to-digital conversion devices, digital signal processing algorithms, low-power communications, and ubiquitous Internet access and connectivity, Steklac says. 

The concept behind Zensi's technology is simple: a single sensor is plugged into a wall outlet, where it "listens" to the high-frequency electrical noise produced in the wiring when different devices are turned on. Each electrical device has a signature that is unique to the kind of device it is, its brand, and its location within a house. This information, in turn, reveals its energy consumption. MIT professor Fred Schweppe, and others tested a similar idea more than a decade ago. In the case of plumbing, a sensor is connected to the hose spigot on the side of a house. When a toilet is flushed or a sink is turned on, the sensor detects the characteristic change in pressure. 

Data from the electricity and water sensors is sent via the Internet to a base station for analysis. The algorithms differentiate between different devices and calculate electricity and water usage.
The technology will face competition, says Harvey Michaels, an energy-efficiency scientist and lecturer at MIT. Cisco and MIT researchers, not including Michaels, are collaborating on a chip that can be built into every energy-consuming object in a house. The chip will track energy use and communicate with a smart meter to automatically conserve energy and lower bills.

Belkin, a privately held company, has not disclosed the acquisition price for Zensi. The startup's CEO, Kevin Ashton, will serve as the general manager of its Belkin's Conserve business unit, which will manage the startup's intellectual property. The acquisition came before Zensi closed a first round of funding.
Belkin is still working out the product's details. Patel says the interface might be on a panel within the home, available via a website, or sent directly to a person's phone. 

By Kate Greene 
From Technology Review

Nanotube Fibers

In a Rice University lab, a black fiber the diameter of a human hair spools into a beaker of ether. Made up of pure nano­tubes, the strand is the culmination of nearly a decade of experimentation. Chemical engineer Matteo Pasquali and his colleagues have spun nanotubes into fibers several hundred meters long, proving that commercially useful manufacturing techniques can be developed to produce macroscale materials from these cylindrical molecules of pure carbon. 

Making carbon nanotubes into fibers was a particular dream of the late Rice professor Richard Smalley, who shared the 1996 Nobel Prize in chemistry for his discovery of the spherical carbon molecules called buckyballs. Individual nanotubes have remarkable properties: they're lightweight, they're strong, and they can be electrically conductive. But assembling them into large structures with these properties has been difficult.

In 2001, Smalley began trying to use liquid processing to spin carbon nanotubes into fibers that retained the tubes' electrical and mechanical properties over kilometer lengths--an idea that, he admitted, was "really lunatic extreme" (see "Wires of Wonder," March 2001). Such fibers would be stronger than steel and more conductive than copper. Smalley imagined them woven into cables that could efficiently carry electricity from remote wind and solar farms to populated areas--without losing energy to heat. Pasquali, who was part of the project from the beginning and took over after Smalley's death in 2005, acknowledges that he started out as a skeptic. "I thought that it was complete lunacy, because carbon nano­tubes are not soluble in fluid--and I'm a fluid guy," he says.




Chemical engineer Matteo Pasquali, who spins carbon nanotubes into fibers in his lab at Rice University in Houston.  


Other researchers have made macroscale fibers from dry nanotubes, pulling them from vertical arrays or spinning them like wool as they emerge from a reactor. But the individual nanotubes in these fibers don't line up, and proper alignment is critical: ­tangled masses of the molecules don't carry electricity well, and they're not strong. Pasquali knew that nanotubes brought into solution would line up like logs floating down a river, resulting in well-ordered fibers. 

The group had a breakthrough in 2004, when they reasoned that the methods used to manufacture Kevlar fibers, a component of bullet­proof vests, might also work with nano­tubes. Like nanotubes, the Kevlar polymer is long, thin, and difficult to dissolve in solution; the fibers are made by mixing the polymer with sulfuric acid and then shooting the solution through needles grouped like the holes in a showerhead.

The Rice researchers managed to dissolve only small amounts of nanotubes using sulfuric acid. But when they used chloro­sulfonic acid--a so-called superacid--they could get high concentrations of nanotubes into solution. The tubes form a liquid crystal, in which they are already aligned--a tremendous advantage in making them into fibers.
Spinning a line
Pasquali's group starts its spinning process with single-walled nanotubes made in a nearby lab using a process originally developed by Smalley. In a high-pressure reactor where temperatures reach 1,000 °C, carbon monoxide alights on droplets of pure iron catalyst and decomposes. The carbon atoms build up into hollow cylinders about a nanometer in diameter and a few hundred nanometers long. These nanotubes emerge from the reactor in fluffy black drifts; they're kept in five-gallon buckets stacked to the ceiling, each holding just 200 grams.

Nanotubes made in this reactor contain traces of iron that must be removed before the tubes can be turned into fibers. Graduate student Colin Young fills a glass chamber with nanotubes that have been treated with oxygen in a furnace to oxidize the iron, making it soluble. Inside a fume hood, he fastens the chamber over a flask of hydrochloric acid. He turns on a heating block under the acid to boil it. As it condenses and drips down onto the nanotubes, the acid dissolves the iron; the tubes are left untouched.

After their acid shower, graduate student Natnael Behabtu loads the nanotubes and chlorosulfonic acid into a stainless-steel tube fitted with pistons that rub the nano­tubes uniformly in a single direction to encourage them to line up. The resulting viscous solution is 8 percent liquid-crystal nanotubes by weight. 

He then detaches half of the chamber, and one of the pistons with it, and replaces it with a part that's been fitted with a spinning needle. The piston pushes the liquid through a glass filter (which prevents clogging), into the ­needle, and out into a waiting bath of diethyl ether. The acid is soluble in the ether, but the nano­tubes aren't, so the result is a pure nanotube fiber, 50 to 100 micrometers in diameter and many meters long.
measuring up
To measure the fibers' tensile strength, Young uses glue to tack a short length of fiber onto a cardboard frame. He clamps this into the metal vises of a stress tester, cuts the frame, and pulls the fiber from either end until it breaks. The fibers can currently withstand about 350 megapascals of pressure before failing--slightly less than a human hair, which is considered fairly strong for its diameter. 

The fibers' strength depends on the friction generated where nanotube surfaces interact. Longer nanotubes generate more friction and, thus, stronger fibers. The Rice nanotubes--which Pasquali is using for the sake of convenience--are relatively short. But he's exploring partnerships with fiber-spinning companies and carbon-nanotube manufacturers who can provide additional spinning expertise and longer nanotubes. Pasquali hopes to ultimately increase the fibers' tensile strength more than tenfold. 

There is still one major obstacle to realizing Smalley's dream of using nanotubes to remake the electrical grid. Pasquali's fibers have an electrical resistance of 120 microöhms per centimeter, about eight times greater than that of copper wires. The reason is that every method for growing nano­tubes results in a mix of conducting and semi­conducting versions. For nanotube fibers to carry enough current to displace copper, they'd need to be made up entirely of conducting nanotubes. The Rice group plans to make fibers from conducting nano­tubes separated from the nonconducting tubes to determine whether such conductivities are possible. But today's sorting process makes the nanotubes too expensive for use in electrical transmission.

Pasquali remains optimistic, however, that this second challenge will be overcome, just as he solved the problem of spinning nanotubes into long fibers. And he's sure that when it is, strong, lightweight nano­tube wires can at last replace the heavy and in­efficient steel-reinforced aluminum cables used in today's power grid, just as Smalley imagined.

 
By Katherine Bourzac 

Wind Turbines Shed Their Gears

Wind turbine manufacturers are turning away from the industry-standard gearboxes and generators in a bid to boost the reliability and reduce the cost of wind power. 

Siemens has begun selling a three-megawatt turbine using a so-called direct-drive system that replaces the conventional high-speed generator with a low-speed generator that eliminates the need for a gearbox. And last month, General Electric announced an investment of 340 million euros in manufacturing facilities to build its own four-megawatt direct-drive turbines for offshore wind farms. 


Power ring: This three-megawatt wind turbine uses permanent magnets and a design that makes it significantly lighter than a conventional geared turbine.  

Most observers say the industry's shift to direct-drive is a response to highly publicized gearbox failures. But Henrik Stiesdal, chief technology officer of Siemens's wind power unit, says that gearbox problems are overblown. He says Siemens is adopting direct-drive as a means of generating more energy at lower cost. "Turbines can be made more competitive through direct-drive," says Stiesdal. 

Siemens's plans hinge on a new design that reduces the weight of the system's generator. In conventional wind turbines, the gearbox increases the speed of the wind-driven rotor several hundred fold, which radically reduces the size of the generator required. Direct-drive generators operate at the same speed as the turbine's blades and must therefore be much bigger--over four meters in diameter for Siemens's three-megawatt turbine. Yet Siemens claims that the turbine's entire nacelle weighs just 73 metric tons--12 tons less than that on its less powerful, gear-driven 2.3-megawatt turbines.

Much of the weight reduction comes from the use of permanent magnets in the generators' rotor--a trick that GE is also using. Conventional turbine generators use electromagnets--copper coils fed with electricity from the generator itself. Henk Polinder, an expert in permanent-magnet generators at Holland's Delft University of Technology, says that a 15-millimeter-thick segment of permanent magnets can generate the same magnetic field as a 10- to 15-centimeter section of copper coils.

Stiesdal says Siemens reduced weight further by inverting its generator's design. Rather than a steel rotor covered with permanent magnets spinning inside a stationary doughnut-shaped stator (the design GE is using in its four-megawatt direct-drive turbine) Siemens's rotor is a steel cylinder with permanent magnets on the inside, and this rotor spins around a column-like stator. 

Siemens erected a prototype of its machine in Brande, Denmark, in December and plans to install 10 more this year, primarily in Denmark, before beginning mass production in 2011. GE's technology, which it acquired with the purchase of Norwegian turbine producer ScanWind last year, is being demonstrated at a test site in Norway; commercialization of its four-megawatt machine is slated for 2012. 

More competition is on the way. Venture capital firm New Enterprise Associates is backing a Boulder, CO-based startup called Boulder Wind Power, which is developing a 1.5-megawatt direct-drive turbine. The firm was founded in December by Sandy Butterfield, who was chief engineer for the U.S. National Renewable Energy Laboratory's (NREL) wind technology center, where he led a major study of the gearbox design process. 

Whether gearbox failures are an industrywide problem remains a matter of some contention. NREL initiated its study in 2007, when there were several failures: a U.S.-based company, Clipper Windpower, experienced serious gearbox problems within months of installing the first of its 2.5-megawatt turbines at a wind farm in Lackawanna, NY, while gearboxes in the 30 Vestas Wind Systems turbines forming the U.K.'s offshore Kentish Flats wind farm had to be replaced after just two years of operation. NREL concluded that most wind turbine gearboxes would fail "well before" their 20-year design life. 

Stiesdal says Siemens's own studies show that gearboxes are quite reliable, overall. A 2008 analysis of Siemens machines installed from 1983 to 1989 in the U.S. found that the "vast majority" were still operating with their original gearboxes. But he does expect increased reliability from the direct-drive system, which has about half as many parts as a conventional turbine.

Direct-drive systems do introduce one potential problem, however. There are ongoing concerns regarding the future supply of the rare earth metals used to make permanent magnets. "That's a serious issue," says Stiesdal. 

By Peter Fairley
From Technology Review

What's in a Tweet?

Researchers at the Palo Alto Research Center (PARC) are developing new ways to deal with the torrent of information flowing from social media sites like Twitter. They have developed a Twitter "topic browser" that extracts meaning from the posts in a user's timeline. This could help users scan through thousands of tweets quickly, and the underlying technology could also offer novel ways of mining Twitter for information or for creating targeted advertising.




Information flow: Software developed at PARC categorizes and prioritizes the information in a user’s Twitter stream.



The researchers' idea was to provide a way for users to deal with a large number of Twitter messages quickly. They found that many users wanted to be able to quickly catch up on what's been going on, without having to go through every single tweet in their timeline.

Ed Chi, area manager and principal scientist for the Augmented Social Cognition Research Group at PARC, says that the information coming through Twitter resembles a stream--users will dip into it from time to time, but they don't want to consume it all at once. His group's work is called the "Eddi Project" in reference to the idea of eddies in a stream. 

The researchers developed two main ways of filtering Twitter content. The first, presented recently at the ACM Conference on Human Factors in Computing Systems in Atlanta, is a recommendation system that ranks which posts in a Twitter stream a user is likely to find most interesting, based on factors such as the contents of posts as well as his interactions with other Twitter users. The second tool, the Twitter topic browser, summarizes the contents of a user's timeline so that the user can quickly survey what information has come through Twitter without having to read through every post.

To create this second tool, the researchers focused on identifying the topic of each tweet. Michael Bernstein, a researcher at the Computer Science and Artificial Intelligence Lab at MIT who is involved with the project, says the group found that Twitter users were interested in filtering posts relating to specific topics, and said they found existing methods lacking. "Hashtags"--user-generated annotations that categorize tweets--are perhaps the best current option, but most tweets don't have these tags. Bernstein notes that Twitter, Google, and other companies are developing ways to identify and categorize the most popular topics of discussion on Twitter--such the Icelandic volcano. But the sheer volume of tweets provides a lot of information for algorithms to use; it's much harder, he says, to figure out the topic of tweets that are more unique.

A key challenge of extracting meaning from a tweet is its length: no more than 140 characters. Chi says that most natural language processing technology relies on having a larger sample of text to work with. For example, some methods rely on people writing out associations between terms, which requires a lot of work to maintain, and is not the best way to interpret real-time information.

The researchers realized, however, that search engines have been dealing with extracting meaning from a small number of words--in the form of search queries--for years. 

"The essence of the approach is to coerce a tweet to look more like a search query and then get a search engine to tell us more," Bernstein says. The researchers first clean up a tweet by pulling common terms, like the Twitter slang "RT," which means "retweet." Once their algorithms have focused on likely significant terms, they feed those into the Yahoo's Build your Own Search Service interface--a Web service that can be used to tap directly into Yahoo's search result. 


Back in time: The software tools can produce a visualization showing Twitter topics over time. The lower half of this image shows recommended tweets.  

The Web is the most up-to-date source of data, Bernstein says, and the pages that come up in search results give enough information for the researchers' algorithms to produce a list of topics related to the original tweet.
A similar approach could be used with any repository of information, Chi notes, pointing out that companies could use the technology on an intranet to classify bits of information related to more specialized topics.

"Boosting the signal of a tweet by piping it through web search is an application of a well-established information-retrieval technique," says Daniel Tunkelang, an engineer at Google who is an expert on information retrieval. He compares it to using a thesaurus to set a word in a broader context. 

However, Tunkelang says the PARC researchers will have to make sure that the tweet-as-search-query approach doesn't collide with search engines' increasing efforts to index tweets. It wouldn't be good for a tweet to return itself as a result. 

Chi says that his team is working on a platform for managing various kinds of information streams. This summer, they plan to increase the scale of the Eddi Project so it can be placed on the live Web for testing. The longer-term goal, Chi says, is to build tools that can be optimized for enterprise customers. 

By Erica Naone 
From Technology Review

US military loses contact with experimental hypersonic vehicle


The US military has apparently lost contact with an experimental hypersonic vehicle over the Pacific Ocean.   
According to Turner Brinton of Space News, the Falcon Hypersonic Technology Vehicle (HTV)-2 was the "first in a series of flight experiments" planned to demonstrate technology that could be deployed in future long-range conventional missiles.  "[The vehicle] was launched from Vandenberg Air Force Base, Calif., atop a Minotaur 4 rocket. Built by Lockheed Martin Corp., the HTV-2 craft was supposed to glide over the Pacific Ocean at speeds exceeding 20,000 kilometers per hour for as long as 30 minutes," explained Brintion.
"Nine minutes after launch, however, DARPA (Defense Advanced Research Projects Agency) lost contact with the craft, and the cause of the failure is still unknown. There is [only] one remaining HTV-2 craft."
However, Frank James of NPR noted that the early days of most defense-related, high-tech military projects were "typically marked" by some sort of failure.

"The infant US space program in the late 1950s saw the failures of the Vanguard rocket program. So it's no surprise that the military's test last week of its Falcon space glider meant to test the concept of a hypersonic craft that could travel up to 20 times the speed of sound, more than 15,000 miles an hour, was a bust," opined James.

"The idea is that such a craft could eventually allow US aircraft to reach hotspots anywhere on Earth within minutes."

 By Aharon Etengoff
From tgdaily.com 

In Fast-Tracked Trial, Nanopatch Flu Vaccine Found Effective

In a successful test of a prototype nanotech vaccine patch, Australian researchers at the University of Queensland used a patch smaller than a postage stamp to deliver vaccine through the skin without needles, and with 100 times less vaccine required to evoke a similar protective immune response, according to Pharmacy News.

We noted previously that the nanopatch efficiency could help limited stocks of vaccine go a longer way during epidemics. Its ability to be self-administered also means that ordinary people in the developing world could more easily get vaccinated without the presence of physicians or nurses. The nanopatch has thousands of densely packed projections to administer the vaccine through the skin over a period of just two minutes. Australian scientists used the nanopatch to specifically target a narrow layer just beneath the skin which holds a high density of antigen-presenting cells (APCs). Such cells are essential to creating a protective immune response. 


Only dry vaccine was needed, as opposed to refrigerated vaccine -- removing yet another limiting factor for many vaccination programs. And it almost goes without saying that people afraid of needles can also find some relief from this approach.

If the nanopatch performs just as well in human clinical trials, it could hit the market within five years. Just sit tight on that needle phobia until then.

From Popular Science

Is There a Micro-Supercapacitor in Your Future? Don't Bet Against It

“Just think how often your fancy new mobile phone or computer has become little more than a paperweight because the battery lost its zeal for doing its job,” says John Chmiola, a chemist with the Lawrence Berkeley National Laboratory (Berkeley Lab). “At a time when cellphones can do more than computers could do at the beginning of the Clinton presidency, it would be an understatement to say that batteries have not been holding up their end of the mobile device bargain.”

 


Chmiola is a staff scientist in the Advanced Energy Technologies Department of Berkeley Lab’s Environmental Energy Technologies Division. His research is aimed at addressing this problem of relatively short-lived portable energy storage devices. Chmiola believes he has found a solution in electrochemical capacitors, which are commonly referred to as “supercapacitors” because of their higher energy storage densities than conventional dielectric capacitors and higher abuse tolerance than batteries.

In a paper published in the April 23, 2010 issue of the journal Science, titled “Monolithic Carbide-Derived Carbon Films for Micro-Supercapacitors,” Chmiola and Yury Gogotsi of Drexel University, along with other co-authors, describe a unique new technique for integrating high performance micro-sized supercapacitors into a variety of portable electronic devices through common microfabrication techniques.

By etching electrodes made of monolithic carbon film into a conducting substrate of titanium carbide, Chmiola and Gogotsi were able to create micro-supercapacitors featuring an energy storage density that was at least double that of the best supercapacitors now available. When used in combination with microbatteries, the power densities and rapid-fire cycle times of these micro-supercapacitors should substantially boost the performance and longevity of portable electric energy storage devices.

“The prospect of integrating batteries and supercapacitors with the micro-electromechanical systems (MEMS) they power represents a conceptual leap forward over existing methods for powering such devices,” Chmiola says. “Furthermore, since the same fabrication processes that produced the devices needing the electrical energy also produced the devices storing that energy, we provide a framework for potentially increasing the density of microelectronic devices and allowing improved functionality, reduced complexity, and enhanced redundancy.”  
 
The two principal systems today for storing electrical energy are batteries and supercapacitors. Batteries store electrical energy in the form of chemical reactants and generally display even higher energy storage densities than supercapacitors. However, the charging and discharging of a battery exact a physical toll on electrodes that eventually ends the battery’s life after several thousand charge-discharge cycles. In supercapacitors, energy is stored as electrical charge, which does not impact electrodes during operation. This allows supercapacitors to be charged and discharged millions of times.

“We have known for some time that supercapacitors are faster and longer-lasting alternatives to conventional batteries,” Gogotsi says, “so we decided to see if it would be possible to incorporate them into microelectronic devices and if there would be any advantage to doing so.”

Chmiola and Gogotsi chose titanium carbide as the substrate in this study because while all metal carbides can be selectively etched with halogens so that a monolithic carbon film is left behind, titanium carbide is readily available, relatively inexpensive and can be used at the same temperatures as other microfabrication processes.

“Plus, we have a body of work on titanium carbide precursor carbons that provided us with a lot of data to draw from for understanding the underlying science,” Chmiola says.

The process started with titanium carbide ceramic plates being cut to size and polished to a thinness of approximately 300 micrometers. The titanium was then selectively etched from one face of the plate using chlorine at elevated temperatures, a process that is similar to current dry-etching techniques for MEMS and microchip fabrications.

Chlorinating the titanium removed the metal atoms and left in place a monolithic carbon film, a material with a proven track record in supercapacitors produced via the traditional “sandwich construction” technique.

“By using microfabrication techniques to produce our supercapacitors we avoided many of the pitfalls of the traditional method,” says Chmiola, “namely poor contact between electro-active particles in the electrode, large void spaces between particles that don’t store charge, and poor contact between the electro-active materials and the external circuitry.”

The electrical charge storage densities of the micro-supercapacitors were measured in two common electrolytes. As promising as the results were, Chmiola notes the impressive figures were achieved without the “decades of optimization” that other electronic devices have undergone. This, he says, “hints at the possibility that the energy density ceiling for microfabricated supercapacitors is, indeed, quite high.”

Adds Gogotsi, “Given their practically infinite cycle life, micro-supercapacitors seem ideal for capturing and storing energy from renewable resources and for on-chip operations.”

The next step of the work is to scale down the size of the electrodes and improve the dry etching procedure for removing metal atoms from metal carbides to make the process even more compatible with commercial microfabrication technology. At Berkeley Lab, Chmiola is working on the development of new electrolytes that can help increase the energy storage densities of his micro-supercapacitors. He is also investigating the factors that control the usable voltage window of different electrolytes at a carbon electrode.

“My ultimate goals are to increase energy stored to levels closer to batteries, and preserve both the million-plus charge-discharge cycles and recharge times of less than five minutes of these devices,” says Chmiola. “I think this is what the end users of portable energy storage devices really desire.”

 
 From physorg.com

Scientists discover 'traitor' human DNA helps viruses cause cancer

The research, which was undertaken at the UCL Cancer Institute and funded by Cancer Research UK, and published in Nature Cell Biology today, revealed that viruses can exploit the body’s DNA - dampening its antiviral immune response and allowing infection to take hold more easily.

 University College London scientists have discovered that stretches of human DNA act as a traitor to the body?s defences by helping viruses infect people and trigger cancer-causing diseases.

The UCL Cancer Institute scientists showed that this happened with the Kaposi sarcoma herpesvirus which causes the cancer Kaposi Sarcoma, and also with the herpes simplex virus which causes cold sores.
Our immune system uses multiple ways to prevent or clear infection. In parallel, viruses have also evolved highly sophisticated counter-measures to escape from the human immune defence. 

The team has discovered that viruses exploit tiny molecules derived from human DNA called microRNAs, to make cells more susceptible to viral infection. MicroRNAs are mostly found in parts of the human genome which do not generate proteins - initially thought to be ‘junk DNA’.

Lead author Chris Boshoff, Director of the UCL Cancer Institute and Cancer Research UK’s Professor of Cancer Medicine, said: “We are investigating microRNAs as future therapeutic targets, and targeting cellular microRNAs could be a potential way to prevent or treat cancer-causing infection from viruses.”

Dr Dimitris Lagos, study author, based at the Cancer Research UK Viral Oncology laboratory, UCL Cancer Institute, said: “The viruses we tested have evolved with humans for millions of years and use a variety of biological tricks to establish life-long and mostly harmless infections. We discovered that it is likely that other viruses - which can cause diseases including cancer exploit the tiny molecules present in everyone’s DNA - called microRNA - to turn cells into a viral ‘hotel’ which they can check into - to cause infection - and spread.”

From physorg.com