Meet the Nimble-Fingered Interface of the Future

Microsoft's Kinect, a 3-D camera and software for gaming, has made a big impact since its launch in 2010. Eight million devices were sold in the product's.....

Electronic Implant Dissolves in the Body

Researchers at the University of Illinois at Urbana-Champaign, Tufts University, and ......

Sorting Chip May Lead to Cell Phone-Sized Medical Labs

The device uses two beams of acoustic -- or sound -- waves to act as acoustic tweezers and sort a continuous flow of cells on a ....

TDK sees hard drive breakthrough in areal density

Perpendicular magnetic recording was an idea that languished for many years, says a TDK technology backgrounder, because the ....

Engineers invent new device that could increase Internet

The device uses the force generated by light to flop a mechanical switch of light on and off at a very high speed........


Quantum Computers? Physicists 'Entangle' Two Atoms Using Microwaves for the First Time

Microwaves, the carrier of wireless communications, have been used in past experiments to manipulate single ions. But the NIST group is the first to position microwaves sources close enough to the ions -- just 30 micrometers away -- and create the conditions enabling entanglement, a quantum phenomenon expected to be crucial for transporting information and correcting errors in quantum computers.

 Composite photo of microwave apparatus used in NIST quantum computing experiments. A pair of ions (electrically charged atoms) are trapped by electric fields and manipulated with microwaves inside a glass chamber at the center of the apparatus. The chamber is illuminated by a green light-emitting diode for visual effect. An ultraviolet laser beam used to cool the ions and detect their quantum state is colorized to appear blue.

Described in the August 11 issue of Nature,* the experiments integrate wiring for microwave sources directly on a chip-sized ion trap and use a desktop-scale table of lasers, mirrors, and lenses that is only about one-tenth of the size previously required. Low-power ultraviolet lasers are still needed to cool the ions and observe experimental results but might eventually be made as small as those in portable DVD players. Compared to complex, expensive laser sources, microwave components could be expanded and upgraded more easily to build practical systems of thousands of ions for quantum computing and simulations.

"It's conceivable a modest-sized quantum computer could eventually look like a smart phone combined with a laser pointer-like device, while sophisticated machines might have an overall footprint comparable to a regular desktop PC," says NIST physicist Dietrich Leibfried, a co-author of the new paper.

"Although quantum computers are not thought of as convenience devices that everybody wants to carry around, they could use microwave electronics similar to what is used in smart phones. These components are well developed for a mass market to support innovation and reduce costs. The prospect excites us."

Quantum computers would harness the unusual rules of quantum physics to solve certain problems -- such as breaking today's most widely used data encryption codes -- that are currently intractable even with supercomputers. A nearer-term goal is to design quantum simulations of important scientific problems, to explore quantum mysteries such as high-temperature superconductivity, the disappearance of electrical resistance in certain materials when sufficiently chilled.

Ions are a leading candidate for use as quantum bits (qubits) to hold information in a quantum computer. Although other promising candidates for qubits -- notably superconducting circuits, or "artificial atoms" -- are manipulated on chips with microwaves, ion qubits are at a more advanced stage experimentally in that more ions can be controlled with better accuracy and less loss of information.

The same NIST research group previously used ions and lasers to demonstrate many basic components and processes for a quantum computer. In the latest experiments, the NIST team used microwaves to rotate the "spins" of individual magnesium ions and entangle the spins of a pair of ions. This is a "universal" set of quantum logic operations because rotations and entanglement can be combined in sequence to perform any calculation allowed by quantum mechanics, Leibfried says.

In the experiments, the two ions were held by electromagnetic fields, hovering above an ion trap chip consisting of gold electrodes electroplated onto an aluminum nitride backing. Some of the electrodes were activated to create pulses of oscillating microwave radiation around the ions. Radiation frequencies are in the 1 to 2 gigahertz range. The microwaves produce magnetic fields used to rotate the ions' spins, which can be thought of as tiny bar magnets pointing in different directions. The orientation of these tiny bar magnets is one of the quantum properties used to represent information.

Scientists entangled the ions by adapting a technique they first developed with lasers. If the microwaves' magnetic fields gradually increase across the ions in just the right way, the ions' motion can be excited depending on the spin orientations, and the spins can become entangled in the process. Scientists had to find the right combination of settings in the three electrodes that provided the optimal change in the oscillating magnetic fields across the extent of the ions' motion while minimizing other, unwanted effects. The properties of the entangled ions are linked, such that a measurement of one ion would reveal the state of the other.

The use of microwaves reduces errors introduced by instabilities in laser beam pointing and power as well as laser-induced spontaneous emissions by the ions. However, microwave operations need to be improved to enable practical quantum computations or simulations. The NIST researchers achieved entanglement 76 percent of the time, well above the minimum threshold of 50 percent defining the onset of quantum properties, but not yet competitive with the best laser-controlled operations at 99.3 percent.

In addition to improving microwave operations by reducing unwanted ion motion, the NIST team also plans to study how to suppress cross-talk between different information processing zones on the same chip. Different frequencies could be used for logic operations and control of other nearby qubits, for instance. Smaller traps could enable faster operations if unwanted heating can be suppressed, according to the paper.

From sciencedaily

Multiple Sclerosis Research Doubles Number of Genes Associated With the Disease, Increasing the Number to Over 50

The research, involving an international team of investigators led by the Universities of Cambridge and Oxford, and funded by the Wellcome Trust, was published August 10 in the journal Nature. This is the largest MS genetics study ever undertaken and includes contributions from almost 250 researchers as members of the International Multiple Sclerosis Genetics Consortium and the Wellcome Trust Case Control Consortium.

 Scientists have identified 29 new genetic variants linked to multiple sclerosis, providing key insights into the biology of a very debilitating neurological disease. Many of the genes ¬implicated in the study are relevant to the immune system, shedding light onto the immunological pathways that underlie the development of multiple sclerosis

Multiple sclerosis is one of the most common neurological conditions among young adults, affecting around 2.5 million individuals worldwide. The disease results from damage to nerve fibres and their protective insulation, the myelin sheath, in the brain and spinal cord. The affected pathways -- responsible in health for everyday activities such as seeing, walking, feeling, thinking and controlling the bowel and bladder -- are prevented from 'firing' properly and eventually are destroyed. The new findings focus attention on the pivotal role of the immune system in causing the damage and help to explain the nature of the immune attack on the brain and spinal cord.

In this multi-population study, researchers studied the DNA from 9,772 individuals with multiple sclerosis and 17,376 unrelated healthy controls. They were able to confirm 23 previously known genetic associations and identified a further 29 new genetic variants (and an additional five that are strongly suspected) conferring susceptibility to the disease.

A large number of the genes implicated by these findings play pivotal roles in the workings of the immune system, specifically in the function of T-cells (one type of white blood cell responsible for mounting an immune response against foreign substances in the body but also involved in autoimmunity) as well as the activation of 'interleukins' (chemicals that ensure interactions between different types of immune cell). Interestingly, one third of the genes identified in this research have previously been implicated in playing a role in other autoimmune diseases (such as Crohn's Disease and Type 1 diabetes) indicating that, perhaps as expected, the same general processes occur in more than one type of autoimmune disease.

Previous research has suggested a link between Vitamin D deficiency and an increased risk of multiple sclerosis. Along with the many genes which play a direct role in the immune system, the researchers identified two involved in the metabolism of Vitamin D, providing additional insight into a possible link between genetic and environmental risk factors.

Dr. Alastair Compston from the University of Cambridge who, on behalf of the International Multiple Sclerosis Genetics Consortium, who led the study jointly with Dr. Peter Donnelly from the Wellcome Trust Centre for Human Genetics, University of Oxford, said: "Identifying the basis for genetic susceptibility to any medical condition provides reliable insights into the disease mechanisms. Our research settles a longstanding debate on what happens first in the complex sequence of events that leads to disability in multiple sclerosis. It is now clear that multiple sclerosis is primarily an immunological disease. This has important implications for future treatment strategies."

Dr. Donnelly added: "Our findings highlight the value of large genetic studies in uncovering key biological mechanisms underlying common human diseases. This would simply not have been possible without a large international network of collaborators, and the participation of many thousands of patients suffering from this debilitating disease."

Dr. John Rioux, holder of the Canada Research Chair in Genetics and Genomic Medicine, furthermore stated that "the integration of the genetic information emerging from studies of this and other chronic inflammatory diseases such as Crohn's disease, ulcerative colitis, arthritis and many others is revealing what is shared across these diseases and what is disease-specific. This is but one of the key bits of information emerging from these studies that will guide the research of disease biology for years to come and be the basis for the development of a more personalized approach to medicine."

From sciencedaily

Inexpensive catalyst that makes hydrogen gas 10 times faster than natural enzyme

This step is just one part of a series of reactions to split water and make hydrogen gas, but the researchers say the result shows they can learn from nature how to control those reactions to make durable synthetic catalysts for energy storage, such as in fuel cells.

 The part of the catalyst that cranks out 100,000 molecules of hydrogen gas a second packs electrons into chemical bonds between hydrogen atoms, possibly hijacked from water.

In addition, the natural protein, an enzyme, uses inexpensive, abundant metals in its design, which the team copied. Currently, these materials -- called catalysts, because they spur reactions along -- rely on expensive metals such as platinum.

"This nickel-based catalyst is really very fast," said coauthor Morris Bullock of the Department of Energy's Pacific Northwest National Laboratory. "It's about a hundred times faster than the previous catalyst record holder. And from nature, we knew it could be done with abundant and inexpensive nickel or iron."

Stuffing Bonds
Electrical energy is nothing more than electrons. These same electrons are what tie atoms together when they are chemically bound to each other in molecules such as hydrogen gas. Stuffing electrons into chemical bonds is one way to store electrical energy, which is particularly important for renewable, sustainable energy sources like solar or wind power. Converting the chemical bonds back into flowing electricity when the sun isn't shining or the wind isn't blowing allows the use of the stored energy, such as in a fuel cell that runs on hydrogen.

Electrons are often stored in batteries, but Bullock and his colleagues want to take advantage of the closer packing available in chemicals.

"We want to store energy as densely as possible. Chemical bonds can store a huge amount of energy in a small amount of physical space," said Bullock, director of the Center for Molecular Electrocatalysis at PNNL, one of DOE's Energy Frontier Research Centers. The team also included visiting researcher Monte Helm from Fort Lewis College in Durango, Colo. 

Biology stores energy densely all the time. Plants use photosynthesis to store the sun's energy in chemical bonds, which people use when they eat food. And a common microbe stores energy in the bonds of hydrogen gas with the help of a protein called a hydrogenase.

Because the hydrogenases found in nature don't last as long as ones that are built out of tougher chemicals (think paper versus plastic), the researchers wanted to pull out the active portion of the biological hydrogenase and redesign it with a stable chemical backbone.

Two Plus Two Equals One
In this study, the researchers looked at only one small part of splitting water into hydrogen gas, like fast-forwarding to the end of a movie. Of the many steps, there's a part at the end when the catalyst has a hold of two hydrogen atoms that it has stolen from water and snaps the two together.

The catalyst does this by completely dismantling some hydrogen atoms from a source such as water and moving the pieces around. Due to the simplicity of hydrogen atoms, those pieces are positively charged protons and negatively charged electrons. The catalyst arranges those pieces into just the right position so they can be put together correctly. "Two protons plus two electrons equals one molecule of hydrogen gas," says Bullock.

In real life, the protons would come from water, but since the team only examined a portion of the reaction, the researchers used water stand-ins such as acids to test their catalyst.

"We looked at the hydrogenase and asked what is the important part of this?" said Bullock. "The hydrogenase moves the protons around in what we call a proton relay. Where the protons go, the electrons will follow."

A Bauble for Energy
Based on the hydrogenase's proton relay, the experimental catalyst contained regions that dangled off the main structure and attracted protons, called "pendant amines." A pendant amine moves a proton into position on the edge of the catalyst, while a nickel atom in the middle of the catalyst offers a hydrogen atom with an extra electron (that's a proton and two electrons for those counting).

The pendant amine's proton is positive, while the nickel atom is holding on to a negatively charged hydrogen. Positioned close to each other, the opposites attract and the conglomerate solidifies into a molecule, forming hydrogen gas.

With that plan in mind, the team built potential catalysts and tested them. On their first try, they put a bunch of pendant amines around the nickel center, thinking more would be better. Testing their catalyst, they found it didn't work very fast. An analysis of how the catalyst was moving protons and electrons around suggested too many pendant amines got in the way of the perfect reaction. An overabundance of protons made for a sticky catalyst, which pinched it and slowed the hydrogen-gas-forming reaction down.

Like good gardeners, the team trimmed a few pendant amines off their catalyst, leaving only enough to make the protons stand out, ready to accept a negatively charged hydrogen atom.

Fastest Cat in the West
Testing the trimmed catalyst, the team found it performed much better than anticipated. At first they used conditions in which no water was present (remember, they used water stand-ins), and the catalyst could create hydrogen gas at a rate of about 33,000 molecules per second. That's much faster than their natural inspiration, which clocks in at around 10,000 per second.

However, most real-life applications will have water around, so they added water to the reaction to see how it would perform. The catalyst ran three times as fast, creating more than 100,000 hydrogen molecules every second. The researchers think the water might help by moving protons to a more advantageous spot on the pendant amine, but they are still studying the details.

Their catalyst has a drawback, however. It's fast, but it's not efficient. The catalyst runs on electricity -- after all, it needs those electrons to stuff into the chemical bonds -- but it requires more electricity than practical, a characteristic called the overpotential.

Bullock says the team has some ideas on how to reduce the inefficiency. Also, future work will require assembling a catalyst that splits water in addition to making hydrogen gas. Even with a high overpotential, the researchers see high potential for this catalyst.
 

Hidden soil fungus, now revealed, is in a class all its own

 Culture stained make cell walls visible. Hyphae (long, branching filamentous structure) and swellings (clamydospores) are visible.

Researchers say it represents an entirely new class of fungi: the Archaeorhizomycetes. Like the discovery of a weird type of aquatic fungus that made headlines a few months ago, this finding offers a glimpse at the rich diversity of microorganisms that share our world but remain hidden from view.

The fungal phenomenon, brought to light by researchers at the University of Michigan, the Swedish University of Agricultural Sciences, the Imperial College London and Royal Botanic Gardens and the University of Aberdeen, is described in the Aug. 12 issue of the journal Science.

Although unseen until recently, the fungus was known to be extremely common in soil. Its presence was detected in studies of environmental DNA---genetic material from a living organism that is detected in bulk environmental samples, such as samples of the soil or water in which the organism lives.

"You couldn't really sample the soil without finding evidence of it," said Timothy James, a U-M assistant professor of ecology and evolutionary biology and an assistant curator at the university's herbarium. "So people really wanted to know what it looks like."

That became possible thanks to the work of the Swedish researchers, led by mycologist Anna Rosling. The researchers were studying mycorrhizae---fungi that colonize plant roots---when they discovered that some root tips harbored not only the mycorrhizae they were interested in, but also an unfamiliar fungus.

"When culturing mycorrhizal fungi from coniferous roots we were exited to find that one of the cultures represented this unfamiliar fungus," said Anna Rosling.

Later the culture was identified as a member of Soil Clone Group 1 (SCG1), a ubiquitous but enigmatic lineage known only from environmental DNA. It's not especially impressive to look at, James concedes: "It doesn't make some crazy structure that nobody's ever seen." But simply seeing and photographing a form of life that's been invisible until now is cause for excitement.  
 
 Pure culture of Archaeorhizomyces finlayi grown for six months on solid media in a petri dish (9 cm diameter).
 
Having in hand a member of the elusive fungal group, the Swedish scientists and their collaborators have been able to study the group in more detail than ever before possible, using electron microscopy, DNA sequencing and in vitro growth studies to characterize it. The fungus they cultured is a slow-growing form that produces none of the typical aerial or aquatically dispersed spores most fungi typically reproduce with, suggesting it seldom if ever sees the light of day. 
 
"By finding that it is slow growing and only produces spores in the soil, we can provide an explanation for why it has taken so long to be cultured," James said. The researchers also performed experiments aimed at understanding how the fungus, dubbed Archaeorhizomyces finlayi, interacts with the environment and with other organisms.

"We don't have any evidence that it's pathogenic; we don't have any evidence that it's mutualistic and doing anything beneficial for the plant," James said. "It's a little bit of a boring fungus." It may, however, help break down and recycle dead plants, a common---and extremely important---job for fungi. Hints of this role come from the observation that A. finlayi grows in the lab if provided with food in the form of glucose or cellulose (the main structural component of plant cell walls).

"Because it is so common in the soil, it must be very successful at what it does, and that role must be ecologically relevant," Rosling said. Now that the researchers have ruled out some typical fungal roles---such as pathogen, benign endophyte, and member of a mycorrhizal association---they hope to find out through additional experiments exactly what role the fungus does play in nature and how it interacts with plants and other fungi.

"At this point we're still in the early stages of understanding what it's doing out there," James said.
Whether A. finlayi turns out to be beneficial or detrimental to the plants or microbes it interacts with, it's sure to contribute to understanding the diverse array of fungi in the world.

Though environmental DNA of SCG1 had been collected and reported in more than 50 previous studies, the type of DNA collected in the past didn't lend itself to analyses that would definitively pinpoint the group's position on the tree of life.

 Archaeorhizomyces finlayi was isolated from a coniferous root tip with mycorrhizae (fungi that colonize plant roots) but the researchers have not been able to demonstrate that A. finlayi forms ectomycorrhizal structures on roots in the lab.

"Now that we have the culture, we can sequence almost any gene we want, so that's what we've done," James said.

The resulting information, combined with DNA data from the previous studies, revealed that A. finlayi belongs in an eclectic subphylum known as Taphrinomycotina, other members of which include the yeast Schizosaccharomyces, often used in studies of cell biology and evolution, and Pneumocystis, which can cause pneumonia in people with weakened immune systems, such as those who have cancer or HIV/AIDS or are undergoing treatment with immune-suppressing drugs.

From physorg

Robot-Finger Spinoff Can Model Microscopic Details in 3-D


Researchers from MIT have created a handheld device called GelSight that provides ultra-high resolution 3D scans of microscopic surface structure. The main section of the system is a small slab of transparent, synthetic rubber that’s coated on one side with a paint containing tiny flecks of metal. If you push the rubber against an object, the paint-coated side morphs to closely conform to the object’s texture.



In the video, we see a guy squidge his finger into the rubber, instantly amplifying the microscopic structure of his skin, and revealing a bold indentation of his fingerprint on the other side.

Once this rubbery solution is connected to a series of lights and cameras, it can be used to create 3D models of the underlying structure. The models can register physical features less than a micrometer in depth and about two micrometers across — enough to capture the raised ink patterns on a $20 bill.

GelSight grew out of a project to create tactile sensors for robots. But MIT researchers Edward Adelson and Micah Kimo Johnson quickly realised that their system provided much higher resolution than robotic sensing required.

The scanner has plenty of real-world applications. The team is already in discussion with aerospace companies and equipment manufacturers who are interested in using GelSight to check the microscopic integrity of their products. It could have applications in medicine, forensics, ballistics and biometrics. 

By Wired UK 
From Wired

Stick-On Electronic Tattoos

Researchers have made stretchable, ultrathin electronics that cling to skin like a temporary tattoo and can measure electrical activity from the body. These electronic tattoos could allow doctors to diagnose and monitor conditions like heart arrhythmia or sleep disorders noninvasively.

 Pinch me: These microelectronics are able to wrinkle, bend, and twist along with skin, even as it is being pinched, without breaking or coming loose.

John A. Rogers, a professor of materials science at the University of Illinois at Urbana-Champaign, has developed a prototype that can replicate the monitoring abilities of bulky electrocardiograms and other medical devices that are normally restricted to a clinical or laboratory setting. This work was presented today in Science.

To achieve flexible, stretchable electronics, Rogers employed a principle he had already used to achieve flexibility in substrates. He made the components—all composed of traditional, high-performance materials like silicon—not only incredibly thin, but also "structured into a serpentine shape" that allows them to deform without breaking. The result, says Rogers, is that "the whole system takes on this kind of spiderweb layout."
In the past, says Rogers, he was able to create devices that were either flexible but not stretchable, or stretchable but not flexible. In particular, his previous work was limited by the fact that the electronics portions of his designs couldn't flex and stretch as much as the substrate they were mounted on.

The electronic tattoo achieves the mechanical properties of skin, which can stand up to twisting, poking, and pulling without breaking. Rogers's tattoo can also conform to the topography of the skin as well as stretch and shift with it. It can be worn for extended periods without producing the irritation that often results from adhesive tapes and rigid electronics. Although Rogers's preliminary tests involved a custom-made substrate, he also demonstrated that the electronics could be mounted onto a commercially available temporary tattoo. 

The prototype was equipped with electrodes to measure electric signals produced by muscle and brain activity. This could be useful for noninvasive diagnosis of sleep apnea or monitoring of premature babies' heart activity. It also might be possible, Rogers says, to use the tattoos to stimulate the muscles of physical rehabilitation patients, although this use wasn't demonstrated in the paper.

To demonstrate the device's potential as a human-computer interface, Rogers mounted one of the tattoos on a person's throat and used measurements of the electrical activity in the throat muscles to control a computer game. The signal from the device contained enough information for software to distinguish among the spoken words "left," "right," "up," and "down" to control a cursor on the screen.

The device included sensors for temperature, strain, and electric signals from the body. It also housed LEDs to provide visual feedback; photodetectors to measure light exposure; and tiny radio transmitters and receivers. The device is small enough that it requires only minuscule amounts of power, which it can harvest via tiny solar cells and via a wireless coil that receives energy from a nearby transmitter. Rogers hopes to build in some sort of energy-storage ability, like a tiny battery, in the near future. The researchers are also working on making the device wireless.

Ultimately, Rogers says, "we want to have a much more intimate integration" with the body, beyond simply mounting something very closely to the skin. He hopes that his devices will eventually be able to use chemical information from the skin in addition to electrical information.

By Kenrick Vezina
From Technology Review

Choosing the Good Eggs

By watching the tiny, pulsing motions of a newly fertilized mouse egg, researchers in a new study could determine which eggs stood the best chance of producing healthy mice. The same procedure should also work with human eggs, the researchers said yesterday, opening up the possibility of dramatically improving the success rates of in-vitro fertilization.

 Small actions: Researchers noticed that the cellular innards of an egg pulsed more rapidly immediately after fertilization, and that the movement was directed toward the spot where the sperm entered. The arrows indicate the direction of the pulses.

In a paper in the current issue of Nature Communications, the researchers found that the insides of a newly fertilized egg slowly vibrate, and that the speed and direction of these movements was associated with the egg's likelihood of success.

Using high-speed photography, the researchers could predict within hours of fertilization whether an egg would be viable—the fastest and earliest method ever devised, says the paper's senior author, Magdalena Zernicka-Goetz, of the University of Cambridge. The method is potentially safer than current techniques, which require removing cells from the developing embryo to determine their viability. 

The researchers have not tried this approach yet with human eggs, but say there's no reason it wouldn't work. "The same type of movements do happen in human eggs upon fertilization—we checked it," Zernicka-Goetz said at a news conference. "We are in the process of discussing with an in-vitro fertilization clinic in the U.K. to initiate such tests on human embryos."

Doctors have long sought ways to judge whether an embryo is worth implanting. This is particularly important in the United States because insurance often doesn't cover in-vitro fertilization, and many prospective parents can't afford the costly procedure more than once or twice, says Andrew La Barbera, scientific director for the American Society for Reproductive Medicine.

Doctors often implant multiple embryos to increase a woman's chances of a successful pregnancy. But this also increases the chances of twins and other multiple births, which are riskier for the babies. 

By using a rapid-speed camera, the researchers could see that the cytoplasm—the liquid surrounding the cell's organelles—began to move more rapidly after fertilization than before. That pulsing continued for about four hours, though it waxed and waned in three distinct stages. The direction of the movement also changed during these periods. The vibrations slowed once a cell nucleus was formed.

La Barbera agrees that human embryos likely show similar movements, and says he found the paper elegant and novel. But he doubts that this kind of test could be as predictive in human eggs as the Cambridge researchers suggest it is in mice. Human couples have far more genetic diversity than the inbred mice used in the research, La Barbera says, and older women can have defects in their eggs that might not be visible with this technique.  

"This study has great promise," he says. "It remains to be seen how well these results can be translated into humans."

Janice Evans, an associate professor at the Johns Hopkins Bloomberg School of Public Health, shares La Barbera's concerns about the applicability of the work to human embryos. The researchers' insight is profound, Evans says, but it is still too soon to tell if their method will work with human embryos.

By Karen Weintraub
From Technology Review

Energy Storage for Solar Power

BrightSource Energy has become the latest solar thermal power company to develop a system for generating power when the sun isn't shining. The company says the technology can lower the cost of solar power and make it more reliable, helping it compete with conventional sources of electricity.

 Stored sunlight: A rendering shows BrightSource’s new thermal storage design. The two large tanks will store molten salt, which can be used to generate steam to drive a turbine.

The company, based in Oakland, California, is building one of the world's largest solar thermal power plants. The 392-megawatt solar plant in Ivanpah, California, however, will not include the storage technology. Instead, BrightSource is working with utilities to determine which future projects could best benefit from storage. 

Solar thermal systems use mirrors to focus sunlight, generating temperatures high enough to produce steam to drive a turbine. One of the advantages of the solar thermal approach, versus conventional photovoltaics that convert sunlight directly into electricity, is that heat can be stored cheaply and used when needed to generate electricity. In all solar thermal plants, some heat is stored in the fluids circulating through the system. This evens out any short fluctuations in sunlight and lets the plant generate electricity for some time after the sun goes down. But adding storage systems would let the plant ride out longer periods of cloud cover and generate power well into, or even throughout, the night. Such long-term storage could be needed if solar is to provide a large share of the total power supply. 

BrightSource is using a variation on an approach to storage that's a decade old: heating up a molten salt—typically, a combination of sodium and potassium nitride—and then storing it in a tank. To generate electricity, the molten salt is pumped through a heat exchanger to generate steam. BrightSource CEO John Woolard says one big factor in making this technology economically attractive is the use of power towers—in which mirrors focus sunlight on a central tower—that generate higher temperatures than other solar thermal designs. That higher temperature makes it possible to store more energy using a smaller amount of molten salt. "It's a much more efficient system and much more cost effective, overall," he says. 

Storage allows a thermal power plant to run more hours in the day, so they can more quickly recover the cost of expensive steam turbines and generators. Woolard says that while a solar thermal plant without storage can generate electricity about 2,700 hours a year, BrightSource's storage system increases that to 4,300 hours. The increased output more than offsets the added cost of storage. A study from the National Renewable Energy Laboratory (NREL) in Golden, Colorado, estimates that storage in a power tower system could cut costs per kilowatt hour by 25 to 30 percent. 

At least two other companies are pairing power tower technologies with molten salt storage. Torresol Energy has built such a system at a 19.9-megawatt solar thermal power plant near Seville, Spain, and demonstrated that it can run the power plant through the night using stored heat. In the United States, Solar Reserve plans to build a power tower with molten salt storage in Riverside County, California.  

In addition to lowering costs, the storage system also improves the economics of solar thermal power by increasing the price that utilities are willing to pay for the electricity. Storage decreases the need for utilities to invest in backup power for smoothing out variations in power. Utilities will also pay a higher price for power they can count on at any given moment to make up for increases in demand. And the storage system lets the plant sell power into the evening, when power prices are higher in some locations. 

Storage technology may be essential if solar thermal technology is to compete with photovoltaic solar panels, which have been coming down in price, says Mark Mehos, an NREL researcher. "BrightSource plants that don't have energy storage probably generate electricity at about the same price as a plant that uses photovoltaics," he says. "So all things being equal, they would like to be able to deliver that at a higher value."

By Kevin Bullis
From Technology Review

New X-ray microscopy technique images magnetic nanostructure

 Shown are the magnetic polarities of a cobalt alloy imaged by the new x-ray microscope technique pioneered by SLAC's Joshua Turner. The red and green areas indicate areas of opposite. Credit: Joshua Turner


The new method combines aspects of two existing X-ray techniques: one that determines 3-D molecular arrangements with another that is sensitive to magnetic structures.

“Biologists can use coherent X-ray techniques to calculate the electronic structure of complicated molecules from the pattern of X-rays diffracted off their electrons,” Turner said.  “But it turns out that the formula they use actually contains more parts that relate to the electron spin of magnetic atoms, such as iron and cobalt. Since biologists rarely work with magnetic molecules, they usually ignore the other parts of the equation.  But because electron spin is the source of magnetism, we can use the full formula to determine both the location and magnetic orientations of groups of atoms.”

Turner came to that realization three years ago, when he was nearly finished with graduate school at the University of Oregon. To pursue his idea, he took a postdoctoral research position with Chris Jacobsen at Stony Brook University in New York, where in 1999 scientists had first demonstrated the “lensless” X-ray diffraction technique that biologists use. This technique requires a coherent X-ray beam, but uses calculations instead of lenses to determine molecular structures from diffraction patterns. Among Turner’s mentors was emeritus faculty member Richard Gambino, a magnetic materials expert and National Medal of Technology winner. The key step in Turner’s new technique is in interpreting the complex resonant X-ray scattering information from the calculations which reveal the magnetic information.

After three years of trials, failures and improvements, Turner and colleagues succeeded earlier this year. They imaged a cobalt alloy sample with and without an external magnetic field that was more than 600 times stronger than Earth’s. The difference between the two views revealed in great detail the magnetic polarities throughout the alloy. Turner performed his experiments at Lawrence Berkeley National Laboratory’s Advanced Light Source. The results were published last month in the physics journal, Physical Review Letters.

This method can be used at any source of coherent X-rays, such as synchrotrons or free-electron X-ray lasers like the Linac Coherent Light Source. Ultra-short pulses freeze-frame magnetic changes, offering the potential for imaging in unprecedented detail the structure and motion of the 5-nanometer-thick boundaries between regions with different magnetic orientation.

According to Turner, the new technique has significant advantages over other ways of imaging magnetic nanoscrutctures. “In particular, it is complementary to popular methods such as magnetic holography, but will be able to reach finer spatial resolution and doesn’t require circularly polarized X-rays,” he said, adding that a simple change will improve the resolution of the technique in future experiments from about 75 nanometers to less than 30 nanometers. The ultimate precision depends on the X-ray wavelength set by the magnetic element being imaged. For cobalt, it’s 1.5 nanometers. But for iridium, it’s 15 times smaller: 1 angstrom, which is less than the diameter of most atoms. “This will be the best way to get to the ultimate diffraction-limit resolution in imaging magnetic atoms on an ultrafast time scale,” Turner said.
 
From  physorg

Engineers Solve Longstanding Problem in Photonic Chip Technology: Findings Help Pave Way for Next Generation of Computer Chips

Now, researchers led by engineers at the California Institute of Technology (Caltech) are paving the way for the next generation of computer-chip technology: photonic chips. With integrated circuits that use light instead of electricity, photonic chips will allow for faster computers and less data loss when connected to the global fiber-optic network.

 Caltech engineers have developed a new way to isolate light on a photonic chip, allowing light to travel in only one direction. This finding can lead to the next generation of computer-chip technology: photonic chips that allow for faster computers and less data loss.

"We want to take everything on an electronic chip and reproduce it on a photonic chip," says Liang Feng, a postdoctoral scholar in electrical engineering and the lead author on a paper to be published in the August 5 issue of the journal Science. Feng is part of Caltech's nanofabrication group, led by Axel Scherer, Bernard A. Neches Professor of Electrical Engineering, Applied Physics, and Physics, and co-director of the Kavli Nanoscience Institute at Caltech.

In that paper, the researchers describe a new technique to isolate light signals on a silicon chip, solving a longstanding problem in engineering photonic chips.

An isolated light signal can only travel in one direction. If light weren't isolated, signals sent and received between different components on a photonic circuit could interfere with one another, causing the chip to become unstable. In an electrical circuit, a device called a diode isolates electrical signals by allowing current to travel in one direction but not the other. The goal, then, is to create the photonic analog of a diode, a device called an optical isolator. "This is something scientists have been pursuing for 20 years," Feng says.

Normally, a light beam has exactly the same properties when it moves forward as when it's reflected backward. "If you can see me, then I can see you," he says. In order to isolate light, its properties need to somehow change when going in the opposite direction. An optical isolator can then block light that has these changed properties, which allows light signals to travel only in one direction between devices on a chip.

"We want to build something where you can see me, but I can't see you," Feng explains. "That means there's no signal from your side to me. The device on my side is isolated; it won't be affected by my surroundings, so the functionality of my device will be stable."

To isolate light, Feng and his colleagues designed a new type of optical waveguide, a 0.8-micron-wide silicon device that channels light. The waveguide allows light to go in one direction but changes the mode of the light when it travels in the opposite direction.

A light wave's mode corresponds to the pattern of the electromagnetic field lines that make up the wave. In the researchers' new waveguide, the light travels in a symmetric mode in one direction, but changes to an asymmetric mode in the other. Because different light modes can't interact with one another, the two beams of light thus pass through each other.

Previously, there were two main ways to achieve this kind of optical isolation. The first way -- developed almost a century ago -- is to use a magnetic field. The magnetic field changes the polarization of light -- the orientation of the light's electric-field lines -- when it travels in the opposite direction, so that the light going one way can't interfere with the light going the other way. "The problem is, you can't put a large magnetic field next to a computer," Feng says. "It's not healthy."

The second conventional method requires so-called nonlinear optical materials, which change light's frequency rather than its polarization. This technique was developed about 50 years ago, but is problematic because silicon, the material that's the basis for the integrated circuit, is a linear material. If computers were to use optical isolators made out of nonlinear materials, silicon would have to be replaced, which would require revamping all of computer technology. But with their new silicon waveguides, the researchers have become the first to isolate light with a linear material.

Although this work is just a proof-of-principle experiment, the researchers are already building an optical isolator that can be integrated onto a silicon chip. An optical isolator is essential for building the integrated, nanoscale photonic devices and components that will enable future integrated information systems on a chip. Current, state-of-the-art photonic chips operate at 10 gigabits per second (Gbps) -- hundreds of times the data-transfer rates of today's personal computers -- with the next generation expected to soon hit 40 Gbps. But without built-in optical isolators, those chips are much simpler than their electronic counterparts and are not yet ready for the market. Optical isolators like those based on the researchers' designs will therefore be crucial for commercially viable photonic chips.

In addition to Feng and Scherer, the other authors on the Science paper, "Non-reciprocal light propagation in a silicon photonic circuit," are Jingqing Huang, a Caltech graduate student; Maurice Ayache of UC San Diego and Yeshaiahu Fainman, Cymer Professor in Advanced Optical Technologies at UC San Diego; and Ye-Long Xu, Ming-Hui Lu, and Yan-Feng Chen of the Nanjing National Laboratory of Microstructures in China. This research was done as part of the Center for Integrated Access Networks (CIAN), one of the National Science Foundation's Engineering Research Centers. Fainman is also the deputy director of CIAN. Funding was provided by the National Science Foundation, and the Defense Advanced Research Projects Agency.

Northern Humans Had Bigger Brains, to Cope With the Low Light Levels, Study Finds

Scientists have found that people living in countries with dull, grey, cloudy skies and long winters have evolved bigger eyes and brains so they can visually process what they see, reports the journal Biology Letters.

 Skulls from the 1800s used in the study.

The researchers measured the eye socket and brain volumes of 55 skulls, dating from the 1800s, from museum collections. The skulls represented 12 different populations from across the globe. The volume of the eye sockets and brain cavities were then plotted against the latitude of the central point of each individual's country of origin. The researchers found that the size of both the brain and the eyes could be directly linked to the latitude of the country from which the individual came.

Lead author Eiluned Pearce, from the Institute of Cognitive and Evolutionary Anthropology in the School of Anthropology, said: 'As you move away from the equator, there's less and less light available, so humans have had to evolve bigger and bigger eyes. Their brains also need to be bigger to deal with the extra visual input. Having bigger brains doesn't mean that higher latitude humans are smarter, it just means they need bigger brains to be able to see well where they live.'

Co-author Professor Robin Dunbar, Director of the Institute of Cognitive and Evolutionary, said: 'Humans have only lived at high latitudes in Europe and Asia for a few tens of thousands of years, yet they seem to have adapted their visual systems surprisingly rapidly to the cloudy skies, dull weather and long winters we experience at these latitudes.'

That the explanation is the need to compensate for low light levels at high latitudes is indicated by the fact that actual visual sharpness measured under natural daylight conditions is constant across latitudes, suggesting that the visual processing system has adapted to ambient light conditions as human populations have moved across the globe.

The study takes into account a number of potentially confounding effects, including the effect of phylogeny (the evolutionary links between different lineages of modern humans), the fact that humans living in the higher latitudes are physically bigger overall, and the possibility that eye socket volume was linked to cold weather (and the need to have more fat around the eyeball by way of insulation).

The skulls used in the study were from the indigenous populations of England, Australia, Canary Islands, China, France, India, Kenya, Micronesia, Scandinavia, Somalia, Uganda and the United States. From measuring the brain cavity, the research suggests that the biggest brains belonged to populations who lived in Scandinavia with the smallest being Micronesians.

This study adds weight to other research that has looked at the links between eye size and light levels. Other studies have already shown that birds with relatively bigger eyes are the first to sing at dawn in low light. The eyeball size across all primates has been found to be associated with when they choose to eat and forage -- with species with the largest eyes being those that are active at night.

A Guiding Light for Silicon Photonics

A new way of controlling the path that light takes as it passes through silicon could help overcome one of the big obstacles to making an optical, rather than electronic, computer circuit. Researchers at Caltech and the University of California, San Diego, have taken a step toward a device that prevents light signals from reflecting back and causing errors in optical circuits.

 Light bouncer: Light entering a metallic silicon waveguide from the left flows freely (top). Light entering from the right has its path disrupted.

Chips that compute with light instead of electrons promise to be not only faster, but also less expensive and more energy-efficient than their conventional counterparts. But to be made economically, many believe, photonic chips must be made from silicon, using equipment already being used to build electronic microchips.
Researchers have made many of the necessary elements for a silicon photonic circuit already, including superfast modulators for encoding information onto beams of light, and detectors to read these beams. 
 
But the way light travels through silicon remains a big problem. Light doesn't just go in one direction—it bounces around and even reflects backward, which is disastrous in a circuit. If an optical device were designed to receive two inputs and a third input reflected back in, that would cause an error. As a circuit became more complex, error-causing reflections would overwhelm it.

The Caltech and UCSD researchers have developed a silicon waveguide that causes light to behave differently depending on the direction it's traveling. The researchers, led by Caltech electrical engineering professor Axel Scherer, created a waveguide out of a long, narrow strip of silicon about 800 nanometers wide, with metal spots along the sides like bumpers. Light travels freely in one direction down the waveguide, but is bent as it travels in the opposite direction.

"This is an important breakthrough in a field where we really need a few," says Marin Soljačić, a physics professor at MIT. Soljačić was not involved with the work. The lack of this kind of component, he says, has been "the single biggest obstacle to the large-scale integration of optics at a similar scale to electronics."

Physicists have been wrestling with the unruly behavior of light in silicon for a long time. The new design is the result of years of theoretical work by the California researchers, as well as Soljačić, Shanhui Fan at Stanford University, and others. Previously, researchers had only been able to get light to behave this way in magnetic materials that cannot be incorporated into silicon circuitry, says Michelle Povinelli, assistant professor of electrical engineering at the University of Southern California. 

Soljačić says the new waveguide is particularly significant because it was fabricated using methods used by the semiconductor industry. "This is a very important step toward large-scale optics integration," he says.
Caltech researcher Liang Feng says the team is now working on engineering a full isolator—a component that only lets light travel in one direction, instead of just bending it as it tries to travel the wrong way. He says the current work "is just the first step." 

"Now it's about engineering around this fundamental discovery," says Keren Bergman, professor of electrical engineering at Columbia University. Bergman was not involved with the work. 

Even after that engineering is finished, Bergman says, there's a big looming problem for silicon photonics: there's no good way to make the light sources that are needed for silicon optical processors. Soljačić adds that a full optical computer will also need optical memory, which hasn't been made, either. However, the current work overcomes the "biggest uncertainty" that had been troubling engineers, he says. "Now, with this work, I'm feeling much better."

By Katherine Bourzac
From Technology Review

Nanoscale Pillars Could Have a Big Role in Future Batteries

Tin, silicon, and a few other elements have long been languishing on chemists' list of electrode materials that could, in theory, help lithium-ion batteries hold more energy. A new way of structuring these materials could at last allow them to be used in this way. 

Researchers at the Lawrence Berkeley National Laboratory made tin electrodes by using layers of graphene to protect the normally fragile tin. These first tin electrodes are a sign that materials scientists have made a great deal of progress in using nanoscale structures to improve batteries.

 Power pillars: This battery electrode, shown in cross section under an electron microscope, consists of nanoscale tin pillars sandwiched between sheets of graphene.

Making battery electrodes from tin or silicon can boost the battery's overall energy storage. That's because such materials can take in more lithium during charging and recharging than carbon, which is normally used. But silicon and tin tend to be unstable as electrodes. Tin takes up so much lithium that it expands in volume by a factor of two to three during charging. "This forms cracks, and the tin leaks into the electrolyte and is lost," says Yuegang Zhang, a scientist at Lawrence Berkeley.

Zhang's clever solution is to layer the tin between sheets of graphene, single-atom-thick sheets of carbon mesh. Graphene is highly conductive, and while it's flexible, it's also the strongest material ever tested. 

The tin-graphene electrode consists of two layers of tin nanopillars sandwiched between three sheets of graphene. The pillars help the electrode remain stable: instead of fracturing, the tin expands and contracts during charging without breaking. The space between the pillars means there's plenty of room for the battery's electrolyte to move around, which ensures fast charging speeds. 

Zhang's group has made prototype batteries featuring these electrodes. The prototype tin-graphene batteries can charge up in about 10 minutes and store about 700 milliamp-hours per gram of charge. This storage capacity is maintained over 30 charge cycles. The batteries will ultimately need to hold their performance for hundreds of charge cycles. "The performance they have is quite reasonable, and this has a pretty clear application in existing batteries," says Yi Cui, associate professor of materials science at Stanford University. Cui was not involved with the work.

Several other research groups are working on promising battery materials that include nanoscale structures. Cui has founded a company called Amprius to commercialize another kind of battery anode that features silicon nanowires. The nanostructure of these wires also helps the fragile material remain stable as it takes up and releases lithium. Another group, led by Pulickel Ajayan at Rice, recently built a nanostructured battery that incorporates a tin electrode, in this case integrating the electrodes and the electrolyte on individual nanowires. Arrayed together, these nanowires could make long-lasting microbatteries for small devices such as sensors. 

Zhang is working to demonstrate the use of the nanopillar structure with other fragile electrode materials, including silicon. The process may add to the cost of battery production, but the performance gains could offset the potential additional cost. "People typically assume that a fancy nanoscale structure will cost more, but it may not," says Zhang.

By Katherine Bourzac
From Technology Review

New Process Could Make Canadian Oil Cheaper, Cleaner

New technology for extracting oil from oil sands could more than double the amount of oil that can be extracted from these abundant deposits. It could also reduce greenhouse-gas emissions from the process by up to 85 percent. The technology was developed by N-Solv, an Alberta-based consortium that recently received $10 million from the Canadian government to develop the technology.

 Oil cleanup: A solvent allows oil to flow out of the sand at the top of this simulated oil sands reservoir.

Canada's oil sands are a huge resource. They contain enough oil to supply the U.S. for decades. But they are made up of a tarry substance called bitumen, which requires large amounts of energy to extract from the ground and prepare for transport to a refinery. This fact has raised concerns about the impact of oil sands on climate change. The concerns have been heightened by plans to build a new pipeline for transporting crude oil from the sands to refineries in the United States. 

Most oil sands production currently involves digging up oily sand deposits near the surface and processing the sludgy material with heat and chemicals to free the oil and reduce its viscosity so it can flow through a pipeline. But 80 percent of oil sands are too deep for this approach. Getting at the deeper oil requires treating the bitumen underground so it can be pumped out through an oil well. The most common technique in new projects involves injecting the bitumen with steam underground. But producing the steam means burning natural gas, which emits carbon dioxide. And the oil that's pumped out is still too thick to flow through a pipeline, so it has to be partially refined, which emits still more greenhouse gases. 

N-Solv's process requires less energy because it uses a solvent rather than steam to free the oil, says Murray Smith, a member of N-Solv's board of directors. The solvent, such as propane, is heated to a relatively low temperature (about 50 °C) and injected into a bitumen deposit. The solvent breaks down the bitumen, allowing it to be pumped out along with the propane, which can be reused. The solvent approach requires less energy than heating, pumping, and recycling water for steam. And because the heaviest components of the bitumen remain underground, the oil that results from the solvent process needs to be refined less before it can be transported in a pipeline. 

Because the new process requires less energy, it should also be cheaper. Smith adds that the equipment needed for heating and reusing the propane is less expensive than technology for managing the large volumes of water used in the steam process. With conventional techniques, oil prices have to be above $50 to $60 per barrel—as they have been for several years—for oil sands to be economical. Smith says that with the solvent process, oil sands are still economical even if oil is $30 to $40 per barrel, close to what it was in the 1990s and early 2000s (in inflation-adjusted dollars). N-Solv says the lower costs will make it possible to economically extract more than twice as much oil from the oil sands compared to conventional technologies. 

The idea of using solvents to get at oil sands was proposed in the 1970s, but early experiments showed that the process couldn't produce oil quickly enough. Two things changed that, according to N-Solv. First, horizontal drilling technologies now make it possible to run a solvent injection well along the length of an oil sands deposit, increasing the area in contact with the solvent, thus increasing production. Second, N-Solv determined that even small amounts of methane—a by-product of using a solvent—could contaminate the propane and degrade its performance. So N-Solv introduced purification equipment to separate methane from the propane before it is reused. The separated methane can also be used to heat the propane, further reducing energy costs. 

Although N-Solv's technology could reduce carbon-dioxide emissions from production, most of the emissions associated with oil sands—as with any source of oil—come not from producing the oil, but from burning it in vehicles and furnaces. The technology's impact on climate change will depend on whether the process leads to increased oil production—if it does, it may actually result in increased net greenhouse-gas emissions, says David Keith, a chemical and petroleum engineering professor at the University of Calgary. 

So far, the process has been tested only in a lab. Now N-Solv will begin a pilot project that could produce 500 barrels of oil a day. The $60 million project, which is mostly funded by private sources, will determine whether the process can work on a larger scale. 

By Kevin Bullis
From Technology Review

Nanofiber Regenerates Blood Vessels

Regenerating blood vessels is important for combating the aftereffects of a heart attack or peripheral arterial disease, and for ensuring that transplanted organs receive a sufficient supply of blood. Now researchers at Northwestern University have created a nanomaterial that could help the body to grow new blood vessels.

 Capillary action: The transparent circle in the center of this image is a nanomaterial designed to mimic the protein VEGF. Here, it has enhanced the growth of blood vessels in the membrane from a chicken egg after three days.

Samuel Stupp and his colleagues developed a liquid that, when injected into patients, forms a matrix of loosely tangled nanofibers. Each of these fibers is covered in microscopic protuberances that mimic vascular endothelial growth factor, or VEGF—a protein that occurs naturally in the body and causes chemical reactions that result in the growth of new blood vessels. By mimicking VEGF, the nanofiber has the same biological effect.

Jeff Karp, director of the Laboratory for Advanced Biomaterials and Stem-Cell-Based Therapeutics at Brigham & Women's Hospital, says, "this is an elegant approach to rationally design engineered materials to stimulate specific biological pathways." Karp was not involved with the project.

Ali Khademhosseini, an associate professor at the Harvard-MIT Division of Health Sciences and Technology, adds that "the ability to induce blood vessel formation is one of the major problems in tissue engineering." 

Tissue engineers have tried using VEGF itself to stimulate the growth of blood vessels, but clinical trials with the protein were unsuccessful, says Stupp, director of the Institute for BioNanotechnology in Medicine at Northwestern. This is because VEGF tends to diffuse out of the target tissue before it can do its job. Maintaining a therapeutic concentration in the target tissue would require a series of expensive, invasive injections. 

The new nanomaterial has a similar effect, but it lasts much longer, and is completely biodegradable once its job is finished. Stem cells could be used to regenerate blood vessels, but their use is expensive and controversial. 

The researchers tested their material in mice. The blood supply to the animals' hind legs was restricted. Left untreated, these limbs would die. The nanofiber treatment rescued the limbs, and resulted in better motor function and blood circulation than the other treatments, including a treatment with VEGF.

Stupp says there could be more uses for nanofibers that mimic proteins from the body. For example, they could be used to stimulate the formation of connective tissues such as bone and cartilage, or to regenerate neurons in the brain.

"The next step is to proceed with extensive toxicological testing," says Stupp. "The long view would be to produce a cell-free, growth-factor-free therapy for the treatment of ischemic disease and heart attacks."

Khademhosseini also sees a lot of potential in nanomaterials that mimic natural proteins. "Such materials could have a great future application in regenerative medicine, as they will enable the body's own regenerative response to heal," he says.

By Kenrick Vezina
From Technology Review