Meet the Nimble-Fingered Interface of the Future

Microsoft's Kinect, a 3-D camera and software for gaming, has made a big impact since its launch in 2010. Eight million devices were sold in the product's.....

Electronic Implant Dissolves in the Body

Researchers at the University of Illinois at Urbana-Champaign, Tufts University, and ......

Sorting Chip May Lead to Cell Phone-Sized Medical Labs

The device uses two beams of acoustic -- or sound -- waves to act as acoustic tweezers and sort a continuous flow of cells on a ....

TDK sees hard drive breakthrough in areal density

Perpendicular magnetic recording was an idea that languished for many years, says a TDK technology backgrounder, because the ....

Engineers invent new device that could increase Internet

The device uses the force generated by light to flop a mechanical switch of light on and off at a very high speed........


Virus Enzymes Could Promote Human, Animal Health

ScienceDaily (Aug. 31, 2009) — Could viruses be good for you? Scientists with the Agricultural Research Service (ARS) have shown that enzymes from bacteria-infecting viruses known as phages could have beneficial applications for human and animal health.

Phage enzymes called endolysins attack bacteria by breaking down their cell walls. Unlike antibiotics, which tend to have a broad range, endolysins are comparatively specific, targeting unique bonds in the cell walls of their hosts. This is significant because it means non-target bacteria could be less likely to develop resistance to endolysins.

Researchers at the ARS Animal Biosciences and Biotechnology Laboratory in Beltsville, Md., in collaboration with federal, university and industry scientists, have developed and are patenting technology to create powerful antimicrobials by fusing genetic material from multiple cell-wall-degrading endolysins. Now the researchers are collaborating with biopharmaceutical companies to evaluate and further develop the technology.

Studies led by ARS biologist David M. Donovan show that phage enzymes could be used to wipe out multi-drug-resistant pathogens that affect both animals and humans, such as methicillin resistant Staphylococcus aureus, also known as MRSA.

Studies led by ARS biologist David M. Donovan have shown that enzymes from bacteriophages (like the one shown) can be used to fight multi-drug-resistant bacterial pathogens that affect animals and humans, such as methicillin-resistant Staphylococcus aureus, also known as MRSA.

The scientists showed that the enzymes can knock out pathogens in biofilms, which are matrices of microorganisms that can attach to a variety of surfaces. Biofilms are resistant to antibiotics and contribute to many human infections.

In a related study, the scientists showed that using the endolysins lysostaphin and LysK in concert inhibited the growth of staphylococcal strains that cause mastitis in cattle and staph infections in humans.

This research was published recently in the journal Biotech International.

This work is supported in part by the U.S. Department of Agriculture's Cooperative State Research, Education and Extension Service, the National Institutes of Health, and the U.S. Department of State.


New Type of Disappearing Ink

Top-secret maps and messages that fade away to keep unwanted eyes from seeing them could be made with a new nanoparticle ink. Researchers at Northwestern University, led by chemical and biological engineering professor Bartosz Grzybowski, have used gold and silver nanoparticles embedded in a thin, flexible organic gel film to make the new type of self-erasing medium.

Shining ultraviolet light on the film through a patterned mask or moving an ultraviolet "pen" over it records an image on the film. In visible light, the image slowly vanishes. Writing on the medium takes a few tens of milliseconds, but the researchers can speed up the process by using brighter light. They can also tweak the nanoparticles to control how quickly the images disappear, anywhere from hours to a few days. The images vanish in a few seconds when they are exposed to bright light or heat.

The film can be erased and rewritten hundreds of times without any change in quality. It can be bent and twisted.

Timely disappearance: Metal nanoparticles that clump together and change color under ultraviolet light are used as an ink to create images. In visible light, the clumps break apart and the image fades away in nine hours.

The technology, described in an online Angewandte Chemie paper, would be ideal for making secure messages, Grzybowski says. He also envisions self-expiring bus and train tickets. "It self-erases and there's no way of tracing it back," Grzybowski says. "Also this material self-erases when exposed to intense light, so putting it on a copier is not possible."

There have been previous reports of self-erasing media. In 2006, Xerox announced a paper that erases itself in 16 to 24 hours. These materials use photochromic molecules that rearrange their internal chemical structure when exposed to light, which changes their color. Typically, these molecules can only switch between two colors and they lose their ability to switch after a few cycles. Besides, says Grzybowski, the molecules are not bright so you need a large number to see any color change. "You have to put a kilogram of this into paper before you see something," he says.

Grzybowski and his colleagues make the self-erasing ink with 5-nanometer-wide gold or silver particles. They attach on the nanoparticles' surface molecules that change shape under ultraviolet (UV) light and attract each other. "They're like a molecular glue that you can regulate using light," he says. The unwritten films are red if they contain gold particles and yellow if they contain silver. The films can also be made of other colors, ranging from red to blue, by choosing nanoparticles of a different size. Particles exposed to light form clusters of a different color--the red film changes to blue and yellow changes to violet.

In the absence of light the clusters fall apart. How quickly they fall apart, erasing the writing, depends on the amount of gluelike particles on them.

You can write in different colors depending on how much light you put in--more UV light makes the particles form tighter clusters, which have a different color than looser clusters. The researchers were also able to write two images, one over the other, on the same film. All the nanoparticles do not get used to write the first image and can be used for the second image.

"The concept of using photostimulated reversible aggregation of gold or silver particles for self-erasing images is quite interesting and new," says Masahiro Irie, a chemistry professor at Rikkyo University in Tokyo who studies photochromic molecules. However, he believes that photochromic molecules might be better for practical self-erasing systems. Images or text written with the new inks might not have a high resolution because they require clusters of nanoparticles. Plus, the unwritten film is colored because of the nanoparticles, and it would be more desirable to have a colorless or white original film, he says.

But the flexibility and control that the new material offers makes it attractive. It is easy to control the speed of writing and erasure, as well as the color, Grzybowski says. He adds that the technology has drawn interest from a United Kingdom-based security firm.


By Prachi Patel

Star-birth Myth 'Busted'

ScienceDaily (Aug. 30, 2009) — An international team of researchers has debunked one of astronomy's long held beliefs about how stars are formed, using a set of galaxies found with CSIRO’s Parkes radio telescope.

When a cloud of interstellar gas collapses to form stars, the stars range from massive to minute.

Since the 1950s astronomers have thought that in a family of new-born stars the ratio of massive stars to lighter ones was always pretty much the same — for instance, that for every star 20 times more massive than the Sun or larger, you’d get 500 stars the mass of the Sun or less.

“This was a really useful idea. Unfortunately it seems not to be true,” said team research leader Dr Gerhardt Meurer of Johns Hopkins University in Baltimore.

The different numbers of stars of different masses at birth is called the ‘initial mass function’ (IMF).

Most of the light we see from galaxies comes from the highest mass stars, while the total mass in stars is dominated by the lower mass stars.

False-colour images of two galaxies, NGC 1566 (left) and NGC 6902 (right), showing their different proportions of very massive stars. Regions with massive O stars show up as white or pink, while less massive B stars appear in blue. NGC 1566 is much richer in O stars than is NGC 6902. The images combine observations of UV emission by NASA's Galaxy Evolution Explorer spacecraft and H-alpha observations made with the Cerro Tololo Inter-American Observatory (CTIO) telescope in Chile. NGC 1566 is 68 million light years away in the southern constellation of Dorado. NGC 6902 is about 33 million light years away in the constellation Sagittarius. (Credit: NASA/JPL-Caltech/JHU)

By measuring the amount of light from a population of stars, and making some corrections for the stars’ ages, astronomers can use the IMF to estimate the total mass of that population of stars.

Results for different galaxies can be compared only if the IMF is the same everywhere, but Dr Meurer’s team has shown that this ratio of high-mass to low-mass newborn stars differs between galaxies.

For instance, small 'dwarf' galaxies form many more low-mass stars than expected.

To arrive at this finding, Dr Meurer’s team used galaxies from the HIPASS Survey (HI Parkes All Sky Survey) done with CSIRO’s Parkes radio telescope.

The astronomers measured two tracers of star formation, ultraviolet and H-alpha emissions, in 103 galaxies using NASA’s GALEX satellite and the 1.5-m CTIO optical telescope in Chile.

“All of these galaxies were detected with the Parkes telescope because they contain substantial amounts of neutral hydrogen gas, the raw material for forming stars, and this emits radio waves,” said CSIRO’s Dr Baerbel Koribalski, a member of Dr Meurer’s team.

Selecting galaxies on the basis of their neutral hydrogen gave a sample of galaxies of many different shapes and sizes, unbiased by their star formation history.

The astronomers measured two tracers of star formation, ultraviolet and H-alpha emissions, in 103 galaxies using NASA’s GALEX satellite and the 1.5-m CTIO optical telescope in Chile.

H-alpha emission traces the presence of very massive stars called O stars, which are born with masses more than 20 times that of the Sun.

The UV emission, traces both O stars and the less massive B stars — overall, stars more than three times the mass of the Sun.

Meurer’s team found that this ratio, of H-alpha to UV emission, varied from galaxy to galaxy, implying that the IMF also did, at least at its upper end.

Their work confirms tentative suggestions made first by Veronique Buat and collaborators in France in 1987, and then a more substantial study last year by Eric Hoversteen and Karl Glazebrook working out of Johns Hopkins and Swinburne Universities that suggested the same result.

“This is complicated work, and we’ve necessarily had to take into account many factors that affect the ratio of H-alpha to UV emission, such as the fact that B stars live much longer than O stars,” Dr Meurer said.

Dr Meurer’s team suggests the IMF seems to be sensitive to the physical conditions of the star-forming region, particularly gas pressure.

For instance, massive stars are most likely to form in high-pressure environments such as tightly bound star clusters.

The team’s results allow a better understanding of other recently observed phenomena that have been puzzling astronomers, such as variation of the ratio of H-alpha to ultraviolet light as a function of radius within some galaxies. This now makes sense as the stellar mix varying as the pressure drops with radius (just like the pressure varies with altitude on the Earth).

Importantly, the team also found that essentially all galaxies rich in neutral hydrogen seem to form stars.

“That means surveys for neutral hydrogen with radio telescopes will find star-forming galaxies of all kinds,” Dr Meurer said.

The Australian SKA Pathfinder, the next-generation radio telescope now being developed by CSIRO, will find neutral hydrogen gas in half a million galaxies, allowing a comprehensive examination of star-formation in the nearby universe.


A More Sensitive Cancer Breathalyzer

Lung cancer is a brutal disease, often not caught until it's too late for treatment to do much good. Now researchers are building an electronic nose that could help physicians detect the disease during its initial stages. Using gold nanoparticles, scientists at the Israel Institute of Technology in Haifa have created sensors with an unprecedented sensitivity for sniffing out compounds present in the breath of lung-cancer patients.

Other attempts to do this have yielded promising results (see Lung-Cancer Breathalyzer and Cancer Breathalyzer), but those devices require a higher concentration of the telltale biomarker chemicals than the Israeli device. The chemicals, called volatile organic compounds (VOCs), are metabolic products present in the vapors that we breathe out, but they occur in such small amounts that researchers have had to find ways to increase their concentrations before testing. Now, Hossam Haick and his colleagues have built sensors using an array of gold nanoparticles that can detect these VOCs in their natural concentrations and under the humid conditions characteristic of human breath. Their research was recently published online in the journal Nature Nanotechnology.

Other devices used for the same kinds of tests depend on expensive means of VOC detection, such as optical sensors, mass spectrometry, and acoustic sensors. These systems aren't always portable, either. Gold-nanoparticle sensors, however, have the potential to be small and inexpensive--the only problem has been getting the VOCs to stick to the gold. "It was quite a lot of work to get them to stick," says Haick, a 2008 TR35 winner. "We're the first to do so, as far as I know." Because of an impending patent, Haick declined to explain how he achieved the desired stickiness.

Using breath samples from 40 healthy volunteers and 56 lung-cancer patients, the group used the sensors to identify which biomarkers would collectively act as an accurate sign of lung-cancer signature. After training the sensors to identify the signature and testing it again, Haick and his colleagues found that their device could reliably differentiate between cancerous and healthy breath. They're now testing the device on a larger group of people in various stages of the disease and believe they'll be ready to start clinical trials within two or three years.

"Any advance in the area of developing sensors for breath research is exciting to me, and I think this is certainly an advance," says Peter Mazzone, a lung cancer specialist and breath-analysis researcher at the Cleveland Clinic in Ohio. "This was a very well-done, very promising study. I don't know if it is accurate enough to use in clinical practice, yet it's very exciting to see another promising sensor system."

Small sensor: When exposed to particular gases, chemiresistors, such as the one shown above, change their resistance. Researchers are using an array of nine chemiresistors to detect lung cancer on the breath of human subjects.

Preliminary tests indicate that the gold-nanoparticle sensors can not only differentiate among stages of lung cancer, they can detect distinct signatures for other ailments, such as liver failure. Haick's group has even tested the electronic nose above colonies of cells grown in culture. This study found that while the sensor was able to sniff out compounds already known to be in breath, other lung-cancer-associated VOCs weren't detected. "Obviously, something is going on in the body to metabolize and create additional VOCs," Haick says. He's now working to figure out precisely what that is, in the hopes that it could provide new insight into lung cancer and how to treat it.



By Lauren Gravitz



Army of Flea-Sized Robots to Do Our Bidding in the Future

The researchers, from institutes in Sweden, Spain, Germany, Italy, and Switzerland, explain that their building approach marks a new paradigm of robot development in microrobotics. The technique involves integrating an entire robot - with communication, locomotion, energy storage, and electronics - in different modules on a single circuit board. In the past, the single-chip robot concept has presented significant limitations in design and manufacturing. However, instead of using solder to mount electrical components on a printed circuit board as in the conventional method, the researchers use conductive adhesive to attach the components to a double-sided flexible printed circuit board using surface mount technology. The circuit board is then folded to create a three-dimensional robot.

The resulting robots are very small, with their length, width, and height each measuring less than 4 mm. The robots are powered by a solar cell on top, and move by three vibrating legs. A fourth vibrating leg is used as a touch sensor. As the researchers explain, a single microrobot by itself is a physically simple individual. But many robots communicating with each other using infrared sensors and interacting with their environment can form a group that is capable of establishing swarm intelligence to generate more complex behavior. The framework for this project, called I-SWARM (intelligent small-world autonomous robots for micro-manipulation) is inspired by the behavior of biological insects.

Researchers are working towards mass-producing a bunch of tiny, flea-sized robots to do our bidding. Bring on the miniscule robot helpers!

I-SWARM robots could be used for everything from cleaning to surveillance to medicine to manufacturing. The trick to building useful little robots is to put the entire robot on a single circuit board.


By Adam Frucci

source : http://gizmodo.com/5347909/army-of-flea+sized-robots-to-do-our-bidding-in-the-future



A Touch of Ingenuity

Now that more and more smart phones and MP3 players have touch-screen interfaces, people have grown accustomed to interacting with gadgets using only taps and swipes of their fingers. But on the 11th floor of a downtown Manhattan building, New York University researchers Ilya Rosenberg and Ken Perlin are developing an interface that goes even further. It's a thin pad that responds precisely to pressure from not only a finger but a range of objects, such as a foot, a stylus, or a drumstick. And it can sense multiple inputs at once.
Ken Perlin (left), a professor of computer science at NYU, and Ilya Rosenberg, an NYU graduate student, show off the plastic sheets that are the starting point for their pressure-sensitive touch pads.

The idea for the pad occurred to Rosenber­g, a graduate student at NYU, a few years ago when he was working with a conductive polymer called force-­sensing resistor ink, which is often used in electronic music keyboards. When pressure is applied to the ink, its molecules reorient themselves in a way that alters its electrical resistance, which is easy to measure. Rosenber­g originally used the ink to create sensors that could be embedded under tennis­-court boundaries to automate line calls, but he wondered if it might be the basis of a good multi­touch interface for computers. He began collaborating with Perlin, a professor in NYU's Media Research Laboratory, to make a pressure-sensitive touch pad to replace a computer mouse.

Pressure-sensitive pads have existed for years, but most have been limited to ­simple applications, such as sensing when a car seat is occupied. Devices like the ­Palm­ Pilot, which use a stylus to input data, typically detect touch by measuring changes in electrical resistance when an object taps the screen. But these screens can register only a single touch at a time. Touch screens on smart phones, meanwhile, use a sensor that detects changes in capacitance, or the material's ability to hold an electric charge; capacitance changes when objects containing water--including fingers--move across the screen. Such screens can sense multiple touches, but they can't detect pressure.

Rosenberg and Perlin's touch pad, by contrast, combines some advantages of all these technologies. It can simultaneously register the pressure and location of several touches, and it can be simply and inexpensively shrunk to the size of a pendant or scaled up to cover a tabletop.

Painted Plastic
To build a pressure-sensitive touch pad, Rosenberg starts with sheets of plastic slightly thicker than a piece of paper. He uses a special program to design a pattern of lines that will be printed on each sheet, tailoring the pattern to the device's intended use. The lines are laid down on the plastic in metal to make them electrically conductive; the sheet is then covered with an even coat of the black pressure-sensitive ink. In bulk, the printed sensors would cost about $100 per square meter, but since these letter-sized prototypes are one-offs, each one is about $100.

Rosenberg places two of the prepared sheets against each other with the polymer ink side facing in, orienting them so that the conductive lines create a grid. Then he sticks the sheets together with double-sided tape. Every sixth metal line terminates at one edge of the plastic sheets in a short, flexible tail that is connected to a rigid circuit board by a clamp. Though the rest of the wires are not connected to electronics, they influence the electrical characteristics of the active lines, which helps software infer where a touch is coming from.

The circuit board itself contains a microchip programmed to scan the sensor pad, supplying power to each active wire in quick succession. The chip also converts the pressure data from a continuous analog signal into a digital format that a computer can interpret. Finally, it compresses the data and sends it to a computer via a USB connection or (for musical applications) a MIDI port.

Software on the computer calculates both the position of objects that contact the pad and the amount of pressure they exert. If an object touches at the inter­section of two conductive lines, the electronics register a strong current there; but the farther away from the intersection it touches, the weaker the current, owing to the resistivity of the ink. Prototypes already have resolution high enough to accurately sense finger and stylus input for tablet PCs. For a single touch, it can record forces from five grams to five kilograms with a 2.5 percent margin of error--enough range to interpret the light tap of a stylus or a strike on a digital drum. Perlin says that because so few of the wires need to be powered, larger versions of the pad can achieve similar sensitivity without much more complexity or cost.

Market Pressure
Today's prototypes are an opaque black, so they're unsuitable as touch-screen interfaces for cell phones and other electronic gadgets. But such a precise and inexpensive pressure-sensitive interface still has many potential uses, Perlin says.

For instance, Rosenberg and Perlin have collaborated with other researchers on several medical and scientific applications. ­Perlin says the pad could be added to shoes to monitor gait and to hospital beds to alert nurses when a patient has been still for too long, increasing the risk of pressure sores. The pad is even sensitive enough to measure pressure waves in water and air; this could lead to better fluid-dynamics models that might help with designing airplanes and boats. Today, researchers use arrays of individual sensors to collect such data, but they are too expensive to use over a large area.

The technology is also useful in multi­touch interfaces for electronic devices. Patric­k Baudisch, a researcher at the Hasso Plattner Institute in Germany, has integrated the pad onto the back of a small gaming gadget, effectively adding an ergonomic touch input: users can control the game without having their fingers block the screen. And Rosenberg believes that by using a different type of pressure-sensitive ink and making the lines thinner, he and his colleagues can build a transparent sensor usable in touch screens on mobile phones and tablet PCs.

Rosenberg and Perlin's touch pad is much more sensitive than other resistance-sensing devices, says Andy Wilson, a Microsoft researcher who developed Surface, a commercially available multitouch table. "Many of the applications focus on using the pressure sensor in interesting ways," he says. He adds, however, that the technology is still in its early stages, and it's difficult to say how much cheaper it will be than today's touch interfaces.

In April, Rosenberg and Perlin launched Touchco, a startup that will license the technology and provide design assistance to companies that want to build it into devices such as mobile phones and e-reader­s. The company's engineers are exploring additional applications--such as the first electronic hand drum, which would be impossible without a sensor capable of such fine resolution.

Eventually, these thin, unobtrusive touch pads could be built into virtually any surface, opening up a new dimension of multi­touch interaction.


By Kate Greene

Human Cloning- The controversy

Human cloning means make a identical person in look wise, style, physiq, brain, voice etc. It is the creation of a genetically identical copy of an existing or previously existing human. The term is generally used to refer to artificial human cloning; human clones in the form of identical twins are commonplace, with their cloning occurring during the natural process of reproduction.

Types of Human Cloning Commonly these is two discussed types of human cloning found. * 1.Therapeutic cloning It involves cloning cells from an adult for use in medicine and is an active area of research. * 2. Reproductive cloning It would involve making cloned human beings. Such reproductive cloning has not been performed and is illegal in many countries. * 3. Replacement cloning Replacement cloning is a third type of cloning. It is a theoretical possibility, and would be a combination of therapeutic and reproductive cloning. Replacement cloning would entail the replacement of an extensively damaged, failed, or failing body through cloning followed by whole or partial brain transplant.

Controversy with Human Cloning The various forms of human cloning are controversial. There have been numerous demands for all progress in the human cloning field to be halted. Some people and groups oppose therapeutic cloning, but most scientific, governmental and religious organizations oppose reproductive cloning. The American Association for the Advancement of Science (AAAS) and other scientific organizations have made public statements suggesting that human reproductive cloning be banned until safety issues are resolved. Serious ethical concerns have been raised by the idea that it might be possible in the future to harvest organs from clones. Some people have considered the idea of growing organs separately from a human organism - in doing this, a new organ supply could be established without the moral implications of harvesting them from humans.

The first Human hybrid Clone The first human hybrid human clone was created in November 1998, by American Cell Technologies. It was created from a man's leg cell, and a cow's egg whose DNA was removed. It was destroyed after 12 days. Human Cloning means Clones wars


by parmod gusain

Quantum Cryptography for the Masses

Quantum cryptography could finally hit the mainstream thanks to a deal that will allow customers to adopt the technology without having to install dedicated optical fibers.

Quantum cryptography--a means of keeping secrets safe by using light particles to help scramble data--has been commercially available for several years. But the technology has only been practical for governmental or large private-sector organizations that can afford to have their own point-to-point optical fiber that the technology requires. But under the new deal, struck between Siemens IT Solutions and Services in the Netherlands and Geneva, Switzerland-based id Quantique, any organizations or individuals wanting state-of-the-art data security will be able to buy the complete package of quantum cryptography and cable.

For the commercial development of quantum cryptography it's a significant step, says Seth Lloyd, an expert in the subject and a professor at MIT. "It makes it a lot more commercially viable. The fiber is by far the most expensive part," he says.

Light box: id Quantique's Cerberis quantum key distribution system (bottom) with two link encryption units (above) is now widely available over dark fiber networks.
Credit: id Quantique

Quantum cryptography is a method that seeks to solve the problem of how to securely send cryptographic keys between two parties by encoding them within light particles, or photons. It allows the parties to share a random--and so almost unbreakable--key without fear of third-party interception. If anyone does try to eavesdrop on the key exchange, the mere act of observing the photons changes them, making the attack detectable.

But for this quantum key distribution (QKD) to work, the same photons transmitted by one party have to be received by the other. This means that unlike most optical fiber data signals, which are periodically amplified by repeaters to boost the signal, quantum keys can only be sent through dedicated, unamplified, point-to-point fibers.

Telecom companies have spent the last few years installing precisely this kind of fiber, but for entirely different reasons, says Lloyd. Known as dark fiber, this is essentially extra capacity that has been laid in bulk to accommodate future growth.

Some companies lease this dark fiber for their own secure data connections, but for the most part it's just laying there waiting for deployment, says Andrew Shields, head of Toshiba Research Europe's Quantum Information Group in Cambridge, U.K. "For quantum key distribution, this is a godsend. There is all this dark fiber in the ground right now that's not being used."

In the new deal, Siemens SIS will offer id Quantique's QKD system over Siemens' existing dark fiber. "It's important from a commercial point of view that companies like Siemens, a global player, are showing an interest in this technology," says Grégoire Ribordy, co-founder and CEO of id Quantique. "There's potential to really accelerate commercial development."

Initially it will only be made available to Dutch customers, says Feike van der Werf, sales director of Siemens SIS, but in time may be deployed more widely. "I see this as the first step in the switch to quantum-based security," says Charlotte Rugers, a security consultant with Siemens SIS.

In essence, this deal means that for the first time QKD will be commercialized and marketed like standard IT services, says Ribordy. Dark fiber has become so prevalent that in some countries you have fiber direct to your home, he says. At the moment it is still not widely used, mainly by organizations that really care about security. But in theory this new deal means that even individuals could adopt the technology, "if you were really paranoid," he says.

This is an important step that should help bring QKD into the mainstream, says Shields. Previously customers were forced to source their own dark fiber, either through laying it themselves or getting a telecom to provide it, but this new deal allows them to buy the complete, scalable package. Although some bigger companies may have their own dark fiber, for smaller companies it would make it easier to adopt the technology, he says. "There are people out there using it but mostly it's to assess the capability, rather than using it to hide their secrets."

It will still be expensive. Besides the $82,000 price tag for a pair of id Quantique's QKD boxes, the cost of dark fiber remains high, because the customer will have to bear the cost of at least two fibers--one for the QKD and the other with which to send the encrypted data once keys have been exchanged. Normally, the cost of each fiber is offset by having dozens of customers share it, says Shields. But QKD customers will be unlikely to want to share their cables. "I think in the longer term we will need to see QKD integrated with normal telecom fibers." But for now this isn't possible, he says. Quantum signals are very weak and classical data signals are very strong, so there is a danger they will be drowned out. Once this problem has been solved, QKD should become even more attractive, he says.


By Duncan Graham-Rowe


Small Fluctuations In Solar Activity, Large Influence On Climate

ScienceDaily (Aug. 28, 2009) — Subtle connections between the 11-year solar cycle, the stratosphere, and the tropical Pacific Ocean work in sync to generate periodic weather patterns that affect much of the globe, according to research appearing this week in the journal Science. The study can help scientists get an edge on eventually predicting the intensity of certain climate phenomena, such as the Indian monsoon and tropical Pacific rainfall, years in advance.

An international team of scientists led by the National Center for Atmospheric Research (NCAR) used more than a century of weather observations and three powerful computer models to tackle one of the more difficult questions in meteorology: if the total energy that reaches Earth from the Sun varies by only 0.1 percent across the approximately 11-year solar cycle, how can such a small variation drive major changes in weather patterns on Earth?

The answer, according to the new study, has to do with the Sun's impact on two seemingly unrelated regions. Chemicals in the stratosphere and sea surface temperatures in the Pacific Ocean respond during solar maximum in a way that amplifies the Sun's influence on some aspects of air movement. This can intensify winds and rainfall, change sea surface temperatures and cloud cover over certain tropical and subtropical regions, and ultimately influence global weather.

"The Sun, the stratosphere, and the oceans are connected in ways that can influence events such as winter rainfall in North America," says NCAR scientist Gerald Meehl, the lead author. "Understanding the role of the solar cycle can provide added insight as scientists work toward predicting regional weather patterns for the next couple of decades."

The study was funded by the National Science Foundation, NCAR's sponsor, and by the Department of Energy. It builds on several recent papers by Meehl and colleagues exploring the link between the peaks in the solar cycle and events on Earth that resemble some aspects of La Nina events, but are distinct from them. The larger amplitude La Nina and El Nino patterns are associated with changes in surface pressure that together are known as the Southern Oscillation.

The connection between peaks in solar energy and cooler water in the equatorial Pacific was first discovered by Harry Van Loon of NCAR and Colorado Research Associates, who is a co-author of the new paper.

Top down and bottom up

The new contribution by Meehl and his colleagues establishes how two mechanisms that physically connect changes in solar output to fluctuations in the Earth's climate can work together to amplify the response in the tropical Pacific.

The team first confirmed a theory that the slight increase in solar energy during the peak production of sunspots is absorbed by stratospheric ozone. The energy warms the air in the stratosphere over the tropics, where sunlight is most intense, while also stimulating the production of additional ozone there that absorbs even more solar energy. Since the stratosphere warms unevenly, with the most pronounced warming occurring at lower latitudes, stratospheric winds are altered and, through a chain of interconnected processes, end up strengthening tropical precipitation.

At the same time, the increased sunlight at solar maximum causes a slight warming of ocean surface waters across the subtropical Pacific, where Sun-blocking clouds are normally scarce. That small amount of extra heat leads to more evaporation, producing additional water vapor. In turn, the moisture is carried by trade winds to the normally rainy areas of the western tropical Pacific, fueling heavier rains and reinforcing the effects of the stratospheric mechanism.

The top-down influence of the stratosphere and the bottom-up influence of the ocean work together to intensify this loop and strengthen the trade winds. As more sunshine hits drier areas, these changes reinforce each other, leading to less clouds in the subtropics, allowing even more sunlight to reach the surface, and producing a positive feedback loop that further magnifies the climate response.

These stratospheric and ocean responses during solar maximum keep the equatorial eastern Pacific even cooler and drier than usual, producing conditions similar to a La Nina event. However, the cooling of about 1-2 degrees Fahrenheit is focused farther east than in a typical La Nina, is only about half as strong, and is associated with different wind patterns in the stratosphere.

Earth's response to the solar cycle continues for a year or two following peak sunspot activity. The La Nina-like pattern triggered by the solar maximum tends to evolve into a pattern similar to El Nino as slow-moving currents replace the cool water over the eastern tropical Pacific with warmer water. The ocean response is only about half as strong as with El Nino and the lagged warmth is not as consistent as the La Nina-like pattern that occurs during peaks in the solar cycle.

Enhancing ocean cooling

Solar maximum could potentially enhance a true La Nina event or dampen a true El Nino event. The La Nina of 1988-89 occurred near the peak of solar maximum. That La Nina became unusually strong and was associated with significant changes in weather patterns, such as an unusually mild and dry winter in the southwestern United States.

The Indian monsoon, Pacific sea surface temperatures and precipitation, and other regional climate patterns are largely driven by rising and sinking air in Earth's tropics and subtropics. Therefore the new study could help scientists use solar-cycle predictions to estimate how that circulation, and the regional climate patterns related to it, might vary over the next decade or two.

Three views, one answer

To tease out the elusive mechanisms that connect the Sun and Earth, the study team needed three computer models that provided overlapping views of the climate system.

One model, which analyzed the interactions between sea surface temperatures and lower atmosphere, produced a small cooling in the equatorial Pacific during solar maximum years. The second model, which simulated the stratospheric ozone response mechanism, produced some increases in tropical precipitation but on a much smaller scale than the observed patterns.

The third model contained ocean-atmosphere interactions as well as ozone. It showed, for the first time, that the two combined to produce a response in the tropical Pacific during peak solar years that was close to actual observations.

"With the help of increased computing power and improved models, as well as observational discoveries, we are uncovering more of how the mechanisms combine to connect solar variability to our weather and climate," Meehl says.

The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation.

First Complete Image of a Molecule, Atom by Atom

Researchers at IBM have used an atomic-force microscope to resolve the chemical structure of pentacene.

This image of pentacene, a molecule
made up of five carbon rings, was
made using an atomic-force
microscope. Credit: Science/AAAS

Using an atomic-force microscope, scientists at IBM Research in Zurich have for the first time made an atomic-scale resolution image of a single molecule, the hydrocarbon pentacene.

Atomic-force microscopy works by scanning a surface with a tiny cantilever whose tip comes to a sharp nanoscale point. As it scans, the cantilever bounces up and down, and data from these movements is compiled to generate a picture of that surface. These microscopes can be used to "see" features much smaller than those visible under light microscopes, whose resolution is limited by the properties of light itself. Atomic-force microscopy literally has atom-scale resolution.

Still, until now, it hasn't been possible to use it to look with atomic resolution at single molecules. On such a scale, the electrical properties of the molecule under investigation normally interfere with the activity of the scanning tip. Researchers at IBM Research in Zurich overcame this problem by first using the microscope tip to pick up a single molecule of carbon monoxide. This drastically improved the resolution of the microscope, which the IBM scientists used to make an image of pentacene. They arrived at carbon monoxide as a contrast-enhancing addition after trying many chemicals.

The researchers hope that looking this closely at single molecules will give them a better understanding of chemical reactions and catalysis at an unprecedented level of detail.


source : http://www.technologyreview.com/blog/editors/24040/

Nitrous oxide fingered as monster ozone slayer

Most people know nitrous oxide as the laughing gas that dentists reserve for drill-phobic patients. But once it enters the atmosphere, N2O is no laughing matter. New calculations indicate that it has risen to become the leading threat to the future integrity of stratospheric ozone, Earth’s protective shield against the sun’s harmful ultraviolet rays.

Currently, Freon and other chlorofluorocarbons — or CFCs — are the leading source of ozone thinning, especially in the hole that forms annually over Antarctica. The surprise is not that N2O is also ozone-toxic. That’s been known for decades. What’s new is a measure of how its ozone-destroying potency compares to CFCs, specifically to one known as CFC-11.

Calculations by a trio of scientists from the National Oceanic and Atmospheric Administration in Boulder, Colo., now indicate that each molecule of N2O is almost one-fiftieth as effective at depleting ozone as is CFC-11.

Which may not sound like much — except it is, the NOAA scientists emphasize. Owing to its roughly 100-year survival time in the atmosphere (a lifespan comparable to CFCs) and the huge quantities released each year, N2O stands poised to become a potent player in the thinning of global stratospheric ozone. Indeed, “We found that if you look ahead, N2O will remain the largest ozone-depleting emissions for the rest of the century,” notes team leader A.R. Ravishankara.

A paper describing the new analyses was posted online today in Science.

CFCs not only are very potent agents of ozone destruction, but also have been released in huge quantities and are long-lived. The bottom line: Once these pollutants enter Earth’s upper atmosphere, they linger, catalyzing damage for decades.

As such, they deserve most of the blame for the overall five-to six-percent thinning in stratospheric ozone that has developed in the past half-century or so, the NOAA scientists say. But owing to the 1987 Montreal Protocol, a United Nations treaty that has restricted or banned use of the most ozone-toxic chemicals, stratospheric ozone thinning has peaked and now appears to be falling.

NOAA calculations now suggest that gains made under the Montreal Protocol will slow or halt, owing to the huge and rising contributions of a pollutant that also imperils ozone — but remains ignored by the powerful treaty. Owing to a twist of fate, the treaty’s success in limiting CFC emissions will also begin intensifying N2O’s potency.

To understand why, Ravishankara says, it helps to know how CFCs and N2O damage ozone. Solar ultraviolet radiation breaks CFC molecules apart, creating chlorine and chlorine oxides. "These are what destroy ozone," he says, not the parent CFCs. Similarly, N2O doesn't directly damage ozone. Chemical reactions in the stratosphere must first strip away one of that molecule's nitrogen atoms — forming nitric oxide, or NO. This stripped down molecule, he explains, is what actually wreaks havoc with ozone.

“Nitrogen oxides and chlorine oxides kind of oppose each other in destroying stratospheric ozone,” the scientist explains. “In other words, N2O offsets the ability of chlorine oxides to destroy ozone. And vice versa.” In the new paper, he says, “We have calculated the ozone-depleting potential of N2O to be roughly 50 percent larger when chlorine levels return to the year-1960 level.”

As N2O pollution goes, dentists are very bit players. Deforestation, animal wastes and bacterial decomposition of plant material in soils and streams emit up to two-thirds of atmospheric N2O.

However, emissions from such natural sources appear fairly static, Ravishankara said at a briefing yesterday. That’s in stark contrast to N2O releases from processes fostered by human activity, such as the nitrogen fertilization of agricultural soils and fossil-fuel combustion. These anthropogenic emissions of the pollutant have been growing steadily, he says, to where they now boost atmospheric concentrations of N2O by roughly one percent every four years.

What all this means, the NOAA scientists say, is that N2O is now a bigger threat to future stratospheric ozone destruction than are CFCs. And if N2O emissions don’t diminish substantially, Ravishankara says, within a century they could eventually slay 40 percent as much stratospheric ozone each year as CFCs did at their peak.

Reporters asked the NOAA team what can and should be done, but the scientists simply argued that finding answers was the responsibility of policymakers. However, Ravishankara observed that because most N2O releases are so diffuse, limiting them will prove much more challenging than simply mandating controls on tailpipes or smokestacks.

Yet success would yield a doubly whammy, notes coauthor John Daniel. The reason: N2O is also a greenhouse gas.


Web edition



'Plas Plasmobot': Scientists To Design First Robot Using Mould

ScienceDaily (Aug. 27, 2009) — Scientists at the University of the West of England are to design the first ever biological robot using mould.

Researchers have received a Leverhulme Trust grant worth £228,000 to develop the amorphous non-silicon biological robot, plasmobot, using plasmodium, the vegetative stage of the slime mould Physarum polycephalum, a commonly occurring mould which lives in forests, gardens and most damp places in the UK. The Leverhulme Trust funded research project aims to design the first every fully biological (no silicon components) amorphous massively-parallel robot.

This project is at the forefront of research into unconventional computing. Professor Andy Adamatzky, who is leading the project, says their previous research has already proved the ability of the mould to have computational abilities.

Professor Adamatzky explains, “Most people’s idea of a computer is a piece of hardware with software designed to carry out specific tasks. This mould, or plasmodium, is a naturally occurring substance with its own embedded intelligence. It propagates and searches for sources of nutrients and when it finds such sources it branches out in a series of veins of protoplasm. The plasmodium is capable of solving complex computational tasks, such as the shortest path between points and other logical calculations. Through previous experiments we have already demonstrated the ability of this mould to transport objects. By feeding it oat flakes, it grows tubes which oscillate and make it move in a certain direction carrying objects with it. We can also use light or chemical stimuli to make it grow in a certain direction.

“This new plasmodium robot, called plasmobot, will sense objects, span them in the shortest and best way possible, and transport tiny objects along pre-programmed directions. The robots will have parallel inputs and outputs, a network of sensors and the number crunching power of super computers. The plasmobot will be controlled by spatial gradients of light, electro-magnetic fields and the characteristics of the substrate on which it is placed. It will be a fully controllable and programmable amorphous intelligent robot with an embedded massively parallel computer.”

This research will lay the groundwork for further investigations into the ways in which this mould can be harnessed for its powerful computational abilities.

Professor Adamatzky says that there are long term potential benefits from harnessing this power, “We are at the very early stages of our understanding of how the potential of the plasmodium can be applied, but in years to come we may be able to use the ability of the mould for example to deliver a small quantity of a chemical substance to a target, using light to help to propel it, or the movement could be used to help assemble micro-components of machines. In the very distant future we may be able to harness the power of plasmodia within the human body, for example to enable drugs to be delivered to certain parts of the human body. It might also be possible for thousands of tiny computers made of plasmodia to live on our skin and carry out routine tasks freeing up our brain for other things. Many scientists see this as a potential development of amorphous computing, but it is purely theoretical at the moment.”

Professor Adamatzky has recently edited and had published by Springer, ‘Artificial Life Models in Hardware’ aimed at students and researchers of robotics. The book focuses on the design and real-world implementation of artificial life robotic devices and covers a range of hopping, climbing, swimming robots, neural networks and slime mould and chemical brains.


Mitochondrial DNA replacement successful in Rhesus monkeys

Scientists may have found a way to prevent the transfer of serious inherited mitochondrial diseases from mother to child. By shuttling DNA from an egg cell to a donor cell, the technique enabled the birth of four healthy Rhesus monkey males, researchers report online August 26 in Nature.

“We consider this a big achievement,” study coauthor Shoukhrat Mitalipov of the Oregon National Primate Research Center in Beaverton said in a news briefing August 25. “We believe that the technique can be applied very quickly to humans, and we believe it will work.”

Mitochondria, power-producing organelles in cells, carry their own DNA, distinct from the DNA held in cells’ nuclei. Healthy or otherwise, mitochondrial DNA is passed from mother to child. In recent years, researchers have identified more than 150 harmful mutations in mitochondrial DNA, some of which can cause serious and debilitating diseases (SN: 2/28/09, p. 20).

“This whole field of mitochondria medicine is very new,” says Douglas Wallace, a mitochondria expert at the University of California, Irvine. “It is a very important problem that’s been pretty much ignored. It affects lots of people, but we have very little to offer them.” Some estimates report that 1 in 6,000 people may have inherited a mitochondrial DNA disorder. Other estimates put the number higher, Wallace says.

A single cell can have thousands of copies of mitochondrial DNA. Usually, all of these copies are the same, healthy type, but occasionally a cell can have a mixture of normal and mutant mitochondrial DNA, a condition called heteroplasmy.

Heteroplasmy in an egg cell makes it nearly impossible to determine if a baby is going to inherit a severe mitochondrial disease, says Jo Poulton of the University of Oxford in England. “You can get quite a big range of how much mitochondrial DNA is transferred to children. There’s been a lot of debate on whether you can or can’t do genetic counseling” for these women, she says.

To get around the guesswork surrounding inherited mitochondrial diseases, the researchers took the mother’s mitochondrial DNA completely out of the picture. In the new work, researchers identified nuclear DNA in a mother’s egg cell by the DNA’s attachment to structures called spindles. Researchers removed the nuclear DNA (leaving the original mitochondrial DNA behind) and then put it into different egg cells lacking nuclear DNA but replete with healthy donor mitochondrial DNA. With the help of an inactive virus, the nuclear DNA fused into the donor cells.

Next, these modified egg cells were fertilized with donor sperm and implanted into Rhesus females to develop. Male twins, named Mito and Tracker, were born healthy, followed later by two more individual males, named Spindler and Spindy, from different mothers.

Researchers found no traces of the original egg cell’s mitochondrial DNA in the offspring, indicating that the process successfully prevented its transfer.

“I’m very pleased to see that, in nonhuman primates, this paper shows conceptually that this system might work,” Wallace comments.

The researchers used Rhesus monkey mothers with no mitochondrial DNA mutations because there are no established primate groups with such mutations, Mitalipov says. However, genetic signatures from the two cell groups—the mother and the egg donor—were distinct enough to tell apart easily, he adds.

Mitalipov and colleagues plan to monitor the monkeys as they age. “We’d like to see the growth and development of the offspring, to see if they have any abnormalities,” he says. Seeing how the mitochondrial DNA fares in subsequent generations will also be important, he adds. Doing that, though, will require female offspring, since mitochondria are passed on maternally. Mitalipov says they are now trying for a girl.

Many of the techniques used in the experiment are already in use at human fertility clinics, Mitalipov says, although the procedure as a whole will need to gain approval from the Food and Drug Administration before being used. The procedure, he says, “is offering real treatments to real patients.”

Others remain cautious. Wallace says that the virus used to integrate the nuclear DNA into a donor cell would need to be scrutinized for safety. The virus is “something FDA would look at very carefully,” before approving the procedure, he says. What’s more, no one knows what will happen as the monkeys get older. “They look normal now, but who knows what they’ll look like as they age,” Wallace says.

Poulton points out that techniques like this performed in animals might not reveal subtle defects, such as the mild hearing loss associated with some mitochondrial DNA mutations. “Doing it in an animal is quite a long way off from doing it in humans,” she says.


Web edition

First Gene-encoded Amphibian Toxin Isolated

ScienceDaily (Aug. 26, 2009) — Researchers in China have discovered the first protein-based toxin in an amphibian –a 60 amino acid neurotoxin found in the skin of a Chinese tree frog. This finding may help shed more light into both the evolution of amphibians and the evolution of poison.

While gene-encoded protein toxins have been identified in many vertebrate animals, including fish, reptiles and mammals, none have yet been found in amphibians or birds. In the case of poisonous amphibians, like the tropical poison dart frogs, their toxins are usually small chemicals like alkaloids that are extracted from insects and secreted onto the animal's skin.

Therefore, Ren Lai and colleagues were surprised to find a protein toxin while examining the secretions of the tree frog Hyla annectans. They then purified and characterized this new toxin, which they called anntoxin.

In protein sequence and structure, anntoxin was very similar to dendrotoxins (the venoms found in cobras and other mamba snakes) and cone snail toxins, though anntoxin only has two disulfide bridges (a strong link that helps keep proteins folded) compared to three in the other types. The slight differences may account for why anntoxin does not block potassium channels as the other venoms do, but rather sodium channels important for signaling in sensory nerves.

Like these other venoms, though, anntoxin is fast-acting and potent; the researchers found it could produce rapid convulsions, paralysis and respiratory distress in several would-be predators like snakes and birds.

The similarities and differences make anntoxin a very valuable protein for further study, considering amphibians' special niche as the animals bridging the evolutionary land-water gap.


Kamikaze Planet

Astronomers have found a giant planet orbiting so close to its parent star that it's bound to spiral inward to its doom or else be ripped to shreds by the star's gravity. Either way, the planet called WASP-18b should provide astronomers with a mother lode of data about the delicate gravitational balancing act that affects all solar systems.

Stars and their planets can influence one another in subtle but inevitable ways. Even though Jupiter orbits about 500 million kilometers away from our sun, for example, the planet's huge mass can shift the sun's center of gravity by more than 10,000 kilometers, enough to possibly reveal the planet's presence to alien astronomers. The moon's pull on Earth raises and lowers sea levels in the form of tides. And conversely, Earth's gravity has slowed the moon's rotation so that it always presents the same face to its parent planet.

Now researchers have found an extreme version of tidal forces at work. WASP-18b is roughly 10 times the mass of Jupiter, and it orbits only about 3 million kilometers from its star. WASP-18b, which was discovered with twin telescopes located in the Canary Islands and South Africa, is located about 400 light-years away in the constellation Phoenix.

WASP-18b and its star are so close to each other that it takes the planet, which is described in tomorrow's issue of Nature, less than a day to complete a revolution. As it orbits, WASP-18b drags bulges tens of kilometers high around the star's equator, making it spin faster. Those same forces also cost the planet its angular momentum, pulling it inward, eventually to the point where it will smash into the star's surface. Or perhaps it will be shredded to form a Saturn-like ring system. In either case, WASP-18b appears doomed to die within half a million years.

Or maybe not. Astronomer and co-author Andrew Collier Cameron of the University of St. Andrews in Fife, U.K., says there could be a third alternative: Some as-yet-unknown manifestation of tidal forces could prolong the planet's life by as much as 500 million years. That's because the close and fierce interaction between two large and essentially gaseous bodies could produce turbulence or other effects that could somehow cancel out WASP-18b's orbital deterioration. Astronomers may know whether that's happening within a decade or so, Collier Cameron says. By then, they will have collected enough data to determine whether WASP-18b's orbit has degraded and by how much. If WASP-18b somehow escapes imminent destruction by tidal forces, then astronomers will know that "our understanding of orbital dynamics, particularly tidal interactions, needs revision," says lead author Coel Hellier of Keele University in the United Kingdom.

"It's the find of a lifetime," says astronomer Douglas Hamilton of the University of Maryland, College Park. "The beauty of this discovery is that we will be able to know [how WASP-18b will meet its end] within 5 to 10 years," Hamilton says. "I think that it could go either way--which makes it very exciting."


By Phil Berardelli
ScienceNOW Daily News
26 August 2009


Lower-cost Solar Cells To Be Printed Like Newspaper, Painted On Rooftops

ScienceDaily (Aug. 25, 2009) — Solar cells could soon be produced more cheaply using nanoparticle “inks” that allow them to be printed like newspaper or painted onto the sides of buildings or rooftops to absorb electricity-producing sunlight.



Brian Korgel, a University of Texas at Austin chemical engineer, is hoping to cut costs to one-tenth of their current price by replacing the standard manufacturing process for solar cells – gas-phase deposition in a vacuum chamber, which requires high temperatures and is relatively expensive.

“That’s essentially what’s needed to make solar-cell technology and photovoltaics widely adopted,” Korgel said. “The sun provides a nearly unlimited energy resource, but existing solar energy harvesting technologies are prohibitively expensive and cannot compete with fossil fuels.”

For the past two years, Korgel and his team have been working on this low-cost, nanomaterials solution to photovoltaics – or solar cell – manufacturing. Korgel is collaborating with professors Al Bard and Paul Barbara, both of the Department of Chemistry and Biochemistry, and Professor Ananth Dodabalapur of the Electrical and Computer Engineering Department. They recently showed proof-of-concept in a recent issue of Journal of the American Chemical Society.

The inks could be printed on a roll-to-roll printing process on a plastic substrate or stainless steel. And the prospect of being able to paint the “inks” onto a rooftop or building is not far-fetched.

“You’d have to paint the light-absorbing material and a few other layers as well,” Korgel said. “This is one step in the direction towards paintable solar cells.”

Korgel uses the light-absorbing nanomaterials, which are 10,000 times thinner than a strand of hair, because their microscopic size allows for new physical properties that can help enable higher-efficiency devices.

In 2002, he co-founded a company called Innovalight, based in California, which is producing inks using silicon as the basis. This time, Korgel and his team are using copper indium gallium selenide or CIGS, which is both cheaper and benign in terms of environmental impact.

“CIGS has some potential advantages over silicon,” Korgel said. “It’s a direct band gap semiconductor, which means that you need much less material to make a solar cell, and that’s one of the biggest potential advantages.”

His team has developed solar-cell prototypes with efficiencies at one percent; however, they need to be about 10 percent.

“If we get to 10 percent, then there’s real potential for commercialization,” Korgel said. “If it works, I think you could see it being used in three to five years.”

He also said that the inks, which are semi-transparent, could help realize the prospect of having windows that double as solar cells. Korgel said his work has attracted the interest of industrial partners.

Funding for the research comes from the National Science Foundation, the Welch Foundation and the Air Force Research Laboratory.

Vitamin D may be heart protective

Vitamin D deficiency may exacerbate the excess heart disease risk that people with type 2 diabetes face, a new study in the Aug. 25 Circulation suggests. In lab tests, researchers demonstrate that immune cells with very low vitamin D levels turn into soggy, cholesterol-filled baggage that can become building blocks of arterial plaques.

Carlos Bernal-Mizrachi, an endocrinologist at Washington University School of Medicine in St. Louis, and his colleagues found that people with diabetes seem more susceptible than nondiabetics to the negative cardiovascular effects attributable to a vitamin D shortage. Larger studies may clarify whether the shortage’s effects extend to nondiabetics, Bernal-Mizrachi says.

Previous studies have tied vitamin D deficiency to cardiovascular disease risk, but the cell biology underpinning this link has been gauzy. “Now we’re figuring out the mechanisms behind how this works,” Bernal-Mizrachi says.

The team tested blood samples from 76 obese people, average age 55, who had high blood pressure, type 2 diabetes and low vitamin D levels. As a comparison, the scientists tested blood from 15 similar people who had normal vitamin D levels and another 45 people with normal blood pressure.

From these blood samples, the scientists cultured immune cells called macrophages and exposed the cells to an oxidized form of LDL cholesterol (the bad kind). Because macrophages are immune cleanup crews that normally snag and engulf LDL molecules, this test mimicked the goings-on in the walls of a blood vessel.

But in these tests, macrophages from the type 2 diabetes patients were more likely to absorb LDL cholesterol in excess when they were cultured without vitamin D than when they were bathed in vitamin D.

Macrophages low on vitamin D become indiscriminate devourers, gobbling up too much LDL and transforming into foam cells, core constituents of arterial plaques, Bernal-Mizrachi says.

Foam cells cluster with other debris along the sides of blood vessels, eventually forming a fibrous, collagen cap. If that cap becomes unstable, the plaque ruptures and a blood clot forms — a recipe for a heart attack or stroke.

Macrophages from people without diabetes were much less affected by the presence of vitamin D in this study.

In further experiments, the researchers demonstrated that vitamin D protects the endoplasmic reticulum — an organelle that governs a host of cell functions — from stress in diabetic people. Less stress means the macrophages will display fewer receptor proteins that scavenge oxidized LDL cholesterol. “Vitamin D regulates these receptors,” Bernal-Mizrachi says, and that limits how much LDL cholesterol the macrophage can gobble up.

“The central observation here is that, with vitamin D deficiency, diabetic people have more endoplasmic reticulum stress,” says Alan Tall, an internist at Columbia University. That stress results in the snaring of more LDL cholesterol.

Bernal-Mizrachi and his colleagues also show that this stress triggers an inflammatory response. In an artery that has already formed plaques, this can be deadly, says Tall, because some inflammatory proteins degrade collagen. “Inflammation is linked to cap breakdown,” he says.

Until more data are available, people with diabetes might think about their vitamin D levels, Bernal-Mizrachi says. Vitamin D can be stored up from supplements and short sun exposures, he says, but not getting too much is “a matter of moderation.”


Web edition


Multitasking Muddles the Mind?

Bad news for people who like to text their BFFs while surfing the Web for some new shoes and watching the latest episode of Project Runway. Scientists who've conducted what they say is the first-ever study of chronic multitaskers found that cognitive performance declines when people try to pay attention to many media channels at once.

Although media multitasking has become more and more prevalent, no one knows how chronic media immersion affects cognitive functioning. So a team headed by psychologist Eyal Ophir of Stanford University in Palo Alto, California, identified 19 "heavy media multitaskers" (HMMs) and 22 "light media multitaskers" (LMMs) among a group of students based on how often they reported simultaneously using media such as television, cell phones, computer games, and videos.

The researchers then gave subjects in the two groups tests to see how well they could sift relevant information from the environment, filter out irrelevant information in their memories, and quickly switch between cognitive tasks. One filtering test, for example, required viewers to note changes in red rectangles while ignoring changes in blue rectangles in the same pictures. In the task-switching experiment, participants were presented with images of paired numbers and letters and had to switch back and forth between classifying the numbers as even or odd and classifying the letters as vowels or consonants.

The HMMs did worse than the LMMs across the board. Surprisingly, says co-author and sociologist Clifford Nass, "They're bad at every cognitive control task necessary for multitasking." They were more easily distracted by irrelevant stimuli, and although their memories were no worse than those of theLMMs, they had more difficulty in selecting stored information that was relevant to the task at hand. In one filtering test, for example, the LMMs took 323 milliseconds to discern the correct answer, but the HMMs averaged 400 milliseconds.

Nass says the study has a disturbing implication in an age when more and more people are simultaneously working on a computer, listening to music, surfing the Web, texting, or talking on the phone: Access to more information tools is not necessarily making people more efficient in their intellectual chores. Also disconcerting, he notes, is that "people who chronically multitask believe they're good at it." The findings are reported this week in the Proceedings of the National Academy of Sciences.

Daphne Bavelier, a cognitive scientist at the University of Rochester in New York state, says the research presents a puzzle about the brain's ability to learn from experience. Bavelier has discovered that people who play action video games get better at the kind of task-switching those games require. In contrast, Bavelier says, the poor performance of multitaskers in the new study suggests that more experience doesn't always translate to improved performance.

It's still not clear, however, that multitasking really scrambles the brain. It's also possible that people with poor filtering and attentional abilities are more prone to multitasking to begin with. Anthony Wagner, a psychologist in the Stanford group, says he suspects that constant jumping among different media offers instant rewards that reinforce "exploratory" behavior at the expense of the ability to concentrate on a particular task.

By Constance Holden
ScienceNOW Daily News
25 August 2009


Biotech Bacteria Could Help Diabetics

Friendly gut microbes that have been engineered to make a specific protein can help regulate blood sugar in diabetic mice, according to preliminary research presented last week at the American Chemical Society conference in Washington, D.C. While the research is still in the very early stages, the microbes, which could be grown in yogurt, might one day provide an alternative treatment for people with diabetes.
he research represents a new take on probiotics: age-old supplements composed of nonharmful bacteria, such as those found in yogurt, that are ingested to promote health. Thanks to a growing understanding of these microbes, a handful of scientists are attempting to engineer them to alleviate specific ailments. "The concept of using bacteria to help perform (or fix) human disorders is extremely creative and interesting," wrote Kelvin Lee, a chemical engineer at the University of Delaware, in Maryland, in an e-mail. "Even if it does not directly lead to a solution to the question of diabetes, it opens up new avenues of thought in a more general sense," says Lee, who was not involved in the research.

Engineering edible bacteria: Researchers engineered friendly bacteria (dots in the bottom half of the image) to produce a protein that triggers intestinal epithelial cells (top, highlighted in blue) to produce insulin.

People with type 1 diabetes lack the ability to make insulin, a hormone that triggers muscle and liver cells to take up glucose and store it for energy. John March, a biochemical engineer at Cornell University, in Ithaca, NY, and his collaborators decided to re-create this essential circuit using the existing signaling system between the epithelial cells lining the intestine and the millions of healthy bacteria that normally reside in the gut. These epithelial cells absorb nutrients from food, protect tissue from harmful bacteria, and listen for molecular signals from helpful bacteria. "If they are already signaling to one another, why not signal something we want?" asks March.

The researchers created a strain of nonpathogenic E. coli bacteria that produce a protein called GLP-1. In healthy people, this protein triggers cells in the pancreas to make insulin. Last year, March and his collaborators showed that engineered bacterial cells secreting the protein could trigger human intestinal cells in a dish to produce insulin in response to glucose. (It's not yet clear why the protein has this effect.)

In the new research, researchers fed the bacteria to diabetic mice. "After 80 days, the mice [went] from being diabetic to having normal glucose blood levels," says March. Diabetic mice that were not fed the engineered bacteria still had high blood sugar levels. "The promise, in short, is that a diabetic could eat yogurt or drink a smoothie as glucose-responsive insulin therapy rather than relying on insulin injections," says Kristala Jones Prather, a biochemical engineer at MIT, who was not involved in the research.

Creating bacteria that produce the protein has a number of advantages over using the protein itself as the treatment. "The bacteria can secrete just the right amount of the protein in response to conditions in the host," says March. That could ultimately "minimize the need for self-monitoring and allow the patient's own cells (or the cells of the commensal E. coli) to provide the appropriate amount of insulin when needed," says Cynthia Collins, a bioengineer at Rensselaer Polytechnic Institute, in Troy, NY, who was not involved in the research.

In addition, producing the protein where it's needed overcomes some of the problems with protein-based drugs, which can be expensive to make and often degrade during digestion. "Purifying the protein and then getting past the gut is very expensive," says March. "Probiotics are cheap--less than a dollar per dose." In underprivileged settings, they could be cultured in yogurt and distributed around a village.

The researchers haven't yet studied the animals' guts, so they don't know exactly how or where the diabetic mice are producing insulin. It's also not yet clear if the treatment, which is presumably triggering intestinal cells to produce insulin, has any harmful effects, such as an overproduction of the hormone or perhaps an inhibition of the normal function of the epithelial cells. "The mice seem to have normal blood glucose levels at this point, and their weight is normal," says March. "If they stopped eating, we would be concerned."

March's microbes are one of a number of new strains being developed to treat disease, including bacteria designed to fight cavities, produce vitamins and treat lactose intolerance. March's group is also engineering a strain of E. coli designed to prevent cholera. Cholera prevention "needs to be something cheap and easy and readily passed from village to village, so why not use something that can be mixed in with food and grown for free?" says March.

However, the work is still in its early stages; using living organisms as therapies is likely to present unique challenges. More research is needed to determine how long these bacteria can persist in the gut, as well as whether altering the gut flora has harmful effects, says MIT's Prather.

In addition, recent research shows that different people have different kinds of colonies of gut bacteria, and it's unclear how these variations might affect bacterial treatments. "This may be particularly challenging when it comes to determining the appropriate dose of the therapeutic microbe," says Collins at Rensselaer. "The size of the population of therapeutic bacterial and how long it persists will likely depend on the microbes in an individual's gut."


By Emily Singer

An Operating System for the Cloud

From early in their company's history, Google's founders, Larry Page and Sergey Brin, wanted to develop a computer operating system and browser.

They believed it would help make personal computing less expensive, because Google would give away the software free of charge. They wanted to shrug off 20 years of accumulated software history (what the information technology industry calls the "legacy") by building an OS and browser from scratch. Finally, they hoped the combined technology would be an alternative to Microsoft Windows and Internet Explorer, providing a new platform for developers to write Web applications and unleashing the creativity of programmers for the benefit of the masses.

But despite the sublimity of their aspirations, Eric Schmidt, Google's chief executive, said no for six years. Google's main source of revenue, which reached $5.5 billion in its most recent quarter, is advertising. How would the project they envisioned support the company's advertising business? The question wasn't whether Google could afford it. The company is wonderfully profitable and is on track to net more than $5 billion in its current fiscal year. But Schmidt, a 20-year veteran of the IT industry, wasn't keen on shouldering the considerable costs of creating and maintaining an OS and browser for no obvious return.

Finally, two years ago, Schmidt said yes to the browser. The rationale was that quicker and more frequent Web access would mean more searches, which would translate into more revenue from ads. Then, in July of this year, Schmidt announced Google's intention to launch an operating system as well. The idea is that an OS developed with the Internet in mind will also increase the volume of Web activity, and support the browser.

Google's browser and OS both bear the name Chrome. At a year old, the browser holds a mere 2 to 3 percent share of a contested global market, in which Microsoft's Internet Explorer has a majority share and Firefox comes in second. The Chrome operating system will be released next year. Today, Windows enjoys around 90 percent of the global market for operating systems, followed by Apple's Mac OS and the freeware Linux. Does Google know what it's doing?

Ritualized Suicide
Going after Microsoft's operating system used to be hopeless. When I covered the company for the Wall Street Journal in the 1990s, I chronicled one failed attempt after another by software innovators to wrest control of the field from Bill Gates. IBM failed. Sun failed. Borland. Everybody. By the end of the 1990s, the quest had become a kind of ritualized suicide for software companies. Irresistible forces seemed to compel Gates's rivals, driving them toward self-destruction.

The networking company Novell, which Schmidt once ran, could have been one of these casualties. Perhaps Schmidt's managerial experience and intellectual engagement with computer code immunized him against the OS bug. In any case, he knew that the task of dislodging Microsoft was bigger than creating a better OS. While others misguidedly focused on the many engineering shortcomings of Windows, Schmidt knew that Microsoft was the leader not for technical reasons but for business ones, such as pricing practices and synergies between its popular office applications and Windows.

So for Schmidt to finally agree to develop an OS suggests less a technological shift than a business revolution. Google's new ventures "are game changers," he now says.

What has changed? Google has challenged the Microsoft franchise, further diminishing a declining force. The latest quarter gave Microsoft the worst year in its history. Revenue from its various Windows PC programs, including operating systems, fell 29 percent in the fiscal quarter that ended in June. Some of the decline stems from the global economic slowdown. But broad shifts in information technology are also reducing the importance of the personal computer and its central piece of software, the OS. In many parts of the world, including the two most populous countries, China and India, mobile phones are increasingly the most common means of reaching the Web. And in the rich world, netbooks, which are ideal for Web surfing, e-mailing, and Twittering, account for one in every 10 computers sold.

Another powerful trend that undercuts Microsoft is toward programs that look and function the same way in any operating system. "Over the past five years there's been a steady move away from Windows-specific to applications being OS-neutral," says Michael Silver, a software analyst at the research firm Gartner.

One example would be Adobe Flash. Such popular social applications as Facebook and Twitter are also indifferent to operating systems, offering users much the same experience no matter what personal computer or handheld device they use. Since so many people live in their social-media sites, the look and feel of these sites has become at least as important as the user interface of the OS. The effect is to shrink the role of the OS, from conductor of the orchestra to merely one of its soloists. "The traditional operating system is becoming less and less important," says Paul Maritz, chief executive of VMware, who was once the Microsoft executive in charge of the operating system. By and large, he has noted, "people are no longer writing traditional Windows applications."

Microsoft's troubles make the company's OS doubly vulnerable. Vista, its current version, has been roundly criticized, and it has never caught on as widely as the company anticipated; many Microsoft customers continue to use the previous version of Windows, XP. A new version being released this fall, Windows 7, promises to remedy the worst problems of Vista. But even 7 may not address a set of technical issues that both galvanize Microsoft's critics and stoke the appetites of Brin and Page to create a more pleasing alternative. In their view, the Microsoft OS takes too long to boot up, and it slows down even the newest hardware. It is too prone to viral attacks and too complicated.

Exactly how Google plans to solve these problems is still something of a mystery. Technical details aren't available. Google has said so little about the innards of its forthcoming OS that it qualifies as "a textbook example of vaporware," wrote John Gruber on his blog Daring Fireball. Information is scarce about even such basic things as whether it will have a new user interface or rely on an existing open-source one, and whether it will support the driver that make printers and other peripherals routinely work with Windows PCs.

The mere announcement of Chrome already threatens Microsoft, however. The imminence of Google's entry into the market--following the delivery of its Android OS for mobile phones--gives Microsoft's corporate customers a reason to ask for lower prices. After all, Google's OS will be free, and the buyers of Windows are chiefly PC makers, whose profit margins are already ultra-slim.

"It's all upside for Google and no downside," says Mitchell Kapor, a software investor and the founder of Lotus, a pioneer supplier of PC applications that was bloodied by Microsoft in the 1990s.

Legacy Code
Fifteen years ago, I wrote a book on the making of Windows NT--still the foundation of Microsoft's OS family. At the time, I wrongly concluded that developing the dominant operating system was proof of technological power, akin to building the greatest fleet of battleships in the early 20th century, or the pyramids long ago. Windows NT required hundreds of engineers, tens of millions of development dollars, and a huge marketing effort. By the mid-1990s, Microsoft was emphasizing features over function, complexity over simplicity.

In doing so, Microsoft and its cofounder, Bill Gates, seemed to be fulfilling the company's historical destiny. The operating system as a technological showpiece goes back to OS/360, a program designed by IBM that was immortalized in The Mythical Man-Month, a book by the engineer Frederick Brooks. The historian Thomas Haigh explains, "That was a huge scaling up of ambition of what the OS was for."

IBM's 360 mainframe was the first computer to gain widespread acceptance in business, and the popularity of the machine, first sold in 1965, depended as much on its software as its hardware. When IBM used Microsoft's DOS as the operating system for its first PC, introduced in 1981, it was the first time Big Blue had gone outside its own walls for a central piece of code. Soon, technologists (including, belatedly, IBM) realized that control of the OS had given Microsoft control of the PC. IBM tried and failed to regain that control with a program called OS/2. But Microsoft triumphed with Windows in the 1990s--and became the most profitable company on earth, turning Gates into the world's richest person. Thus, the OS came to be viewed as the ultimate technological product, a platform seemingly protean enough to incorporate and control every future software innovation and at the same time robust enough to drag outdated PC machines and programs into the present.

It couldn't last. The main reason why control of the OS no longer guarantees technological power, of course, is the ascent of the Internet. Gates made few references to the Internet in the first edition of his book The Road Ahead, published in November 1995. Neither Windows NT nor its mass-market incarnation, Windows 95, was intimately connected to the Web. With the spread of Netscape's browser, though, Gates began to realize that the individual PC and its operating system would have to coöperate with the public information network. By bringing a browser into the OS and thus giving it away, Microsoft recovered its momentum (and killed off a new generation of competitors). Then, preoccupied once again with control of the OS, Microsoft missed the sudden, spectacular rise of search engines. When Google's popularity persisted, Microsoft was unable to do with the search engine what he had done with the browser.

In one sense, this failure to adapt to a networked world reflected the integrity of Gates's vision of the PC as a tool of individual empowerment. In the mid-1970s, when the news of the first inexpensive microprocessor-based computers reached Gates at Harvard, he instantly understood the implications. Until then, computers had been instruments of organizations and agents of bureaucratization. The PC brought about a revolution, offering the little guy a chance to harness computing power for his personal ends.

Technology is now moving away from the individualistic and toward the communal--toward the "cloud" (see our Briefing on cloud computing, July/August 2009). Ray Ozzie, Microsoft's chief software architect, who has been the most influential engineer at the company since Gates retired from executive management, describes the process under way as a return to the computing experience of his youth, in the 1970s, when folks shared time on computers and the network reigned supreme. Cloud technologies "have happened before," he said in June. "In essence, this pendulum is swinging." Similarly, Schmidt recalls how, in the early 1980s, Sun Microsystems' OS was developed for a computer that lacked local storage.

The return to the network has big implications for the business of operating systems. Computer networks used to be closed, private: in the 1960s and '70s they revolved around IBM mainframe operating systems and, later, linked Windows machines on desktops and in back rooms. Today's computer networks are more like public utilities, akin to the electricity and telephone systems. The operating system is less important. Why does Google want to build one?

Successful operating-system designs continue to pay off big, though increasingly in cases where the system is well integrated with hardware. Apple's experience is illustrative. For years, people advised Steve Jobs, Apple's cofounder and chief, to decouple the Mac OS from the company's hardware. Jobs never did. Indeed, he moved in the opposite direction. With the iPod and then the iPhone, he built new operating systems ever more integrated with hardware--and these products have been even more successful than the Macintosh. "For Apple, software is a means to an end," says Jean-Louis Gassée, who once served as the company's chief of product development and who has since founded his own OS and hardware company, Be. "They write a good OS so they can have nice margins on their aluminum laptop."

The effort to create a good OS carries risks. The biggest one for Google is that expectations will outstrip results. Even though the company plans to use a number of freely available pieces of computer code--most notably the Linux "kernel," which delivers basic instructions to hardware--its new system can't be assembled, like a Lego plaything, out of existing pieces. Some pieces don't exist, and some existing ones are deficient. There is the real chance that Google might tarnish its reputation with an OS that disappoints.

Then there is the risk that cloud computing won't deliver on its promise. Privacy breaches could spoil the dream of cheap and easy access to personal data anywhere, anytime. And applications that demand efficient performance may founder if they are drawn from the cloud alone, especially if broadband speeds fail to improve. These unknowns all present substantial threats.

Magic Blends
David Gelernter, a computer scientist at Yale University, has described the chief goal of the personal-computer OS as providing a " 'documentary history' of your life." Information technology, he argues, must answer the question "Where's my stuff?" That stuff includes not only words but also photos, videos, and music.

For a variety of good reasons--technical, social, and economic--the cloud will probably never store and deliver enough of that "stuff" to render the OS completely irrelevant. You and I will always want to store and process some information on our local systems. Therefore, the next normal in operating systems will probably be a hybrid system--a "magic" blend, to quote Adobe's chief technology officer, Kevin Lynch. Predicting just how Microsoft and Google will pursue the magic blend isn't possible. "We hope we are in the process of a redefinition of the OS," Eric Schmidt told me in an e-mail. But one thing is certain: the new competition in operating systems benefits computer users. Microsoft will do more to make Windows friendlier to the new networked reality. No longer a monopoly, the company will adapt or die. It's worth remembering that in the 1970s, AT&T, then the most powerful force in the information economy, "made a set of decisions that doomed it to slow-motion extinction," says Louis Galambos, a historian of business and economics at Johns Hopkins. "Microsoft is not immune to 'creative destruction.' "

Neither is Google. To completely ignore operating systems in favor of the cloud might be an efficient route to failure. And there is much to admire in the very attempt to create a new one. For Brin and Page, it is as much an aesthetic and ethical act as it is an engineering feat.


By G. Pascal Zachary