Meet the Nimble-Fingered Interface of the Future

Microsoft's Kinect, a 3-D camera and software for gaming, has made a big impact since its launch in 2010. Eight million devices were sold in the product's.....

Electronic Implant Dissolves in the Body

Researchers at the University of Illinois at Urbana-Champaign, Tufts University, and ......

Sorting Chip May Lead to Cell Phone-Sized Medical Labs

The device uses two beams of acoustic -- or sound -- waves to act as acoustic tweezers and sort a continuous flow of cells on a ....

TDK sees hard drive breakthrough in areal density

Perpendicular magnetic recording was an idea that languished for many years, says a TDK technology backgrounder, because the ....

Engineers invent new device that could increase Internet

The device uses the force generated by light to flop a mechanical switch of light on and off at a very high speed........


Showing posts with label SPACE AND ASTRONOMY. Show all posts
Showing posts with label SPACE AND ASTRONOMY. Show all posts

The Jets of the Future

 Box Wing Jet Nick Kaloterakis 

NASA asked the world’s top aircraft engineers to solve the hardest problem in commercial aviation: how to fly cleaner, quieter and using less fuel. The prototypes they imagined may set a new standard for the next two decades of flight.
BOX WING JET, LOCKHEED MARTIN

Target Date: 2025
Passenger jets consume a lot of fuel. A Boeing 747 burns five gallons of it every nautical mile, and as the price of that fuel rises, so do fares. Lockheed Martin engineers developed their Box Wing concept to find new ways to reduce fuel burn without abandoning the basic shape of current aircraft. Adapting the lightweight materials found in the F-22 and F-35 fighter jets, they designed a looped-wing configuration that would increase the lift-to-drag ratio by 16 percent, making it possible to fly farther using less fuel while still fitting into airport gates.

They also ditched conventional turbofan engines in favor of two ultrahigh-bypass turbofan engines. Like all turbofans, they generate thrust by pulling air through a fan on the front of the engine and by burning a fuel-air mixture in the engine’s core. With fans 40 percent wider than those used now, the Box Wing’s engines bypass the core at several times the rate of current engines. At subsonic speeds, this arrangement improves efficiency by 22 percent. Add to that the fuel-saving boost of the box-wing configuration, and the plane is 50 percent more efficient than the average airliner. The additional wing lift also lets pilots make steeper descents over populated areas while running the engines at lower power. Those changes could reduce noise by 35 decibels and shorten approaches by up to 50 percent.—Andrew Rosenblum


Supersonic Green Machine:  Nick Kaloterakis

SUPERSONIC GREEN MACHINE, LOCKHEED MARTIN

Target Date: 2030
The first era of commercial supersonic transportation ended on November 26, 2003, with the final flight of the Concorde, a noisy, inefficient and highly polluting aircraft. But the dream of a sub-three-hour cross-country flight lingered, and in 2010, designers at Lockheed Martin presented the Mach 1.6 Supersonic Green Machine. The plane’s variable-cycle engines would improve efficiency by switching to conventional turbofan mode during takeoff and landing. Combustors built into the engine would reduce nitrogen oxide pollution by 75 percent. And the plane’s inverted-V tail and underwing engine placement would nearly eliminate the sonic booms that led to a ban on overland Concorde flights.

The configuration mitigates the waves of air pressure (caused by the collision with air of a plane traveling faster than Mach 1) that combine into the enormous shock waves that produce sonic booms. “The whole idea of low-boom design is to control the strength, position and interaction of shock waves,” says Peter Coen, the principal investigator for supersonic projects at NASA. Instead of generating a continuous loop of loud booms, the plane would issue a dull roar that, from the ground, would be about as loud as a vacuum cleaner.—Andrew Rosenblum

 Sugar Volt:  Nick Kaloterakis

SUGAR VOLT, BOEING

Target Date: 2035
The best way to conserve jet fuel is to turn off the gas engines. That’s only possible with an alternative power source, like the battery packs and electric motors in the Boeing SUGAR Volt’s hybrid propulsion system. The 737-size, 3,500-nautical-mile-range plane would draw energy from both jet fuel and batteries during takeoff, but once at cruising altitude, pilots could switch to all-electric mode [see Volta Volare GT4]. At the same time Boeing engineers were rethinking propulsion, they also rethought wing design. “By making the wing thinner and the span greater, you can produce more lift with less drag,” says Marty Bradley, Boeing’s principal investigator on the project. The oversize wings would fold up so pilots could access standard boarding gates. Together, the high-lift wings, the hybrid powertrain and the efficient open-rotor engines would make the SUGAR Volt 55 percent more efficient than the average airliner. The plane would emit 60 percent less carbon dioxide and 80 percent less nitrous oxide. Additionally, the extra boost the hybrid system provides at takeoff would enable pilots to use runways as short as 4,000 feet. (For most planes, landing requires less space than takeoff.) A 737 needs a minimum of 5,000 feet for takeoff, so the SUGAR Volt could bring cross-country flights to smaller airports.—Rose Pastore

By Andrew Rosenblum and Rose Pastore
From popsci

In Space and On Earth, Why Build It, When a Robot Can Build It for You?

That's just one thing researchers in Hod Lipson's Creative Machines Lab envision with their latest robot prototype. It can autonomously traverse and manipulate a 3-D truss structure, using specially designed gears and joints to assemble and disassemble the structure as it climbs. Lipson is an associate professor of mechanical and aerospace engineering, and of computing and information science at Cornell University.

 Jeremy Blum '12 holds one version of a prototype robot that can autonomously climb, assemble and disassemble truss structures.

The robot's design is detailed in a paper accepted by IEEE Robotics and Automation, to appear soon online and in print. Its co-authors include former visiting scientist Franz Nigl, former visiting Ph.D. student Shuguang Li, and undergraduate Jeremy Blum.

"What gets me most excited is this idea of safety," said Blum, a student researcher working on the project. Having a robot able to climb and reconfigure building structures, even just to deliver materials, would be a step toward making construction zones safer for humans, he said.

The researchers also point to space-exploration applications. Instead of sending astronauts out on a dangerous spacewalk at the International Space Station, a robot could be deployed to repair a damaged truss.

The robot is equipped with an onboard power system, as well as reflectivity sensors so it can identify where it is on the structure. This allows it to maneuver accurately without explicit commands, Blum added.

Lipson said he envisions transforming the built environment with the help of these kinds of technologies. Instead of making buildings out of concrete or other non-recyclable materials, components designed specifically for robots could be used to build or reconfigure structures more efficiently -- for example, after an earthquake, or if an outdated building needed to be torn down in favor of something better.

"Right now, we are very bad at recycling construction materials," Lipson said. "We are exploring a smarter way to allow the assembly, disassembly and reconfiguration of structures."

The project is part of a National Science Foundation Emerging Frontiers in Research and Innovation grant jointly awarded to Lipson at Cornell, Daniela Rus of the Massachusetts Institute of Technology, Mark Yim of the University of Pennsylvania, and Eric Klavins of the University of Washington.

From sciencedaily

Powerful pixels: Mapping the 'Apollo Zone'

 Mosaic of the near side of the moon as taken by the Clementine star trackers. The images were taken on March 15, 1994

For NASA researchers, pixels are much more – they are precious data that help us understand where we came from, where we've been, and where we're going. 

At NASA's Ames Research Center, Moffett Field, Calif., computer scientists have made a giant leap forward to pull as much information from imperfect static images as possible. With their advancement in image processing algorithms, the legacy data from the Apollo Metric Camera onboard Apollo 15, 16 and 17 can be transformed into an informative and immersive 3D mosaic map of a large and scientifically interesting part of the moon. 

The "Apollo Zone" Digital Image Mosaic (DIM) and Digital Terrain Model (DTM) maps cover about 18 percent of the lunar surface at a resolution of 98 feet (30 meters) per pixel. The maps are the result of three years of work by the Intelligent Robotics Group (IRG) at NASA Ames, and are available to view through the NASA Lunar Mapping and Modeling Portal (LMMP) and Google Moon feature in Google Earth. 

"The main challenge of the Apollo Zone project was that we had very old data – scans, not captured in digital format," said Ara Nefian, a senior scientist with the IRG and Carnegie Mellon University-Silicon Valley. "They were taken with the technology we had over 40 years ago with imprecise camera positions, orientations and exposure time by today’s standards."

The researchers overcame the challenge by developing new computer vision algorithms to automatically generate the 2D and 3D maps. Algorithms are sets of computer code that create a procedure for how to handle certain set processes. For example, part of the 2D imaging algorithms align many images taken from various positions with various exposure times into one seamless image mosaic. In the mosaic, areas in shadows, which show up as patches of dark or black pixels are automatically replaced by lighter gray pixels. These show more well-lit detail from other images of the same area to create a more detailed map. 

 Left: A normal one-camera image of the lunar surface. Right: A composite Apollo Zone image showing the best details from multiple photographs.

"The key innovation that we made was to create a fully automatic image mosaicking and terrain modeling software system for orbital imagery," said Terry Fong, director of IRG. "We have since released this software in several open-source libraries including Ames Stereo Pipeline, Neo-Geography Toolkit and NASA Vision Workbench." 

Lunar imagery of varying coverage and resolution has been released for general use for some time. In 2009, the IRG helped Google develop "Moon in Google Earth", an interactive, 3D atlas of the moon. With "Moon in Google Earth", users can explore a virtual moonscape, including imagery captured by the Apollo, Clementine and Lunar Orbiter missions. 

The Apollo Zone project uses imagery recently scanned at NASA's Johnson Space Center in Houston, Texas, by a team from Arizona State University. The source images themselves are large – 20,000 pixels by 20,000 pixels, and the IRG aligned and processed more than 4,000 of them. To process the maps, they used Ames' Pleiades supercomputer. 

 The color on this map represents the terrain elevation in the Apollo Zone mapped area.

The initial goal of the project was to build large-scale image mosaics and terrain maps to support future lunar exploration. However, the project's progress will have long-lasting technological impacts on many targets of future exploration. "The algorithms are very complex, so they don't yet necessarily apply to things like real time robotics, but they are extremely precise and accurate," said Nefian. "It's a robust technological solution to deal with insufficient data, and qualities like this make it superb for future exploration, such as a reconnaissance or mapping mission to a Near Earth Object." 

Near Earth Objects, or "NEOs" are comets and asteroids that have been attracted by the gravity of nearby planets into orbits in Earth's neighborhood. NEOs are often small and irregular, which makes their paths hard to predict. With these algorithms, even imperfect imagery of a NEO could be transformed into detailed 3D maps to help researchers better understand the shape of it, and how it might travel while in our neighborhood.

In the future, the team plans to expand the use of their algorithms to include imagery taken at angles, rather than just straight down at the surface. A technique called photoclinometry – or "shape from shading" – allows 3D terrain to be reconstructed from a single 2D image by comparing how surfaces sloping toward the sun appear brighter than areas that slope away from it. Also, the team will study imagery not just as pictures, but as physical models that give information about all the factors affect how the final image is depicted.

"As NASA continues to build technologies that will enable future robotic and human exploration, our researchers are looking for new and clever ways to get more out of the data we capture," said Victoria Friedensen, Joint Robotic Precursor Activities manager of the Human Exploration Operations Mission Directorate at NASA Headquarters. "This technology is going to have great benefit for us as we take the next steps."

From physorg

NASA developing comet harpoon for sample return

This is an artist's concept of a comet harpoon embedded in a comet. The harpoon tip has been rendered semi-transparent so the sample collection chamber inside can be seen.

Scientists at NASA's Goddard Space Flight Center in Greenbelt, Md. are in the early stages of working out the best design for a sample-collecting comet harpoon. In a lab the size of a large closet stands a metal ballista (large crossbow) nearly six feet tall, with a bow made from a pair of truck leaf springs and a bow string made of steel cable 1/2 inch thick. The ballista is positioned to fire vertically downward into a bucket of target material. For safety, it's pointed at the floor, because it could potentially launch test harpoon tips about a mile if it was angled upwards. An electric winch mechanically pulls the bow string back to generate a precise level of force, up to 1,000 pounds, firing projectiles to velocities upwards of 100 feet per second.

Donald Wegel of NASA Goddard, lead engineer on the project, places a test harpoon in the bolt carrier assembly, steps outside the lab and moves a heavy wooden safety door with a thick plexiglass window over the entrance. After dialing in the desired level of force, he flips a switch and, after a few-second delay, the crossbow fires, launching the projectile into a 55-gallon drum full of cometary simulant -- sand, salt, pebbles or a mixture of each. The ballista produces a uniquely impressive thud upon firing, somewhere between a rifle and a cannon blast.

"We had to bolt it to the floor, because the recoil made the whole testbed jump after every shot," said Wegel. "We're not sure what we'll encounter on the comet – the surface could be soft and fluffy, mostly made up of dust, or it could be ice mixed with pebbles, or even solid rock. Most likely, there will be areas with different compositions, so we need to design a harpoon that's capable of penetrating a reasonable range of materials. The immediate goal though, is to correlate how much energy is required to 

penetrate different depths in different materials. What harpoon tip geometries penetrate specific materials best? How does the harpoon mass and cross section affect penetration? The ballista allows us to safely collect this data and use it to size the cannon that will be used on the actual mission." 

 This is a demonstration of the sample collection chamber.

Comets are frozen chunks of ice and dust left over from our solar system's formation. As such, scientists want a closer look at them for clues to the origin of planets and ultimately, ourselves. "One of the most inspiring reasons to go through the trouble and expense of collecting a comet sample is to get a look at the 'primordial ooze' – biomolecules in comets that may have assisted the origin of life," says Wegel.

Scientists at the Goddard Astrobiology Analytical Laboratory have found amino acids in samples of comet Wild 2 from NASA's Stardust mission, and in various carbon-rich meteorites. Amino acids are the building blocks of proteins, the workhorse molecules of life, used in everything from structures like hair to enzymes, the catalysts that speed up or regulate chemical reactions. The research gives support to the theory that a "kit" of ready-made parts created in space and delivered to Earth by meteorite and comet impacts gave a boost to the origin of life.

Although ancient comet impacts could have helped create life, a present-day hit near a populated region would be highly destructive, as a comet's large mass and high velocity would make it explode with many times the force of a typical nuclear bomb. One plan to deal with a comet headed towards Earth is to deflect it with a large – probably nuclear – explosion. However, that might turn out to be a really bad idea. Depending on the comet's composition, such an explosion might just fragment it into many smaller pieces, with most still headed our way. It would be like getting hit with a shotgun blast instead of a rifle bullet. So the second major reason to sample comets is to characterize the impact threat, according to Wegel. We need to understand how they're made so we can come up with the best way to deflect them should any have their sights on us.

"Bringing back a comet sample will also let us analyze it with advanced instruments that won't fit on a spacecraft or haven't been invented yet," adds Dr. Joseph Nuth, a comet expert at NASA Goddard and lead scientist on the project.

This is a photo of the ballista testbed preparing to fire a prototype harpoon into a bucket of material that simulates a comet.
 
Of course, there are other ways to gather a sample, like using a drill. However, any mission to a comet has to overcome the challenge of operating in very low gravity. Comets are small compared to planets, typically just a few miles across, so their gravity is correspondingly weak, maybe a millionth that of Earth, according to Nuth. "A spacecraft wouldn't actually land on a comet; it would have to attach itself somehow, probably with some kind of harpoon. So we figured if you have to use a harpoon anyway, you might as well get it to collect your sample," says Nuth. Right now, the team is working out the best tip design, cross-section, and explosive powder charge for the harpoon, using the crossbow to fire tips at various speeds into different materials like sand, ice, and rock salt. They are also developing a sample collection chamber to fit inside the hollow tip. "It has to remain reliably open as the tip penetrates the comet's surface, but then it has to close tightly and detach from the tip so the sample can be pulled back into the spacecraft," says Wegel. "Finding the best design that will package into a very small cross section and successfully collect a sample from the range of possible materials we may encounter is an enormous challenge."

"You can't do this by crunching numbers in a computer, because nobody has done it before -- the data doesn't exist yet," says Nuth. "We need to get data from experiments like this before we can build a computer model. We're working on answers to the most basic questions, like how much powder charge do you need so your harpoon doesn't bounce off or go all the way through the comet. We want to prove the harpoon can penetrate deep enough, collect a sample, decouple from the tip, and retract the sample collection device."

The spacecraft will probably have multiple sample collection harpoons with a variety of powder charges to handle areas on a comet with different compositions, according to the team. After they have finished their proof-of-concept work, they plan to apply for funding to develop an actual instrument. "Since instrument development is more expensive, we need to show it works first," says Nuth.

Currently, the European Space Agency is sending a mission called Rosetta that will use a harpoon to grapple a probe named Philae to the surface of comet "67P/Churyumov-Gerasimenko" in 2014 so that a suite of instruments can analyze the regolith. "The Rosetta harpoon is an ingenious design, but it does not collect a sample," says Wegel. "We will piggyback on their work and take it a step further to include a sample-collecting cartridge. It's important to understand the complex internal friction encountered by a hollow, core-sampling harpoon."

NASA's recently-funded mission to return a sample from an asteroid, called OSIRIS-REx (Origins, Spectral Interpretation, Resource Identification, Security -- Regolith Explorer), will gather surface material using a specialized collector. However, the surface can be altered by the harsh environment of space. "The next step is to return a sample from the subsurface because it contains the most primitive and pristine material," said Wegel.

Both Rosetta and OSIRIS-REx will significantly increase our ability to navigate to, rendezvous with, and locate specific interesting regions on these foreign bodies. The fundamental research on harpoon-based sample retrieval by Wegel and his team is necessary so the technology is available in time for a subsurface sample return mission.


From physorg

Particles can be quantum entangled through time as well as space

Quantum entanglement says that two particles can become intertwined so that they always share the same properties, even if they're separated in space. Now it seems particles can be entangled in time, too. Who's ready for some serious quantum weirdness?


Of all the ideas in modern physics, quantum entanglement is a serious contender for the absolute strangest. Basically, entangled particles share all their quantum properties, even if they are separated by massive distances in space. The really odd part is that any changes made to the properties of one particle will instantly occur in the other particle. There are some subtle reasons why this doesn't actually violate the speed of light, but here's the short version: this is all very, very bizarre.

But all experiments in quantum entanglement have focused exclusively on spatial entanglement, because seriously...isn't this already weird enough? Apparently not for physicists S. Jay Olson and Timothy C. Ralph of Australia's University of Queensland, who have figured out a series of thought experiments about how to entangle particles across time.

Now, what the hell does that mean? Well, Olson explains:

"Essentially, a detector in the past is able to ‘capture' some information on the state of the quantum field in the past, and carry it forward in time to the future — this is information that would ordinarily escape to a distant region of spacetime at the speed of light. When another detector then captures information on the state of the field in the future at the same spatial location, the two detectors can then be compared side-by-side to see if their state has become entangled in the usual sense that people are familiar with — and we find that indeed they should be entangled. This process thus takes a seemingly exotic, new concept (timelike entanglement in the field) and converts it into a familiar one (standard entanglement of two detectors at a given time in the future)."

That may still be a bit confusing, so think about it this way. The detectors are basically taking on the properties of their particles - if they share the same properties, then the particles themselves are entangled. The first, "past" detector stores one set of quantum properties, and then the second, "future" detector measures a new set of properties at the same location as the first. The two sets of quantum properties are affecting each other just like spatially entangled particles share the same properties, but now it's happening across time instead. Once the two detectors are brought together in time, the entanglement becomes the more normal (well, relatively speaking) sort of spatial entanglement.

This may seem difficult to comprehend - I know I'm struggling with it - but that's because we're accustomed to temporal events always being completely independent of one another. Both types of entanglement are counter-intuitive, to be sure, but it's easier for us to imagine particles sharing properties in different parts of space than it is different parts of time because we ourselves move through space so easily. And yet, from a physics perspective, there isn't all that much of a difference between space and time, and certainly not enough to rule out temporal entanglement.

Now, this is all still just hypothetical for the time being, but there is a theoretical basis for this and it may soon be possible to probe these ideas further with some experiments. Still, if you're up for a bit of extra credit weirdness, here's Olson and Ralph's thought experiment for teleportation through time. Let's say you want to move a quantum state, or qubit, through time. You'll need one detector coupled to a field in the "past" and another coupled to the same field in the "future." The first detector stores the information on the qubit and generates some data on how the qubit can be found again. The qubit is then teleported through time, effectively skipping the period in between the past and future detectors.

The first detector is removed and the second detector is put in precisely the same place, keeping the spatial symmetry in tact. The second detector eventually receives the necessary information from the first, and then it uses this to bring the qubit back, reconstructing it in the future. There's a weird time symmetry to all this - let's say the qubit is teleported at 12:00 and the first detector gather its information at 11:45. That fifteen-minute gap must exist in both direction, and it's impossible to reconstruct the qubit until 12:15 rolls around.

Obviously, these are all deeply strange, epically counter-intuitive ideas right at the bleeding edge of what modern physics can conceptualize. But it's also very awesome. And as soon as I even begin to understand it, I'm sure it'll get even more awesome.


New Study Finds No Sign of ‘First Habitable Exoplanet’

Things don’t look good for Gliese 581g, the first planet found orbiting in the habitable zone of another star. The first official challenge to the small, hospitable world looks in the exact same data — and finds no significant sign of the planet.



“For the time being, the world does not have data that’s good enough to claim the planet,” said astro-statistics expert Philip Gregoryof the University of British Columbia, author of the new study.

The “first habitable exoplanet” already has a checkered history. When it was announced last September, Gliese 581g was heralded as the first known planet that could harbor alien life. The planet orbits its dim parent star once every 36.6 days, placing it smack in the middle of the star’s habitable zone, the not-too-hot, not-too-cold region where liquid water could be stable.

Planet G was the sixth planet found circling Gliese 581, a red dwarf star 20 light-years from Earth. A team of astronomers from the Geneva Observatory in Switzerland found the first four planets using the HARPS spectrograph on a telescope in Chile. The team carefully measured the star’s subtle wobbles as the planets tugged it back and forth.

Two more planets, including the supposedly habitable 581g, appeared when astronomers Steve Vogt of the University of California, Santa Cruz and Paul Butler of the Carnegie Institution of Washington added data from the HIRES spectrograph on the Keck Telescope in Hawaii. They announced their discovery Sept. 29.
Just two weeks later, the HARPS team announced they found no trace of the planet in their data, even when they added two more years’ worth of observations. But it was still possible that the planet was only visible using both sets of data.

Now, the first re-analysis of the combined data from both telescopes is out, and the planet is still missing.
“I don’t find anything,” Gregory said. “My analysis does not want to lock on to anything around 36 days. I find there’s just no feature there.”

Unlike earlier studies, Gregory used a branch of statistics called Bayesian analysis. Classical methods are narrow, testing only a single hypothesis, but Bayesian methods can evaluate a whole set of scenarios and figure out which is the most likely.

Gregory wrote a program that analyzed the likelihood that a given planetary configuration would produce the observed astronomical data, then ran it for various possible configurations.

For the HARPS data set, he found that the best solution was a star with five planets, which orbit the star once every three, five, 13, 67 and 400 days. The 36-day habitable world wasn’t there.

When he looked at the HIRES and the combined data sets, the best solution was a star with two planets. Only when he included an extra term in the HIRES data did Gregory find more, which he suspects means the HIRES instrument isn’t as accurate as thought.

“There may be something in the telescope…that’s contributing to the error,” he said.
Gregory’s model finds the probability that the six-planet model is a false alarm is 99.9978 percent. None of the planets Gregory’s analysis turned up are in the habitable zone. The results are in a paper submitted to the Monthly Notices of the Royal Astronomical Society and published on the physics preprint website arxiv.org.

Other astronomers seem impressed with Gregory’s analysis.
“That’s the right way to do it,” said exoplanet expert Daniel Fabrycky of the University of California, Santa Cruz. “I think everyone would agree that that is the most sophisticated analysis that you can do, and as much as you could hope to do.”

“The Gregory paper is by far the most complete statistical analysis to date that has been made public,” said exoplanet and astro-statistics expert Eric Ford of the University of Florida. “It’s by far the most rigorous analysis.”

But most astronomers are not yet ready to close the book on Gliese 581g.
“I’m not going to admit that it’s a dead planet yet,” said exoplanet expert Sara Seager of MIT. “No one will be able to sort this out today … it will take some time.”

Vogt still firmly believes the planet is there. “I’m standing by our data,” he told Wired.com.
He said there are two ways to interpret the signals from Gliese 581. Sometimes a single planet with an elongated, or elliptical orbit can look the same as two planets that trace perfect circles around their stars. One of Gliese 581’s planets, planet D, could be one of these “eccentric impostors,” hiding an extra planet within its signal.

Part of the reason it’s so difficult to tell these two scenarios apart is that spotty observations make fake signals in the data. These signals, which show up because the telescope can’t watch the star continuously, look like they could actually be planets, but they would disappear if we could observe round the clock.
In a paper that’s still in preparation, astronomer Guillem Anglada-Escudé and Harvard graduate student Rebekah Dawson tackle these issues, and conclude that the habitable planet still has a chance. “With the data we have, the most likely explanation is that this planet is still there,” Anglada-Escudé said.

Everyone agrees that the problem can only be resolved with more data. In particular, astronomers are anxious to see the extra data that the HARPS group used to conclude Gliese 581g is a mirage.

“I don’t think anything will change significantly until the Swiss publish their data,” Anglada-Escudé said. “Nobody else has seen their data. We’re waiting to see that, just to settle down the problem.”

By Lisa Grossman 
From wired.com

Hubble Helps Build Most-Detailed Dark Matter Map Yet


Using the Hubble Space Telescope and a cosmic magnifying glass effect, astronomers have put together one of the most detailed maps yet of dark matter in a giant galaxy cluster.

Dark matter is the stubborn, invisible stuff that makes up nearly a quarter of the mass and energy of the universe, but refuses to interact with ordinary matter except through gravity. The only way to know dark matter is there at all is by observing how its mass warps and tugs at visible matter.

When a lot of dark matter clumps together, as in massive galaxy clusters that contain hundreds or thousands of galaxies, it can act as an enormous magnifying glass for even more distant galaxies. The cluster’s gravity stretches and distorts the light from galaxies behind it like a fun house mirror. Astronomers on Earth see multiple warped images of each galaxy, a phenomenon called gravitational lensing.

Gravitational lensing can give a good idea of how much dark matter is in a cluster, but up until now astronomers had to guess at where exactly the dark matter was.

Now, using an image from Hubble’s Advanced Camera for Surveys, astronomers have built a high-resolution map of exactly where the dark stuff lurks in a galaxy cluster called Abell 1689.

“Other methods are based on making a series of guesses as to what the mass map is, and then astronomers find the one that best fits the data,” said astronomer Dan Coe of NASA’s Jet Propulsion Laboratory in a press release. “Using our method, we can obtain, directly from the data, a mass map that gives a perfect fit.”
Abell 1689 lies 2.2 billion light-years away and contains about 1,000 galaxies and trillions of stars. By combining the Hubble image with earlier observations, astronomers picked out 135 multiple images of 42 background galaxies.

“The lensed images are like a big puzzle,” Coe said. “Here we have figured out, for the first time, a way to arrange the mass of Abell 1689 such that it lenses all of these background galaxies to their observed positions.”

Coe and colleagues superimposed the locations of dark matter in the cluster (shown in blue, above) onto the Hubble image. The results, which appear in the Nov. 10 Astrophysical Journal, confirmed that Abell 1689 has more dark matter packed closer together than astronomers expected for a cluster its size.

That extra bulk could indicate that galaxy clusters formed earlier in the history of the universe than astronomers thought. Dark matter’s gravity pulls matter together, but it’s countered by another, even more mysterious force called dark energy, which pushes matter apart. Once dark energy became an important player in the early universe, galaxy clusters would have had a hard time sticking together.

“Galaxy clusters, therefore, would had to have started forming billions of years earlier in order to build up to the numbers we see today,” Coe said. “At earlier times, the universe was smaller and more densely packed with dark matter. Abell 1689 appears to have been well fed at birth by the dense matter surrounding it in the early universe. The cluster has carried this bulk with it through its adult life to appear as we observe it today.”

More data is still to come from a project called CLASH (Cluster Lensing And Supernova survey with Hubble), which will aim Hubble at 25 galaxy clusters for a total of one month over the next three years.
Image: NASA, ESA, D. Coe (NASA Jet Propulsion Laboratory/California Institute of Technology, and Space Telescope Science Institute), N. Benitez (Institute of Astrophysics of Andalusia, Spain), T. Broadhurst (University of the Basque Country, Spain), and H. Ford (Johns Hopkins University)

By Lisa Grossman
From wired.com

The Moon Hides Ice Where the Sun Don’t Shine

The moon is pockmarked with cold, wet oases that could contain enough water ice to be useful to manned missions.

A year after NASA’s Lunar Crater Observation and Sensing Satellite (LCROSS) smashed into the surface of the moon, astronomers have confirmed that lunar craters can be rich reservoirs of water ice, plus a pharmacopoeia of other surprising substances.


The debris plume about 20 seconds after LCROSS impact.



On Oct. 9, 2009, the LCROSS mission sent a spent Centaur rocket crashing into Cabeus crater near the moon’s south pole, a spot previous observations had shown to be loaded with hydrogen. A second spacecraft flew through the cloud of debris kicked up by the explosion to search for signs of water and other ingredients of lunar soil.

And water appeared in buckets. The first LCROSS results reported that about 200 pounds of water appeared in the plume. A new paper in the Oct. 22 Science ups the total amount of water vapor and water ice to 341 pounds, plus or minus 26 pounds.

Given the total amount of soil blown out of the crater, astronomers estimate that 5.6 percent of the soil in the LCROSS impact site is water ice. Earlier studies suggested that soils containing just 1 percent water would be useful for any future space explorers trying to build a permanent lunar base.

“The number of 1 percent was generally agreed to as what was needed to be a net profit, a net return on the effort to extract it out of the dark shadows,” said NASA planetary scientist Anthony Colaprete in a press conference Oct. 21. “We saw 5 percent, which means that indeed where we impacted would be a net benefit to somebody looking for that resource.”

Water could lurk not just in the moon’s deep dark craters, but also as permafrost beneath the sunlit surface. Based on the impact data, water is probably mixed in to the soil as loose ice grains, rather than spread out in a concentrated skating rink. This distribution could make the water easier to harvest.

“The water ice is in this rather malleable, dig-able kind of substrate, which is good,” Colaprete said. “At least some of the water ice, you could go in and literally just scoop it up if you needed to.”

But the plume wasn’t just wet. A series of papers in Science report observations from both LCROSS and LRO that show a laundry list of other compounds were also blown off the face of the moon, including hydroxyl, carbon monoxide, carbon dioxide, ammonia, free sodium, hydrogen, methane, sulfur dioxide and, surprisingly, silver.


Temperature map of the lunar south pole from the LRO Diviner Lunar Radiometer Experiment, showing several intensely cold impact craters. UCLA/NASA/Jet Propulsion Laboratory, Pasadena, Calif./Goddard



The impact carved out a crater 80 to 100 feet wide, and kicked between 8,818 pounds and 13,228 pounds of debris more than 6 miles out of the dark crater and into the sunlight where LCROSS could see it. Astronomers, as well as space enthusiasts watching online, expected to see a bright flash the instant the rocket hit, but none appeared.

The wimpy explosion indicates that the soil the rocket plowed into was “fluffy, snow-covered dirt,” said NASA chief lunar scientist Michael Wargo.

The soil is also full of volatile compounds that evaporate easily at room temperature, suggests planetary scientist Peter Schultz of Brown University, lead author of one of the new papers. The loose soil shielded the view of the impact from above.

Data from an instrument called LAMP (Lyman Alpha Mapping Project) on LRO shows that the vapor cloud contained about 1256 pounds of carbon monoxide, 300 pounds of molecular hydrogen, 350 pounds of calcium, 265 pounds of mercury and 88 pounds of magnesium. Some of these compounds, called “super-volatile” for their low boiling points, are known to be important building blocks of planetary atmospheres and the precursors of life on Earth, says astronomer David Paige of the University of California, Los Angeles.
Compared to the amount of water in the crater, the amounts of these materials found were much greater than what is usually found in comets, the interstellar medium, or what is predicted from reactions in the protoplanetary disk.

“It’s like a little treasure trove of stuff,” said planetary scientist Greg Delory of the University of California, Berkeley, who was not involved in the new studies.

Astronomers picked Cabeus crater partly because its floor has been in constant shadow for billions of years. Without direct sunlight, temperatures in polar craters on the moon can drop as low as -400 degrees Fahrenheit, cold enough for compounds to stick to grains of soil the way your tongue sticks to an ice cube.

Other factors, like micrometeorite impacts and ultraviolet photons that carry little heat but significant amounts of energy, can release these molecules from the moon’s cold traps. The composition of the lunar surface represents a balancing act between what sticks and what is released.

The fact that so many different materials, most of which are usually gaseous at room temperature and react easily with other chemicals, remain stuck to the moon gives astronomers clues as to how they got there.

“Perhaps the moon is presently active and there’s all kinds of chemistry going on and stuff being produced, continually collecting in these polar regions,” Delory said. “Maybe it’ll tell us the moon is in fact a much more active and dynamic system than we thought, and there’s water being concentrated at the poles by present-day ongoing processes.”

Another possibility is that these materials hitched a ride on comets or asteroids, Schultz suggests. Compounds deposited all over the moon could have migrated to the poles over the course of billions of years, where they were trapped by the cold or buried under the soil.

There’s only one sure way to find out.
“We need to go there,” Delory said. Whether the water will be a useful resource for future astronauts or not, the ice itself is a rich stockpile of potential scientific information, he said. “That’s as much a reason to go there, for the story that this water tells.”

By Lisa Grossman
From wired.com

Universal, Primordial Magnetic Fields Discovered in Deep Space

Caltech physicist Shin'ichiro Ando and Alexander Kusenko, a professor of physics and astronomy at UCLA, report the discovery in a paper to be published in an upcoming issue of Astrophysical Journal Letters.

Ando and Kusenko studied images of the most powerful objects in the universe -- supermassive black holes that emit high-energy radiation as they devour stars in distant galaxies -- obtained by NASA's Fermi Gamma-ray Space Telescope.

 An artist's conception of an "active galactic nucleus." In some galaxies, the nucleus, or central core, produces more radiation than the entire rest of the galaxy.

"We found the signs of primordial magnetic fields in deep space between galaxies," Ando said.
Physicists have hypothesized for many years that a universal magnetic field should permeate deep space between galaxies, but there was no way to observe it or measure it until now.
The physicists produced a composite image of 170 giant black holes and discovered that the images were not as sharp as expected.

"Because space is filled with background radiation left over from the Big Bang, as well as emitted from galaxies, high-energy photons emitted by a distant source can interact with the background photons and convert into electron-positron pairs, which interact in their turn and convert back into a group of photons somewhat later," said Kusenko, who is also a senior scientist at the University of Tokyo's Institute for Physics and Mathematics of the Universe.

"While this process by itself does not blur the image significantly, even a small magnetic field along the way can deflect the electrons and positrons, making the image fuzzy," he said.

From such blurred images, the researchers found that the average magnetic field had a "femto-Gauss" strength, just one-quadrillionth of the Earth's magnetic field. The universal magnetic fields may have formed in the early universe shortly after the Big Bang, long before stars and galaxies formed, Ando and Kusenko said.
The research was funded by NASA, the U.S. Department of Energy and Japan's Society for the Promotion of Science.

From sciencedaily.com

Astronomers catch moment of star's birth

What could be the youngest known star has been photographed in the earliest stages of being born. Not yet fully developed into a true star, the object is in the earliest stages of formation and has just begun pulling in matter from a surrounding envelope of gas and dust, according to a new study in the Astrophysical Journal.

The study’s authors found the object using the Submillimeter Array in Hawaii and the Spitzer Space Telescope. Known as L1448-IRS2E, it’s located in the Perseus star-forming region of our Milky Way galaxy, about 800 light years away.



The team reckons it's in between the prestellar phase, when a particularly dense region of a molecular cloud first begins to clump together, and the protostar phase, when gravity has pulled enough material together to form a dense, hot core out of the surrounding envelope.

"It’s very difficult to detect objects in this phase of star formation, because they are very short-lived and they emit very little light," said Xuepeng Chen, a postdoctoral associate at Yale and lead author of the paper.
Most protostars are at least as luminous as the sun, with large dust envelopes that glow at infrared wavelengths. Because L1448-IRS2E is less than one tenth as bright as this, the team believes it's too dim to be considered a true protostar. 

Yet they also discovered that the object is ejecting streams of high-velocity gas from its center, confirming that some sort of preliminary mass has already formed and the object has developed beyond the prestellar phase. This kind of outflow is seen in protostars as a result of the magnetic field surrounding the forming star, but has never before been seen at such an early stage.

"Stars are defined by their mass, but we still don’t know at what stage of the formation process a star acquires most of its mass," said Héctor Arce, assistant professor of astronomy at Yale and an author of the paper. "This is one of the big questions driving our work."

Emma Woollacott
From tgdaily.com

Astronomers glimpse birth of mega-stars

Using a CSIRO radio telescope, astronomers have caught an enormous cloud of cosmic gas and dust in the process of collapsing in on itself. They hope the discovery could help establish how massive stars form. Dr Peter Barnes from the University of Florida says astronomers have a good grasp of how stars such as our sun form from clouds of gas and dust. But for heavier stars – ten times the mass of the sun or more – they are still largely in the dark, despite years of work.

"Astronomers are still debating the physical processes that can generate these big stars," says Barnes.
"Massive stars are rare, making up only a few per cent of all stars, and they will only form in significant numbers when really massive clouds of gas collapse, creating hundreds of stars of different masses. Smaller gas clouds are not likely to make big stars."

Most regions in space where massive stars are forming are well over 1,000 light-years away, making them hard to spot.

But using CSIRO’s ‘Mopra’ radio telescope – a 22m dish near Coonabarabran, New South Wales – the team discovered a massive cloud made mostly of hydrogen gas and dust, three or more light-years across, that is collapsing in on itself and will probably form a huge cluster of stars.
Dr Stuart Ryder of the Anglo-Australian Observatory said the discovery was made during a survey of more than 200 gas clouds. Called BYF73, it's about 8,000 light years away, in the constellation of Carina in the Southern sky.

"With clouds like this we can test theories of massive star cluster formation in great detail," says Ryder.
Evidence for ‘infalling’ gas came from the radio telescope’s detection of two kinds of molecules in the cloud – HCO+ and H13CO+. The spectral lines from the HCO+ molecules in particular showed the gas had a velocity and temperature pattern that indicated collapse. 

The research team calculates that the gas is falling in at the rate of about three per cent of the Sun’s mass every year – one of the highest rates known.

Follow-up infrared observations made with the 3.9-m Anglo-Australian Telescope showed signs of massive young stars that have already formed right at the centre of the gas clump, and new stars forming.

Star-formation in the cloud also showed up in archival data from the Spitzer and MSX spacecraft, which observe in the mid-infrared.

From tgdaily.com

US military loses contact with experimental hypersonic vehicle


The US military has apparently lost contact with an experimental hypersonic vehicle over the Pacific Ocean.   
According to Turner Brinton of Space News, the Falcon Hypersonic Technology Vehicle (HTV)-2 was the "first in a series of flight experiments" planned to demonstrate technology that could be deployed in future long-range conventional missiles.  "[The vehicle] was launched from Vandenberg Air Force Base, Calif., atop a Minotaur 4 rocket. Built by Lockheed Martin Corp., the HTV-2 craft was supposed to glide over the Pacific Ocean at speeds exceeding 20,000 kilometers per hour for as long as 30 minutes," explained Brintion.
"Nine minutes after launch, however, DARPA (Defense Advanced Research Projects Agency) lost contact with the craft, and the cause of the failure is still unknown. There is [only] one remaining HTV-2 craft."
However, Frank James of NPR noted that the early days of most defense-related, high-tech military projects were "typically marked" by some sort of failure.

"The infant US space program in the late 1950s saw the failures of the Vanguard rocket program. So it's no surprise that the military's test last week of its Falcon space glider meant to test the concept of a hypersonic craft that could travel up to 20 times the speed of sound, more than 15,000 miles an hour, was a bust," opined James.

"The idea is that such a craft could eventually allow US aircraft to reach hotspots anywhere on Earth within minutes."

 By Aharon Etengoff
From tgdaily.com 

Youngest Extra-Solar Planet Discovered Around Solar-Type Star

The giant planet, six-times the mass of Jupiter, is only 35 million years old. It orbits a young active central star at a distance closer than Mercury orbits the Sun. Young stars are usually excluded from planet searches because they have intense magnetic fields that generate a range of phenomena known collectively as stellar activity, including flares and spots. This activity can mimic the presence of a companion and so can make extremely difficult to disentangle the signals of planets and activity.

Artistic impression of BD+20 1790b.


University of Hertfordshire astronomers, Dr Maria Cruz Gálvez-Ortiz and Dr John Barnes, are part of the international collaboration that made the discovery.

Dr Maria Cruz Gálvez-Ortiz, describing how the planet was discovered, said: "The planet was detected by searching for very small variations in the velocity of the host star, caused by the gravitational tug of the planet as it orbits -- the so-called 'Doppler wobble technique.' Overcoming the interference caused by the activity was a major challenge for the team, but with enough data from an array of large telescopes the planet's signature was revealed."

There is currently a severe lack of knowledge about early stages of planet evolution. Most planet-search surveys tend to target much older stars, with ages in excess of a billion years. Only one young planet, with an age of 100 million years, was previously known. However, at only 35 million years, BD+20 1790b is approximately three times younger. The detection of young planets will allow the testing of formation scenarios and to investigate the early stages of planetary evolution.

BD+20 1790b was discovered using observations made at different telescopes, including the Observatorio de Calar Alto (Almería, Spain) and the Observatorio del Roque de los Muchachos (La Palma, Spain) over the last five years. The discovery team is an international collaboration including: M.M. Hernán Obispo, E. De Castro and M. Cornide (Universidad Complutense de Madrid, Spain), M.C. Gálvez-Ortiz and J.R. Barnes, (University of Hertfordshire, U.K.), G. Anglada-Escudé (Carnegie Institution of Washington, USA) and S.R. Kane (NASA Exoplanet Institute, Caltech, USA).

From sciencedaily.com

Hot Space Shuttle Images

Researchers at NASA are using a novel thermal-imaging system on board a Navy aircraft to capture images of heat patterns that light up the surface of the space shuttle as it returns through the Earth's atmosphere. The researchers have thus far imaged three shuttle missions and are processing the data to create 3-D surface-temperature maps. The data will enable engineers to design systems to protect future spacecraft from the searing heat--up to 5,500 degrees Celsius--seen during reentry.

"We want to understand peak temperatures, when they happen and where, because that determines the type of material for, and size of, a protection system," says Thomas Horvath, principal investigator of the project, called Hypersonic Thermodynamic InfraRed Measurements (HYTHIRM), at Langley Research Center in Hampton, VA.

Hot body: These thermal images were taken of space shuttle Discovery on September 11. Temperature data was used to make the color images (middle and bottom), blue being the lowest temperatures and red the highest.



NASA has become more concerned with safety and developing tools for inspecting and protecting the shuttle since the 2003 space shuttle Columbia disaster, when damage to the shuttle's wing compromised its heat-resistant shield, causing it to lose structural integrity and break apart during reentry, killing all seven astronauts aboard. Horvath, also a support team member of the Columbia Accident Investigation Board (CAIB), says the HYTHIRM project was developed in response to the Columbia accident.

"I certainly think [the researchers] can learn something about what causes the heating," says Douglas Osheroff, a professor of physics and applied physics at Stanford University, and a member of the CAIB. He adds that the thermal images could also be used as a diagnostic tool to check the integrity of the shuttles tiles during reentry. Currently engineers must manually inspect the tiles upon the shuttle's return.

To image the shuttle, the researchers used a novel optical system called Cast Glance on board a Navy P-3 Orion aircraft. The system is used mostly by the Department of Defense for missile defense missions and so had to be slightly modified for the NASA project. The Navy researchers added a high-resolution, off-the-shelf video camera and adjusted it to filter infrared light. They then calibrated Cast Glance's optical sensors so that, by measuring the infrared radiation from the shuttle, they could calculate the surface temperatures.

The Navy jet flies to within 37 kilometers of the space shuttle when the latter is traveling at speeds of between two and three miles per second, acquiring eight uninterrupted minutes of data: approximately 10,000 to 15,000 images for each mission.

The researchers' focus was the underbelly of the shuttle, which is covered by about 10,000 thermal protective tiles made of a material called reinforced carbon-carbon (RCC). The highest heating areas are near the nose and along the leading edge of each wing. As the shuttle pushes air molecules out of the way, says Deborah Tomak, project manager of HYTHIRM, a boundary layer or protective region, similar to insulation, forms around the shuttle where temperatures are between 1,093 and 1,649 degrees Celsius. Just outside that boundary layer temperatures can rise to a sweltering 5,500 degrees.

Any damage to the tiles, or a protrusion or bump on the underbelly of the shuttle, can cause a break in the boundary layer and allow in extreme heat. Of particular concern are the gap fillers, pieces of ceramic-coated fabric the thickness of a sheet of paper that fit between the tiles to provide cushion, which have been known to protrude. (NASA, however, says the fillers do not impose a safety concern.)

The Langley researchers imaged three shuttle missions: Discovery on March 28 (STS-119); Atlantis on May 24 (STS-125); and Discovery again on September 11 (STS-128). They also conducted two small flight-research experiments. "We added a tiny bump to Discovery's wing, approximately a quarter of an inch, to better understand what is called a boundary layer transition or trip in the flow fields," says Tomak. The researchers also coated two of the tiles with a material that is being developed for the heat shield of the Orion crew exploration vehicle.

The researchers are just beginning to process all the collected data into 3-D surface temperature maps, which they will compare with measurements from thermal sensors on the shuttle's underbelly and with computational fluid dynamic models. Horvath says they will present their results at a conference in January 2010.

However, the researchers have already seen some unexpected results. A small imperfection, possibly as small as a tenth of an inch, on the opposite side from the bump purposely placed on Discovery's wing, created high temperatures in a much larger area than what you normally see, says Horvath.

Osheroff says he is interested to see if the analysis finds different results for different orbiters. For example, Columbia was the first shuttle built and is 20,000 pounds heavier than the other orbiters. "Heating patterns depend on the attitude or orientation of the orbiter during reentry, so it would be beneficial to conduct tests for at least two flights of each orbiter."

There are only six remaining space shuttle flights before the orbiters are scheduled to retire. Horvath says the researchers hope they can continue to image the remaining missions, but final approval is still pending. "Our ability to accurately predict thermal data will have a profound impact on designs for new vehicles," he says.

By Brittany Sauser

Scientists See Water Ice In Fresh Meteorite Craters On Mars

ScienceDaily (Sep. 25, 2009) — NASA's Mars Reconnaissance Orbiter has revealed frozen water hiding just below the surface of mid-latitude Mars. The spacecraft's observations were obtained from orbit after meteorites excavated fresh craters on the Red Planet.

Scientists controlling instruments on the orbiter found bright ice exposed at five Martian sites with new craters that range in depth from approximately half a meter to 2.5 meters (1.5 feet to 8 feet). The craters did not exist in earlier images of the same sites. Some of the craters show a thin layer of bright ice atop darker underlying material. The bright patches darkened in the weeks following initial observations, as the freshly exposed ice vaporized into the thin Martian atmosphere. One of the new craters had a bright patch of material large enough for one of the orbiter's instruments to confirm it is water-ice.

The finds indicate water-ice occurs beneath Mars' surface halfway between the north pole and the equator, a lower latitude than expected in the Martian climate.

Earlier and later HiRISE images of a fresh meteorite crater 12 meters, or 40 feet, across located within Arcadia Planitia on Mars show how water ice excavated at the crater faded with time. The images, each 35 meters, or 115 feet across, were taken in November 2008 and January 2009.



"This ice is a relic of a more humid climate from perhaps just several thousand years ago," said Shane Byrne of the University of Arizona, Tucson.

Byrne is a member of the team operating the orbiter's High Resolution Imaging Science Experiment, or HiRISE camera, which captured the unprecedented images. Byrne and 17 co-authors report the findings in the Sept. 25 edition of the journal Science.

"We now know we can use new impact sites as probes to look for ice in the shallow subsurface," said Megan Kennedy of Malin Space Science Systems in San Diego, a co-author of the paper and member of the team operating the orbiter's Context Camera.

During a typical week, the Context Camera returns more than 200 images of Mars that cover a total area greater than California. The camera team examines each image, sometimes finding dark spots that fresh, small craters make in terrain covered with dust. Checking earlier photos of the same areas can confirm a feature is new. The team has found more than 100 fresh impact sites, mostly closer to the equator than the ones that revealed ice.

An image from the camera on Aug. 10, 2008, showed apparent cratering that occurred after an image of the same ground was taken 67 days earlier. The opportunity to study such a fresh impact site prompted a look by the orbiter's higher resolution camera on Sept. 12, 2009, confirming a cluster of small craters.

"Something unusual jumped out," Byrne said. "We observed bright material at the bottoms of the craters with a very distinct color. It looked a lot like ice."

The bright material at that site did not cover enough area for a spectrometer instrument on the orbiter to determine its composition. However, a Sept. 18, 2008, image of a different mid-latitude site showed a crater that had not existed eight months earlier. This crater had a larger area of bright material.

"We were excited about it, so we did a quick-turnaround observation," said co-author Kim Seelos of Johns Hopkins University Applied Physics Laboratory in Laurel, Md. "Everyone thought it was water-ice, but it was important to get the spectrum for confirmation."

Mars Reconnaissance Orbiter Project Scientist Rich Zurek, of NASA's Jet Propulsion Laboratory, Pasadena, Calif., said, "This mission is designed to facilitate coordination and quick response by the science teams. That makes it possible to detect and understand rapidly changing features."

The ice exposed by fresh impacts suggests that NASA's Viking Lander 2, digging into mid-latitude Mars in 1976, might have struck ice if it had dug 10 centimeters (4 inches) deeper. The Viking 2 mission, which consisted of an orbiter and a lander, launched in September 1975 and became one of the first two space probes to land successfully on the Martian surface. The Viking 1 and 2 landers characterized the structure and composition of the atmosphere and surface. They also conducted on-the-spot biological tests for life on another planet.

Twin Keck Telescopes Probe Dual Dust Disks

ScienceDaily (Sep. 25, 2009) — Astronomers using the twin 10-meter telescopes at the W. M. Keck Observatory in Hawaii have explored one of the most compact dust disks ever resolved around another star. If placed in our own solar system, the disk would span about four times Earth’s distance from the sun, reaching nearly to Jupiter’s orbit. The compact inner disk is accompanied by an outer disk that extends hundreds of times farther.

The centerpiece of the study is the Keck Interferometer Nuller (KIN), a device that combines light captured by both of the giant telescopes in a way that allows researchers to study faint objects otherwise lost in a star’s brilliant glare. "This is the first compact disk detected by the KIN, and a demonstration of its ability to detect dust clouds a hundred times smaller than a conventional telescope can see," said Christopher Stark, an astronomer at NASA’s Goddard Space Flight Center in Greenbelt, Md., who led the research team.

By merging the beams from both telescopes in a particular way, the KIN essentially creates a precise blind spot that blocks unwanted starlight but allows faint adjacent signals – such as the light from dusty disks surrounding the star – to pass through.

This diagram compares 51 Ophiuchi and its dust disks to the sun, planets and zodiacal dust in the solar system. Zones with larger dust grains are red; those with smaller grains are blue. Planet sizes are not to scale.



In April 2007, the team targeted 51 Ophiuchi, a young, hot, B-type star about 410 light-years away in the constellation Ophiuchus. Astronomers suspect the star and its disks represent a rare, nearby example of a young planetary system just entering the last phase of planet formation, although it is not yet known whether planets have formed there.

"Our new observations suggest 51 Ophiuchi is a beautiful protoplanetary system with a cloud of dust from comets and asteroids extremely close to its parent star," said Marc Kuchner, an astronomer at Goddard and a member of the research team.

Planetary systems are surprisingly dusty places. Much of the dust in our solar system forms inward of Jupiter's orbit, as comets crumble near the sun and asteroids of all sizes collide. This dust reflects sunlight and sometimes can be seen as a wedge-shaped sky glow – called the zodiacal light – before sunrise or after sunset.

Dusty disks around other stars that arise through the same processes are called "exozodiacal" clouds. "Our study shows that 51 Ophiuchi’s disk is more than 100,000 times denser than the zodiacal dust in the solar system," explained Stark." This suggests that the system is still relatively young, with many colliding bodies producing vast amounts of dust."

To decipher the structure and make-up of the star’s dust clouds, the team combined KIN observations at multiple wavelengths with previous studies from NASA’s Spitzer Space Telescope and the European Southern Observatory’s Very Large Telescope Interferometer in Chile.

The inner disk extends about 4 Astronomical Units (AU) from the star and rapidly tapers off. (One AU is Earth’s average distance from the sun, or 93 million miles.) The disk’s infrared color indicates that it mainly harbors particles with sizes of 10 micrometers – smaller than a grain of fine sand – and larger.

The outer disk begins roughly where the inner disk ends and reaches about 1,200 AU. Its infrared signature shows that it mainly holds grains just one percent the size of those in the inner disk – similar in size to the particles in smoke. Another difference: The outer disk appears more puffed up, extending farther away from its orbital plane than the inner disk.

"We suspect that the inner disk gives rise to the outer disk," explained Kuchner. As asteroid and comet collisions produce dust, the larger particles naturally spiral toward the star. But pressure from the star’s light pushes smaller particles out of the system. This process, which occurs in our own solar system, likely operates even better around 51 Ophiuchi, a star 260 times more luminous than the sun.

Radar Map Of Buried Mars Layers Matches Climate Cycles

ScienceDaily (Sep. 23, 2009) — New, three-dimensional imaging of Martian north-polar ice layers by a radar instrument on NASA's Mars Reconnaissance Orbiter is consistent with theoretical models of Martian climate swings during the past few million years.

Alignment of the layering patterns with the modeled climate cycles provides insight about how the layers accumulated. These ice-rich, layered deposits cover an area one-third larger than Texas and form a stack up to 2 kilometers (1.2 miles) thick atop a basal deposit with additional ice.

"Contrast in electrical properties between layers is what provides the reflectivity we observe with the radar," said Nathaniel Putzig of Southwest Research Institute, Boulder, Colo., a member of the science team for the Shallow Radar instrument on the orbiter. "The pattern of reflectivity tells us about the pattern of material variations within the layers."

A radar-generated map of the thickness of the layered deposits.




Earlier radar observations indicated that the Martian north-polar layered deposits are mostly ice. Radar contrasts between different layers in the deposits are interpreted as differences in the concentration of rock material, in the form of dust, mixed with the ice. These deposits on Mars hold about one-third as much water as Earth's Greenland ice sheet.

Putzig and nine co-authors report findings from 358 radar observations in a paper accepted for publication by the journal Icarus and currently available online.

Their radar results provide a cross-sectional view of the north-polar layered deposits of Mars, showing that high-reflectivity zones, with multiple contrasting layers, alternate with more-homogenous zones of lower reflectivity. Patterns of how these two types of zones alternate can be correlated to models of how changes in Mars' tilt on its axis have produced changes in the planet's climate in the past 4 million years or so, but only if some possibilities for how the layers form are ruled out.

"We're not doing the climate modeling here; we are comparing others' modeling results to what we observe with the radar, and using that comparison to constrain the possible explanations for how the layers form," Putzig said.

The most recent 300,000 years of Martian history are a period of less dramatic swings in the planet's tilt than during the preceding 600,000 years. Since the top zone of the north-polar layered deposits -- the most recently deposited portion -- is strongly radar-reflective, the researchers propose that such sections of high-contrast layering correspond to periods of relatively small swings in the planet's tilt.

They also propose a mechanism for how those contrasting layers would form. The observed pattern does not fit well with an earlier interpretation that the dustier layers in those zones are formed during high-tilt periods when sunshine on the polar region sublimates some of the top layer's ice and concentrates the dust left behind. Rather, it fits an alternative interpretation that the dustier layers are simply deposited during periods when the atmosphere is dustier.

The new radar mapping of the extent and depth of five stacked units in the north-polar layered deposits reveals that the geographical center of ice deposition probably shifted by 400 kilometers (250 miles) or more at least once during the past few million years.

"The radar has been giving us spectacular results," said Jeffrey Plaut of NASA's Jet Propulsion Laboratory, Pasadena, Calif., a co-author of the paper. "We have mapped continuous underground layers in three dimensions across a vast area."

The Italian Space Agency operates the Shallow Radar instrument, which it provided for NASA's Mars Reconnaissance Orbiter. The orbiter has been studying Mars with six advanced instruments since 2006. It has returned more data from the planet than all other past and current missions to Mars combined. For more information about the mission, visit: http://www.nasa.gov/mro .