Meet the Nimble-Fingered Interface of the Future

Microsoft's Kinect, a 3-D camera and software for gaming, has made a big impact since its launch in 2010. Eight million devices were sold in the product's.....

Electronic Implant Dissolves in the Body

Researchers at the University of Illinois at Urbana-Champaign, Tufts University, and ......

Sorting Chip May Lead to Cell Phone-Sized Medical Labs

The device uses two beams of acoustic -- or sound -- waves to act as acoustic tweezers and sort a continuous flow of cells on a ....

TDK sees hard drive breakthrough in areal density

Perpendicular magnetic recording was an idea that languished for many years, says a TDK technology backgrounder, because the ....

Engineers invent new device that could increase Internet

The device uses the force generated by light to flop a mechanical switch of light on and off at a very high speed........


Showing posts with label HUMAN. Show all posts
Showing posts with label HUMAN. Show all posts

'Matrix'-Style Effortless Learning? Vision Scientists Demonstrate Innovative Learning Method

Experiments conducted at Boston University (BU) and ATR Computational Neuroscience Laboratories in Kyoto, Japan, recently demonstrated that through a person's visual cortex, researchers could use decoded functional magnetic resonance imaging (fMRI) to induce brain activity patterns to match a previously known target state and thereby improve performance on visual tasks.

 In the future, a person may be able to watch a computer screen and have his or her brain patterns modified to improve physical or mental performance. Researchers say an innovative learning method that uses decoded functional magnetic resonance imaging could modify brain activities to help people recuperate from an accident or disease, learn a new language or even fly a plane.

Think of a person watching a computer screen and having his or her brain patterns modified to match those of a high-performing athlete or modified to recuperate from an accident or disease. Though preliminary, researchers say such possibilities may exist in the future.

"Adult early visual areas are sufficiently plastic to cause visual perceptual learning," said lead author and BU neuroscientist Takeo Watanabe of the part of the brain analyzed in the study.

Neuroscientists have found that pictures gradually build up inside a person's brain, appearing first as lines, edges, shapes, colors and motion in early visual areas. The brain then fills in greater detail to make a red ball appear as a red ball, for example.

Researchers studied the early visual areas for their ability to cause improvements in visual performance and learning.

"Some previous research confirmed a correlation between improving visual performance and changes in early visual areas, while other researchers found correlations in higher visual and decision areas," said Watanabe, director of BU's Visual Science Laboratory. "However, none of these studies directly addressed the question of whether early visual areas are sufficiently plastic to cause visual perceptual learning." Until now.

Boston University post-doctoral fellow Kazuhisa Shibata designed and implemented a method using decoded fMRI neurofeedback to induce a particular activation pattern in targeted early visual areas that corresponded to a pattern evoked by a specific visual feature in a brain region of interest. The researchers then tested whether repetitions of the activation pattern caused visual performance improvement on that visual feature.

The result, say researchers, is a novel learning approach sufficient to cause long-lasting improvement in tasks that require visual performance.

What's more, the approached worked even when test subjects were not aware of what they were learning.

"The most surprising thing in this study is that mere inductions of neural activation patterns corresponding to a specific visual feature led to visual performance improvement on the visual feature, without presenting the feature or subjects' awareness of what was to be learned," said Watanabe, who developed the idea for the research project along with Mitsuo Kawato, director of ATR lab and Yuka Sasaki, an assistant in neuroscience at Massachusetts General Hospital.

"We found that subjects were not aware of what was to be learned while behavioral data obtained before and after the neurofeedback training showed that subjects' visual performance improved specifically for the target orientation, which was used in the neurofeedback training," he said.

The finding brings up an inevitable question. Is hypnosis or a type of automated learning a potential outcome of the research?

"In theory, hypnosis or a type of automated learning is a potential outcome," said Kawato. "However, in this study we confirmed the validity of our method only in visual perceptual learning. So we have to test if the method works in other types of learning in the future. At the same time, we have to be careful so that this method is not used in an unethical way."

At present, the decoded neurofeedback method might be used for various types of learning, including memory, motor and rehabilitation.

The National Science Foundation, the National Institutes of Health and the Ministry of Education, Culture, Sports, Science and Technology in Japan supported the research.

From sciencedaily

No Need to Shrink Guts to Have a Larger Brain

Brain tissue is a major consumer of energy in the body. If an animal species evolves a larger brain than its ancestors, the increased need for energy can be met by either obtaining additional sources of food or by a trade-off with other functions in the body. In humans, the brain is three times larger and thus requires a lot more energy than that of our closest relatives, the great apes. Until now, the generally accepted theory for this condition was that early humans were able to redirect energy to their brains thanks to a reduced digestive tract. Zurich primatologists, however, have now disproved this theory, demonstrating that mammals with relatively large brains actually tend to have a somewhat bigger digestive tract. Ana Navarrete, the first author on the study recently published in Nature, has studied hundreds of carcasses from zoos and museums.

 Thanks to communal care for mothers and children, humans can afford both: a huge brain and more frequent offspring.

"The data set contains a hundred species, from the stag to the shrew," explains the PhD student. The scientists involved in the study then compared the size of the brain with the fat-free body mass. Senior author Karin Isler stresses that, "it is extremely important to take an animal's adipose deposits into consideration as, in some species, these constitute up to half of the body mass in autumn." But even compared with fat-free body mass, the size of the brain does not correlate negatively with the mass of other organs.

More fat, smaller brain
Nevertheless, the storage of fat plays a key role in brain size evolution. The researchers discovered another rather surprising correlation: the more fat an animal species can store, the smaller its brain. Although adipose tissue itself does not use much energy, fat animals need a lot of energy to carry extra weight, especially when climbing or running. This energy is then lacking for potential brain expansion. "It seems that large adipose deposits often come at the expense of mental flexibility," says Karin Isler. "We humans are an exception, along with whales and seals -- probably because, like swimming, our bipedalism doesn't require much more energy even when we are a bit heavier."

Interplay of energetic factors
The rapid increase in brain size and the associated increase in energy intake began about two million years ago in the genus Homo. Based on their extensive studies of animals, the Zurich researchers propose a scenario in which several energetic factors are involved: "In order to stabilize the brain's energy supply on a higher level, prehistoric man needed an all-year, high-quality source of food, such as underground tubers or meat. As they no longer climbed every day, they perfected the art of walking upright. Even more important, however, is communal child care," says Karin Isler. Because ape mothers do not receive any help, they can only raise an offspring every five to eight years. Thanks to communal care for mothers and children, humans can afford both: a huge brain and more frequent offspring.

From sciencedaily

Stem cells created by cloning human eggs

Scientists have for the first time derived embryonic stem cells from individual patients, raising the possibility of personalised genetic treatments. A team of scientists at The New York Stem Cell Foundation (NYSCF) Lab in New York created the cells through a cloning process, by adding the nuclei of adult skin cells from patients with type 1 diabetes to unfertilized donor eggs.



Such patient-specific cells could potentially be used to replace damaged or diseased cells without fear of rejection by the patient's immune system. They could help treat diseases such as diabetes, Parkinson's, and Alzheimer's.

"The specialized cells of the adult human body have an insufficient ability to regenerate missing or damaged cells caused by many diseases and injuries," says Dr Dieter Egli, NYSCF senior scientist.
"But if we can reprogram cells to a pluripotent state, they can give rise to the very cell types affected by disease, providing great potential to effectively treat and even cure these diseases.
There's a lot more work to be done. In this initial study, the stem cells produced were abnormal, meaning they couldn't be safely used. They contain genetic material from two people, rather than just the patient, and have 69 chromosomes rather than the usual 46. However, the team believes it can solve this problem.
"In this three-year study, we successfully reprogrammed skin cells to the pluripotent state," says Egli.
"Our hope is that we can eventually overcome the remaining hurdles and use patient-specific stem cells to treat and cure people who have diabetes and other diseases."

By Emma Woollacott 
From tgdaily

Gastric Bacterium Helicobacter Pylori Protects Against Asthma

Allergy-induced asthma has been on the increase in the industrialized world for decades and has virtually taken on epidemic proportions. The rapid rise in allergic airway disease is attributed to air pollution, smoking, the hygiene hypothesis and the widespread use of antibiotics. The hygiene hypothesis states that modern hygiene measures have led to a lack of exposure to infectious agents, which is important for the normal maturation of the immune system. In an article published in the Journal of Clinical Investigation, scientists from the University of Zurich and the University Medical Center of the Johannes Gutenberg University Mainz now reveal that the increase in asthma could be put down to the specific disappearance of the gastric bacterium Helicobacter pylori (H. pylori) from Western societies.

 Electron micrograph of H. pylori.


H. pylori is resistant to gastric acid. According to estimates, around half of the world's population might be infected with the bacteria. The affliction often has no symptoms, but under certain conditions can cause gastritis, gastric and duodenal ulcers, and stomach cancer. Consequently, H. pylori is often killed off with antibiotics as a precaution, even if the patient does not have any complaints.

Early infection with H. pylori protects against asthma
For their study, the researchers infected mice with H. pylori bacteria. If the mice were infected at the age of a few days old, they developed immunological tolerance to the bacterium and even reacted insignificantly -- if at all -- to strong, asthma-inducing allergens. Mice that were not infected with H. pylori until they had reached adulthood, however, had a much weaker defense. "Early infection impairs the maturation of the dendritic cells and triggers the accumulation of regulatory T-cells that are crucial for the suppression of asthma," says Anne Müller, a professor of molecular cancer research at the University of Zurich, explaining the protective mechanism.

If regulatory T-cells were transferred from infected to uninfected mice, they too enjoyed effective protection against allergy-induced asthma. However, mice that had been infected early also lost their resistance to asthma-inducing allergens if H. pylori was killed off in them with the aid of antibiotics after the sensitization phase. According to lung and allergy specialist Christian Taube, a senior physician at III. Medical Clinic of the Johannes Gutenberg University Mainz, the new results confirm the hypothesis that the increase in allergic asthma in industrial nations is linked to the widespread use of antibiotics and the subsequent disappearance of micro-organisms that permanently populate the human body: "The study of these fundamental mechanisms is extremely important for us to understand asthma and be able to develop preventative and therapeutic strategies later on."

From sciencedaily

Source of Key Brain Function Located: How to Comprehend a Scene in Less Than a Second

The key is to process the interacting objects that comprise a scene more quickly than unrelated objects, according to corresponding author Irving Biederman, professor of psychology and computer science in the USC Dornsife College and the Harold W. Dornsife Chair in Neuroscience.
The study appears in the June 1 issue of The Journal of Neuroscience.

The intraparietal sulcus (IPS), a groove in the brain closer to the top of the head, is engaged with implementing visual attention. Above: Lateral surface of left cerebral hemisphere, viewed from the side. Intraparietal sulcus visible at upper right, running horizontally.

The brain's ability to understand a whole scene on the fly "gives us an enormous edge on an organism that would have to look at objects one by one and slowly add them up," Biederman said. What's more, the interaction of objects in a scene actually allows the brain to identify those objects faster than if they were not interacting.

While previous research had already established the existence of this "scene-facilitation effect," the location of the part of the brain responsible for the effect remained a mystery. That's what Biederman and lead author Jiye G. Kim, a graduate doctoral student in Biederman's lab, set out to uncover with Chi-Hung Juan of the Institute of Cognitive Neuroscience at the National Central University in Taiwan.

"The 'where' in the brain gives us clues as to the 'how,'" Biederman said. This study is the latest in an ongoing effort by Biederman and Kim to unlock the complex way in which the brain processes visual experience. The goal, as Biederman puts it, is to understand "how we get mind from brain."

To find out the "where" of the scene-facilitation effect, the researchers flashed drawings of pairs of objects for just 1/20 of a second. Some of these objects were depicted as interacting, such as a hand grasping for a pen, and some were not, with the hand reaching away from the pen. The test subjects were asked to press a button if a label on the screen matched either one of the two objects, which it did on half of the presentations.
A recent study by Kim and Biederman suggested that the source of the scene-facilitation effect was the lateral occipital cortex, or LO, which is a portion of the brain's visual processing center located between the ear and the back of the skull. However, the possibility existed that the LO was receiving help from the intraparietal sulcus, or IPS, which is a groove in the brain closer to the top of the head.

The IPS is engaged with implementing visual attention, and the fact that interacting objects may attract more attention left open the possibility that perhaps it was providing the LO with assistance.

While participants took the test, electromagnetic currents were used to alternately zap subjects' LO or IPS, temporarily numbing each region in turn and preventing it from providing assistance with the task.
All of the participants were pre-screened to ensure they could safely receive the treatment, known as transcranial magnetic stimulation (TMS), which produces minimal discomfort.

By measuring how accurate participants were in detecting objects shown as interacting or not interacting when either the LO or IPS were zapped, researchers could see how much help that part of the brain was providing. The results were clear: zapping the LO eliminated the scene-facilitation effect. Zapping the IPS, however, did nothing.

When it comes to providing a competitive edge in identifying objects that are part of an interaction, the lateral occipital cortex appears to be working alone. Or, at least, without help from the intraparietal sulcus.

The research was funded through Biederman's National Science Foundation grants as well as a competitive grant awarded to Kim by the National Science Foundation designed to allow US students to collaborate with scientists in East Asia. Kim worked with Chi-Hung Juan, an expert in transcranial magnetic stimulation.

From sciencedaily

A single gene may have shaped human cerebral cortex

The size and shape of the human cerebral cortex - responsible for all conscious thought - is largely determined by mutations in a single gene. The findings, from the Yale School of Medicine and two other universities, are based on a genetic analysis of in one Turkish family and two Pakistani families whose children were born with the most severe form of microcephaly. 



The childrens' brains are just 10 percent the normal size, and lack the normal human cortical architecture. And the researchers found that the deformity was caused by mutations in the same gene, centrosomal NDE1, which is involved in cell division.

They say that the combination of undersized brain and simplified architecture has not been observed linked to any other gene associated with brain development. It implies that mutations in this gene were responsible for the development of a complex cerebral cortex in humans.

"The degree of reduction in the size of the cerebral cortex and the effects on brain morphology suggest this gene plays a key role in the evolution of the human brain," said professor Murat Gunel, co-senior author of the paper.

"These findings demonstrate how single molecules have influenced the expansion of the human cerebral cortex in the last five million years. We are now a little closer to understanding just how this miracle happens."

By Kate Taylor 
From tgdaily

Blood analysis chip detects diseases in minutes

A big step forward in microfluidics has helped researchers develop stand-alone, self-powered chips that can diagnose diseases within minutes. The Self-powered Integrated Microfluidic Blood Analysis System (SIMBAS) can process whole blood samples without the use of external tubing and extra components.


"This is a very important development for global healthcare diagnostics," says UC Berkeley professor of bioengineering Luke Lee. “Field workers would be able to use this device to detect diseases such as HIV or tuberculosis in a matter of minutes." 

The SIMBAS biochip uses trenches patterned underneath nanoscale microfluidic channels. When whole blood is dropped onto the chip’s inlets, the relatively heavy red and white blood cells settle down into the trenches, separating from the clear blood plasma. The blood moves through the chip in a process called degas-driven flow.

In experiments, the researchers were able to capture more than 99 percent of the blood cells in the trenches and selectively separate plasma using this method.

The team demonstrated the proof-of-concept of SIMBAS by testing a five-microliter sample of whole blood that contained biotin (vitamin B7) at a concentration of about 1 part per 40 billion.
The chip provided a readout of the biotin levels in 10 minutes.

"This is a very important development for global healthcare diagnostics. Field workers would be able to use this device to detect diseases such as HIV or tuberculosis in a matter of minutes. The fact that we reduced the complexity of the biochip and used plastic components makes it much easier to manufacture in high volume at low cost," says Lee. 

"Our goal is to address global health care needs with diagnostic devices that are functional, cheap and truly portable."

By Kate Taylor 
From tgdaily

Giftedness Linked to Prenatal Exposure of Higher Levels of Testosterone

Mrazik, a professor in the Faculty of Education's educational psychology department, and a colleague from Rider University in the U.S., have published a paper in Roeper Review linking giftedness (having an IQ score of 130 or higher) to prenatal exposure of higher levels of testosterone. Mrazik hypothesizes that, in the same way that physical and cognitive deficiencies can be developed in utero, so, too, could similar exposure to this naturally occurring chemical result in giftedness.

 A longstanding debate as to whether genius is a byproduct of good genes or good environment has an upstart challenger that may take the discussion in an entirely new direction. University of Alberta researcher Marty Mrazik says being bright may be due to an excess level of a natural hormone.

"There seems to be some evidence that excessive prenatal exposure to testosterone facilitates increased connections in the brain, especially in the right prefrontal cortex," said Mrazik. "That's why we see some intellectually gifted people with distinct personality characteristics that you don't see in the normal population."
Mrazik's notion came from observations made during clinical assessments of gifted individuals. He and his fellow researcher observed some specific traits among the subjects. This finding stimulated a conversation on the role of early development in setting the foundation for giftedness.

"It gave us some interesting ideas that there could be more to this notion of genius being predetermined from a biological perspective than maybe people gave it credit for," said Mrazik. "It seemed that the bulk of evidence from new technologies (such as Functional MRI scans) tell us that there's a little bit more going on than a genetic versus environmental interaction."

Based on their observations, the researchers made the hypothesis that this hormonal "glitch" in the in-utero neurobiological development means that gifted children are born with an affinity for certain areas such as the arts, math or science. Mrazik cautions that more research is needed to determine what exact processes may cause the development of the gifted brain.

He notes that more is known about what derails the brain's normal development, thus charting what makes gifted people gifted is very much a new frontier. Mrazik hopes that devices such as the Functional MRI scanner will give them a deeper understanding of the role of neurobiology in the development of the gifted brain.

"It's really hard to say what does put the brain in a pathway where it's going to be much more precocious," he said. "The next steps in this research lay in finding out what exact stimuli causes this atypical brain development."

From sciencedaily.com

The Weirdest Indicators of Serious Medical Risks

Today's computer-powered studies allow researchers to look beyond obvious health risks of the past. New analyses show, for example, that finger length, grip strength and even height may be reliable predictors of cancer, longevity and heart disease.

But not all statistically based findings are created equal, said Rebecca Goldin, a mathematician at George Mason University and volunteer for STATS.org.


 Apparently things like finger length, grip strength, and height may be reliable predictors of cancer, longevity and heart disease. Don't believe it? Check this weird list of potential risk indicators out anyway. It'll terrify, amaze, and amuse you.

"It's easy to get results that look impressive by trying a whole bunch of things on large databases of information. Things pop out, but they can be completely spurious because of chance," Goldin said. "It's now a fairly common thing to see something published and have someone say that it's not true."

Although the ease of mining medical databases for results can outpace scientists' abilities to review them (clinical trial journals alone publish about 75 in-depth studies every day, yet only 11 reviews of these studies), some do stand up to statistical and cause-and-effect scrutiny.
We recap here some of the weirdest, yet credible, indicators of medical risks ever discovered.

Finger Length

At least two genes - HOXA and HOXD - control testicle development in the womb, and testicles in turn create testosterone. But these two genes also mandate hand development, especially the index and ring fingers.

The discovery has spawned odd testosterone-based hypotheses about what the ratio of the two fingers means, from sexual fitness and exam performance to personality and sporting ability.

Most of the proposals have fallen short of any meaningful significance, but an upcoming study in the British Journal of Cancer suggests there is a significant link to prostate cancer: If the index finger is longer than the ring finger, a man is less likely to develop the cancer.

"It seems strange, but this isn't guesswork," said Rosalind Eeles, a cancer geneticist at the Institute for Cancer Research (ICR) in London and co-author of the study.

Eeles and her team compared more than 1,500 men with prostate cancer against more than 3,000 random men. Ignoring family history and other factors, men older than 60 years with a longer index finger were 33 percent less likely (on average) to develop prostate cancer. Younger men with a longer index finger fared even better, with an 87 percent average reduction in risk.

The association still needs to be tried against other populations to be a meaningful assessment of prostate cancer risk, but Goldin said "its speed and non-invasiveness does have something going for it."

"It's way too early to say how much hand screening could help," said Elizabeth Rapley, a molecular geneticist and spokesperson for ICR. "If anything, it gives us more of a handle on how prostate cancer starts, that testosterone may have big role in the development of the disease."

Grip Strength

According to a 25-year study of more than 6,000 men aged 45 through 68, grip strength was the best predictor of well they'd avoid being disabled later in life. The weakest-gripping men suffered twice the disabilities of strongest grippers. And in a separate study of older men and women, good grip strength was correlated with longer lifespan.

But correlation is not causation. The best bet to living a long life, according to a plethora of research, is eating well, exercising regularly and avoiding harmful habits like smoking.

Flossing

The crud between your teeth may seem innocuous, but study after study has shown chronic infections of the mouth (also called periodontal diseases) increase the risk of circulatory woes, including coronary heart disease.

Mouth bacteria sneaking into in the blood via the gums, the thinking goes, may lead to more heart-clogging arterial plaques. Inflammation caused by such a persistent infection may also prime the body for heart attacks.

Travel

If you're close to an airport and can fly cheap, you may get to see more of the world, but this could also increase your risk of developing skin cancer.

During a British economic rebound in the 1970s, Rapley said, people enjoyed the jump in their money's value by traveling abroad.

"Many of them went to the beaches of Spain and spent a lot of time in the sun," she said. "We now see an increase in the rate of melanoma in that population."

Birth Order

First-born boys may be more likely to develop testicular cancer later in life.
"Lots of series of studies suggest the first child is exposed to higher levels of estrogen, which gives greater risk of testicular cancer. But this has never been definitively proven," Rapley said.

Chemicals similar to estrogen are one major suspect for the doubling of testicular cancer in the past 40 years (an increase not from improved screening, she said). Estrogen analogs may get into food and water supplies, for example, via the pesticides they're found in.

Perhaps the strongest medical risk of early birth order is childhood leukemia. It develops more often in older siblings and seems to be tied to socioeconomic status. Rapley suspects immune system training may also be part of the explanation.

"There are suggestions that it may have to do with exposure to viruses and colds and bacteria," she said. Siblings aren't around to give them as much exposure, she said, so "kids who go to child care at an early age are less likely to develop leukemia than kids kept at home."

In assessing any database-powered medical study, Goldin said it's important to look for large sample sizes, proposed causes, accounting for chance and extraneous effects, and acknowledgment of other hypotheses. But putting a health risk into perspective is perhaps the most important thing of all.

"There's a lot of medicine where it's just not clear how helpful it is to know something," Goldin says. "If you're doubling your risk of one in a million, for example, that's still two in a million. Unless it's got some significant impact to way we evaluate treatment, it's hard to see any benefit."


From gizmodo.com

MRI Scans Reveal Brain Changes in People at Genetic Risk for Alzheimer's

Researchers at Washington University School of Medicine in St. Louis report in the Dec. 15 issue of The Journal of Neuroscience that these patients had a particular form of the apolipoprotein E (APOE) gene called APOE4. The findings suggest that the gene variant affects brain function long before the brain begins accumulating the amyloid that will eventually lead to dementia.


 Researchers identified functional differences in the brains of APOE4-positive and APOE4-negative people. Red indicates increased connectivity among regions at rest while blue shows decreased connectivity.

"We looked at a group of structures in the brain that make up what's called the default mode network," says lead author Yvette I. Sheline, MD. "In particular, we are interested in a part of the brain called the precuneus, which may be important in Alzheimer's disease and in pre-Alzheimer's because it is one of the first regions to develop amyloid deposits. Another factor is that when you look at all of the structural and functional connections in the brain, the most connected structure is the precuneus. It links many other key brain structures together."

The research team conducted functional MRI scans on 100 people whose average age was 62. Just under half of them carried the APOE4 variant, which is a genetic risk factor for late-onset Alzheimer's disease. Earlier PET scans of the study subjects had demonstrated that they did not have amyloid deposits in the brain. Amyloid is the protein that makes up the senile plaques that dot the brains of Alzheimer's patients and interfere with cognitive function.

Participants in the study also underwent spinal puncture tests that revealed they had normal amyloid levels in their cerebrospinal fluid.

"Their brains were 'clean as a whistle,' " says Sheline, a professor of psychiatry, of radiology and of neurology and director of Washington University's Center for Depression, Stress and Neuroimaging. "As far as their brain amyloid burden and their cerebrospinal fluid levels, these individuals were completely normal. But the people who had the APOE4 variant had significant differences in the way various brain regions connected with one another."

Sheline's team focused on the brain's default mode network. Typically, the default network is active when the mind rests. Its activity slows down when an individual concentrates.

Subjects don't need to perform any particular tasks for researchers to study the default mode network. They simply relax in the MRI scanner and reflect or daydream while the machine measures oxygen levels and blood flow in the brain.

"We make sure they don't go to sleep," Sheline says. "But other than not sleeping, study participants had no instructions. They were just lying there at rest, and we looked at what their brains were doing."

This is the latest in a series of studies in which Sheline and her colleagues have looked at brain function in people at risk for Alzheimer's disease. Initially, her team compared the default mode networks in the brains of people with mild Alzheimer's disease to the same structures in the brains of those who were cognitively normal. In that study, her team found significant differences in how the network functioned.

Then, using PET imaging to identify cognitively normal people who had amyloid deposits in their brains in a second study, they compared those cognitively normal people whose PET scans indicated that their brains contained amyloid to others whose PET scans showed no evidence of amyloid. Again, the default mode network operated differently in those with amyloid deposits.

In the current study, there was no evidence of dementia or amyloid deposits. But still, in those with the APOE4 variant, there was irregular functioning in the default mode network.

APOE4 is the major genetic risk factor for sporadic cases of Alzheimer's disease. Other genes that pass on inherited, early-onset forms of the disease have been identified, but APOE4 is the most important genetic marker of the disease identified so far, Sheline says.

The study subjects, all of whom participate in studies through the university's Charles F. and Joanne Knight Alzheimer's Disease Research Center, will be followed to see whether they eventually develop amyloid deposits. Sheline anticipates many will.

"I think a significant number of them eventually will be positive for amyloid," she says. "We hope that if some people begin to accumulate amyloid, we'll be able to look back at our data and identify particular patterns of brain function that might eventually be used to predict who is developing Alzheimer's disease."

The goal is to identify those with the highest risk of Alzheimer's and to develop treatments that interfere with the progression of the disease, keeping it from advancing to the stage when amyloid begins to build up in the brain and, eventually, dementia sets in.

"The current belief is that from the time excess amyloid begins to collect in the brain, it takes about 10 years for a person to develop dementia," Sheline says. "But this new study would suggest we might be able to intervene even before amyloid plaques begin to form. That could give us an even longer time window to intervene once an effective treatment can be developed."

This work was supported by grants from the National Institute of Mental Health and the National Institute on Aging of the National Institutes of Health.

From sciencedaily.com

Your Genome in Minutes: New Technology Could Slash Sequencing Time

The researchers have patented an early prototype technology that they believe could lead to an ultrafast commercial DNA sequencing tool within ten years. Their work is described in a study published this month in the journal Nano Letters.

 Dr Joshua Edel shows the prototype chip, and an array of the chips prior to use.

The research suggests that scientists could eventually sequence an entire genome in a single lab procedure, whereas at present it can only be sequenced after being broken into pieces in a highly complex and time-consuming process. Fast and inexpensive genome sequencing could allow ordinary people to unlock the secrets of their own DNA, revealing their personal susceptibility to diseases such as Alzheimer's, diabetes and cancer. Medical professionals are already using genome sequencing to understand population-wide health issues and research ways to tailor individualised treatments or preventions.

Dr Joshua Edel, one of the authors on the study from the Department of Chemistry at Imperial College London, said: "Compared with current technology, this device could lead to much cheaper sequencing: just a few dollars, compared with $1m to sequence an entire genome in 2007. We haven't tried it on a whole genome yet but our initial experiments suggest that you could theoretically do a complete scan of the 3,165 million bases in the human genome within minutes, providing huge benefits for medical tests, or DNA profiles for police and security work. It should be significantly faster and more reliable, and would be easy to scale up to create a device with the capacity to read up to 10 million bases per second, versus the typical 10 bases per second you get with the present day single molecule real-time techniques."

In the new study, the researchers demonstrated that it is possible to propel a DNA strand at high speed through a tiny 50 nanometre (nm) hole -- or nanopore -- cut in a silicon chip, using an electrical charge. As the strand emerges from the back of the chip, its coding sequence (bases A, C, T or G) is read by a 'tunnelling electrode junction'. This 2 nm gap between two wires supports an electrical current that interacts with the distinct electrical signal from each base code. A powerful computer can then interpret the base code's signal to construct the genome sequence, making it possible to combine all these well-documented techniques for the first time.

Sequencing using nanopores has long been considered the next big development for DNA technology, thanks to its potential for high speed and high-capacity sequencing. However, designs for an accurate and fast reader have not been demonstrated until now.

Co-author Dr Emanuele Instuli, from the Department of Chemistry at Imperial College London, explained the challenges they faced in this research: "Getting the DNA strand through the nanopore is a bit like sucking up spaghetti. Until now it has been difficult to precisely align the junction and the nanopore. Furthermore, engineering the electrode wires with such dimensions approaches the atomic scale and is effectively at the limit of existing instrumentation. However in this experiment we were able to make two tiny platinum wires into an electrode junction with a gap sufficiently small to allow the electron current to flow between them."

This technology would have several distinct advantages over current techniques, according to co-author, Aleksandar Ivanov from the Department of Chemistry at Imperial College London: "Nanopore sequencing would be a fast, simple procedure, unlike available commercial methods, which require time-consuming and destructive chemical processes to break down and replicate small sections of the DNA molecules to determine their sequence. Additionally, these silicon chips are incredibly durable compared with some of the more delicate materials currently used. They can be handled, washed and reused many times over without degrading their performance."

Dr Tim Albrecht, another author on the study, from the Department of Chemistry at Imperial College London, says: "The next step will be to differentiate between different DNA samples and, ultimately, between individual bases within the DNA strand (ie true sequencing). I think we know the way forward, but it is a challenging project and we have to make many more incremental steps before our vision can be realised."

From sciencedaily.com

Brain System Behind General Intelligence Discovered

The study, to be published the week of February 22 in the early edition of the Proceedings of the National Academy of Sciences, adds new insight to a highly controversial question: What is intelligence, and how can we measure it?

The research team included Jan Gläscher, first author on the paper and a postdoctoral fellow at Caltech, and Ralph Adolphs, the Bren Professor of Psychology and Neuroscience and professor of biology. The Caltech scientists teamed up with researchers at the University of Iowa and USC to examine a uniquely large data set of 241 brain-lesion patients who all had taken IQ tests. The researchers mapped the location of each patient's lesion in their brains, and correlated that with each patient's IQ score to produce a map of the brain regions that influence intelligence.

The brain regions important for general intelligence are found in several specific places (orange regions shown on the brain on the left). Looking inside the brain reveals the connections between these regions, which are particularly important to general intelligence. In the image on the right, the brain has been made partly transparent. The big orange regions in the right image are connections (like cables) that connect the specific brain regions in the image on the left.


"General intelligence, often referred to as Spearman's g-factor, has been a highly contentious concept," says Adolphs. "But the basic idea underlying it is undisputed: on average, people's scores across many different kinds of tests are correlated. Some people just get generally high scores, whereas others get generally low scores. So it is an obvious next question to ask whether such a general ability might depend on specific brain regions."

The researchers found that, rather than residing in a single structure, general intelligence is determined by a network of regions across both sides of the brain.

"One of the main findings that really struck us was that there was a distributed system here. Several brain regions, and the connections between them, were what was most important to general intelligence," explains Gläscher.

"It might have turned out that general intelligence doesn't depend on specific brain areas at all, and just has to do with how the whole brain functions," adds Adolphs. "But that's not what we found. In fact, the particular regions and connections we found are quite in line with an existing theory about intelligence called the 'parieto-frontal integration theory.' It says that general intelligence depends on the brain's ability to integrate -- to pull together -- several different kinds of processing, such as working memory."

The researchers say the findings will open the door to further investigations about how the brain, intelligence, and environment all interact.

The work at Caltech was funded by the National Institutes of Health, the Simons Foundation, the Deutsche Akademie der Naturforscher Leopoldina, and a Global Center of Excellence grant from the Japanese government.

From sciencedaily.com

'Closed Heart Surgery': Scientists Jump-start The Heart By Gene Transfer

ScienceDaily (Oct. 7, 2009) — Scientists from the Universities of Michigan and Minnesota show in a research report published online in the FASEB Journal that gene therapy may be used to improve an ailing heart's ability to contract properly. In addition to showing gene therapy's potential for reversing the course of heart failure, it also offers a tantalizing glimpse of a day when "closed heart surgery" via gene therapy is as commonly prescribed as today's cocktail of drugs.

"We hope that our study will lead some day to the development of new genetic-based therapies for heart failure patients," said Todd J. Herron, Ph.D., one of the researchers involved in the study and research assistant professor of molecular and integrative physiology at the University of Michigan. "The advent of molecular motor-based gene transfer for the failing heart will hopefully improve cardiac function and quality of life for heart failure patients."

To make this advance, Herron and colleagues treated heart muscle cells from the failing hearts of rabbits and humans with a virus (adenovirus) modified to carry a gene which produces a protein that enables heart cells to contract normally (fast molecular motor) or a gene that becomes active in failing hearts, which is believed to be part of the body's way of coping with its perilous situation (slow molecular motor). Heart cells treated with the gene to express the fast molecular motor contracted better, while those treated with the gene to express the slow molecular motor were unaffected.

"Helping hearts heal themselves, rather than prescribing yet another drug to sustain a failing organ, would be a major advance for doctors and patients alike," said Gerald Weissmann, M.D., Editor-in-Chief of the FASEB Journal. "Equally important, it shows that gene therapy remains one of the most promising approaches to treating the world's most common and deadliest diseases."

According to the U.S. Centers for Disease Control and Prevention, heart failure is a condition where the heart cannot pump enough blood and oxygen to meet the needs of other body organs. Approximately 5 million people in the United States have heart failure, about 550,000 new cases are diagnosed each year, and more than 287,000 people in the United States die each year of heart failure. The most common causes of heart failure are coronary artery disease, hypertension or high blood pressure, and diabetes. Current treatments usually involve three to four medicines: ACE inhibitors, diuretics, digoxin, and beta blockers.

Current clinical agents and treatments focus on the amount of calcium available for contraction, which can provide short-term cardiac benefits, but are associated with an increased mortality in the long-term. Results from this study show that calcium-independent treatments could have implications for heart diseases associated with depressed heart function, due to the effectiveness of fast molecular motor gene transfer on the improved contractions of human heart muscle cells.

'Junk' DNA Cut-and-paste Protein: Discovery May Prove Invaluable In Quest For Gene Therapies

ScienceDaily (Sep. 23, 2009) — Scientists have identified how a protein enables sections of so-called junk DNA to be cut and pasted within genetic code – a finding which could speed development of gene therapies.

The study by researchers at the University of Edinburgh sheds light on the process, known as DNA transposition, in which shifted genes have a significant effect on the behaviour of neighbouring genes. In the human genome, rearrangement of antibody genes can enable the immune system to target infection more effectively.

The research identifies how the enzyme is able to cut out a section of DNA and reinsert it elsewhere in the genome. The study, published in the journal Cell, was funded by the Wellcome Trust and the Medical Research Council.

New research sheds light on how a protein enables sections of so-called junk DNA to be cut out and reinserted elsewhere in the genome.



The cut-and-paste property of shifted DNA is now being used to develop tools for scientific research and medical applications. Learning more about transposition could help scientists understand how to control the process and speed the development of gene therapies – which introduce into cells genes with beneficial properties that, for example, can fight hereditary diseases or cancer.

Junk DNA, which accounts for almost half of the human genome, was originally believed to have no purpose. However, it is now emerging that movement of junk DNA, in a cut-and-paste mechanism, can lead to beneficial changes in cells.

Dr Julia Richardson of the University's School of Biological Sciences, who led the study, said: "By forming a picture of the enzyme that causes DNA to shift, and discovering how this works, we understand more about how these proteins could be adapted and controlled. This may one day enable genes to be pasted into cells exactly where they are needed – which could be of enormous benefit in developing gene therapies."


Experimental Approach May Reverse Rheumatoid Arthritis And Osteoporosis

ScienceDaily (Sep. 22, 2009) — Researchers have identified a mechanism that may keep a well known signaling molecule from eroding bone and inflaming joints, according to an early study published online today in the Journal of Clinical Investigation.

Bone is continually recycled to maintain its strength through the competing action of osteoclasts, cells that break down aging bone, and osteoblasts, which build new bone. Osteoclasts also play a central role in common diseases that erode bone, where two signaling molecules, TNFα and RANKL, cause too much bone breakdown. Both are known to turn on the nuclear factor kappa B complex (NF-κB), which turns on genes that cause the stem cell precursors of osteoclasts to mature and start eating bone. While both TNFα and RANKL encourage bone loss, the current study argues that TNFα and RANKL have different effects on levels of a key inhibitory protein within the NF-κB pathway called NF-κB p100, with important consequences for drug design.

A newly identified mechanism may keep a well known signaling molecule from eroding bone and inflaming joints -- a finding that may yield new treatments for osteoporosis and rheumatoid arthritis.



The NF-κB pathway as a whole signals for more active osteoclasts, but NF-κB p100 (p100) interferes with the ability of that same pathway to pass on the bone loss signal. While both TNFα and RANKL activate NF-κB signaling, RANKL efficiently converts p100 into a form that no longer blocks NF-κB pathway signaling and that leads to bone loss. In contrast, the current study is the first to show that TNFα lets p100 build up. Thus, TNFα both causes bone loss through NF-κB signaling and limits it via p100 accumulation.

Experiments found further that mice genetically engineered to lack NF-κB2p100 suffered more severe joint erosion and inflammation than their normal littermates in the face of TNFα. TNFα, but not RANKL, also increased levels of a protein in osteoclast precursors called TNF receptor-associated factor 3 (TRAF 3), which may help NF-κB p100 block osteoclast formation and inflammation.

"While further studies will be required to confirm and detail this mechanism, our results argue strongly that increasing levels of either TRAF3 or NF-κB p100 could represent a powerful new way to limit bone destruction and inflammation-induced bone loss seen in osteoporosis and rheumatoid arthritis," said Brendan Boyce, M.D., professor within the Department of Pathology and Laboratory Medicine at the University of Rochester Medical Center, and the study's corresponding author. "NF-κB p100 levels may vary with each person's genes, making some more susceptible to TNFα-driven disease. Future solutions may be local delivery of p100 into diseased joints via gene therapy, or to target with a drug the enzyme, NIK, which otherwise limits the p100 supply."

At the Center of Bone Loss and Inflammation

Drugs that block the function of TNFα are blockbusters (e.g. Enbrel, Humira and Remicade) because they effectively prevent bone loss and inflammation in most patients with rheumatoid arthritis. They have also been shown to reduce bone loss in women early after menopause.

Other studies, however, have suggested that TNFα cannot cause precursor cells to become osteoclasts unless RANKL first "primes" them. The debate has been spirited because it goes to which molecule should be targeted in near-future attempts to design more precise drugs.

The current results show that TNFα can signal for bone loss without RANKL, providing NF-κB p100 is also absent. By engineering mice with neither RANKL nor NF-κB p100, Boyce and colleagues found that TNFα had greatly increased ability to signal for osteoclast maturation and bone loss in this scenario.

Another unexpected result was measured in changes in gene expression, the process by which information encoded in DNA chains is used to build proteins that make up the body's structures and carry it messages. The team found that mice engineered to over-express TNFα, but also to lack NF-κB p100, had significantly increased inflammation in their joints when compared to mice with high TNFα levels, but with p100 present to counter it.

Along with Boyce, the study was led by Zhenqiang Yao and Lianping Xing in the Department of Pathology and Laboratory Medicine at the University of Rochester Medical Center. The study was funded in part by the National Institutes of Health.

"We believe NF-κB p100 limits not only osteoclast maturation, but also the number of inflammatory cells attracted to the joints in response to TNFα," Boyce said. "If confirmed, it would mean that p100 has more than one role in more than one major bone disease, and thus would create new opportunities to reverse disease by manipulating p100 levels."

You Can't Trust A Tortured Brain: Neuroscience Discredits Coercive Interrogation

ScienceDaily (Sep. 22, 2009) — According to a new review of neuroscientific research, coercive interrogation techniques used during the Bush administration to extract information from terrorist suspects are likely to have been unsuccessful and may have had many unintended negative effects on the suspect's memory and brain functions.


A new article, published in the journal, Trends in Cognitive Science, reviews scientific evidence demonstrating that repeated and extreme stress and anxiety have a detrimental influence on brain functions related to memory.

Memos released by the US Department of Justice in April of 2009 detailing coercive interrogation techniques suggest that prolonged periods of shock, stress, anxiety, disorientation and lack of control are more effective than standard interrogatory techniques in making subjects reveal truthful information from memory. "This is based on the assumption that subjects will be motivated to reveal veridical information to end interrogation, and that extreme stress, shock and anxiety do not impact memory," says review author, Professor Shane O'Mara from the Institute of Neuroscience at Trinity College in Dublin, Ireland. "However, this model of the impact of extreme stress on memory and the brain is utterly unsupported by scientific evidence."

Coercive interrogation techniques used to extract information from terrorist suspects are likely to have been unsuccessful, new research shows.




Psychological studies suggest that during extreme stress and anxiety, the captive will be conditioned to associate speaking with periods of safety. For the captor, when the captive speaks, the objective of gaining information will have been obtained and there will be relief from the unsavory task of administering these conditions of stress. Therefore, it is difficult or impossible to determine during the interrogation whether the captive is revealing truthful information or just talking to escape the torture. Research has also shown that extreme stress has a deleterious effect on the frontal lobe and is associated with the production of false memories.

Neurochemical studies have revealed that the hippocampus and prefrontal cortex, brain regions integral to the process of memory, are rich in receptors for hormones that are activated by stress and sleep deprivation and which have been shown to have deleterious effects on memory. "To briefly summarize a vast, complex literature, prolonged and extreme stress inhibits the biological processes believed to support memory in the brain," says O'Mara. "For example, studies of extreme stress with Special Forces Soldiers have found that recall of previously-learned information was impaired after stress occurred." Waterboarding in particular is an extreme stressor and has the potential to elicit widespread stress-induced changes in the brain.

"Given our current cognitive neurobiological knowledge, it is unlikely that coercive interrogations involving extreme stress will facilitate release of truthful information from long term memory," concludes Professor O'Mara. "On the contrary, these techniques cause severe, repeated and prolonged stress, which compromises brain tissue supporting both memory and decision making."


New Species Discovered On Whale Skeletons

ScienceDaily (Sep. 21, 2009) — When a whale dies, it sinks to the seafloor and becomes food for an entire ecosystem. Researchers at the University of Gothenburg, Sweden, have discovered previously unknown species that feed only on dead whales -- and have used DNA technology to show that the species diversity in our oceans may be higher than previously thought.

Dead whales constitute an unpredictable food source - it is impossible to know when and where a whale is going to die, and when it does, the food source does not last forever. Nevertheless, some marine species have specialised in feeding on whale cadavers.

Big source of nutrients

This is shown by researchers at the University of Gothenburg who have studied the ecosystem around dead whales using underwater cameras. A dead whale is an enormous source of nutrients. In fact, one cadaver offers the same amount of nutrients that normally sinks from the surface to the seafloor in 2000 years, and this is of great benefit to innumerable species: First the meat is eaten by for example sharks and hagfish, then tremendous amounts of various organisms come to feast on the skeleton.

When a whale dies, it sinks to the seafloor and becomes food for an entire ecosystem. Researchers have discovered previously unknown species that feed only on dead whales - and use DNA technology to show that the species diversity in our oceans may be higher than previously thought.




Specialised worms

One group of animals commonly found on whale skeletons is bristleworms, which are related to the earthworm. Some bristleworm species are so specialised in eating dead whales they would have problems surviving elsewhere. One example is Osedax, which uses its root system to penetrate the whale bones when searching for food. Other species specialise in eating the thick layers of bacteria that quickly form around the bones.

Nine new species

A dissertation from the Department of Zoology at the University of Gothenburg describes no fewer than nine previously unknown species of these bacteria-grazing bristleworms.

Cryptic species

Four of the new species were found on whale cadavers placed at a depth of 125 metres in the new national park Kosterhavet off the coast of Strömstad, Sweden. The other five species feed on whale bones in the deep waters off the coast of California, USA. The family tree of bristleworms was explored using molecular data. The DNA analyses show that there are several so-called cryptic bristleworm species, meaning species that despite looking identical differ very much genetically.

The analyses show that the adaptation to a life on whale cadavers has occurred in species from different evolutionary paths and at several points in time. The study also shows that some species that are assumed to inhabit many different areas globally, so-called cosmopolitan species, may in fact be cryptic species. This finding may be very significant for our understanding of how animals spread around the world and of how many different species dwell on our planet.

Simpler Colon Cancer Screening

A blood test designed to enable simple screening for colon cancer has been hailed by experts as a major advance. The test detects cancer due to a chemical change called methylation that occurs disproportionately in two key genes in colorectal tumor cells.

Since more people should be willing to have a simple blood test, the screening method could help identify those patients who need a more invasive, more diagnostically rigorous colonoscopy.

The U.S. death toll from the condition is around 50,000 a year. The American Cancer Society recommends that men over the age of 50 have about one colonoscopy every 10 years, and that those at a higher risk be screened earlier and more often. Yet until now, only invasive colonoscopies and stool tests have been available and compliance by those deemed in need of screening is disappointingly low, at less than 50 percent. Screening programs have been shown to cut deaths by allowing more victims to receive earlier, curative treatment so a simpler test could save lives by encouraging more people to get screened.

Tissue test: Healthy colon tissue (shown top), with surface cells dyed green and internal stromal cells dyed red. In cancerous colon tissue (bottom), the tissue structure is broken down, and surface and internal cells are mixed together.



The developers of the new test, OncoMethylome Sciences, based in Liège, Belgium, say their method, which relies on one three-milliliter sample of blood, has the potential to boost compliance rates and conserve precious health service resources.

The test identifies the presence of methylated SYNE1 and FOXE1 genes, which mark out colorectal cancer cells. The researchers compared test results from 686 healthy control patients with 193 patients already diagnosed with the disease. The test was able to detect colorectal cancer in 77 percent of those subjects with the disease, according to data presented at the Congress of the European Cancer Organization in Berlin, Germany, on Monday. It correctly identified healthy, noncancerous patients in 91 percent of cases.

"This test has potential to provide a better balance of performance, cost-effectiveness, and patient compliance than other options currently available for colorectal cancer screening," says Joost Louwagie, vice president of product development at OncoMethylome.

Louwagie hopes that with further testing and refinements the test will become more sensitive and provide fewer false-positive results. But he says that even a 77 percent sensitivity would be "very useful" if it were applied to the large numbers of people who decide not to have screening using more-intrusive methods. He stresses, however, that colonoscopies remain the gold standard for diagnosing the disease.

Ernst Kuipers, head of the colorectal screening program and a professor of medicine at Erasmus University Medical Center in Rotterdam, praises the results. "This is an excellent new method, technically very well done," he says. "It represents a major advance on what we have now."

Kuipers says the 77 percent accuracy in detecting cancer-containing samples is "a good result." In comparison, the fecal-immunological screening method that he has been researching is around 60 percent accurate. He notes, however, that the specificity of the blood test--its ability to correctly identify healthy patients--will need to improve. "In practice, everyone over 55 would be screened, perhaps every two years," he says. "That's millions of people. So, if you had more than 5 percent false-positive rates, the number of follow-up colonoscopies you'd need to do would become too great."

Kuipers says that the specificity of the test needs to be at least 95 percent for it to be used in colorectal screening and that a large-scale evaluation will be vital.

With this in mind, Louwagie and colleagues are enrolling people in a prospective colorectal screening study at several German colonoscopy centers. "We plan to complete enrollment of 7,000 people by the end of 2009," he says.

The trials should also shed more light on how effective the test is at detecting the very earliest stages of colorectal cancer. Such a gene test will not be able to spot precancerous polyps. But it could be particularly effective if it can detect stage-one and stage-two colorectal cancers, which are almost always curable with surgery.

A new paper by Kuipers, due to appear in Journal of the National Cancer Institute, will provide new evidence that colorectal screening can ultimately save health services money, he says. But he believes that the most important measure will be a reduction in the number of colon cancer deaths. Kuipers notes that older, repeat-stool type testing, which was considered ineffective and not very sensitive, has been shown to have cut colorectal cancer deaths by 15 percent. He says that a reasonably sensitive and simple test with higher compliance levels could prevent many more colon cancer deaths.



By Michael Day



Direct Evidence Of Role Of Sleep In Memory Formation Is Uncovered

ScienceDaily (Sep. 16, 2009) — A Rutgers University, Newark and Collége de France, Paris research team has pinpointed for the first time the mechanism that takes place during sleep that causes learning and memory formation to occur.

It’s been known for more than a century that sleep somehow is important for learning and memory. Sigmund Freud further suspected that what we learned during the day was “rehearsed” by the brain during dreaming, allowing memories to form. And while much recent research has focused on the correlative links between the hippocampus and memory consolidation, what had not been identified was the specific processes that cause long-term memories to form.

For the first time, researchers have pinpointed the mechanism that takes place during sleep that causes learning and memory formation to occur.



As posted online September 11, 2009 by Nature Neuroscience, György Buzsaki, professor at the Center for Molecular and Behavioral Neuroscience at Rutgers University, Newark, and his co-researchers, Gabrielle Girardeau, Karim Benchenane, Sidney I. Wiener and Michaël B. Zugaro of the Collége de France, have determined that short transient brain events, called “sharp wave ripples,” are responsible for consolidating memory and transferring the learned information from the hippocampus to the neocortex, where long-term memories are stored.

Sharp wave ripples are intense, compressed oscillations that occur in the hippocampus when the hippocampus is working “off-line,” most often during stage four sleep, which, along with stage three, is the deepest level of sleep.

During stage four sleep, Buzsaki explains, “it’s as if many instruments and members of the orchestra come together to generate a loud sound, a sound so loud that it is heard by wide areas of the neocortex. These sharp, ‘loud’ transient events occur hundreds to thousands of times during sleep and ‘teach’ the neocortex to form a long-term form of the memory, a process referred to as memory consolidation.” The intensity and multiple occurrence of those ripples also explain why certain events may only take place once in the waking state and yet can be remembered for a lifetime, adds Buzsaki.

The researchers were able to pinpoint that sharp wave ripples are the cause behind memory formation by eliminating those ripple events in rats during sleep. The rats were trained in a spatial navigation task and then allowed to sleep after each session. Those rats that selectively had all ripple events eliminated by electrical stimulation were impeded in their ability to learn from the training, as compressed information was unable to leave the hippocampus and transfer to the neocortex.

Identification of a specific brain pattern responsible for strengthening learned information could facilitate applied research for more effective treatment of memory disorders.

“This is the first example that if a well-defined pattern of activity in the brain is reliably and selectively eliminated, it results in memory deficit; a demonstration that this specific brain pattern is the cause behind long-term memory formation,” says Buzsaki.

The research also represents a move toward a new direction in neuroscience research. While previous research largely has focused on correlating behavior with specific brain events through electroencephalogram, neuronal spiking and functional magnetic resonance imaging studies, increasingly researchers are challenging those correlations as they seek to identify the specific process or processes that cause certain events and behaviors to take place.

The research was performed at the Collége de France, Paris where Buzsaki worked as a distinguished visiting professor in 2008.

Neurons Found To Be Similar To U.S. Electoral College

ScienceDaily (Sep. 15, 2009) — A tiny neuron is a very complicated structure. Its complex network of dendrites, axons and synapses is constantly dealing with information, deciding whether or not to send a nerve impulse, to drive a certain action.

It turns out that neurons, at one level, operate like another complicated structure -- the United States, particularly its system of electing a president, through the Electoral College.

A new Northwestern University study provides evidence that supports the "two-layer integration model," one of several competing models attempting to explain how neurons integrate synaptic inputs. The findings are published in the journal Neuron.

Artist's rendering of neurons.



In this model, each dendritic branch of a neuron receives and integrates thousands of electrical inputs, deciding on just one signal to send to the axon. The axon then receives signals from all the dendrites, much like electoral votes coming in from state elections, and a final decision is made. The result could be an output in the form of an impulse, or action potential, or no action at all.

"There are more than 100 billion neurons in the human brain, so detailed knowledge of individual neurons will lead to a better understanding of how the brain works, including the processes of learning and memory," said Nelson Spruston, who led the research team. He is professor of neurobiology and physiology in the Weinberg College of Arts and Sciences at Northwestern.

Using electron microscopy, the researchers made a three-dimensional reconstruction of individual dendritic branches of mammalian hippocampal neurons with all their synapses. They found that the synapses get progressively smaller, or weaker, between the origin of the dendrite's branch and its end. This distribution supports the two-layer integration model.

Output from each branch, rather than each synapse, is sent to the axon. This design of the neuron implies that local integration is very important to the cell. After information is integrated locally within a branch, there is a global integration within the axon.

"Each of these neurons is a complicated network in and of itself," said William Kath, an author of the study. He is professor of engineering and applied science in the McCormick School of Engineering and Applied Science and is co-director of the Northwestern Institute on Complex Systems.

In addition to Spruston and Kath, other authors of the paper are Yael Katz, Vilas Menon, Daniel A. Nicholson and Yuri Geinisman, all from Northwestern.