Press the backs of your hands against the inside of a door frame for 30 seconds—as if you’re trying to widen the frame—and then let your arms down; you’ll feel something odd. Your arms will float up from your sides, as if lifted by an external force. Scientists call this Kohnstamm phenomenon, but you may know it as the floating arm trick. Now, researchers have studied what happens in a person’s brain and nerve cells when they repress this involuntary movement, holding their arms tightly by their sides instead of letting them float up. Two theories existed as to how this repression worked: The brain could send a positive “push down” signal to the arm muscles at the same time as the involuntary “lift up” signal was being transmitted to cancel it out; or the brain could entirely block the involuntary signal at the root of the nerves. The new study, which analyzed brain scans and muscle activity recordings from 39 volunteers, found that the latter was true—when a person stifles Kohnstamm phenomenon, the involuntary “lift” signal is blocked before it reaches the muscle. The difference between the repression mechanisms may seem subtle, but understanding it could help people repress other involuntary movements—including the tremors associated with Parkinson’s disease and the tics associated with Tourette syndrome, the team reports online today in the Proceedings of the Royal Society B.
Category Archives: Research Fields
Why wet feels wet: Understanding the illusion of wetness
Human sensitivity to wetness plays a role in many aspects of daily life. Whether feeling humidity, sweat or a damp towel, we often encounter stimuli that feel wet. Though it seems simple, feeling that something is wet is quite a feat because our skin does not have receptors that sense wetness. The concept of wetness, in fact, may be more of a “perceptual illusion” that our brain evokes based on our prior experiences with stimuli that we have learned are wet.
So how would a person know if he has sat on a wet seat or walked through a puddle? Researchers at Loughborough University and Oxylane Research proposed that wetness perception is intertwined with our ability to sense cold temperature and tactile sensations such as pressure and texture. They also observed the role of A-nerve fibers — sensory nerves that carry temperature and tactile information from the skin to the brain — and the effect of reduced nerve activity on wetness perception. Lastly, they hypothesized that because hairy skin is more sensitive to thermal stimuli, it would be more perceptive to wetness than glabrous skin (e.g., palms of the hands, soles of the feet), which is more sensitive to tactile stimuli.
Davide Filingeri et al. exposed 13 healthy male college students to warm, neutral and cold wet stimuli. They tested sites on the subjects’ forearms (hairy skin) and fingertips (glabrous skin). The researchers also performed the wet stimulus test with and without a nerve block. The nerve block was achieved by using an inflatable compression (blood pressure) cuff to attain enough pressure to dampen A-nerve sensitivity.
They found that wet perception increased as temperature decreased, meaning subjects were much more likely to sense cold wet stimuli than warm or neutral wet stimuli. The research team also found that the subjects were less sensitive to wetness when the A-nerve activity was blocked and that hairy skin is more sensitive to wetness than glabrous skin. These results contribute to the understanding of how humans interpret wetness and present a new model for how the brain processes this sensation.
“Based on a concept of perceptual learning and Bayesian perceptual inference, we developed the first neurophysiological model of cutaneous wetness sensitivity centered on the multisensory integration of cold-sensitive and mechanosensitive skin afferents,” the research team wrote. “Our results provide evidence for the existence of a specific information processing model that underpins the neural representation of a typical wet stimulus.”
The article “Whys wet feels wet? A neurophysiological model of human cutaneous wetness sensitivity” is published in the Journal of Neurophysiology. It is highlighted as one of this month’s “best of the best” as part of the American Physiological Society’s APSselect program.
Curiosity improves memory by tapping into the brain’s reward system
Brain scans of college students have shed light on why people learn more effectively when their curiosity is piqued than when they are bored stiff.
Researchers in the US found evidence that curiosity ramped up the activity of a brain chemical called dopamine, which in turn seemed to strengthen people’s memories.
Students who took part in the study were better at remembering answers to trivia questions when they were curious, but their memories also improved for unrelated information they were shown at the same time.
The findings suggest that while grades may have their place in motivating students, stimulating their natural curiosity could help them even more.
Chara Ranganath, a neuroscientist at the University of California, Davis, said curiosity seemed to be piqued when people had some knowledge of a subject but were then faced with a gap in their understanding. “We think curiosity is the drive to fill that gap. It’s like an itch you just have to scratch,” he said.
Matthias Gruber, a colleague of Ranganath’s who led the study, asked students to work through a series of trivia questions. He then had them rate how confident they were that they knew the correct answer and how curious they were to find out. He then created bespoke lists of questions for each student that left out those they already knew the answers to. The remaining questions ranged from ones the students were highly curious about to others they found totally boring.
Gruber then used an fMRI scanner to monitor each student’s brain while their list of questions appeared one after another on a screen. After each question they faced a 14-second wait during which a random face flashed up for two seconds. The answer to the trivia question then appeared on the screen before the next question flashed up.
The scans revealed that when people were more curious, brain activity rose in regions that transmit dopamine signals. The neurotransmitter is intimately linked to the brain’s reward circuitry, suggesting that curiosity taps into the same neural pathways that make people yearn for chocolate, nicotine and a win at the races.
“When we compare trials where people are highly curious to know an answer with trials where they are not, and look at the differences in brain activity, it beautifully follows the pathways in the brain that are involved in transmitting dopamine signals,” said Ranganath. “The activity ramps up and the amount it ramps up is highly correlated with how curious they are.”
In memory tests an hour later, the students were better at remembering the answers to questions they were curious about. On average, they remembered 35 of 50 answers when they were curious, compared with 27 out of 50 when they were not.
The students also did better at recognising the faces that had flashed up on the screen when they were waiting for the answer to a question that made them curious. The improvement was slight, at 42% versus 38% for faces that flashed up before questions the students found boring.
The study showed that – as expected – students had better memories when their curiosity was piqued. To find out if the effect was brief or longer-lasting, they ran another series of tests.
Gruber invited a different group of students into the lab and put them through the same regime of reading trivia questions, watching faces flash up, and seeing the answers. This time Gruber tested their memories a full day later. The students still fared better when they had been curious, suggesting that the improvement in memory was more than momentary.
“There are times when people feel they can take in a lot of new information, and other times when they feel their memories are terrible,” said Ranganath. “This work suggests that once you light that fire of curiosity, you put the brain in a state that’s more conducive to learning. Once you get this ramp-up of dopamine, the brain becomes more like a sponge that’s ready to soak up whatever is happening.”
Ranganath said the findings are in line with theories that give dopamine a key role in stabilising or consolidating memories. The research is published in the journal, Neuron.
Guillén Fernández at the Donders Centre for Cognitive Neuroimaging in the Netherlands said: “Understanding the mechanistic underpinning of how we learn is of utmost importance if we want to optimise knowledge acquisition in education.
“The brain is the most individual organ we have. The authors of this report show nicely that individual differences in curiosity are associated with differential abilities to learn new information.”
How your brain actually makes decisions while you sleep
The idea that during sleep our minds shut down from the outside world is ancient and one that is still deeply anchored in our view of sleep today, despite some everyday life experiences and recent scientific discoveries that would tend to prove that our brains don’t completely switch off from our environment.
On the contrary, our brains can keep the gate slightly open. For example, we wake up more easily when we hear our own name or a particularly salient sound such as an alarm clock or a fire alarm compared to equally loud but less relevant sounds.
In research published in Current Biology, we went one step further to show that complex stimuli can not only be processed while we sleep but that this information can be used to make decisions, similarly as when we’re awake.
Our approach was simple: We built on knowledge about how the brain quickly automates complex chores. Driving a car, for example, requires integrating a lot of information at the same time, making rapid decisions and putting them into action through complex motor sequences. And you can drive all the way home without remembering anything, as we do when we say we’re on “automatic pilot.”
When we’re asleep, the brain regions critical for paying attention to or implementing instructions are deactivated, of course, which makes it impossible to start performing a task. But we wanted to see whether any processes continued in the brain after sleep onset if participants in an experiment were given an automatized task just before.
To do this, we carried out experiments in which we got participants to categorize spoken words that were separated into two categories: words that referred to animals or objects — for example “cat” or “hat,” in a first experiment; then real words like “hammer” vs. pseudo-words (words that can be pronounced but are found nowhere in the dictionary) like “fabu” in a second one.
Participants were asked to indicate the category of the word that they heard by pressing a left or right button. Once the task became more automatic, we asked them to continue to respond to the words, but they were also allowed to fall asleep. Since they were lying down in a dark room, most of them fell asleep while words were being played.
At the same time we monitored their state of vigilance thanks to EEG electrodes placed on their head. Once they were asleep, and without disturbing the flow of words they were hearing, we gave our participants new items from the same categories. The idea here was to force them to extract the meaning of the word (in the first experiment) or to check whether a word was part of the lexicon (in the second experiment) in order to be able to respond.
Of course, when asleep, participants stopped pressing buttons. So in order to check whether their brains were still responding to the words, we looked at the activity in the motor areas of the brain. Planning to press a button on your left involves your right hemisphere and vice-versa. By looking at the lateralization of brain activity in motor areas, it is possible to see whether someone is preparing a response and toward which side. Applying this method to our sleepers allowed us to show that even during sleep, their brains continued to routinely prepare for right and left responses according to the meaning of the words they were hearing.
Even more interesting, at the end of the experiment and after they woke up, participants had no memory of the words they heard during their sleep, though they recalled the words heard while they were awake very well. So not only did they process complex information while being completely asleep, but they did it unconsciously. Our work sheds new light about the brain’s ability to process information while asleep but also while being unconscious.
This study is just the beginning. Important questions have yet to be answered. If we are able to prepare for actions during sleep, why is it that we do not perform them? What kind of processing can or cannot be achieved by the sleeping brain? Can sentences or series of sentences be processed? What happens when we dream? Would these sounds be incorporated into the dream scenery?
But most importantly, our work revives that age-old fantasy of learning during our sleep. It is well known that sleep is important to consolidate previously learned information or that some basic form of learning like conditioning can take place while we are asleep. But can more complex forms of learning take place and what would be the cost in terms of what sacrifices the brain would make to do this?
Sleep is important for the brain and total sleep deprivation leads to deathafter about two to four weeks. Indeed, it should be borne in mind that sleep is a crucial phenomenon and universal to all animals. We proved here that sleep is not an all-or-none state, not that forcing our brain to learn and do things during the night would be ultimately beneficial in the long run.
Constructing the self
How does our acting, sensing and feeling body shape our mind? Dr Katerina Fotopoulou’s ERC-funded project is an ambitious exploration of the relationship between the body and the mind which spans philosophy, psychology and clinical neuroscience. She will be presenting her work at the World Economic Forum Annual Meeting of the New Champions in Tianjin, China (10-12 September). In preparation for her presentation, Dr Fotopoulou is concentrating on one particular aspect of her research: the ramifications of body image.
What we see in the mirror
As part of the ERC’s IdeasLab session in Tianjin (China) on Wednesday 10 September, Dr Fotopoulou will be addressing the question of the embodied self. Her presentation will focus on the relationship between how we see our bodies and how we protect ourselves against an uncritical internalisation of these images. “By giving so much significance to outside images, we forget about what happens inside – how we process these images, how we filter these perceptions and what this does to our sense of self”, she says.
Dr Fotopoulou’s ERC project ranges beyond questions of body image into the role of primary body signals. Signals from the body are known to be processed in hierarchically organised re-mappings in the brain. However, it remains unknown how the brain integrates them to give rise to our awareness of ourselves as embodied beings. These signals can be roughly divided into three areas – signals from inside the body, from outside and those we receive from others. They are, perhaps inevitably, interrelated. How the inside of the body makes us feel, for example when our heart is racing, is inextricably linked to what we see in the mirror as well as to the perceptions we have absorbed from others.
Bodily signals continuously condition our sense of self, but we are only really aware of them when something goes wrong: “when you are walking somewhere, you are concentrating not on your sense of the bodily self moving through space but on reaching your destination. But if you trip, then you are suddenly jolted into a sense of your self failing to negotiate a pavement“, Dr Fotopoulou says.
Processing pain
One particularly interesting aspect of this research is the group’s investigation of the experience of pain. “The link between stimulus and damage when we feel pain, the perception of pain is not a category in the brain. Instead, our response to pain is based on our previous experience of it,” Dr Fotopoulou explains. “When a child falls over, there is a delay. The child stops and watches its mother. If the mother reacts dramatically, the child will start to cry. If the mother’s response is more practical, the child is much more likely to pick themselves up and carry on. In other words, the child’s experience of pain is conditioned by their mother’s sense of how much danger they are in.”
How mind–body processes can affect healing
The awareness of the relationship between the body and the self is significant when studying the experiences of brain damaged patients. Dr Fotopoulou and her team are particularly interested in patients who deny their conditions, or who are unaware of them – who believe that they retain motion in a paralysed side after a stroke for example. This kind of self-deception can inhibit treatment: it is difficult to treat a patient who does not believe that there is anything wrong with them.
“Brain damaged patients are traditionally treated as broken-down machines in neurological terms,” Dr Fotopoulou explains. “But their problems are psychological as well as physical. By applying cognitive neuroscience methods when treating a small number of patients we have demonstrated that disorders that were previously thought to be intractable can be treated. Studies of this kind are vital: working with patients whose sense of self is fractured can teach us not only about their disorders but also tell us something about how these mechanisms function in healthy individuals.”
The hope is that these findings can be fed into future policy decisions about the treatment of brain damaged patients: particularly in terms of the importance of psychotherapy as part of the rehabilitation process.
ERC funding has enabled Dr Fotopoulou and her team to set up a truly interdisciplinary project. “We have been given the luxury of time to apply a wide range of methods and tools from disciplines as diverse as philosophy and psychology. We have the freedom to pursue the best science without any external pressures – to develop ideas and to publish only when the science is ready.” Dr Fotopoulou and her team are based at University College London (UCL), UK.
Study gauges humor by age
TV sitcoms in which characters make jokes at someone else’s expense are no laughing matter for older adults, according to a University of Akron researcher.
Jennifer Tehan Stanley, an assistant professor of psychology, studied how young, middle-aged and older adults reacted to so-called “aggressive humor” — the kind that is a staple on shows like The Office.
By showing clips from The Office and other sitcoms (Golden Girls, Mr. Bean, Curb Your Enthusiasm) to adults of varying ages, she and colleagues at two other universities found that young and middle-aged adults considered aggressive humor to be funny while older adults did not. The older adults preferred “affiliative humor,” in which a number of characters share and navigate an awkward situation.
Stanley and her co-authors, Monika Lohani of Brandeis University and Derek M. Isaacowitz of Northeastern University, published their findings in the journalPsychology and Aging.
The study raises some intriguing questions about our concept of what is funny. Is that concept based on factors peculiar to generations, or does it evolve over time as we age and, perhaps, mellow? Those possibilities will need to be explored in a future episode of humor research. Stay tuned.
Sounds you can’t hear can still hurt your ears
A wind turbine, a roaring crowd at a football game, a jet engine running full throttle: Each of these things produces sound waves that are well below the frequencies humans can hear. But just because you can’t hear the low-frequency components of these sounds doesn’t mean they have no effect on your ears. Listening to just 90 seconds of low-frequency sound can change the way your inner ear works for minutes after the noise ends, a new study shows.
“Low-frequency sound exposure has long been thought to be innocuous, and this study suggests that it’s not,” says audiology researcher Jeffery Lichtenhan of the Washington University School of Medicine in in St. Louis, who was not involved in the new work.
Humans can generally sense sounds at frequencies between 20 and 20,000 cycles per second, or hertz (Hz)—although this range shrinks as a person ages. Prolonged exposure to loud noises within the audible range have long been known to cause hearing loss over time. But establishing the effect of sounds with frequencies under about 250 Hz has been harder. Even though they’re above the lower limit of 20 Hz, these low-frequency sounds tend to be either inaudible or barely audible, and people don’t always know when they’re exposed to them.
For the new study, neurobiologist Markus Drexl and colleagues at the Ludwig Maximilian University in Munich, Germany, asked 21 volunteers with normal hearing to sit inside soundproof booths and then played a 30-Hz sound for 90 seconds. The deep, vibrating noise, Drexl says, is about what you might hear “if you open your car windows while you’re driving fast down a highway.” Then, they used probes to record the natural activity of the ear after the noise ended, taking advantage of a phenomenon dubbed spontaneous otoacoustic emissions (SOAEs) in which the healthy human ear itself emits faint whistling sounds. “Usually they’re too faint to be heard, but with a microphone that’s more sensitive than the human ear, we can detect them,” Drexl says. Researchers know that SOAEs change when a person’s hearing changes and disappear in conjunction with hearing loss.
People’s SOAEs are normally stable over short time periods. But in the study, after 90 seconds of the low-frequency sound, participants’ SOAEs started oscillating, becoming alternately stronger and weaker. The fluctuations lasted about 3 minutes, the team reports today in Royal Society Open Science. The changes aren’t directly indicative of hearing loss, but they do mean that the ear may be temporarily more prone to damage after being exposed to low-frequency sounds, Drexl explains. “Even though we haven’t shown it yet, there’s a definite possibility that if you’re exposed to low-frequency sounds for a longer time, it might have a permanent effect,” Drexl adds.
“The unfortunate thing about our ears is that we can be doing terrible things to them with sounds that aren’t necessarily painful,” says hearing loss researcher M. Charles Liberman of Harvard Medical School in Boston. To explore the potential harm of specific sounds, such as the hotly debated question of the effect of wind turbines on hearing, Liberman says the same experiment could be repeated with conditions mimicking wind turbine noise. He’d also like to see the study expanded to look at how the ears react to noises—rather than silence—in the minutes after low-frequency sound exposure.
Decreased ability to identify odors can predict death: Olfactory dysfunction is a harbinger of mortality
For older adults, being unable to identify scents is a strong predictor of death within five years, according to a study published October 1, 2014, in the journal PLOS ONE. Thirty-nine percent of study subjects who failed a simple smelling test died during that period, compared to 19 percent of those with moderate smell loss and just 10 percent of those with a healthy sense of smell.
The hazards of smell loss were “strikingly robust,” the researchers note, above and beyond most chronic diseases. Olfactory dysfunction was better at predicting mortality than a diagnosis of heart failure, cancer or lung disease. Only severe liver damage was a more powerful predictor of death. For those already at high risk, lacking a sense of smell more than doubled the probability of death.
“We think loss of the sense of smell is like the canary in the coal mine,” said the study’s lead author Jayant M. Pinto, MD, an associate professor of surgery at the University of Chicago who specializes in the genetics and treatment of olfactory and sinus disease. “It doesn’t directly cause death, but it’s a harbinger, an early warning that something has gone badly wrong, that damage has been done. Our findings could provide a useful clinical test, a quick and inexpensive way to identify patients most at risk.”
The study was part of the National Social Life, Health and Aging Project (NSHAP), the first in-home study of social relationships and health in a large, nationally representative sample of men and women ages 57 to 85.
In the first wave of NSHAP, conducted in 2005-06, professional survey teams from the National Opinion Research Center at the University of Chicago used a well-validated test — adapted by Martha K. McClintock, PhD, the study’s senior author — for this field survey of 3,005 participants. It measured their ability to identify five distinct common odors.
The modified smell tests used “Sniffin’Sticks,” odor-dispensing devices that resemble a felt-tip pen but are loaded with aromas rather than ink. Subjects were asked to identify each smell, one at a time, from a set of four choices. The five odors, in order of increasing difficulty, were peppermint, fish, orange, rose and leather.
Measuring smell with this test, they learned that: • Almost 78 percent of those tested were classified as “normosmic,” having normal smelling; 45.5 percent correctly identified five out of five odors and 29 percent identified four out of five. • Almost 20 percent were considered “hyposmic.” They got two or three out of five correct. • The remaining 3.5 percent were labelled “anosmic.” They could identify just one of the five scents (2.4%), or none (1.1%).
The interviewers also assessed participants’ age, physical and mental health, social and financial resources, education, and alcohol or substance abuse through structured interviews, testing and questionnaires. As expected, performance on the scent test declined steadily with age; 64 percent of 57-year-olds correctly identified all five smells. That fell to 25 percent of 85-year-olds.
In the second wave, during 2010-11, the survey team carefully confirmed which participants were still alive. During that five-year gap, 430 (12.5%) of the original 3005 study subjects had died; 2,565 were still alive.
When the researchers adjusted for demographic variables such as age, gender, socioeconomic status (as measured by education or assets), overall health, and race, those with greater smell loss when first tested were substantially more likely to have died five years later. Even mild smell loss was associated with greater risk.
“This evolutionarily ancient special sense may signal a key mechanism that affects human longevity,” noted McClintock, the David Lee Shillinglaw Distinguished Service Professor of Psychology, who has studied olfactory and pheromonal communication throughout her career.
Age-related smell loss can have a substantial impact on lifestyle and wellbeing, according to Pinto, a member of the university’s otolaryngology-head and neck surgery team. “Smells impact how foods taste. Many people with smell deficits lose the joy of eating. They make poor food choices, get less nutrition. They can’t tell when foods have spoiled or detect odors that signal danger, like a gas leak or smoke. They may not notice lapses in personal hygiene.”
“Of all human senses,” Pinto said, “smell is the most undervalued and underappreciated — until it’s gone.”
Precisely how smell loss contributes to mortality is unclear. “Obviously, people don’t die just because their olfactory system is damaged,” McClintock said.
The research team, which includes biopsychologists, physicians, sociologists and statisticians, is considering several hypotheses. The olfactory nerve, the only cranial nerve directly exposed to the environment, may serve as a conduit, they suggest, exposing the central nervous system to pollution, airborne toxins, pathogens or particulate matter.
McClintock noted that the olfactory system also has stem cells which self-regenerate, so “a decrease in the ability to smell may signal a decrease in the body’s ability to rebuild key components that are declining with age and lead to all-cause mortality.”
2014 Nobel Prize in Physiology or Medicine
at Nobelprize.org
The Nobel Assembly at Karolinska Institutet has today decided to award
The 2014 Nobel Prize in Physiology or Medicine
with one half to
John O´Keefe
and the other half jointly to
May-Britt Moser and Edvard I. Moser
for their discoveries of cells that constitute a positioning
system in the brain
How do we know where we are? How can we find the way from one place to another? And how can we store this information in such a way that we can immediately find the way the next time we trace the same path? This year´s Nobel Laureates have discovered a positioning system, an “inner GPS” in the brain that makes it possible to orient ourselves in space, demonstrating a cellular basis for higher cognitive function.
In 1971, John O´Keefe discovered the first component of this positioning system. He found that a type of nerve cell in an area of the brain called the hippocampus that was always activated when a rat was at a certain place in a room. Other nerve cells were activated when the rat was at other places. O´Keefe concluded that these “place cells” formed a map of the room.
More than three decades later, in 2005, May-Britt and Edvard Moser discovered another key component of the brain’s positioning system. They identified another type of nerve cell, which they called “grid cells”, that generate a coordinate system and allow for precise positioning and pathfinding. Their subsequent research showed how place and grid cells make it possible to determine position and to navigate.
The discoveries of John O´Keefe, May-Britt Moser and Edvard Moser have solved a problem that has occupied philosophers and scientists for centuries – how does the brain create a map of the space surrounding us and how can we navigate our way through a complex environment?
How do we experience our environment?
The sense of place and the ability to navigate are fundamental to our existence. The sense of place gives a perception of position in the environment. During navigation, it is interlinked with a sense of distance that is based on motion and knowledge of previous positions.
Questions about place and navigation have engaged philosophers and scientists for a long time. More than 200 years ago, the German philosopher Immanuel Kant argued that some mental abilities exist as a priori knowledge, independent of experience. He considered the concept of space as an inbuilt principle of the mind, one through which the world is and must be perceived. With the advent of behavioural psychology in the mid-20th century, these questions could be addressed experimentally. When Edward Tolman examined rats moving through labyrinths, he found that they could learn how to navigate, and proposed that a “cognitive map” formed in the brain allowed them to find their way. But questions still lingered – how would such a map be represented in the brain?
John O´Keefe and the place in space
John O´Keefe was fascinated by the problem of how the brain controls behaviour and decided, in the late 1960s, to attack this question with neurophysiological methods. When recording signals from individual nerve cells in a part of the brain called the hippocampus, in rats moving freely in a room, O’Keefe discovered that certain nerve cells were activated when the animal assumed a particular place in the environment (Figure 1). He could demonstrate that these “place cells” were not merely registering visual input, but were building up an inner map of the environment. O’Keefe concluded that the hippocampus generates numerous maps, represented by the collective activity of place cells that are activated in different environments. Therefore, the memory of an environment can be stored as a specific combination of place cell activities in the hippocampus.
May-Britt and Edvard Moser find the coordinates
May-Britt and Edvard Moser were mapping the connections to the hippocampus in rats moving in a room when they discovered an astonishing pattern of activity in a nearby part of the brain called the entorhinal cortex. Here, certain cells were activated when the rat passed multiple locations arranged in a hexagonal grid (Figure 2). Each of these cells was activated in a unique spatial pattern and collectively these “grid cells” constitute a coordinate system that allows for spatial navigation. Together with other cells of the entorhinal cortex that recognize the direction of the head and the border of the room, they form circuits with the place cells in the hippocampus. This circuitry constitutes a comprehensive positioning system, an inner GPS, in the brain (Figure 3). Continue reading
Whole organ ‘grown’ in world first
A whole functional organ has been grown from scratch inside an animal for the first time, say researchers in Scotland.
A group of cells developed into a thymus – a critical part of the immune system – when transplanted into mice.
The findings, published in Nature Cell Biology, could pave the way to alternatives to organ transplantation.
Experts said the research was promising, but still years away from human therapies.
The thymus is found near the heart and produces a component of the immune system, called T-cells, which fight infection.
Grow your own
Scientists at the Medical Research Council centre for regenerative medicine at the University of Edinburgh started with cells from a mouse embryo.
These cells were genetically “reprogrammed” and started to transform into a type of cell found in the thymus.
These were mixed with other support-role cells and placed inside mice.
Once inside, the bunch of cells developed into a functional thymus.
It is similar to a feat last year, when lab-grown human brains reached the same level of development as a nine-week-old foetus.
The thymus is a much simpler organ and in these experiments became fully functional.
Structurally it contained the two main regions – the cortex and medulla – and it also produced T-cells.

Prof Clare Blackburn, part of the research team, said it was “tremendously exciting” when the team realised what they had achieved.