Toggle light / dark theme

Monitoring their sleeping patterns, researchers identified an increase in the duration and continuity of REM sleep and specific brain oscillations characteristic of REM sleep, whereas ‘deep’ sleep, or non-REM sleep, did not change. The changes in REM sleep were very tightly linked to deficiency in the regulation of the stress hormone corticosterone. Mild stress also caused changes in gene expression in the brain.


The first and most distinct consequence of daily mild stress is an increase in rapid-eye-movement (REM) sleep, a new study in the journal PNAS reports. The research also demonstrated that this increase is associated with genes involved in cell death and survival.

REM sleep, also known as paradoxical sleep, is the sleep state during which we have most of our dreams and is involved in the regulation of emotions and memory consolidation. REM sleep disturbances are common in mood disorders, such as depression. However, little was known about how sleep changes are linked to molecular changes in the brain.

During this 9-week study, conducted by researchers from the Surrey Sleep Research Centre at the University of Surrey in collaboration with Eli Lilly, mice were intermittently exposed to a variety of mild stressors, such as the odour of a predator. Mice exposed to mild stressors developed signs of depression; they were less engaged in self-care activities; were less likely to participate in pleasurable activities such as eating appetising food, and became less social and interested in mice they hadn’t encountered before.

Read more

Decoder, developed in collaboration with a games developer, gets users to assume the role of an intelligence officer tasked with breaking up global criminal gangs (users are able to select a character and their backstory).

To meet the objective, users have to identify different combinations of number strings in missions littered with distraction.

Winning each mission means users unlock letters of the next criminal location (the higher the score, the more letters revealed).

Read more

The adult brain has learned to calculate an image of its environment from sensory information. If the input signals change, however, even the adult brain is able to adapt − and, ideally, to return to its original activity patterns once the perturbation has ceased. Scientists at the Max Planck Institute of Neurobiology in Martinsried have now shown in mice that this ability is due to the properties of individual neurons. Their findings demonstrate that individual cells adjust strongly to changes in the environment but after the environment returns to its original state it is again the individual neurons which reassume their initial response properties. This could explain why despite substantial plasticity the perception in the adult brain is rather stable and why the brain does not have to continuously relearn everything.

Everything we know about our environment is based on calculations in our brain. Whereas a child’s brain first has to learn the rules that govern the environment, the adult brain knows what to expect and, for the most part, processes environmental stimuli in a stable manner. Yet even the adult brain is able to respond to changes, to form new memories and to learn. Research in recent years has shown that changes to the connections between neurons form the basis of this plasticity. But, how can the brain continually change its connections and learn new things without jeopardizing its stable representation of the environment? Neurobiologists in the Department of Tobias Bonhoeffer in Martinsried have now addressed this fundamental question and looked at the interplay between plasticity and stability.

The scientists studied the stability of the processing of sensations in the visual cortex of the mouse. It has been known for about 50 years that when one eye is temporarily closed, the region of the brain responsive to that eye increasingly becomes responsive to signals from the other eye that is still open. This insight has been important to optimize the use of eye patches in children with a squint. “Thanks to new genetically encoded indicators, it has recently become possible to observe reliably the activity of individual neurons over long periods of time,” says Tobias Rose, the lead author of the study. “With a few additional improvements, we were able to show for the first time what happens in the brain on the single-cell level when such environmental changes occur.”

Read more

From the article:


In recent years, a growing number of scientific studies have backed an alarming hypothesis: Alzheimer’s disease isn’t just a disease, it’s an infection.

While the exact mechanisms of this infection are something researchers are still trying to isolate, a litany of papers argue the deadly spread of Alzheimer’s goes way beyond what we used to think.

Now, scientists are saying they’ve got one of the most definitive leads yet for a bacterial culprit behind Alzheimer’s, and it comes from a somewhat unexpected quarter: gum disease.

Read more

In May, 2016 I stumbled upon a highly controversial Aeon article titled “The Empty Brain: Your brain does not process information, retrieve knowledge or store memories. In short: your brain is not a computer” by psychologist Rob Epstein. This article attested to me once again just how wide the range of professional opinions may be when it comes to brain and mind in general. Unsurprisingly, the article drew an outrage from the reading audience. I myself disagree with the author on most fronts but one thing, I actually agree with him is that yes, our brains are not “digital computers.” They are, rather, neural networks where each neuron might function sort of like a quantum computer. The author has never offered his version of what human brains are like, but only criticized IT metaphors in his article. It’s my impression, that at the time of writing the psychologist hadn’t even come across such terms as neuromorphic computing, quantum computing, cognitive computing, deep learning, evolutionary computing, computational neuroscience, deep neural networks, and alike. All these IT concepts clearly indicate that today’s AI research and computer science derive their inspiration from human brain information processing — notably neuromorphic neural networks aspiring to incorporate quantum computing into AI cognitive architecture. Deep neural networks learn by doing just children.


By Alex Vikoulov.

Picture

“I have always been convinced that the only way to get artificial intelligence to work is to do the computation in a way similar to the human brain. That is the goal I have been pursuing. We are making progress, though we still have lots to learn about how the brain actually works.”

Read more

For the tenth consecutive year, #Deloitte, a global leader in audit and consulting, lists the technological trends that will transform the processes, products, and services of the most innovative companies in the world this year.

These technologies include advanced network architectures, serverless computing, and intelligent interfaces, as well as increased development of digital, cognitive and cloud experiences.


Yes, uncertainty is disconcerting. But much of the tech-driven disruption today—and, likely, going forward—is both understandable and knowable.

Read more