Thursday, November 12, 2015

Neurotechnology and Brain Health


The fusion of applicable technology and neuroscience has created a new found field in recent years called neurotechnology. The history of technology developing within neuroscience started with the inventions of the EEG (electroencephalogram), that detects electrical activity, and MRI/fMRI that measures changes in blood flow. EEGs are particularly useful for researchers when trying to pinpoint the relationship between behavior and brain activity within a small temporal time period whereas fMRI is more useful for locating which brain areas are associated with behavior.

Both EEG and fMRI have been very beneficial for the fields of neuroscience and brain health but they often come at a cost. This is why technology has continued to create new, affordable and efficient ways that influence brain health data. An article written by SharpBrains' Alvaro Fernadez explores 10 technologies that innovators have proposed in recent years that contribute to the growing field of neurotechnology. The 10 technologies on the list that I'm going to talk about are either soon to be in use for brain health or are already in use.

The first neurotechnology on the list is Big Data-enhanced diagnostics and treatments. Because of the level of computing power that general technology has achieved, computers can synergize thousands of brain data components from clinical trials and research. This allows brain health researchers to compare an individuals' data with population data within a specific age group, geographic location or intelligence level to determine where on a variable distribution curve the individual is located. This data can be used to help predict future neurodegenerative diseases or behavioral changes that might benefit the individual. With more information being processed by big-data institutions, healthcare information can be properly distinguished to help healthcare providers identify trends. Companies like CNS Response and Advanced Brain Monitoring are using these big data systems for health analytics.

Another neurotechnology on the list includes Real-time neuromonitoring. Real-time neuromonitoring is exactly what it sounds like - it is a system that monitors your brain activity in real-time while an individual is performing cognitive functions, sleeping, or anything in between. As a brain health example, these systems can detect if brain activity is about to lead to a seizure and in real-time, provide treatments that ultimately prevent epileptic activity.

Virtual reality combined with EEG and tDCS represents another neurotechnology that is utilized for brain health. tDCS is a form of neuro(brain) stimulation that uses a low electrical current to transform the electrical environment of a particular brain area. The amazing capacity of virtual reality to depict real world events combined with the measuring capabilities of EEG and enhancement capabilities of tDCS (transcranial direct-current stimulation) give brain health researchers the ability to create behavioral trainings that can combat PTSD and phobias. Companies like Medtronic, Brainlab and Nielsen are creating patents in the virtual reality space to develop these technologies.

Electrical and magnetic brain stimulation today typically comes in the form of TMS (transcranial magnetic stimulation) and tDCS. TMS uses magnetic fields to stimulate neurons in the brain in a specific location. The research behind tDCS and TMS has shown that it is useful in combating symptoms of depression. Many hospitals are continuing to perform clinical trials and research to explore the positive transfer effects of TMS and tDCS.

Neurotechnology in brain health is projected to grow increasingly in the next few decades. With healthcare being such an important aspect of society and policy there is a continuing demand for new technology to create an efficient system. A huge aspect of this system is mental and brain health. But neurotechnologies don't stop there. Read about many more neurotechnologies relating to everyday work performance, attention and mindfulness in the SharpBrains article here.


Eli






Tuesday, August 4, 2015

Canine Cognition and Face Processing


The study of face processing is huge in the worlds of Neuroscience and Cognitive Science. Conclusively evidence stands that there are special areas and mechanisms for face processing in both human and primate species. This evidence is the result from various studies exploring the neural differences between looking at objects versus looking at faces, looking at faces upside-down versus right-side up, and looking at different features of faces.

One study that explores whether or not face processing is innate or learned is Sugita’s 2008 study. Here monkeys were reared either in an environment in which faces were present or an environment in which faces were not present. The monkeys were then tested to see whether or not they preferred faces versus objects and monkey faces versus human faces. Results showed that without any exposure to any faces at all, monkeys showed a preference to faces over objects but did not prefer monkey faces to human faces, this preference was the same. These results showed that monkeys do in fact have an innate mechanism to process human and monkey faces.

A most recent study explores this same question of innate face processing in dogs. Gregory Burns, a neuroscientist at Emory University, is heading the Dog Project and researching evolutionary questions of man’s best friend. Burns’ project is the first to train dogs to sit still in an fMRI machine without restraint or sedation, a pretty impressive feat. Within the fMRI machine the dogs looked at video images on a screen of faces (both human and dog) and objects.

Results showed increased activity in the canines’ temporal lobe when looking at videos of both human and dog faces. As Burns states, “If the dogs’ responses to faces was learned — by associating a human face with food, for example — you would expect to see a response in the reward system of their brains, but that was not the case.” In humans, our face processing area is known to neuroscientists as the FFA, or fusiform face area. Burns and colleagues at Emory University have called the region in temporal lobe for dogs the dog face area, or DFA.

Why dogs? Humans have had the longest animal interaction with dogs, they are incredibly social with us, and hey it’s cool to know how our pet’s brains work, right? This kind of study is also important for understanding social animals in general, including ourselves. There are many canine cognition labs across the United States exploring many more aspects of dogs’ cognitive behavior including Duke University, Yale, The University of Florida, and many others.


Eli

Thursday, July 16, 2015

Mind-Wandering and Its Cognitive Benefits


Everybody knows the kid in the back-corner of the classroom daydreaming out into the playground or at the clouds. The teacher knows this student is not paying attention, their mind is on a distant land, and he’s not getting any information from the lesson. While the kid certainly is not obtaining anything that is coming from the teacher, perhaps this mind-wondering has other cognitive benefits that might help him for the future.

It has been long known that mind wondering activates the dorsolateral prefrontal cortex (dlPFC), a recently evolved brain region that is only found in mammals. This finding is surprising to many because it is also a brain region that has been associated with creative problem solving, working memory, and decision making — very cognitively demanding behaviors, unlike mind-wandering. Also known in the brain research world is the finding that it is common for the mind to be wandering during tasks. In fact, we spend about half our time during tasks mind-wandering.

A recent study by Vadim Axelrod and colleagues found that when given a repetitive task, participants minds are wondering a whole lot. They did this through asking the participants what they were thinking about throughout the task. But that is already known; what is important from this study is that the more the participants’ minds were wondering, the better they did at the task! In addition, the researchers used transcranial direct current stimulation (tDCS), in which an electrode is attached to the scalp that sends electrical currents activating the neurons in that particular brain region. Here, the researchers used tDCS on the dorsolateral prefrontal cortex and found that it increased mind-wandering!

So how did this happen? How does mind-wandering actually increase our ability to do repetitive tasks? Some suggest that when doing a repetitive task you have a propensity to want to stop, called “temporal discounting”, and this temporal discounting is inhibited by the mind’s ability to daydream or wonder. So mind-wondering stops the mechanism by which we want to stop doing a repetitive task therefore making us do better at the task… hmm.

In addition, mind-wandering benefits us in ways that aren’t directly tied to task ability. Mind-wandering also increases creative problem solving and allows us to “run future-oriented simulations.” Not only is mind-wandering beneficial for thinking about possible outcomes to situations but its also beneficial for how we would feel given that possible future outcome, increasing our future planning abilities.

What do you think? Is the kid in the back-corner of the class increasing his abilities to plan for the future, solve creative problems and complete repetitive tasks?

Eli

Thursday, May 28, 2015

Perceived Loneliness and Social Stimuli

While the research in general social stimuli like empathy and cooperation is growing incrementally, little research has been conducted regarding how perceived loneliness affects our general perception of social stimuli. Evolutionary Psychology shows that humans created their own social constructs out of survival and protection purposes in order to sustain the next generation to reproduce. Behavioral and neuroimaging studies have suggested that individuals who build social connections find it fundamentally rewarding. This may be intuitive, but what about those who choose not to be as proactive at building social relationships?

Recent research out of the University of Chicago (Cacioppo, Norris, Decety Monteleone, Nusbaum) in the Journal of Cognitive Science shows that perceived loneliness has an impact on how we receive social stimuli. Their fMRI research conducted on 23 female participants from the University of Chicago shows that the ventral striatum, a brain region that is critical in social reward processes and social learning, was less activated for individuals that perceived themselves as lonely than individuals who perceived themselves as non-lonely.

Perceived loneliness was established through the UCLA Loneliness Scale which consists of "20 items measuring general loneliness and degrees of satisfactions within one's social relationships." fMRI results in the ventral striatum and surrounding brain regions were established through showing various pictorial scenes ranging from pleasant pictures of people to pleasant pictures of objects. Individuals who did not perceive themselves as lonely found the pleasant pictures of people to be more rewarding in general and more rewarding than the pictures of objects. These results suggest that individuals who perceive themselves as lonely do not find as much reward in social stimuli as those who perceive themselves as non-lonely. In addition, these results suggest that perceived lonely individuals actually find similar rewarding in non-social stimuli and social stimuli.

Social interactions evoke the opportunities for trust, support and cooperation; but they also can elicit opportunities for betrayal and conflict. This research gives insight into how lonely and non-lonely individuals interpret these social interactions, and what kind of repercussions can come from those interpretations. Moreover, this research raises new questions about the roles of the ventral striatum and surrounding brain regions on social interactions with lonely and non-lonely individuals.


Cacioppo, John T., Catherine J. Norris, Jean Decety, George Monteleone, and Howard Nusbaum. "In the Eye of the Beholder: Individual Differences in Perceived Social Isolation Predict Regional Brain Activation to Social Stimuli." Journal of Cognitive Neuroscience. U.S. National Library of Medicine, n.d. Web. 28 May 2015. .

Eli

Friday, January 30, 2015

Football and Long Term Brain Health

This topic in particular is one that interests me and also one that I'm conflicted with. Because I played football for 16 years and I still love watching the game, it's difficult for me to believe evidence that says the players I'm watching on ESPN could have a shorter life expectancies because of the game they're playing. With previous players now getting older, the cases of mental health repercussions are growing steadily, and much of the blame is on the lack of player safety and proper equipment. There is overwhelming evidence that previous NFL players who have reported getting concussions do damage to the longterm health of their brain. Recently, an article in the journal Neurology shows that players who started younger in childhood are experiencing decreased mental capabilities than those who started later in childhood in later ages in life. Additionally, another study published in Neurology in 2012 showed higher rates of possible neurodegenerative diseases were greater in football players than the general U.S. population for males longterm

In the recent study, 42 former NFL players with an age range of 40-69 were split into two groups: a group who started playing after the age of 12 and a group who started playing before the age of 12. The groups were comparable on each of their age ranges now and the amount of years they had been playing in the NFL. Interestingly, the group who started playing before the age of 12 showed decreased performance in strategic planning abilities, verbal memory, word pronunciation, flexibility in decision making, and problem solving than the group who started playing after 12 years old. With participation in youth football declining 29% from the years 2008-2013, it may be evidence like this that reminds parents why they are keeping their kids off the football field.

While this study suggests that it is the age at which the players started that makes a difference in their mental abilities later in life, many have argued that it is the prolonged exposure to head trauma that is doing the severe damage. With the evidence found in 2012 of NFL players showing increased signs of neurodegenerative diseases younger than the general population, with a balance of starting ages, it would seem as if both studies are supplemental to the overall picture of concussions and head trauma at any level of football.

It is difficult to form a take-home message after looking at these studies. Some medical professionals have urged parents to keep their kids out of tackle football until the age of 14 or all together. Others say football is ingrained in our society and provides kids with a means to learn important values like teamwork and discipline. Many see football as a tool for opportunity to get high school students to the college of their choice or to get out of a rough neighborhood.

While the choice for parents is a difficult one, there is overwhelming evidence that playing tackle football at any age is a risk to your mental health. Even as the game becomes safer with penalties being thrown for head-head contact and helmet technology improving, one cannot deny that in a game like football, where head trauma is so frequent, the game is dangerous for long term mental health.

http://www.neurology.org/content/early/2012/09/05/WNL.0b013e31826daf50.full.pdf+html
http://www.espn.go.com/pdf/2015/0128/otlBUfootballstudy.pdf
http://espn.go.com/espn/otl/story/_/id/12243012/ex-nfl-players-played-tackle-football-youth-more-likely-thinking-memory-problems
http://www.bloomberg.com/news/articles/2015-01-28/nfl-players-tackled-young-show-weaker-brain-function-later


Eli

Tuesday, January 20, 2015

Stress and Empathy

Would you ever think that being around strangers makes you generally less empathetic? What if just a little bit of positive interaction with a stranger made you generally more empathetic? 

President Obama has discussed the so called 'empathy deficit' in the United States citing the many times politicians have called for welfare cuts on the poor. Others say that empathy is the social glue that holds our societies together without us clawing for our own fortune all the time. 


A new study published in the journal Current Biology says that empathy has a lot to do with how stressed you are being around strangers. Student participants were separated into various scenarios: alone, with a friend, with a stranger, between two strangers given a stress-blocking drug, and between two strangers that had played Rock Band together for 15 minutes. They were then asked to submerge their arm in cold ice water and rate their pain. This is a common technique for professionals to see how different stimuli and behaviors can subsequently affect how one perceives pain. For the purpose of this study, seeing an increase in pain above an initial baseline is an indication of feeling another's pain or empathy. 


Participants that were alone or sitting with a stranger reported experiencing the same level of pain. But, when paired with a friend or with someone whom they just played 15 minutes of Rock Band with, participants reported experiencing more pain. "It would seem like more pain in the presence of a friend would be bad news, but it's in fact a sign that there is a strong empathy between individuals - they are indeed feeling each other's pain," said Mogil, one of the authors of the study. The group that was given the stress-blocking drug, which inhibited a flight-or-fight stress response, experienced more pain in the prescience of a complete stranger, indicating an increase in empathy. So it would seem stress has a lot to do with our ability to be empathetic with one another. 


Sometimes after a stressful day we only contemplate on our own difficult matters rather than the struggles of others. Often this narrows our perspective and shies us away from realizing our own fortunate situations. Being social beings has benefitted our society in many ways over the course of our evolution and continuing to be empathetic towards others only propels our social relationships in everyday life.  


http://neurosciencenews.com/empathy-emotion-psychology-1714/

“Reducing Social Stress Elicits Emotional Contagion of Pain in Mouse and Human Strangers” by Loren J. Martin, Georgia Hathaway, Kelsey Isbester, Sara Mirali, Erinn L. Acland, Nils Niederstrasser, Peter M. Slepian, Zina Trost, Jennifer A. Bartz, Robert M. Sapolsky, Wendy F. Sternberg, Daniel J. Levitin, and Jeffrey S. Mogil in Current Biology. Published online January 15 2015

Eli

Wednesday, December 24, 2014

The Cognitive Advantages to Call of Duty

In the realms of psychology and cognitive science, video games like Call of Duty and Halo have often been associated with a higher variability to violent behavior. If you ask any parent they will probably tell you that they don't enjoy the idea of their children playing video games at all, let alone first-person shooter games like COD. In recent polls about 90% of school-age children play video games but in fact the average age of a video gamer in today's market stands at 33 years old. One month after the release of Call of Duty- Black Ops, Activision the company that created the game, recorded a total of 600 million hours (68,000 years) of gameplay. Clearly, video games take up a huge proportion of how we spend our leisure time. This has allowed current researchers to study the effects of action video games on the brain and the good news is - there are a host of cognitive advantages.

In a 2012 TED Talk, Daphne Bavelier shares her research on the effects of action video game play on the visual and working memory systems of the brain. While non video game players had corrected to normal vision during her study, action video gamers, who played up to 5 and 10 hours per week, had shown improved vision both on a normalized visual scale(something you see at the eye doctor) and in a grey scale visual field. Visual attention is another advantage action video game players show above non-video game players and non-action players, specifically our ability to track objects. We use our visual attention everyday, it's what helps us keep track of cars in an intersection or when we're playing sports. In the experiment, action video game players were better able at keeping track of more moving objects in a visual field than non-action video game players.

In more recent study in 2013, Kara Blacker and Kim Curby found an advantage in visual short term memory in action video game players. Given a memory array, participants were asked to encode shapes within the array and respond as to whether or not the shape within the array changed between displays. Action video gamers were better able to detect the changing stimuli, contributing to higher cognitive functions in visual short term memory. Moreover, a significant finding in their study showed that the length of encoding time did not have an effect on the increased visual short term memory found in action video gamers. This means the action video gamers encoded the visual information on the memory arrays very efficiently.

Often the impact of technology on the brain is seen in a negative light. These studies show that given a moderate amount of action video game playing, there can be significant cognitive advantages from playing video games that impact your everyday working life. So if you're a parent debating on buying your +17(Mature rating) kid that new Call of Duty game for fear that their brain is going to rot everything is going to be okay, they are going to develop significant cognitive advantages from playing moderate amounts.

Blacker, Kara J., and Kim M. Curby. "Enhanced Visual Short-term Memory in Action Video Game Players." Attention, Perception, & Psychophysics 75.6 (2013): 1128-136. Web.


Happy Holidays

Eli