Category Archives: Neuroscience

Cognitive Skills

Source: MIT, Mar 2015

different components of fluid intelligence peak at different ages, some as late as age 40.

… each cognitive skill they were testing peaked at a different age. For example, raw speed in processing information appears to peak around age 18 or 19, then immediately starts to decline. Meanwhile, short-term memory continues to improve until around age 25, when it levels off and then begins to drop around age 35.

For the ability to evaluate other people’s emotional states, the peak occurred much later, in the 40s or 50s.

rystallized intelligence peaks later in life, as previously believed, but the researchers also found something unexpected: While data from the Weschler IQ tests suggested that vocabulary peaks in the late 40s, the new data showed a later peak, in the late 60s or early 70s.


Mapping the Human Brain

Source: NYTimes, Jan 2015

device that imaged brain tissue with enough resolution to make out the connections between individual neurons. But drawing even a tiny wiring diagram required herculean efforts, as people traced the course of neurons through thousands of blurry black-and-white images. What the field needed, Tank said, was a computer program that could trace them automatically — a way to map the brain’s connections by the millions, opening a new area of scientific discovery. For Seung to tackle the problem, though, it would mean abandoning the work that had propelled him to the top of his discipline in favor of a highly speculative engineering project.

Seung published a paper in the prestigious journal Nature, demonstrating how the brain’s neural connections can be mapped — and discoveries made — using an ingenious mix of artificial intelligence and a competitive online game. Seung has also become the leading proponent of a plan, which he described in a 2012 book, to create a wiring diagram of all 100 trillion connections between the neurons of the human brain, an unimaginably vast and complex network known as the connectome.

With connectome mapping, Seung explained last month, it is possible to start answering questions that theorists have puzzled over for decades, including the ones that prompted him to put aside his own work in frustration. He is planning, among other things, to prove that he can find a specific memory in the brain of a mouse and show how neural connections sustain it. “I am going back to settle old scores,” he said.

The question is not whether a map can be made, but what insights it will bring. Will future generations cherish a cartographer’s work or shake their heads and deliver it up to the inclemencies?

The ur-map of this big science is the one produced by the Human Genome Project, a stem-to-stern accounting of the DNA that provides every cell’s genetic instructions. The genome project was completed faster than anyone expected, thanks to Moore’s Law, and has become an essential scientific tool. In its wake have come a proliferation of projects in the same vein — the proteome (proteins), the foldome (folding of proteins) — each promising a complete description of something or other. (One online listing includes the antiome: “The totality of people who object to the propagation of omes.”)

The Brain Initiative, the United States government’s 12-year, $4.5 billion brain-mapping effort, is a conscious echo of the genome project, but neuroscientists find themselves in a far more tenuous position at the outset. The brain might be mapped in a host of ways, and the initiative is pursuing many at once. In fact, Seung and his colleagues, who are receiving some of the funding, are working at the margins of contemporary neuroscience. Much of the field’s most exciting new technology has sought to track the brain’s activity — like functional M.R.I., with its images of parts of the brain “lighting up” — while the connectome would map the brain’s physical structure.

What makes the connectome’s relationship to our identity so difficult to understand, Seung told me, is that we associate our “self” with motion.

a cross-disciplinary group of researchers, including Seung, hit on a new way of thinking that is described as connectionism.

The basic idea (which borrows from computer science) is that simple units, connected in the right way, can give rise to surprising abilities (memory, recognition, reasoning). In computer chips, transistors and other basic electronic components are wired together to make powerful processors. In the brain, neurons are wired together — and rewired.

Every time a girl sees her dog (wagging tail, chocolate brown fur), a certain set of neurons fire; this churn of activity is like Seung’s Colorado River. When these neurons fire together, the connections between them grow stronger, forming a memory — a part of Seung’s riverbed, the connectome that shapes thought.

A typical human neuron has thousands of connections; a neuron can be as narrow as one ten-thousandth of a millimeter and yet stretch from one side of the head to the other. Only once have scientists ever managed to map the complete wiring diagram of an animal — a transparent worm called C. elegans, one millimeter long with just 302 neurons — and the work required a stunning display of resolve. Beginning in 1970 and led by the South African Nobel laureate Sydney Brenner, it involved painstakingly slicing the worm into thousands of sections, each one-thousandth the width of a human hair, to be photographed under an electron microscope.

That was the easy part. To pull a wiring diagram from the stack of images required identifying each neuron and then following it through the sections, a task akin to tracing the full length of every strand of pasta in a bowl of spaghetti and meatballs, using pens and thousands of blurry black-and-white photos. For C. elegans, this process alone consumed more than a dozen years. When Seung started, he estimated that it would take a single tracer roughly a million years to finish a cubic millimeter of human cortex — meaning that tracing an entire human brain would consume roughly one trillion years of labor. He would need a little help.

In 2012, Seung started EyeWire, an online game that challenges the public to trace neuronal wiring — now using computers, not pens — in the retina of a mouse’s eye. Seung’s artificial-­intelligence algorithms process the raw images, then players earn points as they mark, paint-by-numbers style, the branches of a neuron through a three-dimensional cube. The game has attracted 165,000 players in 164 countries. In effect, Seung is employing artificial intelligence as a force multiplier for a global, all-volunteer army that has included Lorinda, a Missouri grandmother who also paints watercolors, and Iliyan (a.k.a. @crazyman4865), a high-school student in Bulgaria who once played for nearly 24 hours straight. Computers do what they can and then leave the rest to what remains the most potent pattern-recognition technology ever discovered: the human brain.

“Blink” Creativity

Source: Fast Company, Nov 2014

In the 1990s, cognitive scientists John Kounios and Mark Beeman started studying the insightful moment when you’re suddenly able to see things differently, also known as the “aha!” or eureka moment.

… right before the problem is presented, activity in the visual part of an analytical person’s brain would amp up to take in as much information as possible. On the other hand, the visual cortex would shut down for those who don’t solve problems in a methodical way, which allows them to block out the environment, look inward, and “find and retrieve subconscious ideas,” says Kounios.

The Visual Paradox

While more creative people shut down visual information before their “aha!” moment, these people tend to take in more visuals compared to others on a daily basis. Kounios says when these people walk down the street, they tend to study others, take in information, and may seem very scattered about their own agenda. However, the information they take in and synthesis may be a product of unconscious processing for years before ideas emerge.

Those who are more analytical are more focused with their attention. When they walk down the street, they are focused on where they’re going and how they’re going to get there. They tend not to stray into different areas of thoughts.

What you can do is be receptive and expose yourself to a lot of insight triggers. Also, positive moods tend to promote eureka moments. On the contrary, anxiety will promote analytical thoughts.

Lastly, Kounios advises to people who want epiphanies to get more sleep.

TED AI xPrize

Source: xPrize website, 2014

On March 20, from the TED2014 stage, Chris Anderson and Peter Diamandis join forces to announce the A.I. XPRIZE presented by TED, a modern-day Turing test to be awarded to the first A.I. to walk or roll out on stage and present a TED Talk so compelling that it commands a standing ovation from you, the audience. The detailed rules are yet to be created because we want your help to create what the rules should be.

A See Through Brain

Source: Nature, Apr 2013

A chemical treatment that turns whole organs transparent offers a big boost to the field of ‘connectomics’ — the push to map the brain’s fiendishly complicated wiring. Scientists could use the technique to view large networks of neurons with unprecedented ease and accuracy.

The new method instead allows researchers to see directly into optically transparent whole brains or thick blocks of brain tissue. Called CLARITY, it was devised by Karl Deisseroth and his team at Stanford University in California. “You can get right down to the fine structure of the system while not losing the big picture,” says Deisseroth, who adds that his group is in the process of rendering an entire human brain transparent.

Neuroscience of Learning

Source: KQED Mindshift, Aug 2014

Under a principle the Bjorks call desirable difficulty, when the brain has to work hard to retrieve a half-forgotten memory (such as when reviewing new vocabulary words you learned the day before), it re-doubles the strength of that memory.

The brain is a foraging learner. … The human brain evolved to pick up valuable pieces of information here and there, on the fly, all the time, and put it all together, he said. It still does that — absorbing cues from daily life, overheard conversations, its own internal musings.

It keeps things in mind that are important to you (an unfinished project, for instance) and adds to your thoughts about them by subconsciously tuning in to any relevant information you see or hear around you. By foraging in this way, the brain is “building knowledge continually, and it’s not only during study or practice,” Carey said. And we’re not even completely aware of that.

We can be tactical in our schooling.

Students can tailor their preparation with techniques targeting different kinds of content or skills, and manage their schedule to optimize their time.

  • Breaking up and spacing out study time over days or weeks can substantially boost how much of the material students retain, and for longer, compared to lumping everything into a single, nose-to-the-grindstone session.
  • Varying the studying environment — by hitting the books in, say, a cafe or garden rather than only hunkering down in the library, or even by listening to different background music — can help reinforce and sharpen the memory of what you learn.
  • A 15-minute break to go for a walk or trawl on social media isn’t necessarily wasteful procrastination. Distractions and interruptions can allow for mental “incubation” and flashes of insight — but only if you’ve been working at a problem for a while and get stuck, according to a 2009 research meta-analysis.
  • Quizzing oneself on new material, such as by reciting it aloud from memory or trying to tell a friend about it, is a far more powerful way to master information than just re-reading it

Carey has fully incorporated the learning techniques into his own life … For example, when reading a difficult scientific journal article, “I realize I’m not going to understand a bunch of the stuff right away, no matter how hard I try or concentrate. I don’t let that slow me down.” He runs through it a few times, puts it aside and, spacing out his learning, tries again later, when the material almost always begins to gel.

Deadline pressure often forces him to start writing his article before he even has all the pieces, which is an “extremely valuable way to efficiently pick up the knowledge,” he said. “In effect, you’re testing yourself on how much you know… and you’re trying to write it clearly so you’re sort of teaching it, too. Those are two very effective study techniques.”

Story Telling Works in Technology

Source: Business Insider, Jul 2014

I’ll ask the candidate to go through their prior successes and challenges and major responsibilities and tell that story, partially because I want to see how good they are at storytelling.”

“Neurological research has shown that our emotional reaction to stories is deeply rooted in biology,” he says. “In business, creating a compelling narrative is invaluable for motivating a team, explaining strategic priorities in a way that’s easy for others to understand, or communicating complex ideas to customers and prospects. Successful senior-level leaders are good storytellers, and it’s also a very useful skill early on in your career.”

storytelling is especially important in the tech industry because technology can be “very complex, and sometimes people find technical details to be somewhat boring.”

“It’s much easier to pay attention to something when you find it interesting, and you are much more likely to focus on technical details when they’re part of a well-crafted narrative,” he says.

“In the tech industry, storytelling ability is a critical skill for functions as diverse as software development, sales, strategy, or marketing, since a good story provides historical context, puts priorities in perspective, and clearly lays out a vision for the future.”

He says the ability to listen is equally as important.

“One underappreciated aspect of storytelling is the importance of listening. The best storytellers are continuously evolving their story in response to what they’re learning from their audience,” Jaffe explains. “Listening allows you to craft stories that create a strong emotional connection.”


Electronic and Organic Memory

Source: Annie Murphy Paul blog,

it’s important to recognize the two types of memory, and the differences between them.

  1. The first is exemplified by young doctors’ use of UpToDate: E-memory is suited for targeted searches, while O-memory is best for building a broad, deep base of knowledge. 

    Research in cognitive science and psychology demonstrates that the ability to make quick and accurate judgments depends on the possession of extensive factual knowledge stored in memory — in internal, organic memory, that is, and not in a device.Browsing, as another physician put it in a commentary on Kassirer’s essay, is “an open-ended exploratory strategy that is driven by curiosity and creates the conditions needed for serendipity.”

  2. E-memory is good for invariant storage, while O-memory is good for elaborated connections. 
  3. The third insight into E-memory and O-memory is that electronic memory is useful for checking the accuracy of our impressions, while organic memory is valuable for the self-knowledge it can foster.

    E-memory acts as a check on O-memory. But only O-memory endows recollections with meaning.

With our computers, we can search, store, and check.
With our minds, we can browse, elaborate and reflect.

The intelligent user of memory in our connected world—what the philosopher Andy Clark calls a “canny cognizer”—combines the best of E-memory and O-memory, and knows what she’s up to as she does it.

Polymaths are Creatives

Source: Aeon, date indeterminate

Science … is polymathic. New ideas frequently come from the cross-fertilisation of two separate fields.

In business, cross-fertilisation is the source of all kinds of innovations. … To come up with such ideas, you need to know things outside your field. What’s more, the further afield your knowledge extends, the greater potential you have for innovation.

… it’s easier to learn when you’re young isn’t completely wrong, or at least it has a real basis in neurology. However, the pessimistic assumption that learning somehow ‘stops’ when you leave school or university or hit thirty is at odds with the evidence. It appears that a great deal depends on the nucleus basalis, located in the basal forebrain.

Among other things, this bit of the brain produces significant amounts of acetylcholine, a neurotransmitter that regulates the rate at which new connections are made between brain cells. This in turn dictates how readily we form memories of various kinds, and how strongly we retain them. When the nucleus basalisis ‘switched on’, acetylcholine flows and new connections occur. When it is switched off, we make far fewer new connections.

From research into the way stroke victims recover lost skills it has been observed that the nucleus basalis only switches on when one of three conditions occur: a novel situation, a shock, or intense focus, maintained through repetition or continuous application.

… simply attempting new things seems to offer health benefits to people who aren’t suffering from Alzheimers. After only short periods of trying, the ability to make new connections develops. And it isn’t just about doing puzzles and crosswords; you really have to try and learn something new.

by being more polymathic, you develop a better sense of proportion and balance — which gives you a better sense of humour.

Brain Facts

Source:  National Geographic, Feb 2014

the brain’s wiring: the network of some 100,000 miles of nerve fibers, called white matter, that connects the various components of the mind, giving rise to everything we think, feel, and perceive.

each neuron is a distinct cell, separate from every other one. A neuron sends signals down tendrils known as axons. A tiny gap separates the ends of axons from the receiving ends of neurons, called dendrites. Scientists would later discover that axons dump a cocktail of chemicals into the gap to trigger a signal in the neighboring neuron.

Each neuron has on average 10,000 synapses. Is there some order to their connections to other neurons, or are they random? Do they prefer linking to one type of neuron over others?

every neuron made nearly all its connections with just one other one, scrupulously avoiding a connection with almost all the other neurons packed tightly around it. “They seem to care who they’re connected to,” Lichtman says.