Category Archives: Genius

The Genius Famine

Source: VDare, Jun 2017

… geniuses are a distinct psychological type.

They have extremely high intelligence, meaning they excel at quickly solving cognitive problems. This strongly predicts socioeconomic, educational and even social success. But geniuses combine this with relatively low conscientiousness and low empathy. They also tend to be uninterested in worldly things—money, sex, power—focused intensely on the intellectual pursuit of solving whatever seemingly unsolvable problem has come to obsess them. New ideas always break established rules and offend vested interests, but the genius couldn’t care less, claim Dutton and Charlton. This is why it is the genius who is able to make original, fantastic breakthroughs.

These kinds of people are fundamental to the growth and survival of civilization, the authors maintain. They are behind all major innovations. But, frighteningly, levels of genius have been in decline during the twentieth century. Measured from 1455 to 2004, macro-inventions—those that really changed the course of history—peaked in the nineteenth century and are now in on the slide. So, what has happened? Why is genius dying-out?

Based on representative samples, the authors show that reaction times are getting longer and have been getting longer since about 1900. Between 1900 and 2000, IQ—using this proxy—seems to have gone down by about 15 points. This means that the doctors of today are the high school science teachers of 1900. The result of this is that for purely genetic reasons there would be a far smaller percentage of Turing-types today.

Intelligence is correlated with a trait known as ‘Intellect’: being open to new ideas and fascinated by intellectual pursuit. Until the 1950s, this kind of attitude underpinned the British university and perhaps even the US one—the book focuses on the UK. Academics were under no pressure to regularly publish or obtain grants. They were expected to teach and were given vast amounts of time to think and research based on the hope that some would produce works of genius.

Religion was part of the reason that universities were created along these lines. Their purpose was to reach a greater understanding of God’s Creation. If this involved frittering away money—with most academics not publishing anything—this didn’t matter. Some things are more important than money, such as the glory of God.

Since the 1960s, the authors note, universities have become bureaucratic businesses. This reflects the anti-intellectual, anti-religious attitude that their purpose is to make money. Academics contribute to this by getting funding, publishing frequently, and attending conferences.

All of this is anathema to the genius, who wants to be left alone to solve his problem. He also won’t tick the bureaucratic boxes that get you an academic position—Francis Crick, discoverer of DNA, was rejected from Cambridge, failed to get a top mark in his bachelor’s degree, and dropped out of assorted PhDs. As such, universities are less likely to appoint genius types.

They will appoint what Dutton and Charlton call the ‘head girl’ (at UK schools)—quite intelligent, socially skilled, conscientious; absolutely not a genius.

This person will be excellent at playing the academic game and will make a great colleague. But they won’t innovate; won’t rock the boat. Once upon a time, they note, a ‘country vicar’ had lots of free time to research, but with the shrinking of the Church, the days of the Victorian ‘scholar-rector,’ are long gone as well. The genius has no institution to nurture him and his potential will not be fulfilled.

Dutton and Charlton’s book predicts that genius will continue to decline and civilization will collapse because it is ultimately underpinned by intelligence and genius. Technology will reach a peak, stagnate, and go backwards, as there are fewer and fewer people intelligent enough to maintain and eventually even use it.

Life will become harsher and simpler and, eventually, more religious. At the moment, it seems that there’s nothing we can do to stop this short of a horrendous reversion to pre-Industrial levels of child mortality. But if we could better nurture genius then somebody might come up with a solution before it is too late.

So, the authors ask, how can we help geniuses?

Firstly, we need to identify them.

The genius is likely to be highly intelligent but it will be a lop-sided kind of intelligence. Oxford University philosopher A. J. Ayer, for example, had such poor spatial intelligence that he never learnt to drive. The genius will combine this very narrow intelligence with very narrow interests. Thus, he might be rejected from a top university, like Francis Crick, and do brilliantly only on aspects of his degree. He’ll also be socially awkward and eccentric.

Secondly, we need to give them an environment in which they can flourish.

They tend to be useless at everyday things—Einstein had a tendency to get lost—so these need to be taken care of for them.

Thirdly, they are very fragile people and they are not usually interested in money.

They will work for the minimum they require as long as they are looked after and free to get on with problem solving. The mathematician Paul Erdos, note Dutton and Charlton, lived out of a suitcase and camped out with various math professors. They need long-term security so that they do not have to worry about ordinary things, which they not interested in and are no good at.

If we can make these changes, insist Dutton and Charlton, then in spite of declining intelligence, it is possible that a genius may be produced who can develop a solution to this problem.

JCR Licklider – Ideas for the Digital Age

Source: Wikipedia, Aug 2019

Licklider became interested in information technology early in his career. His ideas foretold of graphical computing, point-and-click interfaces, digital libraries, e-commerce, online banking, and software that would exist on a network and migrate wherever it was needed. Much like Vannevar Bush‘s, Licklider’s contribution to the development of the Internet consists of ideas, not inventions. He foresaw the need for networked computers with easy user interfaces.

Licklider was instrumental in conceiving, funding and managing the research that led to modern personal computers and the Internet. In 1960 his seminal paper on “Man-Computer Symbiosis[23] foreshadowed interactive computing, and he went on to fund early efforts in time-sharing and application development, most notably the work of Douglas Engelbart, who founded the Augmentation Research Center at Stanford Research Institute and created the famous On-Line System where the computer mouse was invented.

Related Resources:

http://iae-pedia.org/J.C.R._Licklider, date indeterminate

Licklider at DARPA

DARPA (Department of Defense Advanced Research Projects Agency)has provided leadership and funding in a number of world-changing projects.

Quoting from http://www.nationmaster.com/encyclopedia/J.C.R.-Licklider:

In October 1962 Licklider was appointed head of the DARPA information processing office, part of the United States Department of Defense Advanced Research Projects Agency. He would then convince Ivan Sutherland, Bob Taylor, and Lawrence G. Roberts that an all-encompassing computer network was a very important concept.

When Licklider began his work at ARPA, there were no Ph.D. programs in computer science at American universities. ARPA began handing out grants to promising students, a practice that convinced MIT, Stanford, University of California at Berkeley, and Carnegie Mellon University to start their own graduate programs in computer science in 1965. This is certainly one of Lidlicker’s most lasting legacies.

SCIHI.Org, Mar 2019

During his active years in computer science, Licklider managed to conceive, manage, and research the fundamentals that led to modern computers and the Internet as we know it today.

His 1960 scientific paper on the Man-Computer Symbiosis was revolutionary and foreshadowed interactive computing. This inspired many other scientists to continue early efforts on time-sharing and application development. One of the scientists funded by Licklider’s efforts was the famous American computer scientist Douglas Engelbart, whose efforts led to the invention of the computer mouse.[4]

In August 1962, in a series of memos, Licklider described a global computer network that contained almost all the ideas that now characterize the Internet.

With a huge budget at his disposal, he hired the best computer scientists from Stanford University, MIT, UCLA, Berkeley and selected companies for his ARPA research. He jokingly described this approximately a dozen or so researchers, with whom he had a close exchange, as the “Intergalactic Computer Network“.

About six months after starting his work, he distributed an opinion in this unofficial panel, criticizing the problems of the proliferating multiplication of different programming languages, debugging programs and documentation procedures and initiating a discussion on standardization, as he saw this as a threat to a hypothetical, future computer network.

Asking Feynman: What Color is a Shadow?

Source: Nautilus, Jan 2019

I can remember a few times during my freshman year when I screwed up enough courage to say hello to Feynman before a seminar. Anything more would have been unimaginable at the time. But in my junior year, my roommate and I somehow summoned the nerve to knock on his office door to ask if he might consider teaching an unofficial course in which he would meet once a week with undergraduates like us to answer questions about anything we might ask. The whole thing would be informal, we told him. No homework, no tests, no grades, and no course credit. We knew he was an iconoclast with no patience for bureaucracy, and were hoping the lack of structure would appeal to him.

Feynman thought a moment and, much to our surprise, replied “Yes!” So every week for the next two years, my roommate and I joined dozens of other lucky students for a riveting and unforgettable afternoon with Dick Feynman.

Physics X always began with him entering the lecture hall and asking if anyone had any questions. Occasionally, someone wanted to ask about a topic on which Feynman was expert. Naturally, his answers to those questions were masterful.

In other cases, though, it was clear that Feynman had never thought about the question before. I always found those moments especially fascinating because I had the chance to watch how he engaged and struggled with a topic for the first time.

I vividly recall asking him something I considered intriguing, even though I was afraid he might think it trivial. “What color is a shadow?” I wanted to know.

After walking back and forth in front of the lecture room for a minute, Feynman grabbed on to the question with gusto. He launched into a discussion of the subtle gradations and variations in a shadow, then the nature of light, then the perception of color, then shadows on the moon, then earthshine on the moon, then the formation of the moon, and so on, and so on, and so on. I was spellbound.

One of the most important things Feynman ever taught me was that some of the most exciting scientific surprises can be discovered in everyday phenomena. All you need do is take the time to observe things carefully and ask yourself good questions.

He also influenced my belief that there is no reason to succumb to external pressures that try to force you to specialize in a single area of science, as many scientists do. Feynman showed me by example that it is acceptable to explore a diversity of fields if that is where your curiosity leads.

I also learned that “impossible,” when used by Feynman, did not necessarily mean “unachievable” or “ridiculous.” Sometimes it meant, “Wow! Here is something amazing that contradicts what we would normally expect to be true. This is worth understanding!”

Einstein’s Desk @ the Institute for Advanced Study

Source: Princeton University, Nov 2015

Feynman: How A Genius Thinks

Source: Forbes, Jun 2014

The philosopher Arthur Schopenhauer once said that, “talent hits a target no one else can hit; Genius hits a target no one else can see.”  Lots of people of people are smart, but true genius has always had an element of mystery to it.

Nobody really knows where genius comes from.  While surely there is a genetic component, most child prodigies do not attain outstanding professional success.  Some creativity experts consider genius to be a method as much as it is an ability.

When you read Feynman’s talk, you get the feeling that he is not so much a physicist or an engineer, but an explorer.  Much like the famous biologist E.O. Wilson, he wanders around the nano-ecosystem, picking up objects of interest, examining them, figuring out where they fit and moving on.

Emmy Noether

Source: Science News, Jun 2018

Noether divined a link between two important concepts in physics: conservation laws and symmetries. A conservation law — conservation of energy, for example — states that a particular quantity must remain constant. No matter how hard we try, energy can’t be created or destroyed. The certainty of energy conservation helps physicists solve many problems, from calculating the speed of a ball rolling down a hill to understanding the processes of nuclear fusion.

Symmetries describe changes that can be made without altering how an object looks or acts. A sphere is perfectly symmetric: Rotate it any direction and it appears the same. Likewise, symmetries pervade the laws of physics: Equations don’t change in different places in time or space.

Noether’s theorem proclaims that every such symmetry has an associated conservation law, and vice versa — for every conservation law, there’s an associated symmetry.

Conservation of energy is tied to the fact that physics is the same today as it was yesterday. Likewise, conservation of momentum, the theorem says, is associated with the fact that physics is the same here as it is anywhere else in the universe. These connections reveal a rhyme and reason behind properties of the universe that seemed arbitrary before that relationship was known.

James Clerk Maxwell

Source: Clerk Maxwell Foundation, June 2002

Paul Dirac & Abdus Salam – Thinking Geometrically and Algebraically

Michael Atiyah – Geometry & Algebra