Category Archives: MIT

Harvard: Educate Leaders; MIT: Nurture Geniuses

Source: Marginal Revolution, Sep 2019

Using publicly released reports, we examine the preferences Harvard gives for recruited athletes, legacies, those on the dean’s interest list, and children of faculty and staff (ALDCs). Among white admits, over 43% are ALDC. Among admits who are African American, Asian American, and Hispanic, the share is less than 16% each.

Our model of admissions shows that roughly three quarters of white ALDC admits would have been rejected if they had been treated as white non-ALDCs. Removing preferences for athletes and legacies would significantly alter the racial distribution of admitted students, with the share of white admits falling and all other groups rising or remaining unchanged.

A Totally Black Diamond

Source: MIT News, Sep 2019

… a 16.78-carat natural yellow diamond from LJ West Diamonds, estimated to be worth $2 million, which the team coated with the new, ultrablack CNT material. The effect is arresting: The gem, normally brilliantly faceted, appears as a flat, black void.

AI Startups & AI Skills Impact on Pay

Source: Gofman.info, Aug 2019

JCR Licklider – Ideas for the Digital Age

Source: Wikipedia, Aug 2019

Licklider became interested in information technology early in his career. His ideas foretold of graphical computing, point-and-click interfaces, digital libraries, e-commerce, online banking, and software that would exist on a network and migrate wherever it was needed. Much like Vannevar Bush‘s, Licklider’s contribution to the development of the Internet consists of ideas, not inventions. He foresaw the need for networked computers with easy user interfaces.

Licklider was instrumental in conceiving, funding and managing the research that led to modern personal computers and the Internet. In 1960 his seminal paper on “Man-Computer Symbiosis[23] foreshadowed interactive computing, and he went on to fund early efforts in time-sharing and application development, most notably the work of Douglas Engelbart, who founded the Augmentation Research Center at Stanford Research Institute and created the famous On-Line System where the computer mouse was invented.

Related Resources:

http://iae-pedia.org/J.C.R._Licklider, date indeterminate

Licklider at DARPA

DARPA (Department of Defense Advanced Research Projects Agency)has provided leadership and funding in a number of world-changing projects.

Quoting from http://www.nationmaster.com/encyclopedia/J.C.R.-Licklider:

In October 1962 Licklider was appointed head of the DARPA information processing office, part of the United States Department of Defense Advanced Research Projects Agency. He would then convince Ivan Sutherland, Bob Taylor, and Lawrence G. Roberts that an all-encompassing computer network was a very important concept.

When Licklider began his work at ARPA, there were no Ph.D. programs in computer science at American universities. ARPA began handing out grants to promising students, a practice that convinced MIT, Stanford, University of California at Berkeley, and Carnegie Mellon University to start their own graduate programs in computer science in 1965. This is certainly one of Lidlicker’s most lasting legacies.

SCIHI.Org, Mar 2019

During his active years in computer science, Licklider managed to conceive, manage, and research the fundamentals that led to modern computers and the Internet as we know it today.

His 1960 scientific paper on the Man-Computer Symbiosis was revolutionary and foreshadowed interactive computing. This inspired many other scientists to continue early efforts on time-sharing and application development. One of the scientists funded by Licklider’s efforts was the famous American computer scientist Douglas Engelbart, whose efforts led to the invention of the computer mouse.[4]

In August 1962, in a series of memos, Licklider described a global computer network that contained almost all the ideas that now characterize the Internet.

With a huge budget at his disposal, he hired the best computer scientists from Stanford University, MIT, UCLA, Berkeley and selected companies for his ARPA research. He jokingly described this approximately a dozen or so researchers, with whom he had a close exchange, as the “Intergalactic Computer Network“.

About six months after starting his work, he distributed an opinion in this unofficial panel, criticizing the problems of the proliferating multiplication of different programming languages, debugging programs and documentation procedures and initiating a discussion on standardization, as he saw this as a threat to a hypothetical, future computer network.

MIT Micromasters –> MIT Degrees

Source: MIT News, Aug 2019

MIT President Rafael Reif

  • ” the second person to set foot on the moon was Buzz Aldrin, ScD ’63—the first astronaut to have a doctoral degree. “
  • “Of the 12 humans who have walked on the moon, four graduated from MIT.”

MIT Contributes to Skynet

Source: MIT News, Aug 2018

From Van Gogh to the “Scream”

Source: MIT, Jul 2019

The system takes as input grayscale image examples — such as the flat actuator that displays the Van Gogh portrait but tilts at an exact angle to show “The Scream.” It basically executes a complex form of trial and error that’s somewhat like rearranging a Rubik’s Cube, but in this case around 5.5 million voxels are iteratively reconfigured to match an image and meet a measured angle.

NASA helped to Birth the Computer Age

Source: Fast Company, Jun 2019

The computer that flew the astronauts to the Moon—the Apollo guidance computer—was a marvel of the 1960s: small, fast, nimble, and designed for the people who were using it, astronauts flying spaceships.

It also represented a huge leap for NASA. The risk was in the cutting-edge technology that MIT used to squeeze as much power and speed into the computer’s slim, briefcase-sized case, one of the boldest and riskiest bets of the whole Moon mission, and one that few knew about or appreciated at the time.

The MIT Instrumentation Lab tried to design the Apollo computer using transistors, which in the early 1960s were well-settled technology—reliable, understandable, relatively inexpensive. But 15 months into the design effort, it became clear that transistors alone couldn’t give the astronauts the computing power they needed to fly to the Moon. In November 1962, MIT’s engineers got NASA’s permission to use a very new technology: integrated circuits.

As part of its early evaluation, MIT bought 64 integrated circuits from Texas Instruments. The price was $1,000 each, or $9,000 apiece in 2019 dollars. Each had six transistors.

But integrated circuits would change what it was possible for the Apollo computer to do. They would increase its speed by 2.5 times while allowing a reduction in space of 40% (the computer didn’t get smaller; it just got packed with more capacity).

The Apollo computers were the most sophisticated general-purpose computers of their moment. They took in data from dozens of sensors, from radar, directly from Mission Control. They kept the spaceships oriented and on course. They were, in fact, capable of flying the whole mission on autopilot, while also telling the astronauts what was going on in real time.

MIT did two things to solve the problems of those first integrated circuits. Working with early chip companies—Fairchild Semiconductor, Texas Instruments, Philco—it drove the manufacturing quality of computer chips up by a factor of 1,000. MIT had a battery of a dozen acceptance tests for the computer chips it bought, and if even one chip in a lot of 1,000 failed one test, MIT packed up the whole lot and sent it back.

And MIT, on behalf of NASA, bought so many of the early chips that it drove the price down dramatically: from $1,000 a chip in that first order to $15 a chip in 1963, when MIT was ordering lots of 3,000. By 1969, those basic chips cost $1.58 each, except they had significantly more capability, and a lot more reliability, than the 1963 version.

MIT and NASA were able to do all that because for year after year, Apollo was the No. 1 customer for computer chips in the world.

  • In 1962, the U.S. government bought 100% of integrated circuit production.
  • In 1963, the U.S. government bought 85%.
  • In 1964, 85%.
  • In 1965, 72%.

Even as the share dropped, total purchasing soared. The 1965 volume was 20 times what it had been just three years earlier.

Inside the government, there was only NASA using the chips, and the Air Force’s Minuteman missile, a relatively small project compared with the Apollo computers.

Without knowing it, the world was witnessing the birth of “Moore’s Law,” the driving idea of the computer world that the capability of computer chips would double every two years, even as the cost came down.

In fact, Fairchild Semiconductor’s Gordon Moore wrote the paper outlining Moore’s Law in 1965, when NASA had been the No. 1 buyer of computer chips in the world for four years, and the only user of integrated circuits that Moore cites by name in that paper is “Apollo, for manned Moon flight.” Moore would go on to cofound and lead Intel, and help drive the digital revolution into all elements of society.

What was Moore doing when he conceived of Moore’s Law? He was director of research and development. Fairchild’s most significant customer: MIT’s Apollo computer.

It’s a part of the history of modern computing that Silicon Valley manages to skip over most of the time, but MIT, NASA, and the race to the Moon laid the very foundation of the digital revolution, of the world we all live in.

@MIT: Learning to Recover from Failure

Source: MIT, May 2019

This very human tendency was the inspiration for FAIL!, an event series committed to destigmatizing failure.

At each conference, prominent scholars from MIT and Harvard University share 10-minute stories of personal, academic, and professional failures, followed by a Q&A session with the audience. By learning of the challenges and missteps of highly successful people, the organizers hope to reduce the discouragement and isolation attendees may feel when confronted with their own failures.

MIT professor of computer science Daniel Jackson, who recently published a book on resilience at MIT, opened this April’s FAIL! Conference by reflecting on the different types of failure. “There’s what I call ‘little-f failure’ and ‘big-F failure,’” he said. “Little-f failure is when you do something and you screw it up … Big-F failure is when your whole life comes to nothing.”

Big-F failures, he noted, are relatively rare, although fear of them can lead people to avoid taking worthwhile risks and limit their ability to lead full, meaningful lives. “Talking about fear and failure is the key to changing ourselves and the culture in which we live,” said Jackson, emphasizing the importance of events like FAIL! that create spaces to explore these topics.

Professor of humanities, sociology, and anthropology Susan Silbey, who was recently awarded MIT’s highest faculty honor, the Killian Award, spoke after Jackson. Although Silbey has had a celebrated career with seemingly few little-f failures, she struggled to find direction and mentorship as a graduate student.

“I started my PhD two months after I graduated college,” said Silbey. “In 1962 there weren’t very many women who joined PhD programs at the University of Chicago. That was quite extraordinary in that year. What was more extraordinary is that I did not graduate until 1978. Sixteen years. That is not the career of a star: That is a failure.” Silbey credited her eventual success to her love of learning and research, regardless of the topic she was studying.

Harvard Medical School professor of genetics George Church spoke at the FAIL! Conference held in November 2018. Those who know him as a founding father of synthetic biology would be surprised to learn that he spent six months homeless and failed out of graduate school at Duke University prior to being accepted to a PhD program at Harvard University, where he later graduated.

Church encouraged the audience to not only embrace their own failures, but to learn from the failures of others. “I’ve learned as much from my negative role models as I did from my positive ones,” he said. “They had trouble, and you’re trying to learn from their trouble without personally experiencing it.”

FAIL! is about being human,” adds Benedetti. “We all need inspiring and realistic role models. By sharing the challenges and vulnerabilities that many people try to hide, our brave speakers are helping to create an environment where students feel comfortable being themselves and expressing their creativity. We believe that FAIL! is providing a model of thoughtfulness and humility, which will inspire attendees to be better leaders.”

Captain America @ MIT

Source: MIT, May 2019

On April 27 the MIT Great Dome was transformed. This time, it was decked out as Captain America’s shield in honor of the release of Avengers: Endgame.

The hack even got the attention of the actor who plays Captain America, Sudbury native Chris Evans, who tweeted the article by the Boston Globe.

https://twitter.com/ChrisEvans/status/1122871898545762304”

Another perspective: