In a sad but expected follow-up to Chris's post from a little over a month ago, this entry marks the passing of Steve Jobs, co-founder of Apple, who has died at the age of 56.
While many renowned leaders of industry are remembered for one big thing, or can have their accomplishments summarized in a cohesive way, Jobs had a career that can hardly be characterized by one--or even two, or three--major accomplishments. From the Apple II, to the original Macintosh, to Pixar studios, all the way to the iMac, iPod, iPhone, and iTunes Music Store, Jobs continually reinvented himself and created new technological landmarks. Perhaps the one thing that these all have in common is the way in which they encourage us to take technology for granted, with the goal of elevating the user experience. As many pointed out during his lifetime, and continue to point out after his passing, this approach was a double-edged sword that often cleft away useful functionality right along with the cruft. But, I think it is also undoubtedly something that will cement Jobs's legacy and importance to the field of computing long after his death. What do you think?
The Washington Post has a very lengthy and insightful obituary of Jobs, and the obituaries in The Guardian and New York Times are also worth a look. It's interesting to compare them with the premature obituary of Jobs that was accidentally published in 2008. You can also read what Jobs himself had to say about his life, and a short but telling piece from an Apple fan on ZDNet.
Recently, the BBC reported that the London Science Museum plans to add to its collection in the history of computing by digitizing Charles Babbage's huge store of design notes on the Analytical Engine. Though the 19th c. Analytical Engine is often pointed to as a machine that presaged the modern computer, a working version was never fully built in Babbage's lifetime (although the notes on the potential machine resulted in the first computer program, written by Ada Byron, Countess of Lovelace). And historians have not been the only ones fascinated with this machine--alternate histories in which the Analytical Engine was successfully built form the bedrock of a significant amount of science fiction, particularly in the steampunk subgenre.
Those of you in or around NYC might be interested in the exhibit series called the Silent Series at the New Museum, which aims to present interactions between contemporary art and technology.
SIGCIS Workshop 2011: Cultures and Communities in the History of Computing
1) The program is now complete. Work-in-progress and dissertation-in-progress session papers are now available to read right here! Or, click on each paper title on the program and then look for the "DOWNLOAD A COPY OF THIS PAPER" link above the abstract.
A recent Wired article on Khan Academy gave me a distinct sense of déjà vu. A pre-programmed set of lessons that are written once, and then can be used by kids anywhere in the country? They allow students to proceed at their own pace, simulating the advantages of one-on-one tutorial instruction? They ensure that a student have mastered a given concept before allowing him or her to move on to more advanced material? Data on student performance is automatically collected for analysis by educators? A claim that all of this is totally new and is going to revolutionize the staid old American education system? Where have I heard this all before...