On Monday, Carnegie Mellon University published “The Untold History of Multitouch” in the summer edition of their School of Computer Science’s LINK newsletter.
The article details how Roger Dannenberg, Paul McAvinney, and Dean Rubine worked together to create one of the world’s first multitouch interfaces starting back in 1983. As is the case with many of our most important technologies, multitouch was initially conceived to serve a creative purpose: to facilitate generating music on a computer. A single touch interface wasn’t suitable, so this group of musically-inclined computer scientists set out to fix that… and did so by creating a multitouch interface they called The Sensor Frame.
In 1985 the group published a paper titled, “The Sensor Frame™: A Device for the Manipulation of Graphic Objects”, detailing many aspects of how the interface worked, including zoom-in and zoom-out gestures, quite similar to today’s pinch-to-zoom on iOS. Later that same year, Steve Jobs brought a team from NeXT over to see what the folks at CMU were up to, and that included touring the Sensor Frame lab… and Jobs himself reluctantly signed an NDA before being shown the technology. That visit – and the NDA – have proven sticky for Apple’s pinch-to-zoom patent requests over the past ten years, and continue to cause issues to this day.
In the video below, see a young Dean Rubine talk through the specifics of multitouch. It’s pretty amazing to see this technology in action decades before it was introduced to most of us. The multitouch section begins at the 7-minute mark, but the whole thing is fascinating to watch.
You can read the whole piece starting on page 9 of the CMU LINK Summer Edition. We had some trouble getting it to load in Safari, but Firefox had no issues displaying the PDF.