It’s easy to forget that people once marveled at the interactive tablets carried around by characters in Gene Roddenberry’s Star Trek, or the touch screens in Spielberg’s Minority Report – dream futures that have become reality.
In 1983 Apple co-founder Steve Jobs stated that his ultimate ambition was
“to put an incredibly great computer in a book that you can carry around with you and learn how to use in 20 minutes…”.
That vision would take him 27 years to realize.
With the release of the iPad mobile devices reached a new quality because of it’s screen size and the possibility to operate them just with the fingers and without any peripherals. They initiated a shift in paradigm of teaching and learning as well and from that time ‘people could hold the internet in their hands‘.
Historically the term first originated when Apple‘s founder Steve Jobs discussed the future of personal computing during an interview alongside Bill Gates at the fifth All Things Digital conference in 2007. At that time he described
“a category of devices that aren’t as general purpose, that are really more focused on specific functions, whether they’re phones or iPods or Zunes or what have you. And I think that category of devices is going to continue to be very innovative and we’re going to see lots of them,”
It wasn’t the first tablet computer by any means, and the iPad faced significant criticism when it was first announced – most pointedly the accusation that it was trying to fulfill a demand that didn’t exist. But the iPad surprised everyone, instantly spawning a market for a new form of computer that made the internet touchable and found its way into the hands of 170-million customers in its first three-and-a-half years.
There are some quite astonishing engineering feats behind the iPad. The iPad Air (launched in 2013), for example, weighs twenty times less than an Apple Mac from ten years ago, but packs four times the memory and four times the processing power. The glass that covers the screen is manufactured via a process which makes it harder than sapphire, and is coated with an invisible oil-repellant to reduce fingerprint staining.
But most incredible of all is the fact that Jobs succeeded in coming up with a portable – and connected – computer that even orangutans in the Smithsonian National Zoo have worked out how to use, forever altering the future of digital media in the process.
A tablet …
When Steve Jobs ended years of speculation in 2010 by announcing the iPad tablet device, he helped launch a new era in computer hardware. Though tablet PCs have been around for years, the iPad was the first device to use the form factor successfully in the consumer market.
So what exactly is a tablet?
At its most basic level, a tablet PC is a mobile computing device that’s larger than a smartphone or personal digital assistant. In general, if the computing device uses an on-screen interface it’s a tablet.
To confuse matters, some manufacturers produce hybrid devices that are part tablet, part laptop computer. The device might come with an attached keyboard — the screen swivels or folds down to cover the keyboard and voila, you have a tablet!
In 2010, Lenovo introduced a prototype device called the IdeaPad U1 at the Consumer Electronics Show in Las Vegas, Nev. At first glance, it looked like a normal laptop computer. But if you detached the screen from the base, the laptop converted to a tablet computer with its own, independent operating system. Lenovo rebranded the device, naming it the Lenovo LePad and launching it in China in 2011.
The touch screen …
Thousands of scientists, their researches and inventions, the ongoing miniaturization, and improvements in sensor technology made a smartphone and a tablet ready for use.
All tablets have a touch screen interface and an operating system capable of running small programs and interpreting gestures. Further sensors are used to automatically change the orientation from portrait to landscape mode.
Summarized, it was the touch screen first introduced with the iPhone in 2007 and it’s multi-touch capable operating system iOS which made the iPad to a device capable to replace computers for an increasing number of people.
The development of the Touch Screen Technology started in the 1960s although there are many results of basic scientific researches in physics starting hundreds of years earlier.
Historians consider the first touch screen to be a capacitive touch screen invented by E.A. Johnson at the Royal Radar Establishment, Malvern, UK, around 1965 – 1967. The inventor published a full description of touch screen technology for air traffic control in an article published in 1968.
In 1971, a “touch sensor” was developed by Doctor Sam Hurst (founder of Elographics) while he was an instructor at the University of Kentucky. This sensor called the “Elograph” was patented by The University of Kentucky Research Foundation. The “Elograph” was not transparent like modern touch screens, however, it was a significant milestone in touch screen technology. The Elograph was selected by Industrial Research as one of the 100 Most Significant New Technical Products of the Year 1973.
In 1974, the first true touch screen incorporating a transparent surface came on the scene developed by Sam Hurst and Elographics. In 1977, Elographics developed and patented a resistive touch screen technology, the most popular touch screen technology in use today.
In 1977, Siemens Corporation financed an effort by Elographics to produce the first curved glass touch sensor interface, which became the first device to have the name “touch screen” attached to it. On February 24, 1994, the company officially changed its name from Elographics to Elo TouchSystems.
In 1983, the computer manufacturing company, Hewlett-Packard introduced the HP-150, a home computer with touch screen technology. The HP-150 had a built in grid of infrared beams across the front of the monitor which detected finger movements. However, the infrared sensors would collect dust and require frequent cleanings.
The nineties introduced smart phones and handhelds with touch screen technology. In 1993, Apple released the Newton PDA, equipped with handwriting recognition; and IBM released the first smart phone called Simon, which featured a calendar, note pad, and fax function, and a touch screen interface that allowed users to dial phone numbers. In 1996, Palm entered the PDA market and advanced touch screen technology with its Pilot series.
In 2002, Microsoft introduced the Windows XP Tablet edition and started its entry into touch technology.
Until today I use a Siemens T4010 Convertible running on the XP Tablet edition with a stylus and an incredible weight of 2.5 kg.
It just works, but it works more like a heater if you place it on your lap and because it’s running XP you have to take care of it’s health with an antibiotic therapy each week.
However, you could say that the increase in the popularity of touch screen smart phones defined the 2000s. In 2001 Mitsubishi launched the DiamondTouch (a human interface device that has the capability of allowing multiple people to interact simultaneously while identifying which person is touching where).
Various companies expanded upon these inventions in the beginning of the twenty-first century. The company Fingerworks developed various multi-touch technologies between 1999 and 2005, including Touchstream keyboards and the iGesture Pad. Several studies of this technology were published in the early 2000s by Alan Hedge, professor of human factors and ergonomics at Cornell University. Apple acquired Fingerworks and its multi-touch technology in 2005. In 2007, Apple introduced the iPhone still seen as the king of smartphones with nothing but touch screen technology follwed by the iPad in 2010.
There are basically three components used in touch screen technology …
- 1 The touch sensor is a panel with a touch responsive surface. Systems are built based on different types of sensors: resistive (most common), surface acoustic wave, and capacitive (most smart phones). However, in general sensors have an electrical current running through them and touching the screen causes a voltage change. The voltage change signals the location of the finger.
- 2 The controller, is the hardware that converts the voltage changes on the sensor into signals the device can receive.
- 3 Software tells the device what’s happening on the sensor and the information coming from the controller. Who is touching what and where; and allows the operating system and apps to react accordingly.
To improve the features multi-touch was introduced to recognize the presence of two or more points of contact with the surface. This plural-point awareness is used to implement advanced functionality such as pinch to zoom.
The iPad is in the line of disruptive innovations. It seems to become a tool supporting all areas of life, household, education, science, health, communication, etc. It’s for primates but still not for apes who grant preferences to other activities.
Related links …
Thanks for stopping by.