PaperPhone and Snaplet provide glimpse into flexible gadgets future

By Damir Beciri

An advanced “thin-film” flexible paper computer has been developed in collaboration between the researchers of Queen’s University in Ontario, Canada, and Arizona State University, USA. Two variations of the flexible paper computer are going to be unveiled on May 10 at the Association of Computing Machinery’s CHI 2011(Computer- Human Interaction) conference in Vancouver, Canada.
“This computer looks, feels and operates like a small sheet of interactive paper”, said its inventor, Roel Vertegaal, the director of the Human Media Lab at Queen’s University. “You interact with it by bending it into a cell phone, flipping the corner to turn pages, or writing on it with a pen.”
Hardware for a prototype of the thin-film computer/phone device has been provided by Nicholas Colaneri, director ofASU’s Flexible Display Center, and Jann Kaminski, a display engineering manager at the center. The device is connected to a laptop running a Max 5 patch that processes sensor data, performs gesture recognition and sends images to the display. Pen tracking is fully embedded on the device itself. The researchers plan to exhibit two prototypes based on this technology – a smartphone prototype PaperPhone and a thin-film wristband computer called Snaplet.
The PaperPhone is capable to perform usual smartphone functions, such as storing books, playing music or making phone calls. But its display consists of a 9.5 cm (3.75 inch) diagonal Bloodhound flexible electrophoretic display, augmented with a layer of 5 Flexpoint bi-directional bend sensors and 6 pressure sensors.


An interactive gesture-recognition system for the PaperPhone has been developed by Byron Lahey, a doctoral student in ASU’s School of Arts, Media and Engineering, and Winslow Burleson, an assistant professor in the School of Computing, Informatics and Decision Systems Engineering, one of ASU’s Ira A. Fulton Schools of Engineering.

Watch video: https://youtu.be/Rl-qygUEE2c

“Using real-time sensing and modeling of dynamic inputs we were able to develop and evaluate an entirely new array of interactions on a first-of-its-kind mobile platform”, said Burleson, who specializes in human-computer interaction and leads the Motivational Environments Research Group. “This allows natural bend gestures and interaction on the Paperphone display to navigate through maps, contact lists, or music play lists, in ways that resemble how such content appears on paper documents.”

Snaplet uses the same display, but it was designed for three wearable application contexts – a watch context, a PDA context, and a mobile phone context. Each context is associated with a limited number of mobile application functions. To use Snaplet as a watch, the user places the flexible screen along the curvature of their wrist, horizontally, bending it upwards. The user fastens the curved display on a shirt using Velcro. In this context, the user can watch a video, or play with a music application. To use it as a PDA, the user removes the device from the wrist, and holds it flat inside the palm. In this context, the user can read a book, take notes or sketch on the display. Users can pick up a phone call by bending the edge of the display with their fingers, then placing the device to their ear.

Watch video: https://youtu.be/ol_uu5pMmq8

The invention could lead to a new generation of computers that are super lightweight, thin-film and flexible. They use no power when nobody is interacting with them, and when users are reading, they don’t feel like they’re holding a sheet of glass or metal. Aside the fact their prototypes still need to be connected to a laptop, according to developers, the main limitation of their work resides in the physical engineering of the prototype display, which restricted bending to one side of the display thus reducing the number of available bend gestures.

No comments