Our first demo for “Internet TV in the Social Web” includes an iPhone app that works as both a TV remote control and a ‘companion device’ for viewing programme recommendations and programme guides, managing programme playlists, and reading background information about a programme from the Web.
From a user experience perspective, there are two particularly interesting aspects to this:
- Usability issues relating to using a touch-screen remote for controlling a TV
- The idea that merging the Web with TV doesn’t have to mean showing the Web on a TV screen: instead, the Web can be integrated into the viewing experience using a ‘second screen’ companion device, leaving the TV screen free of clutter.
The second issue is worthy of a blog post of its own. In the meantime, we’re starting to explore the first issue in more detail.
Our iPhone app includes an interface for a basic TV remote that allows the user to select and browse their programme guide (EPG) on the TV screen, and to select and play programmes. Whilst anecdotal evidence suggests that people want to use their smartphones as TV remotes, the touch-screen interface poses a major design challenge. With traditional remotes we’ve all become used to changing channels or adjusting the volume without necessarily taking our eyes off the TV screen – by feeling for the desired buttons under our fingers. However, without these physical buttons, the touch-screen requires us to take our eyes off the TV screen and pay visual attention to the remote.
When we mocked-up the interfaces for our prototypes, we played around with various designs for the remote screen. However, none of these deviated much from standard remote designs, and we didn’t have time to try out anything new or to do any rigorous analysis of the usability effects of the screen layout and positioning of the controls.
We also wondered whether, and how, various input and output methods could be used to improve the user experience. These include:
- Gesture control: for example, the Boxee Remote iPhone app offers a ‘gesture’ mode whereby users drag the Boxee logo around to the TV screen, and tap the logo to perform an action (select/play/pause). Similarly, Apple added swipe and tap finger gesture control functionality to its Remote app to control what’s seen on Apple TV. Via its ‘Control’ interface, users can tap to select, play and pause, and flick left or right, or drag and hold, to rewind or fast-forward.
- Sound effects, such as clicks or voiceover
- Voice commands
- ‘Haptics’ (tactile feedback such as vibration)
- Accelerometer control (tilts and shakes)
Finally, could there be certain situations in which specific combinations of these ‘modes’ could be optimal, depending on the user’s individual preferences and needs? If so, how much would users want to control these modes?
We thought that this was an interesting area of research, and it also complements the work our colleagues in BBC Research and Development are doing in investigating how multi-touch software could support television viewing in the future.
The challenge of proposing some solutions to this design issue has been taken up by the some of the students attending Lora Aroyo’s HCI course at VU University Amsterdam. The students are currently mid-way through their assignment, and we’re really looking forward to seeing what they come up with.