Reading ‘The Apple Watch: User-Experience Appraisal‘ and it’s somewhat ironic that Apple seems to be forgetting their well established guidelines for enabling user interactions with their devices.
This was particularly noticeable in the comments regarding icon sizes and tappable areas – it would be worth finding out just why Apple decided to go small and not just because ‘it’s a smaller screen’. Not only is that blatantly obvious but it’s also blatantly misguided, at least form a UX perspective. Just as the smartphone screen is not a smaller desktop monitor, neither is the watch screen a smaller smart phone screen – I’m sure the UX experts at Apple would have had many a heated discussion on just how to design interactions for the watch screen within the current technological capabilities but it’s surprising how unexpectedly inadequate the end result is. I’d love to get an insider’s view as to why this was the best they could do.
I’m not exactly a technology skeptic – humans can be marvellously unpredictable when it comes to utilising technology – so this isn’t the place for me to argue if the Apple Watch is going anywhere. However, thinking about the capabilities for future user interaction, it got me thinking that the Apple Watch might be the prime device to utilise air gestures and 3D interactions rather than the routine tap and swipe. It’s certainly feasible on smartphone devices, and it seems a pretty nifty avenue to explore. Sure, you have a much limited 2D space, but a surface volume that could easily compensate for that. Using areas of the Watch screen as a light sensor could also compensate for any limitations imposed by the ultra tiny camera.
Still having said all that it’s important to remember that:
The small screen size forces designers to think hard about (1) what users care about most in their app; (2) how to compress that information so that it fits the tiny screen. Complex interactions do not belong on the watch: Remember that people can always go to their iPhone if they need more.
Anyway, the key thing here is the user and as ever the article highlights the importance of testing not just how a user interacts with a piece of technology but also their context. This is something that will be of increasing significance as our technological landscape evolves in complexity and scale and clearly requires a much more focussed effort.