Jacob Estep
3 min readApr 9, 2019

--

Mobile operating systems are increasingly moving to gesture-based navigation systems. Palm introduced the concept of a swipe-based gesture nav system in 2008. The company’s WebOS utilized a touch sensitive area below the screen to detect swipe and half-swipe gestures to perform actions like showing the dock, switching between apps, and going back. Over the years, numerous Android custom ROMs used a sort of hybrid system, with the user swiping on an indicator and releasing on a pop-up button to perform it’s respective action (back, home, recent apps). In 2017, Apple brought a few of Palm’s ideas back with the iPhone X. Their version used swipes and half-swipes for similar actions. Now, stock Android has a semi-gesture based system, though it is so bad that most OEMs have designed their own implementations. Oneplus has the best gesture system on Android.

There’s a reason these systems are catching on though; gesture navigation just feels fluid and natural! Activation happens in the background. A user taps a button and some task gets executed, abstracting the action away from the interaction. Manipulation puts the controls directly in the user’s hands. A user drags a notification off the screen to dismiss it. Of course, activation works for certain tasks like opening an app or toggling a switch; actions which demand precision and specificity. The best gesture systems combine animations with gestures to directly connect a user’s interaction with the system action. A swipe isn’t just a swipe to activate a trigger, but a direct manipulation of content on the screen. Oneplus uses this methodology almost perfectly. A swipe up shrinks an app down to the icon on the home screen. Pausing mid-way through that swipe pauses the animation and shows other open apps. Swiping up and to the right pushes the open app over and opens the next-most-recent app in the switcher. It falls apart with the back gesture though. This is activated by swiping up from either bottom corner of the screen.

When designing the HitOS navigation system, I wanted to keep this in mind. Apps and option drawers have a handle. Dragging one controls it’s respective element. For instance, dragging in one direction on a handle pulls the drawer out from under another paired element, opening it. Dragging back the opposite way closes the drawer. These handles are anchored towards the center of the screen to ensure one-handed use. Other manipulatable elements are free-floating with other options on either side. For instance, expanded music controls show the album art for the current song in the middle with that of the previous and next song on either side, showing just a peek of the side. Leaving most of these album covers off-screen ensures focus on the central, current art and indicates a user can drag the others into the center to play the previous or next song.

Gestures make content feel more physical by letting a user control it in a physical way. Activation disconnects a user from the system; manipulation gives a sense of flow and direct control over navigation. This tying of digital elements to physical interactions is an example of a sort of ‘neo-skeuomorphism’. In the dawn of mixed reality and spatial computing, the digital indeed becomes physical. The way we interact with these holographic elements need to feel just as physical and natural, mimicking real-world physics and interactions. As this next era of computing begins, it only makes sense for a new skeuomorphism to emerge.

--

--

Jacob Estep

(they/he) I'm a designer-developer with special interests in operating systems, ethics, and the intersection of technology and social justice.