HD Rumble on Nintendo Switch may have been a simple pleasure, but it’s hard to say it wasn’t riveting.
Nintendo most prominently showcased the intricate and precise haptic feedback of its handheld/console hybrid with its launch title, the failed Wii Sports successor, 1-2 Switch, which used the Switch’s unique Joy-Con controllers to play a variety of wacky mini-games. One of the strangest, ‘Ball Count’, has you holding a Joy-Con horizontally in your hand and feeling how many ‘balls’ are rolling around ‘inside’.
Despite being a solid object, the Joy-Cons vibration motors can simulate the feeling of balls rolling around using only haptic feedback. It was a legitimate ‘wow’ moment when I first felt it, a unique experience from what was already a pretty weird console.
Haptic feedback is a sensory element of modern computing that hides in plain sight.
Since the Switch launched in 2017, almost no third-party game has tried to take advantage of the Switch’s haptic finesse, and virtually no Nintendo titles have relied on haptic feedback to convey information in the same way. 1-2 Switch did. It stands out now because detailed haptic feedback is everywhere from our smartphones to our laptops, but in many ways has never felt so invisible. Haptic feedback is a sensory element of modern computers that hides in plain sight, and it should play a bigger role in how we use the computers of the future.
In the land of linear resonant actuators
Cell phones and other pocket-sized personal electronics have had haptics and vibration engines for centuries, but they only became really good relatively recently. In the early days, eccentric rotating mass motors (EMRs) were the name of the game. Open an early pager and you’ll see an EMR at work: a DC motor with a weight on one end of the shaft that then rotates and, thanks to its asymmetrical design, shakes the entire motor, creating a vibration. Rotate the weight at different speeds and you can create different types, strengths and vibration patterns for different attention-grabbing purposes.
Many older smartphones used these too, and in early versions of Android, Google even enabled vibration by default, where literally every tap of your finger on the screen was greeted with a dull buzz. In the years that followed, healthier heads prevailed. EMRs were traded in for LRAs, or linear resonant actuators, and mobile software is much more conservative when it comes to how and when your phone vibrates. LRAs, AC motors that oscillate in a straight line and are also much more responsive and energy efficient, have become key to Apple’s breakthroughs in haptics through the Taptic Engine, Force Touch, 3D Touch and Haptic Touch.
Apple’s Taptic Engine is a custom LRA that first appeared in the Apple Watch to simulate Force Touch, a hard press that enables hidden interface elements, and the “taps” that make up the smartwatch’s notifications. That could be a buzz for a call, double-tap for messages, or swipes to indicate which way to go during turn-by-turn navigation.
The Taptic Engine was later added to the iPhone 6S alongside a pressure-sensitive layer in the phone’s display to enable 3D Touch, Apple’s version of a right-click on steroids and a solid-state home button. 3D Touch was short-lived – the iPhone useful additions that made using an iPhone easier, provided you were aware of them. 3D Touch was later replaced by Haptic Touch when Apple ditched the pressure-sensitive screen, again made possible by the subtle vibrations of the Taptic Engine.
But the biggest win of Apple’s foray into haptics was, of course, the MacBook trackpad. The 2012 12-inch MacBook was so slim and compact that the company removed the physically clicking trackpad in favor of a sturdy piece of glass that vibrates back to you when you click on it, with some pressure-sensitive Force Touch features built into it. sweetened the deal. The change was so successful that now all Apple laptops use the new type of trackpad, and many other laptop manufacturers have used a similar design to narrow the profile of their laptops as well.
A button replacement
While hugely successful, what’s disappointing about this conservative approach to haptic feedback is that vibrations are turned into button replacements by default. Look no further than the iPhone 15 Pro. The Action Button is a convenient way to replace a switch that was usually ‘off’ with a multi-function button that can enable shortcuts and other features on the iPhone. The feeling of ‘pressing’ the action button is fully enabled by the Taptic Engine.
There are obvious reasons why smartphone and laptop manufacturers aren’t going wild with haptics, the biggest of which is battery life. You don’t have to wait long with haptic feedback enabled on the iOS keyboard to notice it impacting your battery life. Switching to an interface built around vibration can seriously affect whether your phone or laptop can make it through a full day. But as iFixit notes in a blog exploring what makes Apple’s haptics unique, the company has regularly scaled up the Taptic Engine as iPhones have gotten bigger. The satisfying feel of the iPhone or the MacBook Pro trackpad is due to the careful coordination of software and hardware. There’s no reason why Apple or any other manufacturer can’t find other ways to keep battery life in check while leaning on good vibrations.
We should feel more
A new social networking app called ID by Amo stands out for more than a few reasons, the least of which is its out-of-left-field interpretation of social feeds as collaborative collages and mood boards. But what first struck me about the app were the haptics. As you scroll through the updates of your friends’ boards, you can zoom in and out of someone’s board by tilting your phone so it vibrates in time with your rotations. Hop on a new board and you’ll get a pleasant buzz too. It’s not over the top, but it’s more expressive than your standard text vibration.
Not Boring Software, the makers of (Not Boring) Weather and Calculator, among others, makes similarly playful use of vibrations in its apps while navigating menus. Neither are necessarily revealing applications of haptics, but they are proof that our personal electronics should do more than just recreate physical buttons or accentuate existing software elements. We interact with our computers auditorily, visually and yes, tactilely, but mainstream consumer technology has largely neglected that tactile element.
…shouldn’t we do more to engage senses beyond just sight and sound?
If the future, as Meta and Apple propose, will be powered by head-mounted, ‘immersive’ spatial computing, shouldn’t we be doing more to engage senses beyond just sight and sound? Why don’t any of these headsets vibrate? Haptics are everywhere in gaming, after all, the PlayStation VR2 headset vibrates your face, but its role in traditional computing is mainly to keep up appearances. There are hurdles to overcome, but there’s no reason why we shouldn’t feel our computers as well as we see or hear them.