Good Vibrations
A multisensory journey through Material sounds and haptics
Multisensory design is a process that includes sound and haptics to create a holistic user and brand experience — a thoughtful convergence between what the user sees, hears, and feels. At Google, a 12-person multisensory design team composes every sound and haptic across all of the company’s devices, platforms, and verticals.
As part of our celebration of a decade of Material Design, Senior Sound and Haptics Designer Harrison Zafrin and Head of Sound and Haptics Design Conor O’Sullivan share how they orchestrate a uniquely complex, sonic, and tactile symphony — and how the sensory experience of Google devices has evolved along with its design system.
As designers, we think about multisensory design in terms of waveforms. Sound consists of vibrations that you hear, and haptics consist of vibrations that you feel. Attaching thoughtful, nuanced sounds and haptics to even the most micro interactions can make something mundane feel genuinely satisfying. Good multisensory design often disappears — users might not actively notice it, or aren’t conscious that they’re experiencing it — so people would probably be surprised at the amount of time, energy, and care that go into making something that’s less than eight milliseconds long both functional and interesting.
A big thing in Google design, and especially in sound and haptic design, is trying to create a “natural” kind of digital experience — something comfortable, familiar, and for lack of a better word, human. As a team, the best qualities we have are good ears and empathy. We want users to feel like they can express themselves through the sounds and haptics we create, so that their device feels like a reflection of who they are.
Pixel camera shutter sound
(2014 / 2015 / 2016)
Google thought about sound design so differently a decade ago. In keeping with the era, many of the sounds were elaborate and had a lot of personality. For example, the original Pixel camera shutter didn’t sound anything like a camera. It had a melody, which made it very expressive — but dialing up all individual sounds to that level often comes at the expense of the user experience. Today, people don’t really want to be hearing melodies when they’re taking photos. It can be distracting and confusing.
In 2015, as Material was being developed, we began applying a consistent approach to multisensory design, starting with Android. From there, we had the opportunity to start thinking holistically about multisensory experience across all products. For instance, the Pixel camera shutter has since evolved to increase what we refer to as “learnability” — leveraging preexisting cultural norms and knowledge to improve accessibility. It now sounds a lot like a “real” camera shutter: more minimal, less intrusive. It lets you know it’s working and then stays out of the way, making the whole interaction feel easier.
“Go Off King”
(2021)
There’s a misconception that most people’s phones are on silent all the time, but the reality is that people’s phones oscillate between silent, vibrate, and sound-on modes, based on a bunch of different factors; we want to make sure that when a user’s sound is on, they have a great experience. But multisensory design is so subjective. Users have a lot of opinions and preferences around what they want things to sound and feel like.
Every year, we put out a collection of ringtones, alarms, and notifications, providing a wide range of options that support all different user styles, so that everyone can find something that brings them joy. We’re always trying to push the envelope of what people are expecting from ringtones — to make people receiving a call think, “Why is this tune so good? It shouldn’t be this good!” Not everyone will notice, but for those who do, it can be a delightful moment.
The ringtone “Go Off King” is part of the Material Adventures collection, created in 2021 to celebrate the launch of Material 3 and the Pixel 6. Material 3 challenged everything about how Google presents itself to the world. We wanted to reflect that daring attitude by incorporating textures and styles that we shied away from in the past. The ringtone bursts out of the speakers with unrelenting, upbeat energy. It has booming bass, hip hop–inspired drums, and a high tempo that incorporates elements of trap and dance music. Everyone who listens to it thinks it’s hilarious because they unironically enjoy it as a tune and can’t help but want to dance to it. It’s one of our many ringtones that’s been called an “unexpected banger,” which is high praise.
Welcome animation
(2021)
All devices with haptic capabilities, from phones to tablets to watches, ship with a vibration motor inside. Even ten years ago, these motors were incredibly basic, and they could essentially make one haptic effect: buzz. Since then, there’s been a significant shift toward higher-quality vibration motors, which has allowed us to develop premium haptics.
It’s difficult to discuss haptics without experiencing them, so we often talk about them in terms of texture. Imagine running your finger on a screen and it feels like it’s sliding on ice, dragging through sand, or rubbing along a piece of wood. Being able to articulate these textures — feelings — with vibration is technologically extremely difficult. Our ability to do so has come such a long way as our tools have evolved. Even the basic buzz is more crisp and punchy.
Today, haptics are very, very niche and very, very custom. Our super-smart hardware engineers have created a collection of haptic “primitives” — like click, tick, and low tick — that we then compose into rhythms. When a user drags their finger around a screen, these haptics are triggered in rapid succession at varying amplitudes, which creates that constant textured feeling.
Starting with Pixel 6 in 2021, when you power on a Pixel phone, the first thing you see in Setup Wizard is a welcome animation: two Material shapes float around the screen, and every time they collide, there’s a custom haptic. Our design approach was to ascribe imaginary gravity to the smooth and subtle visual movements, then add some imaginary texture with the haptic — like a rubber ball gently hitting a soft membrane, then drifting like a feather.
To achieve the effect, we use a lower-frequency haptic signal that changes in pitch over time. It’s literally called “THUD” — and that’s exactly what it feels like. There’s a weight to it. In the early haptic days, it might have only been possible to go: “Dzzzt.” Here, we’re able to go: “Doompf.”
It’s a great moment of haptics and motion working together in a very Googley way — it’s surprising and delightful — and people enjoy it. We’re looking forward to a time when devices have more than one vibration motor so that we can explore the future of stereo haptics.
Guided Frame
(2022)
Guided Frame is an accessibility feature on Pixel cameras that allows blind and low-vision users to take better photos. The frame of the viewfinder is broken up into different zones; as you position a subject closer to the middle, the sounds and haptics build on top of one another to create a classic diatonic musical progression.
The outer zones start on a IV or “subdominant” chord, with two notes. As you get closer to the center, the sound moves upward into a V or “dominant” chord, with three notes, which has the most musical tension, before the sound reaches a resolution. When you reach the sweet spot, you land on a I chord, or “tonic” chord, with four notes, which resolves the tension created by the V chord; it feels like home. When you’re perfectly framed, the sound also has a little bit of sparkly dust on top and feels more magical.
Unlike sound, haptics can’t use melody as a means of communicating information, so rhythm and texture become much more important. With Guided Frame, the rhythm of the haptics increases in intensity with two taps, three taps, and four taps, creating a sense of progress. Then you know it’s time to take your picture.
All those sounds and haptics work in isolation, but together they add up to more than the sum of their parts and tell a clear story to the user that supports their interaction. The experience is upbeat and celebratory, while also intuitive. The feedback throughout is increasingly positive — it’s a fun, clever interaction, like you’re leveling up in a video game.
“Ethereal Vibration”
(2023)
It’s important to us to establish responsible and positive best practices for using AI as one of our creative tools. Pixel Gems is a collection we made in 2023. For each track, we explored how Google’s in-house GenAI tools, like MusicLM — now available for public use as MusicFX — could be used to unlock new avenues for expression and provide an optimistic glimpse into the future of sound creation.
Very early on in our explorations, we realized that simply recreating higher-fidelity versions of ideas that MusicLM would output wasn’t fulfilling or fun. The creative process is all about exploring the ideas in between the spark of inspiration and the finished product. It’s much more rewarding to use generated audio material as raw material to be manipulated or as a mood board to draw inspiration from.
“Ethereal Vibration” is a ringtone that began as a MusicLM prompt: “Solo female voice, vocal choir, Baroque melody.” We were trying to generate novel vocal samples without any instrumentation that could be chopped up and rearranged in new ways. Harrison took the audio clip and rearranged both its pitch and rhythm. The recontextualized piece has a unique and totally new relationship to the source material. This process isn’t so different from how a lot of people make music today, remixing royalty-free samples or audio clips into other tracks.
Silence
A lot of what we do with multisensory design is manage how much attention we want to draw to something. We’re always asking ourselves: Does this action need sound or haptics, and why? It needs to have a good reason from the user’s point of view. What information is being delivered in that moment? Does that information need to be delivered? The more often an interaction happens, the less intrusive that sound or haptic should be.
Because of this, we use silence very intentionally. Not every button you press, or input between you and your device, results in a haptic or sound. Why? Because that would make more meaningful sounds and haptics less prominent.
You can think of silence in terms of what you see on the screen. UX and UI designers use negative space to help guide users’ attention toward a focal point; negative space gives visual information more prominence. We use silence in the same way. Overusing sound and haptics takes away from moments that we want to highlight or mark as more important than others.
Brand sound: Super G
(2016)
The Super G is our North Star. It’s like a sonic logo, an audio representation of Google’s core values, and the Google sound design aesthetic wrapped up in less than a second and a half. It’s typically used for a hero moment — powering on a product for the first time or after a factory reset, or bookending advertising content — and this consistency creates a signature throughline across the Google ecosystem.
The sound itself is two piano notes separated by an octave in the key of G, for Google. The acoustic effect of playing the notes so close together (called “flamming”) is a sense of playfulness. The sound also has a personal touch; Conor came up with it and recorded it in his home studio. At the time he was working with Creative Lab on establishing a cross-product sonic identity that initially showed up in Pixel. The Creative Lab team really pushed our own perspective on what that sound could be; we kept refining it and refining it, until eventually we came up with this two-note idea. It was like, “Oh, you want simple? Here’s simple.”
It showed up for the first time on the Pixel 1 in 2016, and has shown up on every Google Home–based device since then: Google TV, Google Nest speakers, Google Nest hubs, and more. Now you’ll also hear it on Pixel phones and watches and many other Google surfaces. If you listen closely to the new Super G sound for Gemini, you can still hear the ethos of the original Super G sound in the composition: a G-major chord that swells into a singular G note. We’ve built the rest of the soundscape around those notes and the key of G. Collectively, it has the effect of a color palette: There can be variations, but generally speaking, everything feels like it’s part of the same system.
Every musician has their own style, but no sound designer on the team is working in isolation. If our 12 sound designers sat in a circle, the brand sound would be like the fire pit in the middle that connects everyone. As Material evolves, we’re always asking whether our sounds continue to make sense. It’s an open question, and almost ten years on, you can still hear the sounds morphing and changing. It’s still inspiring.
Sound design by Harrison Zafrin and Conor O’Sullivan. Motion design by Arthur Ribeiro Vergani and Matthew Sienzant.