There’s a time and place for everything. In the privacy of your home, saying “hey” to Google, Alexa, Siri, Meta, and occasionally Bixby is comfortable. But out in public? Where can others see and hear you? You’d probably rather avoid it.
This has been one of the significant issues with AI gadgets lately. They all seem to think that the best way to interact with AI assistants is to talk to them, much like in the movie Her. In reality, you probably rarely see your friends or family use their phones’s assistants in private and never in public. It felt like a small “Aha!”moment when, during last week’s WWDC keynote, Apple mentioned that iOS 18 will let you type to Siri instead.
The new feature in iOS 18
You can already type to Siri through the iPhone’s accessibility settings. (Go to Accessibility > Siri > Type to Siri.) This feature brings up a primary window and keyboard for typing commands. However, in iOS 18, Apple fully embraces this feature, letting you double-tap the bottom of the screen to bring up a Siri keyboard. You’ll also see quick suggestions that you can tap instead of typing (or saying) a complete query.
There are many reasons why this makes sense. Although digital assistants have improved at understanding commands, it’s still hard to talk to them naturally. You might find yourself adjusting your pitch and tone at home when using a wake word. You might think about how to phrase a query before speaking. Despite this, you might occasionally mess up when asking Google to adjust your living room lights. You feel even more self-conscious doing this in public.
The challenges of voice commands in public
Outside, it’s often very noisy. While testing the Ray-Ban Meta smart glasses’ AI features, the AI frequently told me it couldn’t hear properly. Either the environment was too loud, or I was subconsciously embarrassed and spoke too quietly for the device to pick up. This led to frustration, causing me to resort to my phone—the opposite of what AI hardware wants.
It’s not just about new AI gadgets. Speaking into a smartwatch might seem fantastic if you’re James Bond, but most of us are not. People using voice commands on smartwatches often look confused and frustrated. This might seem vain, but self-consciousness is why people hesitate to use voice-controlled assistants in public. A 2018 PwC survey found that 74% of consumers prefer using voice assistants at home, with participants saying that using them in public “just looks weird.” The survey also identified a lack of trust as another major hurdle to using voice assistants—people didn’t believe a voice assistant would correctly understand commands. If you think an AI assistant won’t understand you, why bother using it where you might be judged? (Also, imagine saying “Hey Siri” and activating other people’s iPhones around you. New nightmare unlocked.)
The benefits of typing to AI assistants
Aside from tech logistics, typing to your AI assistant gives you more privacy. You don’t need others to know what you’re doing on your phone, even if it’s something simple like playing a song or setting a timer. You especially don’t want to dictate texts aloud in public. Typing those queries allows you to keep your activities private, even if it means sacrificing some hands-free convenience.
There are valid reasons to speak to an assistant, even in public. Voice commands are handy if you can’t use your hands or are driving. However, having multiple ways to interact with AI assistants helps them fit more seamlessly into our daily lives instead of forcing everyone to adopt new habits. Maybe one day, talking to a chatbot out loud while walking down the street won’t feel odd. For most people, that day isn’t today. Until then, you might prefer typing to Siri instead.