Voice is becoming a primary search technology

If your business isn't investing in trying to reach the top of local search results, you should spend time thinking about how to do so, as voice-based search queries deliver locally-based results.

Voice assistants are maturing

Ten years since Siri arrived, the voice assistant market is maturing and voice search is beginning to proliferate across multiple devices. Products from Amazon, Google, and Apple all deliver voice assistants and already one in four US adults owns a smart speaker. Juniper Research predicts that voice-based ad revenue will reach $19 billion by 2022, but the best ads are always native search results.

John Stine, the executive director Open Voice Network, explains: "Voice will soon be a primary way consumers connect to the digital world and a primary way digital marketers will connect with actual consumers.... It's time to get ready."

Vixen Labs and the Open Voice Network spoke with 6,000 people in the US, UK, and Germany to figure out how they use their voice assistants. The findings are available here.

One of the big findings is ubiquity. More than 30% of us now use voice assistants daily, and around 23% of us use them several times a day. Nearly everyone is aware that these things exist.

Voice users are becoming accustomed

The report also provides useful demographic data. I was interested to learn that 60% of users aged 18-24 and 36% of those who are 25-34 years of age are using Siri more than any other assistant. Alexa is more widely used across older demographics, while Google Assistant is also popular. (Cortana and Bixby are very much minor players in the space.)

This dovetails with a recent claim from Futuresource that Apple's Siri holds 25% of the voice assistant market.

Privacy remains an issue. It looks as if it will be some time, however, until using these things in public is seen as socially acceptable. Just 27% of US voice assistant users feel comfortable using them in public, which means we rely on them at home, in the car, or on an iPhone when out and about.

Among those who don't yet use voice assistants, 42% said concerns about privacy have stopped them doing so, while 32% just don't trust the assistants.

That's not especially surprising given incidents such as when we found Apple had humans listening in to some conversations. (Apple subsequently made it possible to prevent this.)

The knowledge gap

Confusion about what our voice assistants can do remains. Most people (76% in the UK) rely on trial and error to find out what they're capable of doing. In other words, while Apple and others in the space regularly introduce support for different types of search, search users are still playing catch up.

There are some tasks people have become accustomed to. The report states that most of us use Siri and the other assistants to control music (73% of users) and check the weather (80% of users), and confirms 91% of users have searched using voice.

That last statistic is why every enterprise should work on local search, as those are the results most likely to show up in voice search results. It's also good business, given that 41% of those surveyed already use voice to make purchases.

There are some differences in behavior between nations: 21% of US consumers say 'pay a bill' is their top banking and finance voice-assisted task, compared to 15% in the UK and 17% in Germany. German users, however, are more open to using the technology to find a doctor or specialist than those from the US or UK.

James Poulter, CEO & Co-founder of Vixen Labs said: "Currently there is a lot of white space for [brands] to move into; the customer base is ready and waiting, but in order to tap into this new marketing channel, brands need to optimize, create, and integrate their products and services with voice technology."

So, what's coming up?

There are some trends that are pretty easy to guess: The way we search will change, and people will become more habitually accustomed to using voice to search for some things than any other form of search. We'll also see the results become more personalized as voice assistants get better at figuring out who is speaking.

Further out, we know voice assistants will become more empathic and capable of responding to the emotion in a person's voice, and we'll see new form factors such as smart displays and smart glasses emerge. In each case, these will make use of voice as part of the overall interface and will extend the spaces into which use of voice makes sense.

To get a sense of that future, the best place to look is toward Apple's work in accessibility as an area in which the company explores alternative user interfaces. We're also seeing voice assistants become contextually aware, as well as capable of providing answers to questions while being offline.

We're also seeing increasing use of voice assistants in business. Enterprise voice assistant architectures such as VERA 2.0 let business users build their own voice controls to manage their own internal business systems, while apps such as Shortcuts let users extend what their existing voice assistants can do.

JP Morgan & Co and Capital One both make use of Alexa in client-focused roles, and we are certainly seeing voice deployed in call centers globally - proving a B2C component in supporting customer-focused roles.

Please follow me on Twitter, or join me in the AppleHolic's bar & grill and Apple Discussions groups on MeWe.