With 1.5 million apps buried in the app graveyard of disuse and neglect, and hundreds of thousands more teetering on oblivion, the venerable mobile app has seen better days. Even so, the mobile app economy is much better off than the explosive, if forgettable, voice app economy, according to a new VoiceLabs report.
Forgettable, that is, because users don't seem to be able to remember how or when to use the roughly 7,000 Alexa-powered voice apps. In my own family, we pretty much only know how to ask Alexa to set timers, read our Kindle books, and play Hamilton. For the voice-first revolution to really hit its stride, it seems some changes are in order.
Losing track of the apps
As an Amazon Echo owner, each week Amazon emails me to tell me "What's new with Alexa?" This past week it was the Ditty skill, which transforms simple spoken messages (e.g., "I love you") and turns them into musical dittys. The same email informed me how to access quotes from the movie Groundhog Day.
Each week new skills (or apps) of various degrees of (in)utility are announced (CNET touts its picks of 2016 here), with developers now offering over 7,000:
Now, if only we could remember them. Any of them.
According to the VoiceLabs report, a week after we first enable a new voice app (on Alexa or Google Home, the Google equivalent), there's a mere 3% chance that it will still be in use a week later. For mobile apps, retention rates aren't sky-high (13% for Android and 11% for iOS), but they're much higher than voice apps. Indeed, so forgettable are most voice apps that 69% of all voice apps earn just one star (or none) on their respective app stores.
Getting to Voice 2.0
Of course, the problem might not be that the apps are bad, but rather that discovering them and enabling them is too hard. Imagine trying to uncover a gem within the millions of apps in Apple's App Store without the benefit of search or a graphical user interface, and you start to see the difficulty inherent in a "headless" voice-controlled app universe.
Though Amazon has invested heavily in developer tooling and documentation to drive third-party experimentation on its Alexa platform, the reality is that consumers don't want to have to enable and remember specific skills. We just want to interact with an AI-driven device and rely on its intelligence to understand what we want. We don't want to have to think about this or that developer and this or that skill. We just want to speak and have the AI respond.
In this way, Google might have the edge on Amazon, given that Google has been training AI for many years with petabytes upon petabytes of data, both homegrown (using Google Voice, Google Maps, and other Google products) and indexed (the vast treasure trove of the internet). For now, Amazon's Alexa has the edge according to CNET and other reviewers, but Google's broader and deeper data set seems better positioned to tune its AI to replace multitudinous apps with the one app we really need: Artificial intelligence.
That's not the sort of thing we'll easily forget.
- Amazon Alexa: The smart person's guide (TechRepublic)
- Why an app-focused strategy could lead to mobile failure (TechRepublic)
- 10 Amazon Alexa skills to add to your Echo today (TechRepublic)
- Alexa tricks: From helpful to amusing, here are 25 things to ask your assistant (ZDNet)
- Why mobile app developers may soon be looking for a new job (TechRepublic)
Matt is currently head of the developer ecosystem at Adobe. The views expressed are his own, not those of his employer.
Matt Asay is a veteran technology columnist who has written for CNET, ReadWrite, and other tech media. Asay has also held a variety of executive roles with leading mobile and big data software companies.