Why Alexa skill programming allows for safe library use

Alexa skill programming might just limit your need to hire a nanny or babysitter, with her new security and whispering abilities!

Lightbox Image
×

She can tell you a story, call you an Uber and give you a delicious dinner recipe — and do it all in a British accent if you like. Thanks to Alexa skill programming, there seems to be no limit to what our favorite digital assistant can do.

Here’s another one: Alexa can accomplish everything we just listed above while using her “inside voice.” That’s right, Alexa now whispers — and it’s just as weird and fun as you’d expect.

Why you never have to worry about Alexa waking the baby again

Amazon’s new “whisper mode” for Alexa, which was announced in September, officially went live in October. Now, if you whisper your question or command, Alexa will reply in kind with her own whispered response.

While this seems like a smart new feature for parents of young children, or people in busy office settings, the development of whisper mode was prompted largely by another factor: The desire to make Alexa more lifelike. By giving her the ability to whisper back at you, Amazon’s engineers have made her more human sounding and natural, able to respond to conversational cues more intuitively.

The result is something to behold. Alexa sounds breathy and more than a bit eerie – but also very natural and human.

Whisper mode is not turned on by default, so users will have to ask Alexa to enable whisper mode (simply saying “Alexa, turn on whisper mode” will do the trick). Or, you may also choose to turn on whisper mode by toggling through the app.

Alexa: Smarter, faster, better

As The Verge mentions, whisper mode is yet one more step in making Alexa more contextually aware by expanding her AI-enhanced capabilities. Alexa uses machine learning to discern sound patterns that are evident in whispered speech, which then triggers a whispered response.

This same machine-learning network is used by Alexa to recognize smoke and carbon monoxide alarms. Alexa can also use it to identify the sound of glass being broken, which allows Alexa to serve a home security role. By becoming more contextually aware, Alexa is able to recognize more than just human speech — and therefore play a much larger role in the home.

Amazon’s Alexa Brain Initiative is working to refine the voice assistant’s “memory,” so to speak, allowing her to track and recall information across multiple dialogue sessions, something that will make it easier for users to engage with the vast number of third-party Alexa programming skills currently available. Users can also ask Alexa to remember specific information, and then furnish it later.

Amazon has also made progress in the effort to make skill launching easier and more user-friendly. Instead of following rigid conversational cues, the goal is to have Alexa activate skills based on natural human language — ultimately becoming so refined that she can recognize a request even if it’s delivered with a stutter, a digression, a long pause — or a whisper.

Is your business taking advantage of Alexa Skill Programming?

Smart speakers are poised to have a transformational impact on the way we live and do business, much in the same way that mobile devices did a decade before. At BIGEYE, our voice experts specialize in Alexa skill programming — and we’d love to show you how your company can benefit from the right set of skills. Contact our Alexa experts today to take the next step with your brand’s voice and capabilities.

Back to Thinking