Google Assistant doesn’t have flash briefings in the Alexa sense — instead, you’ll be publishing your audio content as a podcast. This is a little more technical than Alexa’s process. First, your briefing will need its own homepage. Second, you’ll need to edit the briefing’s RSS feed to include snippets of code that are required for Google Assistant to recognize it in its directory — check out all the requirements here. Google doesn’t require setting up an Assistant action. Once you’ve included the necessary code in your RSS feed, your podcast will show up automatically within search results.
Like all AI devices that use National Language Processing or NLP, Alexa does not understand all voices easily. But she learns to understand her people over time. She does, however, require people to speak in simple terms, with appropriate pauses, and use specific word orders. There are many times when she will say "I'm sorry, I don't know the answer to that question." More than likely, she doesn't recognize a word order or misunderstood the question.
For tracking your food, you can use the Track by Nutritionix skill, which lets you record your food intake using your voice, or ask for caloric values of foods. (Alexa does the latter by default.) Say things like, "Alexa, tell Food Tracker to log a cup of almond milk" or "Alexa, ask Food Tracker how many calories are in two eggs and three slices of bacon."
As of this writing, Amazon has thousands of sources for flash briefings.  The sources they include can be hyper-local like your local news station. Amazon also has specific topics like tech or business, or general info.  Many of these sources, like NPR, are podcasts provided by Tune-In.  These briefings are audio files Alexa plays for you.  Other sources like the AP news stories are read in Alexa’s voice.  I wish Amazon told you which ones were audio files because her voice drones on after a while.  I hope I get to change her voice like I can with Siri.  Right now, you can just change the language to the English (UK) or German.
Like all AI devices that use National Language Processing or NLP, Alexa does not understand all voices easily. But she learns to understand her people over time. She does, however, require people to speak in simple terms, with appropriate pauses, and use specific word orders. There are many times when she will say "I'm sorry, I don't know the answer to that question." More than likely, she doesn't recognize a word order or misunderstood the question.
×