That does not make sense. We need a tool to tame this. Enter the Google Assistant. It's a conversational experience between you and Google to help you get things done in your world. Whatever you need help with, you should be able to ask the assistant for it. You should be able to use natural language. We believe conversation is the easiest and most universal way to get things done. But it's not just a question of understanding the words. It's a matter of context. The assistant needs to know something about you in order to understand you - your situation, your location, your needs, in order to be able to understand you most effectively.
It's exciting because it feels like the right time for it - technology is making it all possible. Voice recognition and machine learning are important developments. Maps, Knowledge Graph, structured data - all useful for understanding how to get things done. Assistant: Context jewelry retouching service and memory are important. "when is my next appointment", "how long will it take to cycle there", "which route", etc. — Glenn Gabe (@glenngabe) March 21, 2017 Assistant started on Google Allo, then Google Home and Pixel - it's in cars too.
Wherever people need help, the Google Assistant should be there. That's why we think it's important to invest in a robust development platform. This is where Actions on Google comes in. That's what I'm working on. We believe this may be the next big ecosystem in the tradition of search and Google Play. Jason Douglas from @Google: "The platform for Google Assistant could become the next big ecosystem for Google, like YouTube, GPlay, etc." — QuanticMind (@TheQuanticMind) March 21, 2017 Three axes for Actions on