Google's new Semantic Experiences uses word vectors to search over 100,000 books for contextually related statements and points out passages that provide answers.
Building a slide deck, pitch, or presentation? Here are the big takeaways:
- Google has launched a new service that uses vector-based language learning to search books for contextually relevant answers to questions, all without needing to match keywords.
- Word vector language modeling maps synonyms, antonyms, related terms, context clues, and other natural uses of language to find relationships between statements. Google has published a pre-trained TensorFlow model for developers to use in testing the technology.
Semantic Experiences uses vector models to find contextual relationships between words based on related concepts, equivalence, and other relationships. Google first applied the technology to its Gmail Smart Reply feature last year, and launched Semantic Experiences to show two potential applications of its vector modeling technology.
Included on Semantic Experiences is a word association game called Semantris, and Talk to Books, which can crawl over 100,000 books to find amazingly contextual answers to almost any question you can think to ask.
How does Talk to Books work?
Talk to Books uses Google's vector space language modeling, which Google said "enable algorithms to learn about the relationships between words, based on examples of actual language usage."
"Relatedness, synonymy, antonymy, meronymy, holonymy, and many other types of relationships may all be represented in vector space language models if we train them in the right way and then pose the right 'questions'." In terms of Talk to Books, that means you can pose a question and Talk to Books can find relevant responses without any dependence on keyword matching.
SEE: IT leader's guide to deep learning (Tech Pro Research)
To get a practical idea of how vector language modeling works, give Semantris a try. In either game mode, players are given a list of words and told to type an association, either another related word or a phrase, and the learning machine at the heart of the game will match it with one of the words on the screen. It doesn't always work perfectly, especially for more esoteric relations, but it's a great example of how Talk to Books works: Context clues and related words work together to find matches.
Vector language modeling: Countless applications
Anyone who needs to search for relevant passages in books knows how difficult it can be: Researchers can spend countless hours looking for relevant information. With question in hand, all a Talk to Books user needs to do is ask, and a whole list of related passages from books are pulled out.
Google said that its Semantic Experiences shows only two examples of countless applications of vector language modeling. "Other potential applications include classification, semantic similarity, semantic clustering, whitelist applications (selecting the right response from many alternatives), and semantic search (of which Talk to Books is an example)."
Those interested in finding other applications for Google's vector-based universal sentence encoder can read about it in a related paper and can try out the pre-trained Universal Sentence Encoder TensorFlow model.
- Special report: How to implement AI and machine learning (free PDF) (TechRepublic)
- Machine Learning for the masses (ZDNet)
- Machine learning: A cheat sheet (TechRepublic)
- Blockchain, AI, machine learning: What do CIOs really think are the most exciting tech trends? (ZDNet)
- Understanding the differences between AI, machine learning, and deep learning (TechRepublic)