Recently NYT's Cade Metz wrote about Google's BERT, a language representation model that is advancing natural language generation (NLG). The whole principle of BERT underlies something our technology team here at Gravyty is doing as well – teaching AI to understand the relationships between words, phrases, and sentence structure.
Teaching technology to develop relationships and context between words is critical for enabling AI to communicate fluently and at scale with our users. This is because Natural Language Generation (NLG) and Natural Language Processing (NLP) projects hinge on the right definition of the business problem, access to relevant and ample data, and the ability to run computations to derive insights.
Google has an advantage in data access and data processing – they're building their own TPU chips, after all. And, what Google and other researchers are attempting is NLG with general communication, everything from Wikipedia articles to social media, and so on.
Gravyty has a different luxury – the ability to hone its research within the fundraising world. This allows us to design our AI to focus on the actions that frontline fundraisers can and should take within a given day, month, quarter or year. Much of the language we take in is general, but our output is far from it. Because we can set specific parameters on our output language to a smaller number of use cases, it’s easier for us to construct a more meaningful dialog.
That said, having followed BERT, Fast.ai, and others focusing on the advancement of NLP and NLG, it’s validating to see that they are thinking along the same lines as us and raising the bar for AI-base communication.