Learning Language through Interaction
By Hal Daume III. Natural language processing systems build using machine learning techniques are amazingly effective when plentiful labeled training data exists for the task/domain of interest. Unfortunately, for broad coverage (both in task and domain and language) language understanding, we're unlikely to ever have sufficient labeled data, and systems must find some other way to learn. I'll describe work we've done building methods that can learn from interactions applied to two canonical NLP problems: machine translation and question answering. In the former, we develop techniques for collaborating with people; the latter, for competing with them. This talk highlights joint work with a number of wonderful students at collaborators at UMD, UC Boulder and MSR.
Learning Language through Interaction |
Related Links |
Can Robots be Made Creative Enough to Invent Their Own Language? Professor Luc Steels talks about some of his recent breakthrough experiments, which have seen robots programmed to play language games and come up with novel concepts, words and meanings. |
Can a Machine Ever Argue? Francesca Toni is working on models of logic-based argumentation to underpin reasoning in intelligent machines. |
Procedural Language and Knowledge Various types of how-to-knowledge are encoded in natural language instructions: from setting up a tent, to preparing a dish for dinner, and to executing biology lab experiments. |
Natural Language Processing This course is designed to introduce students to the fundamental concepts and ideas in natural language processing (NLP), and to get them up to speed with current research in the area. |
Machine Learning This is a graduate-level course on machine learning, a field that focuses on using automated data analysis for tasks like pattern recognition and prediction. |