Kevin M. Livingston, Ph.D.

Research Associate
Computational Bioscience Program
University of Colorado Anschutz Medical Campus

email: kevin.livingston@ucdenver.edu
office phone: 303.724.0276
office location: Room 6400A
Research Complex 1 South
University of Colorado Denver
more detailed: contact and location information

Current

In 2014 I will be at both ISMB and AAAI, I hope to see you there...

Education

I have been a Postdoc in Larry Hunter's Lab for the past 5 years, now a Research Associate.

I received my Ph.D. from Northwestern Univiersity in December of 2009. My dissertation is titled: "Language Understanding by Reference Resolution in Episodic Memory" My Ph.D. advisor was Chris Riesbeck.

Prior to that I received a Masters Degree from Northwestern in June of 2004. I have been employed by several companies of varying size and industry, and earned a Bachelors Degree in Computer Science from the University of Dayton (in Dayton, OH) completed in December of 2000.

Research: Machine Reading and Memory

Goal

I believe that semantic and episodic memory can be (and in all likelihood, have to be) leveraged early during language processing and understanding, in order for a machine reader to accomplish the task of language comprehension, and integration with existing knowledge. This hypothesis is in contrast to the typical Natural Language Processing (NLP) pipeline model which defers memory integration to later stages in processing, and frequently doesn't address issues of scale pertaining to integration with a large existing memory.

Memory and language understanding

Although it is generally accepted that memory and context play a crucial role in language comprehension, the question remains as to when this knowledge should be applied in the task of machine language understanding. Current Neuroscience indicates that humans are accessing deep semantic and episodic knowledge at very early stages when reading. It also shows that humans operate on syntax and semantics at the same time, suggesting that there is no syntactic -> semantic pipeline. (Hagoort 2007) My research explores giving machine readers the same functionality: the ability to access memory at all stages of language understanding.

Direct Memory Access Parsing (DMAP)

Direct Memory Access Parsing (DMAP) (Martin 1992) is a memory-driven, expectation-based, deep-semantic approach to natural language understanding. DMAP uses phrasal patterns linked directly to knowledge structures in memory to recursively recognize textual references and map them to existing knowledge, or to construct new knowledge with similar structure when appropriate.

To facilitate my research I have built an implementation of DMAP on top of the ReseachCyc knowledge base contents, driven by the Fire reasoning engine. In addition to my research questions, several new problems have arisen. For example, building an implementation of DMAP that can work with predicate logic assertions (as exist in Cyc), as opposed to frames for which DMAP was originally designed. Also issues of scale, ResearchCyc is three orders of magnitude larger than any other memory previously used with a DMAP system. Memory based methods for dealing with these problems of scale has also become a research question.

DMAP is about bringing semantic and episodic memory to bear early and efficiently in the language understanding process. Semantic and episodic memory can be powerful resources for many Natural Language Processing (NLP) problems, such as coreference resolution, meaning formulation, and knowledge integration. Furthermore, given the end goal of integrating knowledge from reading with existing knowledge, I contend this process is made easier by operating with semantic and episodic structures, instead of lexical or linguistic structures, as early as possible in the parsing and understanding process.

Please contact me, I'd be happy to discuss this or related work with you.