Integrative Training in Health-Assistive Smart Environments

Context-aware Prompting and Executive Function Assistance for Cognitive Impairment using Sensors and Mobile Devices

Start Time: 
Fri, 02/15/2013 - 1:00pm
End Time: 
Fri, 02/15/2013 - 2:00pm
Location: 
EME B46

 Cognitive impairment of almost any kind may impact the critical executive functions we need for self-management and independent living. The Planning and Execution Assistant and Trainer (PEAT) is an Android app designed to help users with executive function impairment complete more activities with less prompting from a caregiver. PEAT includes real-time planning methods to reschedule the prompts when delays or interruptions occur, or when monitored sensor conditions change. PEAT was originally designed for brain injury rehabilitation but has been used by many people with nearly all forms of cognitive impairment including mild cognitive impairment, early dementia, stroke, autism, ADHD, PTSD, hypoxia and others.  This talk will explain how PEAT functions, including an overview of the system architecture, the sensors, and the prompt generation methods. We will describe how PEAT has been used by individuals with different forms of cognitive impairment and its applications for aging in place. The talk covers PEAT's roots in NASA's autonomous systems research lab, clinical trials, and more recent R&D projects funded by DARPA and TATRC to integrate PEAT with Smart Home sensors and biosensors, creating a context-aware prompting system. We will present work with collaborator Dr. Henry Kautz (University of Rochester) who developed state estimation and prompt planning modules which plug into PEAT. We will describe collaboration with a VA neuropsychologist who uses PEAT with approximately 50 veterans per year including many with mild cognitive impairment or early dementia, and we'll describe specific cognitive interventions she developed which plug into PEAT to help both the patient and their caregiver.   An MP4 file of the talk can be found here (right-click and save to download). 

Speaker: 
Rich Levinson
Bio: 

 Richard Levinson is an artificial intelligence researcher who developed autonomous software systems for nearly 20 years at NASA Ames Research Center. Levinson's research focus is on integrated planning and reaction for autonomous systems. Levinson developed software to provide executive function support for NASA's autonomous robots in the form of integrated planning and real-time control components. Levinson pioneered research into a computer model of human executive function and published his work in both neuropsychology and computer science journals. Levinson holds three patents for integrated planning and prompting in cognitive aids, and has been PI for federally-funded R&D projects sponsored by the Department of Education, DARPA, and TATRC.