Perception

I've spent more than a decade studying and conducting research on human visual perception, focusing on how we use vision to build spatial memory and to guide our actions. 

Along the way, I've developed considerable expertise in topics and technologies relevant to design practice and ergonomics including haptics, eye-tracking systems, motion-tracking, machine learning, and mathematical methods for analyzing and predicting patterns of human movement in urban and architectural spaces.

 

Virtual Reality

I spent six years conducting research with Dr. William H. Warren at Brown University's Virtual Environment Navigation Laboratory (VENLab), one of the world's most advanced VR facilities. While in graduate school, I earned competitive grant funding from the NASA RI space grant to support STEM education at RI schools.

My dissertation research asked people to navigate M.C. Escher-inspired geometrically impossible mazes in virtual reality. Results suggest that the "maps in our heads" have more in common with subway maps than Google Maps.

Neuroscience

I've designed interfaces for collecting and analyzing neural data at the University of Connecticut, working with Dr. Etan Markus, Dr. Andrew Moiseff. I've also collaborated with Dr. Rebecca Burwell at Brown University under a Brain Science grant supporting interdisciplinary projects that bridge behavioral science and neuroscience.

My work has contributed to research projects on the neural control of firefly flashes, and how "place cell" activity in the hippocampus (a brain area involved in memory) changes in response to varying environmental conditions.