Hello! I am a PhD Candidate in the Computer Engineering program at Northeastern University under the supervision of Kristen Dorsey and Taşkın Padır. My research interests lie at the intersection of remote sensing technologies and robotics. Currently, my proposed dissertation focuses on bringing hyperspectral imaging and near infrared spectroscopy into lighter, cheaper form factors for use in household robots. I am midway through a research thrust involving non-contact material estimation methods to infer object properties such as weight and friction. I welcome all collaborations on practical applications of hyperspectral imaging in real-time processing and vision systems that extend beyond the visible wavelengths of light.
Download my current CV.
PhD Computer Engineering, 2023
Northeastern University
MS Computer Science, 2020
Boston University
BSc Computer Engineering, 2019
University of Notre Dame
My current research directions include:
At any given time, I mentor a handful of talented undergraduate and graduate students. I am our lab’s webmaster and resident tour guide!
Terrain classification is a challenging task for robots operating in unstructured environments. Existing classification methods make simplifying assumptions, such as a reduced number of classes, clearly segmentable roads, or good lighting conditions, and focus primarily on one sensor type. These assumptions do not translate well to off-road vehicles, which operate in varying terrain conditions. To provide mobile robots with the capability to identify the terrain being traversed and avoid undesirable surface types, we propose a multimodal sensor suite capable of classifying different terrains. We capture high resolution macro images of surface texture, spectral reflectance curves, and localization data from a 9 degrees of freedom (DOF) inertial measurement unit (IMU) on 11 different terrains at different times of day. Using this dataset, we train individual neural networks on each of the modalities, and then combine their outputs in a fusion network. The fused network achieved an accuracy of 99.98% percent on the test set, exceeding the results of the best individual network component by 0.98%. We conclude that a combination of visual, spectral, and IMU data provides meaningful improvement over state of the art in terrain classification approaches. The data created for this research is available at https://github.com/RIVeR-Lab/vast_data.
I am always looking for collaborators. Please reach out if you would like to work together!