Nathaniel Hanson

I am a Technical Staff Member at the Lincoln Laboratory in the Humanitarian Assistance and Disaster Relief (HADR) Technology group, part of MIT. I work on applications of spectral sensing to environmental monitoring and perception in unstructured environments. My PhD advisors were Taşkın Padır and Kristen Dorsey.

I am always excited to foster new collaborations. If you have any questions about my research, please feel free to contact me.

Email  /  GitHub  /  Google Scholar  /  LinkedIn  /  CV

profile photo

Biography

Within the HADR group, I lead a portfolio of research focused on the intersection of robotics, disasters, and remote sensing. My PhD research in the Institute for Experiential Robotics focused on making hyperspectral imaging and near-infrared spectroscopy lighter and more cost-effective for use in robot-centric terrain and object understanding. I coined the term "material-informed robotics" to describe a novel reframing of robotics with material perception at its core. I am active in collaborative efforts with MIT campus, including the Beaver Works Summer Institute as the lead instructor for the Unmanned Aerial Vehicle Racing program and the Learning Machines Training course with the MIT Media Lab. I have an MS in Computer Science from Boston University and a BS in Computer Engineering with a minor in Theology from the University of Notre Dame, where I worked in Jane Cleland-Huang's lab.

Research

I'm interested in hyperspectral imaging, material identification, multi-modal sensing, remote sensing, and robotics. Representative papers are highlighted.

project image

HASHI: Highly Adaptable Seafood Handling Instrument for Manipulation in Industrial Settings


Austin Allison*, Nathaniel Hanson*, Sebastian Wicke, Taşkın Padır
International Conference on Robotics and Automation (ICRA), 2024
arxiv / code / website /

We propose a novel robot end effector, called HASHI, that employs chopstick-like appendages for precise and dexterous manipulation. This gripper is capable of in-hand manipulation by rotating its two constituent sticks relative to each other and offers control of objects in all three axes of rotation by imitating human use of chopsticks. HASHI delicately positions and orients food through embedded 6-axis force-torque sensors. We derive and validate the kinematic model for HASHI, as well as demonstrate grip force and torque readings from the sensorization of each chopstick.

project image

Hold 'em and Fold 'em: Towards Human-scale, Feedback-Controlled Soft Origami Robots


Immanuel Ampomah Mensah, Jessica Healey, Celina Wu, Andrea Lacunza, Nathaniel Hanson, Kristen L. Dorsey
2024 Soft Robotics Toolkit Competition Runner-Up, 2024
arxiv / website /

In this work, we independently demonstrate these key factors towards controlling and actuating human-scale loads: proprioceptive (embodied) feedback control of a soft, pneumatically-actuated origami robot; and actuation of these origami origami robots under a person’s weight in an open-loop configuration.

project image

Battery-Swapping Multi-Agent System for Sustained Operation of Large Planetary Fleets


Ethan Holand, Jarrod Homer, Alex Storrer, Musheeera Khandeker, Ethan F. Muhlon, Maulik Patel, Ben-oni Vainqueur, David Antaki, Naomi Cooke, Chloe Wilson, Bahram Shafai, Nathaniel Hanson, Taşkın Padır
IEEE Aerospace Conference, 2024
arxiv / code / website /

This work shares an open-source platform developed to demonstrate battery swapping on unknown field terrain. We detail our design methodologies utilized for increasing system reliability, with a focus on optimization, robust mechanical design, and verification. Optimization of the system is discussed, including the design of passive guide rails through simulation-based optimization methods which increase the valid docking configuration space by 258%. The full system was evaluated during integrated testing, where an average servicing time of 98 seconds was achieved on surfaces with a gradient up to 10 degrees. We conclude by briefly proposing flight considerations for advancing the system toward a space-ready design.

project image

Hyper-Drive: Visible-Short Wave Infrared Hyperspectral Imaging Datasets for Robots in Unstructured Environments


Nathaniel Hanson, Benjamin Pyatski, Samuel Hibbard, Charles DiMarzio, Taşkın Padır
IEEE Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS), 2023
arxiv / data / website /

We introduce a first-of-its-kind system architecture with snapshot hyperspectral cameras and point spectrometers to efficiently generate composite datacubes from a moving robot base. Our system collects and registers datacubes spanning the visible to shortwave infrared (660-1700 nm) spectrum while simultaneously capturing the ambient solar spectrum reflected off a white reference tile. We collect and disseminate a large dataset of more than 500 labeled datacubes from on-road and off-road terrain compliant with the ATLAS ontology to further the integration hyperspectral imaging (HSI).

project image

A Vision for Cleaner Rivers: Harnessing Snapshot Hyperspectral Imaging to Detect Macro-Plastic Litter


Nathaniel Hanson*, Ahmet Demirkaya*, Deniz Erdoğmuş, Aron Stubbins, Taşkın Padır, Tales Imbiriba
IEEE Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS), 2023
arxiv / code /

To address the problem of mismanaged plastic waste, we analyze the feasibility of macro-plastic litter detection using computational imaging approaches in river-like scenarios. We enable near-real-time tracking of partially submerged plastics by using snapshot Visible-Shortwave Infrared hyperspectral imaging. Our experiments indicate that imaging strategies coupled with machine learning classification approaches can lead to high detection accuracy even in challenging scenarios, especially when leveraging hyperspectral data and nonlinear classifiers.

project image

Mobile MoCap: Retroreflector Localization On-The-Go


Gary Lvov, Mark Zolotas, Nathaniel Hanson, Austin Allison, Xavier Hubbard, Michael Carvajal, Taskin Padir
IEEE International Conference on Automation Science and Engineering (CASE), 2023
arxiv / code /

We present a retroreflector feature detector that performs 6-DoF (six degrees-of-freedom) tracking and operates with minimal camera exposure times to reduce motion blur. To evaluate the proposed localization technique while in motion, we mount our Mobile MoCap system, as well as an RGB camera to benchmark against fiducial markers, onto a precision-controlled linear rail and servo. We evaluate the two systems at varying distances, marker viewing angles, and relative velocities. Across all experimental conditions, our stereo-based Mobile MoCap system obtains higher position and orientation accuracy than the fiducial approach.

project image

SLURP! Spectroscopy of Liquids Using Robot Pre-Touch Sensing


Nathaniel Hanson*, Wesley Lewis*, Kavya Puthuveetil*, Donelle Furline, Akhil Padmanabha, Taşlan Padir, Zackory Erickson
International Conference on Robotics and Automation (ICRA), 2023
arxiv / code / website /

We present a state-of-the-art sensing technique for robots to perceive what liquid is inside of an unknown container. We do so by integrating Visible to Near Infrared (VNIR) reflectance spectroscopy into a robot’s end effector. We introduce a hierarchical model for inferring the material classes of both containers and internal contents given spectral measurements from two integrated spectrometers.

project image

VAST: Visual and Spectral Terrain Classification in Unstructured Multi-Class Environments


Nathaniel Hanson, Michael Shaham, Deniz Erdoğmuş, Taşkin Padir
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022
paper / data / code /

To provide mobile robots with the capability to identify the terrain being traversed and avoid undesirable surface types, we propose a multimodal sensor suite capable of classifying different terrains. We capture high resolution macro images of surface texture, spectral reflectance curves, and localization data from a 9 degrees of freedom (DOF) inertial measurement unit (IMU) on 11 different terrains at different times of day. Using this dataset, we train individual neural networks on each of the modalities, and then combine their outputs in a fusion network.

project image

Occluded Object Detection and Exposure in Cluttered Environments with Automated Hyperspectral Anomaly Detection


Nathaniel Hanson, Gary Lvov, Taşkin Padir
Frontiers in Robotics and AI, 2022
paper / code /

Our approach proposes a new automated method to perform hyperspectral anomaly detection in cluttered workspaces with the goal of improving robot manipulation. We first assume the dominance of a single material class, and coarsely identify the dominant, non-anomalous class. Our work advances robot perception for cluttered environments by incorporating multi-modal anomaly detection aided by hyperspectral sensing into detecting fractional object presence without need for laboriously curated labels.

project image

Hyperbot-A Benchmarking Testbed For Acquisition Of Robot-Centric Hyperspectral Scene And In-Hand Object Data


Nathaniel Hanson, Tarik Kelestemur, Joseph Berman, Dominik Ritzenhoff, Taşkin Padir
IEEE Workshop on Hyperspectral Imaging and Signal Processing: Evolution in Remote Sensing (WHISPERS), 2022
paper / code /

We designed a benchmarking testbed to enable a robot manipulator to perceive spectral and spatial characteristics of scene items. Our design includes the use of a push broom Visible to Near Infrared (VNIR) hyperspectral camera, co-aligned with a depth camera. This system enables the robot to process and segment spectral characteristics of items in a larger spatial scene.

project image

In-Hand Object Recognition with Innervated Fiber Optic Spectroscopy for Soft Grippers


Nathaniel Hanson, Hillel Hochsztein, Akshay Vaidya, Joel Willick, Kristen Dorsey, Taşkin Padir
IEEE International Conference on Soft Robotics (RoboSoft), 2022
paper /

We present a novel modular sensing platform integrated into a hybrid-manufactured soft robot gripper to collect and process high-fidelity spectral information. The custom design of the gripper is realized using 3D printing and casting. We embed full-spectrum light sources paired with lensed fiber optic cables within an optically clear gel to collect multi-point spectral reflectivity curves in the Visible to Near Infrared (VNIR) segment of the electromagnetic spectrum.




Other Projects

These include coursework, side projects and unpublished research work.

project image

Material Informed Robotics – Spectral Perception for Object Identification and Parameter Inference


Nathaniel Hanson*
Dissertation
2023-12-16
paper /

Traditional robot perception has focused on recognizing an object’s semantic purpose as proxy to understanding ideal interaction strategies. However, semantic recognition does not include an implicit understanding of the material composition of these objects. Material recognition aids in understanding properties like weight, friction, and deformability. In this dissertation, the identification of material makeup in objects and scenes composed of heterogeneous materials is achieved through the introduction of near infrared (NIR) spectroscopy into robotics.

project image

Pregrasp Object Material Classification by a Novel Gripper Design with Integrated Spectroscopy


Nathaniel Hanson, Tarik Kelestemur, Deniz Erdoğmuş, Taşkin Padir
Preprint
2021-09-15
arxiv /

We introduce a novel design for a two fingered gripper with an integrated NIR spectrometer and endoscopic camera to collect VNIR spectral readings and macro surface images from grasped items. We also develop a method based on a nonlinear Support Vector Machine (SVM) to achieve material inference between visually similar items (a smorgasbord of real and artificial fruits) and continually update class estimates with a discrete Bayes filter.


Design and source code from Jon Barron's website