NRC Research and Fellowship Programs
Fellowships Office
Policy and Global Affairs

Participating Agencies

  sign in | focus

RAP opportunity at Naval Research Laboratory     NRL

Environment Dependent Sensor Fusion for Robotic Perception

Location

Naval Research Laboratory, DC, Information Technology

opportunity location
64.15.18.C0876 Washington, DC 203755321

Advisers

name email phone
Donald A Sofge donald.a.sofge.civ@us.navy.mil 202.767.0806

Description

Modern autonomous robotic platforms are equipped with a variety of sensors including cameras [1], infrared sensors  [2], lidar [3], and radar [4]. Despite this variety of sensors, many robots rely exclusively on lidar for localization [3]. The U.S. Naval Research laboratory is interested in developing novel perception algorithms which can identify optimal sensing strategies based on mission level decisions or environmental conditions. Algorithms should be robust to a variety of real-world constraints, such as adapting to visually degraded environments, modulating probability of detection, operating in very limited SWaP envelopes, etc. A key goal in this research is to explore emerging technologies in multi-modal perception on embodied platforms, and applying data-driven solutions to determine optimal sensor selections for estimating the location of agents. These should be scalable to a variety of platforms using varying compute and subsets of common sensors. 

Applicants should have an extensive background in field robotics applications and experience in a subset of the following areas: Machine learning, state estimation, robotic perception, sensor fusion, factor graphs, and statistical analysis. 

References: 

[1] Sun, K., Mohta, K., Pfrommer, B., Watterson, M., Liu, S., Mulgaonkar, Y., Taylor, C.J., and Kumar, V. (2018), Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight, in IEEE Robotics and Automation Letters, vol. 3, no. 2, pp. 965-972, April 2018, doi: 10.1109/LRA.2018.2793349.

[2] Shin, Y.S. and Kim, A., 2019. Sparse depth enhanced direct thermal-infrared SLAM beyond the visible spectrum.IEEE Robotics and Automation Letters, 4(3), pp.2918-2925.

[3] Ebadi, K., Bernreiter, L., Biggie, H., Catt, G., Chang, Y., Chatterjee, A., Denniston, C.E., Deschênes, S.P., Harlow, K., Khattak, S. and Nogueira, L., 2023. Present and future of SLAM in extreme environments: The DARPA subT challenge. IEEE Transactions on Robotics.

[4] Harlow, K., Jang, H., Barfoot, T.D., Kim, A. and Heckman, C., 2023. A New Wave in Robotics: Survey on Recent mmWave Radar Applications in Robotics. arXiv preprint arXiv:2305.01135.

key words
Robotics; Sensor fusion; Localization; Millimeter-wave radar; Visual odometry; Lidar odometry; Infrared odometry

Eligibility

Citizenship:  Open to U.S. citizens and permanent residents
Level:  Open to Postdoctoral applicants

Stipend

Base Stipend Travel Allotment Supplementation
$99,200.00 $3,000.00
Copyright © 2024. National Academy of Sciences. All rights reserved.Terms of Use and Privacy Policy