NRC Research and Fellowship Programs
Fellowships Office
Policy and Global Affairs

Participating Agencies

RAP opportunity at National Institute of Standards and Technology     NIST

Explainable Artificial Intelligence


Information Technology Laboratory, Software and Systems Division

opportunity location
50.77.51.C0802 Gaithersburg, MD

NIST only participates in the February and August reviews.


name email phone
Peter Bajcsy 301.975.2958


As of today, there is a plethora of cyber-physical instruments consisting of physical sensing (e.g., microscopy imaging) and cyber (digital) Artificial Intelligence (AI)-based predictions. These instruments raise concerns about safety because they rely on black-box AI models and do not contain any guardrails if physical and/or digital parts of the instrument fail or are attacked by an adversary. We would like to address the safety concerns by researching a metrology for establishing digital references [2], safety zones (boundaries), validation methods for AI risk management, and baselines for traceability of physical and digital parts of AI-enabled instruments [1]. Our research is applied to instruments used in regenerative medicine and cancer research [3]. 

[1] OpenAI Microscope, a collection of visualizations of every significant layer and neuron of 13 important vision models, URL

[2] Peter Bajcsy et al., “AI Model Utilization Measurements For Finding Class Encoding Patterns”, arXiv, Dec. 2022, URL

[3] Nicholas J. Schaub et al., “Deep learning predicts function of live retinal pigment epithelium from quantitative microscopy,” Journal of Clinical Investigation. November 14, 2019. DOI

key words
Artificial Intelligence; Computer Vision; Confidence Estimation;


Citizenship:  Open to U.S. citizens
Level:  Open to Postdoctoral applicants


Base Stipend Travel Allotment Supplementation
$82,764.00 $3,000.00
Copyright © 2024. National Academy of Sciences. All rights reserved.Terms of Use and Privacy Policy