
What's New
Monthly Newsletter
Don't miss the latest DSI news. Subscribe to our newsletter & read the latest volume.
Upcoming Seminar
Nov. TBD: Stay tuned for the next speaker and topic. Contact DSI-Seminars [at] llnl.gov (DSI-Seminars[at]llnl[dot]gov) for a WebEx invitation.
Latest Video
Data Science Meets Fusion (7:39), featuring Jay Thiagarajan, Luc Peterson, and Kelli Humbird.
Data Scientist Spotlight

Giselle Fernández
Data Scientist
Data scientist Giselle Fernández contributes to an “exhilarating and immensely rewarding” breadth of LLNL research. She serves as machine learning (ML) lead for projects involving fusion energy design optimization, material deformation, and post-detonation flow transport. “Machines undoubtedly will play a pivotal role in our future,” she says. “My deep involvement in data science makes me feel connected to that future in an unprecedented way.” After completing an aerospace engineering PhD at the University of Florida and postdoctoral research at Los Alamos National Lab, Fernández came to Livermore in 2020, joining the Atmospheric, Earth, and Energy Division to apply expertise in ML techniques and uncertainty quantification to research utilizing high-performance computing resources. “Being part of the LLNL community affords me the honor of working for the safety and security of our nation—a responsibility I take immense pride in.” Dedicated to outreach and mentorship, Fernández regularly presents workshops and tutorials, authors news articles, and leads interdisciplinary ML discussions. She also hosts students every summer at Livermore. “My mom, who devoted her career to teaching early on, always fascinated me with her passion for education. Now, hearing students say, ‘I learned so much from you,’ I appreciate how deeply rewarding it is.”
New Research in AI
Explainable AI Can Enhance Scientific Workflows

As ML and AI tools become more widespread, a team of researchers in LLNL’s Computing and Physical and Life Sciences directorates are trying to provide a reasonable starting place for scientists who want to apply ML/AI, but don’t have the appropriate background. The team’s work grew out of a Laboratory Directed Research and Development project on feedstock materials optimization, which led to a pair of papers about the types of questions a materials scientist may encounter when using ML tools, and how these tools behave. Explainable artificial intelligence (XAI) is an emerging field that helps interpret complicated machine learning models, providing an entry point to new applications. XAI may use tools like visualizations that identify which features a neural network finds important in a dataset, or surrogate models to explain more complex concepts. Read more at LLNL Computing.