Volume 7

May 4, 2021

DSI logo cropped

Our mission at the Data Science Institute (DSI) is to enable excellence in data science research and applications across LLNL. Our newsletter is a compendium of breaking news, the latest research, outreach efforts, and more. Past volumes of our newsletter are available online.

diagram of pruned neural network

New Research in Machine Learning Robustness

LLNL postdoctoral researcher James Diffenderfer and computer scientist Bhavya Kailkhura are co-authors on a paper that offers a novel and unconventional way to train deep neural networks (DNNs). The LLNL team shows both empirically and theoretically that it is possible to learn highly accurate NNs simply by compressing (i.e., pruning and binarizing) randomized NNs without ever updating the weights. This is in sharp contrast to prevailing weight-training paradigm—i.e., iteratively learning the values of the weights by stochastic gradient descent. In this process, Diffenderfer and Kailkhura proposed (and proved) what they call the multi-prize lottery ticket hypothesis

Hidden within a sufficiently large, randomly weighted NN lie several subnetworks (winning tickets) that have comparable accuracy to a dense target network with learned weights (prize 1); (b) do not require any training to achieve prize 1 (prize 2); and (c) are robust to extreme forms of quantization (i.e., binary weights and/or activation) (prize 3).

In several state-of-the-art DNNs, the LLNL team found subnetworks that are significantly smaller (only 5–10% of the original parameter counts and keep only signs of weights and/or activations). These subnetworks do not require any training but match (or exceed) the performance of weight-trained, dense, and full-precision DNNs. The paper, “Multi-Prize Lottery Ticket Hypothesis: Finding Accurate Binary Neural Networks by Pruning a Randomly Weighted Network,” demonstrated that this counterintuitive learn-by-compressing approach results in subnetworks that are approximately 32x smaller and up to 58x faster than traditional DNNs without sacrificing accuracy.

“This is a completely new way to train neural networks that is computationally inexpensive and reduces the amount of necessary compute storage,” said Diffenderfer. Kailkhura added, “This work unlocks a range of potential areas where DNNs can now be applied, such as edge computing applications in environment/ocean/urban monitoring with cheap sensors.” The research has been accepted to the upcoming International Conference on Learning Representations 2021.


screen shot of WebEx video chat attendees

WiDS Livermore Goes Virtual

Coinciding with International Women’s Day on March 8, LLNL’s fourth annual Women in Data Science (WiDS) regional event brought women together to discuss successes, opportunities, and challenges of being female in a mostly male field. The Lab’s first-ever virtual WiDS gathering attracted dozens of data scientists from inside and outside the Lab. “If some of you have experienced being literally the only woman at the table, or the only woman on a panel, I’m sure you can agree it’s quite special to have an event centered on women in data science,” WiDS Livermore ambassador Marisa Torres told attendees. “We’re here to celebrate a lot of accomplishments.”

Speakers included statistician Ana Kupresanin, who discussed her path from a high school student in her home country of Croatia to LLNL, where she focuses on the uncertainty quantification of predictive models for the weapons program. Bioinformaticist Nisha Mulakken, who is studying the impact of microbiomes on COVID-19 severity among other pursuits, shared tips for mentoring young women and urged viewers to seek guidance through DSI and the Lab’s Data Science Summer Institute, which she co-directs.

WiDS logo in green

The career panel was moderated by data scientist Amar Saini and featured Brenda Ng, Mary Silva, Juanita Ordóñez, and Hiranmayi Ranganathan. The panelists discussed balancing work and family obligations, imposter syndrome, preventing burnout, and other challenges unique to women in data science. The event also featured a speed mentoring session moderated by computer scientist Stephanie Brink, a collaborative platform for chatting, virtual breakout rooms, and a trivia game. Throughout the day, attendees also tuned into Stanford University’s worldwide livestream. Livermore’s was one of more than 200 WiDS events held in more than 60 countries in conjunction with the main conference. 


ML4I logo with neural network and starburst

Lab Offers Forum on Machine Learning for Industry

LLNL is looking for participants and attendees from industry, research institutions and academia for the first-ever Machine Learning for Industry Forum (ML4I), a three-day virtual event starting Aug. 10. The event is sponsored by LLNL’s High Performance Computing Innovation Center and the Data Science Institute. The deadline for submitting presentations or industry use cases is June 30. The deadline for attendee registration is July 29.


screen shot of Brian in video chat with a slide describing autoencoder latent space

Multimedia Highlights

  • University of Washington seminar. Physicist Brian Spears (pictured at left) gave a virtual talk for UW’s seminar series. In “Cognitive Simulations: Combining Simulation and Experiment with Artificial Intelligence,” he described advances in machine learning architectures and methods for inertial confinement fusion science and other applications.
  • George Mason University colloquium. Computational scientist Youngsoo Choi spoke about “Where Are We with Data-Driven Surrogate Modeling for Various Physical Simulations?” at a virtual colloquium for GMU’s Center for Mathematics and Artificial Intelligence.
  • The Data Standard podcast. Data Science Summer Institute co-director Nisha Mulakken appeared on The Data Standard podcast to discuss her bioinformatics work at LLNL. Audio and video will be available in the coming weeks.
  • DDPS virtual seminar series. The Data-Driven Physical Simulations (DDPS) reading group has expanded to welcome external speakers. Led by Youngsoo Choi, the group’s virtual format is now available as a video playlist in the Livermore Lab Events YouTube channel.

seminar icon next to robert's portrait

Virtual Seminar

The DSI invited Dr. Robert Gramacy of Virginia Tech to speak at the March virtual seminar. Gramacy (pictured) recently published a book on surrogate modeling of computer experiments. His talk, “Replication or Exploration? Sequential Design for Stochastic Simulation Experiments,” showed that replication offers the potential to be beneficial from both design and computational perspectives.


CVPR log with the Earth inside the C

Premier Computer Vision Conference Papers

Coming up in June, the international Computer Vision and Pattern Recognition (CVPR) conference is the ML/AI community’s premier annual computer vision event. Two LLNL co-authored papers have been accepted this year:


highlights icon with olivia's portrait

Meet an LLNL Data Scientist

With a B.S. in Mathematics and Computer Science from UC San Diego, Olivia Miano was poised to join LLNL in the spring of 2020 as a software developer. Then she heard about the Data Science Immersion Program and immediately signed up. “I knew next to nothing about data science when I first joined the Lab, so almost everything I know I learned during the program,” she said, acknowledging mentors David Buttler and Juanita Ordóñez. A year later, she works on natural language processing projects for LLNL’s Global Security Computing Applications Division. For Miano, the challenges of applying data science are also what make it exciting. She stated, “I’m always eager to learn and willing to tackle a challenging assignment, especially when the work is meaningful like what we do at the Lab.”