Data Science in the News

Introduction to deep learning for image classification workshop (VIDEO)

July 6, 2022- 
In addition to its annual conference held every March, the global Women in Data Science (WiDS) organization hosts workshops and other activities year-round to inspire and educate data scientists worldwide, regardless of gender, and to support women in the field. On June 29, LLNL’s Cindy Gonzales led a WiDS Workshop titled “Introduction to Deep Learning for Image Classification.” The abstract...

Building confidence in materials modeling using statistics

Oct. 31, 2021- 
LLNL statisticians, computational modelers, and materials scientists have been developing a statistical framework for researchers to better assess the relationship between model uncertainties and experimental data. The Livermore-developed statistical framework is intended to assess sources of uncertainty in strength model input, recommend new experiments to reduce those sources of uncertainty...

Summer scholar develops data-driven approaches to key NIF diagnostics

Oct. 20, 2021- 
Su-Ann Chong's summer project, “A Data-Driven Approach Towards NIF Neutron Time-of-Flight Diagnostics Using Machine Learning and Bayesian Inference,” is aimed at presenting a different take on nToF diagnostics. Neutron time-of-flight diagnostics are an essential tool to diagnose the implosion dynamics of inertial confinement fusion experiments at NIF, the world’s largest and most energetic...

Data Science Challenge welcomes UC Riverside

Oct. 11, 2021- 
Together with LLNL’s Center for Applied Scientific Computing (CASC), the DSI welcomed a new academic partner to the 2021 Data Science Challenge (DSC) internship program: the University of California (UC) Riverside campus. The intensive program has run for three years with UC Merced, and it tasks undergraduate and graduate students with addressing a real-world scientific problem using data...

Laser-driven ion acceleration with deep learning

May 25, 2021- 
While advances in machine learning over the past decade have made significant impacts in applications such as image classification, natural language processing and pattern recognition, scientific endeavors have only just begun to leverage this technology. This is most notable in processing large quantities of data from experiments. Research conducted at LLNL is the first to apply neural...

A winning strategy for deep neural networks

April 29, 2021- 
LLNL continues to make an impact at top machine learning conferences, even as much of the research staff works remotely during the COVID-19 pandemic. Postdoctoral researcher James Diffenderfer and computer scientist Bhavya Kailkhura, both from LLNL’s Center for Applied Scientific Computing, are co-authors on a paper—“Multi-Prize Lottery Ticket Hypothesis: Finding Accurate Binary Neural...

Winter hackathon highlights data science talks and tutorial

March 24, 2021- 
The Data Science Institute (DSI) sponsored LLNL’s 27th hackathon on February 11–12. Held four times a year, these seasonal events bring the computing community together for a 24-hour period where anything goes: Participants can focus on special projects, learn new programming languages, develop skills, dig into challenging tasks, and more. The winter hackathon was the DSI’s second such...

Novel deep learning framework for symbolic regression

March 18, 2021- 
LLNL computer scientists have developed a new framework and an accompanying visualization tool that leverages deep reinforcement learning for symbolic regression problems, outperforming baseline methods on benchmark problems. The paper was recently accepted as an oral presentation at the International Conference on Learning Representations (ICLR 2021), one of the top machine learning...

Ana Kupresanin featured in FOE alumni spotlight

March 10, 2021- 
LLNL's Ana Kupresanin, deputy director of the Center for Applied Scientific Computing and member of the Data Science Institute council, was recently featured in a Frontiers of Engineering (FOE) alumni spotlight. Kupresanin develops statistical and machine learning models that incorporate real-world variability and probabilistic behavior to quantify uncertainties in engineering and physics...

'Self-trained' deep learning to improve disease diagnosis

March 4, 2021- 
New work by computer scientists at LLNL and IBM Research on deep learning models to accurately diagnose diseases from X-ray images with less labeled data won the Best Paper award for Computer-Aided Diagnosis at the SPIE Medical Imaging Conference on February 19. The technique, which includes novel regularization and “self-training” strategies, addresses some well-known challenges in the...

Lab researchers explore ‘learn-by-calibration’ approach to deep learning to accurately emulate scientific process

Feb. 10, 2021- 
An LLNL team has developed a “Learn-by-Calibrating” method for creating powerful scientific emulators that could be used as proxies for far more computationally intensive simulators. Researchers found the approach results in high-quality predictive models that are closer to real-world data and better calibrated than previous state-of-the-art methods. The LbC approach is based on interval...

CASC research in machine learning robustness debuts at AAAI conference

Feb. 10, 2021- 
LLNL’s Center for Applied Scientific Computing (CASC) has steadily grown its reputation in the artificial intelligence (AI)/machine learning (ML) community—a trend continued by three papers accepted at the 35th AAAI Conference on Artificial Intelligence, held virtually on February 2–9, 2021. Computer scientists Jayaraman Thiagarajan, Rushil Anirudh, Bhavya Kailkhura, and Peer-Timo Bremer led...

NeurIPS papers aim to improve understanding and robustness of machine learning algorithms

Dec. 7, 2020- 
The 34th Conference on Neural Information Processing Systems (NeurIPS) is featuring two papers advancing the reliability of deep learning for mission-critical applications at LLNL. The most prestigious machine learning conference in the world, NeurIPS began virtually on Dec. 6. The first paper describes a framework for understanding the effect of properties of training data on the...

What put LLNL at the center of U.S. supercomputing in 2020?

Nov. 12, 2020- 
The HPC world is waiting for the next series of transitions to far larger machines with exascale capabilities. By this time next year, the bi-annual ranking of the Top500 most powerful systems will be refreshed at the top as Frontier, El Capitan, Aurora, and other DOE systems come online. While LLNL was already planning around AI acceleration for its cognitive simulation aims and had a number...

AI gets a boost via LLNL, SambaNova collaboration

Oct. 20, 2020- 
LLNL has installed a state-of-the-art artificial intelligence (AI) accelerator from SambaNova Systems, the National Nuclear Security Administration (NNSA) announced today, allowing researchers to more effectively combine AI and machine learning (ML) with complex scientific workloads. LLNL has begun integrating the new AI hardware, SambaNova Systems DataScale™, into the NNSA’s Corona...

LLNL, ANL and GSK provide early glimpse into Cerebras AI system performance

Oct. 13, 2020- 
AI chip and systems startup Cerebras was one of many AI companies showcased at the AI Hardware Summit which concluded last week. Cerebras invited collaborators from LLNL, Argonne National Laboratory, and GlaxoSmithKline to talk about their early work on Cerebras machines and future plans. Livermore Computing's CTO Bronis de Supinski said, “We have this vision for performing cognitive...

Machine learning speeds up and enhances physics calculations

Oct. 1, 2020- 
Interpreting data from NIF’s cutting-edge high energy density science experiments relies on physics calculations that are so complex they can challenge LLNL supercomputers, which stand among the best in the world. A collaboration between LLNL and French researchers found a novel way to incorporate machine learning and neural networks to significantly speed up inertial confinement fusion...

DL-based surrogate models outperform simulators and could hasten scientific discoveries

June 17, 2020- 
Surrogate models supported by neural networks can perform as well, and in some ways better, than computationally expensive simulators and could lead to new insights in complicated physics problems such as inertial confinement fusion (ICF), LLNL scientists reported. Read more at LLNL News.

Modeling neuronal cultures on 'brain-on-a-chip' devices

June 12, 2020- 
For the past several years, LLNL scientists and engineers have made significant progress in development of a three-dimensional “brain-on-a-chip” device capable of recording neural activity of human brain cell cultures grown outside the body. The team has developed a statistical model for analyzing the structures of neuronal networks that form among brain cells seeded on in vitro brain-on-a...

Lab team studies calibrated AI and deep learning models to more reliably diagnose and treat disease

May 29, 2020- 
A team led by LLNL computer scientist Jay Thiagarajan has developed a new approach for improving the reliability of artificial intelligence and deep learning-based models used for critical applications, such as health care. Thiagarajan recently applied the method to study chest X-ray images of patients diagnosed with COVID-19, arising due to the novel SARS-Cov-2 coronavirus. Read more at LLNL...