Data Science in the News

Did you know we have a monthly newsletter? View past volumes and subscribe.

Visualization software stands the test of time

Sept. 13, 2021 - 
In the decades since LLNL’s founding, the technology used in pursuit of the Laboratory’s national security mission has changed over time. For example, studying scientific phenomena and predicting their behaviors require increasingly robust, high-resolution simulations. These crucial tasks compound the demands on high-performance computing hardware and software, which must continually be...

Laser-driven ion acceleration with deep learning

May 25, 2021 - 
While advances in machine learning over the past decade have made significant impacts in applications such as image classification, natural language processing and pattern recognition, scientific endeavors have only just begun to leverage this technology. This is most notable in processing large quantities of data from experiments. Research conducted at LLNL is the first to apply neural...

A winning strategy for deep neural networks

April 29, 2021 - 
LLNL continues to make an impact at top machine learning conferences, even as much of the research staff works remotely during the COVID-19 pandemic. Postdoctoral researcher James Diffenderfer and computer scientist Bhavya Kailkhura, both from LLNL’s Center for Applied Scientific Computing, are co-authors on a paper—“Multi-Prize Lottery Ticket Hypothesis: Finding Accurate Binary Neural...

Winter hackathon highlights data science talks and tutorial

March 24, 2021 - 
The Data Science Institute (DSI) sponsored LLNL’s 27th hackathon on February 11–12. Held four times a year, these seasonal events bring the computing community together for a 24-hour period where anything goes: Participants can focus on special projects, learn new programming languages, develop skills, dig into challenging tasks, and more. The winter hackathon was the DSI’s second such...

Novel deep learning framework for symbolic regression

March 18, 2021 - 
LLNL computer scientists have developed a new framework and an accompanying visualization tool that leverages deep reinforcement learning for symbolic regression problems, outperforming baseline methods on benchmark problems. The paper was recently accepted as an oral presentation at the International Conference on Learning Representations (ICLR 2021), one of the top machine learning...

'Self-trained' deep learning to improve disease diagnosis

March 4, 2021 - 
New work by computer scientists at LLNL and IBM Research on deep learning models to accurately diagnose diseases from X-ray images with less labeled data won the Best Paper award for Computer-Aided Diagnosis at the SPIE Medical Imaging Conference on February 19. The technique, which includes novel regularization and “self-training” strategies, addresses some well-known challenges in the...

Lab researchers explore ‘learn-by-calibration’ approach to deep learning to accurately emulate scientific process

Feb. 10, 2021 - 
An LLNL team has developed a “Learn-by-Calibrating” method for creating powerful scientific emulators that could be used as proxies for far more computationally intensive simulators. Researchers found the approach results in high-quality predictive models that are closer to real-world data and better calibrated than previous state-of-the-art methods. The LbC approach is based on interval...

CASC research in machine learning robustness debuts at AAAI conference

Feb. 10, 2021 - 
LLNL’s Center for Applied Scientific Computing (CASC) has steadily grown its reputation in the artificial intelligence (AI)/machine learning (ML) community—a trend continued by three papers accepted at the 35th AAAI Conference on Artificial Intelligence, held virtually on February 2–9, 2021. Computer scientists Jayaraman Thiagarajan, Rushil Anirudh, Bhavya Kailkhura, and Peer-Timo Bremer led...

NeurIPS papers aim to improve understanding and robustness of machine learning algorithms

Dec. 7, 2020 - 
The 34th Conference on Neural Information Processing Systems (NeurIPS) is featuring two papers advancing the reliability of deep learning for mission-critical applications at LLNL. The most prestigious machine learning conference in the world, NeurIPS began virtually on Dec. 6. The first paper describes a framework for understanding the effect of properties of training data on the...

What put LLNL at the center of U.S. supercomputing in 2020?

Nov. 12, 2020 - 
The HPC world is waiting for the next series of transitions to far larger machines with exascale capabilities. By this time next year, the bi-annual ranking of the Top500 most powerful systems will be refreshed at the top as Frontier, El Capitan, Aurora, and other DOE systems come online. While LLNL was already planning around AI acceleration for its cognitive simulation aims and had a number...

AI gets a boost via LLNL, SambaNova collaboration

Oct. 20, 2020 - 
LLNL has installed a state-of-the-art artificial intelligence (AI) accelerator from SambaNova Systems, the National Nuclear Security Administration (NNSA) announced today, allowing researchers to more effectively combine AI and machine learning (ML) with complex scientific workloads. LLNL has begun integrating the new AI hardware, SambaNova Systems DataScale™, into the NNSA’s Corona...

LLNL, ANL and GSK provide early glimpse into Cerebras AI system performance

Oct. 13, 2020 - 
AI chip and systems startup Cerebras was one of many AI companies showcased at the AI Hardware Summit which concluded last week. Cerebras invited collaborators from LLNL, Argonne National Laboratory, and GlaxoSmithKline to talk about their early work on Cerebras machines and future plans. Livermore Computing's CTO Bronis de Supinski said, “We have this vision for performing cognitive...

Machine learning speeds up and enhances physics calculations

Oct. 1, 2020 - 
Interpreting data from NIF’s cutting-edge high energy density science experiments relies on physics calculations that are so complex they can challenge LLNL supercomputers, which stand among the best in the world. A collaboration between LLNL and French researchers found a novel way to incorporate machine learning and neural networks to significantly speed up inertial confinement fusion...

DL-based surrogate models outperform simulators and could hasten scientific discoveries

June 17, 2020 - 
Surrogate models supported by neural networks can perform as well, and in some ways better, than computationally expensive simulators and could lead to new insights in complicated physics problems such as inertial confinement fusion (ICF), LLNL scientists reported. Read more at LLNL News.

Lab team studies calibrated AI and deep learning models to more reliably diagnose and treat disease

May 29, 2020 - 
A team led by LLNL computer scientist Jay Thiagarajan has developed a new approach for improving the reliability of artificial intelligence and deep learning-based models used for critical applications, such as health care. Thiagarajan recently applied the method to study chest X-ray images of patients diagnosed with COVID-19, arising due to the novel SARS-Cov-2 coronavirus. Read more at LLNL...

AI identifies change in microstructure in aging materials

May 26, 2020 - 
LLNL scientists have taken a step forward in the design of future materials with improved performance by analyzing its microstructure using AI. The work recently appeared online in the journal Computational Materials Science. Read more at LLNL News.

AI hardware for future HPC systems (VIDEO)

May 20, 2020 - 
This interview with Brian Spears, who leads cognitive simulations at LLNL, covers the current state of evaluation of AI chips and how those will mesh with existing and future HPC systems. Watch on YouTube.

Deep learning may provide solution for efficient charging, driving of autonomous electric vehicles

Feb. 4, 2020 - 
LLNL computer scientists and software engineers have developed a deep learning-based strategy to maximize electric vehicle (EV) ride-sharing services while reducing carbon emissions and the impact to the electrical grid, emphasizing autonomous EVs capable of offering 24-hour service. Read more at LLNL News.

Successful simulation and visualization coupling proves the power of Sierra

Oct. 22, 2019 - 
As the first National Nuclear Security Administration (NNSA) production supercomputer backed by GPU- (graphics processing unit) accelerated architecture, Sierra’s acquisition required a fundamental shift in how scientists at Lawrence Livermore National Laboratory (LLNL) program their codes to take advantage of the GPUs. The majority of Sierra’s computational power—95 percent of its 125...

Speech generation: siblings collaborate on machine learning hackathon project

May 28, 2019 - 
The first recording that brothers Sam and Joe Eklund, along with their colleague Travis Chambers, played for the audience was a validation. “I endorse Travis as president of the United States of America,” the audio clip played, in a voice resembling Barack Obama’s. The second, in the same voice, was a declaration: “Ice is back, our brand new invention” (from the song “Ice Ice Baby” by...