Views from Intelligent Light
Earl Duque

AIAA AVIATION 2015 – Working with large data of today and tomorrow

Turbo image/animation from AIAA Aviation 2015

This turbine blade simulation result was awarded “Most Quantitatively Descriptive Flow Visualization Animation” in the Visualization Showcase at AIAA Aviation 2015. The achromatic colormap enhances the presentation of the numerical differences.
–click image for animation

I attended the AVIATION 2015 meeting in Dallas last month. I had a great time meeting with colleagues, listening in on great papers and presenting my own work. The week started with my presentation for the CFD Visualization Showcase session where I was awarded the “Most Quantitatively Descriptive Flow Visualization Animation” which highlighted the animations and images from my paper “EPIC – An Extract Plug-In Components Toolkit for In situ Data Extracts Architecture“. The paper was presented at the “Post-Processing and Model Reduction” session.

 

In both the animations and the paper, I made use of FieldView’s achromatic colormaps. I’ve found that the “Achromatic Vision 1″ colormap, easily selected from the new colormap selector in the colormap tab (no more hunting around for user defined colormaps!!!) does a much better job at highlighting flow features that I didn’t see using the default Spectrum colormaps. I use the Achromatic Vision 1 almost exclusively now for all my visualizations.

 

In addition, I took part in a panel discussion “The Path to CFD Visualization in 2030″ where we discussed our ideas regarding “Facing the Knowledge Extraction and Visualization Challenges of the NASA CFD 2030 Vision”.  During this panel, I described how CFD analysts require the ability to simultaneously compute both very large simulations and large numbers of simulations. Code verification/validation and uncertainty quantification studies also drive the need for unsteady solutions consisting of  billions of grid points and large ensembles of non-deterministic solutions. These types of studies are enabled by: In situ data processing where the solver directly outputs FieldView surface extracts,  FieldView XDB workflow and the use of XDBview.

 

In order to extract actionable knowledge and create visualizations of these extensive datasets, my Applied Research Group is developing new capabilities for CFDers through our DOE sponsored research with the VisIt code and the Air Force Research Lab EPISODE project (the paper I presented at AVIATION2015).  In the coming months, I will be working with the other panelists on a paper that we’ll present at SciTech2016.

 

XDBs files and XDBview were critical to this work.

 

Learn more about in-situ post-processing with XDB workflows:

Earl Duque

Unraveling the Mysteries of Turbulence Transport in a Wind Farm

A joint paper with Prof. Sven Schmitz was just issued in the ”Wind Turbine 2015″ special issue of the online journal Energies.

 

This paper entitled “Unraveling the Mysteries of Turbulence Transport in a Wind Farm” is co-authored with Pankaj K. Jha 1, Earl P. N. Duque 2, Jessica L. Bashioum 1 and Sven Schmitz 1,*

 

For this project, we used FieldView XDB workflows to enable the investigation of “mysteries involved in the recovery process of the wake momentum deficit, downstream of utility-scale wind turbines in the atmosphere.”  The “High-resolution surface data extracts provide new insight into the complex recovery process of the wake momentum deficit governed by turbulence transport phenomena. “

Roger Rintala

Reduce Data Three Orders of Magnitude while Retaining Full Fidelity

WindFarmClose

Work presented at the AHS 70th Annual Forum demonstrates that extracts are invaluable for both data reduction and quantitative analysis.

 

In their paper, “Turbulence Transport Phenomena in the Wakes of Wind Turbines”, Jha et al, show that data reduced by three orders of magnitude still retains full fidelity enabling quantitative analysis not possible before.

 

See the movie created for this project

Review the paper

 

 

Roger Rintala

Breakthrough CFD Scalability at 64,000 cores – In Situ for Extreme Scale CFD

Intelligent Light and Georgia Tech Researchers Achieving Breakthrough CFD Scalability at 64,000 cores

Research Leading Toward Practical Extreme Scale CFD

 

Rutherford, NJ – June 4, 2015

Intelligent Light, in collaboration with scalable solver developers at Georgia Tech and HPC experts at Lawrence Berkeley National Laboratory, is achieving breakthrough CFD scalability running the AVF-Leslie combustion simulation code on up to 64,000 cores on supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center.  The project is bringing together Intelligent Light’s expertise in computer science, software and hardware architecture, solver integration, data management, and the practical application of CFD to deliver scalable analysis methods and in situ infrastructure for extreme scale knowledge discovery.

 

High performance computing (HPC) is a necessity for the pursuit of high-fidelity, time-accurate simulations of sophisticated physics.  HPC makes it possible to run these highly detailed simulations and deliver results in reasonable timeframes.  When running simulations using thousands of cores however, the time to write, re-read and post-process the resulting files using traditional volume-based post-processing is impractical or impossible.  When results are not reviewed and desired simulation runs not performed due to these limitations, the cost is wasted computing resources and lost science.  In situ methods enable analysis of full spatiotemporal resolution data while it is still resident in memory, thereby avoiding the costs associated with writing very large data files to persistent storage for subsequent, post hoc analysis.

 

Scalable CFD analysis with extreme scale computing

 

For the AVF-Leslie code, a derivative of Georgia Tech’s Leslie3D solver code, breakthrough scalability is being achieved when running on up to 64,000 cores and the code has been instrumented with VisIt/Libsim to enable in situ extraction of surfaces of interest.  Previously, the code has been used for combustion simulations running up to 5,000 cores.  The extracted surfaces are output to compact XDB files for secondary processing using FieldView, Intelligent Light’s highly efficient post-processing tool that is a mainstay of CFD analysis. XDBs retain full numerical fidelity, enable both automated report generation and interactive exploration and can be used for archives.  Already, combustion researchers are simulating at this extreme scale and learning about phenomena never before possible to explore either numerically or experimentally.

 

“Intelligent Light has been known for years for our post-processing and visualization technology.  As the computing landscapes shifts to high-performance clusters, integrating post-processing with CFD solvers presents an opportunity to create a truly scalable workflow,” said Steve M. Legensky, General Manager and Founder at Intelligent Light.  “Although we have run the VisIt code at up to 98,000 cores on the LLNL BlueGene/Q systems, we are seeing the challenges that arise when integrating with a sophisticated physics code like AVF-Leslie.  This project, as well as others we have executed for DOE and DoD are helping us to understand the issues that affect both post-processing and solver code performance so that we can help our customers be successful in the HPC world.”

 

DOE taps Intelligent Light expertise in pursuit of extreme scale coherency and production quality software for real-world science

 

The Department of Energy (DOE) selected Intelligent Light as part of a team led by Lawrence Berkeley National Laboratory — a team that also includes Kitware, Georgia Tech and Argonne National Laboratory — to address the challenges of extreme-scale computing and integrating CFD solvers with in situ methods.  Software tools for integrating solvers with in situ methods at extreme scale must maintain coherency across tens to hundreds of thousands of processor cores and be production quality to produce useful scientific results.

 

“Today we see widening gaps between compute performance and I/O capability and in situ analysis is a key part of the solution. As we move toward the exascale regime, we will see 3 orders of magnitude increase in FLOPs performance while at the same time seeing only 2-3 times more I/O performance,” says Wes Bethel, Senior Computer Scientist at Lawrence Berkeley National Laboratory.  “Next generation workflows must address this discrepancy while delivering ultra-scalable performance for applications and Intelligent Light is among the organizations that are developing the proven, production quality software that will be required to produce successful science from these machines.”

 

DOE has assembled an exclusive team to develop the next generation methods and tools for in situ workflows to be used in a wide range of HPC-based scientific applications.  Libsim is a key interface for in situ applications.  As Libsim is tightly coupled to the solver, Intelligent Light is working with solvers to integrate this interface.  Intelligent Light is a leading developer and maintainer of the open-source VisIt application and Libsim, both developed by the DOE.

 

XDB – Extract database files provide essential capability for interrogation

 

The use of extracts permits post hoc interactive exploration using standard tools without requiring the user to know what they want to see in advance.  XDBs files are 10-1000 times smaller than solution files and are computationally efficient to create and save.  CFD users can utilize automation to analyze large volumes of data, apply new generation techniques to identify important features from across vast datasets, and maintain the ability to explore solutions interactively using FieldView – the highly efficient, user-centric post-processing product long a favorite tool of CFD practitioners across industry and research around the world.

 

Leading the way forward with HPC

 

In working on research and production projects at the extreme scale, Intelligent Light is developing leading edge expertise and experience solving the challenges that occur when in situ methods are deployed in real world applications at the extreme scale.  By understanding and solving the scalability and workflow issues at 64,000 cores and beyond, the use of HPC and in situ will be accelerated for all Intelligent Light customers.

 

To support the development and deployment of in situ methods, an SC15 workshop has been organized.   Research results are being periodically presented as Intelligent Light’s research and development progress.  The combustion study results were recently presented at the  27th International Conference on Parallel Computational Fluid Dynamics and presentations on related research are planned for AIAA SciTech in January, 2016.

 

This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research under Award Number DE-SC0012449.

 

About Intelligent Light

Winners of multiple IDC HPC Innovation Excellence Awards, Intelligent Light provides industry-leading software and services that unlock the power and value of a highly productive CFD workflow for engineering and research organizations in a variety of industries around the world.  The company’s flagship FieldView™ product line is the most widely used CFD post-processing software for engineering and research, encompassing data management, workflow automation, visualization, and more. Intelligent Light’s expert staff provides production-related engineering services, while its Applied Research Group conducts pure research on the cutting edge of CFD science.  With customer success its paramount goal, Intelligent Light is driving real-world solutions to the toughest challenges in CFD today.  www.ilight.com

 

About Berkeley Lab

Lawrence Berkeley National Laboratory addresses the world’s most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. The Berkeley Lab Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy’s research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe. Founded in 1931, Berkeley Lab’s scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

###

Roger Rintala

No Compromise CFD with On-Demand HPC

XDBview logoThe accessibility of HPC via cloud computing offers tremendous flexibility for CFD users with peak workload demands as well as for organizations and consultants who do not maintain HPC systems in house.

 

By designing a CFD workflow that maximizes the use of HPC systems and eliminates the transfer of volume data sets, productivity gains can be tremendous.  The ability to run high resolution, time dependent simulations and full suites of design points allow every idea to be thoroughly vetted.  Intelligent Light sponsored research used this approach to help a single researcher perform over 60 simulations and evaluate nearly 3TB of data for the AIAA High Lift Prediction Workshop.  Result files were post-processed remotely and only compact XDB files were transferred to the user’s local workstation.

 

Learn how this was accomplished and see how this approach can make your CFD workflow more capable and productive.