flowtorch.data

flowtorch.data.dataloader

Definition of a common interface for all dataloaders.

This abstract base class should be used as parent class when defining new dataloaders, e.g., to support additional file formats.

class flowtorch.data.dataloader.Dataloader[source]

Bases: ABC

Abstract base class to define a common interface for dataloaders.

abstract property field_names: Dict[str, List[str]]

Create a dictionary containing availale fields

Returns

dictionary with write times as keys and field names as values

Return type

Dict[str, List[str]]

abstract load_snapshot(field_name: Union[List[str], str], time: Union[List[str], str]) Union[List[Tensor], Tensor][source]

Load one or more snapshots of one or more fields.

Parameters
  • field_name (Union[List[str], str]) – name of the field to load

  • time (Union[List[str], str]) – snapshot time

Returns

field values

Return type

Union[List[Tensor], Tensor]

abstract property vertices: Tensor

Get the vertices at which field values are defined.

Returns

coordinates of vertices

Return type

Tensor

abstract property weights: Tensor

Get the weights for field values.

In a standard finite volume method, the weights are the cell volumes. For other methods, the definition of the weight is described in the Dataloader implementation.

Returns

weight for field values

Return type

Tensor

abstract property write_times: List[str]

Available write times.

Returns

list of available write times

Return type

List[str]

flowtorch.data.foam_dataloader

Classes to work with OpenFOAM cases, meshes, and fields.

The FOAMDataloader class allows to load fields from an OpenFOAM simulation folder. Currently, only the ESI-OpenCFD branch of OpenFOAM is supported (v1912, v2006). The FOAMCase class assembles information about the folder and file structure of a simulation. The FOAMMesh allows loading and parsing the finite volume mesh.

class flowtorch.data.foam_dataloader.FOAMCase(path: str, distributed: Optional[bool] = None)[source]

Bases: object

Class to access and parse OpenFOAM cases.

Most of the attributes and methods are private because they are typically accessed via a FOAMDataloader instance.

_eval_distributed() bool[source]

Check if the simulation case is distributed (parallel).

Warning

Collated output is currently not supported/not detected.

Returns

True if distributed

Return type

bool

_eval_processors() int[source]

Get number of processor folders.

Returns

number of processor folders or 1 for serial runs

Return type

int

_eval_write_times() List[str][source]

Assemble a list of all write times.

Returns

a list of all time folders

Return type

List[str]

Warning

For distributed simulations, it is assumed that all processor folders contain the same time folders.

_eval_field_names() Dict[str, List[str]][source]

Get a dictionary of all fields and files in all time folders.

Warning

For distributed cases, only processor0 is evaluated. The fields for all other processors are assumed to be the same.

Returns

dictionary with write times as keys and a list of field names for each time as values

Return type

Dict[str, List[str]]

build_file_path(field_name: str, time: str, processor: int = 0) str[source]

Create the path to file inside the time folder of a simulation.

Parameters
  • field_name (str) – name of the field or file, e.g., “U” or “p”

  • time (str) – name of the time folder, e.g., “0.01”

  • processor (int, optional) – processor folder to load the data from; ignored in serial simulation cases; defaults to 0

Returns

path to file inside a time folder

Return type

str

Examples

>>> from flowtorch.data import FOAMCase
>>> case = FOAMCase("./cavity_binary_parallel/")
>>> case._distributed
True
>>> case._processors
4
>>> case._time_folders
['0', '0.1', '0.2', '0.3', '0.4', '0.5']
>>> case._field_names
{'0': ['U', 'p'], '0.1': ['U', 'p', 'phi'], '0.2': ['U', 'p', 'phi'], '0.3': [
    'U', 'p', 'phi'], '0.4': ['U', 'p', 'phi'], '0.5': ['U', 'p', 'phi']}
>>> case.build_file_path("U", "0.1", 1)
'./cavity_binary_parallel/processor1/0.1/U'
class flowtorch.data.foam_dataloader.FOAMDataloader(path: str, dtype: str = torch.float32, distributed: Optional[bool] = None)[source]

Bases: Dataloader

Load internal fields and mesh properties of OpenFOAM cases.

The project ofpp by Xu Xianghua has been a great help to implement some of the methods.

Examples

>>> from flowtorch import DATASETS
>>> from flowtorch.data import FOAMDataloader
>>> path = DATASETS["of_cavity_ascii_parallel"]
>>> loader = FOAMDataloader(path)
>>> loader.write_times
['0', '0.1', '0.2', '0.3', '0.4', '0.5']
>>> loader.field_names
{'0': ['p', 'U'], '0.1': ['p', 'phi', 'U'], '0.2': ['p', 'phi', 'U'], '0.3': [
    'p', 'phi', 'U'], '0.4': ['p', 'phi', 'U'], '0.5': ['p', 'phi', 'U']}
>>> vertices = loader.vertices
>>> vertices[:3]
tensor([[0.0025, 0.0025, 0.0050],
        [0.0075, 0.0025, 0.0050],
        [0.0125, 0.0025, 0.0050]])
>>> loader.weights[:3]  # cell volumes
tensor([2.5000e-07, 2.5000e-07, 2.5000e-07])
>>> p = loader.load_snapshot("p", "0.5")
>>> p.shape
torch.Size([400])
>>> p[:3]
tensor([ 4.2993e-06, -5.8226e-03, -1.2960e-02])
property field_names: Dict[str, List[str]]

Access to the available field names for all available write times via FOAMCase._eval_field_names().

Getter

returns names of availabe fields

Type

Dict[str, List[str]]

load_snapshot(field_name: Union[List[str], str], time: Union[List[str], str]) Union[List[Tensor], Tensor][source]

Load one or more snapshots of one or more fields.

Parameters
  • field_name (Union[List[str], str]) – name of the field to load

  • time (Union[List[str], str]) – snapshot time

Returns

field values

Return type

Union[List[Tensor], Tensor]

property vertices: Tensor

In OpenFOAM, field for post-processing are defined at the control volume’s center (vol<Type>Fields). Therefore, the vertices property enables access to cell center locations via FOAMMesh.

Getter

returns control volume centers

Type

pt.Tensor

property weights: Tensor

For results obtained using a finite volume method with co-located arrangement (OpenFOAM), a sensible weight for a cell-centered value is the cell volume. The cell volumes are availabe via the FOAMMesh class.

Getter

returns cell volumes

Type

pt.Tensor

property write_times: List[str]

Access to available snapshot/write times via FOAMCase._eval_write_times().

Getter

returns the available write times

Type

List[str]

class flowtorch.data.foam_dataloader.FOAMMesh(case: FOAMCase, dtype: str = torch.float32)[source]

Bases: object

Class to load and process OpenFOAM meshes.

OpenFOAM stores the finite volume mesh as a collection of several files located in constant/polyMesh or in processorXX/constant/polyMesh for serial and distributed cases, respectively. Even though OpenFOAM is a cell-centered finite volume method, the cell-centers and volumes are not explicitly stored. Instead, a so-called face-addressing storage is used. All internal faces have an owner cell and a neighbor cell. Boundary faces only have an owner cell. The mesh attributes are defined in several files:

  • points: list of vertices forming cell faces; the list index of a point is used as label

  • faces: list of all cell faces defined as point labels

  • owner: list of cell labels that are face owners

  • neighbour: list of cell labels that are face neighbors; BE spelling

  • boundary: definition of faces belonging to a patch

Examples

>>> from flowtorch.data import FOAMMesh
>>> mesh = FOAMMesh.from_path("./")
>>> centers = mesh.get_cell_centers()
>>> centers.size()
torch.Size([400, 3])
>>> centers[:2]
tensor([[0.0025, 0.0025, 0.0050],
        [0.0075, 0.0025, 0.0050]])
>>> volumes = mesh.get_cell_volumes()
>>> volumes.size()
torch.Size([400])
>>> volumes[:2]
tensor([2.5000e-07, 2.5000e-07])

Warning

Dynamically changing meshes are currently not supported.

Warning

Distributed meshes may be parsed and concatenated, but the cell centers and volumes won’t have the same ordering as when computed from a reconstructed mesh.

_compute_face_centers_and_areas(points: Tensor, faces: Tensor, n_points_faces: Tensor) Tuple[Tensor, Tensor][source]

Compute face centers and areas.

The implemented algorithm is close to the one in makeFaceCentresAndAreas. The main steps are:

  1. compute an estimate of the face center by averaging all face vertices

  2. decompose the face into triangles

  3. compute the sum over all area-weighted triangle centroids, triangle areas, and face area normal vectors

  4. compute the face centroid and face area normal from the (weighted) sums

Parameters
  • points (pt.Tensor) – list of points

  • faces (pt.Tensor) – list point labels forming each face

  • n_points_faces (pt.Tensor) – number of points forming each face

Returns

tuple of two tensors; the first one holds the face centers and the second one holds the face areas

Return type

Tuple[pt.Tensor, pt.Tensor]

_compute_cell_centers_and_volumes(mesh_path: str) Tuple[Tensor, Tensor][source]

Compute the cell centers and volumes of an OpenFOAM mesh.

The implemented algorithm is close to the one in makeCellCentresAndVols. The following steps are involved:

  1. compute an estimate of the cell center as the average over all face centers

  2. compute centroids and volumes of all pyramids formed by the cell faces and and the center estimate

  3. the cell volume equals the sum over all pyramid volumes

  4. the cell center is the volume-weighted average of all pyramid centroids

Parameters

mesh_path (str) – mesh location

Returns

tuple of two tensors; the first one holds the cell centers and the second one holds the cell volumes

Return type

Tuple[pt.Tensor, pt.Tensor]

classmethod from_path(path: str, dtype: str = torch.float32)[source]

Create a new instance based on the simulation’s path.

Parameters
  • path (str) – path to the OpenFOAM simulation

  • dtype (str, optional) – tensor type, defaults to DEFAULT_DTYPE

Returns

FOAMMesh instance

Return type

FOAMMesh

get_cell_centers() Tensor[source]

Return or compute and return control volume centers.

Returns

control volume centers

Return type

pt.Tensor

get_cell_volumes() Tensor[source]

Return or compute and return cell volumes.

Returns

cell volumes

Return type

pt.Tensor

flowtorch.data.hdf5_file

Module to read and write the internal flowTorch data format.

The HDF5Writer class allows to write field and mesh data into an HDF5 file. It also creates an XDMF accessor file for postprocessing in ParaView. The XDMF file creation is implemented in the XDMFWriter class. The HDF5Dataloader class is the complementary writer for the flowTorch data format. Moreover, the FOAM2HDF5 class allows conversion of reconstructed OpenFOAM simulation cases into the HDF5-based flowTorch format.

class flowtorch.data.hdf5_file.FOAM2HDF5(path: str, dtype=torch.float32)[source]

Bases: object

Convert reconstructed OpenFOAM cases to flowTorch HDF5 format.

If the simulation case is decomposed into processor folders/domains, an info statement is displayed and no conversion is performed.

Examples

>>> from flowtorch import DATASETS
>>> from flowtorch.data import FOAM2HDF5
>>> converter = FOAM2HDF5(DATASETS["of_cavity_ascii"])
>>> converter.convert("cavity.hdf5", ["U", "p"], ["0.1", "0.2", "0.3"])
convert(filename: str, fields: Optional[List[str]] = None, times: Optional[List[str]] = None)[source]

Convert OpenFOAM case to flowTorch HDF5 file.

Parameters
  • filename (str) – name of the HDF5 file

  • fields (List[str], optional) – list of fields to convert; if None, all available fields are converted

  • times (List[str], optional) – list of times to convert; if None, all available times are converted

class flowtorch.data.hdf5_file.HDF5Dataloader(file_path: str, dtype: str = torch.float32)[source]

Bases: Dataloader

Load HDF5-based flowTorch data.

Examples

>>> from flowtorch import HDF5Dataloader
>>> loader = HDF5Dataloader("flowtorch.hdf5")
>>> times = loader.write_times
>>> p, U = loader.load_snapshot(["p", "U"], write_times)
>>> vertices = loader.vertices
property connectivity: Tensor
property edge_vertices: Tensor
property field_names: Dict[str, List[str]]

Create a dictionary containing availale fields

Returns

dictionary with write times as keys and field names as values

Return type

Dict[str, List[str]]

load_snapshot(field_name: Union[List[str], str], time: Union[List[str], str]) Union[List[Tensor], Tensor][source]

Load one or more snapshots of one or more fields.

Parameters
  • field_name (Union[List[str], str]) – name of the field to load

  • time (Union[List[str], str]) – snapshot time

Returns

field values

Return type

Union[List[Tensor], Tensor]

property vertices: Tensor

Get the vertices at which field values are defined.

Returns

coordinates of vertices

Return type

Tensor

property weights: Tensor

Get the weights for field values.

In a standard finite volume method, the weights are the cell volumes. For other methods, the definition of the weight is described in the Dataloader implementation.

Returns

weight for field values

Return type

Tensor

property write_times: List[str]

Available write times.

Returns

list of available write times

Return type

List[str]

class flowtorch.data.hdf5_file.HDF5Writer(file: str)[source]

Bases: object

Class to write flowTorch data to HDF5 file.

Two types of data are supported: - variable: (field) data that changes with times, e.g, snapshots - constant: constant data like mesh vertices or cell volumes

An XDMF accessor file can be created to support visual post-processing with ParaView and other XDMF-compatible software packages.

Examples

>>> import torch as pt
>>> from flowtorch.data import HDF5Writer
>>> writer = HDF5Writer("test_file.hdf5")
>>> writer.write("ones", (3, 2), pt.ones((3, 2)), "0.01")
>>> int_data =  pt.ones((3, 2), dtype=pt.int32)
>>> writer.write("ones_int", (3, 2), int_data, dtype=pt.int32)
write(name: str, size: tuple, data: Optional[Tensor] = None, time: Optional[str] = None, dtype: str = torch.float32)[source]

Write data to HDF5 file.

Parameters
  • name (str) – dataset name

  • size (tuple) – dataset shape

  • data (pt.Tensor, optional) – data to write; if None, dataset is only allocated

  • time (str, optional) – snapshot time, dataset if created in VAR_GROUP if present

  • dtype (str, optional) – data type, defaults to pt.float32

write_xdmf()[source]

Write XDMF wrapper to access flowTorch HDF5 files in ParaView.

class flowtorch.data.hdf5_file.XDMFWriter(file_path: str, hdf5_file: File)[source]

Bases: object

Create XDMF file to open flowTorch HDF5 files in ParaView.

Example

>>> from flowtorch.data import XDMFWriter
>>> writer = XDMFWriter("flowtorch.hdf5")
>>> writer.create_xdmf("flowtorch.xdmf")
create_xdmf(filename: Optional[str] = None)[source]

Create XDMF wrapper to access flowTorch HDF5 file in ParaView.

Parameters

filename (str, optional) – name of the XDMF file, defaults to None

classmethod from_filepath(file_path: str)[source]

Create XDMFWriter from file path.

Parameters

file_path (str) – path to HDF5 file

flowtorch.data.hdf5_file.copy_hdf5_mesh(path: str, from_file: str, to_file: str)[source]

Create a copy of an flowTorch hdf5 file containing only the mesh.

Sometimes, it is helpul to create a new copy of an existing hdf5 file that contains only the mesh, e.g., to create a separate file for POD or DMD modes.

Parameters
  • path (str) – location of the flowtorch hdf5 file exclusing the filename

  • from_file (str) – name of the file from which to copy the mesh

  • to_file (str) – name of the file to which to copy the mesh

flowtorch.data.psp_dataloader

Module to load FOR2895 iPSP data.

The PSPDataloader class allows to load instationary pressure-sensitive paint (iPSP) data provided by DLR (Deutsches Luft- und Raumfahrtzentrum) within the FOR2895 research group.

class flowtorch.data.psp_dataloader.PSPDataloader(path: str, dtype: str = torch.float32)[source]

Bases: Dataloader

Load iPSP data and meta data.

iPSP data comes as an HDF5 file with datasets organized in different zones. Each zone has additional meta data related to flow conditions and camera setting. The active zone may be switched by seeting the zone attribute.

Examples

>>> from flowtorch import PSPDataloader
>>> loader = PSPDataloader("0226.hdf5")
>>> loader.zone_names
['Zone0000', 'Zone0001']
>>> loader.info.keys()
['AngleAttackAlpha', 'DateOfRecording', 'Mach', ...]
>>> loader.info["Mach"]
(0.903, 'Mach number')
>>> loader.zone_info.keys()
['ExposureTime', 'NumberImages', 'PSPDeviceName', 'SamplingFrequency', 'ZoneName']
>>> loader.zone
Zone0000
>>> loader.zone = "Zone0001"
>>> loader.zone_info["ZoneName"]
HTP
>>> loader.mask_names
['Mask1', "Mask2"]
>>> loader.mask = "Mask2"
>>> cp = loader.load_snapshot("Cp", loader.write_times[:10])
>>> cp.shape
torch.Size([250, 75, 10])
property field_names: Dict[str, List[str]]

Create a dictionary containing availale fields

Returns

dictionary with write times as keys and field names as values

Return type

Dict[str, List[str]]

property info: Dict[str, tuple]

Get iPSP metadata valid for entire file.

Returns

dictionary of metadata values and descriptions

Return type

Dict[str, tuple]

load_snapshot(field_name: Union[List[str], str], time: Union[List[str], str]) Union[List[Tensor], Tensor][source]

Load one or more snapshots of one or more fields.

Parameters
  • field_name (Union[List[str], str]) – name of the field to load

  • time (Union[List[str], str]) – snapshot time

Returns

field values

Return type

Union[List[Tensor], Tensor]

property mask: str

Name of the currently active mask.

Returns

name of the activated mask

Return type

str

property mask_names: List[str]

Find available binary masks in the HDF5 file.

Returns

list of mask names

Return type

List[str]

property vertices: Tensor

Get the vertices at which field values are defined.

Returns

coordinates of vertices

Return type

Tensor

property weights: Tensor

Get the weights for field values.

In a standard finite volume method, the weights are the cell volumes. For other methods, the definition of the weight is described in the Dataloader implementation.

Returns

weight for field values

Return type

Tensor

property write_times: List[str]

Available write times.

Returns

list of available write times

Return type

List[str]

property zone: str

Get the currently selected zone.

Returns

currently selected zone

Return type

str

property zone_info: Dict[str, tuple]

Get iPSP metadata for the currently selected zone.

Returns

zone metadata

Return type

Dict[str, tuple]

property zone_names: List[str]

Find the zone names available in the HDF5 file.

Raises

ValueError – if no valid zones are found

Returns

list of zone names

Return type

List[str]

flowtorch.data.vtk_dataloader

Class and tools to read Visualization Toolkit (VTK) data.

class flowtorch.data.vtk_dataloader.VTKDataloader(path: str, vtk_reader: Union[vtkUnstructuredGridReader, vtkXMLUnstructuredGridReader], prefix: str = '', suffix: str = '', dtype: str = torch.float32)[source]

Bases: Dataloader

Load unstructured VTK files and time series.

The loader assumes that snapshots are stored in individual VTK files. Currently, only unstructured mesh data are supported.

Examples

>>> from flowtorch import DATASETS
>>> from flowtorch.data import VTKDataloader
>>> path = DATASETS["vtk_cylinder_re200_flexi"]
>>> loader = VTKDataloader.from_flexi(path, "Cylinder_Re200_Solution_")
>>> loader.write_times
["0000000", "0000005", "0000300"]
>>> loader.field_names
{'0000000': ['Density', 'MomentumX', 'MomentumY', 'MomentumZ']}
>>> density = loader.load_snapshot("Density", loader.write_times)
>>> density.shape
torch.Size([729000, 3])
>>> from flowtorch import DATASETS
>>> from flowtorch.data import VTKDataloader
>>> path = DATASETS["vtk_su2_airfoil_2D"]
>>> loader = VTKDataloader.from_su2(path, "flow_")
>>> p, U = loader.load_snapshot(["Pressure", "Velocity"], loader.write_times[0])
>>> U.shape
torch.Size([214403, 3])
property field_names: Dict[str, List[str]]

Create a dictionary containing availale fields

Returns

dictionary with write times as keys and field names as values

Return type

Dict[str, List[str]]

classmethod from_flexi(path: str, prefix: str = '', suffix: str = '.000000000.vtu', dtype: str = torch.float32)[source]

Create loader instance from VTK files generated by Flexi.

Flexi supports the output of field and surface data as unstructured XML-based VTK files.

Parameters
  • path (str) – path to folder containing VTK files

  • prefix (str, optional) – part of file name before time value, defaults to “”

  • suffix (str, optional) – part of file name after time value, defaults to “.000000000.vtu”

  • dtype (str, optional) – tensor type, defaults to DEFAULT_DTYPE

classmethod from_su2(path: str, prefix: str = '', suffix: str = '.vtk', dtype: str = torch.float32)[source]

Create loader instance from VTK files generated by SU2.

Parameters
  • path (str) – path to folder containing VTK files

  • prefix (str, optional) – part of file name before time value, defaults to “”

  • suffix (str, optional) – part of file name after time value, defaults to “.vtk”

  • dtype (str, optional) – tensor type, defaults to DEFAULT_DTYPE

load_snapshot(field_name: Union[List[str], str], time: Union[List[str], str]) Union[List[Tensor], Tensor][source]

Load one or more snapshots of one or more fields.

Parameters
  • field_name (Union[List[str], str]) – name of the field to load

  • time (Union[List[str], str]) – snapshot time

Returns

field values

Return type

Union[List[Tensor], Tensor]

property vertices: Tensor

Get the vertices at which field values are defined.

Returns

coordinates of vertices

Return type

Tensor

property weights: Tensor

Get the weights for field values.

In a standard finite volume method, the weights are the cell volumes. For other methods, the definition of the weight is described in the Dataloader implementation.

Returns

weight for field values

Return type

Tensor

property write_times: List[str]

Available write times.

Returns

list of available write times

Return type

List[str]

flowtorch.data.csv_dataloader

Dataloader and accompanying tools to work with CSV files.

A lot of scientific data is exchanged as comma separated value (CSV) files. While there are many Python packages available to read such data, one has to understand how the data is organized in the CSV file before being able to use the readers properly. Moreover, time series data sometimes come as individual files in a single folder or as time folders with the respective snapshot data inside that folder. This subpackages simplifies access to common CSV-based time series data by trying to figure out appropriate reader settings automatically.

class flowtorch.data.csv_dataloader.CSVDataloader(path: str, prefix: str, suffix: str, read_options: dict, time_folders: bool, dtype: str = torch.float32)[source]

Bases: Dataloader

Load CSV files from different sources.

This class allows to load generic CSV files based on Pandas’s load_csv function. Multiple specific formats are supported via class methods.

Examples

>>> from flowtorch import DATASETS
>>> from flowtorch.data import CSVDataloader
>>> davis_data = DATASETS["csv_aoa8_beta0_xc100_stereopiv"]
>>> loader = CSVDataloader.from_davis(davis_data, "B")
>>> times = loader.write_times
>>> times[:5]
['00001', '00002', '00003', '00004', '00005']
>>> loader.field_names
{'00001': ['Vx', 'Vy', 'Vz']}
>>> Vx, Vy, Vz = loader.load_snapshot(['Vx', 'Vy', 'Vz'], times[:5])
>>> Vx.shape
torch.Size([3741, 5])
>>> foam_data = DATASETS["csv_naca0012_alpha4_surface"]
>>> loader = CSVDataloader.from_foam_surface(foam_data, "total(p)_coeff_airfoil.raw")
>>> times = loader.write_times
>>> times[:5]
['0.001', '0.002', '0.003', '0.004', '0.005']
>>> loader.field_names
{'0.001': ['total(p)_coeff']}
>>> snapshots = loader.load_snapshot("total(p)_coeff", times[:10])
>>> snapshots.shape
torch.Size([28892, 10])
>>> vertices = loader.vertices
>>> vertices.shape
torch.Size([28892, 3])
>>> vertices[0, :]
tensor([0.0000e+00, 0.0000e+00, 4.1706e-18])
property field_names: Dict[str, List[str]]

Create a dictionary containing availale fields

Returns

dictionary with write times as keys and field names as values

Return type

Dict[str, List[str]]

classmethod from_davis(path: str, prefix: str = '', suffix: str = '.dat', dtype: str = torch.float32)[source]

Create CSVDataloader instance for DaVis output files.

Parameters
  • path (str) – path to location of time series data

  • prefix (str) – part of the file name before the time/snapshot number; e.g., if the file name B00001.dat, the prefix is B; defaults to empty string

  • suffix (str) – part of the file name of the time/snapshot number; e.g., if the file name is B00001.dat, the suffix is .dat; defaults to .dat

  • dtype (str) – floating point precision; defaults to pt.float32 (single precision)

classmethod from_foam_surface(path: str, file_name: str, header: int = 1, dtype: str = torch.float32)[source]

Create CSVDataloader instance to load OpenFOAM surface sample data.

The class method simplifies to load data generated by OpenFOAM’s sampling function object if the type is set to surfaces and the surfaceFormat is set to raw. The time series data are stored in individual time folders. The file name remains the same.

Parameters
  • path (str) – path to location of time folders

  • file_name (str) – file name of individual CSV files, e.g., p_airfoil.raw

  • header (int, optional) – line number in which to find the column names (starting from 0); defaults to 1

  • dtype (str) – floating point precision; defaults to pt.float32 (single precision)

load_snapshot(field_name: Union[List[str], str], time: Union[List[str], str]) Union[List[Tensor], Tensor][source]

Load one or more snapshots of one or more fields.

Parameters
  • field_name (Union[List[str], str]) – name of the field to load

  • time (Union[List[str], str]) – snapshot time

Returns

field values

Return type

Union[List[Tensor], Tensor]

property vertices: Tensor

Get the vertices at which field values are defined.

Returns

coordinates of vertices

Return type

Tensor

property weights: Tensor

Get the weights for field values.

In a standard finite volume method, the weights are the cell volumes. For other methods, the definition of the weight is described in the Dataloader implementation.

Returns

weight for field values

Return type

Tensor

property write_times: List[str]

Available write times.

Returns

list of available write times

Return type

List[str]

flowtorch.data.tau_dataloader

Direct access to TAU simulation data.

The DRL (Deutsches Luft- und Raumfahrtzentrum) TAU code saves snapshots in the NetCFD format. The TAUDataloader is a wrapper around the NetCFD Python bindings to simplify the access to snapshot data.

class flowtorch.data.tau_dataloader.TAUConfig(file_path: str)[source]

Bases: object

Load and parse TAU parameter files.

The class does not parse the full content of the parameter file but only content that is absolutely needed to load snapshot data.

property config: dict
property path: str
class flowtorch.data.tau_dataloader.TAUDataloader(parameter_file: str, distributed: bool = False, dtype: str = torch.float32)[source]

Bases: Dataloader

Load TAU simulation data.

The loader is currently limited to read: - internal field solution, serial/reconstructed and distributed - mesh vertices, serial and distributed - cell volumes, serial (if present) and distributed

Examples

>>> from os.path import join
>>> from flowtorch import DATASETS
>>> from flowtorch.data import TAUDataloader
>>> path = DATASETS["tau_backward_facing_step"]
>>> loader = TAUDataloader(join(path, "simulation.para"))
>>> times = loader.write_times
>>> fields = loader.field_names[times[0]]
>>> fields
['density', 'x_velocity', 'y_velocity', ...]
>>> density = loader.load_snapshot("density", times)

To load distributed simulation data, set distributed=True >>> path = DATASETS[“tau_cylinder_2D”] >>> loader = TAUDataloader(join(path, “simulation.para”), distributed=True) >>> vertices = loader.vertices

property field_names: Dict[str, List[str]]

Find available fields in solution files.

Available fields are determined by matching the number of weights with the length of datasets in the available solution files; for distributed cases, the fields are only determined based on domain_0.

Returns

dictionary with time as key and list of available solution fields as value

Return type

Dict[str, List[str]]

load_snapshot(field_name: Union[List[str], str], time: Union[List[str], str]) Union[List[Tensor], Tensor][source]

Load one or more snapshots of one or more fields.

Parameters
  • field_name (Union[List[str], str]) – name of the field to load

  • time (Union[List[str], str]) – snapshot time

Returns

field values

Return type

Union[List[Tensor], Tensor]

property vertices: Tensor

Get the vertices at which field values are defined.

Returns

coordinates of vertices

Return type

Tensor

property weights: Tensor

Get the weights for field values.

In a standard finite volume method, the weights are the cell volumes. For other methods, the definition of the weight is described in the Dataloader implementation.

Returns

weight for field values

Return type

Tensor

property write_times: List[str]

Available write times.

Returns

list of available write times

Return type

List[str]

flowtorch.data.tecplot_dataloader

Read Tecplot data via the ParaView module.

class flowtorch.data.tecplot_dataloader.TecplotDataloader(path: str, file_names: ~typing.Dict[str, str], reader: ~paraview.simple._create_func.<locals>.CreateObject, dtype: str = torch.float32)[source]

Bases: Dataloader

Dataloader for Tecplot binary format.

The dataloader wraps around VisItTecplotBinaryReader available in the ParaView Python module. One level of blocks/zones is expected under the root block/zone

Examples

>>> from flowtorch import DATASETS
>>> from flowtorch.data import TecplotDataloader
>>> path = DATASETS["plt_naca2409_surface"]
>>> loader = TecplotDataloader.from_tau(path, "alfa16.surface.pval.unsteady_")
>>> loader.zone_names
["ls", "te", "us"]
>>> loader.zone
"le"
>>> loader.zone = "us"
>>> times = loader.write_times
>>> density = loader.load_snapshot("density", times)
>>> density.shape
torch.Size([300, 3])
property field_names: Dict[str, List[str]]

List of available field names.

The field names are only determined once for the first available snapshot time.

Returns

available fields at first write time

Return type

Dict[str, List[str]]

classmethod from_tau(path: str, base_name: str = '', suffix: str = '.plt', dtype: str = torch.float32)[source]

Construct TecplotDataloader from TAU snapshots.

Parameters
  • path (str) – path to snapshot location

  • base_name (str, optional) – common basename of all snapshots, defaults to “”

  • suffix (str, optional) – snapshot file suffix, defaults to “.plt”

  • dtype (str, optional) – tensor data type, defaults to DEFAULT_DTYPE

Raises

FileNotFoundError – if no snapshots are found

Returns

Tecplot dataloader object

Return type

TecplotDataloader

load_snapshot(field_name: Union[List[str], str], time: Union[List[str], str]) Union[List[Tensor], Tensor][source]

Load snapshots of single or multiple fields and write times.

Parameters
  • field_name (Union[List[str], str]) – single field name or list of field names

  • time (Union[List[str], str]) – single write time of list of write times

Returns

snapshot(s) of one or multiple fields

Return type

Union[List[pt.Tensor], pt.Tensor]

property vertices: Tensor

Points in which field values are defined.

Returns

list of points

Return type

pt.Tensor

property weights: Tensor

Weight for POD/DMD analysis.

This function returns currently a list of ones, since cell areas/volumes are not accessible via the reader.

Returns

list of ones

Return type

pt.Tensor

property write_times: List[str]

Available snapshot write times

Returns

list of available write times

Return type

List[str]

property zone: str

Currently selected block/zone.

Returns

block/zone name

Return type

str

property zone_names: List[str]

Names of available blocks/zones.

Returns

block/zone names

Return type

List[str]

flowtorch.data.selection_tools

Helper tools for building data matrices.

flowtorch.data.selection_tools.mask_box(vertices: Tensor, lower: List[float], upper: List[float]) Tensor[source]

Create a boolean mask to select all vertices in a box.

This function may be used in conjunction with torch.masked_select to select all field values in a box, e.g., when building data matrices.

Parameters
  • vertices (pt.Tensor) – tensor of vertices, where each column corresponds to a coordinate

  • lower (List[float]) – lower bounds of box; one value for each coordinate must be given

  • upper (List[float]) – upper bounds of box; one value for each coordinate must be given

Returns

boolean mask that’s True for every vertex inside the box

Return type

pt.Tensor

flowtorch.data.selection_tools.mask_sphere(vertices: Tensor, center: List[float], radius: float) Tensor[source]

Create a boolean mask to select all vertices in a sphere.

This function may be used in conjunction with torch.masked_select to select all field values within a sphere, e.g., when building data matrices.

Parameters
  • vertices (pt.Tensor) – tensor of vertices, where each column corresponds to a coordinate

  • center (List[float]) – the sphere’s center

  • radius (float) – the sphere’s radius

Returns

boolean mask that’s True for every vertex inside the sphere

Return type

pt.Tensor

flowtorch.data.utils

Collection of utilities realted to data and dataloaders.

flowtorch.data.utils.check_and_standardize_path(path: str, folder: bool = True)[source]

Check if path exists and remove trailing slash if present.

Parameters
  • path (str) – path to folder or file

  • folder (bool) – True if path points to folder; False if path points to file

Returns

standardized path to file or folder

Return type

str

flowtorch.data.utils.check_list_or_str(arg_value: Union[List[str], str], arg_name: str)[source]

Check if argument is of type list or string.

If the input is a list, an additional check is performed to ensure that the list has at list one entry and that all entries are strings.

Parameters
  • arg_value (Union[List[str], str]) – object to perform the check on

  • arg_name – additional argument name to provide informative error message

  • arg_name – str

flowtorch.data.utils.format_byte_size(size: int) Tuple[float, str][source]

Convert number of bytes into human-readable format.

The function is based on this <https://stackoverflow.com/questions/12523586/python-format-size-application-converting-b-to-kb-mb-gb-tb> Stackoverflow question.

Parameters

size – size in bytes

Returns

converted size corresponding unit

Return type

tuple(float, str)