How can I write multiple fields containers to .h5 file?

Mike.Thompson
Mike.Thompson Member, Employee Posts: 382
25 Answers 100 Comments Second Anniversary 25 Likes
✭✭✭✭
edited January 23 in Structures

If I have multiple fields containers and I want to write them all to a single .h5 file, how can I do this?

serialization.serialize_to_hdf5 operator is deprecated.
I am not sure how to use this operator:
serialization.hdf5dpf_generate_result_file

Tagged:

Answers

  • Pierre Thieffry
    Pierre Thieffry Member, Moderator, Employee Posts: 108
    25 Answers Second Anniversary 10 Comments 25 Likes
    ✭✭✭✭
    edited January 23

    Hi @Mike.Thompson, you can try this:

    import os
    import time
    from ansys.dpf import core as dpf
    dpf.set_default_server_context(dpf.AvailableServerContexts.premium)
    
    filename =  #whichever
    dataSource = dpf.DataSources()
    dataSource.set_result_file_path(filename)
    
    model=dpf.Model(dataSource)
    
    # Get Result to Export
    resultInfoProvider = dpf.operators.metadata.result_info_provider()
    resultInfoProvider.inputs.data_sources.connect(dataSource)
    result_info = resultInfoProvider.outputs.result_info()
    
    time_supp=dpf.operators.metadata.time_freq_provider(data_sources=dataSource).outputs.time_freq_support()
    
    dpf.operators.serialization.hdf5dpf_generate_result_file()
    migrate_op = dpf.operators.serialization.hdf5dpf_generate_result_file() # operator instantiation
    migrate_op.inputs.h5_native_compression.connect(1)# gzip type compresision
    migrate_op.inputs.export_floats.connect(False)# doubles as doubles
    migrate_op.inputs.filename.connect(filename+"_converted.h5")
    migrate_op.inputs.time_freq_support_out.connect(time_supp)
    migrate_op.connect(1,model.metadata.meshed_region)
    idx=4
    fc={}
    for res in result_info.available_results:
        print(res.operator_name)
        op_res=dpf.Operator(res.operator_name)
        op_res.inputs.data_sources.connect(dataSource)
        fc[idx]=op_res.outputs.fields_container()    
        migrate_op.connect(idx,res.operator_name)
        migrate_op.connect(idx,fc[idx])
        idx+=2
    
    tic = time.perf_counter()
    migrate_op.run()
    toc = time.perf_counter()
    print(f"Migration Time (s) : {toc - tic:0.4f}")
    
  • Mike.Thompson
    Mike.Thompson Member, Employee Posts: 382
    25 Answers 100 Comments Second Anniversary 25 Likes
    ✭✭✭✭

    @Pierre Thieffry , Thanks.

    when I run the code below, I see in the .h5 file there is only 1 field in the structure. In the code I have 2 distinct fields, but they simply have the exact same information. It seems DPF is identifying these as identical, and representing them in the structure as the same. Is this expected?

    The question leads to the idea of getting this data back into the original form from an .h5 file. I was wondering what would happen since I have F1 added twice to the FC.

    How can I get back to the same python data structure where the fields container contains 3 entries, but the first two point to the same field by reference (change in FC[1] will also modify FC[0]).

    from ansys.dpf import core as dpf
    
    #Make a field of data
    F1 = dpf.fields_factory.Field(nentities=1, nature=dpf.natures.scalar)
    scoping = dpf.scoping.Scoping(ids=[1] , location=dpf.locations.zone)
    F1.scoping=scoping
    F1.unit='mm'
    F1.data=[1]
    
    #Make another field of data.  This is identical to the first field, but still a unique entity
    F2 = dpf.fields_factory.Field(nentities=1, nature=dpf.natures.scalar)
    F2.scoping=scoping
    F2.unit='mm'
    F2.data=[1]
    
    #Add them to fields container.  
    #Add F1 twice at different times.
    FC = dpf.fields_container.FieldsContainer()
    FC.add_label('time')
    FC.add_field({'time':1}, F1)
    FC.add_field({'time':2}, F1)
    FC.add_field({'time':3}, F2)
    
    #Write .h5 file
    pin = 4
    migrate_op = dpf.operators.serialization.hdf5dpf_generate_result_file() # operator instantiation
    migrate_op.inputs.filename.connect(r'C:\Users\mthompso\MyData\DeleteThis\DPF for Mission Loading\SameVals.h5')
    migrate_op.connect(pin,"Something")
    migrate_op.connect(pin+1,FC)
    my_data_sources = migrate_op.outputs.data_sources()
    
    
  • Ramdane
    Ramdane Member, Employee Posts: 14
    Second Anniversary 10 Comments Ansys Employee Name Dropper
    ✭✭✭

    @Mike.Thompson , yes DPF identifies that a given entity (field or scoping, ...) is already saved and avoids saving it again. This is beeing said your point is valid, please could you file a defect on this? I will ensure that it's tracked.