PetscSF (Star Forest) - Low-level Interface
The PetscSF (Star Forest) component provides efficient parallel communication patterns for distributed data structures. A star forest is a specialized graph structure optimized for scatter/gather operations in parallel computing.
Overview
PetscSF enables:
- Point-to-point communication: Efficient MPI communication patterns
- Scatter/gather operations: Move data between processors
- Halo exchange: Update ghost/boundary values
- Reduction operations: Parallel sums, max, min across shared data
- Irregular communication: Handle non-uniform data distributions
A star forest consists of:
- Roots: Data owned locally
- Leaves: Data needed from remote processes (or local)
- Communication pattern: Which leaves come from which roots
PetscSF is the underlying communication layer for DM ghost point updates and other parallel operations.
Basic Usage
using PETSc, MPI
# Initialize MPI and PETSc
MPI.Init()
petsclib = PETSc.getlib()
PETSc.initialize(petsclib)
PetscInt = petsclib.PetscInt
# Create a star forest
sf = LibPETSc.PetscSFCreate(petsclib, MPI.COMM_WORLD)
# Define communication pattern
# nleaves: number of leaves (data items we need)
# ilocal: local indices for leaves (can be C_NULL if identity)
# iremote: (rank, index) pairs specifying which process/index to get from
nleaves = 5
# number of roots owned locally (for this simple example set equal to nleaves)
nroots = 5
ilocal = [0, 1, 2, 3, 4] # Local indices where data will be stored
iremote = [
LibPETSc.PetscSFNode(0, 0),
LibPETSc.PetscSFNode(0, 1),
LibPETSc.PetscSFNode(0, 2),
LibPETSc.PetscSFNode(0, 3),
LibPETSc.PetscSFNode(0, 4),
]
LibPETSc.PetscSFSetGraph(petsclib, sf, nroots, nleaves, ilocal, LibPETSc.PETSC_COPY_VALUES,
iremote, LibPETSc.PETSC_COPY_VALUES)
# Setup
LibPETSc.PetscSFSetUp(petsclib, sf)
# Cleanup
LibPETSc.PetscSFDestroy(petsclib, sf)
# Finalize PETSc and MPI
PETSc.finalize(petsclib)
MPI.Finalize()Communication Operations
Broadcast (Scatter)
Send data from roots to leaves:
# Root data: data we own
root_data = Float64[1.0, 2.0, 3.0, 4.0, 5.0]
# Leaf data: buffer to receive data
leaf_data = zeros(Float64, nleaves)
# Broadcast: send root data to leaves
LibPETSc.PetscSFBcastBegin(petsclib, sf, LibPETSc.MPI_DOUBLE, root_data, leaf_data,
LibPETSc.MPI_REPLACE)
LibPETSc.PetscSFBcastEnd(petsclib, sf, LibPETSc.MPI_DOUBLE, root_data, leaf_data,
LibPETSc.MPI_REPLACE)Reduce
Accumulate data from leaves back to roots:
# Leaf contributions
leaf_data = Float64[0.1, 0.2, 0.3, 0.4, 0.5]
# Root accumulator
root_data = zeros(Float64, nroots)
# Reduce: accumulate leaf data to roots
LibPETSc.PetscSFReduceBegin(petsclib, sf, LibPETSc.MPI_DOUBLE, leaf_data, root_data,
LibPETSc.MPI_SUM)
LibPETSc.PetscSFReduceEnd(petsclib, sf, LibPETSc.MPI_DOUBLE, leaf_data, root_data,
LibPETSc.MPI_SUM)Fetch and Operations
Atomic operations for concurrent updates:
# Fetch data and apply operation
LibPETSc.PetscSFFetchAndOpBegin(petsclib, sf, LibPETSc.MPI_DOUBLE, root_data,
leaf_data, leaf_updates, LibPETSc.MPI_SUM)
LibPETSc.PetscSFFetchAndOpEnd(petsclib, sf, LibPETSc.MPI_DOUBLE, root_data,
leaf_data, leaf_updates, LibPETSc.MPI_SUM)MPI Operations
Supported MPI operations for reduce:
MPI_SUM: Sum valuesMPI_MAX: Maximum valueMPI_MIN: Minimum valueMPI_REPLACE: Replace (last write wins)MPI_PROD: Product
Star Forest Types
Available through PetscSFSetType:
- PETSCSFBASIC: Basic implementation
- PETSCSFNEIGHBOR: MPI neighborhood collectives (efficient for structured patterns)
- PETSCSFALLGATHERV: All-gather based
- PETSCSFALLGATHER: All-gather for small data
- PETSCSFGATHERV: Gather-based
- PETSCSFGATHER: Simple gather
- PETSCSFALLTOALL: All-to-all based
Graph Queries
# Get number of roots (locally owned data)
nroots = Ref{PetscInt}()
LibPETSc.PetscSFGetGraph(petsclib, sf, nroots, C_NULL, C_NULL, C_NULL)
# Get number of leaves
nleaves = Ref{PetscInt}()
LibPETSc.PetscSFGetGraph(petsclib, sf, C_NULL, nleaves, C_NULL, C_NULL)
# Get full graph
ilocal_ptr = Ref{Ptr{PetscInt}}()
iremote_ptr = Ref{Ptr{LibPETSc.PetscSFNode}}()
LibPETSc.PetscSFGetGraph(petsclib, sf, nroots, nleaves, ilocal_ptr, iremote_ptr)Multi-Root Support
Handle communication with multiple root data per point:
# Create multi-SF for multiple DOFs per point
nroots_mult = nroots * num_components
multi_sf = LibPETSc.PetscSFCreateEmbeddedRootSF(petsclib, sf, nroots_mult, iroot_indices)Common Use Cases
1. Ghost Point Updates (Halo Exchange)
# After modifying owned data, update ghost points
# 1. Pack local data
# 2. Broadcast to leaves (ghost points)
LibPETSc.PetscSFBcastBegin(petsclib, sf, datatype, local_data, ghost_data, op)
LibPETSc.PetscSFBcastEnd(petsclib, sf, datatype, local_data, ghost_data, op)2. Parallel Assembly
# After local assembly, accumulate contributions from other processes
# 1. Each process computes local contributions
# 2. Reduce to accumulate at owners
LibPETSc.PetscSFReduceBegin(petsclib, sf, datatype, local_contrib, global_data, MPI_SUM)
LibPETSc.PetscSFReduceEnd(petsclib, sf, datatype, local_contrib, global_data, MPI_SUM)3. DM Point Communication
# Get natural SF for a DM (describes point distribution)
dm_sf = Ref{LibPETSc.PetscSF}()
# LibPETSc.DMGetPointSF(petsclib, dm, dm_sf)
# Use to communicate point-based dataPerformance Considerations
- Choose appropriate type:
PETSCSFNEIGHBORis often best for structured grids - Reuse SF objects: Creating the communication pattern is expensive
- Batch communications: Combine multiple small messages when possible
- Alignment: Use properly aligned data types for better performance
Function Reference
PETSc.LibPETSc.PetscSFCompose — Method
PetscSFCompose(petsclib::PetscLibType,sfA::PetscSF, sfB::PetscSF, sfBA::PetscSF)Compose a new PetscSF by putting the second PetscSF under the first one in a top (roots) down (leaves) view
Input Parameters:
sfA- The firstPetscSFsfB- The secondPetscSF
Output Parameter:
sfBA- The compositePetscSF
Level: developer
-seealso: PetscSF, PetscSFComposeInverse(), PetscSFGetGraph(), PetscSFSetGraph()
External Links
- PETSc Manual:
Vec/PetscSFCompose
PETSc.LibPETSc.PetscSFComposeInverse — Method
PetscSFComposeInverse(petsclib::PetscLibType,sfA::PetscSF, sfB::PetscSF, sfBA::PetscSF)Compose a new PetscSF by putting the inverse of the second PetscSF under the first one
Input Parameters:
sfA- The firstPetscSFsfB- The secondPetscSF
Output Parameter:
sfBA- The compositePetscSF.
Level: developer
-seealso: PetscSF, PetscSFCompose(), PetscSFGetGraph(), PetscSFSetGraph(), PetscSFCreateInverseSF()
External Links
- PETSc Manual:
Vec/PetscSFComposeInverse
PETSc.LibPETSc.PetscSFComputeDegreeBegin — Method
degree::Vector{PetscInt} = PetscSFComputeDegreeBegin(petsclib::PetscLibType,sf::PetscSF)begin computation of degree for each root vertex, to be completed with PetscSFComputeDegreeEnd()
Collective
Input Parameter:
sf- star forest
Output Parameter:
degree- degree of each root vertex
Level: advanced
-seealso: PetscSF, PetscSFGatherBegin(), PetscSFComputeDegreeEnd()
External Links
- PETSc Manual:
Vec/PetscSFComputeDegreeBegin
PETSc.LibPETSc.PetscSFComputeDegreeEnd — Method
degree::Vector{PetscInt} = PetscSFComputeDegreeEnd(petsclib::PetscLibType,sf::PetscSF)complete computation of degree for each root vertex, started with PetscSFComputeDegreeBegin()
Collective
Input Parameter:
sf- star forest
Output Parameter:
degree- degree of each root vertex
Level: developer
-seealso: PetscSF, PetscSFGatherBegin(), PetscSFComputeDegreeBegin()
External Links
- PETSc Manual:
Vec/PetscSFComputeDegreeEnd
PETSc.LibPETSc.PetscSFComputeMultiRootOriginalNumbering — Method
nMultiRoots::PetscInt,multiRootsOrigNumbering::Vector{PetscInt} = PetscSFComputeMultiRootOriginalNumbering(petsclib::PetscLibType,sf::PetscSF, degree::Vector{PetscInt})Returns original numbering of multi Each multi-root is assigned index of the corresponding original root.
Collective
Input Parameters:
sf- star forestdegree- degree of each root vertex, computed withPetscSFComputeDegreeBegin()/PetscSFComputeDegreeEnd()
Output Parameters:
nMultiRoots- (optional) number of multi-roots (roots of multi-PetscSF)multiRootsOrigNumbering- original indices of multi-roots; length of this array isnMultiRoots
Level: developer
-seealso: PetscSF, PetscSFComputeDegreeBegin(), PetscSFComputeDegreeEnd(), PetscSFGetMultiSF()
External Links
- PETSc Manual:
Vec/PetscSFComputeMultiRootOriginalNumbering
PETSc.LibPETSc.PetscSFConcatenate — Method
PetscSFConcatenate(petsclib::PetscLibType,comm::MPI_Comm, nsfs::PetscInt, sfs::Vector{PetscSF}, rootMode::PetscSFConcatenateRootMode, leafOffsets::Vector{PetscInt}, newsf::PetscSF)concatenate multiple PetscSF into one
Input Parameters:
comm- the communicatornsfs- the number of inputPetscSFsfs- the array of inputPetscSFrootMode- the root mode specifying how roots are handledleafOffsets- the array of local leaf offsets, one for each inputPetscSF, orNULLfor contiguous storage
Output Parameter:
newsf- The resultingPetscSF
Level: advanced
-seealso: PetscSF, PetscSFCompose(), PetscSFGetGraph(), PetscSFSetGraph(), PetscSFConcatenateRootMode
External Links
- PETSc Manual:
Vec/PetscSFConcatenate
PETSc.LibPETSc.PetscSFCreate — Method
sf::PetscSF = PetscSFCreate(petsclib::PetscLibType,comm::MPI_Comm)create a star forest communication context
Collective
Input Parameter:
comm- communicator on which the star forest will operate
Output Parameter:
sf- new star forest context
Options Database Key:
-sf_type basic- Use MPI persistent Isend/Irecv for communication (Default)-sf_type window- Use MPI-3 one-sided window for communication-sf_type neighbor- Use MPI-3 neighborhood collectives for communication-sf_neighbor_persistent <bool>- If true, use MPI-4 persistent neighborhood collectives for communication (used along with -sf_type neighbor)
Level: intermediate
-seealso: PetscSF, PetscSFSetType, PetscSFSetGraph(), PetscSFSetGraphWithPattern(), PetscSFDestroy()
External Links
- PETSc Manual:
Vec/PetscSFCreate
PETSc.LibPETSc.PetscSFCreateByMatchingIndices — Method
sfA::PetscSF,sf::PetscSF = PetscSFCreateByMatchingIndices(petsclib::PetscLibType,layout::PetscLayout, numRootIndices::PetscInt, rootIndices::PetscInt, rootLocalIndices::PetscInt, rootLocalOffset::PetscInt, numLeafIndices::PetscInt, leafIndices::PetscInt, leafLocalIndices::PetscInt, leafLocalOffset::PetscInt)Create PetscSF by matching root and leaf indices
Collective
Input Parameters:
layout-PetscLayoutdefining the global index space and the rank that brokers each indexnumRootIndices- size of rootIndicesrootIndices-PetscIntarray of global indices of which this process requests ownershiprootLocalIndices- root local index permutation (NULL if no permutation)rootLocalOffset- offset to be added to root local indicesnumLeafIndices- size of leafIndicesleafIndices-PetscIntarray of global indices with which this process requires data associatedleafLocalIndices- leaf local index permutation (NULL if no permutation)leafLocalOffset- offset to be added to leaf local indices
Output Parameters:
sfA- star forest representing the communication pattern from the layout space to the leaf space (NULL if not needed)sf- star forest representing the communication pattern from the root space to the leaf space
Level: advanced
Example 1: -seealso: PetscSF, PetscSFCreate()
External Links
- PETSc Manual:
Vec/PetscSFCreateByMatchingIndices
PETSc.LibPETSc.PetscSFCreateEmbeddedLeafSF — Method
newsf::PetscSF = PetscSFCreateEmbeddedLeafSF(petsclib::PetscLibType,sf::PetscSF, nselected::PetscInt, selected::PetscInt)removes edges from all but the selected leaves of a PetscSF, does not remap indices
Collective
Input Parameters:
sf- original star forestnselected- number of selected leaves on this processselected- indices of the selected leaves on this process
Output Parameter:
newsf- new star forest
Level: advanced
-seealso: PetscSF, PetscSFCreateEmbeddedRootSF(), PetscSFSetGraph(), PetscSFGetGraph()
External Links
- PETSc Manual:
Vec/PetscSFCreateEmbeddedLeafSF
PETSc.LibPETSc.PetscSFCreateEmbeddedRootSF — Method
esf::PetscSF = PetscSFCreateEmbeddedRootSF(petsclib::PetscLibType,sf::PetscSF, nselected::PetscInt, selected::PetscInt)removes edges from all but the selected roots of a PetscSF, does not remap indices
Collective
Input Parameters:
sf- original star forestnselected- number of selected roots on this processselected- indices of the selected roots on this process
Output Parameter:
esf- new star forest
Level: advanced
-seealso: PetscSF, PetscSFSetGraph(), PetscSFGetGraph()
External Links
- PETSc Manual:
Vec/PetscSFCreateEmbeddedRootSF
PETSc.LibPETSc.PetscSFCreateEmbeddedSF — Method
selected::PetscInt,esf::PetscSF = PetscSFCreateEmbeddedSF(petsclib::PetscLibType,sf::PetscSF, nselected::PetscInt)External Links
- PETSc Manual:
Vec/PetscSFCreateEmbeddedSF
PETSc.LibPETSc.PetscSFCreateFromLayouts — Method
sf::PetscSF = PetscSFCreateFromLayouts(petsclib::PetscLibType,rmap::PetscLayout, lmap::PetscLayout)Creates a parallel star forest mapping two PetscLayout objects
Collective
Input Parameters:
rmap-PetscLayoutdefining the global root spacelmap-PetscLayoutdefining the global leaf space
Output Parameter:
sf- The parallel star forest
Level: intermediate
-seealso: PetscSF, PetscSFCreate(), PetscLayoutCreate(), PetscSFSetGraphLayout()
External Links
- PETSc Manual:
Vec/PetscSFCreateFromLayouts
PETSc.LibPETSc.PetscSFCreateInverseSF — Method
isf::PetscSF = PetscSFCreateInverseSF(petsclib::PetscLibType,sf::PetscSF)given a PetscSF in which all vertices have degree 1, creates the inverse map
Collective
Input Parameter:
sf- star forest to invert
Output Parameter:
isf- inverse ofsf
Level: advanced
-seealso: PetscSF, PetscSFType, PetscSFSetGraph()
External Links
- PETSc Manual:
Vec/PetscSFCreateInverseSF
PETSc.LibPETSc.PetscSFCreateRemoteOffsets — Method
remoteOffsets::Vector{PetscInt} = PetscSFCreateRemoteOffsets(petsclib::PetscLibType,sf::PetscSF, rootSection::PetscSection, leafSection::PetscSection)Create offsets for point data on remote processes
Collective
Input Parameters:
sf- ThePetscSFrootSection- Data layout of remote points for outgoing data (this is layout for roots)leafSection- Data layout of local points for incoming data (this is layout for leaves)
Output Parameter:
remoteOffsets- Offsets for point data on remote processes (these are offsets from the root section), orNULL
Level: developer
-seealso: PetscSF, PetscSFCreate()
External Links
- PETSc Manual:
Vec/PetscSFCreateRemoteOffsets
PETSc.LibPETSc.PetscSFCreateSectionSF — Method
sectionSF::PetscSF = PetscSFCreateSectionSF(petsclib::PetscLibType,sf::PetscSF, rootSection::PetscSection, remoteOffsets::Vector{PetscInt}, leafSection::PetscSection)Create an expanded PetscSF of dofs, assuming the input PetscSF relates points
Collective
Input Parameters:
sf- ThePetscSFrootSection- Data layout of remote points for outgoing data (this is usually the serial section)remoteOffsets- Offsets for point data on remote processes (these are offsets from the root section), or NULLleafSection- Data layout of local points for incoming data (this is the distributed section)
Output Parameter:
sectionSF- The newPetscSF
Level: advanced
-seealso: PetscSF, PetscSFCreate()
External Links
- PETSc Manual:
Vec/PetscSFCreateSectionSF
PETSc.LibPETSc.PetscSFCreateStridedSF — Method
vsf::PetscSF = PetscSFCreateStridedSF(petsclib::PetscLibType,sf::PetscSF, bs::PetscInt, ldr::PetscInt, ldl::PetscInt)Create an PetscSF to communicate interleaved blocks of data
Collective
Input Parameters:
sf- star forestbs- strideldr- leading dimension of root spaceldl- leading dimension of leaf space
Output Parameter:
vsf- the newPetscSF
Level: intermediate
-seealso: PetscSF, PetscSFCreate(), PetscSFSetGraph()
External Links
- PETSc Manual:
Vec/PetscSFCreateStridedSF
PETSc.LibPETSc.PetscSFDestroy — Method
PetscSFDestroy(petsclib::PetscLibType,sf::PetscSF)destroy a star forest
Collective
Input Parameter:
sf- address of star forest
Level: intermediate
-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFReset()
External Links
- PETSc Manual:
Vec/PetscSFDestroy
PETSc.LibPETSc.PetscSFDistributeSection — Method
remoteOffsets::Vector{PetscInt} = PetscSFDistributeSection(petsclib::PetscLibType,sf::PetscSF, rootSection::PetscSection, leafSection::PetscSection)Create a new PetscSection reorganized, moving from the root to the leaves of the PetscSF
Collective
Input Parameters:
sf- ThePetscSFrootSection- Section defined on root space
Output Parameters:
remoteOffsets- root offsets in leaf storage, orNULL, its length will be the size of the chart ofleafSectionleafSection- Section defined on the leaf space
Level: advanced
-seealso: PetscSF, PetscSFCreate()
External Links
- PETSc Manual:
Vec/PetscSFDistributeSection
PETSc.LibPETSc.PetscSFDuplicate — Method
newsf::PetscSF = PetscSFDuplicate(petsclib::PetscLibType,sf::PetscSF, opt::PetscSFDuplicateOption)duplicate a PetscSF, optionally preserving rank connectivity and graph
Collective
Input Parameters:
sf- communication object to duplicateopt-PETSCSF_DUPLICATE_CONFONLY,PETSCSF_DUPLICATE_RANKS, orPETSCSF_DUPLICATE_GRAPH(seePetscSFDuplicateOption)
Output Parameter:
newsf- new communication object
Level: beginner
-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFSetType(), PetscSFSetGraph()
External Links
- PETSc Manual:
Vec/PetscSFDuplicate
PETSc.LibPETSc.PetscSFFinalizePackage — Method
PetscSFFinalizePackage(petsclib::PetscLibType)Finalize PetscSF package, it is called from PetscFinalize()
Logically Collective
Level: developer
-seealso: PetscSF, PetscSFInitializePackage()
External Links
- PETSc Manual:
Vec/PetscSFFinalizePackage
PETSc.LibPETSc.PetscSFGetGraph — Method
nroots::PetscInt,nleaves::PetscInt,iloc::Vector{PetscInt} = PetscSFGetGraph(petsclib::PetscLibType,sf::PetscSF, iremote::Vector{PetscSFNode})Get the graph specifying a parallel star forest
Not Collective
Input Parameter:
sf- star forest
Output Parameters:
nroots- number of root vertices on the current process (these are possible targets for other process to attach leaves)nleaves- number of leaf vertices on the current process, each of these references a root on any processilocal- locations of leaves in leafdata buffers (if returned value isNULL, it means leaves are in contiguous storage)iremote- remote locations of root vertices for each leaf on the current process
Level: intermediate
-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFView(), PetscSFSetGraph()
External Links
- PETSc Manual:
Vec/PetscSFGetGraph
PETSc.LibPETSc.PetscSFGetGraphLayout — Method
nleaves::PetscInt,iloc::Vector{PetscInt},gremote::Vector{PetscInt} = PetscSFGetGraphLayout(petsclib::PetscLibType,sf::PetscSF, layout::PetscLayout)Get the global indices and PetscLayout that describe this star forest
Collective
Input Parameter:
sf- star forest
Output Parameters:
layout-PetscLayoutdefining the global space for rootsnleaves- number of leaf vertices on the current process, each of these references a root on any processilocal- locations of leaves in leafdata buffers, orNULLfor contiguous storagegremote- root vertices in global numbering corresponding to leaves in ilocal
Level: intermediate
-seealso: PetscSF, PetscSFSetGraphLayout(), PetscSFCreate(), PetscSFView(), PetscSFSetGraph(), PetscSFGetGraph()
External Links
- PETSc Manual:
Vec/PetscSFGetGraphLayout
PETSc.LibPETSc.PetscSFGetGroups — Method
PetscSFGetGroups(petsclib::PetscLibType,sf::PetscSF, incoming::MPI_Group, outgoing::MPI_Group)gets incoming and outgoing process groups
Collective
Input Parameter:
sf- star forest
Output Parameters:
incoming- group of origin processes for incoming edges (leaves that reference my roots)outgoing- group of destination processes for outgoing edges (roots that I reference)
Level: developer
-seealso: PetscSF, PetscSFGetWindow(), PetscSFRestoreWindow()
External Links
- PETSc Manual:
Vec/PetscSFGetGroups
PETSc.LibPETSc.PetscSFGetLeafRange — Method
minleaf::PetscInt,maxleaf::PetscInt = PetscSFGetLeafRange(petsclib::PetscLibType,sf::PetscSF)Get the active leaf ranges
Not Collective
Input Parameter:
sf- star forest
Output Parameters:
minleaf- minimum active leaf on this process. Returns 0 if there are no leaves.maxleaf- maximum active leaf on this process. Returns -1 if there are no leaves.
Level: developer
-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFView(), PetscSFSetGraph(), PetscSFGetGraph()
External Links
- PETSc Manual:
Vec/PetscSFGetLeafRange
PETSc.LibPETSc.PetscSFGetLeafRanks — Method
ioffset::Vector{PetscInt},irootloc::Vector{PetscInt} = PetscSFGetLeafRanks(petsclib::PetscLibType,sf::PetscSF, niranks::PetscMPIInt, iranks::Vector{PetscMPIInt})Get leaf ranks referencing roots on this process
Not Collective
Input Parameter:
sf- star forest
Output Parameters:
niranks- number of leaf ranks referencing roots on this processiranks- [niranks] array of ranksioffset- [niranks+1] offset inirootlocfor each rankirootloc- [ioffset[niranks]] concatenated array holding local indices of roots referenced by each leaf rank
Level: developer
-seealso: PetscSF, PetscSFGetRootRanks()
External Links
- PETSc Manual:
Vec/PetscSFGetLeafRanks
PETSc.LibPETSc.PetscSFGetMultiSF — Method
PetscSFGetMultiSF(petsclib::PetscLibType,sf::PetscSF, multi::PetscSF)gets the inner PetscSF implementing gathers and scatters
Collective
Input Parameter:
sf- star forest that may contain roots with 0 or with more than 1 vertex
Output Parameter:
multi- star forest with split roots, such that each root has degree exactly 1
Level: developer
-seealso: PetscSF, PetscSFSetGraph(), PetscSFGatherBegin(), PetscSFScatterBegin(), PetscSFComputeMultiRootOriginalNumbering()
External Links
- PETSc Manual:
Vec/PetscSFGetMultiSF
PETSc.LibPETSc.PetscSFGetRanks — Method
PetscSFGetRanks(petsclib::PetscLibType,sf::PetscSF, nranks::PetscMPIInt, ranks::PetscMPIInt, roffset::PetscInt, rmine::PetscInt, rremote::PetscInt)External Links
- PETSc Manual:
Vec/PetscSFGetRanks
PETSc.LibPETSc.PetscSFGetRanksSF — Method
PetscSFGetRanksSF(petsclib::PetscLibType,sf::PetscSF, rsf::PetscSF)gets the PetscSF to perform communications with root ranks
Collective
Input Parameter:
sf- star forest
Output Parameter:
rsf- the star forest with a single root per process to perform communications
Level: developer
-seealso: PetscSF, PetscSFSetGraph(), PetscSFGetRootRanks()
External Links
- PETSc Manual:
Vec/PetscSFGetRanksSF
PETSc.LibPETSc.PetscSFGetRootRanks — Method
roffset::Vector{PetscInt},rmine::Vector{PetscInt},rremote::Vector{PetscInt} = PetscSFGetRootRanks(petsclib::PetscLibType,sf::PetscSF, nranks::PetscMPIInt, ranks::Vector{PetscMPIInt})Get root ranks and number of vertices referenced by leaves on this process
Not Collective
Input Parameter:
sf- star forest
Output Parameters:
nranks- number of ranks referenced by local partranks- [nranks] array of ranksroffset- [nranks+1] offset inrmine/rremotefor each rankrmine- [roffset[nranks]] concatenated array holding local indices referencing each remote rank, orNULLrremote- [roffset[nranks]] concatenated array holding remote indices referenced for each remote rank, orNULL
Level: developer
-seealso: PetscSF, PetscSFGetLeafRanks()
External Links
- PETSc Manual:
Vec/PetscSFGetRootRanks
PETSc.LibPETSc.PetscSFGetSubSF — Method
PetscSFGetSubSF(petsclib::PetscLibType,mainsf::PetscSF, map::ISLocalToGlobalMapping, subSF::PetscSF)Returns an PetscSF for a specific subset of points. Leaves are re
Collective
Input Parameters:
mainsf-PetscSFstructuremap- aISLocalToGlobalMappingthat contains the subset of points
Output Parameter:
subSF- a subset of themainSFfor the desired subset.
Level: intermediate
-seealso: PetscSF
External Links
- PETSc Manual:
DM/PetscSFGetSubSF
PETSc.LibPETSc.PetscSFGetType — Method
type::PetscSFType = PetscSFGetType(petsclib::PetscLibType,sf::PetscSF)Get the PetscSF communication implementation
Not Collective
Input Parameter:
sf- thePetscSFcontext
Output Parameter:
type- thePetscSFtype name
Level: intermediate
-seealso: PetscSF, PetscSFType, PetscSFSetType(), PetscSFCreate()
External Links
- PETSc Manual:
Vec/PetscSFGetType
PETSc.LibPETSc.PetscSFInitializePackage — Method
PetscSFInitializePackage(petsclib::PetscLibType)Initialize PetscSF package
Logically Collective
Level: developer
-seealso: PetscSF, PetscSFFinalizePackage()
External Links
- PETSc Manual:
Vec/PetscSFInitializePackage
PETSc.LibPETSc.PetscSFMerge — Method
PetscSFMerge(petsclib::PetscLibType,sfa::PetscSF, sfb::PetscSF, merged::PetscSF)append/merge indices of sfb into sfa, with preference for sfb
Collective
Input Parameters:
sfa- defaultPetscSFsfb- additional edges to add/replace edges in sfa
Output Parameter:
merged- newPetscSFwith combined edges
Level: intermediate
-seealso: PetscSFCompose()
External Links
- PETSc Manual:
Vec/PetscSFMerge
PETSc.LibPETSc.PetscSFRegister — Method
PetscSFRegister(petsclib::PetscLibType,name::String, create::external)Adds an implementation of the PetscSF communication protocol.
Not Collective, No Fortran Support
Input Parameters:
name- name of a new user-defined implementationcreate- routine to create method context
-seealso: PetscSF, PetscSFType, PetscSFRegisterAll(), PetscSFInitializePackage()
External Links
- PETSc Manual:
Vec/PetscSFRegister
PETSc.LibPETSc.PetscSFReset — Method
PetscSFReset(petsclib::PetscLibType,sf::PetscSF)Reset a star forest so that different sizes or neighbors can be used
Collective
Input Parameter:
sf- star forest
Level: advanced
-seealso: PetscSF, PetscSFCreate(), PetscSFSetGraph(), PetscSFDestroy()
External Links
- PETSc Manual:
Vec/PetscSFReset
PETSc.LibPETSc.PetscSFSetFromOptions — Method
PetscSFSetFromOptions(petsclib::PetscLibType,sf::PetscSF)set PetscSF options using the options database
Logically Collective
Input Parameter:
sf- star forest
Options Database Keys:
-sf_type- implementation type, seePetscSFSetType()-sf_rank_order- sort composite points for gathers and scatters in rank order, gathers are non-deterministic otherwise-sf_use_default_stream- Assume callers ofPetscSFcomputed the input root/leafdata with the default CUDA stream.PetscSFwill also
use the default stream to process data. Therefore, no stream synchronization is needed between PetscSF and its caller (default: true). If true, this option only works with -use_gpu_aware_mpi 1.
-sf_use_stream_aware_mpi- Assume the underlying MPI is CUDA-stream aware andPetscSFwon't sync streams for send/recv buffers passed to MPI (default: false).
If true, this option only works with -use_gpu_aware_mpi 1.
-sf_backend <cuda,hip,kokkos>- Select the device backendPetscSFuses. CurrentlyPetscSFhas these backends: cuda - hip and Kokkos.
On CUDA (HIP) devices, one can choose cuda (hip) or kokkos with the default being kokkos. On other devices, the only available is kokkos.
Level: intermediate
-seealso: PetscSF, PetscSFCreate(), PetscSFSetType()
External Links
- PETSc Manual:
Vec/PetscSFSetFromOptions
PETSc.LibPETSc.PetscSFSetGraph — Method
PetscSFSetGraph(petsclib::PetscLibType,sf::PetscSF, nroots::PetscInt, nleaves::PetscInt, iloc::Vector{PetscInt}, locmode::PetscCopyMode, iremote::Vector{PetscSFNode}, remotemode::PetscCopyMode)Set a parallel star forest
Collective
Input Parameters:
sf- star forestnroots- number of root vertices on the current process (these are possible targets for other process to attach leaves)nleaves- number of leaf vertices on the current process, each of these references a root on any processilocal- locations of leaves in leafdata buffers, passNULLfor contiguous storage (locations must be >= 0, enforced
during setup in debug mode)
localmode- copy mode forilocaliremote- remote locations of root vertices for each leaf on the current process, length is 2 `nleaves'
(locations must be >= 0, enforced during setup in debug mode)
remotemode- copy mode foriremote
Level: intermediate
-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFView(), PetscSFGetGraph()
External Links
- PETSc Manual:
Vec/PetscSFSetGraph
PETSc.LibPETSc.PetscSFSetGraphFromCoordinates — Method
PetscSFSetGraphFromCoordinates(petsclib::PetscLibType,sf::PetscSF, nroots::PetscInt, nleaves::PetscInt, dim::PetscInt, tol::PetscReal, rootcoords::PetscReal, leafcoords::PetscReal)Create SF by fuzzy matching leaf coordinates to root coordinates
Collective
Input Parameters:
sf- PetscSF to set graph onnroots- number of root coordinatesnleaves- number of leaf coordinatesdim- spatial dimension of coordinatestol- positive tolerance for matchingrootcoords- array of root coordinates in which root i component d is [i*dim+d]leafcoords- array of root coordinates in which leaf i component d is [i*dim+d]
-seealso: PetscSFCreate(), PetscSFSetGraph(), PetscSFCreateByMatchingIndices()
External Links
- PETSc Manual:
Vec/PetscSFSetGraphFromCoordinates
PETSc.LibPETSc.PetscSFSetGraphLayout — Method
iloc::PetscInt = PetscSFSetGraphLayout(petsclib::PetscLibType,sf::PetscSF, layout::PetscLayout, nleaves::PetscInt, locmode::PetscCopyMode, gremote::PetscInt)Set a parallel star forest via global indices and a PetscLayout
Collective
Input Parameters:
sf- star forestlayout-PetscLayoutdefining the global space for rootsnleaves- number of leaf vertices on the current process, each of these references a root on any processilocal- locations of leaves in leafdata buffers, pass NULL for contiguous storagelocalmode- copy mode for ilocalgremote- root vertices in global numbering corresponding to leaves in ilocal
Level: intermediate
-seealso: PetscSF, PetscSFGetGraphLayout(), PetscSFCreate(), PetscSFView(), PetscSFSetGraph(), PetscSFGetGraph()
External Links
- PETSc Manual:
Vec/PetscSFSetGraphLayout
PETSc.LibPETSc.PetscSFSetGraphSection — Method
PetscSFSetGraphSection(petsclib::PetscLibType,sf::PetscSF, locSection::PetscSection, globSection::PetscSection)Sets the PetscSF graph encoding the parallel dof overlap based upon the PetscSection describing the data layout.
Input Parameters:
sf- ThePetscSFlocalSection-PetscSectiondescribing the local data layoutglobalSection-PetscSectiondescribing the global data layout
Level: developer
-seealso: PetscSF, PetscSFSetGraph(), PetscSFSetGraphLayout()
External Links
- PETSc Manual:
Vec/PetscSFSetGraphSection
PETSc.LibPETSc.PetscSFSetGraphWithPattern — Method
PetscSFSetGraphWithPattern(petsclib::PetscLibType,sf::PetscSF, map::PetscLayout, pattern::PetscSFPattern)Sets the graph of a PetscSF with a specific pattern
Collective
Input Parameters:
sf- ThePetscSFmap- Layout of roots over all processes (insignificant when pattern isPETSCSF_PATTERN_ALLTOALL)pattern- One ofPETSCSF_PATTERN_ALLGATHER,PETSCSF_PATTERN_GATHER,PETSCSF_PATTERN_ALLTOALL
Level: intermediate
-seealso: PetscSF, PetscSFCreate(), PetscSFView(), PetscSFGetGraph()
External Links
- PETSc Manual:
Vec/PetscSFSetGraphWithPattern
PETSc.LibPETSc.PetscSFSetRankOrder — Method
PetscSFSetRankOrder(petsclib::PetscLibType,sf::PetscSF, flg::PetscBool)sort multi
Logically Collective
Input Parameters:
sf- star forestflg-PETSC_TRUEto sort,PETSC_FALSEto skip sorting (lower setup cost, but non-deterministic)
Level: advanced
-seealso: PetscSF, PetscSFType, PetscSFGatherBegin(), PetscSFScatterBegin()
External Links
- PETSc Manual:
Vec/PetscSFSetRankOrder
PETSc.LibPETSc.PetscSFSetType — Method
PetscSFSetType(petsclib::PetscLibType,sf::PetscSF, type::PetscSFType)Set the PetscSF communication implementation
Collective
Input Parameters:
sf- thePetscSFcontexttype- a known method
-seealso: PetscSF, PetscSFType, PetscSFCreate()
External Links
- PETSc Manual:
Vec/PetscSFSetType
PETSc.LibPETSc.PetscSFSetUp — Method
PetscSFSetUp(petsclib::PetscLibType,sf::PetscSF)set up communication structures for a PetscSF, after this is done it may be used to perform communication
Collective
Input Parameter:
sf- star forest communication object
Level: beginner
-seealso: PetscSF, PetscSFType, PetscSFSetFromOptions(), PetscSFSetType()
External Links
- PETSc Manual:
Vec/PetscSFSetUp
PETSc.LibPETSc.PetscSFSetUpRanks — Method
PetscSFSetUpRanks(petsclib::PetscLibType,sf::PetscSF, dgroup::MPI_Group)Set up data structures associated with ranks; this is for internal use by PetscSF implementations.
Collective
Input Parameters:
sf-PetscSFto set up;PetscSFSetGraph()must have been calleddgroup-MPI_Groupof ranks to be distinguished (e.g., for self or shared memory exchange)
Level: developer
-seealso: PetscSF, PetscSFGetRootRanks()
External Links
- PETSc Manual:
Vec/PetscSFSetUpRanks
PETSc.LibPETSc.PetscSFView — Method
PetscSFView(petsclib::PetscLibType,sf::PetscSF, viewer::PetscViewer)view a star forest
Collective
Input Parameters:
sf- star forestviewer- viewer to display graph, for examplePETSC_VIEWER_STDOUT_WORLD
Level: beginner
-seealso: PetscSF, PetscViewer, PetscSFCreate(), PetscSFSetGraph()
External Links
- PETSc Manual:
Vec/PetscSFView
PETSc.LibPETSc.PetscSFViewFromOptions — Method
PetscSFViewFromOptions(petsclib::PetscLibType,A::PetscSF, obj::PetscObject, name::String)View a PetscSF based on arguments in the options database
Collective
Input Parameters:
A- the star forestobj- Optional object that provides the prefix for the option namesname- command line option
Level: intermediate
-seealso: PetscSF, PetscSFView, PetscObjectViewFromOptions(), PetscSFCreate()
External Links
- PETSc Manual:
Vec/PetscSFViewFromOptions
PETSc.LibPETSc.PetscSFWindowGetFlavorType — Method
flavor::PetscSFWindowFlavorType = PetscSFWindowGetFlavorType(petsclib::PetscLibType,sf::PetscSF)Get PETSCSFWINDOW flavor type for PetscSF communication
Logically Collective
Input Parameter:
sf- star forest for communication of typePETSCSFWINDOW
Output Parameter:
flavor- flavor type
Level: advanced
-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowSetFlavorType()
External Links
- PETSc Manual:
Vec/PetscSFWindowGetFlavorType
PETSc.LibPETSc.PetscSFWindowGetInfo — Method
PetscSFWindowGetInfo(petsclib::PetscLibType,sf::PetscSF, info::MPI_Info)Get the MPI_Info handle used for windows allocation
Logically Collective
Input Parameter:
sf- star forest for communication
Output Parameter:
info-MPI_Infohandle
Level: advanced
-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowSetInfo()
External Links
- PETSc Manual:
Vec/PetscSFWindowGetInfo
PETSc.LibPETSc.PetscSFWindowGetSyncType — Method
sync::PetscSFWindowSyncType = PetscSFWindowGetSyncType(petsclib::PetscLibType,sf::PetscSF)Get synchronization type for PetscSF communication of type PETSCSFWINDOW
Logically Collective
Input Parameter:
sf- star forest for communication
Output Parameter:
sync- synchronization type
Level: advanced
-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowSetSyncType(), PetscSFWindowSyncType
External Links
- PETSc Manual:
Vec/PetscSFWindowGetSyncType
PETSc.LibPETSc.PetscSFWindowSetFlavorType — Method
PetscSFWindowSetFlavorType(petsclib::PetscLibType,sf::PetscSF, flavor::PetscSFWindowFlavorType)Set flavor type for MPI_Win creation
Logically Collective
Input Parameters:
sf- star forest for communication of typePETSCSFWINDOWflavor- flavor type
Options Database Key:
-sf_window_flavor <flavor>- sets the flavor type CREATE, DYNAMIC, ALLOCATE or SHARED (seePetscSFWindowFlavorType)
Level: advanced
-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowGetFlavorType()
External Links
- PETSc Manual:
Vec/PetscSFWindowSetFlavorType
PETSc.LibPETSc.PetscSFWindowSetInfo — Method
PetscSFWindowSetInfo(petsclib::PetscLibType,sf::PetscSF, info::MPI_Info)Set the MPI_Info handle that will be used for subsequent windows allocation
Logically Collective
Input Parameters:
sf- star forest for communicationinfo-MPI_Infohandle
Level: advanced
-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowGetInfo()
External Links
- PETSc Manual:
Vec/PetscSFWindowSetInfo
PETSc.LibPETSc.PetscSFWindowSetSyncType — Method
PetscSFWindowSetSyncType(petsclib::PetscLibType,sf::PetscSF, sync::PetscSFWindowSyncType)Set synchronization type for PetscSF communication of type PETSCSFWINDOW
Logically Collective
Input Parameters:
sf- star forest for communicationsync- synchronization type
Options Database Key:
-sf_window_sync <sync>- sets the synchronization type FENCE, LOCK, or ACTIVE (seePetscSFWindowSyncType)
Level: advanced
-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowGetSyncType(), PetscSFWindowSyncType
External Links
- PETSc Manual:
Vec/PetscSFWindowSetSyncType