PetscSF (Star Forest) - Low-level Interface

The PetscSF (Star Forest) component provides efficient parallel communication patterns for distributed data structures. A star forest is a specialized graph structure optimized for scatter/gather operations in parallel computing.

Overview

PetscSF enables:

  • Point-to-point communication: Efficient MPI communication patterns
  • Scatter/gather operations: Move data between processors
  • Halo exchange: Update ghost/boundary values
  • Reduction operations: Parallel sums, max, min across shared data
  • Irregular communication: Handle non-uniform data distributions

A star forest consists of:

  • Roots: Data owned locally
  • Leaves: Data needed from remote processes (or local)
  • Communication pattern: Which leaves come from which roots

PetscSF is the underlying communication layer for DM ghost point updates and other parallel operations.

Basic Usage

using PETSc, MPI

# Initialize MPI and PETSc
MPI.Init()
petsclib = PETSc.getlib()
PETSc.initialize(petsclib)
PetscInt = petsclib.PetscInt

# Create a star forest
sf = LibPETSc.PetscSFCreate(petsclib, MPI.COMM_WORLD)

# Define communication pattern
# nleaves: number of leaves (data items we need)
# ilocal: local indices for leaves (can be C_NULL if identity)
# iremote: (rank, index) pairs specifying which process/index to get from

nleaves = 5
# number of roots owned locally (for this simple example set equal to nleaves)
nroots = 5
ilocal = [0, 1, 2, 3, 4]  # Local indices where data will be stored
iremote = [
    LibPETSc.PetscSFNode(0, 0),
    LibPETSc.PetscSFNode(0, 1),
    LibPETSc.PetscSFNode(0, 2),
    LibPETSc.PetscSFNode(0, 3),
    LibPETSc.PetscSFNode(0, 4),
]

LibPETSc.PetscSFSetGraph(petsclib, sf, nroots, nleaves, ilocal, LibPETSc.PETSC_COPY_VALUES,
                         iremote, LibPETSc.PETSC_COPY_VALUES)

# Setup
LibPETSc.PetscSFSetUp(petsclib, sf)

# Cleanup
LibPETSc.PetscSFDestroy(petsclib, sf)

# Finalize PETSc and MPI
PETSc.finalize(petsclib)
MPI.Finalize()

Communication Operations

Broadcast (Scatter)

Send data from roots to leaves:

# Root data: data we own
root_data = Float64[1.0, 2.0, 3.0, 4.0, 5.0]

# Leaf data: buffer to receive data
leaf_data = zeros(Float64, nleaves)

# Broadcast: send root data to leaves
LibPETSc.PetscSFBcastBegin(petsclib, sf, LibPETSc.MPI_DOUBLE, root_data, leaf_data,
                           LibPETSc.MPI_REPLACE)
LibPETSc.PetscSFBcastEnd(petsclib, sf, LibPETSc.MPI_DOUBLE, root_data, leaf_data,
                         LibPETSc.MPI_REPLACE)

Reduce

Accumulate data from leaves back to roots:

# Leaf contributions
leaf_data = Float64[0.1, 0.2, 0.3, 0.4, 0.5]

# Root accumulator
root_data = zeros(Float64, nroots)

# Reduce: accumulate leaf data to roots
LibPETSc.PetscSFReduceBegin(petsclib, sf, LibPETSc.MPI_DOUBLE, leaf_data, root_data,
                            LibPETSc.MPI_SUM)
LibPETSc.PetscSFReduceEnd(petsclib, sf, LibPETSc.MPI_DOUBLE, leaf_data, root_data,
                          LibPETSc.MPI_SUM)

Fetch and Operations

Atomic operations for concurrent updates:

# Fetch data and apply operation
LibPETSc.PetscSFFetchAndOpBegin(petsclib, sf, LibPETSc.MPI_DOUBLE, root_data,
                                leaf_data, leaf_updates, LibPETSc.MPI_SUM)
LibPETSc.PetscSFFetchAndOpEnd(petsclib, sf, LibPETSc.MPI_DOUBLE, root_data,
                              leaf_data, leaf_updates, LibPETSc.MPI_SUM)

MPI Operations

Supported MPI operations for reduce:

  • MPI_SUM: Sum values
  • MPI_MAX: Maximum value
  • MPI_MIN: Minimum value
  • MPI_REPLACE: Replace (last write wins)
  • MPI_PROD: Product

Star Forest Types

Available through PetscSFSetType:

  • PETSCSFBASIC: Basic implementation
  • PETSCSFNEIGHBOR: MPI neighborhood collectives (efficient for structured patterns)
  • PETSCSFALLGATHERV: All-gather based
  • PETSCSFALLGATHER: All-gather for small data
  • PETSCSFGATHERV: Gather-based
  • PETSCSFGATHER: Simple gather
  • PETSCSFALLTOALL: All-to-all based

Graph Queries

# Get number of roots (locally owned data)
nroots = Ref{PetscInt}()
LibPETSc.PetscSFGetGraph(petsclib, sf, nroots, C_NULL, C_NULL, C_NULL)

# Get number of leaves
nleaves = Ref{PetscInt}()
LibPETSc.PetscSFGetGraph(petsclib, sf, C_NULL, nleaves, C_NULL, C_NULL)

# Get full graph
ilocal_ptr = Ref{Ptr{PetscInt}}()
iremote_ptr = Ref{Ptr{LibPETSc.PetscSFNode}}()
LibPETSc.PetscSFGetGraph(petsclib, sf, nroots, nleaves, ilocal_ptr, iremote_ptr)

Multi-Root Support

Handle communication with multiple root data per point:

# Create multi-SF for multiple DOFs per point
nroots_mult = nroots * num_components
multi_sf = LibPETSc.PetscSFCreateEmbeddedRootSF(petsclib, sf, nroots_mult, iroot_indices)

Common Use Cases

1. Ghost Point Updates (Halo Exchange)

# After modifying owned data, update ghost points
# 1. Pack local data
# 2. Broadcast to leaves (ghost points)
LibPETSc.PetscSFBcastBegin(petsclib, sf, datatype, local_data, ghost_data, op)
LibPETSc.PetscSFBcastEnd(petsclib, sf, datatype, local_data, ghost_data, op)

2. Parallel Assembly

# After local assembly, accumulate contributions from other processes
# 1. Each process computes local contributions
# 2. Reduce to accumulate at owners
LibPETSc.PetscSFReduceBegin(petsclib, sf, datatype, local_contrib, global_data, MPI_SUM)
LibPETSc.PetscSFReduceEnd(petsclib, sf, datatype, local_contrib, global_data, MPI_SUM)

3. DM Point Communication

# Get natural SF for a DM (describes point distribution)
dm_sf = Ref{LibPETSc.PetscSF}()
# LibPETSc.DMGetPointSF(petsclib, dm, dm_sf)

# Use to communicate point-based data

Performance Considerations

  • Choose appropriate type: PETSCSFNEIGHBOR is often best for structured grids
  • Reuse SF objects: Creating the communication pattern is expensive
  • Batch communications: Combine multiple small messages when possible
  • Alignment: Use properly aligned data types for better performance

Function Reference

PETSc.LibPETSc.PetscSFComposeMethod
PetscSFCompose(petsclib::PetscLibType,sfA::PetscSF, sfB::PetscSF, sfBA::PetscSF)

Compose a new PetscSF by putting the second PetscSF under the first one in a top (roots) down (leaves) view

Input Parameters:

  • sfA - The first PetscSF
  • sfB - The second PetscSF

Output Parameter:

  • sfBA - The composite PetscSF

Level: developer

-seealso: PetscSF, PetscSFComposeInverse(), PetscSFGetGraph(), PetscSFSetGraph()

External Links

source
PETSc.LibPETSc.PetscSFComposeInverseMethod
PetscSFComposeInverse(petsclib::PetscLibType,sfA::PetscSF, sfB::PetscSF, sfBA::PetscSF)

Compose a new PetscSF by putting the inverse of the second PetscSF under the first one

Input Parameters:

  • sfA - The first PetscSF
  • sfB - The second PetscSF

Output Parameter:

  • sfBA - The composite PetscSF.

Level: developer

-seealso: PetscSF, PetscSFCompose(), PetscSFGetGraph(), PetscSFSetGraph(), PetscSFCreateInverseSF()

External Links

source
PETSc.LibPETSc.PetscSFComputeDegreeBeginMethod
degree::Vector{PetscInt} = PetscSFComputeDegreeBegin(petsclib::PetscLibType,sf::PetscSF)

begin computation of degree for each root vertex, to be completed with PetscSFComputeDegreeEnd()

Collective

Input Parameter:

  • sf - star forest

Output Parameter:

  • degree - degree of each root vertex

Level: advanced

-seealso: PetscSF, PetscSFGatherBegin(), PetscSFComputeDegreeEnd()

External Links

source
PETSc.LibPETSc.PetscSFComputeDegreeEndMethod
degree::Vector{PetscInt} = PetscSFComputeDegreeEnd(petsclib::PetscLibType,sf::PetscSF)

complete computation of degree for each root vertex, started with PetscSFComputeDegreeBegin()

Collective

Input Parameter:

  • sf - star forest

Output Parameter:

  • degree - degree of each root vertex

Level: developer

-seealso: PetscSF, PetscSFGatherBegin(), PetscSFComputeDegreeBegin()

External Links

source
PETSc.LibPETSc.PetscSFComputeMultiRootOriginalNumberingMethod
nMultiRoots::PetscInt,multiRootsOrigNumbering::Vector{PetscInt} = PetscSFComputeMultiRootOriginalNumbering(petsclib::PetscLibType,sf::PetscSF, degree::Vector{PetscInt})

Returns original numbering of multi Each multi-root is assigned index of the corresponding original root.

Collective

Input Parameters:

  • sf - star forest
  • degree - degree of each root vertex, computed with PetscSFComputeDegreeBegin()/PetscSFComputeDegreeEnd()

Output Parameters:

  • nMultiRoots - (optional) number of multi-roots (roots of multi-PetscSF)
  • multiRootsOrigNumbering - original indices of multi-roots; length of this array is nMultiRoots

Level: developer

-seealso: PetscSF, PetscSFComputeDegreeBegin(), PetscSFComputeDegreeEnd(), PetscSFGetMultiSF()

External Links

source
PETSc.LibPETSc.PetscSFConcatenateMethod
PetscSFConcatenate(petsclib::PetscLibType,comm::MPI_Comm, nsfs::PetscInt, sfs::Vector{PetscSF}, rootMode::PetscSFConcatenateRootMode, leafOffsets::Vector{PetscInt}, newsf::PetscSF)

concatenate multiple PetscSF into one

Input Parameters:

  • comm - the communicator
  • nsfs - the number of input PetscSF
  • sfs - the array of input PetscSF
  • rootMode - the root mode specifying how roots are handled
  • leafOffsets - the array of local leaf offsets, one for each input PetscSF, or NULL for contiguous storage

Output Parameter:

  • newsf - The resulting PetscSF

Level: advanced

-seealso: PetscSF, PetscSFCompose(), PetscSFGetGraph(), PetscSFSetGraph(), PetscSFConcatenateRootMode

External Links

source
PETSc.LibPETSc.PetscSFCreateMethod
sf::PetscSF = PetscSFCreate(petsclib::PetscLibType,comm::MPI_Comm)

create a star forest communication context

Collective

Input Parameter:

  • comm - communicator on which the star forest will operate

Output Parameter:

  • sf - new star forest context

Options Database Key:

  • -sf_type basic - Use MPI persistent Isend/Irecv for communication (Default)
  • -sf_type window - Use MPI-3 one-sided window for communication
  • -sf_type neighbor - Use MPI-3 neighborhood collectives for communication
  • -sf_neighbor_persistent <bool> - If true, use MPI-4 persistent neighborhood collectives for communication (used along with -sf_type neighbor)

Level: intermediate

-seealso: PetscSF, PetscSFSetType, PetscSFSetGraph(), PetscSFSetGraphWithPattern(), PetscSFDestroy()

External Links

source
PETSc.LibPETSc.PetscSFCreateByMatchingIndicesMethod
sfA::PetscSF,sf::PetscSF = PetscSFCreateByMatchingIndices(petsclib::PetscLibType,layout::PetscLayout, numRootIndices::PetscInt, rootIndices::PetscInt, rootLocalIndices::PetscInt, rootLocalOffset::PetscInt, numLeafIndices::PetscInt, leafIndices::PetscInt, leafLocalIndices::PetscInt, leafLocalOffset::PetscInt)

Create PetscSF by matching root and leaf indices

Collective

Input Parameters:

  • layout - PetscLayout defining the global index space and the rank that brokers each index
  • numRootIndices - size of rootIndices
  • rootIndices - PetscInt array of global indices of which this process requests ownership
  • rootLocalIndices - root local index permutation (NULL if no permutation)
  • rootLocalOffset - offset to be added to root local indices
  • numLeafIndices - size of leafIndices
  • leafIndices - PetscInt array of global indices with which this process requires data associated
  • leafLocalIndices - leaf local index permutation (NULL if no permutation)
  • leafLocalOffset - offset to be added to leaf local indices

Output Parameters:

  • sfA - star forest representing the communication pattern from the layout space to the leaf space (NULL if not needed)
  • sf - star forest representing the communication pattern from the root space to the leaf space

Level: advanced

Example 1: -seealso: PetscSF, PetscSFCreate()

External Links

source
PETSc.LibPETSc.PetscSFCreateEmbeddedLeafSFMethod
newsf::PetscSF = PetscSFCreateEmbeddedLeafSF(petsclib::PetscLibType,sf::PetscSF, nselected::PetscInt, selected::PetscInt)

removes edges from all but the selected leaves of a PetscSF, does not remap indices

Collective

Input Parameters:

  • sf - original star forest
  • nselected - number of selected leaves on this process
  • selected - indices of the selected leaves on this process

Output Parameter:

  • newsf - new star forest

Level: advanced

-seealso: PetscSF, PetscSFCreateEmbeddedRootSF(), PetscSFSetGraph(), PetscSFGetGraph()

External Links

source
PETSc.LibPETSc.PetscSFCreateEmbeddedRootSFMethod
esf::PetscSF = PetscSFCreateEmbeddedRootSF(petsclib::PetscLibType,sf::PetscSF, nselected::PetscInt, selected::PetscInt)

removes edges from all but the selected roots of a PetscSF, does not remap indices

Collective

Input Parameters:

  • sf - original star forest
  • nselected - number of selected roots on this process
  • selected - indices of the selected roots on this process

Output Parameter:

  • esf - new star forest

Level: advanced

-seealso: PetscSF, PetscSFSetGraph(), PetscSFGetGraph()

External Links

source
PETSc.LibPETSc.PetscSFCreateFromLayoutsMethod
sf::PetscSF = PetscSFCreateFromLayouts(petsclib::PetscLibType,rmap::PetscLayout, lmap::PetscLayout)

Creates a parallel star forest mapping two PetscLayout objects

Collective

Input Parameters:

  • rmap - PetscLayout defining the global root space
  • lmap - PetscLayout defining the global leaf space

Output Parameter:

  • sf - The parallel star forest

Level: intermediate

-seealso: PetscSF, PetscSFCreate(), PetscLayoutCreate(), PetscSFSetGraphLayout()

External Links

source
PETSc.LibPETSc.PetscSFCreateInverseSFMethod
isf::PetscSF = PetscSFCreateInverseSF(petsclib::PetscLibType,sf::PetscSF)

given a PetscSF in which all vertices have degree 1, creates the inverse map

Collective

Input Parameter:

  • sf - star forest to invert

Output Parameter:

  • isf - inverse of sf

Level: advanced

-seealso: PetscSF, PetscSFType, PetscSFSetGraph()

External Links

source
PETSc.LibPETSc.PetscSFCreateRemoteOffsetsMethod
remoteOffsets::Vector{PetscInt} = PetscSFCreateRemoteOffsets(petsclib::PetscLibType,sf::PetscSF, rootSection::PetscSection, leafSection::PetscSection)

Create offsets for point data on remote processes

Collective

Input Parameters:

  • sf - The PetscSF
  • rootSection - Data layout of remote points for outgoing data (this is layout for roots)
  • leafSection - Data layout of local points for incoming data (this is layout for leaves)

Output Parameter:

  • remoteOffsets - Offsets for point data on remote processes (these are offsets from the root section), or NULL

Level: developer

-seealso: PetscSF, PetscSFCreate()

External Links

source
PETSc.LibPETSc.PetscSFCreateSectionSFMethod
sectionSF::PetscSF = PetscSFCreateSectionSF(petsclib::PetscLibType,sf::PetscSF, rootSection::PetscSection, remoteOffsets::Vector{PetscInt}, leafSection::PetscSection)

Create an expanded PetscSF of dofs, assuming the input PetscSF relates points

Collective

Input Parameters:

  • sf - The PetscSF
  • rootSection - Data layout of remote points for outgoing data (this is usually the serial section)
  • remoteOffsets - Offsets for point data on remote processes (these are offsets from the root section), or NULL
  • leafSection - Data layout of local points for incoming data (this is the distributed section)

Output Parameter:

  • sectionSF - The new PetscSF

Level: advanced

-seealso: PetscSF, PetscSFCreate()

External Links

source
PETSc.LibPETSc.PetscSFCreateStridedSFMethod
vsf::PetscSF = PetscSFCreateStridedSF(petsclib::PetscLibType,sf::PetscSF, bs::PetscInt, ldr::PetscInt, ldl::PetscInt)

Create an PetscSF to communicate interleaved blocks of data

Collective

Input Parameters:

  • sf - star forest
  • bs - stride
  • ldr - leading dimension of root space
  • ldl - leading dimension of leaf space

Output Parameter:

  • vsf - the new PetscSF

Level: intermediate

-seealso: PetscSF, PetscSFCreate(), PetscSFSetGraph()

External Links

source
PETSc.LibPETSc.PetscSFDestroyMethod
PetscSFDestroy(petsclib::PetscLibType,sf::PetscSF)

destroy a star forest

Collective

Input Parameter:

  • sf - address of star forest

Level: intermediate

-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFReset()

External Links

source
PETSc.LibPETSc.PetscSFDistributeSectionMethod
remoteOffsets::Vector{PetscInt} = PetscSFDistributeSection(petsclib::PetscLibType,sf::PetscSF, rootSection::PetscSection, leafSection::PetscSection)

Create a new PetscSection reorganized, moving from the root to the leaves of the PetscSF

Collective

Input Parameters:

  • sf - The PetscSF
  • rootSection - Section defined on root space

Output Parameters:

  • remoteOffsets - root offsets in leaf storage, or NULL, its length will be the size of the chart of leafSection
  • leafSection - Section defined on the leaf space

Level: advanced

-seealso: PetscSF, PetscSFCreate()

External Links

source
PETSc.LibPETSc.PetscSFDuplicateMethod
newsf::PetscSF = PetscSFDuplicate(petsclib::PetscLibType,sf::PetscSF, opt::PetscSFDuplicateOption)

duplicate a PetscSF, optionally preserving rank connectivity and graph

Collective

Input Parameters:

  • sf - communication object to duplicate
  • opt - PETSCSF_DUPLICATE_CONFONLY, PETSCSF_DUPLICATE_RANKS, or PETSCSF_DUPLICATE_GRAPH (see PetscSFDuplicateOption)

Output Parameter:

  • newsf - new communication object

Level: beginner

-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFSetType(), PetscSFSetGraph()

External Links

source
PETSc.LibPETSc.PetscSFGetGraphMethod
nroots::PetscInt,nleaves::PetscInt,iloc::Vector{PetscInt} = PetscSFGetGraph(petsclib::PetscLibType,sf::PetscSF, iremote::Vector{PetscSFNode})

Get the graph specifying a parallel star forest

Not Collective

Input Parameter:

  • sf - star forest

Output Parameters:

  • nroots - number of root vertices on the current process (these are possible targets for other process to attach leaves)
  • nleaves - number of leaf vertices on the current process, each of these references a root on any process
  • ilocal - locations of leaves in leafdata buffers (if returned value is NULL, it means leaves are in contiguous storage)
  • iremote - remote locations of root vertices for each leaf on the current process

Level: intermediate

-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFView(), PetscSFSetGraph()

External Links

source
PETSc.LibPETSc.PetscSFGetGraphLayoutMethod
nleaves::PetscInt,iloc::Vector{PetscInt},gremote::Vector{PetscInt} = PetscSFGetGraphLayout(petsclib::PetscLibType,sf::PetscSF, layout::PetscLayout)

Get the global indices and PetscLayout that describe this star forest

Collective

Input Parameter:

  • sf - star forest

Output Parameters:

  • layout - PetscLayout defining the global space for roots
  • nleaves - number of leaf vertices on the current process, each of these references a root on any process
  • ilocal - locations of leaves in leafdata buffers, or NULL for contiguous storage
  • gremote - root vertices in global numbering corresponding to leaves in ilocal

Level: intermediate

-seealso: PetscSF, PetscSFSetGraphLayout(), PetscSFCreate(), PetscSFView(), PetscSFSetGraph(), PetscSFGetGraph()

External Links

source
PETSc.LibPETSc.PetscSFGetGroupsMethod
PetscSFGetGroups(petsclib::PetscLibType,sf::PetscSF, incoming::MPI_Group, outgoing::MPI_Group)

gets incoming and outgoing process groups

Collective

Input Parameter:

  • sf - star forest

Output Parameters:

  • incoming - group of origin processes for incoming edges (leaves that reference my roots)
  • outgoing - group of destination processes for outgoing edges (roots that I reference)

Level: developer

-seealso: PetscSF, PetscSFGetWindow(), PetscSFRestoreWindow()

External Links

source
PETSc.LibPETSc.PetscSFGetLeafRangeMethod
minleaf::PetscInt,maxleaf::PetscInt = PetscSFGetLeafRange(petsclib::PetscLibType,sf::PetscSF)

Get the active leaf ranges

Not Collective

Input Parameter:

  • sf - star forest

Output Parameters:

  • minleaf - minimum active leaf on this process. Returns 0 if there are no leaves.
  • maxleaf - maximum active leaf on this process. Returns -1 if there are no leaves.

Level: developer

-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFView(), PetscSFSetGraph(), PetscSFGetGraph()

External Links

source
PETSc.LibPETSc.PetscSFGetLeafRanksMethod
ioffset::Vector{PetscInt},irootloc::Vector{PetscInt} = PetscSFGetLeafRanks(petsclib::PetscLibType,sf::PetscSF, niranks::PetscMPIInt, iranks::Vector{PetscMPIInt})

Get leaf ranks referencing roots on this process

Not Collective

Input Parameter:

  • sf - star forest

Output Parameters:

  • niranks - number of leaf ranks referencing roots on this process
  • iranks - [niranks] array of ranks
  • ioffset - [niranks+1] offset in irootloc for each rank
  • irootloc - [ioffset[niranks]] concatenated array holding local indices of roots referenced by each leaf rank

Level: developer

-seealso: PetscSF, PetscSFGetRootRanks()

External Links

source
PETSc.LibPETSc.PetscSFGetMultiSFMethod
PetscSFGetMultiSF(petsclib::PetscLibType,sf::PetscSF, multi::PetscSF)

gets the inner PetscSF implementing gathers and scatters

Collective

Input Parameter:

  • sf - star forest that may contain roots with 0 or with more than 1 vertex

Output Parameter:

  • multi - star forest with split roots, such that each root has degree exactly 1

Level: developer

-seealso: PetscSF, PetscSFSetGraph(), PetscSFGatherBegin(), PetscSFScatterBegin(), PetscSFComputeMultiRootOriginalNumbering()

External Links

source
PETSc.LibPETSc.PetscSFGetRanksSFMethod
PetscSFGetRanksSF(petsclib::PetscLibType,sf::PetscSF, rsf::PetscSF)

gets the PetscSF to perform communications with root ranks

Collective

Input Parameter:

  • sf - star forest

Output Parameter:

  • rsf - the star forest with a single root per process to perform communications

Level: developer

-seealso: PetscSF, PetscSFSetGraph(), PetscSFGetRootRanks()

External Links

source
PETSc.LibPETSc.PetscSFGetRootRanksMethod
roffset::Vector{PetscInt},rmine::Vector{PetscInt},rremote::Vector{PetscInt} = PetscSFGetRootRanks(petsclib::PetscLibType,sf::PetscSF, nranks::PetscMPIInt, ranks::Vector{PetscMPIInt})

Get root ranks and number of vertices referenced by leaves on this process

Not Collective

Input Parameter:

  • sf - star forest

Output Parameters:

  • nranks - number of ranks referenced by local part
  • ranks - [nranks] array of ranks
  • roffset - [nranks+1] offset in rmine/rremote for each rank
  • rmine - [roffset[nranks]] concatenated array holding local indices referencing each remote rank, or NULL
  • rremote - [roffset[nranks]] concatenated array holding remote indices referenced for each remote rank, or NULL

Level: developer

-seealso: PetscSF, PetscSFGetLeafRanks()

External Links

source
PETSc.LibPETSc.PetscSFGetSubSFMethod
PetscSFGetSubSF(petsclib::PetscLibType,mainsf::PetscSF, map::ISLocalToGlobalMapping, subSF::PetscSF)

Returns an PetscSF for a specific subset of points. Leaves are re

Collective

Input Parameters:

  • mainsf - PetscSF structure
  • map - a ISLocalToGlobalMapping that contains the subset of points

Output Parameter:

  • subSF - a subset of the mainSF for the desired subset.

Level: intermediate

-seealso: PetscSF

External Links

source
PETSc.LibPETSc.PetscSFGetTypeMethod
type::PetscSFType = PetscSFGetType(petsclib::PetscLibType,sf::PetscSF)

Get the PetscSF communication implementation

Not Collective

Input Parameter:

  • sf - the PetscSF context

Output Parameter:

  • type - the PetscSF type name

Level: intermediate

-seealso: PetscSF, PetscSFType, PetscSFSetType(), PetscSFCreate()

External Links

source
PETSc.LibPETSc.PetscSFMergeMethod
PetscSFMerge(petsclib::PetscLibType,sfa::PetscSF, sfb::PetscSF, merged::PetscSF)

append/merge indices of sfb into sfa, with preference for sfb

Collective

Input Parameters:

  • sfa - default PetscSF
  • sfb - additional edges to add/replace edges in sfa

Output Parameter:

  • merged - new PetscSF with combined edges

Level: intermediate

-seealso: PetscSFCompose()

External Links

source
PETSc.LibPETSc.PetscSFRegisterMethod
PetscSFRegister(petsclib::PetscLibType,name::String, create::external)

Adds an implementation of the PetscSF communication protocol.

Not Collective, No Fortran Support

Input Parameters:

  • name - name of a new user-defined implementation
  • create - routine to create method context

-seealso: PetscSF, PetscSFType, PetscSFRegisterAll(), PetscSFInitializePackage()

External Links

source
PETSc.LibPETSc.PetscSFResetMethod
PetscSFReset(petsclib::PetscLibType,sf::PetscSF)

Reset a star forest so that different sizes or neighbors can be used

Collective

Input Parameter:

  • sf - star forest

Level: advanced

-seealso: PetscSF, PetscSFCreate(), PetscSFSetGraph(), PetscSFDestroy()

External Links

source
PETSc.LibPETSc.PetscSFSetFromOptionsMethod
PetscSFSetFromOptions(petsclib::PetscLibType,sf::PetscSF)

set PetscSF options using the options database

Logically Collective

Input Parameter:

  • sf - star forest

Options Database Keys:

  • -sf_type - implementation type, see PetscSFSetType()
  • -sf_rank_order - sort composite points for gathers and scatters in rank order, gathers are non-deterministic otherwise
  • -sf_use_default_stream - Assume callers of PetscSF computed the input root/leafdata with the default CUDA stream. PetscSF will also

use the default stream to process data. Therefore, no stream synchronization is needed between PetscSF and its caller (default: true). If true, this option only works with -use_gpu_aware_mpi 1.

  • -sf_use_stream_aware_mpi - Assume the underlying MPI is CUDA-stream aware and PetscSF won't sync streams for send/recv buffers passed to MPI (default: false).

If true, this option only works with -use_gpu_aware_mpi 1.

  • -sf_backend <cuda,hip,kokkos> - Select the device backendPetscSF uses. Currently PetscSF has these backends: cuda - hip and Kokkos.

On CUDA (HIP) devices, one can choose cuda (hip) or kokkos with the default being kokkos. On other devices, the only available is kokkos.

Level: intermediate

-seealso: PetscSF, PetscSFCreate(), PetscSFSetType()

External Links

source
PETSc.LibPETSc.PetscSFSetGraphMethod
PetscSFSetGraph(petsclib::PetscLibType,sf::PetscSF, nroots::PetscInt, nleaves::PetscInt, iloc::Vector{PetscInt}, locmode::PetscCopyMode, iremote::Vector{PetscSFNode}, remotemode::PetscCopyMode)

Set a parallel star forest

Collective

Input Parameters:

  • sf - star forest
  • nroots - number of root vertices on the current process (these are possible targets for other process to attach leaves)
  • nleaves - number of leaf vertices on the current process, each of these references a root on any process
  • ilocal - locations of leaves in leafdata buffers, pass NULL for contiguous storage (locations must be >= 0, enforced

during setup in debug mode)

  • localmode - copy mode for ilocal
  • iremote - remote locations of root vertices for each leaf on the current process, length is 2 `nleaves'

(locations must be >= 0, enforced during setup in debug mode)

  • remotemode - copy mode for iremote

Level: intermediate

-seealso: PetscSF, PetscSFType, PetscSFCreate(), PetscSFView(), PetscSFGetGraph()

External Links

source
PETSc.LibPETSc.PetscSFSetGraphFromCoordinatesMethod
PetscSFSetGraphFromCoordinates(petsclib::PetscLibType,sf::PetscSF, nroots::PetscInt, nleaves::PetscInt, dim::PetscInt, tol::PetscReal, rootcoords::PetscReal, leafcoords::PetscReal)

Create SF by fuzzy matching leaf coordinates to root coordinates

Collective

Input Parameters:

  • sf - PetscSF to set graph on
  • nroots - number of root coordinates
  • nleaves - number of leaf coordinates
  • dim - spatial dimension of coordinates
  • tol - positive tolerance for matching
  • rootcoords - array of root coordinates in which root i component d is [i*dim+d]
  • leafcoords - array of root coordinates in which leaf i component d is [i*dim+d]

-seealso: PetscSFCreate(), PetscSFSetGraph(), PetscSFCreateByMatchingIndices()

External Links

source
PETSc.LibPETSc.PetscSFSetGraphLayoutMethod
iloc::PetscInt = PetscSFSetGraphLayout(petsclib::PetscLibType,sf::PetscSF, layout::PetscLayout, nleaves::PetscInt, locmode::PetscCopyMode, gremote::PetscInt)

Set a parallel star forest via global indices and a PetscLayout

Collective

Input Parameters:

  • sf - star forest
  • layout - PetscLayout defining the global space for roots
  • nleaves - number of leaf vertices on the current process, each of these references a root on any process
  • ilocal - locations of leaves in leafdata buffers, pass NULL for contiguous storage
  • localmode - copy mode for ilocal
  • gremote - root vertices in global numbering corresponding to leaves in ilocal

Level: intermediate

-seealso: PetscSF, PetscSFGetGraphLayout(), PetscSFCreate(), PetscSFView(), PetscSFSetGraph(), PetscSFGetGraph()

External Links

source
PETSc.LibPETSc.PetscSFSetGraphSectionMethod
PetscSFSetGraphSection(petsclib::PetscLibType,sf::PetscSF, locSection::PetscSection, globSection::PetscSection)

Sets the PetscSF graph encoding the parallel dof overlap based upon the PetscSection describing the data layout.

Input Parameters:

  • sf - The PetscSF
  • localSection - PetscSection describing the local data layout
  • globalSection - PetscSection describing the global data layout

Level: developer

-seealso: PetscSF, PetscSFSetGraph(), PetscSFSetGraphLayout()

External Links

source
PETSc.LibPETSc.PetscSFSetGraphWithPatternMethod
PetscSFSetGraphWithPattern(petsclib::PetscLibType,sf::PetscSF, map::PetscLayout, pattern::PetscSFPattern)

Sets the graph of a PetscSF with a specific pattern

Collective

Input Parameters:

  • sf - The PetscSF
  • map - Layout of roots over all processes (insignificant when pattern is PETSCSF_PATTERN_ALLTOALL)
  • pattern - One of PETSCSF_PATTERN_ALLGATHER, PETSCSF_PATTERN_GATHER, PETSCSF_PATTERN_ALLTOALL

Level: intermediate

-seealso: PetscSF, PetscSFCreate(), PetscSFView(), PetscSFGetGraph()

External Links

source
PETSc.LibPETSc.PetscSFSetRankOrderMethod
PetscSFSetRankOrder(petsclib::PetscLibType,sf::PetscSF, flg::PetscBool)

sort multi

Logically Collective

Input Parameters:

  • sf - star forest
  • flg - PETSC_TRUE to sort, PETSC_FALSE to skip sorting (lower setup cost, but non-deterministic)

Level: advanced

-seealso: PetscSF, PetscSFType, PetscSFGatherBegin(), PetscSFScatterBegin()

External Links

source
PETSc.LibPETSc.PetscSFSetTypeMethod
PetscSFSetType(petsclib::PetscLibType,sf::PetscSF, type::PetscSFType)

Set the PetscSF communication implementation

Collective

Input Parameters:

  • sf - the PetscSF context
  • type - a known method

-seealso: PetscSF, PetscSFType, PetscSFCreate()

External Links

source
PETSc.LibPETSc.PetscSFSetUpMethod
PetscSFSetUp(petsclib::PetscLibType,sf::PetscSF)

set up communication structures for a PetscSF, after this is done it may be used to perform communication

Collective

Input Parameter:

  • sf - star forest communication object

Level: beginner

-seealso: PetscSF, PetscSFType, PetscSFSetFromOptions(), PetscSFSetType()

External Links

source
PETSc.LibPETSc.PetscSFSetUpRanksMethod
PetscSFSetUpRanks(petsclib::PetscLibType,sf::PetscSF, dgroup::MPI_Group)

Set up data structures associated with ranks; this is for internal use by PetscSF implementations.

Collective

Input Parameters:

  • sf - PetscSF to set up; PetscSFSetGraph() must have been called
  • dgroup - MPI_Group of ranks to be distinguished (e.g., for self or shared memory exchange)

Level: developer

-seealso: PetscSF, PetscSFGetRootRanks()

External Links

source
PETSc.LibPETSc.PetscSFViewMethod
PetscSFView(petsclib::PetscLibType,sf::PetscSF, viewer::PetscViewer)

view a star forest

Collective

Input Parameters:

  • sf - star forest
  • viewer - viewer to display graph, for example PETSC_VIEWER_STDOUT_WORLD

Level: beginner

-seealso: PetscSF, PetscViewer, PetscSFCreate(), PetscSFSetGraph()

External Links

source
PETSc.LibPETSc.PetscSFViewFromOptionsMethod
PetscSFViewFromOptions(petsclib::PetscLibType,A::PetscSF, obj::PetscObject, name::String)

View a PetscSF based on arguments in the options database

Collective

Input Parameters:

  • A - the star forest
  • obj - Optional object that provides the prefix for the option names
  • name - command line option

Level: intermediate

-seealso: PetscSF, PetscSFView, PetscObjectViewFromOptions(), PetscSFCreate()

External Links

source
PETSc.LibPETSc.PetscSFWindowGetFlavorTypeMethod
flavor::PetscSFWindowFlavorType = PetscSFWindowGetFlavorType(petsclib::PetscLibType,sf::PetscSF)

Get PETSCSFWINDOW flavor type for PetscSF communication

Logically Collective

Input Parameter:

  • sf - star forest for communication of type PETSCSFWINDOW

Output Parameter:

  • flavor - flavor type

Level: advanced

-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowSetFlavorType()

External Links

source
PETSc.LibPETSc.PetscSFWindowGetInfoMethod
PetscSFWindowGetInfo(petsclib::PetscLibType,sf::PetscSF, info::MPI_Info)

Get the MPI_Info handle used for windows allocation

Logically Collective

Input Parameter:

  • sf - star forest for communication

Output Parameter:

  • info - MPI_Info handle

Level: advanced

-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowSetInfo()

External Links

source
PETSc.LibPETSc.PetscSFWindowGetSyncTypeMethod
sync::PetscSFWindowSyncType = PetscSFWindowGetSyncType(petsclib::PetscLibType,sf::PetscSF)

Get synchronization type for PetscSF communication of type PETSCSFWINDOW

Logically Collective

Input Parameter:

  • sf - star forest for communication

Output Parameter:

  • sync - synchronization type

Level: advanced

-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowSetSyncType(), PetscSFWindowSyncType

External Links

source
PETSc.LibPETSc.PetscSFWindowSetFlavorTypeMethod
PetscSFWindowSetFlavorType(petsclib::PetscLibType,sf::PetscSF, flavor::PetscSFWindowFlavorType)

Set flavor type for MPI_Win creation

Logically Collective

Input Parameters:

  • sf - star forest for communication of type PETSCSFWINDOW
  • flavor - flavor type

Options Database Key:

  • -sf_window_flavor <flavor> - sets the flavor type CREATE, DYNAMIC, ALLOCATE or SHARED (see PetscSFWindowFlavorType)

Level: advanced

-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowGetFlavorType()

External Links

source
PETSc.LibPETSc.PetscSFWindowSetInfoMethod
PetscSFWindowSetInfo(petsclib::PetscLibType,sf::PetscSF, info::MPI_Info)

Set the MPI_Info handle that will be used for subsequent windows allocation

Logically Collective

Input Parameters:

  • sf - star forest for communication
  • info - MPI_Info handle

Level: advanced

-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowGetInfo()

External Links

source
PETSc.LibPETSc.PetscSFWindowSetSyncTypeMethod
PetscSFWindowSetSyncType(petsclib::PetscLibType,sf::PetscSF, sync::PetscSFWindowSyncType)

Set synchronization type for PetscSF communication of type PETSCSFWINDOW

Logically Collective

Input Parameters:

  • sf - star forest for communication
  • sync - synchronization type

Options Database Key:

  • -sf_window_sync <sync> - sets the synchronization type FENCE, LOCK, or ACTIVE (see PetscSFWindowSyncType)

Level: advanced

-seealso: PetscSF, PETSCSFWINDOW, PetscSFSetFromOptions(), PetscSFWindowGetSyncType(), PetscSFWindowSyncType

External Links

source