*** Note that this commit depends on a corresponding change in
ThirdParty-dev. Ensure that both repositories are up to date before
re-building OpenFOAM.
New environment variables have been added to explicitly control the
installation type of the thirdparty decomposition libraries and of the
ParaView visualiation software. These are set in the etc/bashrc and can
be overridden in a ~/.OpenFOAM/<version>/prefs.sh file or similar.
The variables relating to the decomposition libraries are SCOTCH_TYPE,
METIS_TYPE, PARMETIS_TYPE and ZOLTAN_TYPE, and they can take values of
none, system, or ThirdParty. In the case of ThirdParty, a
<library>_VERSION variable can also be specified. If the version is not
specified then the configuration will search for a source directory, and
if multiple such directories are found then the one with the highest
version number will be used.
The variable relating to ParaView is ParaView_TYPE, and this can be
similarly be set to none, system, or ThirdParty, and ParaView_VERSION
can also be specified when the type is ThirdParty. If the version is not
specified then the installation with the highest version number will be
used.
An example ~/.OpenFOAM/dev/prefs.sh file, in which all decomposition
libraries are enabled, and the Scotch and ParaView versions are
explicitly set, is as follows:
export SCOTCH_TYPE=ThirdParty
export SCOTCH_VERSION=7.0.3
export METIS_TYPE=ThirdParty
export PARMETIS_TYPE=ThirdParty
export ZOLTAN_TYPE=ThirdParty
export ParaView_TYPE=ThirdParty
export ParaView_VERSION=5.11.2
*** Note that if version numbers are not set then the configuration will
search for a decomposition source directory, but it will search for a
ParaView installation directory. This is because decomposition libraries
are built as part of OpenFOAM's ./Allwmake, but ParaView is not. This
distinction remains. If a local compilation of ParaView is needed, then
'./makeParaView -version X.XX.X' should be called explicitly in the
third party directory prior to building OpenFOAM.
The name of the third party directory can now also be independently set.
This simplifies some packaging processes in that it permits third party
to be located within the OpenFOAM installation directory and therefore
bundled into the same binary package.
With the new -rm option the processor time directories are removed after the
reconstruction of each one. For multi-region cases with the -region and -rm
options only the processor time directory for the reconstructed region is
removed whereas with the -allRegions option the entire processor time directory
is removed after reconstruction of all the regions.
The majority of input parameters now support automatic unit conversion.
Units are specified within square brackets, either before or after the
value. Primitive parameters (e.g., scalars, vectors, tensors, ...),
dimensioned types, fields, Function1-s and Function2-s all support unit
conversion in this way.
Unit conversion occurs on input only. OpenFOAM writes out all fields and
parameters in standard units. It is recommended to use '.orig' files in
the 0 directory to preserve user-readable input if those files are being
modified by pre-processing applications (e.g., setFields).
For example, to specify a volumetric flow rate inlet boundary in litres
per second [l/s], rather than metres-cubed per second [m^3/s], in 0/U:
boundaryField
{
inlet
{
type flowRateInletVelocity;
volumetricFlowRate 0.1 [l/s];
value $internalField;
}
...
}
Or, to specify the pressure field in bar, in 0/p:
internalField uniform 1 [bar];
Or, to convert the parameters of an Arrhenius reaction rate from a
cm-mol-kcal unit system, in constant/chemistryProperties:
reactions
{
methaneReaction
{
type irreversibleArrhenius;
reaction "CH4^0.2 + 2O2^1.3 = CO2 + 2H2O";
A 6.7e12 [(mol/cm^3)^-0.5/s];
beta 0;
Ea 48.4 [kcal/mol];
}
}
Or, to define a time-varying outlet pressure using a CSV file in which
the pressure column is in mega-pascals [MPa], in 0/p:
boundaryField
{
outlet
{
type uniformFixedValue;
value
{
type table;
format csv;
nHeaderLine 1;
units ([s] [MPa]); // <-- new units entry
columns (0 1);
mergeSeparators no;
file "data/pressure.csv";
outOfBounds clamp;
interpolationScheme linear;
}
}
...
}
(Note also that a new 'columns' entry replaces the old 'refColumn' and
'componentColumns'. This is is considered to be more intuitive, and has
a consistent syntax with the new 'units' entry. 'columns' and
'componentColumns' have been retained for backwards compatibility and
will continue to work for the time being.)
Unit definitions can be added in the global or case controlDict files.
See UnitConversions in $WM_PROJECT_DIR/etc/controlDict for examples.
Currently available units include:
Standard: kg m s K kmol A Cd
Derived: Hz N Pa J W g um mm cm km l ml us ms min hr mol
rpm bar atm kPa MPa cal kcal cSt cP % rad rot deg
A user-time unit is also provided if user-time is in operation. This
allows it to be specified locally whether a parameter relates to
real-time or to user-time. For example, to define a mass source that
ramps up from a given engine-time (in crank-angle-degrees [CAD]) over a
duration in real-time, in constant/fvModels:
massSource1
{
type massSource;
points ((1 2 3));
massFlowRate
{
type scale;
scale linearRamp;
start 20 [CAD];
duration 50 [ms];
value 0.1 [g/s];
}
}
Specified units will be checked against the parameter's dimensions where
possible, and an error generated if they are not consistent. For the
dimensions to be available for this check, the code requires
modification, and work propagating this change across OpenFOAM is
ongoing. Unit conversions are still possible without these changes, but
the validity of such conversions will not be checked.
Units are no longer permitted in 'dimensions' entries in field files.
These 'dimensions' entries can now, instead, take the names of
dimensions. The names of the available dimensions are:
Standard: mass length time temperature
moles current luminousIntensity
Derived: area volume rate velocity momentum acceleration density
force energy power pressure kinematicPressure
compressibility gasConstant specificHeatCapacity
kinematicViscosity dynamicViscosity thermalConductivity
volumetricFlux massFlux
So, for example, a 0/epsilon file might specify the dimensions as
follows:
dimensions [energy/mass/time];
And a 0/alphat file might have:
dimensions [thermalConductivity/specificHeatCapacity];
*** Development Notes ***
A unit conversion can construct trivially from a dimension set,
resulting in a "standard" unit with a conversion factor of one. This
means the functions which perform unit conversion on read can be
provided dimension sets or unit conversion objects interchangeably.
A basic `dict.lookup<vector>("Umean")` call will do unit conversion, but
it does not know the parameter's dimensions, so it cannot check the
validity of the supplied units. A corresponding lookup function has been
added in which the dimensions or units can be provided; in this case the
corresponding call would be `dict.lookup<vector>("Umean", dimVelocity)`.
This function enables additional checking and should be used wherever
possible.
Function1-s and Function2-s have had their constructors and selectors
changed so that dimensions/units must be specified by calling code. In
the case of Function1, two unit arguments must be given; one for the
x-axis and one for the value-axis. For Function2-s, three must be
provided.
In some cases, it is desirable (or at least established practice), that
a given non-standard unit be used in the absence of specific
user-defined units. Commonly this includes reading angles in degrees
(rather than radians) and reading times in user-time (rather than
real-time). The primitive lookup functions and Function1 and Function2
selectors both support specifying a non-standard default unit. For
example, `theta_ = dict.lookup<scalar>("theta", unitDegrees)` will read
an angle in degrees by default. If this is done within a model which
also supports writing then the write call must be modified accordingly
so that the data is also written out in degrees. Overloads of writeEntry
have been created for this purpose. In this case, the angle theta should
be written out with `writeEntry(os, "theta", unitDegrees, theta_)`.
Function1-s and Function2-s behave similarly, but with greater numbers
of dimensions/units arguments as before.
The non-standard user-time unit can be accessed by a `userUnits()`
method that has been added to Time. Use of this user-time unit in the
construction of Function1-s should prevent the need for explicit
user-time conversion in boundary conditions and sub-models and similar.
Some models might contain non-typed stream-based lookups of the form
`dict.lookup("p0") >> p0_` (e.g., in a re-read method), or
`Umean_(dict.lookup("Umean"))` (e.g., in an initialiser list). These
calls cannot facilitate unit conversion and are therefore discouraged.
They should be replaced with
`p0_ = dict.lookup<scalar>("p0", dimPressure)` and
`Umean_(dict.lookup<vector>("Umean", dimVelocity))` and similar whenever
they are found.
corresponding to a give order, another case or another region.
Description
Utility to reorder the patches of a case
The new patch order may be specified directly as a list of patch names
following the -patchOrder option or from the boundary file of a reference
case specified using the -referenceCase option with or without the
-referenceRegion option.
This utility run either serial or parallel but either way the reference
case boundary file is read from the constant directory.
Usage
\b reorderPatches
Options:
- \par -patchOrder \<patch names\>
Specify the list of patch names in the new order.
- \par -referenceCase \<case path\>
Specify the reference case path
- \par -referenceRegion \<name\>
Specify an alternative mesh region for the reference case.
If -referenceCase is not specified the current case is used.
- \par -overwrite \n
Replace the old mesh with the new one, rather than writing the new one
into a separate time directory
- \par -region \<name\>
Specify an alternative mesh region.
Now the HashTable underlying PtrListDictionary is used for zone lookup by name
which is a lot faster than the linear search method used previously if there are
a large number of zones.
Now faceZones are handled directly by the applications and the new
faceZone::topoChange function so that any face can now be in any number of
zones, significantly increasing the flexibility and usefulness of faceZones.
This completes the generalisation of cellZone, faceZone and pointZone to support
multiple zones for each cell, face or point respectively. Next step will be to
make zones polymorphic and run-time selectable so that they can alter during the
run and adapt to moving meshes for example.
For a case extruding patches left and right:
sourcePatches (left right);
if the optional zoneNames entry is also specified in extrudeMeshDict
zoneNames (leftCells rightCells);
the cells extruded from the left patch are added to the leftCells cellZone and
the cells extruded from the right patch are added to the rightCells cellZone.
Alternatively if a single zone name is specified, e.g.
zoneNames (extrudedCells);
then cells extruded from the left and right patches are added to the
extrudedCells cellZone.
If the number of patches to extrude is large it might be more convenient for
the cells extruded from each patch to be added to a cellZone named the same as
each patch, this option is selected by setting zoneNames to patchNames:
zoneNames (patchNames);
Now cellZones are handled directly by the applications and the new
cellZone::topoChange function so that any cell can now be in any number of
zones, significantly increasing the flexibility and usefulness of cellZones.
The same rationalisation and generalisation will be applied to faceZones in the
future.
Now pointZones are handled directly by the applications and the new
pointZone::topoChange function so that any point can now be in any number of
zones, significantly increasing the flexibility and usefulness of pointZones.
The same rationalisation and generalisation will be applied to cellZones and
faceZones in the future.
The surface mesh setPoints function resets the points without caching the old
points or swept areas so is the equivalent of the polyMesh::setPoints rather
than movePoints.
This utility is superseded by the much more general transformPoints. A
rotation between vectors (0 1 0) and (0.707107 0.707107 0), and a
corresponding transformation of all vector and tensor fields, can be
achieved with the following call to transformPoints:
transformPoints "rotate=((0 1 0) (0.707107 0.707107 0))" -rotateFields
The concept of cell and face inflation proved unworkable in general and has been
replaced by the more flexible and robust cell-splitting combined with
conservative interpolative mapping and mesh morphing as appropriate.
The concept of cell inflation from faces or points proved unworkable in general
and has been replaced by the more flexible and robust cell-splitting combined
with conservative interpolative mapping and mesh morphing as appropriate.
Constructing the fields in patch order is logical, and preferable to
using the potentially arbitrary order in which the fields are specified
in the field dictionary. It also resolves the issue that the
construction of jump cyclics can fail if the patch fields are not
specified in the same order as the patches.
This utility is used as a pre-processing step for the multiValveEngine
fvMeshMover and provides two options:
-cylinderHead to generate the pointZone within the cylinder head
-pistonBowl to generate the pointZone within the piston bowl
The updated tutorials/XiFluid/kivaTest case demonstrates the application of this
utility.
Either single entries renamed using the -entry option with -rename:
-entry <entryName> -rename <newName>
or a list of entries can be renamed using the -rename <newNames> option:
-rename "<entryName0>=<newName0>, <entryName1>=<newName1>..."
Zones are now completely dynamic, i.e. the number of zones of each type can
change during the run, e.g. by run-time mesh-to-mesh mapping onto meshes with
different zones used to control mesh motion. This means that the index of each
zone may change during the run and so it better that the zones do not cache
their own index but it is looked-up from the zone list using findIndex when
required.
A new nonConformalMappedWall patch type has been added which can couple
between different regions of a multi-region simulation. This patch type
uses the same intersection algorithm as the nonConformalCyclic patch,
which is used for coupling sections of a mesh within the same region.
The nonConformalMappedWall provides some advantages over the existing
mappedWall patches:
- The connection it creates is not interpolative. It creates a pair of
coupled finite-volume faces wherever two opposing faces overlap.
There is therefore no interpolation error associated with mapping
values across the coupling.
- Faces (or parts of faces) which do not overlap are not normalised
away by an interpolation or averaging process. Instead, they are
assigned an alternative boundary condition; e.g., an external
constraint, or even another non-conformal cyclic or mapped wall.
This makes the system able to construct partially-overlapping
couplings.
- The direct non-interpolative transfer of values between the patches
makes the method equivalent to a conformal coupling. Properties of
the solution algorithm, such as conservation and boundedness, are
retained regardless of the non-conformance of the boundary meshes.
- All constructed finite volume faces have accurate centre points.
This makes the method second order accurate in space.
Usage:
Non-conformal mapped wall couplings are constructed as the last stage of
a multi-region meshing process. First, a multi-region mesh is
constructed in one of the usual ways, but with the boundaries specified
as standard non-coupled walls instead of a special mapped type. Then,
createNonConformalCouples is called to construct non-conformal mapped
patches that couple overlapping parts of these non-coupled walls. This
process is very similar to the construction of non-conformal cyclics.
createNonConformalCouples requires a
system/createNonConformalCouplesDict in order to construct non-conformal
mapped walls. Each coupling is specified in its own sub-dictionary, and
a "regions" entry is used to specify the pair of regions that the
non-conformal mapped wall will couple. Non-conformal cyclics can also be
created using the same dictionary, and will be assumed if the two
specified regions are the same, or if a single "region" entry is
specified. For example:
// Do not modify the fields
fields no;
// List of non-conformal couplings
nonConformalCouples
{
// Non-conformal cyclic interface. Only one region is specified.
fluidFluid
{
region fluid;
originalPatches (nonCoupleRotating nonCoupleStationary);
}
// Non-conformal mapped wall interface. Two different regions
// have been specified.
fluidSolid
{
regions (fluid solid);
originalPatches (nonCoupleRotating nonCoupleStationary);
}
}
After this step, a case should execute with foamMultiRun and decompose
and reconstruct and post-process normally.
One additional restriction for parallelised workflows is that
decomposition and reconstruction must be done with the -allRegions
option, so that the both sides of the coupling are available to the
decomposition/reconstruction algorithm.
Tutorials:
Two tutorials have been added to demonstrate use of this new
functionality:
- The multiRegion/CHT/misalignedDuct case provides a simple visual
confirmation that the patches are working (the exposed corners of
the solid will be hot if the non-conformal mapped walls are active),
and it demonstrates createNonConformalCouples's ability to add
boundary conditions to existing fields.
- The multiRegion/CHT/notchedRoller case demonstrates use of
non-conformal mapped walls with a moving mesh, and also provides an
example of parallelised usage.
Notes for Developers:
A coupled boundary condition now uses a new class,
mappedFvPatchBaseBase, in order to perform a transfer of values to or
from the neighbouring patch. For example:
// Cast the patch type to it's underlying mapping engine
const mappedFvPatchBaseBase& mapper =
mappedFvPatchBaseBase::getMap(patch());
// Lookup a field on the neighbouring patch
const fvPatchScalarField& nbrTn =
mapper.nbrFvPatch().lookupPatchField<volScalarField, scalar>("T");
// Map the values to this patch
const scalarField Tn(mapper.fromNeighbour(nbrTn));
For this to work, the fvPatch should be of an appropriate mapped type
which derives from mappedFvPatchBaseBase. This mappedFvPatchBaseBase
class provides an interface to to both conformal/interpolative and
non-conformal mapping procedures. This means that a coupled boundary
condition implemented in the manner above will work with either
conformal/interpolative or non-conformal mapped patch types.
Previously, coupled boundary conditions would access a mappedPatchBase
base class of the associated polyPatch, and use that to transfer values
between the patches. This direct dependence on the polyPatch's mapping
engine meant that only conformal/interpolative fvPatch fields that
corresponded to the polyPatch's geometry could be mapped.