The majority of input parameters now support automatic unit conversion.
Units are specified within square brackets, either before or after the
value. Primitive parameters (e.g., scalars, vectors, tensors, ...),
dimensioned types, fields, Function1-s and Function2-s all support unit
conversion in this way.
Unit conversion occurs on input only. OpenFOAM writes out all fields and
parameters in standard units. It is recommended to use '.orig' files in
the 0 directory to preserve user-readable input if those files are being
modified by pre-processing applications (e.g., setFields).
For example, to specify a volumetric flow rate inlet boundary in litres
per second [l/s], rather than metres-cubed per second [m^3/s], in 0/U:
boundaryField
{
inlet
{
type flowRateInletVelocity;
volumetricFlowRate 0.1 [l/s];
value $internalField;
}
...
}
Or, to specify the pressure field in bar, in 0/p:
internalField uniform 1 [bar];
Or, to convert the parameters of an Arrhenius reaction rate from a
cm-mol-kcal unit system, in constant/chemistryProperties:
reactions
{
methaneReaction
{
type irreversibleArrhenius;
reaction "CH4^0.2 + 2O2^1.3 = CO2 + 2H2O";
A 6.7e12 [(mol/cm^3)^-0.5/s];
beta 0;
Ea 48.4 [kcal/mol];
}
}
Or, to define a time-varying outlet pressure using a CSV file in which
the pressure column is in mega-pascals [MPa], in 0/p:
boundaryField
{
outlet
{
type uniformFixedValue;
value
{
type table;
format csv;
nHeaderLine 1;
units ([s] [MPa]); // <-- new units entry
columns (0 1);
mergeSeparators no;
file "data/pressure.csv";
outOfBounds clamp;
interpolationScheme linear;
}
}
...
}
(Note also that a new 'columns' entry replaces the old 'refColumn' and
'componentColumns'. This is is considered to be more intuitive, and has
a consistent syntax with the new 'units' entry. 'columns' and
'componentColumns' have been retained for backwards compatibility and
will continue to work for the time being.)
Unit definitions can be added in the global or case controlDict files.
See UnitConversions in $WM_PROJECT_DIR/etc/controlDict for examples.
Currently available units include:
Standard: kg m s K kmol A Cd
Derived: Hz N Pa J W g um mm cm km l ml us ms min hr mol
rpm bar atm kPa MPa cal kcal cSt cP % rad rot deg
A user-time unit is also provided if user-time is in operation. This
allows it to be specified locally whether a parameter relates to
real-time or to user-time. For example, to define a mass source that
ramps up from a given engine-time (in crank-angle-degrees [CAD]) over a
duration in real-time, in constant/fvModels:
massSource1
{
type massSource;
points ((1 2 3));
massFlowRate
{
type scale;
scale linearRamp;
start 20 [CAD];
duration 50 [ms];
value 0.1 [g/s];
}
}
Specified units will be checked against the parameter's dimensions where
possible, and an error generated if they are not consistent. For the
dimensions to be available for this check, the code requires
modification, and work propagating this change across OpenFOAM is
ongoing. Unit conversions are still possible without these changes, but
the validity of such conversions will not be checked.
Units are no longer permitted in 'dimensions' entries in field files.
These 'dimensions' entries can now, instead, take the names of
dimensions. The names of the available dimensions are:
Standard: mass length time temperature
moles current luminousIntensity
Derived: area volume rate velocity momentum acceleration density
force energy power pressure kinematicPressure
compressibility gasConstant specificHeatCapacity
kinematicViscosity dynamicViscosity thermalConductivity
volumetricFlux massFlux
So, for example, a 0/epsilon file might specify the dimensions as
follows:
dimensions [energy/mass/time];
And a 0/alphat file might have:
dimensions [thermalConductivity/specificHeatCapacity];
*** Development Notes ***
A unit conversion can construct trivially from a dimension set,
resulting in a "standard" unit with a conversion factor of one. This
means the functions which perform unit conversion on read can be
provided dimension sets or unit conversion objects interchangeably.
A basic `dict.lookup<vector>("Umean")` call will do unit conversion, but
it does not know the parameter's dimensions, so it cannot check the
validity of the supplied units. A corresponding lookup function has been
added in which the dimensions or units can be provided; in this case the
corresponding call would be `dict.lookup<vector>("Umean", dimVelocity)`.
This function enables additional checking and should be used wherever
possible.
Function1-s and Function2-s have had their constructors and selectors
changed so that dimensions/units must be specified by calling code. In
the case of Function1, two unit arguments must be given; one for the
x-axis and one for the value-axis. For Function2-s, three must be
provided.
In some cases, it is desirable (or at least established practice), that
a given non-standard unit be used in the absence of specific
user-defined units. Commonly this includes reading angles in degrees
(rather than radians) and reading times in user-time (rather than
real-time). The primitive lookup functions and Function1 and Function2
selectors both support specifying a non-standard default unit. For
example, `theta_ = dict.lookup<scalar>("theta", unitDegrees)` will read
an angle in degrees by default. If this is done within a model which
also supports writing then the write call must be modified accordingly
so that the data is also written out in degrees. Overloads of writeEntry
have been created for this purpose. In this case, the angle theta should
be written out with `writeEntry(os, "theta", unitDegrees, theta_)`.
Function1-s and Function2-s behave similarly, but with greater numbers
of dimensions/units arguments as before.
The non-standard user-time unit can be accessed by a `userUnits()`
method that has been added to Time. Use of this user-time unit in the
construction of Function1-s should prevent the need for explicit
user-time conversion in boundary conditions and sub-models and similar.
Some models might contain non-typed stream-based lookups of the form
`dict.lookup("p0") >> p0_` (e.g., in a re-read method), or
`Umean_(dict.lookup("Umean"))` (e.g., in an initialiser list). These
calls cannot facilitate unit conversion and are therefore discouraged.
They should be replaced with
`p0_ = dict.lookup<scalar>("p0", dimPressure)` and
`Umean_(dict.lookup<vector>("Umean", dimVelocity))` and similar whenever
they are found.
780 lines
23 KiB
C++
780 lines
23 KiB
C++
/*---------------------------------------------------------------------------*\
|
|
========= |
|
|
\\ / F ield | OpenFOAM: The Open Source CFD Toolbox
|
|
\\ / O peration | Website: https://openfoam.org
|
|
\\ / A nd | Copyright (C) 2018-2024 OpenFOAM Foundation
|
|
\\/ M anipulation |
|
|
-------------------------------------------------------------------------------
|
|
License
|
|
This file is part of OpenFOAM.
|
|
|
|
OpenFOAM is free software: you can redistribute it and/or modify it
|
|
under the terms of the GNU General Public License as published by
|
|
the Free Software Foundation, either version 3 of the License, or
|
|
(at your option) any later version.
|
|
|
|
OpenFOAM is distributed in the hope that it will be useful, but WITHOUT
|
|
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
|
|
FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
|
|
for more details.
|
|
|
|
You should have received a copy of the GNU General Public License
|
|
along with OpenFOAM. If not, see <http://www.gnu.org/licenses/>.
|
|
|
|
Description
|
|
Identifies features in a surface geometry and writes them to file,
|
|
based on control parameters specified by the user.
|
|
|
|
\*---------------------------------------------------------------------------*/
|
|
|
|
#include "argList.H"
|
|
#include "Time.H"
|
|
#include "triSurfaceMesh.H"
|
|
#include "featureEdgeMesh.H"
|
|
#include "extendedFeatureEdgeMesh.H"
|
|
#include "surfaceFeatures.H"
|
|
#include "triSurfaceFields.H"
|
|
#include "vtkWritePolyData.H"
|
|
#include "systemDict.H"
|
|
|
|
using namespace Foam;
|
|
|
|
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
|
|
|
|
namespace Foam
|
|
{
|
|
autoPtr<surfaceFeatures> extractFromFile
|
|
(
|
|
const fileName& featureEdgeFile,
|
|
const triSurface& surf,
|
|
const Switch& geometricTestOnly
|
|
)
|
|
{
|
|
edgeMesh eMesh(featureEdgeFile);
|
|
|
|
// Sometimes duplicate edges are present. Remove them.
|
|
eMesh.mergeEdges();
|
|
|
|
Info<< nl << "Reading existing feature edges from file "
|
|
<< featureEdgeFile << nl
|
|
<< "Selecting edges purely based on geometric tests: "
|
|
<< geometricTestOnly.asText() << endl;
|
|
|
|
return autoPtr<surfaceFeatures>
|
|
(
|
|
new surfaceFeatures
|
|
(
|
|
surf,
|
|
eMesh.points(),
|
|
eMesh.edges(),
|
|
1e-6,
|
|
geometricTestOnly
|
|
)
|
|
);
|
|
}
|
|
|
|
|
|
autoPtr<surfaceFeatures> extractFromSurface
|
|
(
|
|
const triSurface& surf,
|
|
const Switch& geometricTestOnly,
|
|
const scalar includedAngle
|
|
)
|
|
{
|
|
Info<< nl
|
|
<< "Constructing feature set from included angle "
|
|
<< includedAngle << nl
|
|
<< "Selecting edges purely based on geometric tests: "
|
|
<< geometricTestOnly.asText() << endl;
|
|
|
|
return autoPtr<surfaceFeatures>
|
|
(
|
|
new surfaceFeatures
|
|
(
|
|
surf,
|
|
includedAngle,
|
|
0,
|
|
0,
|
|
geometricTestOnly
|
|
)
|
|
);
|
|
}
|
|
|
|
|
|
autoPtr<surfaceFeatures> surfaceFeatureSet
|
|
(
|
|
const fileName& surfaceFileName,
|
|
const triSurface& surf,
|
|
const dictionary& dict,
|
|
const scalar includedAngle
|
|
)
|
|
{
|
|
const Switch geometricTestOnly = dict.lookupOrDefault<Switch>
|
|
(
|
|
"geometricTestOnly",
|
|
"no"
|
|
);
|
|
|
|
if (dict.found("files"))
|
|
{
|
|
HashTable<fileName, fileName> fileNames(dict.lookup("files"));
|
|
|
|
if (fileNames.found(surfaceFileName))
|
|
{
|
|
return extractFromFile
|
|
(
|
|
fileNames[surfaceFileName],
|
|
surf,
|
|
geometricTestOnly
|
|
);
|
|
}
|
|
else
|
|
{
|
|
return extractFromSurface
|
|
(
|
|
surf,
|
|
geometricTestOnly,
|
|
includedAngle
|
|
);
|
|
}
|
|
}
|
|
else
|
|
{
|
|
return extractFromSurface
|
|
(
|
|
surf,
|
|
geometricTestOnly,
|
|
includedAngle
|
|
);
|
|
}
|
|
}
|
|
|
|
|
|
void extractFeatures
|
|
(
|
|
const fileName& surfaceFileName,
|
|
const Time& runTime,
|
|
const dictionary& dict
|
|
)
|
|
{
|
|
const fileName sFeatFileName = surfaceFileName.lessExt().name();
|
|
|
|
Info<< "Surface : " << surfaceFileName << nl << endl;
|
|
|
|
const Switch writeVTK =
|
|
dict.lookupOrDefault<Switch>("writeVTK", "off");
|
|
const Switch writeObj =
|
|
dict.lookupOrDefault<Switch>("writeObj", "off");
|
|
const Switch verboseObj =
|
|
dict.lookupOrDefault<Switch>("verboseObj", "off");
|
|
|
|
const Switch curvature =
|
|
dict.lookupOrDefault<Switch>("curvature", "off");
|
|
const Switch featureProximity =
|
|
dict.lookupOrDefault<Switch>("featureProximity", "off");
|
|
|
|
|
|
Info<< nl << "Feature line extraction is only valid on closed manifold "
|
|
<< "surfaces." << endl;
|
|
|
|
// Read
|
|
// ~~~~
|
|
|
|
triSurface surf
|
|
(
|
|
runTime.path()
|
|
/runTime.constant()
|
|
/searchableSurface::geometryDir(runTime)
|
|
/surfaceFileName
|
|
);
|
|
|
|
Info<< "Statistics:" << endl;
|
|
surf.writeStats(Info);
|
|
Info<< endl;
|
|
|
|
const faceList faces(surf.faces());
|
|
|
|
// Either construct features from surface & featureAngle or read set.
|
|
// ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
|
|
|
|
const scalar includedAngle =
|
|
dict.lookup<scalar>("includedAngle", unitDegrees);
|
|
|
|
autoPtr<surfaceFeatures> set
|
|
(
|
|
surfaceFeatureSet
|
|
(
|
|
surfaceFileName,
|
|
surf,
|
|
dict,
|
|
includedAngle
|
|
)
|
|
);
|
|
|
|
// Trim set
|
|
// ~~~~~~~~
|
|
|
|
if (dict.isDict("trimFeatures"))
|
|
{
|
|
dictionary trimDict = dict.subDict("trimFeatures");
|
|
|
|
scalar minLen =
|
|
trimDict.lookupOrAddDefault<scalar>("minLen", -great);
|
|
|
|
label minElem = trimDict.lookupOrAddDefault<label>("minElem", 0);
|
|
|
|
// Trim away small groups of features
|
|
if (minElem > 0 || minLen > 0)
|
|
{
|
|
Info<< "Removing features of length < "
|
|
<< minLen << endl;
|
|
Info<< "Removing features with number of edges < "
|
|
<< minElem << endl;
|
|
|
|
set().trimFeatures(minLen, minElem, includedAngle);
|
|
}
|
|
}
|
|
|
|
|
|
// Subset
|
|
// ~~~~~~
|
|
|
|
// Convert to marked edges, points
|
|
List<surfaceFeatures::edgeStatus> edgeStat(set().toStatus());
|
|
|
|
if (dict.isDict("subsetFeatures"))
|
|
{
|
|
const dictionary& subsetDict = dict.subDict
|
|
(
|
|
"subsetFeatures"
|
|
);
|
|
|
|
if (subsetDict.found("insideBox"))
|
|
{
|
|
treeBoundBox bb(subsetDict.lookup("insideBox")());
|
|
|
|
Info<< "Selecting edges inside bb " << bb;
|
|
if (writeObj)
|
|
{
|
|
Info << " see insideBox.obj";
|
|
bb.writeOBJ("insideBox.obj");
|
|
}
|
|
Info<< endl;
|
|
|
|
selectBox(surf, bb, true, edgeStat);
|
|
}
|
|
else if (subsetDict.found("outsideBox"))
|
|
{
|
|
treeBoundBox bb(subsetDict.lookup("outsideBox")());
|
|
|
|
Info<< "Removing all edges inside bb " << bb;
|
|
if (writeObj)
|
|
{
|
|
Info<< " see outsideBox.obj" << endl;
|
|
bb.writeOBJ("outsideBox.obj");
|
|
}
|
|
Info<< endl;
|
|
|
|
selectBox(surf, bb, false, edgeStat);
|
|
}
|
|
|
|
const Switch nonManifoldEdges =
|
|
subsetDict.lookupOrDefault<Switch>("nonManifoldEdges", "yes");
|
|
|
|
if (!nonManifoldEdges)
|
|
{
|
|
Info<< "Removing all non-manifold edges"
|
|
<< " (edges with > 2 connected faces) unless they"
|
|
<< " cross multiple regions" << endl;
|
|
|
|
selectManifoldEdges(surf, 1e-5, includedAngle, edgeStat);
|
|
}
|
|
|
|
const Switch openEdges =
|
|
subsetDict.lookupOrDefault<Switch>("openEdges", "yes");
|
|
|
|
if (!openEdges)
|
|
{
|
|
Info<< "Removing all open edges"
|
|
<< " (edges with 1 connected face)" << endl;
|
|
|
|
forAll(edgeStat, edgei)
|
|
{
|
|
if (surf.edgeFaces()[edgei].size() == 1)
|
|
{
|
|
edgeStat[edgei] = surfaceFeatures::NONE;
|
|
}
|
|
}
|
|
}
|
|
|
|
if (subsetDict.found("plane"))
|
|
{
|
|
const plane cutPlane(subsetDict.subDict("plane"));
|
|
|
|
selectCutEdges(surf, cutPlane, edgeStat);
|
|
|
|
Info<< "Only edges that intersect the plane with normal "
|
|
<< cutPlane.normal()
|
|
<< " and base point " << cutPlane.refPoint()
|
|
<< " will be included as feature edges."<< endl;
|
|
}
|
|
}
|
|
|
|
|
|
surfaceFeatures newSet(surf);
|
|
newSet.setFromStatus(edgeStat, includedAngle);
|
|
|
|
Info<< nl
|
|
<< "Initial feature set:" << nl
|
|
<< " feature points : " << newSet.featurePoints().size() << nl
|
|
<< " feature edges : " << newSet.featureEdges().size() << nl
|
|
<< " of which" << nl
|
|
<< " region edges : " << newSet.nRegionEdges() << nl
|
|
<< " external edges : " << newSet.nExternalEdges() << nl
|
|
<< " internal edges : " << newSet.nInternalEdges() << nl
|
|
<< endl;
|
|
|
|
boolList surfBaffleRegions(surf.patches().size(), false);
|
|
|
|
wordList surfBaffleNames;
|
|
dict.readIfPresent("baffles", surfBaffleNames);
|
|
|
|
forAll(surf.patches(), pI)
|
|
{
|
|
const word& name = surf.patches()[pI].name();
|
|
|
|
if (findIndex(surfBaffleNames, name) != -1)
|
|
{
|
|
Info<< "Adding baffle region " << name << endl;
|
|
surfBaffleRegions[pI] = true;
|
|
}
|
|
}
|
|
|
|
// Extracting and writing a extendedFeatureEdgeMesh
|
|
extendedFeatureEdgeMesh feMesh
|
|
(
|
|
newSet,
|
|
runTime,
|
|
sFeatFileName + ".extendedFeatureEdgeMesh",
|
|
surfBaffleRegions
|
|
);
|
|
|
|
|
|
if (dict.isDict("addFeatures"))
|
|
{
|
|
const word addFeName = dict.subDict("addFeatures")["name"];
|
|
Info<< "Adding (without merging) features from " << addFeName
|
|
<< nl << endl;
|
|
|
|
extendedFeatureEdgeMesh addFeMesh
|
|
(
|
|
IOobject
|
|
(
|
|
addFeName,
|
|
runTime.time().constant(),
|
|
"extendedFeatureEdgeMesh",
|
|
runTime.time(),
|
|
IOobject::MUST_READ,
|
|
IOobject::NO_WRITE
|
|
)
|
|
);
|
|
Info<< "Read " << addFeMesh.name() << nl;
|
|
addFeMesh.writeStats(Info);
|
|
|
|
feMesh.add(addFeMesh);
|
|
}
|
|
|
|
|
|
Info<< nl
|
|
<< "Final feature set:" << nl;
|
|
feMesh.writeStats(Info);
|
|
|
|
Info<< nl << "Writing extendedFeatureEdgeMesh to "
|
|
<< feMesh.relativeObjectPath() << endl;
|
|
|
|
mkDir(feMesh.path());
|
|
|
|
if (writeObj)
|
|
{
|
|
feMesh.writeObj
|
|
(
|
|
feMesh.path()/surfaceFileName.lessExt().name(),
|
|
verboseObj
|
|
);
|
|
}
|
|
|
|
feMesh.write();
|
|
|
|
// Write a featureEdgeMesh for backwards compatibility
|
|
featureEdgeMesh bfeMesh
|
|
(
|
|
IOobject
|
|
(
|
|
surfaceFileName.lessExt().name() + ".eMesh",
|
|
runTime.constant(),
|
|
searchableSurface::geometryDir(runTime),
|
|
runTime,
|
|
IOobject::NO_READ,
|
|
IOobject::AUTO_WRITE,
|
|
false
|
|
),
|
|
feMesh.points(),
|
|
feMesh.edges()
|
|
);
|
|
|
|
Info<< nl << "Writing featureEdgeMesh to "
|
|
<< bfeMesh.relativeObjectPath() << endl;
|
|
|
|
bfeMesh.regIOobject::write();
|
|
|
|
// Data to write out in VTK format
|
|
wordList writeVTKFieldNames;
|
|
boolList writeVTKFieldIsPointValues;
|
|
#define DeclareWriteVTKFieldTypeValues(Type, nullArg) \
|
|
PtrList<const Field<Type>> writeVTKField##Type##Values;
|
|
FOR_ALL_FIELD_TYPES(DeclareWriteVTKFieldTypeValues);
|
|
#undef DeclareWriteVTKFieldTypeValues
|
|
|
|
// Find distance between close features
|
|
if (dict.isDict("closeness"))
|
|
{
|
|
Info<< nl << "Extracting internal and external closeness of "
|
|
<< "surface." << endl;
|
|
|
|
const dictionary& closenessDict = dict.subDict("closeness");
|
|
|
|
const Switch faceCloseness =
|
|
closenessDict.lookupOrDefault<Switch>("faceCloseness", "off");
|
|
const Switch pointCloseness =
|
|
closenessDict.lookupOrDefault<Switch>("pointCloseness", "off");
|
|
|
|
const scalar internalAngleTolerance
|
|
(
|
|
closenessDict.lookupOrDefault<scalar>
|
|
(
|
|
"internalAngleTolerance",
|
|
unitDegrees,
|
|
80
|
|
)
|
|
);
|
|
|
|
const scalar externalAngleTolerance
|
|
(
|
|
closenessDict.lookupOrDefault<scalar>
|
|
(
|
|
"externalAngleTolerance",
|
|
unitDegrees,
|
|
80
|
|
)
|
|
);
|
|
|
|
// Searchable triSurface
|
|
const triSurfaceMesh searchSurf
|
|
(
|
|
IOobject
|
|
(
|
|
sFeatFileName + ".closeness",
|
|
runTime.constant(),
|
|
searchableSurface::geometryDir(runTime),
|
|
runTime
|
|
),
|
|
surf
|
|
);
|
|
|
|
if (faceCloseness)
|
|
{
|
|
Pair<tmp<triSurfaceScalarField>> closenessFields
|
|
(
|
|
searchSurf.extractCloseness
|
|
(
|
|
internalAngleTolerance,
|
|
externalAngleTolerance
|
|
)
|
|
);
|
|
|
|
Info<< " writing "
|
|
<< closenessFields.first()->name() << endl;
|
|
closenessFields.first()->write();
|
|
|
|
Info<< " writing "
|
|
<< closenessFields.second()->name() << endl;
|
|
closenessFields.second()->write();
|
|
|
|
if (writeVTK)
|
|
{
|
|
writeVTKFieldNames.append("internalCloseness");
|
|
writeVTKFieldIsPointValues.append(false);
|
|
writeVTKFieldscalarValues.append
|
|
(
|
|
new scalarField(closenessFields.first())
|
|
);
|
|
|
|
writeVTKFieldNames.append("externalCloseness");
|
|
writeVTKFieldIsPointValues.append(false);
|
|
writeVTKFieldscalarValues.append
|
|
(
|
|
new scalarField(closenessFields.second())
|
|
);
|
|
}
|
|
}
|
|
|
|
if (pointCloseness)
|
|
{
|
|
Pair<tmp<triSurfacePointScalarField >> closenessFields
|
|
(
|
|
searchSurf.extractPointCloseness
|
|
(
|
|
internalAngleTolerance,
|
|
externalAngleTolerance
|
|
)
|
|
);
|
|
|
|
Info<< " writing "
|
|
<< closenessFields.first()->name() << endl;
|
|
closenessFields.first()->write();
|
|
|
|
Info<< " writing "
|
|
<< closenessFields.second()->name() << endl;
|
|
closenessFields.second()->write();
|
|
|
|
if (writeVTK)
|
|
{
|
|
const faceList faces(searchSurf.faces());
|
|
const Map<label>& meshPointMap = searchSurf.meshPointMap();
|
|
|
|
const triSurfacePointScalarField&
|
|
internalClosenessPointField = closenessFields.first();
|
|
|
|
const triSurfacePointScalarField&
|
|
externalClosenessPointField = closenessFields.second();
|
|
|
|
scalarField internalCloseness(searchSurf.nPoints(), great);
|
|
scalarField externalCloseness(searchSurf.nPoints(), great);
|
|
|
|
forAll(meshPointMap, pi)
|
|
{
|
|
internalCloseness[pi] =
|
|
internalClosenessPointField[meshPointMap[pi]];
|
|
|
|
externalCloseness[pi] =
|
|
externalClosenessPointField[meshPointMap[pi]];
|
|
}
|
|
|
|
writeVTKFieldNames.append("internalPointCloseness");
|
|
writeVTKFieldIsPointValues.append(true);
|
|
writeVTKFieldscalarValues.append
|
|
(
|
|
new scalarField(internalCloseness)
|
|
);
|
|
|
|
writeVTKFieldNames.append("externalPointCloseness");
|
|
writeVTKFieldIsPointValues.append(true);
|
|
writeVTKFieldscalarValues.append
|
|
(
|
|
new scalarField(externalCloseness)
|
|
);
|
|
}
|
|
}
|
|
}
|
|
|
|
|
|
if (curvature)
|
|
{
|
|
Info<< nl << "Extracting curvature of surface at the points."
|
|
<< endl;
|
|
|
|
triSurfacePointScalarField k
|
|
(
|
|
IOobject
|
|
(
|
|
sFeatFileName + ".curvature",
|
|
runTime.constant(),
|
|
searchableSurface::geometryDir(runTime),
|
|
runTime
|
|
),
|
|
surf,
|
|
dimLength,
|
|
surf.curvature()
|
|
);
|
|
|
|
k.write();
|
|
|
|
if (writeVTK)
|
|
{
|
|
writeVTKFieldNames.append("curvature");
|
|
writeVTKFieldIsPointValues.append(true);
|
|
writeVTKFieldscalarValues.append(new scalarField(k));
|
|
}
|
|
}
|
|
|
|
|
|
if (featureProximity)
|
|
{
|
|
Info<< nl << "Extracting proximity of close feature points and "
|
|
<< "edges to the surface" << endl;
|
|
|
|
const scalar searchDistance =
|
|
dict.lookup<scalar>("maxFeatureProximity");
|
|
|
|
scalarField featureProximity(surf.size(), searchDistance);
|
|
|
|
forAll(surf, fi)
|
|
{
|
|
const triPointRef& tri = surf[fi].tri(surf.points());
|
|
|
|
const Tuple2<point, scalar> circle = tri.circumCircle();
|
|
const point& c = circle.first();
|
|
const scalar rSqr =
|
|
min(sqr(4*circle.second()), sqr(searchDistance));
|
|
|
|
pointIndexHitList hitList;
|
|
|
|
feMesh.allNearestFeatureEdges(c, rSqr, hitList);
|
|
featureProximity[fi] = min
|
|
(
|
|
feMesh.minDisconnectedDist(hitList),
|
|
featureProximity[fi]
|
|
);
|
|
|
|
feMesh.allNearestFeaturePoints(c, rSqr, hitList);
|
|
featureProximity[fi] = min
|
|
(
|
|
minDist(hitList),
|
|
featureProximity[fi]
|
|
);
|
|
}
|
|
|
|
triSurfaceScalarField featureProximityField
|
|
(
|
|
IOobject
|
|
(
|
|
sFeatFileName + ".featureProximity",
|
|
runTime.constant(),
|
|
searchableSurface::geometryDir(runTime),
|
|
runTime,
|
|
IOobject::NO_READ,
|
|
IOobject::NO_WRITE
|
|
),
|
|
surf,
|
|
dimLength,
|
|
featureProximity
|
|
);
|
|
|
|
featureProximityField.write();
|
|
|
|
if (writeVTK)
|
|
{
|
|
writeVTKFieldNames.append("featureProximity");
|
|
writeVTKFieldIsPointValues.append(false);
|
|
writeVTKFieldscalarValues.append
|
|
(
|
|
new scalarField(featureProximity)
|
|
);
|
|
}
|
|
}
|
|
|
|
if (writeVTK)
|
|
{
|
|
#define WriteVTKResizeFieldTypeValues(Type, nullArg) \
|
|
writeVTKField##Type##Values.resize(writeVTKFieldNames.size());
|
|
FOR_ALL_FIELD_TYPES(WriteVTKResizeFieldTypeValues)
|
|
#undef WriteVTKResizeFieldTypeValues
|
|
|
|
vtkWritePolyData::write
|
|
(
|
|
runTime.path()
|
|
/runTime.constant()
|
|
/searchableSurface::geometryDir(runTime)
|
|
/sFeatFileName + "Features.vtk",
|
|
sFeatFileName,
|
|
runTime.writeFormat() == IOstream::BINARY,
|
|
surf.points(),
|
|
labelList(),
|
|
labelListList(),
|
|
faces,
|
|
writeVTKFieldNames,
|
|
writeVTKFieldIsPointValues,
|
|
UPtrList<const Field<label>>(writeVTKFieldNames.size())
|
|
#define WriteVTKFieldTypeValuesParameter(Type, nullArg) \
|
|
, UPtrList<const Field<Type>>(writeVTKField##Type##Values)
|
|
FOR_ALL_FIELD_TYPES(WriteVTKFieldTypeValuesParameter)
|
|
#undef WriteVTKFieldTypeValuesParameter
|
|
);
|
|
}
|
|
|
|
Info<< endl;
|
|
}
|
|
|
|
|
|
void extractFeatures
|
|
(
|
|
const fileNameList& surfaceFileNames,
|
|
const Time& runTime,
|
|
const dictionary& dict
|
|
)
|
|
{
|
|
forAll(surfaceFileNames, i)
|
|
{
|
|
extractFeatures(surfaceFileNames[i], runTime, dict);
|
|
}
|
|
}
|
|
}
|
|
|
|
|
|
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * //
|
|
|
|
int main(int argc, char *argv[])
|
|
{
|
|
argList::addNote
|
|
(
|
|
"extract and write surface features to file"
|
|
);
|
|
argList::noParallel();
|
|
|
|
#include "addDictOption.H"
|
|
|
|
#include "setRootCase.H"
|
|
#include "createTime.H"
|
|
|
|
const dictionary dict(systemDict("surfaceFeaturesDict", args, runTime));
|
|
|
|
if (dict.found("surfaces"))
|
|
{
|
|
extractFeatures
|
|
(
|
|
fileNameList(dict.lookup("surfaces")),
|
|
runTime,
|
|
dict
|
|
);
|
|
}
|
|
else
|
|
{
|
|
forAllConstIter(dictionary, dict, iter)
|
|
{
|
|
if (!iter().isDict())
|
|
{
|
|
continue;
|
|
}
|
|
|
|
extractFeatures
|
|
(
|
|
fileNameList(iter().dict().lookup("surfaces")),
|
|
runTime,
|
|
iter().dict()
|
|
);
|
|
}
|
|
}
|
|
|
|
Info<< "ExecutionTime = " << runTime.elapsedCpuTime() << " s"
|
|
<< " ClockTime = " << runTime.elapsedClockTime() << " s"
|
|
<< nl << endl;
|
|
|
|
Info<< "End\n" << endl;
|
|
|
|
return 0;
|
|
}
|
|
|
|
|
|
|
|
// ************************************************************************* //
|