Skip to content

Commit

Permalink
Edits part 2
Browse files Browse the repository at this point in the history
  • Loading branch information
JennaPaikowsky committed Jan 8, 2025
1 parent 060de8e commit 8487312
Show file tree
Hide file tree
Showing 10 changed files with 67 additions and 71 deletions.
6 changes: 2 additions & 4 deletions doc/source/user_guide/concepts/stepbystep.rst
Original file line number Diff line number Diff line change
Expand Up @@ -155,8 +155,7 @@ Use operators
You use operators to import, export, transform, and analyze data.

An operator is analogous to an integrated circuit in electronics. It
has a set of input and output pins. Pins provide for passing data to
and from operators.
has a set of input and output pins. Pins pass data to and from operators.

An operator takes input from a field, field container, or scoping using
an input pin. Based on what it is designed to do, the operator computes
Expand All @@ -165,8 +164,7 @@ an output that it passes to a field or field container using an output pin.
.. image:: ../../images/drawings/circuit.png

Comprehensive information on operators is available in :ref:`ref_dpf_operators_reference`.
In the **Available Operators** area for either the **Entry** or **Premium** operators,
you can either type a keyword in the **Search** option
In the **Available Operators** area, you can either type a keyword in the **Search** option
or browse by operator categories:

.. image:: ../../images/drawings/help-operators.png
Expand Down
24 changes: 12 additions & 12 deletions doc/source/user_guide/concepts/waysofusing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,36 +3,36 @@
========================================
DPF capabilities and scripting languages
========================================
DPF is a framework that provides data computation capabilities.

DPF as a Framework enabling data computation capabilities
---------------------------------------------------------
DPF as a Framework
------------------

DPF application: kernel and operator's libraries
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
DPF application: kernel and operator libraries
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

DPF is a framework that provides data computation capabilities. These capabilities are provided
through libraries of operators. To learn more about the computed data and the operator concepts, see :ref:`user_guide_concepts`.
DPF capabilities are provided through libraries of operators.
To learn more about the computed data and the operator concepts, see :ref:`user_guide_concepts`.

A DPF application is always composed of a kernel (DataProcessingCore and DPFClientAPI binaries),
that enables capabilities by loading libraries of operators (for example, mapdlOperatorsCore library
is basic library enabled by DPF).
This application is also called a **DPF Server application**.

When starting a DPF application, you can customize the list of operator's libraries that the kernel loads.
When starting a DPF application, you can customize the list of the libraries that the kernel loads.
To learn more on how to customize the initialization of a DPF application, see :ref:`user_guide_xmlfiles`.

DPF client: available APIs and languages
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

DPF is a framework that provides data computation capabilities. These capabilities are
enabled using the DPF Server application.
DPF capabilities are enabled using the DPF Server application.
These capabilities can be accessed through client APIs, as shown here:


.. image:: ../../images/drawings/apis_2.png


1. DPF server application can be accessed using Ansys Inc product, or DPF Server package (see :ref:`ref_dpf_server`) available on the Customer portal.
1. The DPF server application can be accessed using an Ansys product, or DPF Server package (see :ref:`ref_dpf_server`) available on the Customer portal.

2. Several client APIs are available (CPython, IronPython, C++, and so on).

Expand All @@ -43,12 +43,12 @@ Note that **IronPython and CPython APIs are different**, each has specific synta
The **list of available operators when using DPF is independent from the language or API which is used**, it only depends
on how the DPF application has been initialized.

Most of the DPF capabilities can be accessed using the operators. For more information about the existing operators, see the **Operators** tab.
Most of the DPF capabilities can be accessed using the operators. For more information about the existing operators, see :ref:`ref_dpf_operators_reference`.

Enhance DPF capabilities
~~~~~~~~~~~~~~~~~~~~~~~~

The available DPF capabilities loaded in a DPF application can be enhanced by creating new operator's libraries.
The available DPF capabilities loaded in a DPF application can be enhanced by creating new operator libraries.
DPF offers multiple development APIs depending on your environment. These plugins can be:

- CPython based (see :ref:`user_guide_custom_operators`)
Expand Down
36 changes: 18 additions & 18 deletions doc/source/user_guide/custom_operators.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ PyDPF-Core or in any supported client API.

With support for custom operators, PyDPF-Core becomes a development tool offering:

- **Accessibility:** A simple script can define a basic operator plugin.
- **Accessibility:** A simple script can define a basic operator plug-in.

- **Componentization:** Operators with similar applications can be grouped in Python plug-in packages.

Expand All @@ -29,10 +29,10 @@ For more information, see :ref:`ref_user_guide_operators`.
Install module
--------------

Once an Ansys-unified installation is complete, you must install the ``ansys-dpf-core`` module in the Ansys
Once an Ansys unified installation is complete, you must install the ``ansys-dpf-core`` module in the Ansys
installer's Python interpreter.

#. Download the script for you operating system:
#. Download the script for your operating system:

- For Windows, download this :download:`PowerShell script </user_guide/install_ansys_dpf_core_in_ansys.ps1>`.
- For Linux, download this :download:`Shell script </user_guide/install_ansys_dpf_core_in_ansys.sh>`
Expand All @@ -44,7 +44,7 @@ installer's Python interpreter.
- ``-pip_args``: Optional arguments to add to the ``pip`` command. For example, ``--extra-index-url`` or
``--trusted-host``.

If you ever want to uninstall the ``ansys-dpf-core`` module from the Ansys installation, you can do so.
To uninstall the ``ansys-dpf-core`` module from the Ansys installation:

#. Download the script for your operating system:

Expand All @@ -59,15 +59,15 @@ If you ever want to uninstall the ``ansys-dpf-core`` module from the Ansys insta

Create operators
----------------
You can create a basic operator plugin or a plug-in package with multiple operators.
You can create a basic operator plugin or a plugin package with multiple operators.

Basic operator plugin
~~~~~~~~~~~~~~~~~~~~~
To create a basic operator plugin, you write a simple Python script. An operator implementation
Basic operator plug-in
~~~~~~~~~~~~~~~~~~~~~~
To create a basic operator plug-in, write a simple Python script. An operator implementation
derives from the :class:`ansys.dpf.core.custom_operator.CustomOperatorBase` class and a call to
the :func:`ansys.dpf.core.custom_operator.record_operator` method.

This example script shows how you create a basic operator plugin:
This example script shows how you create a basic operator plug-in:

.. literalinclude:: custom_operator_example.py

Expand All @@ -78,32 +78,32 @@ This example script shows how you create a basic operator plugin:
record_operator(CustomOperator, *args)
In the various properties for the class, you specify the following:
In the various properties for the class, specify the following:

- Name for the custom operator
- Description of what the operator does
- Dictionary for each input and output pin, which includes the name, a list of supported types, a description,
- Dictionary for each input and output pin. This dictionary includes the name, a list of supported types, a description,
and whether it is optional and/or ellipsis (meaning that the specification is valid for pins going from pin
number *x* to infinity)
- List for operator properties, including name to use in the documentation and code generation and the
operator category. The optional ``license`` property allows to define a required license to check out
operator category. The optional ``license`` property allows you to define a required license to check out
when running the operator. Set it equal to ``any_dpf_supported_increments`` to allow any license
currently accepted by DPF (see :ref:`here<target_to_ansys_license_increments_list>`)

For comprehensive examples on writing operator plugins, see :ref:`python_operators`.
For comprehensive examples on writing operator plug-ins, see :ref:`python_operators`.


Plug-in package with multiple operators
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To create a plug-in package with multiple operators or with complex routines, you write a
To create a plug-in package with multiple operators or with complex routines, write a
Python package. The benefits of writing packages rather than simple scripts are:

- **Componentization:** You can split the code into several Python modules or files.
- **Distribution:** You can use standard Python tools to upload and download packages.
- **Documentation:** You can add README files, documentation, tests, and examples to the package.

A plug-in package with dependencies consists of a folder with the necessary files. Assume
that the name of your plug-in package is ``custom_plugin``. A folder with this name would
A plugin package with dependencies consists of a folder with the necessary files. Assume
that the name of your plugin package is ``custom_plugin``. A folder with this name would
contain four files:

- ``__init__.py``
Expand Down Expand Up @@ -223,7 +223,7 @@ Once a custom operator is created, you can use the :func:`ansys.dpf.core.core.lo
The first argument is the path to the directory with the plugin. The second argument is ``py_`` plus any name
identifying the plugin. The last argument is the function name for recording operators.

For a plugin that is a single script, the second argument should be ``py_`` plus the name of the Python file:
For a plug-in that is a single script, the second argument should be ``py_`` plus the name of the Python file:

.. code::
Expand All @@ -241,7 +241,7 @@ For a plug-in package, the second argument should be ``py_`` plus any name:
"py_my_custom_plugin", #if the load_operators function is defined in path/to/plugins/custom_plugin/__init__.py
"load_operators")
Once the plugin is loaded, you can instantiate the custom operator:
Once the plug-in is loaded, you can instantiate the custom operator:

.. code::
Expand Down
6 changes: 3 additions & 3 deletions doc/source/user_guide/custom_operators_deps.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
To add third-party modules as dependencies to a plug-in package, you should create
To add third-party modules as dependencies to a plug-in package, create
and reference a folder or ZIP file with the sites of the dependencies in an XML file
located next to the folder for the plug-in package. The XML file must have the same
name as the plug-in package plus an ``.xml`` extension.
Expand All @@ -7,10 +7,10 @@ When the :py:func:`ansys.dpf.core.core.load_library` method is called, PyDPF-Cor
``site`` Python module to add custom sites to the path for the Python interpreter.


To create these custom sites, do the following:
To create these custom sites:

#. Install the requirements of the plug-in package in a Python virtual environment.
#. Remove unnecessary folders from the site packages and compress them to a ZIP file.
#. Remove unnecessary folders from the site packages and compress them into a ZIP file.
#. Place the ZIP file in the plug-in package.
#. Reference the path to the ZIP file in the XML file as indicated above.

Expand Down
29 changes: 14 additions & 15 deletions doc/source/user_guide/fields_container.rst
Original file line number Diff line number Diff line change
@@ -1,23 +1,22 @@
.. _ref_user_guide_fields_container:

===========================
Fields container and fields
Field containers and fields
===========================
While DPF uses operators to load and operate on data, it uses field containers
and fields to store and return data. In other words, operators are like verbs,
acting on the data, while field containers and fields are like nouns, objects
that hold data.
and fields to store and return data. Operators are like verbs, acting on the data,
while field containers and fields are like nouns, objects that hold data.

Access a fields container or field
Access a field container or field
-----------------------------------
The outputs from operators can be either a
:class:`ansys.dpf.core.fields_container.FieldsContainer` or
:class:`ansys.dpf.core.field.Field` class.

A fields container is the DPF equivalent of a list of fields. It holds a
A field container is the DPF equivalent of a list of fields. It holds a
vector of fields.

This example uses the ``elastic_strain`` operator to access a fields container:
This example uses the ``elastic_strain`` operator to access a field container:

.. code-block::
Expand Down Expand Up @@ -59,11 +58,11 @@ This example uses the ``elastic_strain`` operator to access a fields container:
- field 19 {time: 20} with ElementalNodal location, 6 components and 40 entities.
Access fields within a fields container
---------------------------------------
Many methods are available for accessing a field in a fields
Access fields within a field container
--------------------------------------
Many methods are available for accessing a field in a field
container. The preceding results contain a transient
result, which means that the fields container has one field
result, which means that the field container has one field
by time set.

Access the field:
Expand Down Expand Up @@ -127,7 +126,7 @@ To access fields for more complex requests, you can use the
...
Here is a more real-word example:
Here is a more real-world example:

.. code-block::
Expand Down Expand Up @@ -190,7 +189,7 @@ Here is a more real-word example:
The following example references the available time frequency support to determine which
time complex IDs are available in the fields container:
time complex IDs are available in the field container:

.. code-block::
Expand Down Expand Up @@ -409,7 +408,7 @@ Note that this array is a genuine, local, numpy array (overloaded by the DPFArra
<class 'ansys.dpf.gate.dpf_array.DPFArray'>
If you need to access an individual node or element, request it
To access an individual node or element, request it
using either the ``get_entity_data()`` or ``get_entity_data_by_id()`` method:

Get the data from the first element in the field.
Expand Down Expand Up @@ -480,7 +479,7 @@ get the index of element 29 in the field with:
29
Here the data for the element with ID 10 is made of 8 symmetrical tensors.
Here the data for the element with ID 10 is made of eight symmetrical tensors.
The elastic strain has one tensor value by node by element (ElementalNodal location)

To get the displacement on node 3, you would use:
Expand Down
12 changes: 6 additions & 6 deletions doc/source/user_guide/operators.rst
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ Because several other examples use the ``Model`` class, this example uses the
result key: rst and path: path\...\ansys\dpf\core\examples\model_with_ns.rst
Secondary files:
This code shows how to connect the data source to the displacement operator:
This code demonstrates how to connect the data source to the displacement operator:

.. code-block:: python
Expand Down Expand Up @@ -306,11 +306,11 @@ DPF provides three main types of operators:
Operators for importing or reading data
***************************************

These operators provide for reading data from solver files or from standard file types
These operators read data from solver files or from standard file types
such as .RST (MAPDL), .D3Plot (LS DYNA), .CAS.H5/.DAT.H5 (Fluent) or .CAS.CFF/.DAT.CFF (CFX).

To read these files, different readers are implemented as plugins.
Plugins can be loaded on demand in any DPF scripting language with "load library" methods.
To read these files, different readers are implemented as plug-ins.
Plug-ins can be loaded on demand in any DPF scripting language with "load library" methods.
File readers can be used generically thanks to the DPF result providers, which means that the same operators can be used for any file types.

This example shows how to read a displacement and a stress for any file:
Expand All @@ -334,8 +334,8 @@ Operators for transforming data
*******************************

A field is the main data container in DPF. Most of the operators that transform
data take a field or a fields container as input and return a transformed
field or fields container as output. You can perform analytic, averaging,
data take a field or a field container as input and return a transformed
field or field container as output. You can perform analytic, averaging,
or filtering operations on simulation data.

For example, after creation of a field, you can use scaling and filtering
Expand Down
9 changes: 5 additions & 4 deletions doc/source/user_guide/plotting.rst
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
.. _user_guide_plotting:

====
Plot
====
========
Plotting
========

DPF-Core has a variety of plotting methods for generating 3D plots of
Ansys models directly from Python. These methods use VTK and leverage
the `PyVista <https://github.com/pyvista/pyvista>`_ library to
Expand Down Expand Up @@ -74,7 +75,7 @@ First, extract the X component strain
Data:1 components and 69120 elementary data
This ElementalNodal strain must be converted to nodal strain for it to be plotted.
This ElementalNodal strain must be converted to a nodal strain for it to be plotted.

.. code-block::
Expand Down
8 changes: 3 additions & 5 deletions doc/source/user_guide/server_context.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,9 +23,7 @@ Two main licensing context type capabilities are available:
- **Entry:** This context does not allow DPF to perform any license checkout,
meaning that licensed DPF operators fail.

For the operator list for each licensing context type, see :ref:`ref_dpf_operators_reference`.
The **Premium** operators reference includes licensed DPF operators.
The **Entry** operators reference only includes unlicensed DPF operators.
For the operator list, see :ref:`ref_dpf_operators_reference`.

Change server context from Entry to Premium
-------------------------------------------
Expand Down Expand Up @@ -66,7 +64,7 @@ Change the default server context

The default context for the server is **Premium**. You can change the context using
the ``ANSYS_DPF_SERVER_CONTEXT`` environment variable. For more information, see
the :module: `<ansys.dpf.core.server_context>` module). You can also change the server context
the :module: `<ansys.dpf.core.server_context>` module. You can also change the server context
with this code:

.. code-block::
Expand All @@ -83,7 +81,7 @@ with this code:
.. warning::
As starting an ``InProcess`` server means linking the DPF binaries to your current Python
process, you cannot start a new ``InProcess`` server. Thus, if your local ``InProcess`` server
process, you cannot start a new ``InProcess`` server. If your local ``InProcess`` server
is already **Premium**, you cannot set it back as **Entry**.
``InProcess`` being the default server type, the proper commands to work as **Entry** should be
set at the start of your script.
Expand Down
2 changes: 1 addition & 1 deletion doc/source/user_guide/server_types.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Terminology

DPF is based on a **client-server** architecture.

A DPF Server is a set of files that enables DPF capabilities.
A DPF Server is a set of files that enable DPF capabilities.

PyDPF-Core is a Python client API communicating with a DPF Server, either
directly **in the same process** or through the network using **gRPC**.
Expand Down
Loading

0 comments on commit 8487312

Please sign in to comment.