.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/503_dag_with_partitions.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_503_dag_with_partitions.py: DAG with Partitions =================== In the context of this script, partitions are used to divide the data into different segments based on certain criteria. These partitions are defined using the napier_munn function, which is partially applied to set the d50 and ep parameters. The dim argument is used to select the dimension to partition on. The partitions are then used in the Directed Acyclic Graph (DAG) to define the relationships between streams resulting from transformations (operations) on the streams. Each node in the DAG represents an operation on one or more streams, and the edges represent the flow of data from one operation to the next. The DAG, along with the defined partitions, can be used to simulate the network of operations and produce the final results. This approach allows for the management of complex relationships between streams in stream operations. .. GENERATED FROM PYTHON SOURCE LINES 17-32 .. code-block:: default import logging from copy import deepcopy from functools import partial import plotly from elphick.mass_composition import MassComposition, Stream from elphick.mass_composition.dag import DAG from elphick.mass_composition.datasets.sample_data import size_by_assay from elphick.mass_composition.flowsheet import Flowsheet from elphick.mass_composition.utils.partition import napier_munn logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s') logger = logging.getLogger(__name__) .. GENERATED FROM PYTHON SOURCE LINES 33-39 Define the Partitions --------------------- These partitions are defined in the `napier_munn` function. The function is partially applied to set the d50 and ep. The `dim` argument is used to select the dimension to partition on. These have no basis in reality and are for illustrative purposes only. .. GENERATED FROM PYTHON SOURCE LINES 39-45 .. code-block:: default part_screen = partial(napier_munn, d50=0.5, ep=0.2, dim='size') part_rgr_cyclone = partial(napier_munn, d50=0.045, ep=0.1, dim='size') part_clr_cyclone = partial(napier_munn, d50=0.038, ep=0.1, dim='size') part_scav_cyclone = partial(napier_munn, d50=0.045, ep=0.1, dim='size') .. GENERATED FROM PYTHON SOURCE LINES 46-51 Define the DAG -------------- The DAG is defined by adding nodes to the graph. Each node is an input, output or Stream operation (e.g. add, split, etc.). The nodes are connected by the streams they operate on. .. GENERATED FROM PYTHON SOURCE LINES 51-71 .. code-block:: default mc_sample: MassComposition = MassComposition(size_by_assay(), name='sample') dag = DAG(n_jobs=1) dag.add_input(name='feed') dag.add_step(name='screen', operation=Stream.split_by_partition, streams=['feed'], kwargs={'partition_definition': part_screen, 'name_1': 'oversize', 'name_2': 'undersize'}) dag.add_step(name='rougher', operation=Stream.split_by_partition, streams=['undersize'], kwargs={'partition_definition': part_rgr_cyclone, 'name_1': 'rgr_uf', 'name_2': 'rgr_of'}) dag.add_step(name='cleaner', operation=Stream.split_by_partition, streams=['rgr_uf'], kwargs={'partition_definition': part_clr_cyclone, 'name_1': 'clr_uf', 'name_2': 'clr_of'}) dag.add_step(name='scavenger', operation=Stream.split_by_partition, streams=['rgr_of'], kwargs={'partition_definition': part_scav_cyclone, 'name_1': 'scav_uf', 'name_2': 'scav_of'}) dag.add_step(name='overflow', operation=Stream.add, streams=['scav_of', 'clr_of'], kwargs={'name': 'tailings'}) dag.add_step(name='joiner', operation=Stream.add, streams=['oversize', 'clr_uf', 'scav_uf'], kwargs={'name': 'product'}) dag.add_output(name='reject', stream='tailings') dag.add_output(name='product', stream='product') .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 72-77 Run the DAG ----------- The dag is run by providing MassComposition (or Stream) objects for all inputs. They must be compatible i.e. have the same indexes. .. GENERATED FROM PYTHON SOURCE LINES 77-80 .. code-block:: default dag.run({'feed': mc_sample}, progress_bar=True) .. rst-class:: sphx-glr-script-out .. code-block:: none Executing nodes: 0%| | 0/9 [00:00


.. GENERATED FROM PYTHON SOURCE LINES 89-93 .. code-block:: default fig = fs.table_plot(plot_type='sankey', sankey_color_var='Fe', sankey_edge_colormap='copper_r', sankey_vmin=52, sankey_vmax=70) plotly.io.show(fig) .. raw:: html :file: images/sphx_glr_503_dag_with_partitions_001.html .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 2.343 seconds) .. _sphx_glr_download_auto_examples_503_dag_with_partitions.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: 503_dag_with_partitions.py <503_dag_with_partitions.py>` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: 503_dag_with_partitions.ipynb <503_dag_with_partitions.ipynb>` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_