Skip to content

Commit

Permalink
Update text to match ParaView 5.11 where needed, added some images
Browse files Browse the repository at this point in the history
  • Loading branch information
paulmelis committed Jan 31, 2024
1 parent 70b0436 commit f540323
Show file tree
Hide file tree
Showing 8 changed files with 48 additions and 30 deletions.
7 changes: 7 additions & 0 deletions docs/exercise1.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@ Right now, your data is visible only as an outline: a box made from a few white

!!! Info "Volume representation 'Are you sure?'"

![](images/volume-warning.png)

When enabling Volume representation a dialog box may pop up asking if you are sure. For large datasets volume rendering can be problematic
if your system doesn't have a lot of memory and/or has a slow GPU. For the datasets used here volume rendering on most systems should not be a problem.

Expand All @@ -55,6 +57,11 @@ Our next job is to see what this dataset contains. To be more precise, we want t

![](images/contour-filter.png)

!!! Tip "Quick way to add a filter"

When you know the name of the filter you want to add, a quicker way than navigating the Filter menu is to use **Ctrl+Spacebar**. This will show a search dialog where you can type (part of) the filter name and matching filters will get listed. Select the filter you want and press **Enter** to add it.
Of course, make sure to have selected the filter whose inputs you want to use before adding the downstream filter.

▶ Next, we want to see what this filter does: in the __Properties__ tab, set the value of the contour under __Isosurfaces__ to 500 and click on __Apply__ and you should see a surface more-or-less representing the boy's skin.

![](images/contour500.png)
Expand Down
35 changes: 20 additions & 15 deletions docs/exercise2.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,23 +22,20 @@ These values are:
* __x__, __y__, and __z__ values defining a location on a 3D cartesian grid.
* __vx__, __vy__ and __vz__ values, representing a 3D velocity vector at the (x,y,z) location

▶ Verify that the ParaView spreadsheet view indeed shows the same data.
▶ Verify that the ParaView spreadsheet view indeed shows the same set of data.

ParaView does not know how these individual variables relate to the points and cells in its data model, so we have to provide that mapping ourselves.
ParaView does not know by itself how these individual variables relate to the points and cells in its data model, so we have to provide that mapping manually.

▶ The first step is to add a __TableToPoints__ filter to our data source. This filter creates a 3D point for each row in the table, based on a set of table values you choose. You can find this filter under __Filters → Alphabetical → TableToPoints__. Set the __X__, __Y__ and __Z__ columns correctly to the table columns representation point position, check __Keep All Data Arrays__ and hit __Apply__.

!!! Hint "Quick way to add a filter"

When you know the name of the filter you want to add, a quicker way than navigating the Filter menu is to hit **Ctrl+Spacebar**. This will show a search dialog where you can type (part of) the filter name and matching filters will get listed. Select the filter you want and press **Enter** to add it.
Of course, make sure to have selected the filter whose inputs you want to use before adding the downstream filter.

![](images/tabletopoints.png)

▶ Compare the __Information__ tabs of the __wervel.csv__ source and the __TableToPoints__ filter to see how the output data has changed (type, number of points/cells, etc), and verify that the data arrays produced by the TableToPoints filter match those of the wervel.csv source.

▶ If needed, enable visibility of the TableToPoints output by clicking the eye icon next to it in the pipeline browser. If you have both a RenderView (the 3D panel) and a Spreadsheet view then make sure you have the RenderView selected (by clicking in it) before enabling visibility of the TableToPoints output. You should now see a regular 3D grid of (white) points, indicating that the point position values from the CSV data have correctly been set based on the table input.

## Creating velocity vectors

Next, we need to combine the three separate scalar values __vx__, __vy__ and __vz__ into one 3D vector value. To do this, we use the __Calculator__ filter that is built into ParaView.

!!! Note
Expand All @@ -57,7 +54,7 @@ Next, we need to combine the three separate scalar values __vx__, __vy__ and __v

This will cause properties to get filtered on the entered text (which there aren't), hiding them all.

You will probably have understood the name __iHat__ to represent the vector î, i.e. (1, 0, 0). Using the Calculator filter fairly complex expressions can be used to augment existing datasets with new and useful values, both scalars and vectors.
You will probably have understood the name __iHat__ to represent the vector __î__, i.e. (1, 0, 0). Using the Calculator filter fairly complex expressions can be used to augment existing datasets with new and useful values, both scalars and vectors.

▶ Select the Calculator filter in the pipeline and add another Calculator filter, for creating an array __VelocityMag__ and expression `mag(Velocity)`.

Expand All @@ -69,9 +66,9 @@ You will probably have understood the name __iHat__ to represent the vector î,

Now that our data is converted from the CSV input to the ParaView data model we can start looking at our flow field. To do this, we want to use the _stream-tracer filter_. With this filter we simulate injecting particles in our flow field at specific locations. By letting these particles follow the flow, based on the velocity vectors in the grid, and tracing them over time we get an impression of the fluid flow through the model.

The stream-tracer filter is based on tracing a virtual particle through the _cells_ of a dataset, i.e. regions of model space defined using 3D points. The flow direction within a cell, as indicated by the vector value, is then used by the filter to integrate a particle's position over time, thereby creating a trace of the particle through the dataset.
The stream-tracer filter is based on tracing a virtual particle through the _cells_ of a dataset, i.e. regions of model space defined using 3D points. The flow direction within a cell, as indicated by the velocity vector value, is then integrated by the filter to determine a particle's position over time, thereby creating a trace of the particle through the dataset.

But as noted above, there's currently only a single cell in our dataset holding all the points, which provides no meaningful way to trace particles. So we need to divide up the dataset domain into small cells. Furthermore, the flow vector values are currently associated only with the _points_ of the dataset, while we need those values for each _cell_ for the streamtracer to work.
But as noted above, *there's currently only a single cell in our dataset holding all the points*, which provides no meaningful way to trace particles. So we need to divide up the dataset domain into small cells. Furthermore, the flow vector values are currently associated only with the _points_ of the dataset, while we need those values for each _cell_ for the stream-tracer to work.

We can fix these two issues by applying a 3D Delaunay triangulation. This creates cells from the dataset based on the existing points. The cells created are small and detailed enough so that our Stream Tracer filter can reasonably work. The filter also adds a new flow vector value for each created cell, based on interpolating the existing per-point values of its corners.

Expand All @@ -87,7 +84,7 @@ The pipeline we constructed so far should look like this:

Now lets do some initial particle tracing through the flow field using the stream tracer filter.

▶ In the pipeline, select the __Delaunay3D__ filter and add a __Stream Tracer__ filter. You can find this under __Filters → Common → Stream Tracer__. Do not press Apply just yet.
▶ In the pipeline, select the __Delaunay3D__ filter and add a __Stream Tracer__ filter. You can find this under __Filters → Common → Stream Tracer__. *Do not press Apply just yet*.

▶ The Stream Tracer filter has quite a few __parameters__. Important ones are integration settings and the seed location for the particles to trace. Change them to, for example, the __values below__ and click __Apply__:

Expand Down Expand Up @@ -115,16 +112,24 @@ Finally, we'll add a different representation instead of the streamlines, called

▶ Hide all filter output by clicking the relevant eye icons.

▶ Select the __Calculator2__ filter and add a __Glyph__ filter (__Filters → Common → Glyph__). Set the __Glyph Type__ to __Arrow__, set the __Orientation Array__ to __Velocity__ (i.e. our computed velocity vectors) and __Scale Mode__ to __No scale array__. Click __Apply__. Note that there is no need to base the Glyph filter on the Delaunay 3D output, as the Glyph filter works on 3D points only as in the original data set, and not on the cells that we added with the Delaunay 3D filter.
▶ Select the __Calculator2__ filter and add a __Glyph__ filter (__Filters → Common → Glyph__). Set the __Glyph Type__ to __Arrow__, set the __Orientation Array__ to __Velocity__ (i.e. our computed velocity vectors) and __Scale Mode__ to __No scale array__. Click __Apply__.

!!! Info "Point versus cell input"

Note that there is no need to base the Glyph filter on the Delaunay 3D output, as the Glyph filter works on 3D *points*, as in the original data set. This is unlike the Streamtracer filter needing cells, which we added using the Delaunay 3D filter.

ParaView does provide generic `Point Data to Cell Data` and `Cell Data to Point Data` filters, to convert between the two using interpolation.

![](images/glypsettings.png)

You should now see a large number of arrows nicely distributed over the tornado dataset, indicating the direction of wind flow. As we have set the Scale Mode to Off all arrows are the same size, giving no clue to wind speed.
You should now see a large number of arrows nicely distributed over the tornado dataset, indicating the direction of wind flow. As we have set the Scale Mode to Off all arrows are the same size, obscuring the insides and giving less of a visual clue to wind speed.

▶ Set the __Scale Array__ to __Velocity__ and the __Scale Factor__ to __0.2__ and press __Apply__.

▶ Make sure the coloring is set to __VelocityMag__ and verify that the size and colors of a glyph arrow corresponds to its velocity value.

▶ You might have noticed that the number of glyphs placed is a lot smaller than the number of points in the dataset. This actually makes sense as otherwise there would be too many overlapping glyphs in the view to make sense of them. But there still is quite a large number of glyphs, and perhaps still too many. The settings under __Masking__ control the number and distribution of the glyphs placed. See what happens to the resulting visualization when you show a glyph for every 10th point, or only 500 glyphs uniformly distributed (and why that means you need choose these kinds of parameters with care).
You might wonder about the the number of glyphs placed, compared to the 25,000 points in the dataset. There is quite a large number of glyphs, and perhaps still too many to be effective. This doesn't help in the overall visual interpretation of the data, but we do need to balance getting enough coverage of the full dataset.

▶ The settings under __Masking__ control the number and distribution of the glyphs placed. See what happens to the resulting visualization when you show a glyph for every 10th point, or 500 glyphs uniformly distributed (and why that means you need choose these kinds of parameters with care).

▶ A useful variant is to apply glyphs to the output of the Stream Trace filter (by creating a second Stream Trace filter). This is possible because the generated streamlines are themselves polygonal data, where each streamline consists of a Poly-Line cell that uses a set of 3D points. As a Glyph filter uses point positions to place glyphs we can add Glyphs to streamlines. Experiment with this, using different types of glyphs, like Sphere and Arrow. Also try coloring by IntegrationTime to verify the direction in which the streamlines where "grown".
▶ A useful variant is to apply glyphs *to the output of the Stream Trace filter* (by creating a second Stream Trace filter). This is possible because the generated streamlines are themselves polygonal data, where each streamline consists of a Poly-Line cell that uses a set of 3D points. As a Glyph filter uses point positions to place glyphs we can add Glyphs to streamlines. Experiment with this, using different types of glyphs, like Sphere and Arrow. Also try coloring by IntegrationTime to verify the direction in which the streamlines where "grown".
28 changes: 16 additions & 12 deletions docs/exercise3.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,19 +13,15 @@ In this exercise, we are going to look at time-dependent data. The data we use f
You should now see a contour surface of one single time point in the growth of the coral. However, what we want is to have the contour plot
change dynamically over time, showing us the growth of the coral over time.

▶ To achieve this, we need to open the __Animation View__, which can be enabled from the main menu with __View → Animation View__. In the animation toolbar, set __No. frames__ to __100__, the __End time__ to __10__. This will set up the animation to be 10 seconds long, playing a total of 100 frames, and thus
10 frames per second.
▶ To achieve this, we need to open the __Animation View__, which can be enabled from the main menu with __View → Animation View__. In the animation toolbar, set __No. Frames__ to __100__, the __End time__ to __10__. This will set up the animation to be 10 seconds long, playing a total of 100 frames, and thus 10 frames per second.

▶ Link the contour iso-surface value to the time sequence by using the blue __+__ button left of the __Contour1__ and __Isosurfaces__ dropdown menus.

![](images/animation.png)

▶ Verify that this adds __Contour1__ to the timeline, directly under __TimeKeeper1__ as a second "strip" over the full length of the time bar. The values at
the far left and right edges of the strip are the isosurfaces used at those time points. These values are based on the min/max values from the input dataset
(which can you verify using the __Information__ tab of __ALT_PRPB001A.vtk__).
▶ Verify that this adds __Contour1__ to the timeline, directly under __TimeKeeper1__ as a second "strip" over the full length of the time bar. The values at the far left and right edges of the strip are the isosurface values used at those time points. These values are based on the min/max values from the input dataset (which can you verify using the __Information__ tab of __ALT_PRPB001A.vtk__).

▶ With the animation set up you can now use the playback buttons in the main toolbar to play through the growth of the simulation. The double arrow button
controls if the playback loops back to the start.
▶ With the animation set up you can now use the playback buttons in the main toolbar to play through the growth of the simulation. The double arrow button controls if the playback loops back to the start.

![](images/playback.png)

Expand All @@ -43,16 +39,24 @@ Since the coral is a three-dimensional structure, it is nice to look at all side

Next, we will add a circular camera motion, a so-called _orbit_.

▶ In the Animation View select __Camera__ in the dropdown next to the blue __+__ button, and __Orbit__ in the dropdown right to it. Then create a camera animation strip with the blue __+__ button. A dialog box will show up with parameters for the orbit, do not click Ok just yet.
▶ In the Animation View select __Camera__ in the dropdown next to the blue __+__ button, and __Follow Path__ in the dropdown right to it. Then create a camera animation strip by pressing the blue __+__ button. If you play the animation you will see the scene rotates over time.

▶ The default orbital parameters needs some tweaking, as they are based on the current view. The __Center__ value is the point around which the camera is rotated, __Normal__ is the vector used for the rotation and the __Origin__ is the initial camera position. Set the __Normal__ value to be -Y up __(0, -1, 0)__, or else we will get some weird rotation during the orbit. Normally, you will need to experiment in your own scenes to figure out correct values, but use the ones given below here and press __Ok__.
The default camera orbit might need some tweaking, as it is set up based on the current view. For example, the camera might be a bit too close to the coral object and we want to move it back a bit.

▶ Bring up the animation parameters by double-clicking on the Camera strip in the Animation View, this will show the __Animation Keyframes__ dialog.

![](images/animation-keyframes.png)

There are currently on two key frames defined, for time 0 and time 10. We will change the values for time 0 to tweak the camera animation.

▶ Select the row for time 0 in the dialog and click __Create Orbit__. This will show the Create Orbit dialog. The __Center__ value is the point around which the camera is rotated, __Normal__ is the vector used for the rotation and the __Origin__ is the initial camera position. Set the __Normal__ value to be -Y up __(0, -1, 0)__, or else the might be some weird rotation during the orbit. Normally, you will need to experiment in your own scenes to figure out correct values, but use the ones given below here and press __Ok__.

![](images/orbitparameters.png)

▶ Play back the animation again and observe a nice rotation of the data as the coral grows.
Press __Ok__ in the Animation Keyframes dialog to apply the new values. Play back the animation again and observe a nice rotation of the data as the coral grows.

▶ You can try experimenting with some different orbit parameters, to get visually different animations. Unfortunately, you cannot easily edit the current orbit parameters, so it's best to delete the current Camera strip from the animation view and create a new one with different parameters.
▶ You can try experimenting with some different orbit parameters, to get visually different animations.

!!! Tip
!!! Tip "Saving the animation to a video"

Although not part of this exercise, it is really easy at this point to save the animated view to a movie file from ParaView. Use **File → Save Animation** for this. You can either save to a sequence of images, or directly to a video file.
Binary file added docs/images/animation-keyframes.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/images/streamtracerparams.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/volume-warning.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
7 changes: 4 additions & 3 deletions docs/preparation.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,11 @@ All course materials, data files and ParaView binaries can be found at https://e

## Install ParaView

In this course we are going to use ParaView. ParaView is an open-source, multi-platform scientific visualization and data analysis application. It is available for
Windows, MacOS X and Linux. We provide ParaView binaries for this course in the course materials share (see above).
In this course we are going to use ParaView. ParaView is an open-source, multi-platform scientific visualization and data analysis application. It is available for Windows, macOS and Linux.

This guide has been written for ParaView version **5.10**. Between ParaView versions small differences in GUI (and functionality) exist, but most of what is written in these notes should be easy to apply to other versions of ParaView.
We provide ParaView binaries *for Windows and Linux* for this course in the course materials share (see above). For macOS there are many variations available on https://www.paraview.org/download/.

This guide has been written for ParaView version **5.11**. Between ParaView versions small differences in GUI (and functionality) exist, but most of what is written in these notes should be easy to apply to other versions of ParaView.

!!! Warning "OpenGL compatibility"

Expand Down
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,5 +52,6 @@ nav:
- exercise2.md
- exercise3.md
- bonus.md
- privacy.md

copyright: Copyright © 2018-2024 SURF

0 comments on commit f540323

Please sign in to comment.