I am very cautious about photos of myself on the internet. And I am even more cautious about private information. However, this week there was a nice moment I want to share: If you can, you can see me through the web cam of the MS Europa (no, I was not traveling with the ship).

I am a Microsoft fan boy. This, I gladly admit. But one thing is for sure, the guys from Redmond are only humans after all. And I don’t have time nor patience for beta tests.

This week I upgraded my three PCs to Windows 10. All of my computers were running Windows 8.1 installed from the same image, and I used all three roughly for the same stuff. So, I was rather impressed how different the results and installation experiences were, especially, since I used the same downloaded image on all three computers:

My computer at work decided to forget most icons of most installed programs after the upgrade. This was because I moved the useless installation cache from my tiny SSD onto a normal hard disk and relinked it back. This works find in every aspect, but the windows installation decided to throw away the link before doing anything. Smart move.

On my Surface Pro 3 Windows 10 welcomed me after the upgrade with some information on functions I could use if I would have had touch input. … Wut? Well, after two additional reboots the surface remembered its core feature touch screen and pen. Just, the quick-note button on the pen itself, however, cannot be persuaded to use OneNote. Instead it stubbornly insist on using the useless OneNote imposter app. Annoying.

And, finally, one my private Desktop, almost everything worked fine. Only exception was that the setup de-installed my antivirus. It was fine with the same antivirus software on the other two computers, so I guess it was about time to stumble now.

So much for that. Until now Windows 10 leaves and Ok impression. Only some things sucks, only some things seems improved, only some things got worse. It’ll be fine. However, I haven’t seen any real killer feature.

P.S.: Microsoft cannot count! cf. Visual Studio 14

Most new data sets for my scientific visualization find their way to my desk in form of arbitrarily structures text files. This is not really a problem. The first sensible step is converting them into a fast binary format for the visual analysis. With this, however, I face the problem of understanding the structure of 11 Gigabytes text files (no exaggeration here!). But, such files do have structure. So, only the few first and few last lines really matter. The bits in-between will be roughly the same way. What I need are the Linux-known commands “head” and “tail”. However, I am a Windows guy. So? The Powershell comes to the rescue:

gc log.txt | select -first 10 # head
gc log.txt | select -last 10 # tail

I found these on: http://stackoverflow.com/a/9682594 (where else)

At least the “head” version was fast and sufficient for me. I am happy.

Tagged with: , ,

We updated MegaMol to use cmake to build on Linux OS. This greatly improved the build process on Linux. But this also makes some more uncommon scenarios difficult to realize. For example, cmake usually automatically detects required dependencies. But, in some scenarios you need to override this magic.

In this article I show how to compile a second MegaMol on a system on which a MegaMol already has been compiled and installed. This is useful when working with experimental versions.

VISlib and visglut

First off you build the visglut the usual way. I assume here, that the installed MegaMol uses a different visglut as the one you want to build now:

mkdir megamol_x2
cd megamol_x2
svn co https://svn.vis.uni-stuttgart.de/utilities/visglut/tags/forMegaMol11 visglut
cd visglut/build_linux_make

If everything worked you can find the following files:

in megamol_x1/visglut/include:


and in megamol_x2/visglut/lib:


If so, let’s continue with the VISlib:

cd megamol_x2
svn co https://svn.vis.uni-stuttgart.de/utilities/vislib/tags/release_2_0 vislib
cd vislib

Now, there is the first action which is different from the default build process. As usual we will use the script cmake_build.sh. This script, however, per default registers the build directories in the cmake package registry. This enables cmake to find this package in its build trees. In this scenario, however, we do not want this special build to be automatically found, because we do not want to get in the way of our system-installed MegaMol. We thus deactivate the package registry.

This command configures and builds the VISlib, both for debug and release version:

./cmake_build.sh -dcmn

As always, if you encounter build problems due to the multi-job make, reduce the number of compile jobs:

./cmake_build.she -dcmnj 1

Note that I do not specify an install directory. I do not plan to install this special MegaMol. I just want to build, for example for a bug hunt.


It’s now time for the core.

cd megamol_x2
svn co https://svn.vis.uni-stuttgart.de/projects/megamol/core/branches/v1.1rc core
cd core

We first test the configuration by only configuring release and not building anything:

./cmake_build.sh -cv ../vislib -C -DCMAKE_DISABLE_FIND_PACKAGE_MPI=TRUE

Note that I also disabled MPI-Support here. The system I am building on has MPI installed, but I don’t want this MegaMol to use it.

The output should contain this line:

-- Found vislib: /home/sgrottel/megamol20150726/vislib/build.release/libvislib.a

This points to the right vislib, the one we specified. So all is well. We can build MegaMol, again without registering it’s build trees in the cmake package repository:

./cmake_build.sh -dcmnv ../vislib -C -DCMAKE_DISABLE_FIND_PACKAGE_MPI=TRUE

When all worked you got yourself the binaries:



Get yourself a working copy of the console:

cd megamol_x2
svn co https://svn.vis.uni-stuttgart.de/projects/megamol/frontends/console/branches/v1.1rc console
cd console

Again, we test if everything works by only configuring release and not building:

./cmake_build.sh -c -f ../core

The Console does not register its build tree per default, since no other project depends on the console. So we are fine here.

The output should contain these lines:

-- Looking for MegaMolCore with hints: ../core;../core/build.release;../core/share/cmake/MegaMolCore
-- Found MegaMolCore: /home/sgrottel/megamol20150726/core/build.release/libMegaMolCore.so
-- MegaMolCore suggests vislib at: /home/sgrottel/megamol20150726/vislib/build.release
-- MegaMolCore suggests install prefix: /usr/local
-- Using MegaMolCore install prefix
-- Found vislib: /home/sgrottel/megamol20150726/vislib/build.release/libvislib.a
-- Found AntTweakBar: /home/sgrottel/AntTweakBar/lib/libAntTweakBar.so
-- Found visglut: /home/sgrottel/megamol20150726/visglut/lib/libvisglut64.a

If the directories for other libraries are wrong, for example the AntTweakBar or the visglut use the cmake-typical DIR variable to give a search hint. Remember, relative paths might be confusion. Better use absolute paths. I don’t:

./cmake_build.sh -c -f ../core -C -Dvisglut_DIR=~/megamol20150709/visglut -C -DAntTweakBar_DIR=../../AntTweakBar

But in my case the cmake-magic worked fine in the first place. So, I configure both build types again:

./cmake_build.sh -dc -f ../core

Double check the output. Make sure the core, the vislib and the visglut are found in all the right places. If they are, built it:

./cmake_build.sh -dm

At this point you can quickly test your MegaMol. First open the megamol.cfg configuration file in a text editor and adjust the paths in there to yours. Then run MegaMol:

cd build.release

If this seems ok, and if you have a local graphics card you can run the demo renderer:

./MegaMolCon -i demospheres s

Some MegaMol Plugin

Finally we need a plugin. I go for the mmstd_moldyn:

cd megamol_x2
svn co https://svn.vis.uni-stuttgart.de/projects/megamol/plugins/mmstd_moldyn/branches/v1.1rc mmstd_moldyn
cd mmstd_moldyn

The process is now exactly the same as with the console:

./cmake_build.sh -dcf ../core

The double check the directories for the core and the VISlib. If they are good, build the plugin:

./cmake_build.sh -dm

To test this plugin we go back to the console, and adjust the config file to load the plugin:

cd megamol_x2/console/build.release

Include the following lines in the config file. Obviously adjust the paths to what you need:

<plugin path="/home/dude/megamol_x2/mmstd_moldyn/build.release" name="mmstd_moldyn.mmplg" action="include" />
<shaderdir path="/home/dude/megamol_x2/mmstd_moldyn/Shaders" />

If you now run MegaMol it will try to load your plugin and will report it. The output console should contain something like:

200|Plugin mmstd_moldyn loaded: 11 Modules, 0 Calls
200|Plugin "mmstd_moldyn" (/home/sgrottel/megamol20150726/mmstd_moldyn/build.release/mmstd_moldyn.mmplg) loaded: 11 Modules, 0 Calls registered

And that’s it.

Tagged with:

It is time again. In the coming week, there will be the last lectures for some time. Although I like giving lectures, it is fun, and I believe I am good at it. Nevertheless, I am looking forward to get rid of this part of my work again. At least for a little while. So, I can focus again on all the other unfinished issues on my desk, and there are a lot. For example, the release of the next version of MegaMol should have happened two weeks ago. I have to work on my upcoming paper projects. And, of course, the development of MegaMol needs to go further for future projects.

Yesterday I tried to write yet another “nothing new” post, following my normal posting schedule. However, due to some reasons I completely don’t understand, I could not log into my admin dashboard. I tried everything on the login troubleshooting guide, until I reached “If everything else fails…” Luckily, it then was late, I was tired and I did not want to try anything else.

And today, everything works fine again. … WTF …

Why is software so unreliable, and why does software these days fails so silently and so elusively?

Tagged with:

This Monday and Thursday the VII. Annual Meeting of the Boltzmann-Zuse-Society for Computational Molecular Engineering takes place in Kaiserslautern, organized by Martin Horsch. Basically it is a meeting for the groups from Stuttgart (VISUS, HLRS), Paderborn, Kaiserslautern and Dresden, to talk about simulation, analysis and visualization of molecular dynamics data. And to discuss our joint research and development projects. Of course, Joachim and I will be talking about current works with and on MegaMol. By the way: http://megamol.org

Tagged with:


This semester the preparation and holding of my lecture eats away a lot of time. And, there is so much else I need to do, too. Luckily, I get help from my co-workers and student assistants. And, if I would not have my fun doing all this, I would have changed jobs a long time ago.

In my lecture, by the way, I teach programming in the language C++. The focus is on the language itself and on using it correctly. I use interactive computer graphics as scenario. And in the practical part of the lecture, my students can write a small computer game. I am very satisfied with the base code skeleton we prepared this year. And I am curious what the students will make of it in the last exercise.

It is now two weeks since Joachim Staib and myself attended the EuroVis 2015 in Cagliari, Italy. I thought it is about time I write about what we did there. Actually, it was a very successful conference for our computer graphics and visualization group from the TU Dresden.

First off, Joachim gave a talk on our work on particle visualization with transparency and ambient occlusion. I write “our work” but he did a wonderful job making this project fly. He really earned the fame to be first author.

  • [DOI] J. Staib, S. Grottel, and S. Gumhold, “Visualization of Particle-based Data with Transparency and Ambient Occlusion,” Computer Graphics Forum, vol. 34, iss. 3, pp. 151-160, 2015.
      journal = {Computer Graphics Forum},
      title = {{Visualization of Particle-based Data with Transparency and Ambient Occlusion}},
      author = {Joachim Staib and Sebastian Grottel and Stefan Gumhold},
      affiliations = {CGV},
      pages = {151--160},
      volume= {34},
      number= {3},
      year = {2015},
      DOI = {10.1111/cgf.12627},

DOI: 10.1111/cgf.12627

Then, I was giving a talk on one of the smaller workshops co-located with the conference. I presented work on visualization of flood simulation data. The focus is realistic rendering for interactive “experiencing” the data. My goal is rightly set by current AAA games. I am ready for the next round.

  • [DOI] S. Grottel, J. Staib, T. Heyer, B. Vetter, and S. Gumhold, “Real-Time Visualization of Urban Flood Simulation Data for Non-Professionals,” in Workshop on Visualisation in Environmental Sciences (EnvirVis), Cagliari, Italy, 2015, pp. 37-41.
      title = {{Real-Time Visualization of Urban Flood Simulation Data for Non-Professionals}},
      author = {Sebastian Grottel and Joachim Staib and Torsten Heyer and Benjamin Vetter and Stefan Gumhold},
      pages = {037--041},
      year = {2015},
      DOI = {10.2312/envirvis.20151089},
      editor = {Ariane Middel and Gunther Weber and Karsten Rink},
      booktitle = {Workshop on Visualisation in Environmental Sciences (EnvirVis)},
      address = {Cagliari, Italy},
      publisher = {Eurographics Association},

DOI: 10.2312/envirvis.20151089

And, finally, my CGF journal publication on continuous-time parallel coordinates was invited to the conference as well. I already did write about that paper.

  • [DOI] S. Grottel, J. Heinrich, D. Weiskopf, and S. Gumhold, “Visual Analysis of Trajectories in Multi-Dimensional State Spaces,” Computer Graphics Forum, vol. 33, iss. 6, pp. 310-321, 2014.
    @article {Grottel2014HDTraj,
      author = {Grottel, Sebastian and Heinrich, Julian and Weiskopf, Daniel and Gumhold, Stefan},
      title = {{Visual Analysis of Trajectories in Multi-Dimensional State Spaces}},
      year = {2014},
      journal = {Computer Graphics Forum},
    volume = {33},
    number = {6},
    pages = {310--321},
      doi = {10.1111/cgf.12352}

DOI: 10.1111/cgf.12352

So, altogether, I am perfectly happy with the conference. Now, it is just about not getting worse.