I like working with Software. I have fun handling large data sets and producing interactive 3D visualizations which are useful, nice and cool. On the way to create them I, of course, need a huge lot of functions, libraries and services. It is a painful fight till all those parts fit nicely together. But, I would be lying if I would say that this fight is not part of the fun. It is great seeing the parts from different persons matching together to a working whole. In my current projects source code combines from colleagues from Dresden, ex-colleagues from Stuttgart, my student assistants and my advised students.

However, the real pain are the interfaces and the stability of those parts. Of course, you could swap the implementation of anything behind a good interfaces without influencing the rest of the system. But, someone needs to show me a good interface, yet! I haven’t seen one without its problems, and I have seen many. The continuous development get increasingly difficult with each component added. Especially if there is no overseeing architect.

MegaMol-Highlevel-ArchitectureAs a matter of fact, MegaMol is currently exactly on that way down. It is actively used in Stuttgart and Dresden. I, of course, want to foster and extend that process. After all, MegaMol is my project and I am proud of it. But, the larger it gets and the more it is used, the harder it gets to continue the development. Several fundamental problems can only be solved with fundamental changes. These will bring displeasure. And I am not sure how to handle the necessary work load. On the one hand I am reducing my active development to act more like a director of the future development. On the other hand, the actual development is passed from one Ph.D. student to the next, like a scapegoat. This, of course, does not help. I am really not sure how to solve this.

configurator2MegaMol Modul Graph (10.1109/TVCG.2012.282)

But, I have my plans. “Partitioning”! And I know exactly which parts need to have my priority and which are critical. The main problems, however, are the interfaces in between. Those are far from being clean and therefore the whole current partitioning is in vain. The first tries to solve that were … semi-successful. But, I don’t even think of quitting!

Every once in a while I face extremely simple tasks for which I don’t see the simple solution. Not long ago I had to concatenate two binary files into a single one. I really didn’t know how to do it without any major efforts on Windows. I thought of many terrible ideas: copying everything onto a Linux machine, using a Powershell script, googleing for a tool, throwing up a small DotNet tool myself. Luckily I decided to google for the problem and not for a tool. As result I learned that Windows can do the job itself. Well, the old DOS commands can, which are still available on Windows:

copy /y /b file1+file2 dest

Embarrassingly simple.

Important! This post is outdated: here is the update.

Some time ago I stopped using dual-boot systems. My work takes place on Windows and is based on the products of that ecosystem. I will not go into details of the holy war between Windows and Linux. Especially since Linux is an integral part of my work as well. The solution currently working best for me is using a VirtualBox Linux installation. From the multitude of distributions I boringly selected Ubuntu for my use. As computer graphics and visualization researcher I obviously like the OpenGL support in the virtualization solutions. This week I set up a new virtual machine and was greeted by this error prohibiting my OpenGL applications to run:

libGL error: pci id for fd 4: 80ee:beef, driver (null)
OpenGL Warning: glFlushVertexArrayRangeNV not found in mesa table
OpenGL Warning: glVertexArrayRangeNV not found in mesa table
OpenGL Warning: glCombinerInputNV not found in mesa table
OpenGL Warning: glCombinerOutputNV not found in mesa table
OpenGL Warning: glCombinerParameterfNV not found in mesa table
OpenGL Warning: glCombinerParameterfvNV not found in mesa table
OpenGL Warning: glCombinerParameteriNV not found in mesa table
OpenGL Warning: glCombinerParameterivNV not found in mesa table
OpenGL Warning: glFinalCombinerInputNV not found in mesa table
OpenGL Warning: glGetCombinerInputParameterfvNV not found in mesa table
OpenGL Warning: glGetCombinerInputParameterivNV not found in mesa table
OpenGL Warning: glGetCombinerOutputParameterfvNV not found in mesa table
OpenGL Warning: glGetCombinerOutputParameterivNV not found in mesa table
OpenGL Warning: glGetFinalCombinerInputParameterfvNV not found in mesa table
OpenGL Warning: glGetFinalCombinerInputParameterivNV not found in mesa table
OpenGL Warning: glDeleteFencesNV not found in mesa table
OpenGL Warning: glFinishFenceNV not found in mesa table
OpenGL Warning: glGenFencesNV not found in mesa table
OpenGL Warning: glGetFenceivNV not found in mesa table
OpenGL Warning: glIsFenceNV not found in mesa table
OpenGL Warning: glSetFenceNV not found in mesa table
OpenGL Warning: glTestFenceNV not found in mesa table
libGL error: core dri or dri2 extension not found
libGL error: failed to load driver: vboxvideo

Surprisingly a short google search did not came up with a good solution. I only found some indications that the error is related to the VirtualBox guest additions and that one should always use the most recent ones. Of course I did use the most recent ones.

After some more searching I learned that the error is related to rather new OpenGL extensions. These extensions seem to be too new for the graphics drivers of the virtualization. There seems to be a conflict between the extensions the virtual graphics card reports supported and what the vritualization Linux graphics driver makes of it.

I was actually able to solve the error by downgrading my guest additions to an older version. Using version 4.3.12 everything works fine. With any newer version up to 4.3.18 I get that error. So the Iso image of that guest additions will stay on my hard drive. I will wait if newer version of the guest additions or updates of Ubuntu will be able to fix that problem for good.

Inter-process communication always was a pain, at least when dealing with platform independent C++ environments. I have tried many things and none was really satisfying. Either the performance, the usability or the maintainability and extendibility were not acceptable. I never found a method I would want to build on.

Without doubt, a de-facto standard are web services. But, from my perspective, the perspective of a visualization and computer graphics guy, the overhead, in terms of runtime performance and development, are not acceptable. I am not talking about data rates, architecture or latencies alone. But take all these things together and apply then to interactive visualization and you get a solution to 90% of all your problems. The remaining 10%, however, are painfully bad. So you are forced to introduce a second communication channel. And so, you start implementing everything twice, again. Programming in computer graphics and interactive visualization is always so pitiably far away from nice software engineering. But I have chosen this problem as one of my research fields and I have chosen to find fitting solutions to such problems. It is an extremely fruitless work, but to me it is an important topic and I believe it is worth the work.

One rather new idea crawling around in my head is about JSON-RPC. The protocol is simple, small, trivial, but also a sufficient solution to all technical problems. That is a charming combination. Especially so, because no concrete transport channel is needed. Queries and calls can be transported interleaved in the high-performance communication channel used for the rest of the data. But they don’t have to be. I believe, looking at this approach as interpreting and managing layer only makes sense.

And talking about implementation, here we have the catch. I don’t want to claim I did an exhaustive search for implementing libraries. But the libraries I have seen so far were of disillusioning low quality. While, as foundation, jsoncpp and jansson left good impressions I cannot say that about JsonRpc-Cpp and libjson-rpc-cpp. I don’t want to badmouth these projects, but they are not fitting for my plans. For example, it seems impossible to compile theses as Dlls on Windows without massively altering their source code.

Honestly, I do not know where this path is taking me. Well, within the next few weeks it will take me nowhere. I have plenty of other important problems on my desk to keep me occupied. But the idea remains interesting and I will continue on this topic.

My work nicely accelerates and my projects evolve as expected. Ok, maybe they evolve a bit slower than I would like it, but I can be satisfied.

As part of the whole development I finally updated my research profile here on my own website. Soon there will be news on my research and my current projects.

Last week was full of work. Somehow, I write something like this every week. Well…

Together with two colleagues I worked on a submission for a conference last week. A nice paper about a visualization technology. Of course, I cannot say more about it as long as it is not accepted for publication yet. We will see. We did a good job and I am confident. Well, I was confident with most papers that got rejected too. Whatever.

Additionally, there was good news last week. The paper of another colleague of mine, with which I was involved, was accepted for publication at the Multimedia Modelling 2015:

  • [DOI] M. Spehr, S. Grottel, and S. Gumhold, “Wifbs: A Web-based Image Feature Benchmark System,” in MultiMedia Modeling – 21th Anniversary International Conference, MMM 2015, Sydney, Australia, January 5-7, 2015, Proceedings, 2015, pp. 159-170.
    [Bibtex]
    @inproceedings{spehr2015mmm,
      author    = {Marcel Spehr and
                   Sebastian Grottel and
                   Stefan Gumhold},
      title     = {Wifbs: A Web-based Image Feature Benchmark System},
      booktitle = {MultiMedia Modeling - 21th Anniversary International Conference, {MMM} 2015, Sydney, Australia, January 5-7, 2015, Proceedings},
      editors   = {Xiangjian He, Suhuai Luo et al.},
      year      = {2015},
      pages     = {159--170},
      doi       = {10.1007/978-3-319-04114-8_2},
    }

I don’t want to take credit for other’s achievements. The idea, the implementation, the system and the publication, all of that was mostly the work of my colleague Marcel Spher. Great work. All I did was helping out with some details, pointing in some directions and helping with writing the paper itself.

I like system papers. It is work beyond simple software used in research. These system, the one presented here and my MegaMol, have the potential to stay useful for a long time.

Today, I am only writing a short note on MegaMol.

We have done it! We published the MegaMol system as systems paper:

  • [DOI] S. Grottel, M. Krone, C. Müller, G. Reina, and T. Ertl, “MegaMol — A Prototyping Framework for Particle-based Visualization,” Visualization and Computer Graphics, IEEE Transactions on, vol. 21, iss. 2, pp. 201-214, 2015.
    [Bibtex]
    @article{grottel2014megamol,
        author={Grottel, S. and Krone, M. and M\"{u}ller, C. and Reina, G. and Ertl, T.},
        journal={Visualization and Computer Graphics, IEEE Transactions on},
        title={MegaMol -- A Prototyping Framework for Particle-based Visualization},
        year={2015},
        month={2},
        volume={21},
        number={2},
        pages={201--214},
        keywords={Data models;Data visualization;Graphics processing units;Libraries;Rendering (computer graphics);Visualization},
        doi={10.1109/TVCG.2014.2350479},
        ISSN={1077-2626}
    }

Doi: 10.1109/TVCG.2014.2350479

All the hard work really paid off. MegaMol has now been published in the IEEE Journal “Transactions on Visualization and Computer Graphics”, in short TVCG. That is the top journal of the visualization community. I have to admit, I am pretty proud.

And I am curious what will come next. I would like to continue working with MegaMol, and to help to evolve the software even further. But, of course, this depends on my future employment. MegaMol has such a potential. *sigh*

Today I want to talk about one of my newest published research papers, about visualization of multi-dimensional trajectories. It is electronically available here at the Wiley Online Library (http://onlinelibrary.wiley.com/doi/10.1111/cgf.12352/abstract): Visual Analysis of Trajectories in Multi-Dimensional State Spaces [1].

First off, what is multi-dimensional trajectory? We were investigating the state of complex systems, like automation system or robotics. Each element of such a system, e.g. a robotic motor or a sensor, holds several state variables, like sensed temperature or rotation moment applied by the motor. These variables might even be vectors. But even if they are only scalar values, the system is constituted from several dozens of such elements. Thus, the state of the whole system is always a vector containing the state variables of all components. For the systems we investigated, these vectors are of the size of severs tens or variables. This order or magnitude is referred to by the term multi-dimensional, compared to high-dimensional, which refers to data with several hundred or thousand dimensions. The whole system state can be understood as point in the multi-dimensional state space. Now, our system is not static, but is monitored in real time. Thus the values of the state variables change. Temperatures rise and motors move. This can be interpreted as the point of the system state moving through the state space. This movement path is what we call the trajectory.

md_trajectory_teaser

Our approach on visualizing this trajectory was using classical visualization metaphors on multi-dimensional data visualization, namely scatterplot matrices and parallel coordinate plots. We supplemented these plots with additional views, like a temporal heat map. The main aspect of our work was the technique we used to generate these plots. Normally, the sample points of the data will be simply drawn into the plots as points or poly-lines. We, however, took the nature of the data into account, which is the temporal continuity of the discretely sampled signal. We constructed an integration concept for continuous plots in this respect. Our work was based on previous work on continuous scatterplots and parallel coordinate plots, which used spatially continuous interpolation. We adapted this concept to continuous-time interpolation.

 md_trajectory_compare

[1] [doi] S. Grottel, J. Heinrich, D. Weiskopf, and S. Gumhold, “Visual Analysis of Trajectories in Multi-Dimensional State Spaces,” Computer Graphics Forum, vol. 33, iss. 6, pp. 310-321, 2014.
[Bibtex]
@article {Grottel2014HDTraj,
  author = {Grottel, Sebastian and Heinrich, Julian and Weiskopf, Daniel and Gumhold, Stefan},
  title = {{Visual Analysis of Trajectories in Multi-Dimensional State Spaces}},
  year = {2014},
  journal = {Computer Graphics Forum},
volume = {33},
number = {6},
pages = {310--321},
  doi = {10.1111/cgf.12352}
}