I am back in Dresden. This was by far the most exhausting business travel ever. The Siggraph Asia 2014 was quite alright, but it completely filled up the days in the Chinese time zone. An the work for the German time zone in this week was as submission to the EuroVis 2015, the writing of a funding proposal, of a final project report, and the reading and commenting of a Ph.D. thesis of a friend of mine. There was few time left for sleeping and none at all for anything else. And that fun lasted the full seven days. My flights back to Germany were today, Sunday. From 4 am. Shenzhen time until now 8 pm. German time. And tomorrow morning I will be giving a lecture. I am so looking forward too that.

Every once in a while I face extremely simple tasks for which I don’t see the simple solution. Not long ago I had to concatenate two binary files into a single one. I really didn’t know how to do it without any major efforts on Windows. I thought of many terrible ideas: copying everything onto a Linux machine, using a Powershell script, googleing for a tool, throwing up a small DotNet tool myself. Luckily I decided to google for the problem and not for a tool. As result I learned that Windows can do the job itself. Well, the old DOS commands can, which are still available on Windows:

copy /y /b file1+file2 dest

Embarrassingly simple.

Important! This post is outdated: here is the update.

Some time ago I stopped using dual-boot systems. My work takes place on Windows and is based on the products of that ecosystem. I will not go into details of the holy war between Windows and Linux. Especially since Linux is an integral part of my work as well. The solution currently working best for me is using a VirtualBox Linux installation. From the multitude of distributions I boringly selected Ubuntu for my use. As computer graphics and visualization researcher I obviously like the OpenGL support in the virtualization solutions. This week I set up a new virtual machine and was greeted by this error prohibiting my OpenGL applications to run:

libGL error: pci id for fd 4: 80ee:beef, driver (null)
OpenGL Warning: glFlushVertexArrayRangeNV not found in mesa table
OpenGL Warning: glVertexArrayRangeNV not found in mesa table
OpenGL Warning: glCombinerInputNV not found in mesa table
OpenGL Warning: glCombinerOutputNV not found in mesa table
OpenGL Warning: glCombinerParameterfNV not found in mesa table
OpenGL Warning: glCombinerParameterfvNV not found in mesa table
OpenGL Warning: glCombinerParameteriNV not found in mesa table
OpenGL Warning: glCombinerParameterivNV not found in mesa table
OpenGL Warning: glFinalCombinerInputNV not found in mesa table
OpenGL Warning: glGetCombinerInputParameterfvNV not found in mesa table
OpenGL Warning: glGetCombinerInputParameterivNV not found in mesa table
OpenGL Warning: glGetCombinerOutputParameterfvNV not found in mesa table
OpenGL Warning: glGetCombinerOutputParameterivNV not found in mesa table
OpenGL Warning: glGetFinalCombinerInputParameterfvNV not found in mesa table
OpenGL Warning: glGetFinalCombinerInputParameterivNV not found in mesa table
OpenGL Warning: glDeleteFencesNV not found in mesa table
OpenGL Warning: glFinishFenceNV not found in mesa table
OpenGL Warning: glGenFencesNV not found in mesa table
OpenGL Warning: glGetFenceivNV not found in mesa table
OpenGL Warning: glIsFenceNV not found in mesa table
OpenGL Warning: glSetFenceNV not found in mesa table
OpenGL Warning: glTestFenceNV not found in mesa table
libGL error: core dri or dri2 extension not found
libGL error: failed to load driver: vboxvideo

Surprisingly a short google search did not came up with a good solution. I only found some indications that the error is related to the VirtualBox guest additions and that one should always use the most recent ones. Of course I did use the most recent ones.

After some more searching I learned that the error is related to rather new OpenGL extensions. These extensions seem to be too new for the graphics drivers of the virtualization. There seems to be a conflict between the extensions the virtual graphics card reports supported and what the vritualization Linux graphics driver makes of it.

I was actually able to solve the error by downgrading my guest additions to an older version. Using version 4.3.12 everything works fine. With any newer version up to 4.3.18 I get that error. So the Iso image of that guest additions will stay on my hard drive. I will wait if newer version of the guest additions or updates of Ubuntu will be able to fix that problem for good.

Inter-process communication always was a pain, at least when dealing with platform independent C++ environments. I have tried many things and none was really satisfying. Either the performance, the usability or the maintainability and extendibility were not acceptable. I never found a method I would want to build on.

Without doubt, a de-facto standard are web services. But, from my perspective, the perspective of a visualization and computer graphics guy, the overhead, in terms of runtime performance and development, are not acceptable. I am not talking about data rates, architecture or latencies alone. But take all these things together and apply then to interactive visualization and you get a solution to 90% of all your problems. The remaining 10%, however, are painfully bad. So you are forced to introduce a second communication channel. And so, you start implementing everything twice, again. Programming in computer graphics and interactive visualization is always so pitiably far away from nice software engineering. But I have chosen this problem as one of my research fields and I have chosen to find fitting solutions to such problems. It is an extremely fruitless work, but to me it is an important topic and I believe it is worth the work.

One rather new idea crawling around in my head is about JSON-RPC. The protocol is simple, small, trivial, but also a sufficient solution to all technical problems. That is a charming combination. Especially so, because no concrete transport channel is needed. Queries and calls can be transported interleaved in the high-performance communication channel used for the rest of the data. But they don’t have to be. I believe, looking at this approach as interpreting and managing layer only makes sense.

And talking about implementation, here we have the catch. I don’t want to claim I did an exhaustive search for implementing libraries. But the libraries I have seen so far were of disillusioning low quality. While, as foundation, jsoncpp and jansson left good impressions I cannot say that about JsonRpc-Cpp and libjson-rpc-cpp. I don’t want to badmouth these projects, but they are not fitting for my plans. For example, it seems impossible to compile theses as Dlls on Windows without massively altering their source code.

Honestly, I do not know where this path is taking me. Well, within the next few weeks it will take me nowhere. I have plenty of other important problems on my desk to keep me occupied. But the idea remains interesting and I will continue on this topic.

Software is a product. At the same time it is a service. For one, it is the service providing executable machine code for algorithms, which might even be free accessible knowledge. In addition, it is the service to maintain this machine code, meaning both, removing errors and adapting to changing environments.

Especially the last is a reoccurring issue with open source software. I am not opposing open source software, far from it. But, really, get your act together! Have some ambitions and strife for high quality of your open source software; especially when talking of non-functional aspects like usability and quality of service. The usual excuse “but it is for free” reduce the whole concept to absurdity.

The reason for this post is me finally switching from qtranslate to mqtranslate.

It is nice when you can rely on technology. I am always willing to pay a “little more” to get stuff that simply does what it should do and that does not make me problems. To carry some private data with me I bought a Kingston DataTraveler USB stick about 1 ½ years ago. I carries it at my key chain to have some important data at home and at work. Maybe you cannot easily spot why I am writing this post in the picture I took from my stick.

Read more »

Yesterday was the Uni-Tag (open day presentation) at the TU Dresden. What a colossal waste of time. Because of “something” or demo was moved into a different building since last year. When I said, that this is a bad thing because nobody would find us, I got answers like “No problem. We put signs on and we will send people into your direction, blablabla.” The result: giving a three hour demo we were able to show our stuff to something like 20 people. What a joke. I really would have had better things to do with my Saturday.

It is frightening how few time remains for my private projects. All more important for me to set my priority right. Therefore, I decided to abort another one of my private projects: HexDuel.

I have not written anything about HexDuel on my blog. It was meant to be a smart-phone game focusing on two-player gameplay. It was turn based, optimized for touch input, and had a nice and extendable graphical design, at least in my opinion. Nevertheless, it was the project with the lowest priority. However, it is a pity. It was a good idea.

It’s getting better. Slowly but steadily I am gaining speed. That is, I am making progress, both with my private projects and with my work.

The semester is running and i have much fun with my lecture.

New projects are starting and are looking good. Old projects come close to their conclusion and they do not look that bad either.

I am confident.