I came across that Nuget because I was writing WPF UI applications and I needed a folder picker. Someone in the internet suggests to use the WinForms dialogs. I sort of hate using two frameworks. And then another one brought up the Nuget WindowsAPICodePack.Shell with it’s class representation of the Windows Common Dialogs, including the capability of picking folders in the open dialog window.

So, I started using that. Then, at some point, a friend pointed out to me, that the Nuget package I was using did not look like an official Microsoft package, but a repackage someone did. That made me stop and think. I don’t assume any bad intent, but I found it very very strange. The official package vanished. And there is a huge load of packages which strange names:

I don’t like this. Even if there is no ill intent from any of the authors, I still don’t like this, as it reeks like fraught, phishing, and vulnerabilities. Sorry, but, no.

So, where is the official package? Seems to have disappeared, that’s why there are the repackages by community members. Why did it disappear? No idea. Maybe it got caught in a semi-automatic cleanup as it was orphaned. Someone suggestion it’s replaced by Microsoft.Windows.SDK.Contracts.

In the end, I replaced the code in my projects by either using the WinForms dialog, or by writing a very small p/invoke wrapper class calling the Win32 API directly. If you are interested, have a look:

https://github.com/sgrottel/open-here/commit/9de68198e35f0f6dec9386372cc71bada54c2f5b

The moral of the story is, a Nuget package is only as good as the people maintaining it. And, I mean people, not organizations. Because in the end, it’s whether or not individuals want to give their best.

GLM has been released in Version 1.0.0, and I updated the Nuget package accordingly.

With this, the Nuget package source code also moved from Bitbucket to Github.

I am not sure if updating Nuget packages is worth writing about. I’d rather say, it’s business as usual. But this instance here is special, as the GLM developer decided to go for 1.0.0. I want to especially congratulate on this occasion! It’s rather embarrassing how many projects seem to fear the “sonic barrier” of this step, finally switching from “I don’t know” to “This is sort of what I wanted to do and it is sort of done in its first version.” I believe, many more projects are mature enough and stable enough to go that step. But they don’t do it. I don’t know.

This is my first post in 2024, and amazingly, I managed to not post anything for seven and a half months. That might be a sad record. Well, it’s not that I did not do anything publicly available to the community in that time. It does show, however, that my priorities lay somewhere else than this blog. But I don’t want to give this here up just yet.

Anyhow, I don’t want to just write down a blown-up lazy list of what I did, e.g. repeating my Github journal, because, that would be just sad. If you are interested in what happened in my Tiny Tools, Everything Search Client, Checkouts Overview, OpenHere, LittleStarter, MyToDoBoard, updated Nuget Packages, SimpleLog, etc., I suggest you have a look at my Github profile directly. I guess, this is the non-blown-up list.

So, for today, I leave this little sad “I am still alive” message here.

The EverythingSearchClient has now the capability to return the search result sorted. The new release v0.4 is available on Github and as NuGet package.

The addition was sort of straight forward, as the functionality to return the search results sorted is implemented in Everything itself. I just ported the respective configuration flags into my CSharp client library and extended the code which creates the interprocess communication query (version 2).

With this, I currently have no further features planned for my EverythingSearchClient. I am already using it in a couple of my own projects. If you use it as well, I would be happy to hear it. :-)

A lot of stuff was happening around Lua and its NuGet package.

Lua 5.4.5 has been releases. So, it was time for an update of my Lua NuGet package. And for a surprise. And for some pain.

In the end, all was resolved, and the new version of the Lua NuGet package is online.

Although, Lua 5.4.5 was removed again in favor of a hopefully quickly upcoming Lua 5.4.6.

Update 20230514: Lua 5.4.6 is here!

Surprise: The Lua NuGet package was unlisted!?

Some months ago, my notifications from Github stopped working. No idea why. But since I don’t have much traffic there, it was never a high priority for me to fix it. For that reason, however, I nearly missed this issue raised by Smurf-IV “[Question] Why does nuget state that the packages have been witdrawn ?”

What? I have absolutely no idea what had happened there. But I have a guess. The NuGet package was originally created by the CoApp organization. However, at some point they stopped updating it. When I was needing a newer version of Lua, I decided to create a NuGet package myself. I already had good experience in creating native C++ NuGet packages, so I gave it a try, and it worked nicely. At that point I reached out to the CoApp org., and asked if they would let me co-author the NuGet package for Lua. My idea was that I could publish this package using the nice and precise NuGet package id. They agreed, gave me co-authorship, and I started publishing NuGet packages for every release of Lua ever since.

And now the package was unlisted by “the author” and marked as being deprecated and having critical bugs. What? I can tell you that, it was not me!

The “critical bugs” thing worried me a bit. I my previous job I was also tasked to regularly screen our code base for CVEs. So, I do know a couple of things to look at. I did a quick check for Lua, and yes, there are CVEs, but nothing out of the ordinary. Nothing as severe requiring the package to be locked down that hard.

And another thing became apparent: the CoApp org no longer seems to own any NuGet packages at all. None. Which is quite surprising.

So, my best guess is, the CoApp org on NuGet was maybe disbanded, but surely stopped their commitment on maintaining any of the NuGet packages. As a precaution, they shut down all the NuGets, and maybe automatically marked all of them as being deprecated and legacy. I am not sure about the “critical bugs” thing. That is why I assume an automatic “shutdown,” which indiscriminately flagged and unlisted all packages.

Well, my commitment to maintain the Lua Nuget package stands!

I will continue to release version updates and similar. And therefore I (re-)listed the 5.4.x versions of the Lua package again and removed the warning flags about being legacy, deprecated or having critical bugs. I could only speculate about the reasoning of the CoApp group. But, I won’t.

Pain: Moving from AppVeyor to Github Actions

All that happened in the wake of the release of version 5.4.5 of Lua. One task which is on my ToDo list for a long time was to migrate and consolidate the different code and CI platforms. Up until now I built the different flavors of Lua for the NuGet package on AppVeyor. And I planned to move this task to Github Actions for a long time. So, I decided that now is finally the time to do that. One reason being an upcoming public holiday in Germany, which gave me the extra time to take care of that.

While Github actions are quite nice for straight forward projects, and while they are quite powerful, I immediately run into an issue: my NuGet package also contains built flavors from older Visual C++ toolset, currently down to v120. And the runner VM of the GitHub actions only comes with the most recent Visual C++ and toolsets installed.

And so, a long run of trial-and-error started for me. A lot of pain, headache, and “why, why why…” moments. In the end, these are the solutions I came up with:

How To: Adjust Visual Studio 2022 installation on Github Actions Runner

Let’s start easy. The Github runner comes with Visual Studio Enterprise 2022 installed. So, the most recent toolchain should work right away, namely Visual C++ v143, v142, and v141, and it did. Now, for the tool chain v140, which can be installed via the Visual Studio Installer. So, that should be easy. Well, in theory, yes. The Visual Studio Installer offers command line arguments to modify an installation and to add components, and that is what I need to do. The only issue is, the Installer is not a console application. When it is started, the calling console immediately returns, thus a job in the workflow Yaml won’t wait, at least not automatically. I need to invoke the installer with `Start-Process` and need to explicitly tell the command to wait for the spawned process to finish.

When you run the Installer locally within a command prompt, like a Powershell, you will see output messages. When running on Github CI, you don’t. As I said, the Installer is not a console application. As such, it likely performs a manual console host connection to push in its output messages. And on the Github CI, the console host likely does not exist, as the process output is directed from the CI agent process into a temporary file. As a result, the Installer does not find a console host, and thus does not print any output messages. I am flying blind. To have some info of whether or not the command succeeded, I use ` vs_installer.exe export` to dump the list of installed components. That’s not beautiful, but it is good enough for manual debugging.

So, this is what worked:

- name: Modify Visual Studio 2022
  if: matrix.toolConfig.vs2022addMod != ''
  shell: pwsh
  run: |
    $component = "${{matrix.toolConfig.vs2022addMod}}"
    Write-Host "Adding:" $component

    Start-Process -FilePath "C:\Program Files (x86)\Microsoft Visual Studio\Installer\vs_installer.exe" -ArgumentList "modify","--add",$component,"--installPath","`"C:\Program Files\Microsoft Visual Studio\2022\Enterprise`"","--passive","--includeRecommended","--norestart","--noUpdateInstaller" -Wait

    Start-Process -FilePath "C:\Program Files (x86)\Microsoft Visual Studio\Installer\vs_installer.exe" -ArgumentList "export","--config","info-post.txt","--installPath","`"C:\Program Files\Microsoft Visual Studio\2022\Enterprise`"","--passive","--noUpdateInstaller" -Wait
    Write-Host "VS2022 Components after modification:"
    Write-Host (gc info-post.txt)

    Write-Host "Added:"
    Write-Host (gc info-post.txt | Select-String $component)

How To: Visual Studio 2013 on Github Actions Runner

This leaves with the build with the oldest still supported tool chain: v120 aka Visual Studio 2013, and this one is a real pain! I cannot blame most people, as this is an over 10 year old software. I would not want to give support to something like that. But, with this NuGet package, I sort of do.

The current Visual Studio does not support this tool chain. So, I actually need to have Visual Studio 2013. Ask the Internet what to do, and it will tell you to install Visual Studio 2013, e.g. the free “Expression” Edition, via Chocolatey. I tried, and it failed with the most cryptic error message ever:

“Exit code indicates the following: Generic MSI Error. This is a local environment error, not an issue with a package or the MSI itself […]” bla bla bla.

It gives a lot of wrong hints to what to do. The actual reason is quite different: The package actually only downloads the bootstrap installer of Visual Studio, around 1.2 MB. The rest of it would have been downloaded during installation on demand. And there is the issue. Microsoft removed those old download sources, as they were vulnerable because they were signed with SHA-1 hashes. And with that all Chocolatey packages are now broken. … I might be wrong, but the effects and the sparse info I carved out of the log files really look like that.

Meaning, I need a full Visual Studio 2013. At this time, it is officially available through Microsoft’s download website my.visualstudio.com. After logging in and clicking through a dynamic website, that is. For my automation purposes not acceptable. So, I rehost the DVD iso on my OneDrive. I’d say it ok, but not ideal. And now, I need to install the full Visual Studio 2013 then. Sounds expensive. Sounds painful. Sounds slow. Sounds like a job for Docker. … Well, yes, but sadly, I would need a Windows docker image, which would need a Windows Docker host, and that seems not supported on Github, yet. So, yes, I need to install Visual Studio 2013 on the runner.

- name: Install VS 2013 tools
  if: matrix.toolConfig.toolset == 'v120'
  shell: pwsh
  working-directory: ${{env.GITHUB_WORKSPACE}}
  run: |
    Write-Host "Downloading VS 2013 Express for Windows Desktop (w Update 5) DVD ISO..."
    Invoke-WebRequest -Uri "${{ secrets.VS2013ISO_URL }}" -OutFile dvd.iso
    gci dvd.iso
    $dvdiso = (Resolve-Path .\dvd.iso).Path

    Write-Host "Mounting Disk..."
    $DiskImage = Mount-DiskImage -ImagePath $dvdiso -StorageType ISO -PassThru

    $volumeInfo = (Get-Volume -DiskImage $DiskImage)
    $volumeInfo

    cd ($volumeInfo.DriveLetter + ":\")
    gci

    Write-Host "Installing Visual Studio 2013 Express..."
    Start-Process -FilePath ($volumeInfo.DriveLetter + ":\wdexpress_full.exe") -ArgumentList "/Q","/NoRestart","/NoRefresh","/NoWeb","/Full" -Wait

    Write-Host "Unmounting Disk..."
    Dismount-DiskImage -DevicePath $DiskImage.DevicePath

And this works. And, as expected, is painfully slow. To limit the overhead of this installation step, it does make sense to reduce the width of the job matrix in the workflow: only have one job to perform this installation and afterwards perform all build steps sequentially. In the end, this is almost the factor x4 faster as you would expect, due to the massive installation overhead. Luckily for me, this workflow is not a pull-request blocker during active development, but only a release automation.

Final Thoughts

In the end, the migration to Github Actions works. Was it worth it? I mean, Lua releases like once every two years. Do I even need CI for that? And, anyway, isn’t native NuGet dead, replaced by other means, like vcpkg, or something?

Well, all this is true. I did not follow through with this undertaking because of having a CI for my Lua NuGet Package. My main reason is “because I can,” or, really, “because I can’t, yet.” It was a good opportunity to improve my Github-Actions-Fu. My recommendation to you: if you see such an opportunity to learn and improve your skills, with the benefit of getting something usable out of it, do it.

And, now I am waiting for Lua 5.4.6 and maybe new surprises…

I added another new tool to my Tiny Tools Collection: ToggleDisplay

Code: https://github.com/sgrottel/tiny-tools-collection/tree/main/ToggleDisplay
Released Binary: https://github.com/sgrottel/tiny-tools-collection/releases/tag/ToggleDisplay-v1.0

It allows you to enable, disable, and toggle a display.

Why? My computer is connected to 2-3 displays. Two computer monitors on my desk for work. And a TV on the other side of the room, e.g. to play games from my computer or to watch video files in style.

Often enough I boot the computer, and then my mouse disappears from the desktop, because I forgot the TV was configured “on” before, and the mouse moved beyond the desktop monitors. Annoying. The built-in feature “Windows-Key + P” is understandably limited to two monitors. So, I always had to press “Windows + P”, then “Further Settings”, wait for the dialog to appear, fiddle around, press apply, … you get my point.

So, I researched the net a bit on how to programmatically enable or disable a display. And there are several free tools to do that. I tried two, and both did not work. Then there is a hack with using a Windows 10 executable on Windows 11. Yeah, no. Ok. Search on!

It turns out, there is an easy API for that: ChangeDisplaySettingsEx. Some experimental code later I was able to deactivate the display, but not to (re-)activate it. Not good enough. Search on!

Some search later, turns out there is a second API, not as simple and with next to no useful documentation: SetDisplayConfig. This one seems to be the API the windows built in display configuration dialog uses. But … how. I found code by “PuFF1k” on StackOverflow (https://stackoverflow.com/a/62038912/552373) who reverse engineered the API calls of the windows dialog. I tried his code, and it works. Nice! Thank you, PuFF1k!

The core of the trick is to not provide any modeInfo data to SetDisplayConfig, and to set all sourceInfo.modeInfoIdx and targetInfo.modeInfoIdx of all paths to DISPLAYCONFIG_PATH_MODE_IDX_INVALID.

Some refactoring and some cleanup later, I have ToggleDisplay, ready to be shared with the world.

By the way, I now also included source code of some of my older tools in this Tiny Tools Collection repository:

I uses that opportunity to also update these projects to recent DotNet runtimes. I did not set up any automated build pipeline or releases. Maybe some other time.

This is my third and final article in my series about application icons and logos. This time I am going to write about icon sizes, and why you should care for it. Granted, it’s a little bit about perfection, but it is about an easily achievable optimization. Look at those two images:

These show the same icon. The very same icon. Really. And both show the image at 32×32 pixels.

Icon Sizes

Let’s reiterate what icons are for: it’s an iconic, graphical representation of a logo, especially optimized for small sizes, like favicons, small logos or software application icons. They are meant to work at very small sizes, traditionally down to 16×16 pixels. With higher resolution displays, this super-small size might no longer be that relevant. That is why I chose 32×32 for my example above. So, we want icons to work for those small sizes. Yes, we explicitly create icons for those small sizes. And this is my argument, that we should go the extra mile also optimizing the graphics for exactly those sizes we aim for: 16×16, 24×24, 32×32, 48×48, and 64×64, traditionally.

So, what is the difference between the two images above? Let me zoom in without additional interpolation to make the difference more clearly visible:

The left image is the reference image I got from the clipart. It does show what I want to show, and might come from an external design source. But the lines do not align with the pixel grid. As a result, anti-aliasing interpolation results in the blurry visual. The right image, is the same graphics, but all the line vertices have been slightly move to align exactly with the pixel grid. The result is a much crisper appearance. And the effort it minimal. Just a duplication of the icon and touching and snapping of a couple of vertices in your favorite graphics editor. Totally worth it.

Ok, so, can’t we just optimize for 16×16 and we are good?

No. For one, 16×16 is very very small, and as written above looses it’s importance in the age of high resolution displays. Similar to the abstraction from a logo to and icon, as written in my first article of this series, many icons simplify and remove details when they go down from 32×32 pixel sized versions to the 16×16 pixel sized versions:

And the second reason are the in between sizes, infamously the 24×24 pixels. It’s a scaling factor of 1.5x from the 16×16 version. Any line might end up again in between pixels and blurry if you just scale up.

So, it does make sense to create multiple sizes of the icon, each with optimized placement of the graphics’ vertices. At some point, depending on the complexity of the icon’s graphics, further upscaling is irrelevant, and can be done automatically. The 64×64 pixel size is a traditional point for this.

Personally, I usually try to design icons at 32×32 pixels. The 64×64 pixel and 256×256 pixel versions are then automatic upscales, but are always explicitly included in the icon files. The three traditional sizes still missing, 16×16, 24×24, and 48×48 pixels, are the manually optimized for crisper appearance. Of course, this approach is just a starting point, and sometimes the reference is at a different size.

The Straight Lines

So, all this is only about straight horizontal and vertical lines, as only those could perfectly align with the pixel grid? No. Any shape loses detail and gets increasingly blurry at smaller sizes. I wrote above that the reduction of graphical detail might be needed when going down in size. That is true for all shapes. And it might not only be a _reduction_. Sometimes an alteration or even complete replacement of a shape might make sense, as in the example above. Especially, when going to 16×16 pixels in size, the concepts of pixel art, with their reduction of most detail an especial emphasize on other detail is worth a thought:

The right image show the clipart original. The center image shows the vector graphics of the 16×16 pixel image on the left. Look at the strap of the helmet. That is no longer curved at all. Instead it emphasizes on a couple of full pixels for the overall shape, and a couple of partially filled pixels for explicit an controlled anti-aliasing.

Summary

Icons are meant as very small sized representations of a logo and for your application, web page, or similar. As it is their purpose, I argue we should care for optimizing for those sizes as well!

  • Shapes, especially, but not limited to, horizontal and vertical lines, should be placed precisely at pixel grid boundaries to avoid blurriness due to interpolation.
  • 16×16, 24×24, 32×32, and 48×48 pixel sized versions of an icon benefit most from manual optimization, maybe even graphics detail reduction or shape alteration
  • Whatever we do, let’s keep quality always in mind.

So, an SVG, which is only one image at one size could be used as an icon image data source. But if used for all sizes, it will always fall short in the visual quality on some sizes, compared to explicit pixel-based graphics, optimized for that specific size.

Series

Some time ago I started a section on my website here about tools I use and like. I started that series writing about the Everything search tool by Voidtools, which is lightning fast and awesome.

Since then I integrate Everything into several internal tools of mine. Most of the time, I used the Everything command line client and parsed its output. However, I had some trouble with Unicode file names. Then I looked at Dotnet library solutions, namely Everything .Net Client and EverythingNet. Both are basically only P/Invoke wrappers around the Everything SDK, which itself is a wrapper around Interprocess Calls (IPC) to the Everything service. And so, since I know my stuff around low level techniques like Windows Message based IPC, and since I don’t like wrappers of wrappers of functions, I decided to write a library of my own: Everything Search Client

It is a .Net 6.0 library, completely written in CSharp, with some P/Invoke calls to native Windows functions of the Operating System, and directly talking to the Everything service.

The code is available on Github and the ready-to-use nuget package is on Nuget.org.

If you find it useful and use it in a tool of your own, I would love to hear about it: Used By, How to Contribute

Git has this cursed function to fuck up your files by doing unspeakable things to your line endings.

For example, from Githubs Documentation on Line Endings:

On Windows, you simply pass true to the configuration. For example:

$ git config –global core.autocrlf true

Please, never never never never never never never never never never never never do this!

THERE IS NO REASON TO DO IT!

Git is here to keep track of our files, NOT TO CHANGE OUR FILES IN ANY WAY.

So, please, just, never never never never never never never never never never never never do this! Leave my file endings alone!

Just recently, I read this article on Golem about Mouse Without Borders (in German).

Mouse Without Borders (http://www.aka.ms/mm)

My current project at work revolves around network communication. For several reasons I cannot work with a single computer and simulated networks, but I need two physical machines to do my work. And I hate switching keyboards and mice all the time. But, I thought, “how many people have such a problem. Surely not many.” So, I accepted it. And now, Mouse Without Borders comes totally unexpected to my aid. Awesome! And it works!