Skip to content

Zabbix: Required parameter $sql_parts follows optional parameter $table_alias

We are running our monitoring solution zabbix on an arch linux system. Arch linux tends to be "up to date" since it philosophie is "patch current code base, not legacy".

I've updated the monitoring system and with that, the php version jumped from 7.4.x to 8.0.x.

After that, a lot of "widgets" (boxes) in the zabbix frontend displayed me a lot of the following error.

Required parameter $sql_parts follows optional parameter $table_alias [zabbix.php:22 → require_once() → ZBase->run() → CSettingsHelper::getGlobal() → CSettingsHelper::loadParams() → API::getApiService() → CRegistryFactory->getObject() → CApiService->__construct() → CApiService->pk() → CApiService->getTableSchema() → CAutoloader->loadClass() → require() in include/classes/core/CAutoloader.php:77]

I've researched it and found that the error exists in the DB.php class. I've opened the issue Ticket ZBX-18984 and created the pull request 39 on the github source code. Hopefully, this fix will make it quickly into the code. I've added a git patch file to the issue. You can download and apply it on your system.

Create a list of operation system and build version from active directory clients with powershell

So the task was a result of a short question of "Do you know if each windows client got their upgrade?".

It turned out that it was easy to ask this question in powershell to the active directory.

Get-ADComputer -Filter { (Enabled -eq $true) -and (OperatingSystem -notlike "*server*") } -Properties Name,OperatingSystem,OperatingSystemVersion,SID | Format-Table

But, as usual, this results in to much information. Quickly some features where requested. We want to:

  • put this question into a script to ask it in a well defined way over and over again
  • we want to reduced the amount of information
  • we want to be able to filter against operation system
  • we want to filter against build version if more than one build version exists
  • we want to put it to a csv file

I was able to set all feature requests to "done" and more or less, you can find the result here.

Using thecodingmachine docker images with podman

I wanted to quickly develop something. I thought this is the perfect timing to migrate mentally from the insecure and almost dead docker to the alive and secure podman.

I went to the page of thecodingmachine and tried it with the following one-liner:

podman run -p 80:80 --rm --name php-bazzline -v "$PWD":/var/www/html thecodingmachine/php:7.4-v3-apache

Sadly nothing is working so I've created a feature request asking for podman support.

My current workaround is following

sudo -c "echo '<username>:100000:65536' >> /etc/subuid"
sudo -c "echo '<username>:100000:65536' >> /etc/subgid"
sudo reboot

And adapt the start up as following to prevent port issues (you are not allowed - by default - to use ports below 1024).

podman run -p 8080:80 --rm --name php-bazzline -v "$PWD":/var/www/html thecodingmachine/php:7.4-v3-apache

Thats it.

zfs-snap-manager and "Got invalid schema for dataset" or not deleted old snapshots or not created new snapshots

I am using the zfs-snap-manager for such a long time that I never ever had a look on it since years.

I've just created zfs pools and zfs snap manager configurations without thinking or checking it for a while now. But once upon a time I wanted to check the behaviour and found out that some configured snapshots where not made.

After a while I saw that my assumption was false, so the error was in front of the display :-).

My configuration for the section schema was simplified to 7d because all I wanted for a dataset was that I can go back the last seven days. After finding the log file (/var/log/zfs-snap-manager.log) and tailing it, I quickly saw the log message Got invalid schema for dataset .... Just a bit more investigation and I found out that I have to provide a fully defined schema like 7d0w0m0y. After adapting all available configuration files on all hosts, the zfs-snap-manager is working as expected.

I found an existing feature request to "relaxed" the schema interpretion and stumpled over another feature request asking for creating unittests. So I've forked it and will try to learn python by fixing the issue on my own.

  • migrate code to python3
  • write unittests
  • implement relaxed handling of schema with default values

Do you want to join? Contact me if you want to.