Skip to content

uberspace and let's encrypt

Nichts besonderes, ich habe einfach nur das Wissen aus dem wiki und dem blog genommen und ich ein shell script gepackt.

Das Resultat ist folgender Cronjob, der einmal aller 60 Tage läuft.


#!/bin/bash -l
####
# @see:
#   https://blog.uberspace.de/lets-encrypt-rollt-an/
#   https://wiki.uberspace.de/webserver:https?s[]=lets&s[]=encrypt
# @author: stev leibelt <artodeto@bazzline.net>
# @since: 2015-12-28
####

#begin of local parameters
LOCAL_ROOT_PATH='/home/<user name>'
LOCAL_LOG_PATH=$LOCAL_ROOT_PATH'/<path to your log files>'
LOCAL_ACCOUNT='<your.domain.tld>'
#end of local parameters

#begin of parameters for letsencrypt-renewer
LOCAL_CONFIGURATION_PATH=$LOCAL_ROOT_PATH'/.config/letsencrypt'
LOCAL_LOGGING_PATH=$LOCAL_ROOT_PATH'/.config/letsencrypt/logs'
LOCAL_WORING_PATH=$LOCAL_ROOT_PATH'/tmp/'
#end of parameters for letsencrypt-renewer

#begin of parameters for uperspace-prepare-certificate
LOCAL_KEY_PATH=$LOCAL_ROOT_PATH'/.config/letsencrypt/live/'$LOCAL_ACCOUNT'/privkey.pem'
LOCAL_CERTIFICATE_PATH=$LOCAL_ROOT_PATH'/.config/letsencrypt/live/'$LOCAL_ACCOUNT'/cert.pem'
#end of parameters for uperspace-prepare-certificate

letsencrypt-renewer --config-dir $LOCAL_CONFIGURATION_PATH --logs-dir $LOCAL_LOGGING_PATH --work-dir $LOCAL_WORING_PATH &>$LOCAL_LOG_PATH
uberspace-prepare-certificate -k $LOCAL_KEY_PATH -c $LOCAL_CERTIFICATE_PATH &>>$LOCAL_LOG_PATH
In schön gibt es das script auch noch einmal hier.
Großen Dank an uberspace und lets encrypt.

Damit das script funktioniert, müsst ihr natürlich zu erst lets encrypt aufsetzen:


uberspace-letsencrypt 
letsencrypt certonly
Ich bin recht faul. Aus diesem Grund lass ich mir die Zertifikate einmal im Monat neu generieren. Um die Infrastruktur nicht zu sehr zu belasten, habe ich mir einen anderen Tag, als den Ersten im Monat ausgesucht. Das gleiche gilt für die Uhrzeit.

determine if an apache process is still running via bash to prevent multiple instances running

Given is the fact that you have some processes (like cronjobs) executed via an webserver like apache. Furthermore you have installed and enables apache server status. To add some re usability benefits, we should divide and conquer the problems into either shell scripts or shell functions. Side note, if I am writing about shell, I am in the bash environment. What are the problems we want to tackle down?:

  • find the correct environment
  • check all available webservers if a process is not running
  • specify which process should not run and start it if possible

We can put the first two problems into shell functions like the following ones. I am referencing to some self written shell functions. The reference is indicated by the "net_bazzline_" prefix.

#!/bin/bash
#find the correct environment

if net_bazzline_string_contains $HOSTNAME 'production';
    NET_BAZZLINE_IS_PRODUCTION_ENVIRONMENT=1
else
    NET_BAZZLINE_IS_PRODUCTION_ENVIRONMENT=0
fi

And the mighty check.

#!/bin/bash
#check all available webservers if a process is not running
####
# @param string <process name>
# @return int (0 if at least one process was found)
####
function local_is_there_at_least_one_apache_process_running()
{
    if [[ $# -lt 1 ]]; then
       echo 'invalid number of arguments'
       echo '    local_is_there_at_least_one_apache_process_running <process name>'

       return 1
    fi

    if [[ $NET_BAZZLINE_IS_PRODUCTION_ENVIRONMENT -eq 1 ]]; then
        LOCAL_ENVIRONMENT='production'
    else
        LOCAL_ENVIRONMENT='staging'
    fi

    #variables are prefixed with LOCAL_ to prevent overwriting system variables
    LOCAL_PROCESS_NAME="$1"

    #declare the array with all available host names
    declare -a LOCAL_HOSTNAMES=("webserver01" "webserver02" "webserver03");

    for LOCAL_HOSTNAME in ${LOCAL_HOSTNAMES[@]}; do
        APACHE_STATUS_URL="http://$LOCAL_HOSTNAME.my.domain/server-status"

        OUTPUT=$(curl -s $APACHE_STATUS_URL | grep -i $LOCAL_PROCESS_NAME)
        EXIT_CODE_OF_LAST_PROCESS="$?"

        if [[ $EXIT_CODE_OF_LAST_PROCESS == "0" ]]; then
            echo "$LOCAL_PROCESS_NAME found on $LOCAL_HOSTNAME"
            return 0
        fi
    done;

    return 1
}

And here is an example how to use it.

#!/bin/bash
#specify which process should not run and start it if possible

source /path/to/your/bash/functions

LOCAL_PROCESS_NAME="my_process"

local_is_there_at_least_one_apache_process_running $LOCAL_PROCESS_NAME

EXIT_CODE_OF_LAST_PROCESS="$?"

if [[ $EXIT_CODE_OF_LAST_PROCESS == "0" ]]; then
    echo "$LOCAL_PROCESS_NAME still running"
    exit 0;
else
    #execute your process
    echo 'started at: '$(date +'%Y-%m-%d %H:%M:%S');
    curl "my.domain/$LOCAL_PROCESS_NAME"
    echo 'started at: '$(date +'%Y-%m-%d %H:%M:%S');
fi

You can put this into a loop by calling it via the cronjob environment or use watch if you only need it from time to time:

watch -n 60 'bash /path/to/your/shell/script'

Enjoy your day :-).

bash - compare big xml files - get differences

The following script provides a solution to compare two big xml files. I tried to compare a lot of xml files with a size of greater 500 megabytes with different tools. Each tool was eating up my memory and swap and finally crashed. All i want to have is "show me what is in file one and not in file two and vice versa". I've reached this goal by using a property my xml files have. Each file as nodes. Each node has a unique identifier inside. I cutting out the unique identifier tag and putting this tag, line by line, into a file. After that, i'm sorting this unique identifiers. Finally i am using diff. To create a more useful output, i'm separating the "what is only in file one" into a own file (and the same for file two).

Happy using and if you find errors, i'm ready to fix them :-).

#!/bin/bash
####
# script to compare two xml files by (unique) tag
####
# @author stev leibelt
# @since 2013-03-13
####

if [[ $# -eq 3 ]]; then
  XML_FILE_ONE="$1"
  XML_FILE_TWO="$2"
  XML_TAG="$3"

  if [[ -f "$XML_FILE_ONE"
        && -f "$XML_FILE_TWO"
        && ! -z "$XML_TAG" ]]; then
    #retrieving xml_tags per file
    #reduce xmls by lines containing the tag
    sed -n -e 's/.*<'$XML_TAG'>\(.*\)<\/'$XML_TAG'>.*/\1/p' $XML_FILE_ONE > $XML_FILE_ONE'.sed'
    sed -n -e 's/.*<'$XML_TAG'>\(.*\)<\/'$XML_TAG'>.*/\1/p' $XML_FILE_TWO > $XML_FILE_TWO'.sed'

    #sort and uniq the sed'ed files
    sort $XML_FILE_ONE'.sed' | uniq > $XML_FILE_ONE'.sort'
    sort $XML_FILE_TWO'.sed' | uniq > $XML_FILE_TWO'.sort'

    #output the differences
    diff $XML_FILE_ONE'.sort' $XML_FILE_TWO'.sort' > 'xml_diff_by_tag.diff'
    #diff --side-by-side $XML_FILE_ONE'.sort' $XML_FILE_TWO'.sort' > 'xml_diff_by_tag.diff'
    #comm -3 $XML_FILE_ONE'.sort' $XML_FILE_TWO'.sort' > 'xml_diff_by_tag.comm'

    #show only differences per file
    sed -n -e 's/^<\ \(.*\)/\1/p' 'xml_diff_by_tag.diff' > $XML_FILE_ONE'.diff.uniq'
    sed -n -e 's/^>\ \(.*\)/\1/p' 'xml_diff_by_tag.diff' > $XML_FILE_TWO'.diff.uniq'

    #sed -n -e 's/^<\(.*\)/<\1/p' 'xml_diff_by_tag.comm' > $XML_FILE_ONE'.comm.uniq'
    #sed -n -e 's/\t<\(.*\)/<\1/p' 'xml_diff_by_tag.comm' > $XML_FILE_TWO'.comm.uniq'

    #removing unused files
    rm -fr $XML_FILE_ONE'.sed' $XML_FILE_TWO'.sed' $XML_FILE_ONE'.sort' $XML_FILE_TWO'.sort'
  else
    echo 'Invalid arguments provided'
    echo 'try '$0' $xmlFileOne $xmlFileTwo $comparingTag'
  fi
else
  echo 'Invalid number of arguments provided'
  echo 'try '$0' $xmlFileOne $xmlFileTwo $comparingTag'
fi

Available on github.com.

bash - enhanced burn

Since we are dealing more with usb devices, the good old optical burning is not used that often. I created a simple function with an alias of "burn" to write a iso file as quick as possible to my optical storage device.

####
# burns given iso file
#
# @author stev leibelt
# @since 2013-02-12
####
function net_bazzline_burn ()
{
  if [[ $# -lt 1 ]]; then
    echo 'No valid argument supplied.'
    echo 'Try net_bazzline_burn $isoFile [$opticalDevice]'

    exit 1
  fi

  if [[ $# -eq 1 ]]; then
    sudo wodim -v dev=/dev/cdrom "$1"
  else
    sudo wodim -v dev=/dev/"$2" "$1"
  fi
}
This is available to my shell function on github.com.