DokuWiki nfencode converter from “url” to “utf-8”

I have used this script on my linux system without any issues. Do not use it on Windows.

Script is very simple, you can change it to recode from other encodings just by changing one line of code.

conveter.pl:

#!/usr/bin/perl -w
use strict;
use URI::Escape;

my @files;
while(my $file = <>) {
	chomp $file;
	push @files, $file;
}

foreach my $file (reverse @files) {
	if( $file =~ /%/ ) {
		print "$file\n";
		convert($file);
	}
}

sub convert {
	my $file = shift; # it can be directory
	die unless $file =~ /^(.*?)([^\/]+)$/;
	my $new_filename = "$1" . uri_unescape("$2");
	print "$new_filename\n";
	rename($file,$new_filename) or die;
}

Usage: Save it as “conveter.pl” to DokiWiki data/ directory, make it executable, then:

$ cd dokuwiki/data; find . | ./converter.pl

Kalkun and Nginx, config that actually works

Hello.
First of all you should have PHP and Nginx up and running.

After many tests this Nginx config works fine with Kalkun 0.7.1:

server {
        listen 80;
        server_name kalkun.domain.com;

	access_log /var/log/nginx/kalkun.access.log;
	error_log /var/log/nginx/kalkun.error.log;
		
	root /var/www/localhost/htdocs/kalkun;

        client_max_body_size 10G;
        fastcgi_buffers 8 16K;
	fastcgi_buffer_size 53k;

	location / {
		index  index.php;
	}

	location ~* ^.+.(jpg|jpeg|gif|css|png|js|ico|xml)$ {
		access_log        off;
		expires           360d;
	}

	location ~ /\.ht {
		deny  all;
	}

        # pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
        location ~ ^(.+?\.php)(/.*)?$ {
                try_files $1 = 404;

                include fastcgi_params;
                fastcgi_param SCRIPT_FILENAME $document_root$1;
                fastcgi_param PATH_INFO $2;
                fastcgi_param HTTPS on;
                fastcgi_pass 127.0.0.1:9000;
        }
}

Edit Kalkun’s file “application/config/config.php”:

$config['uri_protocol']	= "PATH_INFO";

Done!

Creating and importing self-signed certificate to Android device

This instructions will help you to create new self sign certificate and import it to your android device.

Due to a bug in android internal code you need some extra steps while generating your certificate. Otherwise your self-signed certificate will not show up under “trusted credentials” in android menu.

Create an auxiliary file “android_options.txt” with this line inside:

basicConstraints=CA:true

Create self-signed certificate using these commands:

  • openssl genrsa -out priv_and_pub.key 2048
  • openssl req -new -days 3650 -key priv_and_pub.key -out CA.pem
  • openssl x509 -req -days 3650 -in CA.pem -signkey priv_and_pub.key -extfile ./android_options.txt -out CA.crt

Now our CA.crt certificate is almost ready.

Convert certificate to DER format:

  • openssl x509 -inform PEM -outform DER -in CA.crt -out CA.der.crt

Import CA.der.crt:

  • Put the CA.der.crt onto the sdcard of your Android device (usually to internal one). It should be in root directory.
  • Go to Settings / Security / Credential storage and select “Install from device storage”.
  • The .crt file will be detected and you will be prompted to enter a certificate name.
  • After importing the certificate, you will find it in Settings / Security / Credential storage / Trusted credentials / User.

Sending one input stream to many commands in Linux

Hello guys!

I was seeking a way to process Apache’s access_log file by several parsers simultaneously.

Usually I do it in a straight way:

$ webalizer access_log &
$ awstats access_log &

When access_log is small, this is OK. But if access_log is large, like 15Gb, it is better to read it only once.

We can split one stream into many streams and pass them to many programs using “tee” command and “named pipes”:

$ mkfifo /tmp/fifo.1 /tmp/fifo.2
$ cat access_log | tee /tmp/fifo.1 > /tmp/fifo.2 &
$ (webalizer /tmp/fifo.1; rm -f  /tmp/fifo.1) &
$ (awstats /tmp/fifo.2; rm -f  /tmp/fifo.2) &

Thats all!

 

Duplicity backup with hard-links

Hi there!

Сurrent version of Duplicity doesn’t preserve hard-links, so I’ve created a workaround to make it possible.

Why hard-links are important ? Any Linux distribution has many system hard-links. If you want to do a Full System Backup of Linux you must backup hard-links carefully, or you will meet nasty problems after restore. For example my Gentoo Linux has >3000 hard-links! Imagine what will happen if I ignore that.

I have written a small script which can backup/restore structure of hard-links.

You can download it: duplicity-hardlinks.py (modify python version in the first line if it fails to run. It requires Duplicity installation to work correctly.)

Usage:

$ duplicity-hardlinks.py [options] dump source_dir dump_file
$ duplicity-hardlinks.py restore dump_file target_dir

So, generally you should save hard-links structure into the dump_file, add dump_file into your backup, and use that file after restore.

Script is not stupid, it respects all Duplicity’s filter options.

Example:

$ duplicity-hardlinks.py --include /home --include /etc --exclude '**' dump / /hardlinks_dump.txt

This will save a dump of all hard-links from /home and /etc directories into the file /hardlinks_dump.txt.
Include /hardlinks_dump.txt in the list of files backup-ed by Duplicity. Use Duplicity to backup these files:

$ duplicity full --include /home --include /etc --include /hardlinks_dump.txt --exclude '**' / file:///mnt/backup/

Restore ordinal duplicity backup into some directory, for example /mnt/restore/. The file hardlinks_dump.txt should be there:

$ duplicity restore file:///mnt/backup/ /mnt/restore/

Finally, restore hard-links structure:

$ duplicity-hardlinks.py restore /mnt/restore/hardlinks_dump.txt /mnt/restore

Done!

Duplicity vs Rsync / Boxbackup / Duplicati / Amanda for Linux backup

Hello guys!

First of all, I want to say that Duplicity is excellent backup software for Linux!

I tried Boxbackup / Duplicati / Amanda / Rsync and can compare them.

Lets look what Duplicity has:

  1. Backups are compressed and encrypted and you can encrypt them with just a pass-phrase. No need in hard-to-hide, easy-to-lose keys. Just choose reliable password, remember it and you are good.
  2. Backups can be stored on remote server with only simple access – FTP, etc. No need to install remote server software. You can easily store backups in a cloud, Dropbox, gDrive, Amazon s3, your friend’s server. You can copy backup files on any HDD, for example your external HDD. Even if your files stolen, you get no harm.
  3. Fast Incremental backups. Not just stupid differential backups, there is huge difference. If you have large files, like Virtual Machine Images or large MySQL databases, incremental backups will save a lot of disk space, network traffic and time. This is because only internal changes of a file are saved in incremental backup. Differential backup saves whole file even if one byte changes.
  4. Duplicity does not require heavy dependencies like Mono (Duplicati), only Python which is by default on almost any Linux.

For my Linux servers, desktops and notebooks I choose Duplicity over other software because it has all these features.

For my tasks Duplicity has only one drawback: it lacks hard links support. This makes Full Linux Server backups a problematic. But, I love this software so much, so I decided to add hard links support into Duplicity.

 

First post

Just testing my new shining blogging software!

It was fun installing WordPress on Nginx with PHP-FPM