January 24, 2006 - Filed in Linux HowTos by Felix
Everytime you visit a website, your browser also sends the URL of the previous page to the server you are accessing. In default configuration, Apache does save that information to the logfiles. This information can then be used by logfile analyzing tools such as Webalizer to create statistics of which pages most of your visitors came from.
Yesterday I noticed that someone was massively accessing our server and submitting referer URLs like
January 23, 2006 - Filed in Linux HowTos by Felix
I have a cron job installed on the office server, that is using mysqldump to create a full database backup every day. The script adds a timestamp into the filename and compresses the output via gzip to save some space. Weiterlesen »
February 28, 2004 - Filed in Linux HowTos by Felix
Switching over my main office server system from SuSE Linux to Debian Linux also meant to try to make SuSE-centered and proprietary Capi4Linux work on Debian. Since there is now complete HOW-TO, I decided to write one.
January 16, 2004 - Filed in Linux HowTos by Felix
Setting up a Software RAID1 as root partition for a Linux system has the advance of no reliance on special hardware controllers and reduced running costs. However, the actual process of setting it up can be very time-consuming, annoying and finally a failure. Thus, here a quick break-down of how I got my soft RAID1 array running with Debian.