Jan 14, 2015 Tag: Linux

Using Linux

Ongoing attempt to to collect tips and tricks about how to get things done on Linux. Will be updated as needed.

Updated on Nov 12, 2018

See also: About Linux:

Navigate this page:

Commandline tools

Ag - The Silver Searcher

ag is like grep or ack but faster.

Search only files with ‘js’ at the end:

# -G allows for regular expressions
ag searchterm -G js$

Exclude ‘*.js’ files:

# --ignore doesn't take a regex
ag searchterm --ignore="*.js"

Some more:

# case insensitive
ag searchterm -iG js$

# list filenames only
ag searchterm -ilG js$

# 99 levels deep instead of 25
ag searchterm --depth=99 -ilG js$


# dirs first, sort by filesize, human readable
alias lh='ls -Shl --group-directories-first'

# subdirs of current dir
alias ldir='ls -d */'


dlocate - program to view debian package information
dlocate [OPTIONS] [command] [ package... | PATTERN...]
dlocate is a fast alternative to dpkg for queries like dpkg -L and dpkg -S.


dlocate -lsbin    $packet  # show executables
dlocate -lsconf   $packet  # show conf files
dlocate -md5check $packet  # show modified files
dlocate -du       $packet  # show used disk space
dlocate -i -S "/r?BaSH$"   # find packets, case independent, using regexes
# with -P                  # use Perl regex syntax
dlocate -k                 # show all installed kernels
dlocate -K                 # " plus associated packets


Update database as root and background task:

sudo -b updatedb

Search case independent by REGEXPR:

locate -ir "down.*book.*ans"
# finds for example:
/home/user/Downloads/Books/AnsibleBooks/Ansible Playbook Essentials.pdf


locate --regex "/bin$" # paths ending like '/bin'
locate --regex "\.local.*\.desktop$" | grep phpstorm


Find (living) machines in network:

sudo nmap -sP
sudo nmap -sn  # disable port scan
sudo nmap -sL -v

pv - connect file to pipe and measure progress

sudo apt-get install pv
pv --help
# example:
pv dump.sql | mysql -u user -ppasswd database

sudo - run in background


$ sudo -b updatedb

sudo - without password

From stackoverflow:

sudo visudo
# Then edit that file to add to the very end:
username ALL = NOPASSWD: /fullpath/to/command, /fullpath/to/othercommand

# that is
john ALL = NOPASSWD: /sbin/poweroff, /sbin/start, /sbin/stop

# This will allow user 'john' to sudo poweroff, and
# start and stop without being prompted for password.

Note: visudo is not vi!


Directory Listing with long filenames

Options +Indexes
IndexOptions FancyIndexing NameWidth=* IgnoreCase

Solutions for Tasks

Calculate checksum of files in a directory structure

See also: http://unix.stackexchange.com/questions/35832/how-do-i-get-the-md5-sum-of-a-directorys-contents-as-one-sum

➜ find -type f | LC_ALL=C sort | cpio -o --quiet | md5sum | awk '{ print $1 }'

Cleanup Boot Partition

Check what we have:

dpkg -l | grep "linux" | awk '{print $2}' | sort

Clean up:

dpkg --list | grep linux-image | awk '{ print $2 }' | sort | sed -n '/'`uname -r`'/q;p' | xargs sudo apt-get -y purge

Or, maybe better, step by step more manually:

## search
 uname -a
 dpkg -l | grep "linux" | awk '{print $2}' | grep 3.5.0

## remove manually one by one:
 dpkg -P linux-headers-3.5.0-22

As we chatted:

Remove old kernels, from Skype chat 2013-02-04:
[16:56:25] a: alte kernel aufräumen:
[16:56:30] a: uname -a
[16:56:37] a: dpkg -l | grep "linux" | awk '{print $2}' | grep 3.5.0
[16:56:47] b: ok, cool
[16:56:53] a: jetzt entfernst du alle mit einer älteren versionsnummer im namen mit folgendem befehl:
[16:57:02] a: dpkg -P linux-headers-3.5.0-22
[16:57:04] a: usw.
[16:57:14] a: ich mach das immer manuell, um nicht aus versehen zuviel zu löschen
[16:57:20] a: also paket für paket
[16:57:25] a: schafft aber ziemlich viel platz

dpkg -l | grep "linux" | awk '{print $2}' | sort

Copy file contents to clipoboard

$  xclip -sel clip < ~/.ssh/id_rsa.pub

dbtool, gdbmtool

Ubuntu 18.04 LTS Bionic Beaver knows the gdbmtool package:

# sudo apt install gdbmtool

Ubuntu 14.04 LTS Trusty Tahr does not have a ready made package gdbmtool. Install dbtool instead:

git clone https://github.com/TLINDEN/dbtool.git ~/dbtool.git
cd ~/dbtool.git
sudo make install


dbtool --help
man dbtool

Disk usage: List longest files and subdirs

du .  -c -B MB | gawk  '{ printf"%10s %s\n", $1, $2 }' | sort -g | tail -n 20

or install graphical tool Baobab (about the name: compare with baobab.org):

# graphical gnome tool
sudo apt-get install baobab

Find all cron jobs

# empty logfile
echo "" >~/cronjobs.log.txt

# collect from location 1
cat /etc/crontab  >> ~/cronjobs.log.txt

# collect from location 2
cat /etc/cron.*/* >> ~/cronjobs.log.txt

# collect from location 3
sudo bash -c 'cat /var/spool/cron/crontabs/*  >>~/cronjobs.log.txt'

# see result
less ~/cronjobs.log.txt

Find and copy files recursively

$  cd /top/level/to/copy
$  find . -name '*.txt' | cpio -pdm /path/to/destdir

cpio options:

-p, --pass-through
 Pass through
-d, --make-directories
 Make directories
-m, --preserve-modification-time
 Preserve modification time
-u, --unconditional
 Unconditional (overwrite).

Find most frequent IPs in logfile

awk '{print $1}' access.log | sort | uniq -c | sort -n
tail --lines=1000 /var/log/apache2/access.log | awk '{print $1}' | sort | uniq -c | sort -n

Find my servers IP address

ip addr show eth0 | grep inet | awk '{ print $2; }' | sed 's/\/.*$//'

# fe80::221:ccff:fed9:ed72

Mirror a website

See http://www.httrack.com/ or:

$   wget --mirror --no-parent --convert-links http://www.domain.com

Monitor Data Transfer

An easy way: use ‘pv’. Just put it between two pipes: a | pv | b pipes a to pv to b. Installation: sudo apt-get install pv

Example: Fetch precious data from a remote machine and copy it to a local folder tree. Use tar.gz as transport encoding and monitor with ‘pv’:

ssh user@machine-where-precious-data-is
   "tar czpf - /some/important/data" | pv | tar xzpf - -C /new/root/directory

# tar -C path/to/folder   # -C: switch to folder

Mount Drive

Mount my LinuxData200:

$  sudo mount -t ext4 /dev/sdb1  /home/marble/htdocs/LinuxData200
$  sudo service apache2 restart

Move into TAR archive

Step 1:

$  tar --remove-files -czf new.tar.gz source/

Step 2:

   # repeat the following command until there's nothing found

$  find  source -type d -empty -exec rmdir -p {} +

MySQL: Import Gzip File Directly

# zcat will leave database.sql.gz "as is"
zcat database.sql.gz | mysql -u user -ppasswd database

Port - used?

Example (WIP):

nc -z 80
function netcat_check_ports {
   local result
   for port in 80 443 3306
      nc -z ${port}
      if [ $? -eq 0 ]; then
         echo "${port} in use"
         echo "${port} unused"

Transfer Data with TAR and “Wondertar”

TAR over SSH: Buzzword is “wondertar” or “wonder tar”.


ssh user@machine-where-precious-data-is \
   "tar czpf - /some/important/data" | tar xzpf - -C /new/root/directory

tar cpf - /some/important/data | ssh user@destination-machine "tar xpf - -C /some/directory/"

tar czf - -C /path/to/source files-and-folders | ssh user@target-host "cat - > /path/to/target/backup.tar.gz"

Previous topic

Eureka Tower Carpark Illusions

Next topic

About Computer Graphics




Recent Posts

This Page