Jan 14, 2015 Tag: Linux
Ongoing attempt to to collect tips and tricks about how to get things done on Linux. Will be updated as needed.
Updated on Nov 12, 2018
See also: About Linux:
Navigate this page:
ag
is like grep or ack but faster.
Search only files with ‘js’ at the end:
# -G allows for regular expressions
ag searchterm -G js$
Exclude ‘*.js’ files:
# --ignore doesn't take a regex
ag searchterm --ignore="*.js"
Some more:
# case insensitive
ag searchterm -iG js$
# list filenames only
ag searchterm -ilG js$
# 99 levels deep instead of 25
ag searchterm --depth=99 -ilG js$
# dirs first, sort by filesize, human readable
alias lh='ls -Shl --group-directories-first'
# subdirs of current dir
alias ldir='ls -d */'
dpkg -L
and dpkg -S
.Examples:
packet=bash
dlocate -lsbin $packet # show executables
dlocate -lsconf $packet # show conf files
dlocate -md5check $packet # show modified files
dlocate -du $packet # show used disk space
dlocate -i -S "/r?BaSH$" # find packets, case independent, using regexes
# with -P # use Perl regex syntax
dlocate -k # show all installed kernels
dlocate -K # " plus associated packets
Update database as root and background task:
sudo -b updatedb
Search case independent by REGEXPR:
locate -ir "down.*book.*ans"
# finds for example:
/home/user/Downloads/Books/AnsibleBooks/Ansible Playbook Essentials.pdf
Examples:
locate --regex "/bin$" # paths ending like '/bin'
locate --regex "\.local.*\.desktop$" | grep phpstorm
Find (living) machines in network:
sudo nmap 192.168.1.0/24 -sP
sudo nmap 192.168.1.0/24 -sn # disable port scan
sudo nmap 192.168.1.0/24 -sL -v
sudo apt-get install pv
pv --help
# example:
pv dump.sql | mysql -u user -ppasswd database
From stackoverflow:
sudo visudo
# Then edit that file to add to the very end:
username ALL = NOPASSWD: /fullpath/to/command, /fullpath/to/othercommand
# that is
john ALL = NOPASSWD: /sbin/poweroff, /sbin/start, /sbin/stop
# This will allow user 'john' to sudo poweroff, and
# start and stop without being prompted for password.
Note: visudo is not vi!
Keywords: dialog, shell, gui, zenity, kdialog
YAD is a tool for create graphical dialogs from shell scripts
Options +Indexes
IndexOptions FancyIndexing NameWidth=* IgnoreCase
➜ find -type f | LC_ALL=C sort | cpio -o --quiet | md5sum | awk '{ print $1 }'
3aa2ed92e75665d7d099e78f5c4f0ea2
Check what we have:
dpkg -l | grep "linux" | awk '{print $2}' | sort
Clean up:
dpkg --list | grep linux-image | awk '{ print $2 }' | sort | sed -n '/'`uname -r`'/q;p' | xargs sudo apt-get -y purge
Or, maybe better, step by step more manually:
## search
uname -a
dpkg -l | grep "linux" | awk '{print $2}' | grep 3.5.0
## remove manually one by one:
dpkg -P linux-headers-3.5.0-22
As we chatted:
Remove old kernels, from Skype chat 2013-02-04:
[16:56:25] a: alte kernel aufräumen:
[16:56:30] a: uname -a
[16:56:37] a: dpkg -l | grep "linux" | awk '{print $2}' | grep 3.5.0
[16:56:47] b: ok, cool
[16:56:53] a: jetzt entfernst du alle mit einer älteren versionsnummer im namen mit folgendem befehl:
[16:57:02] a: dpkg -P linux-headers-3.5.0-22
[16:57:04] a: usw.
[16:57:14] a: ich mach das immer manuell, um nicht aus versehen zuviel zu löschen
[16:57:20] a: also paket für paket
[16:57:25] a: schafft aber ziemlich viel platz
dpkg -l | grep "linux" | awk '{print $2}' | sort
$ xclip -sel clip < ~/.ssh/id_rsa.pub
Ubuntu 18.04 LTS Bionic Beaver knows the gdbmtool
package:
# sudo apt install gdbmtool
Ubuntu 14.04 LTS Trusty Tahr does not have a ready made package gdbmtool
. Install dbtool
instead:
git clone https://github.com/TLINDEN/dbtool.git ~/dbtool.git
cd ~/dbtool.git
./autogen.sh
./configure
make
sudo make install
Run:
cd
dbtool --help
man dbtool
du . -c -B MB | gawk '{ printf"%10s %s\n", $1, $2 }' | sort -g | tail -n 20
or install graphical tool Baobab (about the name: compare with baobab.org):
# graphical gnome tool
sudo apt-get install baobab
# empty logfile
echo "" >~/cronjobs.log.txt
# collect from location 1
cat /etc/crontab >> ~/cronjobs.log.txt
# collect from location 2
cat /etc/cron.*/* >> ~/cronjobs.log.txt
# collect from location 3
sudo bash -c 'cat /var/spool/cron/crontabs/* >>~/cronjobs.log.txt'
# see result
less ~/cronjobs.log.txt
$ cd /top/level/to/copy
$ find . -name '*.txt' | cpio -pdm /path/to/destdir
cpio options:
-p, --pass-through | |
Pass through | |
-d, --make-directories | |
Make directories | |
-m, --preserve-modification-time | |
Preserve modification time | |
-u, --unconditional | |
Unconditional (overwrite). |
awk '{print $1}' access.log | sort | uniq -c | sort -n
tail --lines=1000 /var/log/apache2/access.log | awk '{print $1}' | sort | uniq -c | sort -n
ip addr show eth0 | grep inet | awk '{ print $2; }' | sed 's/\/.*$//'
# 192.168.1.203
# fe80::221:ccff:fed9:ed72
#
See http://www.httrack.com/ or:
$ wget --mirror --no-parent --convert-links http://www.domain.com
An easy way: use ‘pv’.
Just put it between two pipes: a | pv | b
pipes a to pv to b.
Installation: sudo apt-get install pv
Example: Fetch precious data from a remote machine and copy it to a local folder tree. Use tar.gz as transport encoding and monitor with ‘pv’:
ssh user@machine-where-precious-data-is
"tar czpf - /some/important/data" | pv | tar xzpf - -C /new/root/directory
# tar -C path/to/folder # -C: switch to folder
Mount my LinuxData200:
$ sudo mount -t ext4 /dev/sdb1 /home/marble/htdocs/LinuxData200
$ sudo service apache2 restart
Step 1:
$ tar --remove-files -czf new.tar.gz source/
Step 2:
# repeat the following command until there's nothing found
$ find source -type d -empty -exec rmdir -p {} +
# zcat will leave database.sql.gz "as is"
zcat database.sql.gz | mysql -u user -ppasswd database
Example (WIP):
nc -z 127.0.0.1 80
function netcat_check_ports {
local result
for port in 80 443 3306
do
nc -z 127.0.0.1 ${port}
if [ $? -eq 0 ]; then
echo "${port} in use"
else
echo "${port} unused"
fi
done
}
TAR over SSH: Buzzword is “wondertar” or “wonder tar”.
Examples:
ssh user@machine-where-precious-data-is \
"tar czpf - /some/important/data" | tar xzpf - -C /new/root/directory
tar cpf - /some/important/data | ssh user@destination-machine "tar xpf - -C /some/directory/"
tar czf - -C /path/to/source files-and-folders | ssh user@target-host "cat - > /path/to/target/backup.tar.gz"