Category Archives: UNIX

Posts regarding UNIX and Linux systems

Script to place in DMZ

So I had to place a small server in my home DMZ leaving it opened to the whole world with the corresponding risks this has. Wondering how to allow access from my home LAN I came up with the following iptables script.

#!/bin/bash

IPTABLES=/sbin/iptables
INT=eth0

startiptables() {
	if [ ${UID} -eq 0 ]; then
		${IPTABLES} -A INPUT -i ${INT} -s 192.168.1.0/24 -j ACCEPT
		${IPTABLES} -A INPUT -i ${INT} -m state --state RELATED,ESTABLISHED -j ACCEPT
		${IPTABLES} -A INPUT -i ${INT} -j REJECT
	else
                echo "Your UID is: ${UID}. Execute as superuser please"
        fi
}

stopiptables() {
	if [ ${UID} -eq 0 ]; then
		${IPTABLES} -F
		${IPTABLES} -L
	else
                echo "Your UID is: ${UID}. Execute as superuser please"
        fi
}

statusiptables() {
	if [ ${UID} -eq 0 ]; then
		${IPTABLES} -L
	else
		echo "Your UID is: ${UID}. Execute as superuser please"
	fi
}

case "$1" in
	start)	startiptables ;;
	stop)	stopiptables ;;
	status) statusiptables ;;
	*) echo "usage: $0 start|stop|status" >&2
		exit 1
		;;
esac

Pretty simple as you can see. It will allow all connections from inside home LAN and block all unrelated traffic coming from the public, except the related and established ones. Substitute the classic class C on script for your corresponding home/work network.

Downgrade Raspberry Pi kernel

So I upgraded my DietPi kernel the other day and noticed there was no driver for Realtek 8188 driver for that kernel version. After playing around trying to compile the driver for the newest kernel I decided to downgrade the kernel. Thought it was difficult, but it’s quite easy. Only choose the kernel version you want to downgrade to from here, copy the hash and run rpi-update.

rpi-update  48cfa89779408ecd69db4eb793b846fb7fe40c4b

Hash above corresponds to kernel 4.4.11-v7+, with that kernel I was able to download the driver for my USB wifi using the script below:

#!/bin/bash
set -e

TOPIC_URL="http://www.raspberrypi.org/phpBB3/viewtopic.php?p=462982"

# Download and install rpi driver for 8188eu-based wifi dongles
# from MrEngman's dropbox.
#
# Version information is fetched from TOPIC_URL and appears as:
#
#   3.6.11+ #371 up to #520 inclusive    - 8188eu-20130209.tar.gz
#   3.6.11+ #524, #528, #532             - 8188eu-20130815.tar.gz
#   ...
# then is matched against local kernel release and version numbers
# to select proper driver tarball.  Kernel build number can be overriden
# with command line option -k, in case no exact match is found.

fetch_versions() {
	curl -s "$TOPIC_URL" \
	| sed 's:<code>\|</code>\|<br />:\n:g' \
	| sed 's:&nbsp;: :g ; s:gz.*:gz:' \
	| grep -E '^[0-9.]+.*tar\.gz'
}


case "$1" in
	-k|--kernel)
		build=$2
		;;
	-l|--list)
		fetch_versions
		exit 0
		;;
	-h|--help)
		echo "usage: `basename $0`" \
			"[-k|--kernel <kernel build>]" \
			"[-l|--list]"
		exit 0
		;;
	"")
		;; # proceed to install
	*)
		echo "unknown command: $1" >&2
		$0 --help
		exit 1
		;;
esac


kernel=$(uname -r)
build=${build:-$(uname -v | awk '{print $1}' | tr -d '#')}

if [ $kernel = "3.6.11+" ] && [ $build -gt 370 ] && [ $build -lt 521 ] ; then
	tarfile=8188eu-20130209.tar.gz
else
	tarfile=$(fetch_versions \
		| grep -e "^$kernel " \
		| grep -E "#$build[, ]" \
		| awk '{print $NF}')
fi

if [ ! "$tarfile" ] ; then
	echo "cannot match kernel: $kernel #$build"
	echo "please check news at $TOPIC_URL"
	echo "or try closest compatible version with -k <kernel build>"
	exit 1
fi

tmpdir=$(mktemp -d)
trap "\rm -rf $tmpdir" EXIT
cd $tmpdir

echo "downloading $tarfile (kernel $kernel #$build)"
curl -s https://dl.dropboxusercontent.com/u/80256631/$tarfile | tar xz

module_bin="8188eu.ko"
module_dir="/lib/modules/$kernel/kernel/drivers/net/wireless"
firmware_bin="rtl8188eufw.bin"
firmware_dir="/lib/firmware/rtlwifi"

if [ -f $firmware_bin ] ; then
	echo "installing firmware $firmware_bin"
	sudo install -p -m 644 $firmware_bin $firmware_dir
fi

echo "installing kernel module $module_bin"
sudo install -p -m 644 $module_bin $module_dir
sudo depmod -a
#sudo modprobe -r 8188eu || true # cannot currently be removed ("permanent")
sudo modprobe -i 8188eu
lsmod | grep -q 8188eu || echo "error: module not loaded"

As per latest update dl.dropboxusercontent.com is no longer valid and should be substituted by http://www.fars-robotics.net/, but dl.dropboxusercontent.com worked for me. Now my wifi is working.

uname -a; ifconfig wlan0; lsmod | grep 8188; lsusb
Linux DietPi 4.4.11-v7+ #886 SMP Thu May 19 15:20:49 BST 2016 armv7l GNU/Linux
wlan0     Link encap:Ethernet  HWaddr 00:e0:4c:81:89:01  
          inet addr:192.168.0.102  Bcast:192.168.0.255  Mask:255.255.255.0
          inet6 addr: fe80::2e0:4cff:fe81:8901/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:2262 errors:0 dropped:10 overruns:0 frame:0
          TX packets:1659 errors:0 dropped:3 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:463049 (452.1 KiB)  TX bytes:274451 (268.0 KiB)
8188eu                859474  0 
cfg80211              427855  1 8188eu
Bus 001 Device 004: ID 0bda:8179 Realtek Semiconductor Corp. 
Bus 001 Device 003: ID 0424:ec00 Standard Microsystems Corp. SMSC9512/9514 Fast Ethernet Adapter
Bus 001 Device 002: ID 0424:9514 Standard Microsystems Corp. 
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

And below are my available kernels.

ls -ltr /lib/modules
total 28
drwxr-xr-x 3 root root 4096 Mar 18  2016 4.1.18-v7+
drwxr-xr-x 3 root root 4096 Nov 17 23:58 4.9.62+
drwxr-xr-x 3 root root 4096 Dec 30 12:36 4.9.35+
drwxr-xr-x 3 root root 4096 Dec 30 12:36 4.9.35-v7+
drwxr-xr-x 3 root root 4096 Dec 30 14:52 4.9.62-v7+
drwxr-xr-x 3 root root 4096 Dec 30 19:49 4.4.11+
drwxr-xr-x 3 root root 4096 Dec 30 20:03 4.4.11-v7+

Run httpd with docker

So below is the script:

#!/bin/bash

echo "Running httpd with docker."

docker run   --rm -v "$PWD":/usr/local/apache2/htdocs  httpd

We use the following options:
-v, –volume list Bind mount a volume
–rm Automatically remove the container when it exits

Quite simple, right? Execution below.

 bash ~/docker/run_httpd.sh
Running httpd with docker.
AH00558: httpd: Could not reliably determine the server's fully qualified domain name, using 172.17.0.6. Set the 'ServerName' directive globally to suppress this message
AH00558: httpd: Could not reliably determine the server's fully qualified domain name, using 172.17.0.6. Set the 'ServerName' directive globally to suppress this message
[Thu Aug 17 22:22:55.249981 2017] [mpm_event:notice] [pid 1:tid 140029488904064] AH00489: Apache/2.4.27 (Unix) configured -- resuming normal operations
[Thu Aug 17 22:22:55.250079 2017] [core:notice] [pid 1:tid 140029488904064] AH00094: Command line: 'httpd -D FOREGROUND'

And we proceed to test.

 curl   http://172.17.0.6
<HTML>
<HEAD>
First page
</HEAD>
<BODY>
Testing docker httpd


We are getting below index.html because we are mapping /tmp/httpd (current $PWD) to /usr/local/apache2/htdocs. In /tmp/httpd we created an example index.html as shown above.
More info here

Run python script with docker

So I started playing with docker and asked myself whether it would be possible to run a python script with docker. Well, answer is yes. Example of the script below.

#!/usr/bin/python

import sys
print "Running script!!"
print sys.version_info

Execution below:

docker run -it --rm --name pythonscript -v "$PWD":/usr/src/myapp -w /usr/src/myapp python:2 python script.py
Running script!!
sys.version_info(major=2, minor=7, micro=13, releaselevel='final', serial=0)

References:
Docker documentation

Using sqoop to import a DB table into HDFS

In the world of Big Data to import data from a DB into HDFS you need Apache Sqoop.

sqoop import --connect jdbc:mysql://localhost/mysql  --username training  -P --warehouse-dir /home/training/db --table user
17/02/23 10:38:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.2.0
Enter password: 
17/02/23 10:38:24 INFO manager.SqlManager: Using default fetchSize of 1000
17/02/23 10:38:24 INFO tool.CodeGenTool: Beginning code generation
17/02/23 10:38:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `user` AS t LIMIT 1
17/02/23 10:38:24 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `user` AS t LIMIT 1
17/02/23 10:38:24 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-0.20-mapreduce
Note: /tmp/sqoop-training/compile/7f3a9709c50f58c2c6bb24de91922c6b/user.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/02/23 10:38:29 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-training/compile/7f3a9709c50f58c2c6bb24de91922c6b/user.jar
17/02/23 10:38:29 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/02/23 10:38:29 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/02/23 10:38:29 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/02/23 10:38:29 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/02/23 10:38:29 WARN manager.CatalogQueryManager: The table user contains a multi-column primary key. Sqoop will default to the column Host only for this job.
17/02/23 10:38:29 WARN manager.CatalogQueryManager: The table user contains a multi-column primary key. Sqoop will default to the column Host only for this job.
17/02/23 10:38:29 INFO mapreduce.ImportJobBase: Beginning import of user
17/02/23 10:38:31 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
17/02/23 10:38:33 INFO db.DBInputFormat: Using read commited transaction isolation
17/02/23 10:38:33 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Host`), MAX(`Host`) FROM `user`
17/02/23 10:38:33 WARN db.TextSplitter: Generating splits for a textual index column.
17/02/23 10:38:33 WARN db.TextSplitter: If your database sorts in a case-insensitive order, this may result in a partial import or duplicate records.
17/02/23 10:38:33 WARN db.TextSplitter: You are strongly encouraged to choose an integral split column.
17/02/23 10:38:33 INFO mapred.JobClient: Running job: job_201702221239_0003
17/02/23 10:38:34 INFO mapred.JobClient:  map 0% reduce 0%
17/02/23 10:38:55 INFO mapred.JobClient:  map 17% reduce 0%
17/02/23 10:38:56 INFO mapred.JobClient:  map 33% reduce 0%
17/02/23 10:39:08 INFO mapred.JobClient:  map 50% reduce 0%
17/02/23 10:39:09 INFO mapred.JobClient:  map 67% reduce 0%
17/02/23 10:39:21 INFO mapred.JobClient:  map 83% reduce 0%
17/02/23 10:39:22 INFO mapred.JobClient:  map 100% reduce 0%
17/02/23 10:39:26 INFO mapred.JobClient: Job complete: job_201702221239_0003
17/02/23 10:39:26 INFO mapred.JobClient: Counters: 23
17/02/23 10:39:26 INFO mapred.JobClient:   File System Counters
17/02/23 10:39:26 INFO mapred.JobClient:     FILE: Number of bytes read=0
17/02/23 10:39:26 INFO mapred.JobClient:     FILE: Number of bytes written=1778658
17/02/23 10:39:26 INFO mapred.JobClient:     FILE: Number of read operations=0
17/02/23 10:39:26 INFO mapred.JobClient:     FILE: Number of large read operations=0
17/02/23 10:39:26 INFO mapred.JobClient:     FILE: Number of write operations=0
17/02/23 10:39:26 INFO mapred.JobClient:     HDFS: Number of bytes read=791
17/02/23 10:39:26 INFO mapred.JobClient:     HDFS: Number of bytes written=818
17/02/23 10:39:26 INFO mapred.JobClient:     HDFS: Number of read operations=6
17/02/23 10:39:26 INFO mapred.JobClient:     HDFS: Number of large read operations=0
17/02/23 10:39:26 INFO mapred.JobClient:     HDFS: Number of write operations=6
17/02/23 10:39:26 INFO mapred.JobClient:   Job Counters 
17/02/23 10:39:26 INFO mapred.JobClient:     Launched map tasks=6
17/02/23 10:39:26 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=89702
17/02/23 10:39:26 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
17/02/23 10:39:26 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
17/02/23 10:39:26 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
17/02/23 10:39:26 INFO mapred.JobClient:   Map-Reduce Framework
17/02/23 10:39:26 INFO mapred.JobClient:     Map input records=8
17/02/23 10:39:26 INFO mapred.JobClient:     Map output records=8
17/02/23 10:39:26 INFO mapred.JobClient:     Input split bytes=791
17/02/23 10:39:26 INFO mapred.JobClient:     Spilled Records=0
17/02/23 10:39:26 INFO mapred.JobClient:     CPU time spent (ms)=5490
17/02/23 10:39:26 INFO mapred.JobClient:     Physical memory (bytes) snapshot=666267648
17/02/23 10:39:26 INFO mapred.JobClient:     Virtual memory (bytes) snapshot=4423995392
17/02/23 10:39:26 INFO mapred.JobClient:     Total committed heap usage (bytes)=191102976
17/02/23 10:39:26 INFO mapreduce.ImportJobBase: Transferred 818 bytes in 56.7255 seconds (14.4203 bytes/sec)
17/02/23 10:39:26 INFO mapreduce.ImportJobBase: Retrieved 8 records.

Example above dumps table user from mysql DB into hadoop.
First connect to DB using –connect
–username would by the authentication username, -P to ask for password at prompt. –warehouse-dir HDFS parent for table destination and –table to select the table to import.

Below dumped content is shown.

hdfs dfs -cat   /home/training/db/user/part-m-0000*
127.0.0.1,root,,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,,,,,0,0,0,0
localhost,root,,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,,,,,0,0,0,0
localhost.localdomain,root,,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,,,,,0,0,0,0
localhost,,,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,,,,,0,0,0,0
localhost.localdomain,,,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,,,,,0,0,0,0
localhost,training,*27CF0BD18BDADD517165824F8C1FFF667B47D04B,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,N,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,,,,,0,0,0,0
localhost,hiveuser,*2470C0C06DEE42FD1618BB99005ADCA2EC9D1E19,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,,,,,0,0,0,0
localhost,hue,*15221DE9A04689C4D312DEAC3B87DDF542AF439E,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,,,,,0,0,0,0