October 2025
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  

Categories

October 2025
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031  

Bask quick learn

Bash Shell Shortcuts
Special Shell Variables

Common environment variables

PATH – Sets the search path for any executable command. Similar to the PATH variable in MSDOS.

HOME – Home directory of the user.

MAIL – Contains the path to the location where mail addressed to the user is stored.

IFS – Contains a string of characters which are used as word seperators in the command line. The string normally consists of the space, tab and the newline characters. To see them you will have to do an octal dump as follows:

$ echo $IFS | od -bc

PS1 and PS2 – Primary and secondary prompts in bash. PS1 is set to $ by default and PS2 is set to > . To see the secondary prompt, just run the command :

$ ls |

… and press enter.

USER – User login name.

TERM – indicates the terminal type being used. This should be set correctly for editors like Vim to work correctly.

SHELL – Determines the type of shell that the user sees on logging in.

To see what are the values held by the above environment variables, just do an echo of the name of the variable preceded with a $.

For example, if I do the following:

$ echo $USER
ravi

… I get the value (My login name) which is stored in the environment variable USER.

Some bash shell scripting rules

The first line in your script must be #!/bin/bash
… that is a # (Hash) followed by a ! (bang) followed by the path of the shell. This line lets the environment know the file is a shell script and the location of the shell.
Before executing your script, you should make the script executable. You do it by using the following command:

$ chmod ugo+x your_shell_script.sh

The name of your shell script must end with a .sh . This lets the user know that the file is a shell script. This is not compulsary but is the norm.
Conditional statements
‘if’ Statement
The ‘if’ statement evaluates a condition which accompanies its command line.
syntax:

if condition_is_true
then
//execute commands
else
//execute commands
fi

‘if’ condition also permits multi-way branching. That is you can evaluate more conditions if the previous condition fails.

if condition_is_true
then
//execute commands
elif another_condition_is_true
then
//execute commands
else
//execute commands
fi

Example :

if grep “aboutlinux” thisfile.html
then
echo “Found the word in the file”
else
echo “Sorry no luck!”
fi

if’s companion – test

test is an internal feature of the shell. ‘test’ evaluates the condition placed on its right, and returns either a true or false exit status. For this purpose, ‘test’ uses certain operators to evaluate the condition. They are as follows:

Relational Operators

-eq – Equal to
-lt – Less than
-gt – Greater than
-ge – Greater than or Equal to
-le – Less than or Equal to

File related tests

-f file – True if file exists and is a regular file.
-r file – True if file exists and is readable.
-w file – True if file exists and is writable.
-x file – True if file exists and is executable.
-d file – True if file exists and is a directory.
-s file – True if file exists and has a size greater than zero.

String tests

-n str – True if string str is not a null string.
-z str – True if string str is a null string.
str1 == str2 – True if both strings are equal.
str – True if string str is assigned a value and is not null.
str1 != str2 – True if both strings are unequal.
-s file – True if file exists and has a size greater than zero.
Test also permits the checking of more than one expression in the same line.
-a – Performs the AND function
-o – Performs the OR function

A few Example snippets of using test

test $d -eq 25 && echo $d

… which means, if the value in the variable d is equal to 25, print the value. Otherwise don’t print anything.

test $s -lt 50 && do_something

if [ $d -eq 25 ]
then
echo $d
fi

In the above example, I have used square brackets instead of the keyword test – which is another way of doing the same thing.

if [ $str1 == $str2 ]
then
//do something
fi

if [ -n “$str1” -a -n “$str2” ]
then
echo ‘Both $str1 and $str2 are not null’
fi

… above, I have checked if both strings are not null then execute the echo command.

Things to remember while using test

If you are using square brackets [ ] instead of test, then care should be taken to insert a space after the [ and before the ].
test is confined to integer values only. Decimal values are simply truncated.
Do not use wildcards for testing string equality – they are expanded by the shell to match the files in your directory rather than the string.
Case statement
Case statement is the second conditional offered by the shell.
Syntax:

case expression in
pattern1) //execute commands ;;
pattern2) //execute commands ;;

esac

The keywords here are in, case and esac. The ‘;;’ is used as option terminators. The construct also uses ‘)’ to delimit the pattern from the action.

Example:


echo “Enter your option : ”
read i;

case $i in
1) ls -l ;;
2) ps -aux ;;
3) date ;;
4) who ;;
5) exit
esac

The last case option need not have ;; but you can provide them if you want.

Here is another example:

case `date |cut -d” ” -f1` in
Mon) commands ;;
Tue) commands ;;
Wed) commands ;;

esac

Case can also match more than one pattern with each option.You can also use shell wild-cards for matching patterns.


echo “Do you wish to continue? (y/n)”
read ans

case $ans in
Y|y) ;;
[Yy][Ee][Ss]) ;;
N|n) exit ;;
[Nn][Oo]) exit ;;
*) echo “Invalid command”
esac

In the above case, if you enter YeS, YES,yEs and any of its combinations, it will be matched.

This brings us to the end of conditional statements.

Looping Statements

while loop

while loop syntax –

while condition_is_true
do
//execute commands
done

Example:

while [ $num -gt 100 ]
do
sleep 5
done

while :
do
//execute some commands
done

The above code implements a infinite loop. You could also write ‘while true’ instead of ‘while :’ .
Here I would like to introduce two keywords with respect to looping conditionals. They are break and continue.

break – This keyword causes control to break out of the loop.

continue – This keyword will suspend the execution of all statements following it and switches control to the top of the loop for the next iteration.

until loop

until complements while construct in the sense that the loop body here is executed repeatedly as long as the condition remains false.

Syntax:

until false
do
//execute commands
done

Example:


until [ -r myfile ]
do
sleep 5
done

The above code is executed repeatedly until the file myfile can be read.

for loop

for loop syntax :

for variable in list
do
//execute commands
done

Example:


for x in 1 2 3 4 5
do
echo “The value of x is $x”;
done

Here the list contains 5 numbers 1 to 5. Here is another example:

for var in $PATH $MAIL $HOME
do
echo $var
done

Suppose you have a directory full of java files and you want to compile those. You can write a script like this:


for file in *.java
do
javac $file
done

You can use wildcard expressions in your scripts.
Read Regular Expressions Tutorial to know more.

Special symbols used in BASH scripting

$* – This denotes all the parameters passed to the script at the time of its execution. Which includes $1, $2 and so on.
$0 – Name of the shell script being executed.
$# – Number of arguments specified in the command line.
$? – Exit status of the last command.

The above symbols are known as positional parameters. Let me explain the positional parameters with the aid of an example.

Suppose I have a shell script called my_script.sh . Now I execute this script in the command line as follows :

$ ./my_script.sh linux is a robust OS

… as you can see above, I have passed 5 parameters to the script. In this scenario, the values of the positional parameters are as follows:
$* – will contain the values ‘linux’,’is’,’a’,’robust’,’OS’.
$0 – will contain the value my_script.sh – the name of the script being executed.
$# – contains the value 5 – the total number of parameters.
$$ – contains the process ID of the current shell. You can use this parameter while giving unique names to any temporary files that you create at the time of execution of the shell.
$1 – contains the value ‘linux’
$2 – contains the value ‘is’
… and so on.

The set and shift statements
set – Lets you associate values with these positional parameters .

For example, try this:

$ set `date`
$ echo $1
$ echo $*
$ echo $#
$ echo $2

shift – transfers the contents of a positional parameter to its immediate lower numbered one. This goes on as many times it is called.

Example :

$ set `date`
$ echo $1 $2 $3
$ shift
$ echo $1 $2 $3
$ shift
$ echo $1 $2 $3

To see the process Id of the current shell, try this:

$ echo $$
2667

Validate that it is the same value by executing the following command:

$ ps -f |grep bash

Make your BASH shell script interactive

read statement

Make your shell script interactive. read will let the user enter values while the script is being executed. When a program encounters the read statement, the program pauses at that point. Input entered through the keyboard id read into the variables following read, and the program execution continues.

An example –

#!/bin/sh
echo “Enter your name : ”
read name
echo “Hello $name , Have a nice day.”

Exit status of the last command

Every command returns a value after execution. This value is called the exit status or return value of the command. A command is said to be true if it executes successfully, and false if it fails. This can be checked in the script using the $? positional parameter.

ZCAT Shell bash

How to display the contents of a gzip/gz file
By Alvin Alexander. Last updated: Aug 6, 2011
Problem: You have a plain text file that has been compressed with the gzip command, and you’d like to display the file contents with the Unix/Linux cat or more commands.

Solution: Instead of using the cat or more commands, use their equivalents for working with gz files, the zcat and zmore commands.

For instance, if you want to display the contents of an Apache log file (which is a plain text file) that has been compressed with gzip, just use the zcat command, like this:

zcat access_log.gz
Of course almost any Apache log file will be large, and will scroll off the screen quickly, so you’ll probably want to use the gzip equivalent of the more command, zmore, like this:

zmore access_log.gz

find . -name “*.gz” | while read -r file; do zcat -f “$file” | head -n 1; done

zcat `man -w manpage` | groff -mandoc -T html – > filename.html

save manpage as html file

zcat log.tar.gz | grep -a -i “string”

grep compressed log files without extracting. Useful in system where log files are compressed for archival purposes

zcat /usr/share/man/man1/man.1.gz | nroff -man | less

As odd as this may be, I know of servers where the man(1) command is not installed, and there is not enough room on / to install it. However, zcat(1), nroff(1) and less(1) are. This is a way to read those documents without the proper tool to do so, as sad as this may seem. 🙂

This command enables the user to append a search pattern on the command line when using less as the PAGER. This is especially convenient (as the example shows) in compressed files and when searching man pages (substituting the zcat command with man, however).

zcat -f $(ls -tr access.log*)

concatenate compressed and uncompressed logs
with zcat force option it’s even simpler.

find /var/log/apache2 -name ‘access.log*gz’ -exec zcat {} \; -or -name ‘access.log*’ -exec cat {} \;
functions: cat find zcat
concatenate compressed and uncompressed logs
This command allows you to stream your log files, including gziped files, into one stream which can be piped to awk or some other command for analysis.
Note: if your version of ‘find’ supports it, use:

find /var/log/apache2 -name ‘access.log*gz’ -exec zcat {} + -or -name ‘access.log*’ -exec cat {} +
zcat database.sql.gz | mysql -uroot -p’passwd’ database
Functions: zcat
Restore mysql database uncompressing on the fly.
This way you keep the file compressed saving disk space.
Other way less optimal using named pipes:
mysql -uroot -p’passwd’ database < ( zcat $FILE || gzcat $FILE || bzcat2 $FILE ) | less Group OR'd commands where you expect only one to work Something to stuff in an alias when you are working in multiple environments. The double-pipe OR will fall through until one of the commands succeeds, and the rest won't be executed. Any STDERR will fall out, but the STDOUT from the correct command will bubble out of the parenthesis to the less command, or some other command you specify. ( last ; ls -t /var/log/wtmp-2* | while read line ; do ( rm /tmp/wtmp-junk ; zcat $line 2>/dev/null || bzcat $line ) > /tmp/junk-wtmp ; last -f /tmp/junk-wtmp ; done ) | less
Functions: last ls read rm zcat
Tags: last command wtmp
See a full last history by expanding logrotated wtmp files
When your wtmp files are being logrotated, here’s an easy way to unpack them all on the fly to see more than a week in the past. The rm is the primitive way to prevent symlink prediction attack.

zcat access_log.*.gz | awk ‘{print $7}’ | sort | uniq -c | sort -n | tail -n 20
Functions: awk sort tail uniq zcat
Tags: log apache zcat analysis
Analyse compressed Apache access logs for the most commonly requested pages

sudo zcat /var/log/auth.log.*.gz | awk ‘/Failed password/&&!/for invalid user/{a[$9]++}/Failed password for invalid user/{a[“*” $11]++}END{for (i in a) printf “%6s\t%s\n”, a[i], i|”sort -n”}’
Functions: awk printf sudo zcat
Tags: Security awk brute force
Show the number of failed tries of login per account. If the user does not exist it is marked with *.

zcat a_big_file.gz | sed -ne “$(zcat a_big_file.gz | tr -d “[:print:]” | cat -n | grep -vP “^ *\d+\t$” | cut -f 1 | sed -e “s/\([0-9]\+\)/\1=;\1p;/” | xargs)” | tr -c “[:print:]\n” “?”

Functions: sed tr zcat
Scan a gz file for non-printable characters and display each line number and line that contains them.
Scans the file once to build a list of line numbers that contain non-printable characters
Scans the file again, passing those line numbers to sed as two commands to print the line number and the line itself. Also passes the output through a tr to replace the characters with a ?

zcat /usr/share/doc/vim-common/README.gz | vim -g +23 –
Functions: vim zcat
Pipe a textfile to vim and move the cursor to a certain line
This command is more for demonstrating piping to vim and jumping to a specific line than anything else.
Exit vim with :q!
+23 jumps to line 23
– make vim receive the data from the pipe

zcat /usr/share/man/man1/grep.1.gz | grep “color”

Search gzipped files
This decompresses the file and sends the output to STDOUT so it can be grepped. A good one to put in loops for searching directories of gzipped files, such as man pages.

#!/bin/sh
STAMP=`date ‘+%Y%m%d-%H:%M’`
REMOTE_MYCNF=/var/log/mysoft/mysoft.log
REMOTE_GZ=/var/log/mysoft/mysoft.log.1.gz
REMOTE_DIR=/var/log/mysoft/
BACKUP_DIR=/home/dev/logs/
NEWLOG=”foo-temp.log”
ssh $1 “zcat $REMOTE_GZ >> $REMOTE_DIR$NEWLOG”
ssh $1 “cat $REMOTE_MYCNF >> $REMOTE_DIR$NEWLOG”

Grep, Awk, and Sed in bash

So, let’s consider the following log file:

www.three.com 10.15.101.11 1353280801 TEST 345
www.one.com 10.14.101.11 1353280801 TEST 343
www.three.com 1.10.11.71 1353280801 TEST 323
www.one.com 10.15.11.61 1353280801 TEST 365
www.two.com 10.10.11.51 1353290801 TEST 55
www.two.com 10.20.13.11 1353290801 REST 435
www.one.com 10.20.14.41 1353290801 REST 65
www.two.com 10.10.11.14 1353290801 REST 345
www.three.com 10.10.11.31 1354280801 REST 34
www.one.com 10.10.13.144 1354280801 JSON 65
www.two.com 10.50.11.141 1354280801 JSON 665
www.three.com 120.10.11.11 1354280801 JSON 555
www.two.com 10.144.11.11 1383280801 RAW 33
www.one.com 10.103.141.141 1383280801 RAW 315

Now, here are some things you can do to this log file:
How many files are in a directory: ls | wc -l
Print the file: cat sample.log
Print lines that match a particular word: grep “RAW” sample.log
Print those lines to a file called test.log: grep “RAW” sample.log > test.log
Print particular columns and sort: cat sample.log | awk ‘{ print $1,$2}’ | sort -k 1
Find and Replace using SED and Regex: cat sample.log | sed ‘s/TEST/JSON/g’
Split a log file into multiple files using a column as name with AWK: awk ‘{ print >>($4?.log”); close($4?.log”) }’ sample.log
Use substr (removes last character) in AWK to manipulate a string per line: cat sample.log | awk ‘{ print $1,$2,$3,substr($4,1,length($4)-1),$5}’
Print first line of file with SED: sed q test.log
Print last line of file with SED: sed ‘$!d’ sample.log
Perform a regular expression on last character of entire file using SED: cat sample.log | sed ‘$ s/5$//’
Add some text to beginning and end of a file with AWK: cat sample.log | awk ‘BEGIN{print “START” } { print } END{print “END”}’
Count and print how many unique fields are in all rows using AWK: cat sample.log | awk ‘{ if (a[$1]++ == 0) print $1 }’ | wc -l
Make everything lowercase with AWK: cat sample.log | awk ‘{print tolower($0)}’
Multiple SED regular expressions: sed ’1s/^/START/;$ s/5$/END/’ sample.log
Regex with SED on multiple files: for file in *; do sed ’1s/^/START/’ $file > $file’.json’; done

n a command on each file in directory: for i in `ls`; do $i; done
Rename the extension of all files in a folder: for old in *.txt; do mv $old `basename $old .txt`.json; done
Merge all files in a directory with a comma seperator: find . -type f -not -name output.txt -exec cat {} ; -exec echo `,` ; > output.txt

if condition-integer expression expected

if condition-integer expression expected

ADV1=94.3
Quantity=96.3
if [ $Quantity -eq $ADV1 ]; then
echo “quantity is greter”
fi

echo “” | nawk -v ADV1=94.3 -v Quantity=96.3 ‘{if(ADV1 $Quantity “|bc) -eq 1 ] ; then
echo “$ADV1 is greter”
else
echo “$Quantity is greater”
fi

ADV1=94.3
Quantity=96.3
if [ $Quantity -gt $ADV1 ]; then
echo “quantity is greter”
fi

if [ “$var1” -lt “$var2” ]; then
echo “$var1 is lt $var2”
else
echo “$var2 is lt $var1”
fi

if ((var1> /tmp/log&
# background cmd pid
pid=$!
# loop to monitor running background cmd
while :
do
ps ax | grep $pid | grep -v grep
ret=$?
if test “$ret” != “0”
then
echo “Monitored pid ended”
exit
fi
sleep 5

done

wait $pid
echo $?

!/bin/bash
CHECK=$0
SERVICE=$1
DATE=`date`
OUTPUT=$(ps aux | grep -v grep | grep -v $CHECK |grep $1)
echo $OUTPUT
if [ “${#OUTPUT}” -gt 0 ] ;
then echo “$DATE: $SERVICE service running, everything is fine”
else echo “$DATE: $SERVICE is not running”
fi

#!/bin/sh

SERVICE=”$1″
RESULT=`ps -a | sed -n /${SERVICE}/p`

if [ “${RESULT:-null}” = null ]; then
echo “not running”
else
echo “running”
fi

#!/bin/bash
ps axho comm| grep $1 > /dev/null
result=$?
echo “exit code: ${result}”
if [ “${result}” -eq “0” ] ; then
echo “`date`: $SERVICE service running, everything is fine”
else
echo “`date`: $SERVICE is not running”
/etc/init.d/$1 restart
fi

#!/bin/sh
SERVICE=$1
if ps ax | grep -v grep | grep -v $0 | grep $SERVICE > /dev/null
then
echo “$SERVICE service running, everything is fine”
else
echo “$SERVICE is not running”
fi

!/bin/sh

PROCESS=”$1″
PROCANDARGS=$*

while :
do
RESULT=`pgrep ${PROCESS}`

if [ “${RESULT:-null}” = null ]; then
echo “${PROCESS} not running, starting “$PROCANDARGS
$PROCANDARGS &
else
echo “running”
fi
sleep 10
done

#!/bin/bash
ps_out=`ps -ef | grep $1 | grep -v ‘grep’ | grep -v $0`
result=$(echo $ps_out | grep “$1”)
if [[ “$result” != “” ]];then
echo “Running”
else
echo “Not Running”
fi

# simulate a long process that will have an identifiable exit code
(sleep 15 ; /bin/false) &
my_pid=$!

while ps | grep ” $my_pid ” # might also need | grep -v grep here
do
echo $my_pid is still in the ps output. Must still be running.
sleep 3
done

echo Oh, it looks like the process is done.
wait $my_pid
my_status=$?
echo The exit status of the process was $my_status

#!/bin/sh

cmd() { sleep 5; exit 24; }

cmd & # Run the long running process
pid=$! # Record the pid

# Spawn a process that coninually reports that the command is still running
while echo “$(date): $pid is still running”; do sleep 1; done &
echoer=$!

# Set a trap to kill the reporter when the process finishes
trap ‘kill $echoer’ 0

# Wait for the process to finish
if wait $pid; then
echo “cmd succeeded”
else
echo “cmd FAILED!! (returned $?)”
fi

#!/bin/bash
….
doSomething &
local pid=$!
while [ -d /proc/$pid ]; do # While directory exists, the process is running
doSomethingElse
….
else # when directory is removed from /proc, process has ended
wait $pid
local exit_status=$?
done

#!/bin/sh
cd /usr/src/linux
if [ “$?” -eq “0” ]; then
make dep
if [ “$?” -eq “0” ]; then
make bzImage
if [ “$?” -eq “0” ]; then
make modules
if [ “$?” -eq “0” ]; then
make modules_install
if [ “$?” -eq “0” ]; then
cp arch/i386/boot/bzImage /boot/my-new-kernel
if [ “$?” -eq “0” ]; then
cp System.map /boot/
if [ “$?” -eq “0” ]; then
echo “Your new kernel awaits, m’lord.”
fi
fi
fi
fi
fi
fi
fi
fi

Parsing – Logs

grep, cat, zgrep and zcat

More on log parsing, I’m taking notes on how to read log files and get the information that I need. On Linux environment, these tools are perfect: grep, cat, zgrep and zcat.

Extracting patterns with grep

We can extract information from a text file using grep. Example, we can extract lines of log file containing patterns like GET /checkout/* where status code is 500.

1
grep -E -e ‘GET /checkout/.* HTTP/1\.(0|1)” 500’ some-log-file.log
Depending on the Apache log format, above will extract lines whose request is /checkout/* and status code is 500 where it may support HTTP/1.0 or HTTP/1.1. However, that would extract the whole line. To only extract the matching pattern, use the -o option.

1
grep -o -E -e ‘GET /checkout/.* HTTP/1\.(0|1)” 500’ some-log-file.log
And to save the matching patterns to a file, simply redirect the output to file.

1
grep -E -e ‘GET /checkout/.* HTTP/1\.(0|1)” 500’ some-log-file.log > checkout-errors.txt
Using cat

cat is usually used to output contents of a file. This is a small but very useful Linux utility. For example, we can combine multiple log files (uncompressed) into a single log file.

1
cat /path/to/log-files/*.log > /combined/log-file.log
Compressed counterpart

grep and cat have their compressed file counterpart. For grep, there’s zgrep.

1
zgrep -E -e ‘GET /checkout/.* HTTP/1\.(0|1)” 500’ some-log-file.gz > checkout-errors.txt
For cat, there’s zcat.

1
zcat /path/to/log-files/*.gz > /combined/log-file.log
I’ve done so many combination last week that I don’t remember them all and not able to include in this post. Happy log parsing.

# List out successful ssh login attempts
cat secure | grep ‘Accepted’ | awk ‘{print $1 ” ” $2 ” ” $3 ” User: ” $9 ” ” }’
cat secure* | sort | grep ‘Accepted’ | awk ‘{print $1 ” ” $2 ” ” $3 ” User: ” $9 ” IP:” $11 }’

# List out successful ssh login attempts from sudo users
cat /var/log/secure | grep ‘session opened for user root’ | awk ‘{print $1 ” ” $2 ” ” $3 ” Sudo User: ” $13 ” ” }’

# List out ssh login attempts from non-existing and unauthorized user accounts
cat /var/log/secure | grep ‘Invalid user’

# List out ssh login attempts by authorized ssh accounts with failed password
cat /var/log/secure | grep -v invalid | grep ‘Failed password’

Indeed, and even grep | awk can be shortened to awk /…/. So you could save a bit of space in the final script. For a typical log file (~200 kb), you might save 1 ms processing it. Or to be exact, 1.8 ms removing the cat and grep, and 0.3 ms using only awk instead of grep | awk.

time for i in `seq 1000`; do cat secure | grep Accepted | awk ‘{print $1 ” ” $2 ” ” $3 ” User: ” $9 ” ” }’; done > /tmp/a
time for i in `seq 1000`; do grep Accepted secure | awk ‘{print $1 ” ” $2 ” ” $3 ” User: ” $9 ” ” }’; done > /tmp/b
time for i in `seq 1000`; do awk ‘/Accepted/ {print $1 ” ” $2 ” ” $3 ” User: ” $9 ” ” }’ secure; done > /tmp/c

However, more interestingly, when the size of the log file is increased to 200 MB, it turns out that the cat | grep | awk chain is significantly faster, at 1.096 s over 100 runs. The single awk command will not max out the CPUs, while the pipe chain does.

for i in `seq 1000`; do cat secure >> s; done
time for i in `seq 100`; do cat s | grep Accepted | awk ‘{print $1 ” ” $2 ” ” $3 ” User: ” $9 ” ” }’; done > /tmp/a1
time for i in `seq 100`; do awk ‘/Accepted/ {print $1 ” ” $2 ” ” $3 ” User: ” $9 ” ” }’ s; done > /tmp/c1

My perl script awk:
Code:
my $OldTime= ‘Sep 10, 2012 5:20:41′;
my $NewTime=’Sep 10, 2012 5:49:40’;

my $test2 = qx{ssh -o stricthostkeychecking=no $WLS “awk ‘/$OldTime/,/$NewTime/’ $WLSP/logs/CDSServer.* “};
My log file format:
Code:
#### Kernel>> <>
#### <[ACTIVE] ExecuteThread: '13' for queue: 'weblogic.kernel.Default (self-tuning)'>
#### <12d58f5205084394:4db5aec7:1393b6c3ce7:-8000-00000000 #### <[ACTIVE] ExecuteThread: ' Reply With Quote Reply With Quote 09-18-2012 #2 atreyu atreyu is offline Trusted Penguin Join Date May 2011 Posts 4,353 Hi, I'd make two suggestions. The first one is to use the Date::Parse Perl module. It is hopefully already packaged for your distro. This module will allow you to easily convert date/time strings to seconds since the epoch (which is an easy way to do date/time math). It will give you the equivalent output to this GNU date command: Code: date +%s -d "Sep 10, 2012 5:20:41 PM" The second suggestion would be to put the script on the server and pass to it 3 arguments: 1. the start date/time range 2. the end date/time range 3. the log file to parse then you'd do something like this to call it: Code: ssh server /tmp/parse-log.pl 'Sep 10, 2012 5:20:41 PM' 'Sep 10, 2012 5:44:42 PM' /path/to/CDSserver.log and here is the parse-log.pl script: Code: #!/usr/bin/perl use strict; use warnings; use Date::Parse; # get command line arguments (3) die " Usage: $0 '‘ ‘
E.g.: $0 ‘Sep 10, 2012 5:20:41 PM’ ‘Sep 10, 2012 5:49:40 PM’ CDSServer.log\n”
unless($#ARGV == 2);

my $startTime = $ARGV[0];
my $stopTime = $ARGV[1];
my $log = $ARGV[2];

# make sure the log file exists
die “$log: No such file\n” unless(-f$log);

# convert date/time strings to seconds since epoch
my $start_sec = str2time($startTime);
my $stop_sec = str2time($stopTime);

print “Start time: $startTime ($start_sec)\n”;
print “Stop time: $stopTime ($stop_sec)\n”;

open(LOG,’<',$log) or die "can't read '$log': $!\n"; while(){
chomp;
my $line = $_; # save original line
s/[ \t]+/ /; # replace contiguous white spaces w/single space
if(/^####<([a-zA-Z]{3} [0-9]{1,2}, [0-9]{4} [0-9]{1,2}:[0-9]{2}:[0-9]{2} [AP]M) [A-Z]{3}>/){
my $timedate = $1;

# convert date/time string in log entro to epoch seconds
my $seconds = str2time($timedate);

# print line if it falls into the range
print $line,”\n” if(($seconds >= $start_sec)&&($seconds <= $stop_sec)); } } close(LOG); Reply With Quote Reply With Quote 09-19-2012 #3 charith charith is offline Just Joined! Join Date Nov 2010 Posts 26 Hi atreyu, Thank you very much for your great clean reply. Your script working fine and it gives lines those match given times but what i need is get all lines whatever between that time range. I'm sorry for my bad log file format i attached correct log file below.[Errors not begin with time] Code: #### f-tuning)’> <> <> <> <1345916502340> <[ACTIVE] ExecuteThread: '20' for queue: ' java.sql.SQLRecoverableException: IO Error: The Network Adapter could not establish the connection at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:443) at oracle.jdbc.driver.PhysicalConnection.(PhysicalConnection.java:670oracle.jdbc.driver.T4CConnection.(T4CConnection.java:230
### <[ACTIVE] ExecuteThrea #### <[ACT Above java.sql.SQLRecoverableException: IO Error: part should be retrieve. Reply With Quote Reply With Quote 09-20-2012 #4 atreyu atreyu is offline Trusted Penguin Join Date May 2011 Posts 4,353 ah, okay. yeah, that changes things, but not by too much. basically, you can just set a marker once the start time string is matched, then set a stop marker once the end time string is matched, and save everything in between to an array. then print the array once you're done looping thru the file. Code: #!/usr/bin/perl use strict; use warnings; use Date::Parse; # get command line arguments (3) die " Usage: $0 '‘ ‘
E.g.: $0 ‘Sep 10, 2012 5:20:41 PM’ ‘Sep 10, 2012 5:49:40 PM’ CDSServer.log\n”
unless($#ARGV == 2);

my $startTime = $ARGV[0];
my $stopTime = $ARGV[1];
my $log = $ARGV[2];

# make sure the log file exists
die “$log: No such file\n” unless(-f$log);

# convert date/time strings to seconds since epoch
my $start_sec = str2time($startTime);
my $stop_sec = str2time($stopTime);

# make sure we got nothing but digits in the variables
die “Failed to convert $startTime to seconds\n” unless($start_sec =~ /^[0-9]*$/);
die “Failed to convert $stopTime to seconds\n” unless($stop_sec =~ /^[0-9]*$/);

print “Start time: $startTime ($start_sec)\n”;
print “Stop time: $stopTime ($stop_sec)\n”;

my @lines;
my $stop;

open(LOG,’<',$log) or die "can't read '$log': $!\n"; while(){
chomp;
my $line = $_; # save original line
s/[ \t]+/ /; # replace contiguous white spaces w/single space
if(/^####<([a-zA-Z]{3} [0-9]{1,2}, [0-9]{4} [0-9]{1,2}:[0-9]{2}:[0-9]{2} [AP]M) [A-Z]{3}>/){
my $timedate = $1;

# convert date/time string in log entro to epoch seconds
my $seconds = str2time($timedate);

# save line w/time string to array if it falls into the range
if($seconds >= $start_sec){
push(@lines,$line) unless($stop);
}elsif($seconds >= $stop_sec){
push(@lines,$line) unless($stop);
$stop = 1;
}
}else{
# save line w/o time string to array if it falls into the range
push(@lines,$line) if(($#lines>=0)&&!($stop));
}
}
close(LOG);

# print the saved lines
print “$_\n” for(@lines);
Reply With Quote Reply With Quote
09-24-2012 #5
charith charith is offline
Just Joined!
Join Date
Nov 2010
Posts
26
Hi atreyu,
It’s working fine thank you very much.

Did small change:
Code:
# save line w/time string to array if it falls into the range
if($seconds >= $start_sec){
as
Code:
if (($seconds >= $start_sec)&&($seconds <= $stop_sec)){ } } close(LOG); #!/usr/bin/perl use strict; use warnings; use Date::Parse; # get command line arguments (3) die " Usage: $0 '‘ ‘
E.g.: $0 ‘Sep 10, 2012 5:20:41 PM’ ‘Sep 10, 2012 5:49:40 PM’ CDSServer.log\n”
unless($#ARGV == 2);

my $startTime = $ARGV[0];
my $stopTime = $ARGV[1];
my $log = $ARGV[2];

# make sure the log file exists
die “$log: No such file\n” unless(-f$log);

# convert date/time strings to seconds since epoch
my $start_sec = str2time($startTime);
my $stop_sec = str2time($stopTime);

# make sure we got nothing but digits in the variables
die “Failed to convert $startTime to seconds\n” unless($start_sec =~ /^[0-9]*$/);
die “Failed to convert $stopTime to seconds\n” unless($stop_sec =~ /^[0-9]*$/);

print “Start time: $startTime ($start_sec)\n”;
print “Stop time: $stopTime ($stop_sec)\n”;

my @lines;
my $stop;

open(LOG,’<',$log) or die "can't read '$log': $!\n"; while(){
chomp;
my $line = $_; # save original line
s/[ \t]+/ /; # replace contiguous white spaces w/single space
if(/^####<([a-zA-Z]{3} [0-9]{1,2}, [0-9]{4} [0-9]{1,2}:[0-9]{2}:[0-9]{2} [AP]M) [A-Z]{3}>/){
my $timedate = $1;

# convert date/time string in log entro to epoch seconds
my $seconds = str2time($timedate);

# save line w/time string to array if it falls into the range
if($seconds >= $start_sec){
push(@lines,$line) unless($stop);
}elsif($seconds >= $stop_sec){
push(@lines,$line) unless($stop);
$stop = 1;
}
}else{
# save line w/o time string to array if it falls into the range
push(@lines,$line) if(($#lines>=0)&&!($stop));
}
}
close(LOG);

# print the saved lines
print “$_\n” for(@lines);

shell string comparison tests

shell string comparison tests
Here are the operators for performing string comparison tests:

s1 Test if s1 is not the empty string
s1 = s2 Test if s1 equals s2
s1 != s2 Test if s1 is not equal to s2
-n s1 Test if s1 has non-zero size
-z s1 Test if s1 has zero size
Here’s an example of how to see if two strings are equal:

if [ $foo = $bar ]
then
# do something
fi
This script echoes TRUE:

s1=

if [ -n $s1 ]
then
echo “TRUE”
else
echo “FALSE”
fi
This script echoes FALSE:

s1=bar

if [ -z “$s1” ]
then
echo “TRUE”
else
echo “FALSE”
fi

Linux shell script math/number equality tests
Here’s how you perform math/number/arithmetic tests using the Bourne and Bash shells:

n1 -eq n2 Test if n1 equals n2
n1 -ne n2 Test if n1 is not equal to n2
n1 -lt n2 Test if n1 is less than n2
n1 -le n2 Test if n1 is less than or equal to n2
n1 -gt n2 Test if n1 is greater than n2
n1 -ge n2 Test if n1 is greater than or equal to n2
Here’s an example of how to test whether two numbers are equal:

if [ $n1 -eq $n2 ]
then
# do something
fi
Linux shell boolean and/or/not operators
The following boolean and/or/not operators can also be used in your tests:

-a and
-o or
! not
Here’s an example of how to test perform a test using the and operator:

if [ $num -gt 0 -a $num -lt 10 ]
then
# do something here
fi

Linux shell script math/number equality tests
Here’s how you perform math/number/arithmetic tests using the Bourne and Bash shells:

n1 -eq n2 Test if n1 equals n2
n1 -ne n2 Test if n1 is not equal to n2
n1 -lt n2 Test if n1 is less than n2
n1 -le n2 Test if n1 is less than or equal to n2
n1 -gt n2 Test if n1 is greater than n2
n1 -ge n2 Test if n1 is greater than or equal to n2
Here’s an example of how to test whether two numbers are equal:

if [ $n1 -eq $n2 ]
then
# do something
fi
Linux shell boolean and/or/not operators
The following boolean and/or/not operators can also be used in your tests:

-a and
-o or
! not
Here’s an example of how to test perform a test using the and operator:

if [ $num -gt 0 -a $num -lt 10 ]
then
# do something here
fi

a=5
b=20

if test \( $a -gt 0 -a $a -lt 10 \) -o \( $b -gt 0 -a $b -lt 20 \)
then
echo “TRUE”
else
echo “FALSE”
fi
That script echoes “TRUE”.

Linux Bourne shell if, then, else, else if (elif) syntax
One thing that varies from one programming language to another is the if / then / else / else if / elseif syntax. In the case of the Bourne shell, the “else if” keyword is actually “elif”, so a sample Bourne shell if then else if statement looks like this:

if [ -e ‘foo’ ]
then
echo “if was true”
elif [ -e ‘bar’ ]
then
echo “elif was true”
else
echo “came down to else”
fi

Linux Bourne shell arithmetic
In the Bourne shell math/arithmetic is performed using the expr command, like this:

sum=`expr $foo + $bar`
half=`expr $foo / 2`
times=`expr $foo \* 2`

# increment a counter
(( count++ ))
Note that you can’t have any spaces before or after the equal sign in those (or any) shell script assignment statements.

A few other common Linux shell tricks
Here are a few other tricks/techniques you will often see in Unix shell scripts:

cmd1 && cmd2 Run cmd1; if it returns 0 (success), run cmd2
cmd1 || cmd2 Run cmd1; if it returns non-zero, run cmd2
cmd1 & cmd2 Run cmd1 and also cmd2
(ls -1) Run the command “ls -1” in a subshell

ls

I always find it difficult to digest the filesize from the ls -al command. For instance, after ls -al, the output give me filesize in bytes.

Gosh, then I have to start calculating it by taking last 4 digits, slowly count upwards like 1K, 10K, 100K, 1MB, 10MB, 100MB and so on so forth.

For instance:

this output:
-rw-r–r– 1 walrus dba 137207094 Jul 8 23:12 config.2008032519.s

137207094 is how much?

going with my method of counting upwards, it gives me 137MB roughly.

Is it correct? WRONG. Hell wrong

The above is bits only. Bear in mind, 1 KB = 1024 bits, 1 MB = 1024 KB and so on so forth

[ Source ]
1 bit = a 1 or 0 (b)
4 bits = 1 nybble (?)
8 bits = 1 byte (B)
1024 bytes = 1 Kilobyte (KB)
1024 Kilobytes = 1 Megabyte (MB)
1024 Megabytes = 1 Gigabyte (GB)
1024 Gigabytes = 1 Terabyte (TB)

The correct calculation is 137207094 / 1024 (bits) / 1024 (KB) = 130.8 MB

Starting in Solaris 10, we have a new option in ls command.

: /u01/apps/WatchMark/FlexPM/classic/vendor/Lucent/ECP/ftpIN/in>uname -a
SunOS lxserver 5.10 Generic_118833-36 sun4u sparc SUNW,Sun-Fire-V445

ANCIENT WAY:
ls -al
total 270388
drwxr-xr-x 2 walrus dba 1024 Jul 8 23:14 .
drwxr-xr-x 11 walrus dba 512 Jun 17 01:49 ..
-rw-r–r– 1 walrus dba 137207094 Jul 8 23:12 config.2008032519.s
-rw-r–r– 1 walrus dba 451989 Jul 8 23:12 config.2008032519.split0.bz

cons: hard to read filesize and output distorted

NEW WAY:
ls -alh
total 269060
drwxr-xr-x 2 flexpm dba 1.0K Jul 8 23:12 .
drwxr-xr-x 11 flexpm dba 512 Jun 17 01:49 ..
-rw-r–r– 1 flexpm dba 131M Jul 8 23:12 config.2008032519.s
-rw-r–r– 1 flexpm dba 441K Jul 8 23:12 config.2008032519.split0.sm.gz

pros:
– more readable format in terms of file size
– contents are properly aligned.

cons:
– need to type extra ‘h’ at the end of ls command

find shell

Find files larger than 100MB…

find . -size +100000000c -ls
Old Files
Find files last modified over 30days ago…

find . -type f -mtime 30 -ls
Find files last modified over 365days ago…

find . -type f -mtime 365 -ls
Find files last accessed over 30days ago…

find . -type f -atime 30 -ls
Find files last accessed over 365days ago…

find . -type f -atime 365 -ls
Find Recently Updated Files
There have been instances where a runaway process is seemingly using up any and all space left on a partition. Finding the culprit file is always useful.

If the file is being updated at the current time then we can use find to find files modified in the last day…

find . -type f -mtime -1 -ls
Better still, if we know a file is being written to now, we can touch a file and ask the find command to list any files updated after the timestamp of that file, which will logically then list the rogue file in question.

touch testfile
find . -type f -newer testfile -ls
Finding tar Files
A clean up of redundant tar (backup) files, after completing a piece of work say, is sometimes forgotten. Conversely, if tar files are needed, they can be identified and duly compressed (using compress or gzip) if not already done so, to help save space. Either way, the following lists all tar files for review.

find . -type f -name “*.tar” -ls
find . -type f -name “*.tar.Z” -ls
Large Directories
List, in order, the largest sub-directories (units are in Kb)…

du -sk * | sort -n
Sometimes it is useful to then cd into that suspect directory and re-run the du command until the large files are found.

Removing Files using Find
The above find commands can be edited to remove the files found rather than list them. The “-ls” switch can be changed for “-exec rm {}\;”=.

e.g.

find . -type f -mtime 365 -exec rm {} \;

Shell good reference

The Bourne shell (/bin/sh) is present on all Unix installations and scripts written in this language are (quite) portable; man 1 sh is a good reference.
Basics

Variables and arguments

Assign with variable=value and get content with $variable
MESSAGE=”Hello World” # Assign a string
PI=3.1415 # Assign a decimal number
N=8
TWON=`expr $N * 2` # Arithmetic expression (only integers)
TWON=$(($N * 2)) # Other syntax
TWOPI=`echo “$PI * 2” | bc -l` # Use bc for floating point operations
ZERO=`echo “c($PI/4)-sqrt(2)/2” | bc -l`
The command line arguments are
$0, $1, $2, … # $0 is the command itself
$# # The number of arguments
$* # All arguments (also $@)
Special Variables

$$ # The current process ID
$? # exit status of last command
command
if [ $? != 0 ]; then
echo “command failed”
fi
mypath=`pwd`
mypath=${mypath}/file.txt
echo ${mypath##*/} # Display the filename only
echo ${mypath%%.*} # Full path without extention
foo=/tmp/my.dir/filename.tar.gz
path = ${foo%/*} # Full path without extention
var2=${var:=string} # Use var if set, otherwise use string
# assign string to var and then to var2.
size=$(stat -c%s “$file”) # get file size in bourne script
filesize=${size:=-1}
Constructs

for file in `ls`
do
echo $file
done

count=0
while [ $count -lt 5 ]; do
echo $count
sleep 1
count=$(($count + 1))
done

myfunction() {
find . -type f -name “*.$1” -print # $1 is first argument of the function
}
myfunction “txt”
Generate a file

MYHOME=/home/colin
cat > testhome.sh << _EOF
# All of this goes into the file testhome.sh
if [ -d “$MYHOME” ] ; then
echo $MYHOME exists
else
echo $MYHOME does not exist
fi
_EOF
sh testhome.sh
Bourne script example

As a small example, the script used to create a PDF booklet from this xhtml document:
#!/bin/sh
# This script creates a book in pdf format ready to print on a duplex printer
if [ $# -ne 1 ]; then # Check the argument
echo 1>&2 “Usage: $0 HtmlFile”
exit 1 # non zero exit if error
fi

file=$1 # Assign the filename
fname=${file%.*} # Get the name of the file only
fext=${file#*.} # Get the extension of the file

prince $file -o $fname.pdf # from www.princexml.com
pdftops -paper A4 -noshrink $fname.pdf $fname.ps # create postscript booklet
cat $fname.ps |psbook|psnup -Pa4 -2 |pstops -b “2:0,1U(21cm,29.7cm)” > $fname.book.ps

ps2pdf13 -sPAPERSIZE=a4 -sAutoRotatePages=None $fname.book.ps $fname.book.pdf
# use #a4 and #None on Windows!
exit 0 # exit 0 means successful
Some awk commands

Awk is useful for field stripping, like cut in a more powerful way. Search this document for other examples. See for example gnulamp.com and one-liners for awk for some nice examples.
awk ‘{ print $2, $1 }’ file # Print and inverse first two columns
awk ‘{printf(“%5d : %s\n”, NR,$0)}’ file # Add line number left aligned
awk ‘{print FNR “\t” $0}’ files # Add line number right aligned
awk NF test.txt # remove blank lines (same as grep ‘.’)
awk ‘length > 80’ # print line longer than 80 char)
Some sed commands

Here is the one liner gold minehttp://student.northpark.edu/pemente/sed/sed1line.txt. And a good introduction and tutorial to sedhttp://www.grymoire.com/Unix/Sed.html.
sed ‘s/string1/string2/g’ # Replace string1 with string2
sed -i ‘s/wroong/wrong/g’ *.txt # Replace a recurring word with g
sed ‘s/\(.*\)1/\12/g’ # Modify anystring1 to anystring2
sed ‘/

/,/<\/p>/d’ t.xhtml # Delete lines that start with

# and end with

sed ‘/ *#/d; /^ *$/d’ # Remove comments and blank lines
sed ‘s/[ \t]*$//’ # Remove trailing spaces (use tab as \t)
sed ‘s/^[ \t]*//;s/[ \t]*$//’ # Remove leading and trailing spaces
sed ‘s/[^*]/[&]/’ # Enclose first char with [] top->[t]op
sed = file | sed ‘N;s/\n/\t/’ > file.num # Number lines on a file
Regular Expressions

Some basic regular expression useful for sed too. See Basic Regex Syntaxhttp://www.regular-expressions.info/reference.html for a good primer.
[\^$.|?*+() # special characters any other will match themselves
\ # escapes special characters and treat as literal
* # repeat the previous item zero or more times
. # single character except line break characters
.* # match zero or more characters
^ # match at the start of a line/string
$ # match at the end of a line/string
.$ # match a single character at the end of line/string
^ $ # match line with a single space
[^A-Z] # match any line beginning with any char from A to Z
Some useful commands

The following commands are useful to include in a script or as one liners.
sort -t. -k1,1n -k2,2n -k3,3n -k4,4n # Sort IPv4 ip addresses
echo ‘Test’ | tr ‘[:lower:]’ ‘[:upper:]’ # Case conversion
echo foo.bar | cut -d . -f 1 # Returns foo
PID=$(ps | grep script.sh | grep bin | awk ‘{print $1}’) # PID of a running script
PID=$(ps axww | grep [p]ing | awk ‘{print $1}’) # PID of ping (w/o grep pid)
IP=$(ifconfig $INTERFACE | sed ‘/.*inet addr:/!d;s///;s/ .*//’) # Linux
IP=$(ifconfig $INTERFACE | sed ‘/.*inet /!d;s///;s/ .*//’) # FreeBSD
if [ `diff file1 file2 | wc -l` != 0 ]; then […] fi # File changed?
cat /etc/master.passwd | grep -v root | grep -v \*: | awk -F”:” \ # Create http passwd
‘{ printf(“%s:%s\n”, $1, $2) }’ > /usr/local/etc/apache2/passwd

testuser=$(cat /usr/local/etc/apache2/passwd | grep -v \ # Check user in passwd
root | grep -v \*: | awk -F”:” ‘{ printf(“%s\n”, $1) }’ | grep ^user$)
:(){ :|:& };: # bash fork bomb. Will kill your machine
tail +2 file > file2 # remove the first line from file
I use this little trick to change the file extension for many files at once. For example from .cxx to .cpp. Test it first without the | sh at the end. You can also do this with the command rename if installed. Or with bash builtins.
# ls *.cxx | awk -F. ‘{print “mv “$0” “$1″.cpp”}’ | sh
# ls *.c | sed “s/.*/cp & &.$(date “+%Y%m%d”)/” | sh # e.g. copy *.c to *.c.20080401
# rename .cxx .cpp *.cxx # Rename all .cxx to cpp
# for i in *.cxx; do mv $i ${i%%.cxx}.cpp; done # with bash builtins

Bourne Shell Reference:

This file contains short tables of commonly used items in this shell. In
most cases the information applies to both the Bourne shell (sh) and the
newer bash shell.

Tests (for ifs and loops) are done with [ ] or with the test command.

Checking files:

-r file Check if file is readable.
-w file Check if file is writable.
-x file Check if we have execute access to file.
-f file Check if file is an ordinary file (as opposed to a directory,
a device special file, etc.)
-s file Check if file has size greater than 0.
-d file Check if file is a directory.
-e file Check if file exists. Is true even if file is a directory.

Example:
if [ -s file ]
then
such and such
fi

Checking strings:

s1 = s2 Check if s1 equals s2.
s1 != s2 Check if s1 is not equal to s2.
-z s1 Check if s1 has size 0.
-n s1 Check if s2 has nonzero size.
s1 Check if s1 is not the empty string.

Example:
if [ $myvar = “hello” ]
then
echo “We have a match”
fi

Checking numbers:

Note that a shell variable could contain a string that represents a number.
If you want to check the numerical value use one of the following:

n1 -eq n2 Check to see if n1 equals n2.
n1 -ne n2 Check to see if n1 is not equal to n2.
n1 -lt n2 Check to see if n1 < n2.
n1 -le n2 Check to see if n1 <= n2.
n1 -gt n2 Check to see if n1 > n2.
n1 -ge n2 Check to see if n1 >= n2.

Example:
if [ $# -gt 1 ]
then
echo “ERROR: should have 0 or 1 command-line parameters”
fi

Boolean operators:

! not
-a and
-o or

Example:
if [ $num -lt 10 -o $num -gt 100 ]
then
echo “Number $num is out of range”
elif [ ! -w $filename ]
then
echo “Cannot write to $filename”
fi

Note that ifs can be nested. For example:
if [ $myvar = “y” ]
then
echo “Enter count of number of items”
read num
if [ $num -le 0 ]
then
echo “Invalid count of $num was given”
else
… do whatever …
fi
fi

The above example also illustrates the use of read to read a string from
the keyboard and place it into a shell variable. Also note that most UNIX
commands return a true (nonzero) or false (0) in the shell variable status
to indicate whether they succeeded or not. This return value can be checked.
At the command line echo $status. In a shell script use something like this:

if grep -q shell bshellref
then
echo “true”
else
echo “false”
fi

Note that -q is the quiet version of grep. It just checks whether it is true
that the string shell occurs in the file bshellref. It does not print the
matching lines like grep would otherwise do.

I/O Redirection:

pgm > file Output of pgm is redirected to file.
pgm < file Program pgm reads its input from file.
pgm >> file Output of pgm is appended to file.
pgm1 | pgm2 Output of pgm1 is piped into pgm2 as the input to pgm2.
n > file Output from stream with descriptor n redirected to file.
n >> file Output from stream with descriptor n appended to file.
n >& m Merge output from stream n with stream m.
n <& m Merge input from stream n with stream m.
<< tag Standard input comes from here through next tag at start of line.

Note that file descriptor 0 is normally standard input, 1 is standard output,
and 2 is standard error output.

Shell Built-in Variables:

$0 Name of this shell script itself.
$1 Value of first command line parameter (similarly $2, $3, etc)
$# In a shell script, the number of command line parameters.
$* All of the command line parameters.
$- Options given to the shell.
$? Return the exit status of the last command.
$$ Process id of script (really id of the shell running the script)

Pattern Matching:

* Matches 0 or more characters.
? Matches 1 character.
[AaBbCc] Example: matches any 1 char from the list.
[^RGB] Example: matches any 1 char not in the list.
[a-g] Example: matches any 1 char from this range.

Quoting:
\c Take character c literally.
`cmd` Run cmd and replace it in the line of code with its output.
“whatever” Take whatever literally, after first interpreting $, `…`, \
‘whatever’ Take whatever absolutely literally.

Example:
match=`ls *.bak` Puts names of .bak files into shell variable match.
echo \* Echos * to screen, not all filename as in: echo *
echo ‘$1$2hello’ Writes literally $1$2hello on screen.
echo “$1$2hello” Writes value of parameters 1 and 2 and string hello.

Grouping:

Parentheses may be used for grouping, but must be preceded by backslashes
since parentheses normally have a different meaning to the shell (namely
to run a command or commands in a subshell). For example, you might use:

if test \( -r $file1 -a -r $file2 \) -o \( -r $1 -a -r $2 \)
then
do whatever
fi

Case statement:

Here is an example that looks for a match with one of the characters a, b, c.
If $1 fails to match these, it always matches the * case. A case statement
can also use more advanced pattern matching.

case “$1” in
a) cmd1 ;;
b) cmd2 ;;
c) cmd3 ;;
*) cmd4 ;;
esac

Shell Arithmetic:

In the original Bourne shell arithmetic is done using the expr command as in:
result=`expr $1 + 2`
result2=`expr $2 + $1 / 2`
result=`expr $2 \* 5` (note the \ on the * symbol)

With bash, an expression is normally enclosed using [ ] and can use the
following operators, in order of precedence:
* / % (times, divide, remainder)
+ – (add, subtract)
< > <= >= (the obvious comparison operators)
== != (equal to, not equal to)
&& (logical and)
|| (logical or)
= (assignment)
Arithmetic is done using long integers.

Example:
result=$[$1 + 3]

In this example we take the value of the first parameter, add 3, and place
the sum into result.

Order of Interpretation:

The bash shell carries out its various types of interpretation for each line
in the following order:

brace expansion (see a reference book)
~ expansion (for login ids)
parameters (such as $1)
variables (such as $var)
command substitution (Example: match=`grep DNS *` )
arithmetic (from left to right)
word splitting
pathname expansion (using *, ?, and [abc] )

Other Shell Features:

$var Value of shell variable var.
${var}abc Example: value of shell variable var with string abc appended.
# At start of line, indicates a comment.
var=value Assign the string value to shell variable var.
cmd1 && cmd2 Run cmd1, then if cmd1 successful run cmd2, otherwise skip.
cmd1 || cmd2 Run cmd1, then if cmd1 not successful run cmd2, otherwise skip.
cmd1; cmd2 Do cmd1 and then cmd2.
cmd1 & cmd2 Do cmd1, start cmd2 without waiting for cmd1 to finish.
(cmds) Run cmds (commands) in a subshell.

See a good reference book for information on traps, signals, exporting of
variables, functions, eval, source, etc.

Date and time calculation functions using AWK

awk has 3 functions to calculate date and time:
systime
strftime
mktime
Let us see in this article how to use these functions:

systime:
This function is equivalent to the Unix date (date +%s) command. It gives the Unix time, total number of seconds elapsed since the epoch(01-01-1970 00:00:00).
$ echo | awk ‘{print systime();}’
1358146640
Note: systime function does not take any arguments.

strftime:
A very common function used in gawk to format the systime into a calendar format. Using this function, from the systime, the year, month, date, hours, mins and seconds can be separated.

Syntax:
strftime (,unix time);
1. Printing current date time using strftime:
$ echo | awk ‘{print strftime(“%d-%m-%y %H-%M-%S”,systime());}’
14-01-13 12-37-45
strftime takes format specifiers which are same as the format specifiers available with the date command. %d for date, %m for month number (1 to 12), %y for the 2 digit year number, %H for the hour in 24 hour format, %M for minutes and %S for seconds. In this way, strftime converts Unix time into a date string.

2. Display current date time using strftime without systime:
$ echo | awk ‘{print strftime(“%d-%m-%y %H-%M-%S”);}’
14-01-13 12-38-08
Both the arguments of strftime are optional. When the timestamp is not provided, it takes the systime by default.

3. strftime with no arguments:
$ echo | awk ‘{print strftime();}’
Mon Jan 14 12:30:05 IST 2013
strftime without the format specifiers provides the output in the default output format as the Unix date command.

mktime:
mktime function converts any given date time string into a Unix time, which is of the systime format.
Syntax:
mktime(date time string) # where date time string is a string which contains atleast 6 components in the following order: YYYY MM DD HH MM SS

1. Printing timestamp for a specific date time :
$ echo | awk ‘{print mktime(“2012 12 21 0 0 0”);}’
1356028200
This gives the Unix time for the date 21-Dec-12.

2. Using strftime with mktime:
$ echo | awk ‘{print strftime(“%d-%m-%Y”,mktime(“2012 12 21 0 0 0”));}’
21-12-2012
The output of mktime can be validated by formatting the mktime output using the strftime function as above.

3. Negative date in mktime:
$ echo | awk ‘{print strftime(“%d-%m-%Y”,mktime(“2012 12 -1 0 0 0”));}’
29-11-2012
mktime can take negative values as well. -1 in the date position indicates one day before the date specified which in this case leads to 29th Nov 2012.

4. Negative hour value in mktime:
$ echo | awk ‘{print strftime(“%d-%m-%Y %H-%M-%S”,mktime(“2012 12 3 -2 0 0”));}’
02-12-2012 22-00-00
-2 in the hours position indicates 2 hours before the specified date time which in this case leads to “2-Dec-2012 22” hours.

$ cat tst.awk
BEGIN {
ARGV[ARGC++] = ARGV[ARGC-1]

mths = “JanFebMarAprMayJunJulAugSepOctNovDec”

if (days) { hours = days * 24 }
if (hours) { mins = hours * 60 }
if (mins) { secs = mins * 60 }
deltaSecs = secs
}

NR==FNR {
nr2secs[NR] = mktime($6″ “(match(mths,$5)+2)/3” “$4” “gensub(/:/,” “,”g”,$7))
next
}

nr2secs[FNR] >= (nr2secs[NR-FNR] – deltaSecs)

$ awk -v hours=1 -f tst.awk file
157.55.34.99 – – 06 Sep 2013 09:13:10 +0300 “GET /index.php HTTP/1.1” 200 16977 “-” “Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)”
85.163.134.149 – – 06 Sep 2013 09:50:23 +0300 “GET /wap/wapicons/mnrwap.jpg HTTP/1.1” 200 1217 “http://mydomain.com/main.php” “Mozilla/5.0 (Linux; U; Android 4.1.2; en-gb; GT-I9082 Build/JZO54K) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 Mobile Safari/534.30”
83.113.48.218 – – 06 Sep 2013 10:13:07 +0300 “GET /english/nicons/word.gif HTTP/1.1” 200 803 “http://mydomain.com/french/details.php?eid=127928&cid=18&fromval=1&frid=18” “Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.0; Trident/5.0)”

$ gawk -v mins=60 -f tst.awk file
157.55.34.99 – – 06 Sep 2013 09:13:10 +0300 “GET /index.php HTTP/1.1” 200 16977 “-” “Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)”
85.163.134.149 – – 06 Sep 2013 09:50:23 +0300 “GET /wap/wapicons/mnrwap.jpg HTTP/1.1” 200 1217 “http://mydomain.com/main.php” “Mozilla/5.0 (Linux; U; Android 4.1.2; en-gb; GT-I9082 Build/JZO54K) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 Mobile Safari/534.30”
83.113.48.218 – – 06 Sep 2013 10:13:07 +0300 “GET /english/nicons/word.gif HTTP/1.1” 200 803 “http://mydomain.com/french/details.php?eid=127928&cid=18&fromval=1&frid=18” “Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.0; Trident/5.0)”

$ gawk -v mins=20 -f tst.awk file
83.113.48.218 – – 06 Sep 2013 10:13:07 +0300 “GET /english/nicons/word.gif HTTP/1.1” 200 803 “http://mydomain.com/french/details.php?eid=127928&cid=18&fromval=1&frid=18” “Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.0; Trident/5.0)”