Thursday, October 16, 2014

Shell Script - Short Learning for big future.

From a long time, i wanted to write a summarized view of various features available in shell scripting. Finally after reading 2 books of shell scripting, I am trying to summarized the best features so that it can be completed in minimum amount of time by a developer who wants a quick cheat sheet type of page, where one can find all the daily useful commands & remember it for long time. here is the pointed form of the summary:-

  1. Cat /etc/shells    -> Lists the shells available
  1. echo $SHELL   -> Get current running shell
  2. ps $$   -> gives the PID of the shell running in the system.
  3. type -a ls -> Gives whether ls is external or internal command, with path
  4. Bash Startup Scripts
    a.
    /etc/profile - The systemwide initialization file, executed for login shells
    b.
    $HOME/.bash_profile, $HOME/.bash_login, and $HOME/.profile runs second when a user logs in in that order.
    c.
    $HOME/.bash_logout invoked when you log out.
  5. apt-cache search shell -> To find the list of available shell packages under ubuntu
  6. which bash -> Will give you the full path of shell command (try Whereis -> this will give the location+man location of this command )
  7. vi hello.sh -> Create a new file and write echo "hello world" in it. Execute chmod +x  hello.sh and then execute ./hello.sh to execute the commands under it.
  1. #!/bin/bash -> this is the Shebang tha tis to be used at the first line of every script
  1. Execute a script also via "bash hello.sh"
  1. Debug mode: "bash -xv hello.sh"   or use set -x/+x , -v/+v, -n/+n in script
  1. Env or printenv -> list all the system variables
  1. Find /home -name *.c -> will find all C files under /home
  1. printf "${PATH}\n"   -> Use printf  extensively as like in C.
  1. personName="Pankaj Bhatt"
    echo "Person name is ${personName}
  1. lastName=${variable:-"bhatt"}    -> If variable is empty it will assign value 'bhatt'
  1. Sleep 3  -> Halt the execution of 3 sceonds.
  1. Echo *.conf -> Show all the conf files within current directory
  2. Export -> This command is used to export variables to all child processes
  3. Export -n -> Gives the list of all variables exported to this shell.
  4. Unset varname -> Will unset the value of the variable.
  1. read -p "Enter your name : "name   -> For reading the value in an Variable
  1. Read arguments -t 10 For Timeout input (abort if don’t enter),-s for password
  1. ans=$((x +y ))  -> To Do the mathematical computation for two variables, if either x or y is can't converted to INTEGER it should be used as ZERO.
  1. declare -i num1=10 -> declare an integer variable. (Assigning else results 0)
  1. readonly varName = value -> being used to create a constant OR
    declare -r varname=value
  1. varc=${varName:?Error varName is not defined or is empty} -> To check whether varName is defined or not, otherwise  the remaining string will be used to display the error.
  1. let sum=no1+no2 :- Let command can be used to do mathmatical operations without using $ in front of variables. Or $[ no1 + no2 ] can also be used and results can be assigned to result.
  1. bc comamnd can be used to do mathematical computation. echo "4 * 0.56" | bc
  1. Export JAVA_HOME= /usr/bin/java/jdk   -> This command will treat java_home as env variable going forward.
  1. Whatis ls -> This command gives a short description of the command from man pages.
  1. history -> Command is used to see the history of commands you typed. CTRL-R is used to search the command while typing. ! will execute the command from command history.
  1. !! -> This will execute the last command you typed.
  1. A {} is used to repeat a pattern. E.g. echo file{1,2,3}.txt will print -> file1.txt file2.txt file3.txt
  1. Wildcard (*,?.[..]) supports the above pattern.  E.g. ls *.{c,java}
  1. ls [ab]*.sh -> only lists the file starting with a or b. [a-d]*.sh   -> Only a-d starting filenames.
  1. ls -d */ -> Gives you a list of all the directories present in the current pwd.
  1. Alias -. It list all the alias present in the system.
    1. Alias ls ='ls -lah' -> To create alias for any command
    1. Unalias ls -> to Remove the Alias
      Alias remove with the session, to store them permanently store them in /{HOME}/.bashrc file
  1. STARTUP Scripts
    1. /etc/profile -> Linux wide system properties file. Runs first when a user logs in.
    2. /etc/profile.d -> directly contains all files that Runs SECOND when user logs in.
    3. ${HOME}/.bash_profile -> Runs third when user logs in. Internally this calls .bashrc in HOME directory. Use this for setting anything. Once done type n press "bash" for to refresh the settings placed in .bashrc file.
  1. cat command is being used for seeing the contents on the screen.
  1. PS1 is the variable that holds the command prompt you use. Set it in .bashrc file, via using export PS1=''${HOME}"
  1. CTRL-D is the command, which is being used to come out of the shell.
  1. Setting Shell options:-
    1. set -o -: This will list all the shell configured variables.
    1. set -o variableName :- This will unset a specific variable listed from the above command.
    2. set +o variableName :- This will set a specific variable listed from the above command.
  1. shopt -> to list some of the variables configured for the environment. For setting it use "shopt -s " and for unset -u option
  2. Setting Env Variables :- Modify ~/.bash_rc  and include line EXPORT =""
  3. System Vide Shell Options: in /etc/profile.d directory create a file java.sh (with execute permissions)  and write export java commands in it.E.g. export JAVA_HOME=/usr/lib/java/jdk1.6u50 ;     \n export PATH=$PATH:$JAVA_HOME/bin
  4. test command is being used for checking the file existence or test values
    test -f /bin/bash && echo "File Exists" || echo "File does not exists."      -> It will print File Exists
  5. If [test] CONDITION ->  then  … … …else … … .. -> fi     ( You can also use elif in place of else if  } 
    if [ $myAge -gt 0 ]   -> This is also a valid condiion
  1. Exit status of the command -> Use $? To get the status of the earlier command executed
  1. Connection command e.g.  > Second set of commands will execute only when first is successful
test! -d /tmp/foo && { read-p "Directory /tmp/foo not found. Hit [Enter] to exit..."enter; exit 1; }
|| Operator -> echo "ram here" >> /dev/null || echo "I M Executing"  -> second will only execute on failure of first.
  1. -ge, -eq, lt, -le, -gt, -ne
  2. test -z -> to check that the string is not empty &length is not  zero
  3. File operation with test :  -a : if file exists, -b : if file exists and block special file , -d : is a directory , -e : if file exists, -f : regular file , -s size
  4. Command line arguments: $0, $1, $2, $3, $4 etc.   $*/$@ -> Represents all command line arguments.
  5. Parameters set by shell: $*, $@, $#, $-, $?, $$, $!   , ($? -> Represents the status of the last executed command)
  1. exit N -> Where N represents the code that is being returned for the script execution status code. 0 means success
  1. case $val in \n  "pankaj") echo "got it" ;; \n "abc") echo " This is second change";; \n *) echo "Default Case";;   esac
    The pattern to be matched can contain regular expression characters.
    You can use "
    shopt -s nocasematch" in the script, to ignore case comparison while case matching in SWITCH case.
  2. For $variable in (some kind of list, can be string/numbers etc) \n do \n cmd1;cmd2;cmd3;  done
    C Like for loop also works here.
    for (( i =1; i <=5; i++ ))
  1. While [ $n -eq 0 ] \n do \n cmd1; cmd2; cmd3 ; \n done
    while : \n do cmd1; cmd2; cmd3 \n done ->
    This is an indefinite loop or (while true ) or (while false)
  1. Until [ condition ] \n do \n cmd1; cmd2; cmd3; \n done   -> Reverse of While, until the condition becomes true, it will run.
  1. Select var in list \n do \n cmd1;cmd2;cmd3; \n done
  1. Break is used to come outside of the innermost loop while break N is to come out of the N levels of loops.
  1. continue is also used to keep on going the loop.
  2. $(ls) -> This will execute a command enclose in $() and can assign it results in an variable.
  3. < :- for input, > :- for output , 2> :- error redirection, &> :- use to redirect both output and error to the same place, 2>> Append error log
  4. /dev/null :- All the data written on this is being discarded by the system.
  1. bash -c "cd /home; ls -la"  -> This comman will execute the script script inline as it is being passed to the command.
  1. cat abc.txt > /dev/lp0 -> print the file to the printer.
  1. File Descriptor can be assigned to an input file or to an output file , exec 3< abc.txt  OR exec 3> abc.txt
  2. Enclose multiple commands in single block : { cd /home; ls -la ; dir ; }
  3. Watch -n 5 "ls -la; df -h ;" -> Watch the execution of this command very 2 seconds. Press CTRL-C to come out.
  4. Command & -> To put the command in the background.
  1. Pipes are a way to send the output of one command to another one.  e.g cat abc.txt > grep "ram"
  1. pstree -> This is the command that will be used to display the tree of process with there childs.
  1. pgrep -u pankaj,ram : This is used to list the processes of pankaj & ram user in the system.
  2. pkill is a command to kill the processes that belongs to a group/user.
  3. trap : trap is the command which is being used to trap (any of the SIGTERM signal) and execute a codes based on that.
  4. Functions are being created in shell scripting. Just like in other languages, arguments are being accessible by $0,$1
    Eg.
myFun()
{
echo "Hello, how things are" $1
}

echo "start calling the function"
myFun Argument1
Any outside variable defined within the Function will change its value. To declare any local variable within function use "local nm=$1"
return command can be used to return a value from the function, if no value is being specified hen last command status will be returned.
We can write all the functions in a single script and then can include then file in our main script. E.g. if we have functions.sh at home directory then include it via ". /home/pankaj/functions.sh" . Then you can call every function present in that file. Or another way is to use the source command e.g. "source /home/pankaj/functions.sh".
you can use '&' operator to execute a function in the background. (just like a normal shell command).
  1. Subshells are the shells within which your script runs within a shell. You need to EXPORT every variable/function that is defined in the main shell to get it accessed by the Subshell.
  1. exec command can be used to replace this shell with specified program without swapping to a new subshell or process.
  1. . (dot) command is used to run the script in the current shell, so that any parameters that are defined in the mains shell should be accessible, otherwise they wont be accessible to the script running under a subshell.
  2. Compond command is a way to encapsulating multiple commands under a single umbrella. All commands will be executed and generate there output
  3. You can also display dialog boxes while running your script. For this "sudo apt-get install dialog" and then you can use it in script
    E.g.
    dialog --title "hello world" --msgbox "Hello World 6 20"
  4. The tty command is used to display the name of device file attached with the terminal. Or (file connected with standard input).
  1. Colored output can be print on the console. E.g. echo -e "\e[1;31m This is red text \e[0m"
  2. For every process in the system you can get the list of environment varialbles by  cat /proc/{PID}/environ
  3. Get Length of a variable via "length=${#varName}
  4. Tee command is used to get the data from the last command to store the output in an file and also show it to the screen
    E.g.
    echo "Ram teri ganga maili" | tee out.txt -> will store the contents in out.txt as well as it store it to the file.
  5. /dev/stdout -> refers to the standard output and /dev/stdin -> Refers to the keyboard, /dev/stderr -> To screen, /dev/null -> discard all
  6. You can also create  a array in shell scripts via arry_Name=("ram","shyam"); arry_Name[0]=100; echo ${arry_Name[$index]}
    To print all values of an Array Use . ${arry_Name[*]} or ${arry_Name[@]}
    An associate array is being used to use the STRING as the INDEX in the array. e.g. declare -A arry_Name
  1. sleep 30 : is a a command that is being used to sleep fo r specific of time.
  1. xargs command can be used to take input from STDIN n convert it to the parameters passed to  another command.
  2. tr is the command which is used for Translating
  1. Md5sum is the command to calculate the md5 of all the filenames that are present in the md5sum command.  Same as sha1sum.
  1. Md5deep & sha1deep is used to calculate the md5 n sha1 of the complete directory.  (similarly base64 is also there)
  1. Sort command is also being used to sort multiple filenames. Uniq is also used with sort to generate unique file name.
  2. Mktemp is a command to create a temp file in the /tmp directory. Use (-d) to create a temp directory.
  1. split command is used to break a file in multiple chunks. E.g. split -b 100k data.file  :It will break the chunks in 100K.
  1. Csplit is an extended command of split that is used to split files based upon certain conditions (like presence of text etc).
  2. rename is the command which is used to rename the files from one to anther and you can use WILDCARD character.
  3. /usr/share/dict directory contain the dictionary. Aspell is the command used to find any word there.
  4. dd is a command available through which you can generate large size files like 100M etc.
  1. /dev/zero is a special character device, which infinitely returns the zero byte (\0).
  1. comm is a command available through which we can generate the differences between two files.
  1. Chmod is the command which is used the change the file permissions for user/groups/others
    E.g.
    chmod u=rwx g=rw o=r filename
  2. chown command is use to change the ownership of the files. The format is : chown user.group filename
  3. /etc/resolv.conf is the file that contains the list of DNS Servers.
  4. chattr is the command which is used to make the file immutable. E.g. chattr +I fileName  (you even can't delete it, use -I to reverse this).
  1. touch command is used to create the new file or modify the timestamp of the file to the current timestamp.
  1. Symbolic  links are being created by  "ln -s  targetFileName  symbolic_link_name"
  1. File command is being used to tell the type of file. E.g "file abc.sh"
  1. Loopback files can be used to create a file system out of a single file. You can mount the file at a specific mount point.  See details in the LINUX shell Scripting Cookbook for details. Page no 124. You can also create the partition inside loopback images and mount it separately. You can also mount ISO files as loopback files.
  1. /dev/zero is a file which will always contain zero, if you read from it.
  1. cdrecord command is used to create an ISO into a CD-ROM.
  1. Diff command is used to find the difference between two files/directories. A host of options are there, through which you can customize the output.
  2. pushd and popd commands are being used to push a specific directory to the stack and then do a CD into it automatically. Dirs is the command that can be used to find the appropriate number of directories in the stack.
  3. wc is the command which is used to count the number of lines/words/characters from a text file.
  4. tree is the command use to print the complete tree of the complete file system. It does not come bydefault, you have to use it.
  5. Grep is the command used to search a test across multiple files. E.g. grep "pattern" file1 file2. egrep is just an extended version of grep command with extended regular expresstions. Its same as "grep -E" Option. -o option is used to print only that line which match the pattern. "-c" is used to count. "-n" prints the line number.  "-R" recursively search over a directory. "-I" ignoring the case. Use  "-e" if there are multiple patterns to match with.  "--include" option is being used to match the files that needs to be included in the search. --include *.{c,cpp} . "--exclude" is used for exclusion of files. "-q" is being used for quiet output, then you can use return status of the command to check whether there are any matches or not.
  6. cut is the command which is used to cut the text in a columnar fashion.
  1. sed is being used for text replacements. E.g. sed 's/pattern/replace_string/' file. Use -I option to save the changes int eh file after replacements.  Awk is used for advanced text processing.
  1. wget is a command which is used to download a file from the internet.
  1. Lynx is a command available through which you can get the TEXT content of  a webpage.
  2. Curl command used to hit a web page/API and get response. "--cookie" option is used to specify cookies. --cookie "user=ram;a=b". -H is being used to specify headers. curl -H "Host: www.slynux.org" -H "Accept-language: en" URL. -I/-head option to show only response headers.
  3. tar -cvf output.tar file1 file2 file3 file4 file5  -> To compress all the files in an tar file
    tar -xf output.tar  -C /path/to/extractdirectory -> To uncompress all the files.
  4. gzip is used to zip a file while gunzip is used to unzip the files. Similarly zip/unzip (-r is being used for recursively).
  5. rsync is the command to keep in sync two files/directories E.g. rsync -av /opt/jboss/data slynux@192.168.0.6:/home/backups/data
  1. To add a scheduler in linux, add a CRON expression based entry in CRONTAB file
  1. Ifconfig is the command used to show the network interfaces. This command is used to set the static IP address on a specific network interface. ifconfig wlan0 192.168.0.80 netmask 255.255.255.0. To automatically get the IP from the DHCP use "dhclient eth0"
  2. Route, host, nslookup , traceroute ( show all hops to the destination).
  3. lsof -I & netstat are being used for port analysis.
  4. Df (disk free)/ du (disk usage) . du -a directory : lists size of all files and then of complete directory. Du -c : shows all count sum of all files.         --exclude is used to exclude some file/directory for there usage calculation.
  1. time command can be used to measure the amount of time taken by the command. E.g. "time ls -la" . In addtion, this command can be used to find exit status, number of signals received, number of context switches made etc.
  1. Users Details: who/w/users/last/

  1. logrotate is the command which is useful to configure the log rotation of various files in any directory of the file system. The configuration directory for every process is in /etc/logrotate.d
  2. Fsck is the comman for file system scan. E.g. fsck /dev/sdb0. /etc/fstab is the file containing all the file systems.
  3. Wall is a command which is used to write a message to all the terminals of all the users who are logged in the system.
  4. Gathering system information:  hostname/uname -n/uname -a/ uname -r/uname -m
  5. cat /proc/cpuinfo; cat /proc/meminfo/ , cat /proc/partitions; use lshw to get the info about complete file system.

Thursday, September 25, 2014

Docker Summarized Overview - 5 mins food for Geeks

Docker:
From a long time, i was quite fascinated by the way docker works and unfortunately i dint get a part of it, why the hell it can't run with windows directly (without installing boot2docker). This really challenges me to learn it from upside down and i embark my Journey to a great component called LXC in Linux. Believe me, you would understand DOCKER very easily if you know how LXC worked. Finally, i got to know DOCKER is just a way of managing your dockers in a better manner with some cherries on the cake like repository etc.

Here are my summarized notes, that anyone can learn from. I tried to keep them crisp and short. It is for just giving DOCKER a try and at least have a spoonfull of the dish.

Docker installation:
$ sudo apt-get update
$ sudo apt
-get install docker.io
$ sudo ln
-sf /usr/bin/docker.io /usr/local/bin/docker
$ sudo sed
-i '$acomplete -F _docker docker' /etc/bash_completion.d/docker.io


Sudo docker search ubuntu  -> search for keyword docker
Sodu docker pull ubuntu -> to Pull all ubuntu images from Docker
Sudo docker images -> show the list of all images
sudo docker run -i -t ubuntu /bin/bash  -> running  the bash sheel correctly

# Start a new container
JOB=$(sudo docker run -d ubuntu /bin/sh -c "while true; do echo Hello world; sleep 1; done")
-> After above command, the container will start and u have to press Ctrl-C to come out of it and then have to you docker ps  to see it is running.
# Stop the container
docker stop $JOB
# Start the container
docker start $JOB
# Restart the container
docker restart $JOB
# SIGKILL a container
docker kill $JOB
# Remove a container
docker stop $JOB # Container must be stopped to remove it
docker rm $JOB

Installation of Shipyard. ->
Installation of apache/nginx ->
  1. sudo docker run -i -t -p 80:80 ubuntu /bin/bash  
    NOw you will be entered in the shell. Press CTRL-P & CTRL-Q aftewards to come out of the shell
  2. Sudo docker ps
    THis will list the container ID of the process
  1. Sudo docker attach 3edaace90360  
    This will take you to the shell.
  2. Sudo apt-get update
    As the image you have download is older one, so you will need to update the package list.
  3. sudo apt-get install apache2
    This will install the apache on the system.
  4. Sudo service apache2 start
  5. Sudo apt-get install curl
  6. Curl http://localhost 
    It will show the complete  Apache Page.

 sudo docker commit 3edaace90360 panbhatt/pankaj_ubuntu_apache
This will store all the changes in that container by this name
Sudo docker images
It will list the image that is just created with the name same.
Sudo docker login
This will prompt for the docker username and password.
sudo docker push panbhatt/ubuntu-apache
This will push the image to the docker repository
Sudo docker inspect 3edaace90360
This will list the complete setting of the container (e.g. the network ports being exposed).
sudo docker port  3edaace90360 80
This will show whether the port 80 is being used by the container or not

You can also establish the LINKING Between the server, see section 2.3.1 of Docker documentation

Networking
sudo apt-get install bridge-utils   (to install brtcl on ubuntu)
Sudo brctl show  (see networking for Ubuntu, how docker establish the DOCKER brdige)
Sudo ifconfig docker0 -> To see the details of the bridge. The set of IP addres to be given to continers
You can also control the list of IP addresses need to be given to the docker.  2.4 section
Docker can also provide a functionality to allow communication between the various contains, by using   -icc parameter which can be set true/false in the bridge configuration.

Pipework project is an great work at Github to connect multiple  containers

You can always make sure that your containers are running with process managers like systemD, upstart & supervisor . Check docker documentation for 2.5 for this.

Shared director or VOLUME is the best way to share the directories from one container to another, (even if the source container is not working at all).
You can also mount a HOST directory to an container too. This will allow it to use any available configuration or properties directory to the host system.

DockerFile
  1. Touch Dockerfile
  2. Write the following contents, see documentation for greater detail.
     
# DOCKER-VERSION 1.0.1
FROM ubuntu:14.10

RUN sudo apt-get update

RUN sudo apt-get install -y nginx

RUN sudo apt-get install -y curl

RUN sudo service nginx restart

EXPOSE 80

  1. Build the image by running the following command
    sudo docker build -t panbhatt/ubuntu_nginx_df .
    if it says -> "
    Successfully built fc09cc5af0c5"
    use "sudo docker images" to see your name of this VM is there.
  2. Now Run this image:
    sudo docker run -i -p 8000:80  -t panbhatt/ubuntu_nginx_df /bin/bash
    Now attach yourself to this and enjoy.


Wednesday, July 16, 2014

Java Currency Detail via Locale and Currency Code

Sometimes, while developing our API's we need to give the currency details while formatting a currency. The details of the currency are like what Symbol to use, did we have a separator, what is the number of decimal places, should we place the symbol before or after the formatted number.

This can easily be achieved by using Java API's rather then storing it in the database. Here is the java code that generates all these details. It would be very much helpful if we want to return this information in the API related to currency.

// Source Code

            Locale locale = new Locale("ja", "JP");
            Currency currency = Currency.getInstance("JPY");
            boolean bPre = false;
            int ndx = -1;
            double price = 12345.67;
           
            DecimalFormatSymbols df = DecimalFormatSymbols.getInstance(locale)  ;
            df.setCurrency(currency);
            NumberFormat nF = NumberFormat.getCurrencyInstance(locale);
            nF.setCurrency(currency);
            System.out.println("CURRENCY SYMBOL   = " + df.getCurrencySymbol());
            System.out.println("DECIMAL SEPARATOR = " + df.getDecimalSeparator());
            System.out.println("GROUP SEPARATOR   = " + df.getGroupingSeparator());
            System.out.println("CURRENCY CODE     = " + df.getInternationalCurrencySymbol());
            System.out.println("DECIMAL PLACE     = " + nF.getMaximumFractionDigits());
            String sLP = ((DecimalFormat) nF).toLocalizedPattern();
            ndx = sLP.indexOf('\u00A4');  // currency sign
           
            if (ndx > 0) {
                bPre = false;
            } else {
                bPre = true;
            }
           
            System.out.println("CURRENCY PLACE BEFORE    = " + bPre);
            System.out.println("FORMATTED CURRENCY: " + nF.format(price));

// Output
//For vi_VN Locale and VND currency code.

CURRENCY SYMBOL   = đ
DECIMAL SEPARATOR = ,
GROUP SEPARATOR   = .
CURRENCY CODE     = VND
DECIMAL PLACE     = 0
CURRENCY PLACE BEFORE    = false
FORMATTER CURRENCY: 12.346 đ

// For India Locale    hi_IN and INR Currency Code
CURRENCY SYMBOL   = रू
DECIMAL SEPARATOR = .
GROUP SEPARATOR   = ,
CURRENCY CODE     = INR
DECIMAL PLACE     = 2
CURRENCY PLACE BEFORE    = trueFORMATTED CURRENCY: रू १२,३४५.६७

// For JAPAN Locale    ja_JP and JPY Currency Code
CURRENCY SYMBOL   =
DECIMAL SEPARATOR = .
GROUP SEPARATOR   = ,
CURRENCY CODE     = JPY
DECIMAL PLACE     = 0
CURRENCY PLACE BEFORE    = true

FORMATTED CURRENCY: 12,346


Monday, February 17, 2014

Groovy - The Groove of Java ( Groovy for Impatient)

Finally, after a long period of time, i started writing back...  I decided to give Groovy a chance, and it really bounced back with a bang as like always. Such a great language, which seems like not getting enough attraction as it should be. I would be say (if performance should not be the only barometer), then this may be a perfect replacement of Java.
Every feature provided by Groovy either turns out to be really nice crafted feature on top of the solid foundations provided by Java. I am going to write a series of posts on Groovy (especially for the impatient Java developers), so that they can add something sharp in their armory.

I am starting with the Java5 features and how they are implemented/Enhanced in Groovy. I will try to provide a basic set of examples that will work on your system without any problem. I am assuming that Groovy has been installed and GROOVY_HOME environment variable has been created with Groovy bin folder being presented in the PATH of the operating system.


  1. Autoboxing and Unboxing:
    As Groovy supports metaprogramming, groovy provides an automatic promotion/demotion of objects based upon the kind of usage. E.g. 
     int firstVariable = 100;  
     print firstVariable .getClass().name  
    
    
    This will going to print "java.lang.Integer" because of usage of the int is being like an object, so the primitive has been converted to object. Prior Groovy 2.0 primitives are being treated always as objects but going forward 2.0 further optimizations has been done to make intype conversion.
  2. For Each loop:  For-each loop which has been introduced in Java 5, needs us to declare the type of the Array/Collection, this constraint has been removed in Groovy.
      String[] stringAr = [ " Ram ", "Shyam ", "Manoj" ]  
       for(str in stringAr)  
        print(str)  
    
    This will print " Ram Shyam Manoj" without any indication in FOR loop what is the TYPE of STR variable. Groovy identifies this at Runtime, thus providing a way to change the type of object at Runtime, it can be collection/array.
  3. Enum: Enums as like java can be used in Switch case, but Groovy provides a functionality to use multiple ENUM in a single case statement or a range of ENUM values. 
      enum NUM { ONE, TWO, THREE,FOUR, FIVE, SIX, SEVEN, EIGHT }  
           NUM myNumber = NUM.THREE  
           switch(myNumber) {  
               case [NUM.ONE, NUM.TWO]:  
                    print "You have entered either 1 or 2 "   
                    break  
               case NUM.ONE .. NUM.SEVEN:  
                    print "You entered somewhere between 1 and 7 "   
                    break  
                default :  
                    print "I dint know what you have entered"  
               }       
    
    This will print "You entered somewhere between 1 and 7 ", which is indeed an enhancement over the way Java handles the Switch cases with ENUM.
  4. Variable Arguments:  Variable arguments can be provided in same way as that of java however instead of sum(int a, int... b) we can also use sum(int a, int[] b), where b represents an array of integers. 
  5. Annotations: Groovy supports all the annotations provided by Java, however it adds up a number of specific annotations which are very much useful keeping an eye over the Dynamic nature of language e.g. @TypeChecked. I will explore all the supported annotations in the coming posts. 
  6. Static Import: All Static variables/functions can be imported with an advantage that an ALIAS can be created for the long name of function/class.
    E.g import static Math.random as RND
    Now we can use RND() in place of random()
  7. Generics: Groovy fully supports generics of java with an added advantage, that the TYPE CHECKING of performing any operation can be delegated to runtime, and groovy is going to take care of conversion of the parameter to the type of generic which is declared.
    E.g.
  List<String> list = new ArrayList<String>();  
       list.add("Pankaj");  
       list.add(5);  
       list.add(5.4);  
       print list  
 This code will print "Pankaj 5 5.4" while java wont be able to compile it. and Groovy adds the groovy and automatically find the best possible conversion to remove the error.

That's all for now, in the next post, we are going to cover the new set of annotations which are introduced in Groovy.


Sunday, July 29, 2012

Spring MVC WADL generation


As in other REST based frameworks in java, most of them provide an out of box support for genreation of WADL file ( web application description language). It's an XML file that is being composed of the description of all the resources that your REST based API is going to expose. This blog is a collection of the codes , that is being used in order to generate WADL through Spring. (Note: spring MVC does not provide an inbuilt way of generating this file and does not implements JSR-311 fully, so we can expect some mismatch here and there), however other frameworks like JERSEY (which is a full fledged REST implementation) provides complete support for this.

In order to generate application.wadl, we must have to understand the structure of the WADL (http://www.w3.org/Submission/wadl/wadl.xsd). This XSD contains a list of all XML elements and attributes that can be present within an WADL file).

There are simply two steps to achieve the generation:
Step 1: Via using the above XSD we have to generate  the classes, that will represent all the elements in the WADL XML file. The command is simple one. Just download the XSD onto your local machine and hit the command "xjc wadl.xsd" and you will get a number of java file in the working directory. In case you want to specify a specific package name for the generated java files, you can achieve this via a number of command line options provided by xjc. So for example i have generated the files in the following folder "com.mine.wadl.artifact" and here is a list of all the files present in that folder. ( I have renamed each file so that the name starts with WADL).
            1.


Step2: This step is all about writing a spring controller, that will map to the "application.wadl" path and will generate the XML. we have to make sure, that the JAXB marshall or any otther that we are using, must be onto the classpath as Spring will make use of it to generate the XML file. Here is the source code for generating this.
package com.mine.wadl.generator;

import java.lang.annotation.Annotation;
import java.lang.reflect.Method;
import java.util.Map;
import java.util.Set;

import javax.servlet.http.HttpServletRequest;
import javax.xml.namespace.QName;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.MediaType;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.ResponseBody;
import org.springframework.web.method.HandlerMethod;
import org.springframework.web.servlet.mvc.condition.ProducesRequestCondition;
import org.springframework.web.servlet.mvc.method.RequestMappingInfo;
import org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping;

import com.mine.wadl.artifact.WadlApplication;
import com.mine.wadl.artifact.WadlDoc;
import com.mine.wadl.artifact.WadlMethod;
import com.mine.wadl.artifact.WadlParam;
import com.mine.wadl.artifact.WadlParamStyle;
import com.mine.wadl.artifact.WadlRepresentation;
import com.mine.wadl.artifact.WadlRequest;
import com.mine.wadl.artifact.WadlResource;
import com.mine.wadl.artifact.WadlResources;
import com.mine.wadl.artifact.WadlResponse;

/**
 * Type name:WadlController.java
 Description:  This Class
 * will be responsible for generation the Web application descriptor file based
 * upon the  
 References:
 * 
 * 

 * 
 * @author Pankaj Bhatt.
 * @version 1.0, June 2012
 */

@Controller
@RequestMapping
public class WadlController {

 // @Autowired
 private RequestMappingHandlerMapping handlerMapping;

 /**
  * Constructor for initializing the Wadl Controller
  * 
  * @param handlerMapping
  */
 @Autowired
 public WadlController(RequestMappingHandlerMapping handlerMapping) {
  this.handlerMapping = handlerMapping;
 }


 /**
  * This is a function which will be responsible for generating the WADL
  * file.
  * 
  * @param request : Represents the Request
  * @return WadlApplication : This object will be converted to the WADL File.
  */
 @RequestMapping(method = RequestMethod.GET, produces = { "application/xml" })
 public @ResponseBody WadlApplication generateWadl(HttpServletRequest request) {
  WadlApplication result = new WadlApplication();
  WadlDoc doc = new WadlDoc();
  doc.setTitle("REST Service WADL");
  result.getDoc().add(doc);
  WadlResources wadResources = new WadlResources();
  wadResources.setBase(getBaseUrl(request));

  Map handletMethods = handlerMapping
    .getHandlerMethods();
  for (Map.Entry entry : handletMethods
    .entrySet()) {
   WadlResource wadlResource = new WadlResource();

   HandlerMethod handlerMethod = entry.getValue();
   RequestMappingInfo mappingInfo = entry.getKey();

   Set pattern = mappingInfo.getPatternsCondition().getPatterns();
   Set httpMethods = mappingInfo.getMethodsCondition().getMethods();
   ProducesRequestCondition producesRequestCondition = mappingInfo
     .getProducesCondition();
   Set mediaTypes = producesRequestCondition
     .getProducibleMediaTypes();

   for (RequestMethod httpMethod : httpMethods) {
    WadlMethod wadlMethod = new WadlMethod();

    for (String uri : pattern) {
     wadlResource.setPath(uri);
    }

    wadlMethod.setName(httpMethod.name());
    Method javaMethod = handlerMethod.getMethod();
    wadlMethod.setId(javaMethod.getName());
    WadlDoc wadlDocMethod = new WadlDoc();
    wadlDocMethod.setTitle(javaMethod.getDeclaringClass().getName()+ "." + javaMethod.getName());
    wadlMethod.getDoc().add(wadlDocMethod);

    // Request
    WadlRequest wadlRequest = new WadlRequest();

    Annotation[][] annotations = javaMethod.getParameterAnnotations();
    Class[] paramTypes = javaMethod.getParameterTypes();
    int parameterCounter = 0;

    for (Annotation[] annotation : annotations) {
     for (Annotation annotation2 : annotation) {
      if (annotation2 instanceof RequestParam) {
       RequestParam param2 = (RequestParam) annotation2;

       WadlParam waldParam = new WadlParam();

       waldParam.setName(param2.value());

       waldParam.setStyle(WadlParamStyle.QUERY);
       waldParam.setRequired(param2.required());

       if (paramTypes != null
         && paramTypes.length > parameterCounter) {
        if (paramTypes.length > parameterCounter
          && (paramTypes[parameterCounter] == javax.servlet.http.HttpServletRequest.class || paramTypes[parameterCounter] == javax.servlet.http.HttpServletResponse.class))
         parameterCounter++;
        if (paramTypes.length > parameterCounter
          && (paramTypes[parameterCounter] == javax.servlet.http.HttpServletRequest.class || paramTypes[parameterCounter] == javax.servlet.http.HttpServletResponse.class))
         parameterCounter++;

        if (paramTypes.length > parameterCounter) {

         waldParam
           .setType(getQNameForType(paramTypes[parameterCounter]));
         parameterCounter++;
        }
       }

       String defaultValue = cleanDefault(param2
         .defaultValue());
       if (!defaultValue.equals("")) {
        waldParam.setDefault(defaultValue);
       }
       wadlRequest.getParam().add(waldParam);
      } else if (annotation2 instanceof PathVariable) {
       PathVariable param2 = (PathVariable) annotation2;

       WadlParam waldParam = new WadlParam();
       waldParam.setName(param2.value());
       waldParam.setStyle(WadlParamStyle.TEMPLATE);
       waldParam.setRequired(true);
       if (paramTypes != null
         && paramTypes.length > parameterCounter) {
        if (paramTypes.length > parameterCounter
          && (paramTypes[parameterCounter] == javax.servlet.http.HttpServletRequest.class || paramTypes[parameterCounter] == javax.servlet.http.HttpServletResponse.class))
         parameterCounter++;
        if (paramTypes.length > parameterCounter
          && (paramTypes[parameterCounter] == javax.servlet.http.HttpServletRequest.class || paramTypes[parameterCounter] == javax.servlet.http.HttpServletResponse.class))
         parameterCounter++;

        if (paramTypes.length > parameterCounter) {

         waldParam
           .setType(getQNameForType(paramTypes[parameterCounter]));
         parameterCounter++;
        }
       }

       wadlRequest.getParam().add(waldParam);
      } else
       parameterCounter++;
     }
    }
    if (!wadlRequest.getParam().isEmpty()) {
     wadlMethod.setRequest(wadlRequest);
    }

    // Response
    if (!mediaTypes.isEmpty()) {
     WadlResponse wadlResponse = new WadlResponse();
     wadlResponse.getStatus().add(200l);
     for (MediaType mediaType : mediaTypes) {
      WadlRepresentation wadlRepresentation = new WadlRepresentation();
      wadlRepresentation.setMediaType(mediaType.toString());
      wadlResponse.getRepresentation()
        .add(wadlRepresentation);
     }
     wadlMethod.getResponse().add(wadlResponse);
    }

    wadlResource.getMethodOrResource().add(wadlMethod);

   }

   wadResources.getResource().add(wadlResource);

  }
  result.getResources().add(wadResources);

  return result;
 }

 private String getBaseUrl(HttpServletRequest request) {

  return request.getScheme() + "://" + request.getServerName() + ":"
    + request.getServerPort() + "" + request.getContextPath() + "/"
    + request.getServletPath().substring(1);
 }

 private String cleanDefault(String value) {
  value = value.replaceAll("\t", "");
  value = value.replaceAll("\n", "");
  value = value.replaceAll("?", "");
  value = value.replaceAll("?", "");
  value = value.replaceAll("?", "");
  return value;
 }

/**
  * This is an private function, which will return the QName based upon the
  * Java Type.
  * 
  * @param classType
  *            : Represent the type of class
  * @return QName
  */
  private QName getQNameForType(Class classType) {
  QName qName = null;

  /**
   * Check whether the thing that is coming is an Array of a data type or
   * not.
   */
  if (classType.isArray()) {
   classType = classType.getComponentType();
  }

  if (classType == java.lang.Long.class)
   qName = new QName("http://www.w3.org/2001/XMLSchema", "long");
  else if (classType == java.lang.Integer.class)
   qName = new QName("http://www.w3.org/2001/XMLSchema", "integer");
  else if (classType == java.lang.Double.class)
   qName = new QName("http://www.w3.org/2001/XMLSchema", "double");
  else if (classType == java.lang.String.class)
   qName = new QName("http://www.w3.org/2001/XMLSchema", "string");
  else if (classType == java.util.Date.class)
   qName = new QName("http://www.w3.org/2001/XMLSchema", "date");

  return qName;
 }

}

I know this is a long stuff, but let me go one by one & line by line( I will only explain those stuff, which will help you to customize your implementation).

Line 61-64: This is the most important part of the generation, as it initialized the contains the initialization of RequestMappingHandlerMapping object, which is present within spring and contains all the details of all the URI's that we have exposed through Spring MVC. In addtion, to it, it also contains details of the methods that have those Spring MVC Rest based annotations. later on we will see how we will make use of this to find out the information in which we are in terested in.
Line 74:
We are simply annotating a function so that it will be invoked once we type "http://blah.com/springserveletmapping/application.wadl".
Line 75 - 113: As you can see here we are creating the foundationg for generating XML and invoking functions of requ>estmappinghandlermapping to find out the set of functions which have the Spring MVC Rest based annotations. We are also looking for the media types that are being supported by the functions, ( if any present in the defintions of the functions). This is continued till line no 113.
Line 114: This is the section, in which we are being interested in, Here as we know, every function that ismapped to some URI via spring MVC , can have any type of parameters.
E.g. public DataToBeReturned getLoginData(@PathVariable int id, HttpServeltRequest req, @RequestParam(value="name" , required=true) String userName)
However, in the WADL we only want those parameters to be listed which we are collecting from the URI e.g. either from header, requst parameters or through path variables, any other parameters beyond them is need not be included in the WADL. So here we are removing the inclusing of HttpServletRequest and HttpServletResponse from inclusion in WADL.
     Based on the type of annotation on the parameter it will be included either as a path variable or request parameter (type QUERY). For all @RequestParameter it is mandatory to include (value and required attribute) otherwise we wont be able to include the corressponding information in the WADL. value attribute reflects the name (e.g what is the request parameter name) and required  tells us whether that parameter is necessary for processing for the request or not, otherwise you will bound to get a BAD Request ( Http 400 error code)
.


Line 212: The function at this line, will help us in calculating the Base URI on which all the resources are being mapped. This has to be modified as per your own requirements.
Line 236: Here we are specifying a function, that will return the type of QName for the type of parameters that we are passing in the function that is mapped to an URI. since here, I am only using Long,Integer,Double,String,Date. In, case you need to add more, please change this function to include the type of parameter of your choice).

And that's all. Once you will hit at /application.wadl you will get the XML File, mentioning your resources. I have tested the file consumption by SOAP UI and it all blends well.

Here i have specifically used Spring 3.1.0.Release, however i will suggest to go for 3.1.1Release as it has some nice little improvements.

At the last, I am thankful to lot to tomasz nurkiewicz & Grégory OLIVER, it is only because of their direction and help with code, I am able to do this. So all the appreciation goes to them directly. Thanks tomasz and gregory.

here are some of the links, that you may find useful.




Here is Tomasz GitHub url for this project : https://github.com/nurkiewicz/spring-rest-wadl
Hope, it helps the developer community. 
If get time, or if there is a need i will upload the maven pom for this project. 

Thanks.