How Do I Know if I’m Running 32-bit or 64-bit Linux?

If you’ve bought a new computer recently, you probably have a 64-bit processor and installed the 64-bit version of your Linux distribution. What if your computer is a bit older and you don’t remember?

There is a nice and simple command line program called uname that will tell us exactly that.

Open a terminal window (Applications > Accessories > Terminal). Continue reading

By dbglory Posted in Linux

Run Node.js as a Service on Ubuntu

The core of our new project runs on Node.js. With Node you can write very fast JavaScript programs serverside. It’s pretty easy to install Node, code your program, and run it. But how do you make it run nicely in the background like a true server?

Clever chaps will have noticed you can just use the ‘&’ like so:

node ./yourprogram.js &

and send your program to the background. But: Continue reading

How to Run Cron Every 5 Minutes, Seconds, Hours, Days, Months

Question: How do I execute certain shell script at a specific intervals in Linux using cron job? Provide examples using different time periods.

Answer: Crontab can be used to schedule a job that runs on certain internal. The example here show how to execute a shell script using different intervals.

Also, don’t forget to read our previous crontab article that contains 15 practical examples, and also explains about @monthly, @daily, .. tags that you can use in your crontab.

1. Execute a cron job every 5 Minutes

The first field is for Minutes. If you specify * in this field, it runs every minutes. If you specify */5 in the 1st field, it runs every 5 minutes as shown below. Continue reading

By dbglory Posted in Linux

FAQ Linux: How do I Compress a Whole Linux or UNIX Directory?

Q. How can I compress a whole directory under Linux / UNIX using a shell prompt?

A. It is very easy to compress a Whole Linux/UNIX directory. It is useful to backup files, email all files, or even to send software you have created to friends. Technically, it is called as a compressed archive. GNU tar command is best for this work. It can be use on remote Linux or UNIX server. It does two things for you:
=> Create the archive
=> Compress the archive

You need to use tar command as follows (syntax of tar command):
tar -zcvf archive-name.tar.gz directory-name

  • -z: Compress archive using gzip program
  • -c: Create archive
  • -v: Verbose i.e display progress while creating archive
  • -f: Archive File name

For example, you have directory called /home/jerry/prog and you would like to compress this directory then you can type tar command as follows:
$ tar -zcvf prog-1-jan-2005.tar.gz /home/jerry/prog

Above command will create an archive file called prog-1-jan-2005.tar.gz in current directory. If you wish to restore your archive then you need to use following command (it will extract all files in current directory):
$ tar -zxvf prog-1-jan-2005.tar.gz


  • -x: Extract files

If you wish to extract files in particular directory, for example in /tmp then you need to use following command:
$ tar -zxvf prog-1-jan-2005.tar.gz -C /tmp
$ cd /tmp
$ ls -

Copy from

By dbglory Posted in Linux

15 Greatest Open Source Terminal Applications Of 2012

Linux on the desktop is making great progress. However, the real beauty of Linux and Unix like operating system lies beneath the surface at the command prompt. nixCraft picks his best open source terminal applications of 2012.

Most of the following tools are packaged by all major Linux distributions and can be installed on *BSD or Apple OS X.

#1: siege – An HTTP/HTTPS stress load tester

Fig.01: siege in action

Fig.01: siege in action
Siege is a multi-threaded http or https load testing and benchmarking utility. This tool allows me to measure the performance of web apps under duress. I often use this tool test a web server and apps. I have had very good results with this tool. It can stress a single url such as or multiple urls. At the end of each test you will get all data about the web server performance, total data transferred, latency, server response time, concurrency and much more. Continue reading

By dbglory Posted in Linux

GIT: pushing and pulling

Today we’re going to review another basic yet powerful concept that Git among other version control systems of its type has: distribution! As you may know, your commits are all local, and repositories are simply clones of each other. That means the real work in distributing your projects is in synchronizing the changes via git push and git pull.

If you’re new to Git, you may think that this is too much overhead and one that leads to a breakdown of control. Look at it this way: if your central server goes down, you’re usually hosed and prevented from working and collaborating with others. Since all of the work involved with actually creating revisions is done on your own machine, you can code whether the network is down without the permission of others or being subject to network issues. Did I mention it’s a lot faster too for most routine operations? Check out some more of advantages (and disadvantages) of DVCS at Wikipedia. Continue reading

ImageMagick convert – PDF to JPG, Partial Image Size Problem

I’ve been used the ImageMagick convert tool to make JPG images from PDF pages. Nothing too fancy, just generating a single 8.5 x 11 inch ratio JPG for each page in a PDF. This worked really well until I started seeing some PDF files generate the same size JPG, but instead of the page taking up the whole image it was about one quarter size, in the lower left corner.

Inspecting the PDFs showed that they had much larger dimensions. So I tried tweaking several settings; density, scale, resize and resample, all with no luck. No matter how I adjusted the settings I couldn’t get the page to use the whole image size. After more searching and trial and error I finally came across the information I needed to make this work, the define option. Specifically: -define pdf:use-cropbox=true. Continue reading

By dbglory Posted in Linux