Uograding Postgres on a Qube

I've been busy installing Postgres on my Cobalt Qube, running Linux. The first thing I did was to decide to install a second and newer version of the product. Despite the fact that I can see the installed version of Postgres (V6.x) is not running, you can never tell what the OS designers decided to do for database services. Here's how I did it.

I did try and start the installed version but I got a user permissions error and couldn't be arsed to find out why. I decided to install a second version of Postgres on the system.

For various reasons, i.e. the O'Rielly book is written for V7 and thats what I decided to install, version 7. The current stable version is eight (at the time of writing), so I may regret it. Here's what I did.

Make a User, this is to be the database owner and execution authority. Two things to learn from me. If you want to make a group, do so before you make the user, and user names on the Qube need to be less than seven characters, so postgres7 is out, try psql7 or pgsql7. I defaulted the shell to bash.

I like to have my database installation directory in a place other than the administrator home. I use the administrator ${HOME} to hold the downloads and install directories and since the Cobalt UNIX likes to do stuff in /usr/local, /usr/share, /usr/lib and /usr/bin, I decided to install the production instance into /opt/postgres7 , to ensure that anything I did was well away from original installation.

I downloaded the files required to ${HOME}/install/ so the tar xvf - creates a directory postgres-${VERSION}; I downloaded the postgres-7.3.13.tar.gz from ${postres.org/downloads}. This contains the other three files and is sufficient for the install.

I created a configure file with the required options and variables and ran the configure script. I chose to include tcl/tk but not perl or python. I shall probably regret not doing perl, but I'll fix this another day. i.e. I wrote a script consisting of the configure command and the parameters used. (See below for my syntax). One of these parameters set the final destination root to /opt/postgres7, so I could use the make feature to compile and install the software. Also I did not inspect the /usr/local file system to see if the Qube designers had installed pgaccess, which ought to be installed as part of the tcl package. Since the Qube has TCL installed, it may or may not install pgaccess. It is certainly there after the V7 install.

I then decided to configure the user. I wrote a .bashrc to prepend the /opt/postgres7 resources to the appropriate paths, $PATH, $MANPATH & $LD_LIBRARY_PATH. I finally decided to develop some code to check if this is required or not before doing it. Its important to prepend the directories to the path; we want the paths to direct the searching programs to Postgres 7 resources first. I also created an environment variable to point at ${DB_HOME}. Putting it in .bashrc means that the code is executed if you "su" to the user.

Next I ran initdb. This requires the -D flag to point at the "data" directory. The data directory which is configurable at configure time holds both database files and some configuration data.

Within the untar'd install directory, in the 'contrib/start-scripts' sub directory is a file called linux. This is the system V script. I copied this to /etc/rc.d/init.d and renamed it something appropriate bearing in mind that the version 6 implementation had taken the name 'postgres'. This script is beautifully documented with a 'start edits' comment. It uses the su ${PG_USER} ${command} syntax and so this is why I configured a shell read configuration file, not a login read configuration file aka profile, so that the environment declared applies to this command.

Create a database, using a script or SQL. type -p createdb tells you where the program is. It takes the parameter ${dbname}.

Here are my configure parameters


./configure --prefix=${MYBASE7} \\
    --exec-prefix=${MYBASE7}/system \\
    --bindir=${MYBASE7}/bin \\
    --datadir=${MYBASE7}/config \\
    --sysconfdir=${MYBASE7}/sysconfig \\
    --libdir=${MYBASE7}/lib \\
    --docdir=${MYBASE7}/doc \\
     --mandir=${MYBASE7}/man \\
    --with-tcl --with-tclconfig=/usr/lib

Here is my .bashrc. I use a function called contains to discover if the path contains the directory I require.

contains() {
# $1 must be set
    echo $1 | grep $2 > /dev/null
    return $?

which I then use to determine if the path requires changing,


export PSQL_HOME

contains $PATH $PSQL_HOME/bin
if [ $? !=0 ]
    export PATH

In the case of MANPATH and LD_LIBRARY_PATH (not illustrated), I use the -z test; the contains function requires two parameters and setting $2 to an empty value makes it behave as if $2 is unset. I could have put this logic in the function, but didn't. This is partly because I use it three times and as shown, MANPATH, requires explicitly stating if being augmented, so I need the if test anyway. See below.

if [ -z $MANPATH ]
    manpath=`cat /etc/man.config | \\
        awk '$1==path {print $2;}' path=MANPATH | \\
        tr '\\n' ":"`
    contains $MANPATH $PSQL_HOME/man
    if [ $? !=0 ]

export MANPATH

The only changes to the System V start/stop script, are its name and the various file locations. It appends all logs to a single log and hence would benefit from a function that recycles the logs like Linux does for other daemon logs.

I find the start/stop scripts quite interesting. I am returning to this problem after a number of years, and the last time I dealt with it, I was using Pyramid's dualport OSx and we generally used the BSD rc.local script. Both the development of SunCluster and Solaris 10 SMF means that we need to have a think about how we want to do this. I have also had the chance to consider classic System V implementations such as Linux that have influenced my thinking. I really rather like chkconfig, I like the use of external functions and I like the use of the case construct to offer multiple run modes, or invoke multiple methods (in newspeak). I have written scripts for both tomcat and sybase and with the latter can see scope for huge argument about the method code.

This article was originally written a while ago, and not uploaded at the time because I wanted to finish describing how to use it remotely using fat clients. I never finished that bit, but when I do I'll post it on the net. 



Post a Comment:
Comments are closed for this entry.



« April 2014