You are currently browsing the category archive for the ‘Tips’ category.

Here is a tip summarized from comp.unix.shell

The problem:

> For the following TAB-delimited records, I want to count number of 
> records with column-2 == -1   (should be 2) 
> ===== file.txt ====== 
> AAA    -1    2008-07-14 
> BBB    -14   2008-07-15 
> CCC    -20   2008-07-16 
> DDD    -1    2008-07-16 
> =========== 
> I tried: 
>   grep -c -- "-1\t" file.txt 
> which is not working

A solution offered for ksh93/zsh/bash shells:

grep -c -- $'-1\t' file.txt 

I will add another alternative

grep -c '-1^V^I' file.txt

Where ^V^I means type ctrl-v and ctrl-i to enter a tab character.

Related:

Advanced Bash-Scripting Guide Example 34-1
Insert ASCII Control Characters in Text

Advertisements

Oracle is today’s new toy for me. And by toy I mean frequent source of frustration. One issue I battled this morning was getting dbca to start. It would just hang at the command line prompt with no feedback about what it was or wasn’t doing. netca and oem would launch just fine so it didn’t seem to be a Java or X11 configuration issue.

The Oracle forums were of no help but I did find a workaround here. What to do when dbca does not start?

So, I yanked my cable a bit, then, remembering I’m on a wireless Linux laptop, shutdown the network service. Then dbca launched. Of course then dbca couldn’t connect to the databases so I had to start the network again. Good times.

This doesn’t solve the fundamental problem – I certainly couldn’t shutdown the network in a production environment and I doubt it would fly on an Oracle certification exam but at least I’m able to continue with my Oracle adventures.

I frequently use Perl’s in place file editing from the command line. What I didn’t consider until it bit me today is that the file ownership can change using this method.

Here’s the original file, owned by tomcat_6 and only readable by user and group.

$ ls -l web.xml
-rw-rw---- 1 tomcat_6 tomcat 49384 Jul 10 11:38 web.xml

I belong to the tomcat group, so have write permissions to the file. The enclosing directory is also tomcat group writable. The importance of this is noted below.

Using the ‘perl pie’ one-liner to make an in place edit if the file:

$ perl -p -i -e 's;<session-timeout>\d+</session-timeout>;\
<session-timeout>1440</session-timeout>;' web.xml

Now the file is owned by me and my default group.

$ ls -l web.xml
-rw-rw---- 1 crashing daily 49384 Jul 10 21:55 web.xml

Most critically, now the file is no longer readable by the tomcat processes. This little change prevented my Tomcat server from starting. Ouch.

sed is a little nicer. It changes the owner but not the group.

$ sed -i 's;<session-timeout>.*</session-timeout>;\
<session-timeout>1445</session-timeout>;g' web.xml
$ ls -l web.xml 
-rw-rw---- 1 crash tomcat 49385 Jul 10 22:44 web.xml

Neither the Perl nor the sed one-liners work if the directory is not writable because Perl and sed require unlinking the original file and replacing it with a new version.

The winner for both maintaining file ownership and working if the directory is not writable is ed.

$ ed - "web.xml" <<EOF
,s;<session-timeout>[[:digit:]]*</session-timeout>;\
<session-timeout>1440</session-timeout>;
w
EOF

ed truly does an in place edit. Nice. If only I could remember the syntax.

The Tiger and Leopard releases of MacOS X include an implementation of BSD’s dummynet. Dummynet is “a system facility that permits the control of traffic going through the various network interfaces“.

There are many uses for this feature. I use it as part of my website development to simulate a slow network connection. Many of the users of our websites are in developing countries with slow, dialup-speed, network connections. By using a couple of quick commands I can throttle my connection to the webserver down to similar speeds. As such, I can feel their pain even though I’m on a snazzy gigabit connection, two hops away from the webserver.

The following series of commands will slow my communications to and from the webserver down to 56K modem speeds. It only affects http connections (to any web server, not just mine). My other network connections – ssh, for example – operate with native network performance.

$ sudo ipfw add pipe 1 src-port http
$ sudo ipfw add pipe 1 dst-port http
$ sudo ipfw pipe 1 config bw 56kbit/s

Adam Knight’s Traffic Shaping in Mac OS X is a good starter tutorial.

Additional articles and documentation.

Wake the media, alert the dog. I installed Windows XP Pro into a Parallels virtual machine (VM) today.

Disabling ‘Acceleration’ in the VM was required for successful installation. With Acceleration enabled, the installation kept hanging at the “installing devices” stage (this could be progressed sometimes by forced restarting of the VM) and spontaneously rebooting.

From Problems with installing Windows XP SP0 in VM in Parallels’ knowledge base:

1. Change the acceleration mode of the VM:
2. Open the VM's configuration page. 
3. In the Resources list select Options.
4. Click the Advanced tab.
5. Set the Acceleration level to Normal or Disabled.
After installing Windows XP in the VM, you can change the Acceleration level back to High.

BASH Cures Cancer has yet another interesting post this week. This one, entitled “Command Substitution and Exit Status“, leverages the exit status returned from a BASH sub-shell. That alone is a useful tip but what tickled me was the author’s trick of shortening the delay in a running loop by starting a newer, shorter sleep loop that kills off the longer-running sleep process. Clever. The thought of dueling shell processes – “I’m going to sleep for a minute.”, “Oh, no you’re not!” – makes me chuckle. Sigh. I am easily amused.

When you execute a shell script it inherits all the environment variables present in the parent shell. Sometimes that can cause unintended consequences. For example, I recently ran into a situation where one of my scripts in common use by our group failed for one of the users. I eventually tracked it down to the JAVA_HOME variable that user had set in his bash profile. The user’s JAVA_HOME being inherited by the script was not set to an appropriate value. Ideally the script should have set all the environment variables it needs. However, in this case, I had failed to explicitly set JAVA_HOME in my script, an oversight masked by the fact my own JAVA_HOME was set to a valid value so the script ran normally for me.

To reduce the chance of this type of problem reoccurring with other variables I decided to clear all the inherited environment variables at the start of the script. That way any other environment dependency not defined by the script would immediately reveal itself during the script debugging phase.

I could not find a simple builtin method to run the script without the inherited environment, (aside from the impractical “env -i scriptname“) [ edit 8/28/2010: see Gordon’s env - /bin/bash suggestion below and you probably don’t need to read the rest of this posting ], so I put a one-liner in the script to unset each variable (BASH syntax):

unset $(/usr/bin/env | egrep '^(\w+)=(.*)$' | \
  egrep -vw 'PWD|USER|LANG' | /usr/bin/cut -d= -f1);

env prints the environment’s variable=value pairs. The cut at the end splits the pairs and returns the variable name.

The first egrep filters for the variable=value pairs that start at the beginning of the line. This was necessary because GNU screen sets a multiline TERMCAP variable:

TERMCAP=SC|screen|VT 100/ANSI X3.64 virtual terminal:\
        :DO=\E[%dB:LE=\E[%dD:RI=\E[%dC:UP=\E[%dA:bs:bt=\E[Z:\
        :cd=\E[J:ce=\E[K:cl=\E[H\E[J:cm=\E[%i%d;%dH:ct=\E[3g:\

I wanted just the first line and needed to discard the others so the cut would not split them and return bogus variable names, like ‘:DO‘, to unset.

The second egrep is optional but allows me to skip selected variables so they retain their preset values.

With this in place, if I attempt to use an environment variable in my script (or in a program called by my script) it will fail unless I explicitly set the value. This provides me the opportunity to ensure the script will behave consistently for all users on the system.

The diff utility reports differences between two files. If you need to find differences between one or two stdout outputs then temporary named pipes are a handy aid.

Here’s a simple example for the BASH shell to illustrate the technique. Say you have two files, A and B:

$ cat A
Tara
Dawn
Anya
Willow

$ cat B
WILLOW
ANYA
DAWN
HARMONY
TARA

The task is to find names that differ between the two lists without creating new files or editing the existing. You can do that by sorting and normalizing the letter case, then using diff to check the stdout on the fly.

$ diff -B <( sort A | tr [:lower:] [:upper:] ) <( sort B | tr [:lower:] [:upper:] )

2a3
> HARMONY

The <( ... ) syntax creates a temporary named pipe which makes the stdout of the sort | tr commands look and behave like a file, allowing diff to operate on the expected type of input.

For fun, you can see the temporary file created by the process:

dir <( sort A | tr [:lower:] [:upper:] ) 
lr-x------  1 crash daily 64 Mar  5 23:17 /dev/fd/63 -> pipe:[21483501]

This technique is not limited to diff. This should work for most any other command expecting a file for input.

Interestingly (frustratingly), it does not work for me in my BASH shell on Mac OS X. Does anyone know why? Update: Indeed someone does know. Unixjunkie has the answer. The following produces the expected output
on OS X but attempting diff produces no output.


cat <( sort A | tr [:lower:] [:upper:] ) <( sort B | tr [:lower:] [:upper:] )

ANYA
DAWN
TARA
WILLOW

ANYA
DAWN
HARMONY
TARA
WILLOW

Related:

Introduction to Named Pipes
Heads and Tails – an earlier posting that uses temporary named pipes for receiving stdin
Process Substitution, Advanced Bash-Scripting Guide

I can’t say I’m a fan of the Apache Ant build tool but that’s what our project uses so I’ve been trying to wrap my small brain around it enough to make some small changes to our configurations.

Today I needed to call an exec task in a build.xml file on the condition that the executable is in the user’s $PATH. This is what I came up with. Other suggestions are welcome. In my case I needed to execute svn (because the svn task doesn’t provide an api for the svn info command) but the general conditional will apply to any executable.

<target name="buildInfo" depends="canDoSvn,svnInfo/>

<target name="canDoSvn">
<property environment="env" />
<condition property="can.do.svn">
<and>
<available file="${projectsDir}/${proj}/.svn" type="dir" />
<available file="svn" type="file">
<filepath>
<pathelement path="${env.PATH}"/>
</filepath>
</available>
</and>
</condition>
</target>

<target name="svnInfo" if="can.do.svn">
<exec dir="${projectsDir}"
executable="svn"
outputproperty="svnInfoOut"
failonerror="false"
failifexecutionfails="false">
<arg line="info ${proj}"/>
</exec>
</target>

To kick things off, the buildInfo target is called elsewhere in the build script. This target defines dependancies for two other targets that get called in listed order.

In the canDoSvn target I test if the file svn can be found in the user’s $PATH. To do this I set the property task’s environment attribute to env which gives me a handle on the user’s PATH environment variable: ${env.PATH} . The path attribute of pathelement converts the colon-separated list of env.PATH to a directory search list used by the available task.

I also test if there is an .svn directory available. This is to verify that the project being built is from an svn working directory. Obviously svn info would fail otherwise.

If these two conditions are met, the can.do.svn property is set to true and allows the svnInfo target to run when called as a dependancy of buildInfo. I’m defining canDoSvn as a dependancy rather than calling it directly with antcall because with the latter method the can.do.svn property is not accessible outside the canDoSvn target, so the svnInfo condition is never met.

To be thorough, I still need to add a condition that the svn file is executable. I’m not sure how to do that yet.

In my case, I’m guaranteed the ant build is executing on a Linux system, so I don’t know or have to worry about what will happen if this technique is attempted in MS Windows.

Update: From the documentation, it looks like the svn task does indeed interface the the svn info command, using the status element. Anyhoo, I don’t have the svn task installed on our systems so wrapping the svn command line stands.

This took much longer for me to nail than I care to admit. How to test if a bash script argument is an integer?

test "$1" -ge 0 -o "$1" -lt 0 2>&- && echo "$1" is an integer

This tests if the value of $1 is greater-than-or-equal-to 0 or is less-than 0. The 2>&- closes test‘s stdout (you could also use 2>/dev/null). Squashing stderr is necessary to mask the “integer expression expected” error that occurs when $1 is not an integer.

Or if you prefer the ‘[‘ syntax:

[ "$1" -ge 0 -o "$1" -lt 0 2>&- ] 2>&- && echo "$1" is an integer

Remember ‘[‘ is actually a command – hence the required space following it – so it also has a stderr that can be closed or redirected.

Categories

December 2018
M T W T F S S
« May    
 12
3456789
10111213141516
17181920212223
24252627282930
31  

Latest del.icio.us

Advertisements