I'm having some problems running the following bitsortgen.c code on my linux machine (you can get it at
http://www.cs.bell-labs.com/cm/cs/pearls/code.html -- it's from the book "Programming Pearls" by the way, which is awesome):
Code:
/* Copyright (C) 1999 Lucent Technologies */
/* From 'Programming Pearls' by Jon Bentley */
/* bitsortgen.c -- gen $1 distinct integers from U[0,$2) */
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
#define MAXN 2000000
int x[MAXN];
int randint(int a, int b)
{ return a + (RAND_MAX * rand() + rand()) % (b + 1 - a);
}
int main(int argc, char *argv[])
{ int i, k, n, t, p;
srand((unsigned) time(NULL));
k = atoi(argv[1]);
n = atoi(argv[2]);
for (i = 0; i < n; i++)
x[i] = i;
for (i = 0; i < k; i++) {
p = randint(i, n-1);
t = x[p]; x[p] = x[i]; x[i] = t;
printf("%d\n", x[i]);
}
return 0;
}
I get 'Segmentation fault' errors quite often when running the program. I used gdb and ran it a couple times with a break point set at the point where p gets assigned in the second for loop of the main function, and sometimes the p value ends up being negative, hence the seg faults, I guess.
But the odd thing about it is that the program runs just fine on a SunOS -- no problems whatsoever. I've checked it out a bit, and the only difference between the two (Linux and SunOS) that I'm currently aware of is that on my Linux machine RAND_MAX = 2147483647, while on a SunOS machine RAND_MAX = 32767. However, I don't see how that could possibly be causing the problem and resulting in negative p values on a Linux machine.
You can compile the program and see for yourself (run it a couple times and for a couple sets of values). When you run it, just make sure that you pass it two numbers as command-line arguments, with the first one smaller than the second, and the second no larger than 2 million.
Any clues?