Memory problem detected by Valgrind for large array - C programming
Hi
I'm writing a program that needs to deal with large multidimensional arrays. My program works fine for matrices of size 125 x 125 and no errors occur when running with Valgrind. For the next size up which is matrix size 343 x 343 I get a segmentation fault and Valgrind gives me "Invalid read". The program runs into trouble when it calls the following Numerical recipes routine choldc.c to do the Cholesky decomposition of matrix a and then inverts it to give matrix a_inv: #include <stdio.h> #include <math.h> #include <stdlib.h> #include "globals.h" #include "structs_and_vars.h" void choldc(int n, double **a, double **a_inv) { #include "header.h" int i,j,k,g,g_dash; /* Dummy variables */ double sum; /* Temp variable */ vector p; for (i=1;i<=n;i++) { for (j=i;j<=n;j++) { LINE WHERE ERROR OCCURS for (sum=a[i][j],k=i-1;k>=1;k--) sum -= a[i][k]*a[j][k]; if (i == j) { if (sum <= 0.0) printf("error: choldc failed\n"); p[i]=sqrt(sum); } else a[j][i]= sum/p[i]; } } /* find inverse */ for (i = 1; i <= n; i++) { for (j = 1; j <= n; j++) a_inv[i][j] = a[i][j]; } for (i=1;i<=n;i++) { a_inv[i][i]=1.0/p[i]; for (j=i+1; j<=n; j++) { sum=0.0; for (k=i;k<j;k++) sum -= a_inv[j][k]*a_inv[k][i]; a_inv[j][i]=sum/p[j]; } } for (i=1;i<=n-1;i++) { for (j=i+1;j<=n;j++) { a[i][j]=0.; a_inv[i][j]=0.; } } } The valgind erro message is: ==22212== Invalid read of size 4 ==22212== at 0x804B705: choldc (choldc.c:18) ==22212== by 0x8049818: main (main.c:332) ==22212== Address 0x1B228C80 is not stack'd, malloc'd or (recently) free'd ==22212== ==22212== Process terminating with default action of signal 11 (SIGSEGV) ==22212== Access not within mapped region at address 0x1B228C80 ==22212== at 0x804B705: choldc (choldc.c:18) ==22212== by 0x8049818: main (main.c:332) ==22212== ==22212== ERROR SUMMARY: 1 errors from 1 contexts (suppressed: 14 from 1) ==22212== malloc/free: in use at exit: 9449752 bytes in 26 blocks. ==22212== malloc/free: 26 allocs, 0 frees, 9449752 bytes allocated. ==22212== For counts of detected errors, rerun with: -v ==22212== searching for pointers to 26 not-freed blocks. ==22212== checked 37667096 bytes. I have noted line 18 where the error occurs in the code. I don't understand what it is about this line that can't deal with the increased size. Other bits of my program can cope. Any help would be greatly appreciated!! |
When you are allocating the memory for the array (presumably in main.c), did it work?
|
There is no problem when the memory is allocated (in main) and when I take out this particular function and let it run through another function, again there is no problem.
|
Next time, use code blocks, the formating is terrible.
What is a vector p? How is n meaningful? The checks have "x<=n", meaning if the array is allocated to size n, x will be accessing it as n+1 at the last iteration. Why did you use multi-dimensional arrays? Just curious because I wouldn't have thought the compiler would treat them correctly without knowing how big they where. Also, use a normal debugger to check on those temporary variables, there is a chance they are being set to invalid numbers then used to address a array, meaning the memory lookup would be invalid and segfault. (It would have to be off by more then 1 for valgrind to give that error message). |
All times are GMT -5. The time now is 09:10 PM. |