Creating vector image from gridded data
Hi all,
I need to create a vector image from a gridded dataset. The image is a map of the world (or even a limited region) and data are regularly spaced topography data. I can easily create a raster image using Matlab or Ocean Data View, but my requirement is to create a vector image and I don't know which linux software (if any) has the capability to read/import gridded data from a file and plot them on a vector graphic. Any idea? Thank you. |
Assuming you mean contour plots, gnuplot might fit the bill. You can plot to a vector graphics file (SVG is my favourite).
Another alternative is to generate an SVG directly from the data e.g. using awk or Perl/Python script. First build a net (no need to do e.g. Delaunay triangulation, if the data is reasonably regular), then draw the curve(s) within each cell. Follow the net to connect the curves. It is not too difficult, if you first work it out with a smaller dataset. Would you like to describe what you are trying to image in a bit more detail? Contours? Projections? Silhouettes? |
Quote:
Sorry if I have misunderstood your requirement. Cheers, Terry |
1 Attachment(s)
Thank you Nominal Animal for your reply. I will take a look at Gnuplot. I like the idea of building the XML code using awk or some other scripting language, but I'm afraid it's far too difficult, since there are a lot of data and usually I don't plot them as contours, rather as indexed images where every pixel (that represents one cell of my grid) picks a color from the current colormap based on its value.
Quote:
|
Quote:
|
While GRASS and other GIS are obviously the best choice if they have the desired functionality -- no need to reinvent the wheel, and all that --, I wrote one possible approach in C99 from scratch.
(The reason I did this is twofold: One, it is not that difficult, and I wanted to show a real life example. Two, I wanted an algorithm that could produce the contour curves in 3D (4D data), for example for the full Earth elevation model.) One function does the actual work. It takes a triangle mesh, and the samples at each vertex as parameters, and returns the contour curve(s). Let's look at the header file first. Code:
#ifndef ISOCURVES_H Code:
#include <stdlib.h>
In practice, each element in the tri array contains three numbers. They specify the three vertices as indexes to the point array. The triangles must be defined counterclockwise. In 3D this means you must orient the triangles so that their normals point outwards. The samples (for example elevation) are always in the value array. The level parameter defines the contour sample value. The points parameter is just for error checking; we could just trust the user that the points the refer to in the tri array exist. Implementation-wise, the code is obviously quite ugly; especially the temporary edge list could be maintained much more efficiently. (Edges are completely defined by the point indices of the two endpoints -- two different nonnegative integers.) There are many ways the above function can be used (and implemented, too!). If you have random samples, you can do e.g. Delaunay triangulation. With polygon meshes -- and for example ordinary rectangular grids -- you can just split each cell/loop into triangles. Well, why not try that? Here is a small main program for the above two files. This one takes the number of samples horizontally and vertically as parameters, plus one or more contour specifications: value:HTMLColor or justvalue#RRGGBB, where value is the desired contour level. The output is almost valid SVG 1.1. Each contour is filled, so you will want to specify smaller values first. The program will read the samples from standard input, and assumes they form a regular rectangular grid. The rectangles are split along alternating diagonals, and the program will even add an outer border of minimum values to get closed contour curves. The code: Code:
#include <stdio.h> Code:
gcc -Wall -pedantic -std=c99 -O3 -o test main.c isocurves.c Code:
N=50 |
Quote:
I do a lot of work using ISIS3 from NASA and USGS one good tool for topography data is GDAL ( gdal_translate is the conversion tool ) http://www.gdal.org/ GRASS uses gdal and proj4 |
Hi nominal! A little update on this topic: I've launched your code more than two weeks ago and it's still running! I have 64 levels to compute and it's calculating the 41st now! The only inconvenience is that the output file size is more than 500M right now and I'm afraid I can't not manage such a huge object with standard graphic software and "only" 8G of RAM. Let's see at the end.
@John_VV: thanks for the suggestion. I will try a GIS software capable of exporting maps in SVG. |
for the whole Earth the best topo data set is the nasa's "srtm_ramp2"
http://mirrors.arsc.edu/nasa/topogra...0x43200.bin.gz but you have been at it for a week ... so |
Thanks John. I'm using the ETOPO1 dataset from NOAA to generate my maps. It has an even too high resolution for my requirements. Anyway, good to know about the existence of another one, I will add it to my references/bookmarks.
|
Quote:
I would like to enhance the program, but unfortunately I'm going almost-offline for two weeks in a couple of days. It certainly is an interesting problem! Quote:
I personally have a physics background, and I have an extremely deep-seated need to know (or be able to estimate) the error bounds, and errors introduced by the theories applied, and by my implementation. This is why I tend to do this kind of analysis from scratch. I do not know whether the various GIS explain in detail the methodology they use when computing the contour curves, and what kind of approximations and optimizations they do. Software that is designed to yield visually pleasing output is often not that accurate. Looking at standard maps, I do not have much confidence in those. (Personally, I'd also prefer to work on the original triangle meshes, not the regular rectangular grids extrapolated from that.) I've considered using the slope at the contour curve points to determine bounds for simplifying the final polygons without compromising accuracy. Basically, the flatter the slope is, the more error there is in the approximation of the contour curve anyway -- although the contour curve will always be within the two original samples, one being above and the other below the contour curve. The nature of the samples (point samples, or averaged over a specific area around the point) affects this, and I'd need to brush up on various bits of theory first to get a real grip on that. I wish I'd known your dataset takes that long to process. For one, it would have made much more sense to use a binary output format, with each polygon of each level in a separate file. Those are trivial to convert to SVG, but you could have experimented with the polygon optimization before converting the data to SVG. Right now, the rounding the text output uses may be a problem. You could also have started that immediately, when the first level (or even first polygon of first level) completed. (Perhaps you could have used Inkscape to simplify the polylines, and calculated the actual error that introduces.) A complete data for each polygon is about eight (eleven) doubles per polyline: (x,y) for the next polyline endpoint, (x0,y0,z0) for the sample above the contour and (x1,y1,z1) for the sample below the contour. The two latter points are of course the original samples that yielded the point (x,y). If you have 3D points with a separate variable being evaluated, then it is eleven doubles per polyline (x,y,z,x0,y0,z0,v0,x1,y1,z1,v1). The contour level (elevation) is always the same for the entire polygon, so there is no need to save that for each point. Although the isocurves() function does not provide that in the result, it would be trivial to modify it to write the output directly to say basename-contourlevel.polygonnumber , as well as a helper program to convert that to SVG. (The inverse, converting back from SVG, is a bit more work, due to SVG having elements that transform and translate the path data.) I'm quite interested in what you're doing, so I wouldn't mind exchanging ideas and suggestions about the way the data is processed; if you don't think it is suitable for LQ, you can always e-mail me directly. |
All times are GMT -5. The time now is 02:06 PM. |