wget would likely be the simplest option. Just put your URL in quotes and stick "wget" in front of it. By default it will write a file with the same name as the page (in this case a long and complicated mess), but you can use the -O option to name the output file how you like, or you can use "-O -" to spit to stdout so you can read it into a variable, eg:
Code:
val=$(wget -O - "http://weather.uwyo.edu/cgi-bin/balloon_traj?TIME=2016112806&FCST=0&POINT=none&LAT=34.65&LON=-117.6&TOP=6000&CALCDROP=on&MASS=1.2&DIAM=2&Cd=0.7&OUTPUT=list&Submit=Submit&.cgifields=POINT&.cgifields=FCST&.cgifields=CALCDROP&.cgifields=TIME&.cgifields=OUTPUT")
You could also use the text-based web browser lynx if you like:
Code:
val=$(lynx --dump "http://weather.uwyo.edu/cgi-bin/balloon_traj?TIME=2016112806&FCST=0&POINT=none&LAT=34.65&LON=-117.6&TOP=6000&CALCDROP=on&MASS=1.2&DIAM=2&Cd=0.7&OUTPUT=list&Submit=Submit&.cgifields=POINT&.cgifields=FCST&.cgifields=CALCDROP&.cgifields=TIME&.cgifields=OUTPUT")
Which will format the output similar to how you'd see it on a web browser.