Jump to content
  • Member Statistics

    17,613
    Total Members
    7,904
    Most Online
    RyRyB
    Newest Member
    RyRyB
    Joined

downloading and using gribdata 101


Jim Marusak

Recommended Posts

have to admit, I know of grib data, I know of software like wingridds, and I have heard of ways to even use those two in order to make things like bufkit files from models such as the Canadian and other foreign models we don't get bufkit files from currently. but I haven't done it yet in my life (yes I know I should have in my career but I never had to, so I didn't learn it).

so, how big are grib files these days from the various model runs?  what type of computer horsepower will i need to process all that data? and will have to learn/relearn a bunch of programming to get it going (knew fortran from college, and still have the manual from that (yes I'm old enough to have my met programming class in fortran, not c)).

 

thanks,

Link to comment
Share on other sites

  • 2 weeks later...

have to admit, I know of grib data, I know of software like wingridds, and I have heard of ways to even use those two in order to make things like bufkit files from models such as the Canadian and other foreign models we don't get bufkit files from currently. but I haven't done it yet in my life (yes I know I should have in my career but I never had to, so I didn't learn it).

so, how big are grib files these days from the various model runs?  what type of computer horsepower will i need to process all that data? and will have to learn/relearn a bunch of programming to get it going (knew fortran from college, and still have the manual from that (yes I'm old enough to have my met programming class in fortran, not c)).

 

thanks,

I sometimes use it for reanalysis for various projects that my advisor has me do. There is an old, but free, program known as GRADS which uses gribmaps. It works well enough but I think there is better stuff around today though so hopefully somebody can point you to that.

 

And fwiw, I have to learn some fortran even now.

Link to comment
Share on other sites

  • 3 weeks later...

what do you want to do?

 

there is a lot of free, and very user friendly, software available for viewing weather model output in grib format.

 

mostly it is used by sailors, and other mariners - the weather data is often overlain on PC based navigation programs, so you can easily see the forecast for where you are, and where you think you will be.

 

most navigation programs have a GFS download feature built in - you drag a box around the area of interest, select the number of days you want, and the variables you want, and download.

 

some programs have built in downloads for other models - NAM, GEM, NAVGEM,

 

I download and view gribs every day for this purpose

 

the only limitation is that these programs will mostly only display the 10-20 variables of interest to mariners - 10m wind, 2m temp, MSLP, 500mb height, 850mb wind, CAPE, precip, wind gust, cloud, and a few others 

 

you could start with zygrib

 

http://www.zygrib.org/

 

the developers are ocean racing sailors

 

I use Expedition, which is also aimed at sailboat racing 

 

http://www.expeditionmarine.com/

 

you can use it as a gribviewer without buying it - just download it. you won't be able to use it for navigation, because you can only connect instruments to a licensed version, but you aren't looking to do that

Link to comment
Share on other sites

i would like to be able to manipulate it for say making my own maps without the need to use other sources, look at archived data, and just be able to plain be familiar using it. some of the met jobs out there want you to be able to look at and/or use the data for different things, and i figured that if I at least learned the basics i could have a shot later on when I would need to use that knowledge.

 

but the reason I wanted to know grib file sizes, specs on the type of PC I'd need to use, etc, is that I was thinking of maybe using an older laptop sitting around my pad, installing linux on there, and maybe trying my hand at getting things right. but i wanted to be sure what I had was enough before I attempted, so that I didn't ruin the hardware in advance.

Link to comment
Share on other sites

For the 0.25deg GFS, the forecast for one valid time, for the entire domain, for all the variables,in GRIB2 format is ~215MB

 

there are about ~45 forecast valid times in each run

 

that's why the software for sailors lets you draw a box around the area you want..., and select only the variables you want

 

for my purposes, i can often just download a GFS forecast of ~100KB, which is a good thing, because i am often doing it over a satphone data connection at 2.4kb/sec..., so even 100KB takes a few minutes.

 

anyway, another somewhat user friendly and small footprint program for viewing grib data is Panoply - it's much more research/academic oriented. Panoply will also display netcdf files.

 

http://www.giss.nasa.gov/tools/panoply/

Link to comment
Share on other sites

  • 7 months later...

have to admit, I know of grib data, I know of software like wingridds, and I have heard of ways to even use those two in order to make things like bufkit files from models such as the Canadian and other foreign models we don't get bufkit files from currently. but I haven't done it yet in my life (yes I know I should have in my career but I never had to, so I didn't learn it).

so, how big are grib files these days from the various model runs?  what type of computer horsepower will i need to process all that data? and will have to learn/relearn a bunch of programming to get it going (knew fortran from college, and still have the manual from that (yes I'm old enough to have my met programming class in fortran, not c)).

 

thanks,

 

If you're still looking for information on this (or if others are looking for such information), grib files can range up into the hundreds of megabytes. But it's also possible to grab only the parts you need using functions like CURL (PHP). If you want to e.g. get into downloading your own grib files via ftp, the accompanying .idx files will tell you where in the respective grib file each parameter starts.

As for processing the files, the free GDAL series of utilities can be used to manipulate and transform grib files into other formats, such as tif files and shapefiles, and further render them into things like .png image files. For example: gdalwarp can take a grib file and project it on a desired coordinate system as well as take a geographic subset of a grib file; gdal_calc can run simple or complex calculations on grib file points; gdal_contour can generate shape files from grib files; gdaldem can generate image files; ogr2ogr can take shape files and transform them into other formats for easier script processing, like .kml files.

 

I run php scripts that check for NAM and HRRR grib file updates every 10 minutes for the purpose of making GRLevel3 placefiles, and downloads/processes relevant subsets of them if available. It then processes them into .tif and shapefile intermediate files (covering the CONUS), with interpolation between hourly runs for every 10 minutes. When a placefile is loaded for a given radar site, scripts further process the area of the .tif and shapefiles around the radar site into .png files and text placefiles.

Those scripts run on a 2.5 GHz dual core processor server (Intel E5200 -- most certainly NOT cutting edge). It's capable of processing and calculating 16 parameters (.tif and contour shapefiles) from grib files, which just about maxes out what it can do in 10 minutes when new grib files are available. I will likely be upgrading that system sometime soon, but it's a useful reference.

Also, FORTRAN's not that bad :) But you can do amazing things in PERL and PHP these days.

XZE5m4a.jpg

 

Fb2Lahs.jpg

Link to comment
Share on other sites

If it's helpful, here's the script I use for downloading and processing grib data. I make no claims as to the beauty of the program, and at some point, I plan on going over it with a fine-toothed comb for efficiency and minor errors. But it works fine as is.

 

Feel free to make use of it in all or in part. This is my original work.

<?PHP
// This is a php file which, when automatically run (for example, as a cron job every 10 minutes) checks for updated HRRR model data
// covering the current and next forecast hour, and if it's available, downloads and processes it.
// The processing includes the creation of:
// 1) geographically resampled and chronologically interpolated geotiff raster files created for every 10 minutes between forecast hours
// 2) derived products calculated from HRRR products
// 3) shapefiles containing level contour polygons
// 4) ascii raster files containing the geotiff data
// 
// This script relies upon the GDAL library. This library is freely available for most platforms.
// Files are broken out by type (of course), forecast period coverage (every 10 minutes between current hour and next hour),
// and product. One file covers one forecast period and one product.
//
// Additional scripts, called on demand via grlevelx products, create placefiles and associated imagery for given locales
// on-the-fly from these data files.
//
// Needless to say, there are other applications for the downloaded and processed data.

// This script was written to work in Ubuntu Linux, but will run on other platforms with minor modifications.
//
// The subdirectory structure is as follows:
// /var/www/gribs/hrrr/gribs: temporary downloaded grib files
// /var/www/gribs/hrrr/grid: derived ascii raster files, in AAIGrid format, average about 30 MB each
// /var/www/gribs/hrrr/img: image files covering a geographic subset of model data, created by further scripts on an 
// 		as-requested basis
// /var/www/gribs/hrrr/shp: shape files containing level contour polygons, average about 2 MB each product
// /var/www/gribs/hrrr/temp: lockfiles
// /var/www/gribs/tif: geotiff files, about 20.5 MB each
//
// The script will run fine on older hardware (it's currently running on an Intel E5200 dual-core processor), but the computations
// the GDAL library functions use are intensive; more products processed means more processing time. As-is, it will complete 
// processing of its configured data set in about 8 minutes.
//
// Jonathan Williams, 2015


date_default_timezone_set ( 'Zulu' );

chdir('/var/www/gribs/hrrr');
#chdir('c:\apache24\htdocs\gribs\hrrr');

// Array of grib products to download. File inventories may be found here:
// http://www.nco.ncep.noaa.gov/pmb/products/hrrr/hrrr.t00z.wrfprsf00.grib2.shtml
// Array is of the following format:
// [[Parameter]_[Level/Layer]]_[Output file name prefix]_[Level Flag] = [Contour Flag]_[$Offset]_[$Interval]_[$Extra]_[$PreCalc]
// where
// [Parameter] & [Level/Layer]: identifying characteristics of the product subset in the grib file
// Level Flag, 1 = Product is not height above surface, 0 = subtract product value from surface geopotential height for height above surface
// Contour Flag: 1 = level contours desired for this product, 0 = do not create contours
// $Offset: value applied in contour creation. From GDAL_Contour documentation: "Offset from zero relative to which to interpret intervals"
// $Interval: value applied in contour creation. GDAL_Contour doc: "elevation interval between contours"
// $Extra: Additional GDAL_Contour parameter (future expansion)
// $PreCalc: uniform multiplier to apply to all grid points prior to other calculations
$ProdArray = array(
"CAPE_surface_CAPE_1" => "1_0_500_ _1",
"CIN_surface_CIN_1" => "1_0_50_ _1",
"LFTX_500-1000 mb_LFTX_1" => "1_0_20_ _10",
"TMP_2 m above ground_TMP_1" => "1_-1.11111111_2.7777778_ _1",
"TMP_surface_STMP_1" => "1_-1.11111111_2.7777778_ _1",
"PRMSL_mean sea level_PRMSL_1" => "1_0_200_ _1",
"DPT_2 m above ground_DPT_1" => "1_-1.11111111_2.7777778_ _1",
"HLCY_3000-0 m above ground_HLCY_1" => "1_0_250_ _1",
"MXUPHL_5000-2000 m above ground_MXUPHL_1" => "1_0_25_ _1",
"PWAT_entire atmosphere (considered as a single layer)_PWAT_1" => "1_0_5_ _1",
);

// Products that will be calculated/derived from downloaded products.
// Array is of the following format:
// [Output file name prefix] = [Operand Prod A]_[Operand Prod B]_[Equation]_[Contour Flag]_[$Offset]_[$Interval]_[$Extra]_[$PreCalc]
$CalcArray = array(
"EHI" => "CAPE_HLCY_(A*B)/160000_1_0_1_ _1"
);

// Reference Surface geopotential height grib product
$SurfaceLevel = "HGT_surface_SLVL";

// Initialize a few variables
date_default_timezone_set("GMT");
$ModelUnixTime = time(); // Current time (Unix time, in seconds). This variable will be used
                         // to find the model run to acquire.

$ForecastOffset = 0; // The model forecast hour. Start at 0 (current hour).
$LastFile = 0;

// File for keeping track of the most recent grib file accessed. Load that into "$LastFile".
$trackhandle = @fopen('track.txt','r');
if($trackhandle){
	if(($buffer = fgets($trackhandle))!==false){
		$LastFile = rtrim($buffer);
	}
	fclose($trackhandle);
}

// Loop from current hour to six hours behind, to find the most recent HRRR model run available
// on the NCEP FTP site with grib files covering both the current forecast hour and the next
// hour. We're interested in acquiring both the data for the current forecast hour and the data
// for the next forecast hour (both from the same, most recent model run), and interpolating
// between the two in 10-minute intervals.
echo "Downloading grib files...\n";
for($i = 0; $i < 6; $i++){
	$url = 'http://www.ftp.ncep.noaa.gov/data/nccf/nonoperational/com/hrrr/prod/hrrr.'.date('Ymd',$ModelUnixTime).'/hrrr.t'.date('H',$ModelUnixTime).'z.wrfprsf'.str_pad($ForecastOffset+1,2,"0",STR_PAD_LEFT).'.grib2';
	$idx = curl_init($url.".idx");
	curl_setopt($idx, CURLOPT_NOBODY, true);
	curl_setopt($idx, CURLOPT_CONNECTTIMEOUT,30);
	curl_setopt($idx, CURLOPT_TIMEOUT,30);
	curl_exec($idx);
	$retcode = curl_getinfo($idx, CURLINFO_HTTP_CODE);
	// $retcode >= 400 -> not found, $retcode = 200, found.
	if($retcode < 350) break; // We've found the most current model run; move on.
	curl_close($idx);
	$ModelUnixTime -= 3600; // Current run not yet found, so search for a model run one hour earlier, and...
	$ForecastOffset += 1; // increment the forecast hour so that we're still covering the same forecast time
}
curl_close($idx);

if($i == 6) exit("Couldn't find a recent grib file");
#-------------------------------------------------------

// Set "$FileFound" to the grib file for the forecast hour *before* what was found; this will be
// stored in our "track.txt" tracker file. 
$FileFound = date('YmdH',$ModelUnixTime).str_pad($ForecastOffset,2,"0",STR_PAD_LEFT);

// If we have a new model file, proceed with acquiring the data
if($FileFound != $LastFile){
	// Begin with downloading the most recently available 
	$idx = curl_init($url.".idx");
	curl_setopt($idx, CURLOPT_RETURNTRANSFER, true);
  curl_setopt($idx, CURLOPT_CONNECTTIMEOUT,30);
  curl_setopt($idx, CURLOPT_TIMEOUT,30);
	$idxfile = curl_exec($idx); // Download grib file index
	curl_close($idx);
	if($idxfile===FALSE) exit("Couldn't access index file 1");
	$idxlines = explode("\n",$idxfile); // Store index file contents in array "$idxlines"
	
	// Loop to find in the index file the start and end points of each desired product within the grib file. 
	// As these are listed in the index files, we don't have to download the entire grib file.
	foreach($ProdArray as $ProdLine => $tmp){
		list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine);
		foreach($idxlines as $key => $line){
			$linearray = explode(":",rtrim($line));
			if(($linearray[3]==$Prod)&&($linearray[4]==$Level)){
				$start = $linearray[1];
				unset($linearray);
				if(isset($idxlines[$key+1])){
					$linearray = explode(":",rtrim($idxlines[$key+1]));
					$stop = $linearray[1]-1;
				}else{
					$stop = '';
				}
				break;
			}
		}
		if(!isset($start)){
			exit("Product not found!");
		}

		$Range[$FName] = $start."-".$stop; // Range of bytes in the grib file containing the product we want
		unset($start);
		unset($stop);
	}

	// Now down to business -- use the $Range value for each product to download data from the
	// appropriate grib file, and store it in a temporary, local grib file
	foreach($ProdArray as $ProdLine => $tmp){
		list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine);
		$gribhandle = @fopen('./grib/temp_'.$FName."_1.grib2","w+");
		$grib = curl_init($url);
		curl_setopt($grib, CURLOPT_CONNECTTIMEOUT,30);
		curl_setopt($grib, CURLOPT_TIMEOUT,30);
		curl_setopt($grib, CURLOPT_FILE, $gribhandle);
		curl_setopt($grib, CURLOPT_FOLLOWLOCATION, true);
		curl_setopt($grib, CURLOPT_RANGE,$Range[$FName]);
		if(curl_exec($grib)===FALSE) exit("Couldn't download ".$url);
		curl_close($grib);
		fclose($gribhandle);
	}
	// Download the reference surface geopotential height
	list($Prod,$Level,$FName) = explode("_",$SurfaceLevel);
	foreach($idxlines as $key => $line){
		$linearray = explode(":",rtrim($line));
		if(($linearray[3]==$Prod)&&($linearray[4]==$Level)){
			$start = $linearray[1];
			unset($linearray);
			if(isset($idxlines[$key+1])){
				$linearray = explode(":",rtrim($idxlines[$key+1]));
				$stop = $linearray[1]-1;
			}else{
				$stop = '';
			}
			break;
		}
	}
	if(!isset($start)){
		exit("Product not found!");
	}

	$Range[$FName] = $start."-".$stop;
	unset($start);
	unset($stop);

	$gribhandle = @fopen('./grib/temp_'.$FName."_1.grib2","w+");
	$grib = curl_init($url);
	curl_setopt($grib, CURLOPT_CONNECTTIMEOUT,30);
	curl_setopt($grib, CURLOPT_TIMEOUT,30);
	curl_setopt($grib, CURLOPT_FILE, $gribhandle);
	curl_setopt($grib, CURLOPT_FOLLOWLOCATION, true);
	curl_setopt($grib, CURLOPT_RANGE,$Range[$FName]);
	if(curl_exec($grib)===FALSE) exit("Couldn't download ".$url);
	curl_close($grib);
	fclose($gribhandle);

	unset($idxlines);

// Now, do the whole thing over again for the previous (current) forecast hour

	$url = 'http://www.ftp.ncep.noaa.gov/data/nccf/nonoperational/com/hrrr/prod/hrrr.'.date('Ymd',$ModelUnixTime).'/hrrr.t'.date('H',$ModelUnixTime).'z.wrfprsf'.str_pad($ForecastOffset,2,"0",STR_PAD_LEFT).'.grib2';
	$idx = curl_init($url.".idx");
	curl_setopt($idx, CURLOPT_RETURNTRANSFER, true);
	$idxfile = curl_exec($idx);
	curl_close($idx);
  if($idxfile===FALSE) exit("Couldn't access index file 0");
	$idxlines = explode("\n",$idxfile);
	foreach($ProdArray as $ProdLine => $tmp){
		list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine);
		foreach($idxlines as $key => $line){
			$linearray = explode(":",rtrim($line));
			if(($linearray[3]==$Prod)&&($linearray[4]==$Level)){
				$start = $linearray[1];
				unset($linearray);
				if(isset($idxlines[$key+1])){
					$linearray = explode(":",rtrim($idxlines[$key+1]));
					$stop = $linearray[1]-1;
				}else{
					$stop = '';
				}
				break;
			}
		}
		if(!isset($start)){
			exit("Product not found!");
		}

		$Range[$FName] = $start."-".$stop;
		unset($start);
		unset($stop);
	}

	foreach($ProdArray as $ProdLine => $tmp){
		list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine);
		$gribhandle = @fopen('./grib/temp_'.$FName."_0.grib2","w+");
		$grib = curl_init($url);
		curl_setopt($grib, CURLOPT_CONNECTTIMEOUT,30);
		curl_setopt($grib, CURLOPT_TIMEOUT,30);
		curl_setopt($grib, CURLOPT_FILE, $gribhandle);
		curl_setopt($grib, CURLOPT_FOLLOWLOCATION, true);
		curl_setopt($grib, CURLOPT_RANGE,$Range[$FName]);
		if(curl_exec($grib)===FALSE) exit("Couldn't download ".$url);
		curl_close($grib);
		fclose($gribhandle);
	}

	list($Prod,$Level,$FName) = explode("_",$SurfaceLevel);
	foreach($idxlines as $key => $line){
		$linearray = explode(":",rtrim($line));
		if(($linearray[3]==$Prod)&&($linearray[4]==$Level)){
			$start = $linearray[1];
			unset($linearray);
			if(isset($idxlines[$key+1])){
				$linearray = explode(":",rtrim($idxlines[$key+1]));
				$stop = $linearray[1]-1;
			}else{
				$stop = '';
			}
			break;
		}
	}
	if(!isset($start)){
		exit("Product not found!");
	}

	$Range[$FName] = $start."-".$stop;
	unset($start);
	unset($stop);

	$gribhandle = @fopen('./grib/temp_'.$FName."_0.grib2","w+");
	$grib = curl_init($url);
	curl_setopt($grib, CURLOPT_CONNECTTIMEOUT,30);
	curl_setopt($grib, CURLOPT_TIMEOUT,30);
	curl_setopt($grib, CURLOPT_FILE, $gribhandle);
	curl_setopt($grib, CURLOPT_FOLLOWLOCATION, true);
	curl_setopt($grib, CURLOPT_RANGE,$Range[$FName]);
	if(curl_exec($grib)===FALSE) exit("Couldn't download ".$url);
	curl_close($grib);
	fclose($gribhandle);

	unset($idxlines);

//-------------------------------------------------------
// Convert surface geopotential height grib files into geotiff files, using gdalwarp to map
//  geographical coordinates onto something we can work with
// http://www.gdal.org/gdalwarp.html
	echo "Finding surface level...\n";
	list($Prod,$Level,$LevelBase) = explode("_",$SurfaceLevel);
	$ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -t_srs EPSG:4326 ./grib/temp_'.$LevelBase.'_0.grib2 ./tif/'.$LevelBase."_00m.tif";
	exec($ExecText);
	$ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -t_srs EPSG:4326 ./grib/temp_'.$LevelBase.'_1.grib2 ./tif/'.$LevelBase."_60m.tif";
	exec($ExecText);
	unlink('./grib/temp_'.$FName."_0.grib2");
	unlink('./grib/temp_'.$FName."_1.grib2");

//-------------------------------------------------------
// Convert remaining product grib files into geotiff files, using gdalwarp to map geographical
// coordinates onto something we can work with
	foreach($ProdArray as $ProdLine => $CData){
		list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine);
		echo $FName."\n";
		list($GetContours,$Offset,$Interval,$Extra,$PreCalc) = explode("_",$CData);
		$ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -t_srs EPSG:4326 ./grib/temp_'.$FName.'_0.grib2 ./tif/temp_'.$FName."_00m.tif";
		exec($ExecText);
		$ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -t_srs EPSG:4326 ./grib/temp_'.$FName.'_1.grib2 ./tif/temp_'.$FName."_60m.tif";
		exec($ExecText);
		
		// wipe temporary grib files no longer needed, and delete old geotiff files
		unlink('./grib/temp_'.$FName."_0.grib2");
		unlink('./grib/temp_'.$FName."_1.grib2");
		for ($i = 0; $i < 7; $i++){
			if(file_exists('./tif/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT).'m.tif')){
				unlink('./tif/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT).'m.tif');
			}
		}
		
		// use gdal_calc to apply precalc values to data, and subtract surface geopotential height
		// from products as appropriate
		// http://www.gdal.org/gdal_calc.html
		if($LFlag == 1){
			$ExecText = '/usr/bin/gdal_calc.py -A ./tif/temp_'.$FName.'_00m.tif --A_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_00m.tif --calc="'.$PreCalc.'*A"';
			exec($ExecText);
			$ExecText = '/usr/bin/gdal_calc.py -A ./tif/temp_'.$FName.'_60m.tif --A_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_60m.tif --calc="'.$PreCalc.'*A"';
			exec($ExecText);
		}else{
			$ExecText = '/usr/bin/gdal_calc.py -A ./tif/temp_'.$FName.'_00m.tif --A_band=1 -B ./tif/'.$LevelBase.'_00m.tif --B_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_00m.tif --calc="'.$PreCalc.'*(A-B)"';
			exec($ExecText);
			$ExecText = '/usr/bin/gdal_calc.py -A ./tif/temp_'.$FName.'_60m.tif --A_band=1 -B ./tif/'.$LevelBase.'_60m.tif --B_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_60m.tif --calc="'.$PreCalc.'*(A-B)"';
			exec($ExecText);
		}
		
		// wipe temporary geotiff files no longer needed
		unlink('./tif/temp_'.$FName."_00m.tif");
		unlink('./tif/temp_'.$FName."_60m.tif");

		// use gdal_calc to calculate linearly interpolated data between this forecast hour and the
		// next, in 10 minute intervals, and write the output as geotiff files
		for ($i = 1; $i < 6; $i++){
			$j = round($i/6,6);
			$ExecText = '/usr/bin/gdal_calc.py -A ./tif/'.$FName.'_00m.tif --A_band=1 -B ./tif/'.$FName.'_60m.tif --B_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_'.str_pad($i,2,"0",STR_PAD_RIGHT).'m.tif --calc="(A+('.$j.'*(B-A)))"';
			exec($ExecText);
		}
	}
//---------------------------------------------------------------
// Produce calculated geotiff files from product files
	foreach($CalcArray as $FName => $CData){
		echo $FName."\n";
		list($FName1,$FName2,$COp,$GetContours,$Offset,$Interval,$Extra,$PreCalc) = explode("_",$CData);
		for ($i = 0; $i < 7; $i++){
			$ExecText = '/usr/bin/gdal_calc.py -A ./tif/'.$FName1.'_'.$i.'0m.tif --A_band=1 -B ./tif/'.$FName2.'_'.$i.'0m.tif --B_band=1 --NoDataValue=9999 --outfile=./tif/'.$FName.'_'.str_pad($i,2,"0",STR_PAD_RIGHT).'m.tif --calc="'.$COp.'"';
			exec($ExecText);
		}
	}
//---------------------------------------------------------------
// So, now we've downloaded grib data and produced geotiff raster files for all
// of our data. These files cover the entire geographic extent of the model type
// in question, for the current forecast hour to the next, in 10 minute intervals.
// The geotiff files are broken out by product.
// They are found in the following form:
// ./tif/[FName]_XXm.tif
// where [FName] is from the ProdArray/CalcArray lists, and "XX" is the 10 minute
// time period in question (from "00" to "60")
//---------------------------------------------------------------
// Next, we need to produce resampled geotiff files (cubic spline interpolation at 0.03 degree
// points) to improve map appearance at high zoom levels, create level contour shape files
// from the data, and create ascii files from the geotiff files for further script processing.

	foreach($ProdArray as $ProdLine => $CData){
		list($Prod,$Level,$FName,$LFlag) = explode("_",$ProdLine);
		list($GetContours,$Offset,$Interval,$Extra,$PreCalc) = explode("_",$CData);
		echo "Contouring, Resampling, ASCII Grids...".$FName."\n";
		for ($i = 0; $i < 7; $i++){
			$temphandle = @fopen('./temp/'.$FName.'lockfile','w'); // create a lock file while we're working on the output geotiff files
			if($temphandle){
				if(flock($temphandle, LOCK_EX)){
					if(file_exists('./tif/'.$FName."_".$i."0m_res.tif")){
						unlink('./tif/'.$FName."_".$i."0m_res.tif");
					}
					$ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -tr 0.03 0.03 -r cubicspline ./tif/'.$FName."_".$i.'0m.tif ./tif/'.$FName."_".$i."0m_res.tif";
					exec($ExecText);
					flock($temphandle, LOCK_UN);
				}
				fclose($temphandle); // close the lock file
			}
			unlink('./tif/'.$FName."_".$i."0m.tif"); // wipe non-resampled geotiff files
			
			// if product calls for contours, calculate here
			if($GetContours == 1){
				$temphandle = @fopen('./temp/'.$FName.'lockfile','w');
				if($temphandle){
					if(flock($temphandle, LOCK_EX)){
						if(file_exists('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp")){ // wipe old shape files
							unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp");
							unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shx");
							unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.prj");
							unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.dbf");
						}
						
						// use gdal_contour to produce shape (vector) files containing contour level polygons from the geotiff
						// files
						// http://www.gdal.org/gdal_contour.html
						$ExecText = "gdal_contour -b 1 -a ".$FName." -snodata 9999 -i ".$Interval." -off ".$Offset.$Extra.'./tif/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m_res.tif .\/shp\/".$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp";
						exec($ExecText);
					}
					flock($temphandle, LOCK_UN);
				}
				fclose($temphandle);
			}
			$temphandle = @fopen('./temp/'.$FName.'lockfile','w');
			if($temphandle){
				if(flock($temphandle, LOCK_EX)){
					if (file_exists('./grid/'.$FName.'_'.$i.'0m.asc')){
						unlink('./grid/'.$FName.'_'.$i.'0m.asc');
						unlink('./grid/'.$FName.'_'.$i.'0m.prj');
						unlink('./grid/'.$FName.'_'.$i.'0m.asc.aux.xml');
					}
					// Use gdal_translate to take the geotiff files and create ascii files of the data.
					// This will allow derivation of, for example, grids of text values for placefiles. 
					// http://www.gdal.org/gdal_translate.html
					// http://www.gdal.org/frmt_various.html
					$ExecText = 'gdal_translate -b 1 -of AAIGrid -ot Float32 ./tif/'.$FName.'_'.$i.'0m_res.tif ./grid/'.$FName.'_'.$i.'0m.asc'; 
					exec($ExecText);
					flock($temphandle, LOCK_UN);
				}
				fclose($temphandle);
			}
		}
	}
	
	// Do the same as above for calculated data (could be combined with above)
	foreach($CalcArray as $FName => $CData){
		list($FName1,$FName2,$COp,$GetContours,$Offset,$Interval,$Extra,$PreCalc) = explode("_",$CData);
		echo "Contouring, Resampling, ASCII Grids...".$FName."\n";
		for ($i = 0; $i < 7; $i++){
			$temphandle = @fopen('./temp/'.$FName.'lockfile','w');
			if($temphandle){
				if(flock($temphandle, LOCK_EX)){
					if(file_exists('./tif/'.$FName."_".$i."0m_res.tif")){
						unlink('./tif/'.$FName."_".$i."0m_res.tif");
					}
					$ExecText = 'gdalwarp -srcnodata 9999 -dstnodata 9999 -tr 0.03 0.03 -r cubicspline ./tif/'.$FName."_".$i.'0m.tif ./tif/'.$FName."_".$i."0m_res.tif";
					exec($ExecText);
					flock($temphandle, LOCK_UN);
				}
				fclose($temphandle);
			}
			unlink('./tif/'.$FName."_".$i."0m.tif");
			if($GetContours == 1){
				$temphandle = @fopen('./temp/'.$FName.'lockfile','w');
				if($temphandle){
					if(flock($temphandle, LOCK_EX)){
						if(file_exists('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp")){
							unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp");
							unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shx");
							unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.prj");
							unlink('./shp/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.dbf");
						}
						$ExecText = "gdal_contour -b 1 -a ".$FName." -snodata 9999 -i ".$Interval." -off ".$Offset.$Extra.'./tif/'.$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m_res.tif .\/shp\/".$FName."_".str_pad($i,2,"0",STR_PAD_RIGHT)."m.shp";
						exec($ExecText);
					}
					flock($temphandle, LOCK_UN);
				}
				fclose($temphandle);
			}
			$temphandle = @fopen('./temp/'.$FName.'lockfile','w');
			if($temphandle){
				if(flock($temphandle, LOCK_EX)){
					if (file_exists('./grid/'.$FName.'_'.$i.'0m.asc')){
						unlink('./grid/'.$FName.'_'.$i.'0m.asc');
						unlink('./grid/'.$FName.'_'.$i.'0m.prj');
						unlink('./grid/'.$FName.'_'.$i.'0m.asc.aux.xml');
					}
					$ExecText = 'gdal_translate -b 1 -of AAIGrid -ot Float32 ./tif/'.$FName.'_'.$i.'0m_res.tif ./grid/'.$FName.'_'.$i.'0m.asc'; 
					exec($ExecText);
					flock($temphandle, LOCK_UN);
				}
				fclose($temphandle);
			}
		}
	}
	
	// Clean up
	if(file_exists('./tif/'.$LevelBase."_00m.tif")){
		unlink('./tif/'.$LevelBase."_00m.tif");
	}
	if(file_exists('./tif/'.$LevelBase."_60m.tif")){
		unlink('./tif/'.$LevelBase."_60m.tif");
	}
	
	// Update our "track.txt" file with the latest model run
	$trackhandle = @fopen('track.txt','w');
	if($trackhandle){
		fwrite($trackhandle,$FileFound);
		fclose($trackhandle);
	}else{
		exit("Problem creating tracking file");
	}
}
?>

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...