Commit 3742c965 authored by Geoffrey Cowles's avatar Geoffrey Cowles

transfer from google code

parents
File added
.svn
*.swp
*~
Pierre Cazenave <pica@pml.ac.uk> fvcom-toolbox ChangeLog
fvcom-prepro:
add_obc_nodes_graphic.m:
* Minor fix to make the defined function name match the file name.
add_obc_nodes_list.m:
* Add optional new argument to plot a figure of the open boundary
nodes.
add_sponge_nodes.m:
* Minor code cleanup.
add_sponge_nodes_list.m:
* New function: For a given list of nodes, apply a given sponge
coefficient over a specified sponge radius (both given as arguments
to the function. Follows the same syntax as add_sponge_nodes.m.
add_stations_list.m:
* New function: Add a set of stations at which FVCOM will output time
series. Requires station coordinates and names, and a threshold
distance beyond which the station is skipped.
calc_sponge_radius.m:
* New function: Create sponge layer with variable width radius based on
the proximity of the closest element or 100km (whichever is smaller).
estimate_ts.m:
* Add arguments to the function so that the estimated current velocity
and tidal range aren't hard-coded in the function.
* Also add a crude conversion from lat/long distances to UTM (metres)
using a great circle appoximation.
example_FVCOM_tsobc.m:
* Convert the method of writing NetCDF files from third-party toolbox
to the MATLAB native version.
* Add arguments to set dynamic file names (rather than hard-coded).
example_FVCOM_wind_ts_speed.m:
* Minor code change to calculate the number of elements in an array.
get_AMM.m:
* New function: Extract boundary forcing information from NOC
Operational Tide Surge Model output.
get_HYCOM_forcing.m:
* New function: INCOMPLETE! Function to extract boundary forcing
information from HYCOM model output through OPeNDAP.
get_NAE2_forcing.m:
* New function: Get the required parameters from NAE2 MetOffice model
data to force FVCOM at the surface.
get_NCEP_forcing.m:
* New function: Get the required parameters from NCEP through OPeNDAP.
Requires the air_sea and OPeNDAP toolboxes.
get_POLCOMS_tsobc.m:
* New function: Extract temperature and salinity data from the PML
POLCOMS NetCDF files and interpolate them to the FVCOM open boundary
nodes.
get_POLPRED_spectide.m:
* New function: Extract POLPRED harmonic amplitude and phases for the
nearest point in the POLPRED grid to the open boundary nodes in the
FVCOM grid.
grid2fvcom.m:
* New function: Interpolate regularly gridded surface forcing data onto
a given FVCOM grid.
read_ERA_wind.m:
* New function: Extract wind data from ERA Interim NetCDF files.
read_NCEP_wind.m:
* New function: Read in pairs of NCEP wind vector files and output
arrays of the required velocity components.
read_sigma.m:
* New function: Extract sigma layer and level information from a given
sigma.dat file.
read_sms_mesh.m:
* Add the ability to extract open boundaries from the defined
nodestrings in the SMS grid file (.2dm).
* Note: the SMS grid name must be a single word (no spaces). If the
script fails to load your grid, chances are its name is "Default
Coverage". If you change it to omit spaces in either the .2dm file or
the SMS project, this script should work.
replace_FVCOM_restart_vars.m:
* New function: For a given FVCOM restart file, replace select variables
with values from a PML POLCOMS model NetCDF output file. POLCOMS data
are interpolated first on the FVCOM vertical grid, then the vertically
interpolated data are interpolated onto the horizontal FVCOM grid.
set_elevtide.m:
* New function: Write out timeseries of surface elevations to a NetCDF
file.
* Requires the Tide Model Driver MATLAB Toolbox from Oregon State
University at http://polaris.esr.org/ptm_index.html.
set_spectide.m:
* Replace hard-coded variables with arguments to function call.
* Add support for adding equilibrium amplitudes and beta love numbers.
write_FVCOM_elevtide.m:
* New function to accompany set_elevtide.m to output a time series of
surface elevations.
write_FVCOM_forcing.m:
* New function: Write forcing data (u and v winds, heat flux etc.) out
to NetCDF file(s) depending on specified FVCOM version.
write_FVCOM_obs_TS.m:
* Converted to use MATLAB native NetCDF routines.
write_FVCOM_spectide.m:
* Replace use of third-party NetCDF library with MATLAB native tools.
* Add support for writing out equilibrium amplitudes and beta love
numbers.
write_FVCOM_sponge.m:
* Add support for variable width sponge layers (see
calc_sponge_radius.m).
write_FVCOM_stations.m:
* New function: Output list of stations at which FVCOM will output 1D
time series.
write_FVCOM_tsobc.m:
* Create a new function to output temperature and salinity at the open
boundaries (either spatially uniform or varying (e.g. interpolated from
POLCOMS using get_POLCOMS_tsobc.m). Based on the example_FVCOM_tsobc.m
file.
write_FVCOM_wind_ts_speed.m:
* Create a new function to output spatially uniform but temporally
varying wind field. Based on example_FVCOM_wind_ts_speed.m.
write_FVCOM_z0.m:
* Add support for MATLAB native NetCDF routines.
fvcom-postproc:
example_surface_plot.m:
* Now includes a lot more examples of different types of surface plots
(e.g. vectors at specific layers).
utilities:
centroid.m:
* New function: Calculate the centroid of a given polygon.
connectivity.m:
* New function: From Mesh2D toolbox. Read unstructured grid
connectivity (useful to find grid boundary, for example).
deg2utm.m:
* New function: Convert from lat/long to UTM (but automatically
determine relevant UTM zone). See also wgs2utm (where UTM zone can be
forced).
do_residual.m:
* New function: Calculate the residual vector for a given 2D time
series.
do_residual_plot.m:
* New function: Use the output of do_residual.m to plot residual
vectors.
do_transect_plot.m:
* New function: Pick and plot a transect through model output.
greg2mjulian.m:
* Format help to match better with other utilities.
get_NCEP_year.m:
* Extract the year from an NCEP Reanalysis file name.
deg2utm.m:
* Function to convert from lat/long to UTM.
* From http://www.mathworks.com/matlabcentral/fileexchange/10915
show_sigma.m:
* Made the reading in of the sigma.dat file more resilient (notably to
comments and blank lines).
utm2deg.m:
* Function to convert from UTM to lat/long.
* From http://www.mathworks.com/matlabcentral/fileexchange/10914
wgs2utm.m:
* New function: Convert from lat/long to UTM whilst being able to force
the UTM zone (allowing for coordinates which spread over several UTM
zones). See also deg2utm.
fvcom-toolbox
=============
The fvcom-toolbox is a collection of matlab and fortran90 scripts for the purpose of preparing and postprocessing data from the Finite Volume Community Ocean Model. These include:
1. scripts for preparing input files for FVCOM (wind forcing, open boundary forcing, river forcing, etc.)
2. scripts for converting meshes from SMS to FVCOM
3. scripts for postprocessing FVCOM data using Matlab
4. scripts for preparing data for the unstructured SWAN model
Notes:
(1) The html based documentation is generated using m2html and is available with the download (see doc/index.html)
(2) The code was originally maintained at a google code repository (http://code.google.com/p/fvcom-toolbox/). This repository was used between Sep, 2010 (initial commit) and July, 2013 when it was moved to github (https://github.com/GeoffCowles/fvcom-toolbox). Commit history was not maintained during the move as substantial revisions had been made to the code by Plymouth Marine Laboratory members outside of version control. The github trunk includes most of these changes noted in the file headers and noted in the file PML_ChangeLog.txt
doc/alpha.png

273 Bytes

doc/c++.png

327 Bytes

doc/c.png

252 Bytes

doc/down.png

133 Bytes

<?php
/******************************************************************************
*
* $Id:$
*
* Copyright (C) 1997-2003 by Dimitri van Heesch.
*
* Permission to use, copy, modify, and distribute this software and its
* documentation under the terms of the GNU General Public License is hereby
* granted. No representations are made about the suitability of this software
* for any purpose. It is provided "as is" without express or implied warranty.
* See the GNU General Public License for more details.
*
*/
function readInt($file)
{
$b1 = ord(fgetc($file)); $b2 = ord(fgetc($file));
$b3 = ord(fgetc($file)); $b4 = ord(fgetc($file));
return ($b1<<24)|($b2<<16)|($b3<<8)|$b4;
}
function readString($file)
{
$result="";
while (ord($c=fgetc($file))) $result.=$c;
return $result;
}
function readHeader($file)
{
$header =fgetc($file); $header.=fgetc($file);
$header.=fgetc($file); $header.=fgetc($file);
return $header;
}
function computeIndex($word)
{
if (strlen($word)<2) return -1;
// high char of the index
$hi = ord($word{0});
if ($hi==0) return -1;
// low char of the index
$lo = ord($word{1});
if ($lo==0) return -1;
// return index
return $hi*256+$lo;
}
function search($file,$word,&$statsList)
{
$index = computeIndex($word);
if ($index!=-1) // found a valid index
{
fseek($file,$index*4+4); // 4 bytes per entry, skip header
$index = readInt($file);
if ($index) // found words matching first two characters
{
$start=sizeof($statsList);
$count=$start;
fseek($file,$index);
$w = readString($file);
while ($w)
{
$statIdx = readInt($file);
if ($word==substr($w,0,strlen($word)))
{ // found word that matches (as substring)
$statsList[$count++]=array(
"word"=>$word,
"match"=>$w,
"index"=>$statIdx,
"full"=>strlen($w)==strlen($word),
"docs"=>array()
);
}
$w = readString($file);
}
$totalFreq=0;
for ($count=$start;$count<sizeof($statsList);$count++)
{
$statInfo = &$statsList[$count];
fseek($file,$statInfo["index"]);
$numDocs = readInt($file);
$docInfo = array();
// read docs info + occurrence frequency of the word
for ($i=0;$i<$numDocs;$i++)
{
$idx=readInt($file);
$freq=readInt($file);
$docInfo[$i]=array("idx"=>$idx,"freq"=>$freq,"rank"=>0.0);
$totalFreq+=$freq;
if ($statInfo["full"]) $totalfreq+=$freq;
}
// read name an url info for the doc
for ($i=0;$i<$numDocs;$i++)
{
fseek($file,$docInfo[$i]["idx"]);
$docInfo[$i]["name"]=readString($file);
$docInfo[$i]["url"]=readString($file);
}
$statInfo["docs"]=$docInfo;
}
for ($count=$start;$count<sizeof($statsList);$count++)
{
$statInfo = &$statsList[$count];
for ($i=0;$i<sizeof($statInfo["docs"]);$i++)
{
$docInfo = &$statInfo["docs"];
// compute frequency rank of the word in each doc
$statInfo["docs"][$i]["rank"]=
(float)$docInfo[$i]["freq"]/$totalFreq;
}
}
}
}
return $statsList;
}
function combine_results($results,&$docs)
{
foreach ($results as $wordInfo)
{
$docsList = &$wordInfo["docs"];
foreach ($docsList as $di)
{
$key=$di["url"];
$rank=$di["rank"];
if (in_array($key, array_keys($docs)))
{
$docs[$key]["rank"]+=$rank;
$docs[$key]["rank"]*=2; // multiple matches increases rank
}
else
{
$docs[$key] = array("url"=>$key,
"name"=>$di["name"],
"rank"=>$rank
);
}
$docs[$key]["words"][] = array(
"word"=>$wordInfo["word"],
"match"=>$wordInfo["match"],
"freq"=>$di["freq"]
);
}
}
return $docs;
}
function normalize_ranking(&$docs)
{
$maxRank = 0.0000001;
// compute maximal rank
foreach ($docs as $doc)
{
if ($doc["rank"]>$maxRank)
{
$maxRank=$doc["rank"];
}
}
reset($docs);
// normalize rankings
while (list ($key, $val) = each ($docs))
{
$docs[$key]["rank"]*=100/$maxRank;
}
}
function filter_results($docs,&$requiredWords,&$forbiddenWords)
{
$filteredDocs=array();
while (list ($key, $val) = each ($docs))
{
$words = &$docs[$key]["words"];
$copy=1; // copy entry by default
if (sizeof($requiredWords)>0)
{
foreach ($requiredWords as $reqWord)
{
$found=0;
foreach ($words as $wordInfo)
{
$found = $wordInfo["word"]==$reqWord;
if ($found) break;
}
if (!$found)
{
$copy=0; // document contains none of the required words
break;
}
}
}
if (sizeof($forbiddenWords)>0)
{
foreach ($words as $wordInfo)
{
if (in_array($wordInfo["word"],$forbiddenWords))
{
$copy=0; // document contains a forbidden word
break;
}
}
}
if ($copy) $filteredDocs[$key]=$docs[$key];
}
return $filteredDocs;
}
function compare_rank($a,$b)
{
return ($a["rank"]>$b["rank"]) ? -1 : 1;
}
function sort_results($docs,&$sorted)
{
$sorted = $docs;
usort($sorted,"compare_rank");
return $sorted;
}
function report_results(&$docs)
{
echo "<table cellspacing=\"2\">\n";
echo " <tr>\n";
echo " <td colspan=\"2\"><h2>Search Results</h2></td>\n";
echo " </tr>\n";
$numDocs = sizeof($docs);
if ($numDocs==0)
{
echo " <tr>\n";
echo " <td colspan=\"2\">".matches_text(0)."</td>\n";
echo " </tr>\n";
}
else
{
echo " <tr>\n";
echo " <td colspan=\"2\">".matches_text($numDocs);
echo "\n";
echo " </td>\n";
echo " </tr>\n";
$num=1;
foreach ($docs as $doc)
{
echo " <tr>\n";
echo " <td align=\"right\">$num.</td>";
echo "<td><a class=\"el\" href=\"".$doc["url"]."\">".$doc["name"]."</a></td>\n";
echo " <tr>\n";
echo " <td></td><td class=\"tiny\">Matches: ";
foreach ($doc["words"] as $wordInfo)
{
$word = $wordInfo["word"];
$matchRight = substr($wordInfo["match"],strlen($word));
echo "<b>$word</b>$matchRight(".$wordInfo["freq"].") ";
}
echo " </td>\n";
echo " </tr>\n";
$num++;
}
}
echo "</table>\n";
}
function matches_text($num)
{
if ($num==0)
{
return 'Sorry, no documents matching your query.';
}
else if ($num==1)
{
return 'Found 1 document matching your query.';
}
else // $num>1
{
return 'Found '.$num.' documents matching your query. Showing best matches first.';
}
}
function main($idxfile)
{
if(strcmp('4.1.0', phpversion()) > 0)
{
die("Error: PHP version 4.1.0 or above required!");
}
if (!($file=fopen($idxfile,"rb")))
{
die("Error: Search index file could NOT be opened!");
}
if (readHeader($file)!="DOXS")
{
die("Error: Header of index file is invalid!");
}
$query="";
if (array_key_exists("query", $_GET))
{
$query=$_GET["query"];
}
$results = array();
$requiredWords = array();
$forbiddenWords = array();
$foundWords = array();
$word=strtolower(strtok($query," "));
while ($word) // for each word in the search query
{
if (($word{0}=='+')) { $word=substr($word,1); $requiredWords[]=$word; }
if (($word{0}=='-')) { $word=substr($word,1); $forbiddenWords[]=$word; }
if (!in_array($word,$foundWords))
{
$foundWords[]=$word;
search($file,$word,$results);
}
$word=strtolower(strtok(" "));
}
$docs = array();
combine_results($results,$docs);
// filter out documents with forbidden word or that do not contain