Commit a6a9292b authored by Karen Amoudry's avatar Karen Amoudry

Merge pull request #2 from pwcazenave/karen

Merge in your new functionality with my dev version. This also makes the new functions optional whilst still maintaining the ability to use the new functions.
parents c3ba2291 b7a8d834
Pierre Cazenave <pica@pml.ac.uk> fvcom-toolbox ChangeLog
20140131
--------
Update to include some fixes to the forcing scripts as well as new scripts to
better handle various river data sources.
README.md
* Add note about the Tidal Model Driver toolbox being an optional
dependency.
fvcom_prepro:
get_EA_river_climatology.m
* New function to use pre-calculated Environment Agency river
temperature climatology data as input to the FVCOM river
temperature time series.
get_EHYPE_rivers.m
* New function to apply E-HYPE river discharge data to a given model
domain.
get_FVCOM_rivers
* Change the unique call to preserve the order by replacing 'first'
with 'stable'. This requires a relatively modern MATLAB (post-2011b).
get_HYCOM_forcing.m
* Add sea surface height to the list of variables that can be
downloaded and add the ability to specify particular variables to
download.
get_HYCOM_tsobc.m
* Add support for sea surface height and make the interpolation to the
boundary nodes less likely to fail when too few HYCOM points are found
nearby.
interp_HYCOM2FVCOM.m
* Fix the identification of the time index in the HYCOM data (use
hycom.time instead of Mobj.ts_times). Also ignore a field name of 'MT'
if supplied in varlist.
read_MetUM_forcing.m
* Fix the way time is handled. Previously a time variable had to be
specified in varlist. Now, each data variable's time is returned as an
array within the MetUM.(variable) struct, giving
MetUM.(variable).time and MetUM.(variable).data. This means if each
data variable uses a different time sampling, that can be accounted for
later (by interpolating to a common time series with interp3, for
example). Currently the code extracts the first 6 hour's worth of data.
The assumption there is that the Met Office do 4 runs per day, so
6 hours of data from each run gives you a day's worth.
read_sms_map.m
* Minor cosmetic changes.
read_sms_mesh.m
* Minor cosmetic changes.
write_FVCOM_elevtide.m
* Simplify the verbose output stuff.
write_FVCOM_forcing.m
* Add support for writing all the variables needed for
a HEATING_CALCULATED model run. This essentially makes
write_FVCOM_heating redundant, but I'll leave it in as it's a bit
simpler to understand what's going on there. Also update the way the
net surface heat flux is calculated (sum long and short, subtract
latent and sensible). Fix the way the wind variables are handled so
that both the U10/V10 and uwind_speed/vwind_speed netCDF variables are
written if only one of data.u10/data.v10 or data.uwnd/data.vwnd is
given. Change the output of tri' to tri, as tri was being written the
wrong way around (thanks to Rory O'Hara Murray for spotting that one).
write_FVCOM_heating.m
* Clarify the help a bit.
write_FVCOM_restart.m
* Make it more obvious when new data is being written in place of
existing data.
write_FVCOM_river.m
* Tidy up the way the times are handled.
write_FVCOM_river_nml.m
* Fix the handling of the optional vertical distribution string
argument.
write_FVCOM_stations.m
* Minor cosmetic changes.
write_WRF_forcing.m
* Add full suite of variables so that HEATING_CALCULATED can be used
instead. Fix names of the fields for long and shortwave radiation to
match the NCEP ones (from n{l,s}wrf to n{l,s}wrs i.e. change last
letter from f to s).
utilities:
grid2fvcom.m
* Check for the presence of the input fields being requested in the
input struct to avoid finding out that the last field in vars doesn't
exist in data half way through a time consuming loop. Change the way
the alternative coordinate arrays are used to accommodate subtleties in
the parallel code in MATLAB. Also fix some problems which sometimes
arose when interpolating using the Parallel Computing toolbox (they
were not reproducible with the serial version, annoyingly).
read_fvcom_mesh.m
* Minor cosmetic changes.
read_netCDF_FVCOM.m
* Add support for missing Itime and Itime2 values in an output file,
falling back on the time variable instead.
20130917
--------
Update to include the ability to use HYCOM and Met Office Unified Model
outputs.
fvcom_prepro:
calc_sponge_radius.m
* Fix typo.
get_FVCOM_rivers.m
* Use the full set of coastal coordinates when finding the appropriate
river node.
get_HYCOM_forcing.m
* Major update to this function. Now it correctly downloads the relevant
HYCOM data from the HYCOM OPeNDAP server. Does not work with the third-
party OPeNDAP toolbox.
get_HYCOM_tsobc.m
* New function: interpolates HYCOM data to the boundary nodes for
temperature and salinity boundary forcing.
get_MetUM_pp.m
* Update to build a list of the downloaded files from the BADC server
Also add new model outputs to the list of files available for download
(the new files are post-2011 but live in a different directory from the
pre-2012 data). Also download files in parallel to speed up the process
(this may be a bad idea so may be removed if that proves to be the
case).
get_NCEP_forcing.m
* Fairly significant change in that this function can now download any
of the Reanalysis-1, Reanalysis-2 and 20th Century Reanalysis data from
the NCEP server. Where 4D data are requested, only the surface data is
returned. I have not tested this with the third-part OPeNDAP toolbox.
interp_HYCOM2FVCOM.m
* Use to create spatially varying temperature, salinity, u and v data
for use in an FVCOM restart file from HYCOM data.
interp_POLCOMS2FVCOM.m
* Add inteprolation of the u and v data in the POLCOMS netCDF files.
Eventually this function will be adjusted to allow the selection of
only specific variables to be interpolated (as is the case with
interp_HYCOM2FVCOM).
read_MetUM_forcing.m
* New function: reads the specified variables from the netCDF files of
the converted Met Office Unified Model PP files.
read_sms_mesh.m
* Fairly major restructuring to decrease the time it takes to read an
SMS file by about 50%. This was mainly done to maintain my sanity.
write_FVCOM_elevtide.m
* Minor change to the verbose output following the rewritten
mjulian2greg function.
write_FVCOM_forcing.m
* Add support for alternative name for the pressure variable. Also tweak
the output of the net surface heat flux to only occur when we are
writing out to a single file.
write_FVCOM_heating.m
* Add failsafe functionality when writing the variables to better handle
NCEP and Met Office data.
write_FVCOM_restart.m
* Add support for writing out u and v data (on the element centres
rather than node positions).
write_FVCOM_river_nml.m
* Add support for an optional string to be written out (e.g. 'uniform')
for the vertical distribution of river discharge to bypass the automated
string generation.
write_WRF_forcing.m
* New function: writes out regularly gridded data in wrf_grid format for
direct use in FVCOM. The interpolation then occurs within FVCOM rather
than here, in MATLAB. Thanks to Dmitry Aleynik for his MATLAB function
which was used as the basis for this function.
utilities:
grid2fvcom.m
* Fix a bug in which the position arrays could be written out transposed
relative to the data arrays (leading to forcing data being in the wrong
place).
grid_vert_interp.m
* Remove any NaN values in the vertical profile to support HYCOM data
(which have a set of fixed vertical levels).
pp2nc.m
* Incorporate changes to better support paths in Windows (in particular
paths with spaces).
pp2nc.tcl
* Add (but leave commented out) functions to interpolate the Met Office
data onto a geographical grid.
pp2nc_subset.m
* New function: wrapper function to use convsh to interpolates Met
Office Unified Model PP files onto a geographical grid (i.e. lat/long).
subset.tcl
* New TCL script to interpolate the Met Office Unified Model PP files
onto a geographical grid.
20130719
--------
......
......@@ -12,6 +12,7 @@ Note that html based documentation is generated using m2html and is available wi
Some third-party MATLAB toolboxes are required for some functions:
* The Tidal Model Driver available at http://polaris.esr.org/ptm_index.html.
* The air-sea toolbox available at http://woodshole.er.usgs.gov/operations/sea-mat/air_sea-html/index.html.
* The OPeNDAP toolbox (for versions of MATLAB older than 2011b) available at http://www.opendap.org/pub/contributed/source/ml-toolbox/.
......@@ -43,7 +43,7 @@ end
%--------------------------------------------------------------------------
% Get a unique list and make sure they are in the range of node numbers
%--------------------------------------------------------------------------
% Make this works in versions of MATLAB older than 2012a (newer versions
% Make this work in versions of MATLAB older than 2012a (newer versions
% can just use unique(A, 'stable'), but checking versions is a pain).
[~, Nidx] = unique(Nlist);
Nlist = Nlist(sort(Nidx));
......
function Mobj = get_EA_river_climatology(Mobj, ea, dist_thresh)
% Read river temperature climatologies from the Environment Agency river
% temperature data. If no data are found within the threshold specified, a
% mean climatology from the nearest 30 sites is provided instead.
%
% function Mobj = get_EA_river(Mogj, ea)
%
% DESCRIPTION:
% Load all the river data from the river climatology netCDF file and find
% the most relevant one to the river nodes in Mobj.rivers.positions.
%
% INPUT:
% Mobj : MATLAB mesh structure which must contain:
% - Mobj.river_nodes - river node IDs.
% - Mobj.lon, Mobj.lat - unstructured grid node
% positions.
% - Mobj.river_time - Modified Julian Day array of the
% times for the river discharge data (Mobj.river_flux).
% ea : Full path to the river climatology netCDF file.
% dist_thresh : distance threshold beyond which a river temperature
% climatology data point is considered too far to be valid
% (in degrees).
%
% OUTPUT:
% Mobj : MATLAB structure with a new Mobj.river_temp field which
% contains the climatology for the river nodes.
%
% EXAMPLE USAGE:
% Mobj = get_EA_river_climatology(Mobj, '/path/to/netcdf.nc', 0.05)
%
% Author(s):
% Pierre Cazenave (Plymouth Marine Laboratory)
%
% Reivision history
% 2013-11-05 First version.
subname = 'get_EA_river_climatology';
global ftbverbose;
if ftbverbose
fprintf('\nbegin : %s \n', subname)
end
% Load the position (lon/lat), time, climatology and SiteType variables
% only. Not really bothered about the other variables.
nc = netcdf.open(ea, 'NOWRITE');
varid = netcdf.inqVarID(nc, 'climatology');
climatology = netcdf.getVar(nc, varid, 'single');
varid = netcdf.inqVarID(nc, 'time');
time = netcdf.getVar(nc, varid, 'single');
varid = netcdf.inqVarID(nc, 'lon');
lon = netcdf.getVar(nc, varid, 'single');
varid = netcdf.inqVarID(nc, 'lat');
lat = netcdf.getVar(nc, varid, 'single');
varid = netcdf.inqVarID(nc, 'SiteType');
SiteType = netcdf.getVar(nc, varid);
netcdf.close(nc)
clear varid
% Remove any sites which aren't RIVER in SiteType. This is not pretty but
% relatively speedy, so it'll have to do.
good = [];
for i = 1:size(SiteType, 2)
if strcmp(strtrim(SiteType(:, i)'), 'RIVER')
good = [good, i];
end
end
% Clear out the bad sites.
climatology = climatology(:, good);
lon = lon(good);
lat = lat(good);
% Now find the nearest nodes to the river node positions.
nr = length(Mobj.river_nodes);
clim = nan(length(time), nr);
for r = 1:nr
dist = sqrt((lon - Mobj.lon(Mobj.river_nodes(r))).^2 + (lat - Mobj.lat(Mobj.river_nodes(r))).^2);
[howclose, idx] = min(dist);
if howclose > dist_thresh
% Find the 30 closest sites and use their data instead.
[~, idx] = sort(dist);
clim(:, r) = median(climatology(:, idx(1:30)), 2);
else
% Get the relevant climatology.
clim(:, r) = climatology(:, idx);
end
end
% Now we have the climatologies for the relevant river nodes, we need to
% repeat it to fit the length of the Mobj.river_time array. Since the
% climatology data are for a year only, we need to find the correct index
% for the start and end of the Mobj.river_time array so that we don't put
% January temperature in July, for example.
[yyyy, mm, dd, HH, MM, SS] = mjulian2greg(Mobj.river_time);
startday = (datenum(yyyy(1), mm(1), dd(1), HH(1), MM(1), SS(1)) - ...
datenum(min(yyyy), 1, 1, 0, 0, 0)) + 1; % add offset of 1 for MATLAB indexing.
warning('Don''t know what''s going on with this here. Check the code to find the end day for the river climatology.')
%%% FIXME!!! %%%
% endday = (datenum(yyyy(end), mm(end), dd(end), HH(end), MM(end), SS(end)) - ...
% datenum(max(yyyy), 1, 1, 0, 0, 0)) + 1; % add offset of 1 for MATLAB indexing.
endday = (datenum(yyyy(end), mm(end), dd(end), HH(end), MM(end), SS(end)) - ...
datenum(max(yyyy), 1, 1, 0, 0, 0));
%%% FIXME!!! %%%
years = unique(yyyy);
ny = length(years);
if ny == 1
% Subset the river climatology for the right days.
repclim = clim(startday:endday, :);
else
% Otherwise build up the time series using the number of unique years
% we have.
for y = 1:ny
% Find the number of days in this year and only extract that number
% from the climatology.
nd = sum(eomday(years(y), 1:12));
if y == 1
% This is the part year for the first year. Prepend the
% existing array with the number of days required.
repclim = clim(startday:end, :);
elseif y == ny
repclim = [repclim; clim(1:endday, :)];
elseif y ~= 1 || y ~= ny
% We're in the middle years, so just repeat add the clim array
% to the end of the previous interation's.
repclim = [repclim; clim];
end
% We need to add an extra day's data to the end of the array for
% this year.
if nd == 366
repclim = [repclim; repclim(end, :)];
end
end
end
% Add the temperature climatology to Mobj.
Mobj.river_temp = repclim;
if ftbverbose
fprintf('end : %s \n', subname)
end
\ No newline at end of file
This diff is collapsed.
......@@ -2,15 +2,16 @@ function Mobj = get_FVCOM_rivers(Mobj, dist_thresh)
% Extract river discharges from the supplied river positions for the FVCOM
% grid in Mobj.
%
% get_FVCOM_rivers(Mobj, rivers, dist_thresh)
% get_FVCOM_rivers(Mobj, dist_thresh)
%
% DESCRIPTION:
% For the positioins in fvcom_xy, find the nearest unstructured grid node
% and extract the river discharge from polcoms_flow. If dist_thresh is
% specified, the river positions must fall within the specified distance.
% If multiple rivers are assigned to the same node, their discharges are
% summed. The resulting river name is generated from the contributing
% rives, separated by a hyphen.
% For the positions in Mobj.rivers.positions, find the nearest
% unstructured grid node and extract the river discharge from
% Mobj.rivers.discharge. If dist_thresh is specified, the river positions
% must fall within the specified distance. If multiple rivers are
% assigned to the same node, their discharges are summed. The resulting
% river name is generated from the contributing rives, separated by a
% hyphen.
%
% INPUT:
% Mobj - MATLAB mesh object containing:
......@@ -40,11 +41,11 @@ function Mobj = get_FVCOM_rivers(Mobj, dist_thresh)
% Mobj.river_time - time series for the river discharge data
%
% EXAMPLE USAGE:
% Mobj = get_FVCOM_rivers(Mobj, Mobj.rivers, 0.025)
% Mobj = get_FVCOM_rivers(Mobj, 0.025)
%
% Author(s):
% Pierre Cazenave (Plymouth Marine Laboratory)
% Karen Amoudry (National Oceanography Centre, Liverpool)
% Karen Amoudry (National Oceanography Centre, Liverpool)
%
% Revision history:
% 2013-03-27 - First version.
......@@ -60,14 +61,17 @@ function Mobj = get_FVCOM_rivers(Mobj, dist_thresh)
% searches for another node in the element which is part of at least two
% elements, thereby avoiding the "element filling" issue. Also updated
% the help to list all the required fields in the Mobj.
% 2013-12-10 - Change the unique call to preserve the order by replacing
% 'first' with 'stable'. This requires a relatively modern MATLAB
% (post-2011b).
%
%==========================================================================
subname = 'get_FVCOM_rivers';
global ftbverbose;
global ftbverbose
if ftbverbose
fprintf(['\nbegin : ' subname '\n'])
fprintf('\nbegin : %s \n', subname)
end
% Check inputs
......@@ -88,7 +92,7 @@ polcoms_flow = Mobj.rivers.discharge;
% For duplicates, we need, therefore, to work out a way to handle them
% elegantly. We will assume that rivers with the same name are close to one
% another. As such, we'll sum their discharges.
[~, di] = unique(fvcom_name, 'first');
[~, di] = unique(fvcom_name, 'stable'); % stable preserves order.
fv_dupes = 1:length(fvcom_name);
fv_dupes(di) = []; % index of duplicates (does this work with more than two?)
......@@ -189,8 +193,8 @@ for ff = 1:fv_nr
% Of the remaining nodes in the element, find the closest one to
% the original river location (in fvcom_xy).
[~, n_idx] = sort(sqrt( ...
(fvcom_xy(ff, 1) - tlon(n_tri)).^2 ...
+ (fvcom_xy(ff, 2) - tlat(n_tri)).^2));
(fvcom_xy(ff, 1) - Mobj.lon(n_tri)).^2 ...
+ (fvcom_xy(ff, 2) - Mobj.lon(n_tri)).^2));
[row_2, ~] = find(Mobj.tri == n_tri(n_idx(1)));
if length(n_idx) > 1
......
This diff is collapsed.
This diff is collapsed.
function met_files = get_MetUM_pp(Mobj, modelTime, credentials)
function met_files = get_MetUM_pp(modelTime, credentials)
% Get the required parameters from the Met Office Unified Model (TM)
% (hereafter MetUM) to use in FVCOM surface forcing.
%
% data = get_MetUM_pp(Mobj, modelTime, credentials)
% met_files = get_MetUM_pp(modelTime, credentials)
%
% DESCRIPTION:
% Using FTP access, extract the necessary parameters to create an FVCOM
......@@ -10,9 +10,8 @@ function met_files = get_MetUM_pp(Mobj, modelTime, credentials)
% it). Data are sampled four times daily.
%
% INPUT:
% Mobj - MATLAB mesh object
% modelTime - Modified Julian Date start and end times
% credentials - struct with fields username and password to access the
% modelTime - Modified Julian Date start and end times array
% credentials - cell array of username and password to access the BADC
% FTP server.
%
% OUTPUT:
......@@ -23,6 +22,7 @@ function met_files = get_MetUM_pp(Mobj, modelTime, credentials)
% - surface_downwelling_shortwave_flux_in_air (W m-2)
% - surface_net_downward_longwave_flux (W m-2)
% - surface_downwelling_longwave_flux_in_air (W m-2)
% - surface_upward_latent_heat_flux (W m-2)
% - surface_upward_sensible_heat_flux (W m-2)
% - eastward_wind / x_wind (m s-1)
% - northward_wind / y_wind (m s-1)
......@@ -36,10 +36,9 @@ function met_files = get_MetUM_pp(Mobj, modelTime, credentials)
% Precipitation is converted from kg/m^2/s to m/s. Evaporation is
% calculated from the mean daily latent heat net flux (lhtfl) at the
% surface.
%
%
% EXAMPLE USAGE:
% metum_forcing = get_MetUM_pp(Mobj, [51725, 51757], ...
% {'username', 'password'});
% met_files = get_MetUM_pp([51725, 51757], {'username', 'password'});
%
% TODO:
% Add support for the AP directories on the FTP server.
......@@ -54,8 +53,13 @@ function met_files = get_MetUM_pp(Mobj, modelTime, credentials)
% conversion from PP format to NetCDF to separate functions
% (get_BADC_data.m and pp2nc.m, respectively). Renamed the function from
% get_MetUM_forcing to get_MetUM_pp to better reflect what it does.
% 2013-09-11 Add support for the post-2011 output files. Also make the
% downloads in parallel (this does abuse the BADC somewhat if you have
% loads of threads running and will still only go as fast as they
% throttle individual connections, assuming they do that).
%
%==========================================================================
subname = 'get_MetUM_forcing';
global ftbverbose
......@@ -63,28 +67,40 @@ if ftbverbose
fprintf('\nbegin : %s \n', subname)
end
% Run jobs on multiple workers if we have that functionality. Not sure if
% it's necessary, but check we have the Parallel Toolbox first.
wasOpened = false;
if license('test', 'Distrib_Computing_Toolbox')
% We have the Parallel Computing Toolbox, so launch a bunch of workers.
if matlabpool('size') == 0
% Force pool to be local in case we have remote pools available.
matlabpool open local
wasOpened = true;
end
end
nt = ceil(modelTime(end)) - floor(modelTime(1));
if nt > 365
error('Can''t (yet) process more than a year at a time.')
end
yearStart = mjulian2greg(modelTime(1));
yearEnd = mjulian2greg(modelTime(end));
assert(yearStart >= 2006, 'The MetUM repository does not contain data earlier than 2006.')
% Four times daily outputs at 0000, 0600, 1200 and 1800
t = modelTime(1):1/4:modelTime(end);
assert(yearEnd == yearStart, 'Can''t (yet) process across a year boundary.')
assert(yearStart >= 2006 && yearEnd <= 2012, 'The MetUM repository does not contain data earlier than 2006 and later than 2012')
% For the pre-2010 data, we need to download several files with
% unique names. The names are based on the STASH numbers and the date:
% For the pre-2010 data, we need to download several files with unique
% names. The names are based on the STASH numbers and the date:
% naamYYYYMMDDHH_STASH#_00.pp
% The numbers we're interested in are stored in stash.
stash = [2, 3, 407, 408, 409, 4222, 9229, 16004, ...
1201, 1235, 2207, 2201, 3217, 3225, 3226, 3234, 3236, 3237, 3245, ...
5216, 16222, 20004];
% The stash numbers, and their corresponding forcing type:
% The stash numbers and their corresponding forcing type.
%
% AP = analysis, pressure levels
% AM = analysis, model levels
%
% |---------|-------------------------------------------|
% | stash # | forcing type |
......@@ -109,7 +125,7 @@ stash = [2, 3, 407, 408, 409, 4222, 9229, 16004, ...
% | 3217 | surface_upward_sensible_heat_flux |
% | 3225 | eastward_wind / x_wind |
% | 3226 | northward_wind / y_wind |
% | 3234 | surface_upward_latenet_heat_flux |
% | 3234 | surface_upward_latent_heat_flux |
% | 3236 | air_temperature |
% | 3237 | specific_humidity |
% | 3245 | relative_humidity |
......@@ -118,19 +134,35 @@ stash = [2, 3, 407, 408, 409, 4222, 9229, 16004, ...
% | 20004 | [RIVER OUTFLOW] |
% |---------|-------------------------------------------|
%
% For the post-2010 data, the stash codes are depracated in favour of
% single files with individual variables. As such, in those instances, the
% four data files are downloaded and then the extraction of the variables
% of interest can be done when the PP files have been converted to netCDF
% and are subsequently read in with read_MetUM_forcing.m.
ns = length(stash);
% From where will we be downloading the data?
site = 'ftp.ceda.ac.uk';
basePath = 'badc/ukmo-um/data/nae/';
% Preallocate the output cell arrays.
tmp_met_files = cell(1, nt * 4);
met_files = cell(1, nt * 4);
% Depending on the year we're extracting, we need to append different
% directories to get the data.
for i = 1:nt * 4 % four files per day (at 0000, 0600, 1200 and 1800).
parfor i = 1:nt * 4 % four files per day (at 0000, 0600, 1200 and 1800).
[year, month, day, hour] = mjulian2greg(t(i));
% Pre-2012 data are in a different directory from the post-2012 data,
% so adjust accordingly here.
if year < 2012
basePath = 'badc/ukmo-um/data/nae/';
else
basePath = 'badc/ukmo-nwp/data/nae/all_years';
end
% Cell array for the files to download.
files = cell(0);
......@@ -158,13 +190,13 @@ for i = 1:nt * 4 % four files per day (at 0000, 0600, 1200 and 1800).
end
else
% Use the mn data
prefix = 'mn';
prefix = 'sn';
filepath = sprintf('%sna/%s/%04d/%02d/%02d', basePath, ...
prefix, ...
year, ...
month, ...
day);
files = sprintf('%s_%04d%02d%02d%02d_s00.pp', ...
files{length(files + 1)} = sprintf('%s_%04d%02d%02d%02d_s00.pp', ...
prefix, ...
year, ...
month, ...
......@@ -197,29 +229,6 @@ for i = 1:nt * 4 % four files per day (at 0000, 0600, 1200 and 1800).
end
end
% Check the 2012 data are from before the 17th January, 2012.
elseif year == 2012
if month > 1
error('The MetUM repository does not contain data later than 17th January, 2012')
elseif month == 1
if day > 17
error('The MetUM repository does not contain data later than 17th January, 2012')
else
prefix = 'mn';
filepath = sprintf('%sna/%s/%04d/%02d/%02d', basePath, ...
prefix, ...
year, ...
month, ...
day);
files = sprintf('%s_%04d%02d%02d%02d_s00.pp', ...
prefix, ...
year, ...
month, ...
day, ...
hour);
end
end
% Pre-2010 files.
elseif year < 2010
% Use the am data.
......@@ -239,16 +248,29 @@ for i = 1:nt * 4 % four files per day (at 0000, 0600, 1200 and 1800).
stash(f));
end
% Post-2010 files.
elseif year > 2010
% Use the mn data.
prefix = 'mn';
filepath = sprintf('%sna/%s/%04d/%02d/%02d', basePath, ...
% 2011-2012 files.
elseif year > 2010 && year < 2012
% Use the sn data (has everything we need but doesn't have 70
% vertical levels!).
prefix = 'sn';
filepath = sprintf('%sna/%s/%04d/%02d/%02d', basePath, ...
prefix, ...
year, ...
month, ...
day);
files = sprintf('%s_%04d%02d%02d%02d_s00.pp', ...
files{length(files) + 1} = sprintf('%s_%04d%02d%02d%02d_s00.pp', ...
prefix, ...
year, ...
month, ...
day, ...
hour);
elseif year >= 2012
% Marginally more straightforward in this case.
prefix = 'prods_op_nae-mn';
filepath = sprintf('%s/%02d/%02d/%02d', basePath, year, month, day);
% prods_op_nae-mn_20120101_00_012.pp
files{length(files) + 1} = sprintf('%s_%04d%02d%02d_%02d_012.pp', ...
prefix, ...
year, ...
month, ...
......@@ -256,8 +278,22 @@ for i = 1:nt * 4 % four files per day (at 0000, 0600, 1200 and 1800).
hour);
end
met_files{i} = get_BADC_data(site, filepath, files, credentials);
tmp_met_files{i} = get_BADC_data(site, filepath, files, credentials);
end
% Clean up the output to be a single cell array of file names.
c = 0;
for i = 1:length(tmp_met_files)
for j = 1:length(tmp_met_files{i})
c = c + 1;
met_files(c) = tmp_met_files{i}{j};
end
end
% Close the MATLAB pool if we opened it.
if wasOpened
matlabpool close
end
if ftbverbose
......
This diff is collapsed.
This diff is collapsed.
function Mobj = interp_POLCOMS2FVCOM(Mobj, ts, start_date, varlist)
% Use an FVCOM restart file to seed a model run with spatially varying
% versions of otherwise constant variables (temperature and salinity only
% for the time being).
% versions of otherwise constant variables.
%
% function interp_POLCOMS2FVCOM(Mobj, ts, start_date, fv_restart, varlist)
% function interp_POLCOMS2FVCOM(Mobj, ts, start_date, varlist)
%
% DESCRIPTION:
% FVCOM does not yet support spatially varying temperature and salinity
......@@ -19,23 +18,22 @@ function Mobj = interp_POLCOMS2FVCOM(Mobj, ts, start_date, varlist)
% Mobj = MATLAB mesh structure which must contain:
% - Mobj.siglayz - sigma layer depths for all model
% nodes.
% - Mobj.lon, Mobj.lat - node coordinates (long/lat).
% - Mobj.ts_times - time series for the POLCOMS
% temperature and salinity data.
% - Mobj.lon, Mobj.lat - node coordinates (long/lat)
% - Mobj.lonc, Mobj.latc - element coordinates (long/lat)
% ts = Cell array of POLCOMS AMM NetCDF file(s) in which 4D
% variables of temperature and salinity (called 'ETWD' and 'x1XD') exist.
% Its/their shape should be (y, x, sigma, time).
% start_date = Gregorian start date array (YYYY, MM, DD, hh, mm, ss).
% varlist = cell array of variables to extract from the NetCDF files.
%
%
% OUTPUT:
% Mobj.restart = struct whose field names are the variables which have
% been interpolated (e.g. Mobj.restart.ETWD for POLCOMS daily mean
% temperature).
%
% EXAMPLE USAGE
% interp_POLCOMS2FVCOM(Mobj, '/tmp/ts.nc', '2006-01-01 00:00:00', ...
% {'lon', 'lat', 'ETWD', 'x1XD', 'time'})
% interp_POLCOMS2FVCOM(Mobj, '/tmp/ts.nc', [2006, 01, 01, 00, 00, 00], ...
% {'lon', 'lat', 'ETWD', 'x1XD', 'ucurD', 'vcurD', 'rholocalD', 'time'})
%
% Author(s):
% Pierre Cazenave (Plymouth Marine Laboratory)
......@@ -49,6 +47,8 @@ function Mobj = interp_POLCOMS2FVCOM(Mobj, ts, start_date, varlist)
% surface; its depths are stored surface to seabed; FVCOM stores
% everything surface to seabed. As such, the POLCOMS scalar values need