Summary:\n This workshop consisted of a lot of discussion of definitions, particularly what is interoperability, a sensor, an instrument, a platform, a system? Pretty much everyone there had a different idea of what these terms mean, and what the purpose and goals of the workshop were. The goals of summarizing existing technologies for sensor interoperability in observing systems, and identifying what's needed to realize sensor interoperability were stated, but because nobody was really clear about what sensor interoperability is, little progress achieving the goals was made. There were two fairly clear groups using different approaches- bottom-up and top down design. There were also folks involved in large observatories, who weren't concerned with power issues (or cost) and those with smaller existing or experimental systems who often did have power or cost limitations. Somehow, out of a lot of discussion, a report will be generated to present the group's findings. \n\nThe Industry folk want to help implement interoperability, but want guidance about what was important to add, but there was limited consensus about what key elements were required to move toward interoperability.\n\n The few concepts that met with fairly wide agreement are discussed below. Here, consider a sensor to be a transmissometer, and the Instrument to be a logger (that may or may not be a node of an observatory).\nTo be interoperable, these conditions need to be met:\n* the sensor must indentify itself (I am a transmissometer)\n* the sensor must supply some limited meta-data\n* one of N communication protocols must be used\n* one of N hardware connectors must be used\n \nBeyond this, there were many desirables, and differing opinions about where the smarts to drive self identification should go and implementation strategies. Most participants thought a specialist group should be assembled to take the meeting recommendations further and work towards a standard for observatories. However, the cyber infrastructure recommendations for ORION are farily well advanced, and perhaps should simply be adopted.\n\nOne of the issues discussed was which other organizations are working of parts of this puzzle. Everyone from Open Geospatial consortium - OGC (with SensorML) to IEEE, MMI, FGDC, NBII were cited. An interesting connectivity standard that was discussed is 1451 with accompanying 1588. JDDAC (Java Distributed Data Acquisition Code) is open source software spun off from that which may have future applicability for us. There's more about this in some slides on \n\n\n\nPoints of interest:\n Many of the data distribution systems are employing XML, web services and Java. Some are also serving netCDF via openDAP. Most of the web based data selectors are very granular, you get temperature or salinity in response to a query, not all the variables collected by the CTD.\n\n Data quality was of less interest than getting it on the web for some participants.\n\n The Neptune observatory has a lot of useful information on it- they offer data as images (pdf & jpg), .mat files and csv ascii.\n\n The concept of a "plug-fest" held part way though a development project that gets different groups working on a thin slice project together to test how things are working seems potentially useful.\n\n\n\n\n
Okay, so I'm logged on and see 3 people logged in as root. Who are they? I can type: \n{{{\n$who \n}}}\nto see the ttyname, then \n{{{\n$write user [ttyname]\n}}}\nto communicate to that user. You just type and convention is to type "-o" when you are done "writing" and and to listen for response. Do <cntrl-c> to exit.
The tinyurl for this page is <>\n\nThis has been superseded by my post on gis stackexchange: \n\n\n!Introduction\nThis is the approach that I've been using to initially set up THREDDS Data Server (TDS) catalogs for regional oceanographic modeling providers to serve their models results. It is not necessarily the best practice, merely a practice that works reasonably well. There are four basic types of catalogs we have been setting up:\n*A top level catalog that points to other catalogs that you want exposed\n*An "all" catalog that automatically scans a directory tree for netcdf (and grib, etc) files\n*Catalogs that aggregate regional model results by concatenating along the time dimension\n*Catalogs that aggregate forecast model results by using the special Forecast Model Run Collection feature of the TDS.\nSo we'll go through each type. But before modifying any catalogs, verify that TDS is up and running with the test catalog and datasets. Go to http://localhost:8080/thredds and drill down on one of the test data sets to the OpenDAP service to make sure everything looks okay in the OpenDAP Data Access page. \n\n!Top level catalog (catalog.xml)\nI use the top level catalog as a table of contents whose sole purpose is to point to other catalogs that you want to advertise. The following example is Ruoying He's catalog.xml, where he is simply pointing to two regional modeling catalogs:\n{{{\n<?xml version="1.0" encoding="UTF-8"?>\n<catalog xmlns=""\n xmlns:xlink=""\n name="THREDDS Top Catalog, points to other THREDDS catalogs" version="1.0.1">\n\n <dataset name="NCSU MEAS THREDDS catalogs">\n <catalogRef xlink:href="gomtox_catalog.xml" xlink:title="GOMTOX (Gulf of Maine) Ocean Model" name=""/>\n <catalogRef xlink:href="sabgom_catalog.xml"\n xlink:title="SABGOM (South Atlantic Bight and Gulf of Mexico) Ocean Model" name=""/>\n </dataset>\n\n</catalog>\n}}}\n\n!The "All" Catalog\nIt is quite convenient to have a catalog that automatically allows you to access to all data files in a particular directory tree via the TDS services. The datasetScan feature in the TDS scans a specified directory tree for files matching certain patterns or file extensions. \nThis could be your whole disk, or just a particular directory. In the following example, the TDS will scan the /data1/models directory for all NetCDF, Grib, or HDF files, sort them by alphabetical order, and include the file size. The data will be served via OpenDAP and HTTP, with HTTP just allowing people to download the existing file in it's native format.\n{{{\n<?xml version="1.0" encoding="UTF-8"?>\n<catalog xmlns=""\n xmlns:xlink=""\n name="THREDDS Catalog for NetCDF Files" version="1.0.1">\n <service name="allServices" serviceType="Compound" base="">\n <service name="ncdods" serviceType="OpenDAP" base="/thredds/dodsC/"/>\n <service name="HTTPServer" serviceType="HTTPServer" base="/thredds/fileServer/"/>\n <service name="wcs" serviceType="WCS" base="/thredds/wcs/"/>\n <service name="ncss" serviceType="NetcdfSubset" base="/thredds/ncss/grid/"/>\n <service name="wms" serviceType="WMS" base="/thredds/wms/"/>\n <service name="iso" serviceType="ISO" base="/thredds/iso/"/>\n <service name="ncml" serviceType="NCML" base="/thredds/ncml/"/>\n <service name="uddc" serviceType="UDDC" base="/thredds/uddc/"/>\n </service>\n \n <datasetScan name="Model Data" ID="models" path="models" location="/data1/models">\n <metadata inherited="true">\n <serviceName>allServices</serviceName>\n <publisher>\n <name vocabulary="DIF">USGS/ER/WHCMSC/Dr. Richard P. Signell</name>\n <contact url="" email=""/>\n </publisher>\n </metadata>\n <filter>\n <include wildcard="*.ncml"/>\n <include wildcard="*.nc"/>\n <include wildcard="*.grd"/>\n <include wildcard="*.nc.gz"/>\n <include wildcard="*.cdf"/>\n <include wildcard="*.grib"/>\n <include wildcard="*.grb"/>\n <include wildcard="*.grb2"/>\n <include wildcard="*.grib2"/>\n </filter>\n <sort>\n <lexigraphicByName increasing="true"/>\n </sort>\n <addDatasetSize/>\n </datasetScan>\n\n</catalog>\n}}}\nYou could reference this catalog in your catalog.xml file, or you might feel that advertising a link to all your data files would be confusing to some users. If you don't put the catalog in catalog.xml, you must add a reference to it in the threddsConfig.xml file in order for it to be read by the TDS. So if your catalog is called "all.xml", you would need a line in threddsConfig.xml that looks like this:\n{{{\n<catalogRoot>all.xml</catalogRoot>\n}}}\n\n!Regional model catalogs\n\nI suggest that you use a separate catalog for each model domain so that others can link to your catalogs in their own THREDDS catalogs in a more flexible way (e.g. your catalog for Boston Harbor could be referenced in a regional catalog for the Gulf of Maine). \n\nFor regional model results, there are typically two types of aggregation datasets that are useful. One aggregates along an existing time dimension, so uses type="joinExisting":\n{{{\n <aggregation dimName="ocean_time" type="joinExisting">\n <scan location="/media/1tb/MABGOM/Jun292008_Feb282009" regExp=".*mabgom_avg_[0-9]{4}\$"/>\n </aggregation>\n}}}\nwhere you can use a regular expression (java style) to match only certain files in a directory. Here we are matching files that looks like "". The "." means any character, so ".*" means any number of any character followed by "mabgom_avg_" followed by exactly 4 digits between 0 and 9, followed by exactly ".nc". So the entire catalog might look like:\n\n{{{\n<?xml version="1.0" encoding="UTF-8"?>\n<catalog name="MABGOM Catalog"\n xmlns=""\n xmlns:xlink="">\n <service name="allServices" serviceType="Compound" base="">\n <service name="wms" serviceType="WMS" base="/thredds/wms/"/>\n <service name="iso" serviceType="ISO" base="/thredds/iso/"/>\n <service name="ncml" serviceType="NCML" base="/thredds/ncml/"/>\n <service name="uddc" serviceType="UDDC" base="/thredds/uddc/"/>\n </service>\n\n <dataset name="MABGOM Runs">\n\n <metadata inherited="true">\n <serviceName>allServices</serviceName>\n <creator>\n <name vocabulary="DIF">Dr. Ruoying He</name>\n <contact url="" email=""/>\n </creator>\n <documentation xlink:href=""\n xlink:title="MABGOM Circulation"/>\n <documentation type="Summary"> Hydrodynamic simulations for the Mid-Atlantic Bight and Gulf of\n Maine </documentation>\n <documentation type="Rights"> This model data was generated as part of an academic research\n project, and the principal investigators: Ruoying He ( ask to be informed of\n intent for scientific use and appropriate acknowledgment given in any publications arising\n therefrom. The data is provided free of charge, without warranty of any kind.\n </documentation>\n </metadata>\n\n <dataset name="Tide-Averaged Data">\n <dataset name="Jun292008_Feb282009" ID="MABGOM/Jun292008_Feb282009/avg"\n urlPath="MABGOM/Jun292008_Feb282009/avg">\n <netcdf xmlns="">\n <aggregation dimName="ocean_time" type="joinExisting">\n <scan location="/media/1tb/MABGOM/Jun292008_Feb282009"\n regExp=".*mabgom_avg_[0-9]{4}\$"/>\n </aggregation>\n </netcdf>\n </dataset>\n </dataset>\n\n <dataset name="History Data">\n <dataset name="Jun292008_Feb282009" ID="MABGOM/Jun292008_Feb282009/his"\n urlPath="MABGOM/Jun292008_Feb282009/his">\n <netcdf xmlns="">\n <aggregation dimName="ocean_time" type="joinExisting">\n <scan location="/media/1tb/MABGOM/Jun292008_Feb282009"\n regExp=".*mabgom_his_[0-9]{4}\$"/>\n </aggregation>\n </netcdf>\n </dataset>\n </dataset>\n\n </dataset>\n</catalog>\n}}}\n\n!Forecast model catalogs\nThe other type of very useful catalog is a Forecast Model Run Collection (FMRC), which aggregates forecast files that have overlapping time records (e.g. 3-day forecasts, issued once a day). For this type of catalog, we use the FMRC FeatureCollection, which creates a "best time series" view, using the most recent data from each forecast to construct a continuous aggregated time series. The files to be scanned are specified in the `collection` tag, and when the files are scanned is specified by either a `recheckAfter` tag in the collection tag, or in the `update` tag.\n\nHere's a full example:\n{{{\n<catalog xmlns:xsi=""\n xsi:schemaLocation=""\n xmlns=""\n xmlns:xlink="" name="OPeNDAP Data Server" version="1.0.3">\n\n <!-- \n Specify the data and metadata services for this catalog\n -->\n <service name="allServices" serviceType="Compound" base="">\n <service name="ncdods" serviceType="OPENDAP" base="/thredds/dodsC/"/>\n <service name="ncss" serviceType="NetcdfSubset" base="/thredds/ncss/grid/"/>\n <service name="wms" serviceType="WMS" base="/thredds/wms/"/>\n <service name="iso" serviceType="ISO" base="/thredds/iso/"/>\n <service name="ncml" serviceType="NCML" base="/thredds/ncml/"/>\n <service name="uddc" serviceType="UDDC" base="/thredds/uddc/"/>\n </service>\n <!-- \n Create a folder for all the FMRC Feature Collections\n -->\n <dataset name="COAWST Model Runs">\n <metadata inherited="true">\n <serviceName>allServices</serviceName>\n <authority></authority>\n <dataType>Grid</dataType>\n <dataFormat>NetCDF</dataFormat>\n <creator>\n <name vocabulary="DIF">OM/WHSC/USGS</name>\n <contact url="" email=""/>\n </creator>\n <publisher>\n <name vocabulary="DIF">OM/WHSC/USGS</name>\n <contact url="" email=""/>\n </publisher>\n <documentation xlink:href=""\n xlink:title="Carolinas Coastal Change Program"/>\n <documentation xlink:href=""\n xlink:title="ReadMe.txt"/>\n </metadata>\n <!-- \n First FMRC Feature Collection\n -->\n <featureCollection name="coawst_4_use" featureType="FMRC" harvest="true" path="coawst_4/use/fmrc">\n <metadata inherited="true">\n <documentation type="summary">ROMS Output from COAWST</documentation>\n <serviceName>allServices</serviceName>\n </metadata>\n <!-- \n Inside the featureCollection, but outside the protoDataset, we define the NcML that happens\n before the aggregation. To get aggregated, we must have grids, so we turn the bed params\n into grids by giving them a psuedo coordinate in Z. If we don't do this, they will not be \n aggregated. \n -->\n <netcdf xmlns="">\n <variable name="Nbed" shape="Nbed" type="double">\n <attribute name="long_name" value="pseudo coordinate at seabed points"/>\n <attribute name="standard_name" value="ocean_sigma_coordinate"/>\n <attribute name="positive" value="up"/>\n <attribute name="formula_terms" value="sigma: Nbed eta: zeta depth: h"/>\n <values start="-1.0" increment="-0.01"/>\n </variable>\n <attribute name="Conventions" value="CF-1.0"/>\n </netcdf>\n\n <!-- \n Specify which files to scan for the collection, and say when to scan them.\n (here we scan at 3:30 and 4:30 every morning. 4:30 is just in case the model\n finishes late)\n -->\n <collection spec="/usgs/vault0/coawst/coawst_4/Output/use/$"\n olderThan="10 min"/>\n <update startup="true" rescan="0 30 3,4 * * ? *" trigger="allow"/>\n\n <!-- \n Specify the dataset to use for non-aggregated variables and \n global attributes. NcML changes here are applied after the data\n has been aggregated. \n -->\n <protoDataset choice="Penultimate">\n <netcdf xmlns="">\n <variable name="temp">\n <attribute name="_FillValue" type="float" value="0.0"/>\n </variable>\n <variable name="salt">\n <attribute name="_FillValue" type="float" value="0.0"/>\n </variable>\n <variable name="Hwave">\n <attribute name="_FillValue" type="float" value="0.0"/>\n </variable>\n <variable name="zeta">\n <attribute name="_FillValue" type="float" value="0.0"/>\n </variable>\n </netcdf>\n </protoDataset>\n <!-- \n Specify what datasets the user will access. Usually we just \n want the "best time series" aggregation. \n -->\n <fmrcConfig regularize="false" datasetTypes="Best"/>\n </featureCollection>\n\n </dataset>\n</catalog>\n\n}}}\n\nThe best place to find more information on setting up the TDS is usually the documents linked from the latest TDS tutorial from Unidata.\n\nAs I type this, the most recent is:\n\n\nwhich links to:\n\n\n\n\nSteve Baum has some tutorials also.\n\nseems to have been superceded by\n\nbut this also points to now-out-of-date pages at Unidata.\n
Working on ADCIRC high res grid developed for VDATUM in the Gulf of Maine\n\nMade a movie of currents over the tidal cycle from forecast run\nusing "tide_movie.m" in \nc:\srps\smaine\smodels\sadcirc\svdatum\stide_movie.m\n\nExtracted elevation and velocity time series for MVCO 12-M node location using\nc:\srps\smaine\smodels\sadcirc\svdatum\sadcirc_tseries.m\n\nCompared this to actual data at MVCO\nc:\srps\svs\smvco used ADCIRC ver. 44.19T for the Vdatum GOM simulation. I'm trying to use ADCIRC 46.32. Here's what I had to do to get the new version going:\n\nThe spatially varying friction used to be in fort.21, but now it's in fort.13.\n\n-Rich
You can browse archived NEXRAD Level II and III data using NOAA's Weather and Climate Toolkit\n\nYou use the tool to first "order" data, which generates an NCDC ID number. You then can view the data you ordered by entering that order number into the space provided under NCDC data. You can then animate to KMZ (but watch the memory -- about 80 frames is the max on my PC).
run anaconda.bat and see what is spits back.\nIt says it put something in the path, but it didn't. So you have to add it at the top.\nI added this:\n{{{\nset path=%path%;C:\sprograms\sanaconda\s;c:\sprograms\sanaconda\sscripts\n}}}\n
config.options.chkHttpReadOnly = false;\n
{{{\n mencoder "mf://*.jpg" -of rawvideo -mpegopts format=mpeg1:tsaf:muxrate=2000 -o output.mpg -oac lavc -lavcopts acodec=mp2:abitrate=224 -ovc lavc\n -lavcopts vcodec=mpeg2video:vbitrate=1152:keyint=15:mbd=2:aspect=4/3\n}}}\n\nThis converts a bunch of jpg files to mpeg1 in a form that will play on windows.\n\nmencoder is a command line tool that can work in windows, and comes with the "mplayer" package at:\n\n
Chen's group is running a 30 year hindcast of the Gulf of Maine using the gom3 grid, and we want to compare to all available current meter data that has been collected. The run starts in 1978, and is available via OPeNDAP at:\n\n\nWe can use the NCTOOLBOX for Matlab \n\nto easily perform the model/data comparison. \n\nMake sure you have the latest version that has Alex Crosby's latest cool contributions. Either this version:\n\nor the latest distribution obtained via Mercurial (hg)\n{{{\nhg clone nctoolbox\n}}}\nThen get this zip file\n\nthat has three m-files to be run in this order:\n\nStep 1. Run "hindcast_gom3_uv.m". This searches for all the time series data from NMFS, WHOI and USGS in the model space and time domain, reads the observed data via OPeNDAP, and then finds the closest model grid point (z,lon,lat) and interpolates both model and data to an hourly time base. \nStep 2. Run "hindcast_stats.m". This does tidal analysis, low-passed filtering, and complex correlation analysis on the model and data time series.\nStep 3. Run "hindcast_plot.m". This plots up some of the statistics.\n\nYou should be able to modify these to do just about any kind of time series analysis that is desired.\n\n[img[image 1|]]\n[img[image 1|]]\n[img[image 1|]]\n[img[image 1|]]\n
We want to assess the quality of the RADARSAT winds, so we are comparing to winds measured over water at various locations. But how do we know if the wind data is okay? Some of these sensors, especially the AGIP wind sensors might not be maintained very well. So one way, although indirect and using model results, is to compare to a met model. Of course, the MET model might not work very well in some regions either, but just as a check, let's compare the COAMPS model results to the buoy and platform data. All these observations were reduced from recorded anemometer heights to 10 m height using the neutral stability assumption.\n\nFor the period of Jan 20 - Feb 20, 2003, here's how the hourly data compares, with \nSeries 1 = DATA\nSeries 2 = MODEL\n|!Sta |!mean1| !theta| !std1 |!mean2 |!theta |!std2 | !corr |!theta | !transfn |!theta|\n| Ada | 5.29| 264.2| 6.47| 4.33| 221.9| 6.12| 0.70| -39.8| 0.66| -39.8|\n| Amelia | 2.58| 184.8| 5.37| 3.65| 198.7| 6.52| 0.65| 5.9| 0.79| 5.9|\n| Annabella | 4.86| 246.8| 4.23| 4.55| 202.8| 6.89| 0.36| 89.9| 0.58| 89.9|\n| Barbara C | 3.24| 194.8| 5.86| 3.38| 210.8| 6.40| 0.73| 36.7| 0.80| 36.7|\n| Fratello | 3.87| 132.6| 5.13| 4.36| 133.9| 6.16| 0.66| 7.8| 0.79| 7.8|\n| Garibaldi A | 1.79| 169.1| 4.64| 3.04| 194.8| 6.33| 0.60| -0.9| 0.82| -0.9|\n| Giovanna | 3.42| 251.9| 6.34| 4.22| 168.9| 6.81| 0.54| -57.8| 0.58| -57.8|\n| Pennina | 3.06| 163.8| 5.99| 2.91| 165.1| 7.44| 0.64| 7.6| 0.79| 7.6|\n| Acqua Alta | 5.81| 232.7| 6.33| 5.29| 225.0| 6.21| 0.71| -5.4| 0.70| -5.4|\n| Piran | 6.46| 241.5| 6.51| 6.70| 251.1| 6.04| 0.76| 8.8| 0.70| 8.8|\n| Senigallia | 4.02| 166.9| 6.78| 2.93| 151.5| 5.72| 0.64| -5.6| 0.54| -5.6|\n\nIt looks like Annabella is the only one we can clearly throw out. It's interesting that all the model standard deviations except for Ada and Senigallia are larger than observed, suggesting that the modeled winds are too strong. It also seem like there might be compass problems at Ada and Barbara C, with Ada data being rotated 39 degrees to the right (clockwise) of the model, and Barbara C being rotated 37 degrees to the left (counterclockwise) of the model.\n\n\n\n
__SBE37-IM__\nDownloaded Seabird "Seasoft2" \n\nHad to "run as administrator", installed in c:\sProgram Files (x86)\sSea-Bird\nwhich contains\nCnv37IMHex.exe\nThis program converts Hex to ASCII (Format=0 to Format=1)
{{{\n#!/bin/bash\nfor font in *bdf.Z ;\ndo\n base=`echo $font | sed -e 's/\s.bdf\s.Z//'`\n zcat $font | dxfc > $base.pcf\n compress $base.pcf\ndone\n}}}
{{{\n1/2 cup mayonnaise\n1 large egg, beaten\n1 tablespoon Dijon mustard\n1 tablespoon Worcestershire sauce\n1/2 teaspoon hot sauce\n1 pound jumbo lump crab meat, picked over\n20 saltine crackers, finely crushed\n1/4 cup canola oil\nLemon wedges, for serving\n}}}\nIn a small bowl, whisk the mayonnaise with the egg, mustard, Worcestershire sauce and hot sauce until smooth.\nIn a medium bowl, lightly toss the crabmeat with the cracker crumbs. Gently fold in the mayonnaise mixture. Cover and refrigerate for at least 1 hour.\nScoop the crab mixture into eight 1/3-cup mounds; lightly pack into 8 patties, about 1 1/2 inches thick. In a large skillet, heat the oil until shimmering. Add the crab cakes and cook over moderately high heat until deeply golden and heated through, about 3 minutes per side. Transfer the crab cakes to plates and serve with lemon wedges.\nMAKE AHEAD\nThe crab cakes can be prepared through Step 2 and refrigerated overnight.
The tinyUrl of this page is <>.\n\nThis is a short tutorial on how to use Matlab to easily extract a geographic range of bathymetry and topography data you want from web services, create nice color-shaded relief maps, and output to a highly interactive 3D visualization tool. \n\nFirst install NCTOOLBOX, the toolbox for Matlab that lets you access netcdf files, OPeNDAP datasets and other gridded data using the same syntax. \nFollow the download instructions at\nThen try this:\n{{{\nncRef =''; % Access an OpenDAP Data URL\n[data,geo]=nj_subsetGrid(ncRef,'topo',[-70.9 -70.1 41.15 41.65]); % Select a lon/lat subset of variable "topo"\nimagesc(geo.lon,,data); axis xy % Plot it up\n}}}\n\nThere are a lot of bathymetry datasets at\n\nYou want to pick a dataset, then choose the OpenDAP service, and then do "select all" and "copy" commands in the OPeNDAP Data URL window to copy the OpenDAP Data URL to the clipboard. Then create a string with this URL in Matlab (I start a string, then "paste" the URL, and close the string).\n\nNow let's try accessing some NOAA tsunami inundation DEM data for Nantucket. \n\nFirst let's get an overall view of what the grid looks like, so we'll subsample every 10 points to make it fast:\n{{{\nurl='';\nnc=ncgeodataset(url);\nz=nc{'topo'}(1:10:end,1:10:end);\ng=nc{'topo'}(1:10:end,1:10:end).grid;\nimagesc(g.lon,,z);axis xy\n}}}\nnow zoom into the region you really want, and then do:\n{{{\nax=axis;\nax=[-70.5356 -70.2343 41.2162 41.4639]; %for example\n}}}\nto store the map limits to the variable "ax". \n\nTo extract data from a specified geographic bounds with subsetting, you need to first create a "geovariable" and then use the "geosubset" method, which includes the ability to subset by geographic range as well as striding. \n{{{\nzvar=nc.geovariable('topo');\ns.lon=ax(1:2);\;\ns.h_stride=[2 2];\nstruc=zvar.geosubset(s); % returns data and grid\ng=struc.grid;\;\nimagesc(g.lon,,z);axis xy\n}}}\nThere is a very nice package for working with bathymetry in Matlab called Mirone: After installing Mirone, go to the Mirone directory in Matlab and type:\n{{{\ngrid2mirone(z,g);\n}}}\nYou can then do sun illumination, do interactive profiles, export to Google Earth (if coordinates are lon/lat) and export to Fledermaus files, and then view with the free Iview4D from\nAn example file exported from Mirone you can load in Iview4D is here:\n\n
Note: Original recipe published in NY Times: May 25, 2010, adapted from a recipe from Zingerman's Bakehouse, Ann Arbor, MI, with further slight modifications by me.\n\nTime: About 2 hours \n*1 tablespoon packed brown sugar (or barley malt syrup)\n*2 tablespoons softened unsalted butter (or lard)\n*2 tablespoons instant yeast \n*5 1/2 - 6 cups bread flour \n*2 cups water\n*1 tablespoon kosher salt \n*2 cups lye solution for dipping (1Tbs NaOH pellets + 2 cups water) \n\nCoarse sea salt or pretzel salt, for sprinkling (do not substitute kosher salt). \n1. In a mixing bowl, stir together sugar, salt, butter, 2 cups 110 deg warm water and 1 cup of flour. Heat mixture if necessary to 100 degrees in microwave. Mix yeast into mixture and let stand for 5 min. Add 3 more cups of flour and stir just until mixture comes together in a shaggy mass, adding as much flour as necessary. \n2. Turn out onto counter (or attach dough hook to mixer) and knead for 8 to 10 minutes, until smooth and supple, adding more of the remaining flour as necessary. Flatten into a fat disk and cut like a pizza into 12 pie shaped wedges. Let rest 5 minutes. \n3. Roll out each piece into a rope about 22 inches long. (For traditional shape, the ends should be thin and the center fat.) An easy way to do this is to separate the top and bottom of the wedge with your fingers, starting from the pointing end, sort of "unfolding" the wedge so that it has two pointy ends and is fat in the middle. Then roll it out! Then lift both ends, twist them around each other once, then bring ends back and press them on either side of fat “belly,” at about 4 o’clock and 8 o’clock. Transfer shaped pretzels to a baking sheet lined with parchment paper.\n4. Let rise at room temperature for 20 minutes, then put in freezer for 20 min to make them easier to handle when you dip them in lye. Instead of freezing, you can also refrigerate for at least one hour and up to overnight. \n5. Heat oven to 425 degrees. \n6. In a deep bowl, wearing rubber or latex gloves, make a 3% Lye solution by pouring 1 Tbsp (1/2 oz=15g) NaOH pellets into 2 cups (1/2 liter=500gm) water (pour lye carefully into water to avoid splashing). Dip each pretzel in solution, turning it over for 10 to 15 seconds, and place back on baking sheet. \n7. Cut an arc on the fat part of the pretzel about 1/4 deep with a razor blade.\n8. Sprinkle pretzels with salt. Bake on a greased baking sheet about 15 minutes or until deep brown. Remove to a rack and serve warm with butter.\nYield: 12 pretzels. \n
The Best-Ever Lentil Salad\n2 ¼ cups (1 lb.) Du Puy lentils, rinsed and drained\n1 medium red onion, diced\n1 cup dried currants (you can also use raisins or other dried fruit)\n1/3 cup capers\nVinaigrette:\n1/3 cup cold-pressed, extra-virgin olive oil\n¼ cup apple cider vinegar\n1 Tbsp maple syrup\n1 Tbsp Dijon mustard\n2 tsp sea salt\n2 tsp freshly ground pepper\n\n3 ½ t spices:\n1 tsp ground cumin\n½ tsp ground turmeric\n½ tsp ground coriander\n½ tsp ground cardamom\n¼ tsp cayenne pepper\n¼ tsp ground cloves\n¼ tsp freshly grated nutmeg\n¼ tsp ground cinnamon \n\nOptional add-ins:\nArugula\nWalnuts\nFresh goat cheese\nFresh herbs, such as flat-leaf parsley, cilantro, or basil\nSprouts\nCrispy seasonal veggies\nDirections:\n1. In a pot, bring lentils to a boil, reduce to a simmer and cook until al dente, 15 to 20 minutes. Remove from heat, drain, and run under cold water. Once cooled slightly, place lentils in a large serving bowl.\n2. While the lentils are simmering, make the dressing: Placing all ingredients in a jar with a tight fitting lid and shake vigorously to combine.\n3. Toss lentils with dressing. Add onion, currants, and capers. Add optional ingredients, such as herbs, greens, and cheese, just before serving.\nThis salad can hang out in the fridge for a couple days.\n
* 2 T olive oil\n* 2 cups chopped onion\n* 3 T finely chopped cilantro stems\n* 4 cloves minced garlic\n* 1 T minced chipotle pepper in adobo\n* 1 T ground cumin\n* 1 lb dried black beans\n* 3 cups water\n* 1 15 oz can tomatos (whole, diced, crushed, doesn't matter)\n* Juice from 1 lime\n* chicken bullion cubes (I use chicken "better than bullion")\n* greek yogurt\n\nSoak beans in water for 7-8 hours (e.g. overnight or during the workday)\nHeat oil in pressure cooker over medium heat, add onion and cumin and cook till softened, 3-4 mins. Turn to medium low, add garlic, cook for 2 mins. Drain the soaked beans and add to cooker with 3 cups water. When up to pressure, time 9 mins. Turn off and run under water to cool. Open and add drained tomatos. Add water if necessary. Add chicken bullion until salty enough. Add lime juice and serve with a big dollop of greek yogurt, cilantro leaves, and sriracha if desired.\n
Please join our model interoperability group. Note that although you need a Google Account to access Google Groups, you can use your work e-mail for your Google Account. Unless you use gmail all the time, I recommend that you make a new Google Account associated with your work e-mail. If you already have a Google Account, you just have to "sign out" of your personal Google Account first. This way you will get e-mail from Google Groups delivered to your work e-mail, and you will be able to post or reply to messages from the group from your work e-mail.
According to Briggs thesis, the median grain size of the 41 ebb dominated samples (north side of Middle Ground) is 710 mm, while for the 2 symmetric flood dominated samples is 340 mm.\n\nCould try a ROMS run with 710 and 340 mm sand, perhaps 1 m of each.
Pound, Brine and Grill Fast:\nPound: discard tender, pound to 1/2 inch thick\nBrining: To brine the chicken, dissolve 1-1/2 tablespoons of unionized table salt (or 1/4 cup of kosher salt) with 1/4 cup of sugar in 8 cups of cold water. This will make enough brine for 4 chicken breasts. If you are making more of less, adjust the amount of brine accordingly. The sugar in the brine will caramelize on the surface of the chicken as it cooks, giving it a nice, grilled coloring. To help dissolve the sugar and water, simply add it to 1 cup of boiling water, stir until dissolved and add mixture to the remaining water. Make sure the brine has cooled before adding the chicken. You can brine in a shallow, covered baking dish or a large zip lock bag. Make sure to bring for at least 30 minutes. It is important that you give the brine enough time to work, but don't overdo it. \nGrill: Directly from Brine to Grill -> grill on hot, 2 minutes a side.
After searching around for the download link on, realized that you have to send an e-mail to get the source code. I sent mine to "Fulcher, Crystal W" <>" and she sent me a file adc51_12.tar. Then following the ADCIRC developers guide at, I did:\n\n{{{\ncd /peach/data1/rsignell/Adcirc\ntar xvf adc51_12.tar\ncd v51.12\nfind . -name \s*.bz2 -exec bunzip2 \s{\s} \s;\ncd work\n\n# add -lnetcdff (fortran lib) to netcdf lib\nsed s/-lnetcdf/-lnetcdff -lnetcdf/ makefile > foo\nmv foo makefile\n\nmake adcirc compiler=pgi NETCDF=enable NETCDFHOME=/share/apps/netcdf NETCDF4=enable NETCDF4_COMPRESSION=enable\nmake padcirc compiler=pgi NETCDF=enable NETCDFHOME=/share/apps/netcdf NETCDF4=enable NETCDF4_COMPRESSION=enable\nmake adcprep compiler=pgi NETCDF=enable NETCDFHOME=/share/apps/netcdf NETCDF4=enable NETCDF4_COMPRESSION=enable\n\n}}}\n\nRunning ADCIRC:\n{{{\nadcprep [select option 1]\nadcprep [select option 2]\n}}}\nStep 2 initially bombed because NWS=1 was set in fort.15, which in older versions signified reading the fort.13 file for spatial varying parameters. In the v51.12 version, if NWS=1 you need to specify on the next line a parameter name for the data in the fort.13 file (e.g. mannings n, chezy, etc). Since the fort.13 file was not provided, I changed to NWS=0 and then it adcprep ran)\n\n\n\nas I didn't need SWAN support.
In the \n{{{\n./esmf/build_config\n}}}\n directory, I made a new sub-directory\n{{{\n./esmf/build_config/CYGWIN_NT-5.1.gfortran.default\n}}}\nand copied into it all the files from \nLinux.g95.default\n\nI then made these modifications to CYGWIN_NT-5.1.gfortran.default/\n{{{\nESMF_F90DEFAULT = gfortran (instead of g95)\nESMF_F90COMPILEOPTS += (removed -fno-second-underscore)\nESMF_F90LINKOPTS += (removed -fno-second-underscore)\n}}}\n\nI then set:\n{{{\n$ env | grep ESMF\nESMF_DIR=/cygdrive/c/rps/models/esmf\nESMF_INSTALL_PREFIX=/usr/local/esmf\nESMF_COMM=mpiuni\nESMF_ABI=32\nESMF_COMPILER=gfortran\n}}}\n\nTyping "make" produced
When I tried building GDAL using ./configure on with NetCDF4, I ran into problems at the linking stage due to unlinked HDF libraries. \n\nThe solution I found was to first do:\n{{{\nrsignell@gam:~$ nc-config --all\n\nThis netCDF 4.1.1 has been built with the following features:\n\n --cc -> gcc\n --cflags -> -I/usr/local/netcdf-4.1.1/include\n --libs -> -L/usr/local/netcdf-4.1.1/lib -lnetcdf -lhdf5_hl -lhdf5 -lz -lm -lcurl\n}}}\nwhich showed me what libraries need to be linked when building applications using netCDF4. So I then did this:\n{{{\n export LIBS='-lnetcdf -lhdf5_hl -lhdf5 -lz -lm -curl'\n ./configure --prefix=$HOME --with-netcdf=/usr/local/netcdf-4.1.1\nmake install\n}}}\nand it worked.\n
The problem: building mexnc from source with netcdf and opendap libraries seems to work fine with R14 on RHEL4, but then the applications fail.\n\nThe solution:\nIt turns out that the default compiler on RHEL4 is gcc version 3.4.4 (gcc -v), but Matlab R14 only supports 3.2.3, so shared libraries are incorrect.\n\nThe solution: install the gcc 3.2.3 compatibility libraries from Redhat network if they are not already installed.\n\nThen do:\n{{{\ncd $MATLAB/sys/os/gnlx86\ncp -p\nrm -f\nln -s /usr/lib/ .\n}}}\n\n(1) For non-DAP Mexnc:\nModify the "" file so that in the branch for your architecture (in my case glnx86)\n\nCC="gcc32"\n\nMake sure that the MATLAB environment variable is set correctly for the version you are building, and then just type "make". \n\n(2) For DAP-enabled Mexnc:\n\nmodify so that\nCC="gcc32"\nthen type "make -f makefile_dap"\n\nTESTING:\nAfter these tiny changes, "mexnc/tests/test_mexnc.m" worked with *no errors* for both the regular netcdf version of mexnc (with large file support) and with the opendap version (without large file support).\nImportant note: when testing using "test_mexnc.m" in the "mexnc/tests" directory, make sure there are no existing .nc files in the "tests" directory before you start the test. Also, when testing the opendap version, if you get "too many connects", try starting Matlab over -- I think some of the tests are forgetting to close the file after opening).\n\n\n(once I deleted the existing .nc files in the tests directory).
Wow, here is the new easy way to build NCVIEW on cygwin (thanks to Ward Fisher at Unidata!). It's quite a few optional cygwin packages to locate and install using setup.exe, but it beats building packages!\n\n1. Install NetCDF, HDF5, Curl, libXaw, libICE, udunits, libexpat and libpng by using the Cygwin setup.exe, searching for "netcdf", "hdf5", "curl", "Xaw", "libICE", "udunits" "libxpat" and "libpng" and installing these packages:\n{{{\nNetCDF libnetcdf-devel, libnetcdf7, netcdf\nHDF5 1.8.9-1: hdf5, libhdf5-devel, libhdf5_7 \ncurl 7.27.0-1: libcurl4\nlibXaw 1.0.11-1: libXaw-devel, libXaw7\nlibICE 1.0.8-1: libICE-devel, libICE6\nlibpng 1.5.12-1: libpng-devel, libpng15, libpng\nudunits 2.1.24-1: libudunits-devel, libudunits0, udunits\nlibexpat 2.1.0-1: libexpat1, libexpat1-devel\n}}}\n\n2. Build NCVIEW\n{{{\nwget\ntar xvfz ncview-2.1.1.tar.gz\ncd ncview-2.1.1\n./configure --prefix=/home/rsignell\nmake install\n}}}\n\nSuccess: /home/rsignell/bin/ncview works!\n\n\n\nCompare this to the old way:\nAfter running configure:\n{{{\nexport FC=gfortran\nexport CPPFLAGS='-DNDEBUG -Df2cFortran'\nexport CFLAGS='-O -fno-builtin'\n./configure --prefix=/usr/local\n}}}\nI edited the Makefile and replaced the NETCDFLIB and NETCDFLIBDIR lines with:\n{{{\nNETCDFLIB = -lnetcdf -lhdf5_hl -lhdf5 -lz -lm -lsz\nNETCDFLIBDIR = -L/usr/local/netcdf/lib -L/usr/local/hdf5-1.8.1/lib -L/usr/local/szip-2.1/lib\n}}}\nThen typed "make" and everything worked. I was able to view both "classic" and "NetCDF-4" files!\n\nNOTE: I don't think the FC=gfortran did anything, and I don't even know whether the CPPFLAGS or CFLAGS options were necessary, but that's just what I had set previously to build some other stuff, and the build worked, so I'm reporting them here.
On win32\n{{{\nexport FC=ifort\nexport CPPFLAGS="-fPIC -DpgiFortran"\nexport FFLAGS="-i-static"\n./configure --prefix=/usr/local/netcdf/ifort\nmake \nmake check\nmake install\n}}}
Note: the first time I untarred the {{{HDF5189-win32-cygwin.tar.gz}}} file, I untarred it in my home directory without noticing that it didn't make a directory for itself. So I ended up with a bunch of stuff in my bin,lib, etc that I didn't want there. So I used this slick command to remove all the tar files that got untarred:\n{{{\ntar tfz HDF5189-win32-cygwin.tar.gz | sort -r | xargs rm -f 2\n}}}\nBut here's how not to screw up\n{{{\ncd $HOME\nwget\nmkdir hdf5\ntar xvfz HDF5189-win32-cygwin.tar.gz -C $HOME/hdf5\n}}}\n\n\n
I pretty much just followed this "porting guide":\n\n\n{{{\n$ export FC=gfortran\n$ export F90=gfortran\n$ gfortran --version\nGNU Fortran (GCC)
!!Why rebuild the mex files?\n\nThere are two reasons to build Seagrid mex files -- either you need bigger arrays or you have a Machine/Matlab version for which the Seagrid mex files have not been built.\n\n!!Changing the array sizes\n\nThe maximum size of the grid that MEXSEPELI can handle is set in Because SEAGRID effectively doubles the grid for computational purposes, if you need a final grid that is 400x400, you need to set NX and NY in to something greater than 800.\n\nAlso the maximum size of the boundary that MEXRECT can handle is set in mexrect.c. You need to increase the size of the Z array in the main routine, and also the size of R and T in the RECT subroutine.\n\n!!Building the mex files\n\nThere are two Fortran mex files and a C mex file\n{{{\nmexrect.F\nmexsepeli.F\nmexinside.c\n}}}\nthat need to be built.\n\nCheck which Fortran compilers work with your platform and version of Matlab at\n
Building udunits 1.12.4 on cygwin\n{{{\nexport FC=gfortran\nexport CPPFLAGS='-DNDEBUG -Df2cFortran'\nexport CFLAGS='-O -fno-builtin'\n./configure --prefix=/usr/local\nmake install\n}}}
Building udunits 1.12.4 on linux\n{{{\nexport CC=gcc\nexport FC=ifort\nexport CPPFLAGS='-DNDEBUG -DpgiFortran'\nexport CFLAGS='-O -fno-builtin'\n./configure --prefix=/usr/local\nsudo make install\n}}}
{{{\nexport LDFLAGS="-L/home/tomcat/lib"\nexport CPPFLAGS="-I/home/tomcat/include " \n ./configure --enable-dap -enable-shared --prefix=/home/tomcat --enable-netcdf-4\n}}}
Started with RHEL4 system (kernel 2.6.9-67.ELsmp) with gcc 3.4.6 and all the standard -devel packages installed.\n\nI set these flags: \n{{{\nexport CPPFLAGS='-DNDEBUG -DpgiFortran -Drestrict='\nexport CC=gcc\nexport FC=ifort\nexport CFLAGS='-O -fno-builtin'\n}}}\n\nand then built:\n{{{\n/usr/local/udunits-1.12.4\n/usr/local/libdap-3.7.3\n/usr/local/libnc-dap-3.7.0\n/usr/local/antlr-2.7.7\n/usr/local/nco-3.9.2\n}}}\n\nIn each directory, I built with this sequence: \n{{{\n./configure\nmake\nsudo make install\n}}}\nexcept that I used\n{{{\n./configure --prefix=/usr/local/nco \n}}}\nfor nco since I didn't want it to overwrite my old nco binaries in /usr/local/bin\n\nNotes: I first built with the most recent libdap (3.7.10), but although libnc_3.7.0 was fine with this, nco-3.9.2 was NOT, and failed the opendap test during configure. So I then cleaned all the *dap* stuff out of /usr/local/lib and removed the /usr/local/include/libdap and /usr/local/include/libnc-dap directories. Also, I used the "-Drestrict" parameter to CPPFLAGS, since NCO said that this is needed if the compiler is not C99 compliant. I didn't know if gcc 3.4.6 was C99 compliant or not, so I defined "-Drestrict". Also, the "-DpgiFortran" is necessary for the Intel Fortran compiler (ifort), and also for the pgi fortran compiler, while "-Df2cFortran" is set for the G77 compiler. This should be automatically detected by the configure scripts, but I just wanted to mak sure.\n
To get ncSOS working, you need to get your time series in NetCDF using CF-1.6 conventions.\n\nWe tried for a long time to convert our 4D (ntimes,1,1,1) time series data using NcML, but in the end we had to convert the actual netCDF files.\nThe ipython notebook 'cmg_ts2cf_no_stationdim' on shows a working example. \n\nI learned a few things:\n* you need the global attribute "naming_authority", because that gets used in ncSOS.\n* ncSOS doesn't seem to work with (ntimes, nstations) files, even if nstations=1. When I removed the station dimension, ncSOS worked. \n\nThis retrieves data:\n
Source: Nick Napoli\nScott Gallagher needs to associate bottom photographs with sediment type and sidescan for species estimates.
Click Windows "Start" button\n Type "Accounts" in the Start search box, and then click User Accounts under Programs.\n\n In the User Accounts dialog box, click Change my environment variables under Tasks.\n Make the changes that you want to the user environment variables for your user account, and then click OK.
1 quart chocolate Haagen Dazs\n1/2 cup heavy cream\n4 oz semi-sweet chocolate\n1 habanero\n\nPut cream in pyrex measuring cup, cut 4 vertical slits in habanero and immerse in cream. Squeeze habanero against side of cup to get cream inside. Microwave until the cream begins to boil, 1-1.5 minutes. Squeeze the habanero again. You should see some oil floating on top of the cream. Taste a tiny bit of the cream -- it should taste very spicy. If not, let the pepper rest some more, and microwave a bit longer. Mix in 4 oz chopped chocolate until melted and uniform consistency. Cool in freezer until cool. Microwave ice cream for 30 seconds. Mix cream/chocolate/habanero mixture into ice cream and refreeze.
*AJAX (web pages don't need to reload on every action -- can have one element of the page change)\n*Canvas/SVG (images rendered directly in browser) (SVG similar to 2D VRML)\n*jQuery/Prototype (JavaScript stopped sucking in browser)\n*REST - (URLS can be data accessors!) API in distributed manner (evolution of Ajax idea)\n*Google/Microsoft/Apple/Mozilla all have very fast javascript compilers (way faster for python for everything besides number crunching)\n\n*jQuery (used almost everywhere)\n*Backbone (decouple data from the DOM) (DOM= Document Object Manager (what drives the "HTML"))\n*D3 (data transformation, primarily for vis)\n\n\none page web server using tornado\n\n
/***\n| Name|CloseOnCancelPlugin|\n| Description|Closes the tiddler if you click new tiddler then cancel. Default behaviour is to leave it open|\n| Version|3.0 ($Rev: 1845 $)|\n| Date|$Date: 2007-03-16 15:19:22 +1000 (Fri, 16 Mar 2007) $|\n| Source||\n| Author|Simon Baird <>|\n| License||\n***/\n//{{{\nmerge(config.commands.cancelTiddler,{\n\n handler_orig_closeUnsaved: config.commands.cancelTiddler.handler,\n\n handler: function(event,src,title) {\n this.handler_orig_closeUnsaved(event,src,title);\n if (!store.tiddlerExists(title) && !store.isShadowTiddler(title))\n story.closeTiddler(title,true);\n return false;\n }\n\n});\n\n//}}}\n\n
Notes from reading the Dec 9, 2009 report "Interim Framework for Effective Coastal and Marine Spatial Planning".\n\nMarine Spatial Planning is one of 9 priority objectives of the Interagency Ocean Policy Task Force (under the White House Council on Environmental Quality). The nine are:\n* Ecosystem-based management\n* Coastal and Marine Spatial Planning\n* Inform Decisions and Improve Understanding\n* Coordinate and Support\n* Climate Change and Ocean Acidification\n* Regional Ecosystem Protection and Restoration\n* Water Quality and Sustainable Practices on Land (that affect health of oceans)\n* Arctic\n* Observations and Infrastructure\n\nFor CMSP, goal is to come up with a system that optimizes decisions across all sectors: economic, environmental, cultural and security.\n\nAfter reading, not quite sure how the authority for CMSP planning works. \n\nCMS Plans would be adaptive and flexible, open, based on best scientific information, would be evaluated periodically, and would adopt the Principle 15 of Rio: "Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation".\n\nGeographic scope: high water line to edge of Continental Shelf or EEZ (whichever is further).\n\nConsistent planning scale for CMSP should be the established Large Marine Ecosystem (LME) scales. These are described at 5 in Alaska, California Current, Gulf of Mexico, Caribbean Sea, Southwest US, Northeast US, and Hawaii.\n
From MWRA reports (for example, the 2010 Outfall Monitoring Report: the total nitrogen loading from the outfall (which is secondary with a little primary mixed in during storms) is about 12,000 Metric Tons/Year. The average discharge rate is 350 MGD * (0.043) = 15 m3/s. \n This means an average total nitrogen concentration of 12000/15/(3600*24*365) = 25 ppm. This goes into a water column that is about 30 m deep, tidal currents are about 10 cm/s, and the tidally-averaged currents are about 10 cm/s (10 km/day).\n\nFor Falmouth, from this West Falmouth Mass Estuaries Project report (, the waste water treatment facility (WWTF) discharges about 0.46 MGD (319 gallons/minute), which equals 0.46 MGD * (0.043 cms/mgd) = 0.02 m3/s. The concentration of total nitrogen from the town waste water site: was about 30 ppm in 2004, but is about 3 ppm now. The estimate annual load of total nitrogen from the existing treatment plant therefore, if discharged to the ocean, would be about 0.02 m3/s * 3e-6 * 3600*24*365 = 1.8 Metric Tons/Year (1 Metric Ton = 1000 kg = 1 m3 water)\n\nCurrently only 3% of Falmouth is sewered, but if we assumed that all of Falmouth was sewered, and multiply this number by 30, we get 30*1.8 = 54 Metric Tons/year. That would mean that the Boston Outfall puts (12,000/54 = 222) more than 200 times as much nitrogen into Mass Bay as the Falmouth Outfall would!\n\nThe population of falmouth is 30,000, population sewered by MWRA is about 2 million (and 5500 businesses); so only 60x more people So why 200 times as much nitrogen? Partly because Falmouth has better treatment. Plus some error?
From the NCO manual:\nIt is possible to use a combination of these operations to compute the variance and standard deviation of a field stored in a single file or across multiple files. The procedure to compute the temporal standard deviation of the surface pressure at all points in a single file involves three steps.\n{{{\n ncwa -O -v prs_sfc -a time\n ncbo -O -v prs_sfc\n ncra -O -y rmssdn\n}}}\nFirst construct the temporal mean of prs_sfc in the file Next overwrite with the anomaly (deviation from the mean). Finally overwrite with the root-mean-square of itself. Note the use of ‘-y rmssdn’ (rather than ‘-y rms’) in the final step. This ensures the standard deviation is correctly normalized by one fewer than the number of time samples. The procedure to compute the variance is identical except for the use of ‘-y var’ instead of ‘-y rmssdn’ in the final step. \n\nHere's what I actually did:\n{{{\ncd /http/www/tomcat/apache-tomcat-7.0.22/data/gom3_hindcast\n}}}\nStep 1. Compute the mean for each month. I ran the script "do_mean":\n{{{\nmore do_mean\n\n#!/bin/bash\n# create the monthly means from the gom3 data\nfor file in /http/www/CODFISH/Data/FVCOM/NECOFS/Archive/gom3_*.nc\ndo\n ext=`echo $file | sed -e 's/.*\s///'`\n outf=`echo $ext | sed -e 's/gom3/gom3_mean/'`\n if [ -f $outf ]\n then\n echo $outf exists\n else\n echo processing $outf\n ncra -O $file $outf\n fi\ndone\n}}}\nThis took 4 days to complete! \n\nStep 2. Compute the standard deviation. In Step 1 I computed the means using "ncra" instead of "ncwa" to be left with a time dimension of 1 that could be easily time aggregated. Unfortunately, for calculating the standard deviation we can't have the time dimension in there because "ncbo" will get upset about differencing files with two different time dimensions. Luckily, it's easy to remove this singleton dimension using "ncwa" as shown below, creating a file "". The "" file, without time dimension, is then used to comput the anomaly. \n{{{\nmore do_std\n#!/bin/bash\n# create the monthly standard deviation from the gom3 data\nfor file in /http/www/CODFISH/Data/FVCOM/NECOFS/Archive/gom3_*.nc\ndo\n ext=`echo $file | sed -e 's/.*\s///'`\n meanf=`echo $ext | sed -e 's/gom3/gom3_mean/'`\n outf=`echo $ext | sed -e 's/gom3/gom3_std/'`\n if [ -f $outf ]\n then\n echo $outf exists\n else\n echo processing $outf\n# remove time dimension from $meanf so that ncbo will work\n ncwa -O -a time $meanf\n# compute anomaly (difference from mean)\n ncbo -O --op_typ=subtraction $file $outf\n# compute rmssdn from anomalys\n ncra -O -y rmssdn $outf $outf\n fi\ndone\n}}}\n
//{{{\nconfig.options.chkHttpReadOnly = false; // means web visitors can experiment with your site by clicking edit\nconfig.options.chkInsertTabs = true; // tab inserts a tab when editing a tiddler\nconfig.views.wikified.defaultText = ""; // don't need message when a tiddler doesn't exist\nconfig.views.editor.defaultText = ""; // don't need message when creating a new tiddler \n//}}}\n
You need to fork the main matplotlib repo, so you have your own copy\nassociated with your github account:\n\n\nOnce you've forked it, clone it and create a branch:\n\ngit clone my-forked-repo-url\ncd matplotlib\ngit checkout -b my_awesome_new_feature\n# ... hack hack hack ...\ngit commit -am "Useful commit message here"\ngit push origin my_awesome_new_feature\n\nOnce you've done that, make a pull request by following the\ninstructions here:\n\n\nOnce you've done that, congratulations!\n
{{{\nC:\sRPS\svs\sgrids>gdal_translate -a_srs "+proj=utm +zone=19 +datum=NAD83" MiddleGround_110706_1m.asc -of GMT Mid_110706_1m.grd\nInput file size is 1781, 1515\n\nC:\sRPS\svs\sgrids>gdal_translate -a_srs "+proj=utm +zone=19 +datum=NAD83" MiddleGround_1.0_rtk.asc -of GMT Mid_092706_1m.grd\nInput file size is 1781, 1515\n\nC:\sRPS\svs\sgrids>gdal_translate -a_srs "+proj=utm +zone=19 +datum=NAD83" MiddleGround_1.0_rtk.asc Mid_092706_1m.tif\nInput file size is 1781, 1515\n0...10...20...30...40...50...60...70...80...90...100 - done.\n}}}\n
The tinyURL of this page is <>\n\nArt deGaetano provided daily weather data (5km grid) stretching back to 1970 for the NE. The data is written in an ASCII file, one file per day, and looks like this:\n{{{\n$ head DEM_NE_5.0_2005_01_01.txt\n dem0 48.00 -84.00 275.8 255.0\n dem1 48.00 -83.96 275.9 255.1\n dem2 48.00 -83.92 275.6 255.2\n dem3 48.00 -83.88 275.9 255.4\n dem4 48.00 -83.83 275.9 255.0\n}}}\nThe first column can be ignored, and then we have lat, lon, tmax, tmin.\n\nI wanted to see what this looked like, so I loaded into Matlab:\nI first on the Linux (or cywin under Windows) command line, I did\n{{{\n$ cut -c11- DEM_NE_5.0_2005_01_01.txt > foo.txt \n}}}\nthen in Matlab\n{{{\n>> load foo.txt\n>> lat=foo(:,1); lon=foo(:,2);\n\n>> min(lon(:))\nans = -84\n>> max(lon(:))\nans = -66.0400\n>> min(lat(:))\nans = 36.0400\n>> max(lat(:))\nans = 48\n>> ind=find(lon>-83.97&lon<=-83.95);\n>> mean(diff(lat(ind)))*60\n>> ans = -2.5003\n>> ind=find(lat>36.05 & lat<=36.10);\n>> ans = 2.4995\n}}}\nSo I'm pretty sure the dx and dy are supposed to be 2.5 minutes, even though there wasn't sufficient significant digits written in the text file to determine the exact spacing.\n\nArmed with the interval, max and min of lat and lon, we can use the GMT (Generic Mapping Tools) program "xyz2grd" to create the NetCDF grid files from the text files. In the directory where the text files are, I created and ran this bash script "do_dem2nc" that calls xyz2grd to convert each file. We have to do tmin and tmax separately because "xyz2grd" can only handle one variable, unfortunately. But we'll fix this later with NcML. So here is "do_dem2nc":\n\n{{{\n#!/bin/bash\n## DO_DEM2NC script convert DEM text files to NetCDF using GMT's "xyz2grd" routine\nfor file in *.txt\ndo \n base=`echo $file | sed -e 's/\s.txt//'`\n echo "creating ${base}, ${base}"\n## cut column 4 (tmax)\n cut -f2,3,4 $file | xyz2grd -: -G${base} -I2.5m -R-84/-66.041667/36.041667/48\n## cut column 5 (tmin)\n cut -f2,3,5 $file | xyz2grd -: -G${base} -I2.5m -R-84/-66.041667/36.041667/48\ndone\n}}}\n\nSo we run it:\n{{{\n./do_dem2nc\n}}}\nwhich produces a bunch of NetCDF files.\n\nThis produces the netcdf files, but to give them time values (extracted from the file names) and make them CF compliant, we use this NcML:\n{{{\n<?xml version="1.0" encoding="UTF-8"?>\n<netcdf xmlns="">\n <dimension name="lon" orgName="x"/>\n <dimension name="lat" orgName="y"/>\n <variable name="lon" orgName="x">\n <attribute name="long_name" value="longitude"/>\n <attribute name="units" value="degrees_east"/>\n </variable>\n <variable name="lat" orgName="y">\n <attribute name="long_name" value="latitude"/>\n <attribute name="units" value="degrees_north"/>\n </variable>\n <aggregation type="union">\n <netcdf xmlns="">\n <variable name="tmin" orgName="z">\n <attribute name="long_name" value="minimum temperature"/>\n <attribute name="units" value="kelvin"/>\n <attribute name="_FillValue" type="float" value="NaN"/>\n </variable>\n <aggregation dimName="time" type="joinNew">\n <variableAgg name="z"/>\n <scan location="c:/RPS/dem/" regExp=".*DEM.*tmin\$"\n dateFormatMark="DEM_NE_5.0_#yyyy_MM_dd"/>\n </aggregation>\n </netcdf>\n <netcdf xmlns="">\n <variable name="tmax" orgName="z">\n <attribute name="long_name" value="maximum temperature"/>\n <attribute name="units" value="kelvin"/>\n <attribute name="_FillValue" type="float" value="NaN"/>\n </variable>\n <aggregation dimName="time" type="joinNew">\n <variableAgg name="z"/>\n <scan location="c:/RPS/dem/" regExp=".*DEM.*tmax\$"\n dateFormatMark="DEM_NE_5.0_#yyyy_MM_dd"/>\n </aggregation>\n </netcdf>\n </aggregation>\n</netcdf>\n}}}\n\nSo to use this on another system, just need:\n1) Install GMT on Linux (or Cygwin on Windows)\n2) Run the do_dem2nc script\n3) Change the "location" in the NcML to point to your local directory.\n\nThen it should be able to be added as a THREDDS dataset in a TDS catalog, just like the PRISM data.\n \n\n\n
{{{\nc:\sRPS\sbathy\sseth>gdalwarp vs_draft.asc -s_srs "+proj=utm +zone=19 +datum=WGS84"\n -t_srs "+proj=latlong +datum=NAD83" vs_geo.tif\n\nc:\sRPS\sbathy\sseth>gdalwarp BB_ALL_DRAFT.asc -s_srs "+proj=utm +zone=19 +datum=WG\nS84" -t_srs "+proj=latlong +datum=NAD83" buzzbay_geo.tif\n}}}\nThen fired up ArcGIS10.1 and ran "RastertoNetCDF" to convert the tif to netCDF (I could use GDAL, but the GDAL netCDF output is not CF compliant, requires making NcML, and this was just 2 datasets)\n
I had some previous COAWST calculation stored as\ndata: hsum\ngrid: g\nHere's how I converted to ArcGis for Brad to consume:\n\n{{{\n%grid2regular.m\ncd c:\srps\scf\scoawst\nload str_max_hours_0.2.mat\nhsum=hsum*100; % convert [0-1] fraction => [0-100] percent\nax=[ -72.5 -65 39.5 46.0];\n\n% determine model index ranges within lon/lat bounding box\n[jj,ii]=lonlat2ij(g.lon,,ax);\n\nlon=g.lon(jj,ii);\,ii);\nhsum=hsum(jj,ii);\n\n% 200x200 grid for 6.5 x 6.5 degrees\ndx=0.02;\ndy=0.02;\nx=ax(1):dx:ax(2);\ny=ax(3):dy:ax(4);\n[xi,yi]=meshgrid(x,y);\nzi=griddata(lon(:),lat(:),hsum(:),xi,yi);\ngz.lon=x;\;\n\n%%\ncd c:/rps/m/mirone200\nmirone\ngrid2mirone(zi,gz);\n}}}\n\nIn Mirone, I did "Save Grid As=> ESRI .hdr labeled" and then zipped up the 3 files that were generated.\n\nWhen Brad brought them into ArcGIS, the map was all black because of the 1e36 values where NaN should have been, but he saved as an Arc GRID, and then reloaded and it was okay.\n
Step 1. Convert EGM96 Geoid to WGS84 Ellipsoid using interactive calculator:\n\n\nStep2. Convert WGM84 Ellipsoid (G1150) to NAD_83 (CORS96) using interactive calculator:\n\n\nStep 3. Convert NAD83 to NAVD88 using GEOID09 interactive calculator:\n\n\nTry low-lying area at Scituate:\nlat
Use windows program "mov2avi" to convert quicktime .mov files from GNOME to an .avi that can be played in PowerPoint under Windows.\n{{{\nc:\srps\sprograms\smov2avi -c19\n}}}\nwill make a file called "gnome_movie.avi" with no compression.\n{{{\nmov2avi -?c\n}}}\nwill show all the CODECS. Unfortunately there isn't a "Microsoft RLE", which is perfect for GNOME movies. But once you've got an uncompressed .avi, you can convert it to RLE using VideoMach.
I used to use "gribtocdl" and "gribtonc" to convert GRIB files to NetCDF. These tools no longer work for GRIB2 files.\n\nBut here are two easy ways to convert GRIB2 files (and GRIB1 for that matter) to NetCDF:\n\n* Use WGRIB2's NetCDF output option: \n{{{\nwgrib2 ds.wspd.bin -netcdf\n}}}\n* Use NetCDF Java: \n{{{\njava -classpath ~/java/jar/toolsUI-2.2.18.jar ucar.nc2.iosp.grib.Grib2Netcdf ds.wspd.bin\n}}}\n\nWGRIB2 writes lon/lat values, while NetCDF java just writes the projected coordinates.\n\nOf course, the best thing would be to ask whoever is producing the GRIB files to put up a GDS server so that they could be accessed as NetCDF to begin with!
Dave Schwab gave me the bathymetry for the Great Lakes as 162 tiles in the form of GEODAS ".g98" format files. These are simple binary files with a 128-byte header (see documentation at <> for more information). After perusing the g98 format doc, I wrote a Matlab program to input g98 files and output NetCDF files. Because I wanted to use "gdal_merge" to merge the NetCDF files into a single tile, and GDAL plots files with decreasing lat upside down, I wrote the NetCDF files "upside down". After using "gdal_merge" (version from FWTOOLS 2.2.8), the resulting single tile 800MB GeoTIFF looked fine.\n\n{{{\nStep 1. In Matlab, run "c:\srps\scf\sglos\sgeodas-great-lakes\sread_g98.m". This produces 161 NetCDF files. I had to skip the 5th .g98 file, "gna42080.g98" as it would not read correctly, despite having the same file size on disk as the other tiles. \n\nStep 2. In the FWTOOLS command shell, run "do_merge.bat", which looks like this:\n\ngdal_merge.bat -o big1.tif ... ^\ ... ^\\n\nI had to do this because *.nc does not work on the PC. The "caret symbols" are continuation symbols for Windows Batch Files.\n\nStep 3. Convert the GeoTIFF back to NetCDF.\n\nC:\sRPS\scf\sglos\sgeodas-great-lakes>gdal_translate big1.tif -of NetCDF\n\nUnfortunately, this creates just a "Band1" variable with no metadata.\n\nStep 4. Generate the NcML to put in standard COARDS Compliant Topo form:\n <netcdf xmlns=""\n location="/var/local/glos/thredds/Bathymetry/">\n <dimension name="lon" orgName="x"/>\n <dimension name="lat" orgName="y"/>\n <variable name="lon" shape="lon" type="double">\n <attribute name="units" value="degrees_east"/>\n <values start="-93.00" increment="0.0008333333333333333"/>\n </variable>\n <variable name="lat" shape="lat" type="double">\n <attribute name="units" value="degrees_north"/>\n <values start="50.00" increment="-0.0008333333333333333"/>\n </variable>\n <variable name="topo" orgName="Band1">\n <attribute name="units" value="meters"/>\n <attribute name="long_name" value="Topography"/>\n </variable>\n <attribute name="Conventions" value="COARDS"/>\n </netcdf>\n\n}}}\n
in OSGeo4W shell:\n{{{\ne:\sDEMS_NELiDAR\sdems\n\ -o maine.tif 19_04534856.img 19_04534854.img 19_04534852.img 19_04534851.img 19_04544856.img 19_04544854.img 19_04544852.img 19_04544851.img 19_04564856.img 19_04564854.img 19_04564852.img\n}}}
MassGIS data is all in NAD83 Mass State Plane Coordinates, in meters. So that is EPSG:26986.\n\nThe 1:25000 coastline for Massachusetts is available as an ArcShapefile from MassGIS at (self-extracting zip which has all the components of the shapefile, including PRJ)\n\nSo on Windows, I fire up the "FWTools Shell" and use "ogr2ogr" from the FWTOOLS set to convert from Arc Shapefile to KML in one step!\n\n{{{\nogr2ogr -f KML -s_srs EPSG:26986 -t_srs EPSG:4326 mass_coast_25k.kml OUTLINE25K_ARC.shp\n}}}\n\nYou can convert to lots of other format types too. Just type "ogr2ogr" to see the list!\n\nFWTOOLS is available for Linux and Windows at\n\nHere's how I converted the coastline shapefile to matlab:\nFirst I converted to GMT format:\n{{{\nogr2ogr -f GMT -s_srs EPSG:26986 -t_srs EPSG:4326 coast.gmt OUTLINE25K_ARC.shp\n}}}\nThen I did a global replace: "#" to "%" and ">" to "NaN NaN NaN".\nThen I was able to just load the file into matlab:\n{{{\ncoast=load('coast.gmt');\nand then join to make a continuous coastline:\nnew=join_cst(coast,0.00001); % with 1e-5 degrees tolerance (1 m)\ncoast=new;\ncomment='Converted from MassGIS 25K coast, NAD83';\nsave coast_mass_25k.mat coast comment\n}}}
Here's a non-ArcGIS method for converting images and grids from the Mass Bay Open-File DS99 ( to self-describing GeoTIFF images and grids, and to Google Earth.\n\nConversion to GeoTIFF:\n\n1. Install FWTOOLS, as we will use the "gdal_translate" program to convert to Mercator GeoTIFF (with coordinate system fully specified interally so that no additional files are required), and the "gdalwarp" program to warp the Mercator GeoTIFF to a Geographic (EPSG:4326) coordinate system that Google Earth requires. \n\n1. From the command line (On a PC, start the FWToools Shell first) convert to Mercator GeoTIFF using\n{{{\ngdal_translate -a_srs "+proj=merc +lat_ts=41.65 +lon_0=-
Several years ago I wrote some bash scripts that used GDAL to convert the Arc ASCII grids into hundreds of individual NetCDF files (using gdal_translate) and then munged the whole thing together with a union aggregation of three different joinNew aggregations together with a NetCDF file containing time. It was a tour de force of NcML aggregation, but difficult to update, and a bit inefficient. \n\nSo instead, I just wrote some python scripts to download the ARC ASCII .gz files from PRISM, and convert them to NetCDF using Python, creating single NetCDF files for each decade that contain temp_min, temp_max, precip_mean, lon, lat and time. I put these files on, but here they are. They require the GDAL and NetCDF4-Python packages, which are supplied in Python(x,y) [windows 32-bit only] and in the Enthought Python Distribution (EPD) [mac, linux 32, linux 64, win32, win64]. I used EPD for Linux 64 bit (geoport is a 64-bit machine running Ubuntu).\n\nFirst I wrote a function and script to download the files by decade,\n{{{\ndef get_prism_files(decades,vars):\n\n from ftplib import FTP\n import os\n #decades=['1890-1899','2000-2009']\n #vars=['ppt','tmax','tmin','tdmean']\n #vars=['ppt','tmax','tmin']\n ftp=FTP('')\n ftp.login()\n for decade in decades:\n try:\n os.mkdir(decade)\n except:\n print('Directory ' + decade + ' already exists')\n \n for var in vars:\n ftp.cwd('/pub/prism/us/grids/' + var + '/' + decade)\n for file in ftp.nlst():\n print(file)\n ftp.retrbinary('RETR ' + file,open('./'+decade+'/'+file,'wb').write)\n \n\n os.system('gunzip ./'+decade+'/*.gz')\n \n return\n}}}\nwhich I called using "", which creates directories of uncompressed ARC ascii files, one for each decade:\n{{{\nimport get_prism_files as rps\n\n#decades=['1890-1899']\n#decades=['1900-1909','1910-1919','1920-1929','1930-1939','1940-1949']\n#decades=['1950-1959','1960-1969','1970-1979','1980-1989']\ndecades=['1990-1999','2000-2009','2010-2019']\n\n#vars=['ppt','tmax','tmin','tdmean']\nvars=['ppt','tmax','tmin']\n\nrps.get_prism_files(decades,vars)\n}}}\n\nI then called a script to read the data and chunk it into NetCDF files:\n{{{\nimport gdal\nimport os\nimport datetime as dt\nimport netCDF4\nimport numpy as np\nimport re\n\ndecades=['1890-1899','1900-1909','1910-1919','1920-1929','1930-1939','1940-1949',\n '1950-1959','1960-1969','1970-1979','1980-1989','1990-1999',\n '2000-2009','2010-2019']\n\n#vars=['ppt','tmax','tmin','tdmean']\nvars=['ppt','tmax','tmin']\n\n#rps.get_prism_files(decades,vars)\n\n# read 1 sample dataset to get lon/lat \ndataset=gdal.Open('us_tmin_1895.01') # sample file\na=dataset.ReadAsArray() #data\nnlat,nlon=np.shape(a)\nb=dataset.GetGeoTransform() #bbox, interval\nlon=np.arange(nlon)*b[1]+b[0]\nlat=np.arange(nlat)*b[5]+b[3]\n\nbasedate=dt.datetime(1858,11,17,0,0,0)\n\nfor decade in decades:\n \n #create netCDF4 file\n nco = netCDF4.Dataset('./chunk/prism_'+decade+'.nc','w',clobber=True)\n chunk_lon=16\n chunk_lat=16\n chunk_time=12\n #sigdigits=4\n nco.createDimension('lon',nlon)\n nco.createDimension('lat',nlat)\n nco.createDimension('time',None)\n timeo=nco.createVariable('time','f4',('time'))\n lono=nco.createVariable('lon','f4',('lon'))\n lato=nco.createVariable('lat','f4',('lat'))\n # 16 MB for one year:\n tmno = nco.createVariable('tmn', 'i2', ('time', 'lat', 'lon'), \n zlib=True,chunksizes=[chunk_time,chunk_lat,chunk_lon],fill_value=-9999)\n tmxo = nco.createVariable('tmx', 'i2', ('time', 'lat', 'lon'), \n zlib=True,chunksizes=[chunk_time,chunk_lat,chunk_lon],fill_value=-9999)\n ppto = nco.createVariable('ppt', 'i4', ('time', 'lat', 'lon'), \n zlib=True,chunksizes=[chunk_time,chunk_lat,chunk_lon],fill_value=-9999)\n #attributes\n timeo.units='days since 1858-11-17 00:00:00'\n lono.units='degrees_east'\n lato.units='degrees_north'\n \n tmno.units='degC'\n tmno.scale_factor = 0.01\n tmno.add_offset = 0.00\n tmno.long_name='minimum monthly temperature'\n tmno.set_auto_maskandscale(False)\n \n tmxo.units='degC'\n tmxo.scale_factor = 0.01\n tmxo.add_offset = 0.00\n tmxo.long_name='maximum monthly temperature'\n tmxo.set_auto_maskandscale(False)\n \n ppto.units='mm/month'\n ppto.scale_factor = 0.01\n ppto.add_offset = 0.00\n ppto.long_name='mean monthly precipitation'\n ppto.set_auto_maskandscale(False)\n \n nco.Conventions='CF-1.4'\n \n #write lon,lat\n lono[:]=lon\n lato[:]=lat\n \n pat=re.compile('us_tmin_[0-9]{4}\s.[0-9]{2}')\n itime=0\n #step through data, writing time and data to NetCDF\n #for root, dirs, files in os.walk('mintemp'):\n\n for root, dirs, files in os.walk(decade):\n dirs.sort()\n files.sort()\n for f in files:\n if re.match(pat,f):\n year=int(f[8:12])\n mon=int(f[13:15])\n if mon <= 12 :\n date=dt.datetime(year,mon,1,0,0,0)\n print(date)\n dtime=(date-basedate).total_seconds()/86400.\n timeo[itime]=dtime\n # min temp\n tmn_path = os.path.join(root,f)\n print(tmn_path)\n tmn=gdal.Open(tmn_path)\n a=tmn.ReadAsArray() #data\n tmno[itime,:,:]=a\n # max temp\n tmax_path=tmn_path.replace('tmin','tmax')\n print(tmax_path)\n tmx=gdal.Open(tmax_path)\n a=tmx.ReadAsArray() #data\n tmxo[itime,:,:]=a\n # mean precip\n ppt_path=tmn_path.replace('tmin','ppt')\n print(ppt_path)\n ppt=gdal.Open(ppt_path)\n a=ppt.ReadAsArray() #data\n ppto[itime,:,:]=a\n \n itime=itime+1\n\n nco.close()\n}}}\n\n
Convert NOAA XYZ swath bathy text files from Kate McMullen to 32-bit GeoTIFF and then to ArcGIS grid\n\nIssues: the original xyz file was more than 2 Gb, so we had to split, create two .grd files, and the use grdpaste to put them together. I also discovered that specifying WKT was easier than looking up the EPSG codes for UTM. But EPSG:4326 worked better for going to geographic. The WKT string of "+proj +latlong" didn't specify any ellipsoid, even though the UTM has a NAD83 ellipsoid defined.\n\n\n{{{\n#!/bin/bash\n\n# Convert NOAA XYZ swath bathy text files from Kate McMullen to GeoTIFF\n\n# This script uses tools from\n# GMT:\n# and\n# FWTOOLS:\n\n# ------------------------CONVERT THE METER-RESOLUTION GRID \n\n# set GMT with sufficient precision for big UTM numbers\ngmtset D_FORMAT %15.4f\nminmax H11310_1m_UTM19NAD83.txt\n\n# the results of "minmax" are used here:\n\nxyz2grd -R/298076.7/309351.7/4599754.86/4607312.86/ \s\n-Gkate_1m.grd -I1.0 H11310_1m_UTM19NAD83.txt\n\n# Specify 1/2 grid cell (0.5 m) larger on each end (point=>raster)\n# and specify output projection as NAD83(CSRS98) / UTM zone 19N (EPSG:2037)\ngdal_translate -a_ullr 298076.2 4607313.36 309352.2 4599754.36 \s\n-a_srs "+proj=utm +zone=19 +datum=NAD83" kate_1m.grd kate_1m.tif\n\n# convert UTM to Geographic also\n\ngdalwarp kate_1m.tif -rb -t_srs "EPSG:4326" kate_1m_geo.tif\n\n\n# ------------CONVERT THE HALF-METER-RESOLUTION GRID --------------------\n\n# The 0.5 m file is to big for "minmax" to handle, so split\n# in files with
For the 30 year FVCOM/NECOFS archive run at SMAST, we were using aggregation=union to join together a aggregation=joinExisting with two additional netcdf files: 1) a new file containing lon/lat coordinates, and 2) an auxiliary file containing z0. \n\nThe problem was that when wrapped in a union aggregation, the joinExisting wasn't automatically updating as new files were added, and the metadata was not being cached.\n\nSo instead, we decided to modify the 1st file of the aggregation to include the correct lon/lat and add the extra auxilliary information. \n\nTo do this, we first used ncks to grab the 1st time step from the union aggregation. This seg faulted on the local smast machine, but worked okay on\n\n{{{\n ncks -d time,0\n}}}\nThen in Python, we did\n{{{\nIn [10]: nci=netCDF4.Dataset('','r+')\nIn [11]: time=nci.variables['time'][:]\nIn [12]: time\nOut[12]: array([ 43509.], dtype=float32)\nIn [13]: nci.variables['time'][0]=time-1/24.\nIn [14]: time=nci.variables['time'][:]\nIn [15]: time\nOut[15]: array([ 43508.95703125], dtype=float32)\nIn [16]: nci.close()\n}}}\n\nto write a new time value that is one hour (1/24 days) earlier. We then renamed this file to be "", since the first real file in "http/www/CODFISH/Data/FVCOM/NECOFS" that is being aggregated is named "".\n\nWe then moved the existing catalog to \n
Converting the Mass Bay 10 m bathy from Arc Binary (Mercator) to ASCII XYZ (geographic)\n{{{\ngdal_translate
The old NOAA smooth sheets are available as TIFF or Mr. Sid. This one is 88 MB as a TIFF, and 8 MB as a Mr. Sid. Can Matlab compress the TIFF to JPEG 2000 (which like Mr. Sid, also uses wavelet compression) with comparable quality? Let's try making an 8MB JPEG 2000 file to find out?\n\nFirst get the Mr. Sid and TIFF image from NOAA:\n{{{\nwget\nwget\n}}}\nafter ungzipping them, read the tiff into Matlab, and convert from an indexed image to a true color image (need to do this for JPEG 2000) and then compress to 8MB file:\n{{{\n[a,map] = imread('c:\sdownloads\sh01832.tif'); % a is 144 MB\nb = ind2rgb8(a,map); % b is 432 MB (3 times bigger than a)\noutput_file_size = 8e6; % size in bytes (8MB)\ns = whos('b');\nratio = s.bytes/output_file_size;\n imwrite(b,'c:\sdownloads\sh01832.jp2','compressionratio',ratio); % results in 8MB file that looks as good as Mr. Sid\n}}}\n
Getting ROMS going with rotated and cell-centered vectors:\n\n1. Install F-TDS, following instructions at: <>\n\n2. Create a clean.ncml file that removes most of the variables, leaving only the ones we want to process with FERRET:\n{{{\n<netcdf xmlns="">\n <remove type="variable" name="Akk_bak"/>\n <remove type="variable" name="Akp_bak"/>\n <remove type="variable" name="Akt_bak"/>\n <remove type="variable" name="Akv_bak"/>\n <remove type="variable" name="Cs_r"/>\n <remove type="variable" name="Cs_w"/>\n <remove type="variable" name="FSobc_in"/>\n <remove type="variable" name="FSobc_out"/>\n <remove type="variable" name="Falpha"/>\n <remove type="variable" name="Fbeta"/>\n <remove type="variable" name="Fgamma"/>\n <remove type="variable" name="M2nudg"/>\n <remove type="variable" name="M2obc_in"/>\n <remove type="variable" name="M2obc_out"/>\n <remove type="variable" name="M3nudg"/>\n <remove type="variable" name="M3obc_in"/>\n <remove type="variable" name="M3obc_out"/>\n <remove type="variable" name="Tcline"/>\n <remove type="variable" name="Tnudg"/>\n <remove type="variable" name="Tobc_in"/>\n <remove type="variable" name="Tobc_out"/>\n <remove type="variable" name="Znudg"/>\n <remove type="variable" name="Zob"/>\n <remove type="variable" name="Zos"/>\n <remove type="variable" name="bustr"/>\n <remove type="variable" name="bvstr"/>\n <remove type="variable" name="dstart"/>\n <remove type="variable" name="dt"/>\n <remove type="variable" name="dtfast"/>\n <remove type="variable" name="el"/>\n <remove type="variable" name="f"/>\n <remove type="variable" name="gamma2"/>\n <remove type="variable" name="gls_Kmin"/>\n <remove type="variable" name="gls_Pmin"/>\n <remove type="variable" name="gls_c1"/>\n <remove type="variable" name="gls_c2"/>\n <remove type="variable" name="gls_c3m"/>\n <remove type="variable" name="gls_c3p"/>\n <remove type="variable" name="gls_cmu0"/>\n <remove type="variable" name="gls_m"/>\n <remove type="variable" name="gls_n"/>\n <remove type="variable" name="gls_p"/>\n <remove type="variable" name="gls_sigk"/>\n <remove type="variable" name="gls_sigp"/>\n <!-- <remove type="variable" name="h"/>\n <remove type="variable" name="hc"/>-->\n <remove type="variable" name="lat_psi"/>\n <remove type="variable" name="lat_u"/>\n <remove type="variable" name="lat_v"/>\n <remove type="variable" name="lon_psi"/>\n <remove type="variable" name="lon_u"/>\n <remove type="variable" name="lon_v"/>\n <remove type="variable" name="mask_psi"/>\n <remove type="variable" name="pm"/>\n <remove type="variable" name="pn"/>\n <remove type="variable" name="rdrg"/>\n <remove type="variable" name="rdrg2"/>\n <remove type="variable" name="rho"/>\n <remove type="variable" name="rho0"/>\n <remove type="variable" name="s_w"/>\n <remove type="variable" name="shflux"/>\n <remove type="variable" name="spherical"/>\n <remove type="variable" name="sustr"/>\n <remove type="variable" name="svstr"/>\n<!-- <remove type="variable" name="theta_b"/>\n <remove type="variable" name="theta_s"/>-->\n <remove type="variable" name="w"/>\n <remove type="variable" name="xl"/>\n<!-- <remove type="variable" name="zeta"/>-->\n\n <!--<remove type="variable" name="salt"/> -->\n <variable name="salt">\n <attribute name="missing_value" type="float" value="0.0"/>\n </variable>\n\n <!--<remove type="variable" name="temp"/> -->\n <variable name="temp">\n <attribute name="missing_value" type="float" value="0.0"/>\n </variable>\n\n <aggregation dimName="ocean_time" type="joinExisting" timeUnitsChange="true">\n <scan location="/data/ftp/upload/Estuarine_Hypoxia/umces/chesroms/synoptic/output/history_output/" suffix=".nc" subdirs="true"/>\n </aggregation>\n</netcdf>\n}}}\nAn example "clean.ncml" dataset can be seen at <>\n\n2. Create a Ferret "vectors.jnl" file that points to the "clean.ncml" URL. See the FERRET documentation for syntax (\n{{{\n[tomcat@testbedapps dynamic]$ more vectors.jnl\n\n!use ""\n!use "\nnc"\n!use ""\nuse ""\n\n! We define a new axis for the rotated data. We are going to use an average of points i and i + 1 to move the data to the center of the cell.\n! This means there will be fewer points in the centered grid than the original so we need a new set of slightly smaller axis.\ndefine axis/x=1:98:1 xrho\ndefine axis/y=1:148:1 yrho\n\n! Define masked variables if mask eq 1 then var\n! Then u_masked will be used in place of u below, etc.\n\nlet u_masked = if mask_u eq 1 then u\nlet v_masked = if mask_v eq 1 then v\nlet ubar_masked = if mask_u eq 1 then ubar\nlet vbar_masked = if mask_v eq 1 then vbar\n\n! These lines produce new variables for the grid using the ferret shf operator which\n! in this case is effectively subsetting the array and eliminating the first item\n! Note the "let/d=1/units syntax. This instructs ferret to create a new variable (let)\n! and store it in dataset 1 (d=1). The units are to come from the existing lon_rho variable.\n! Without the d=1 the new variables would not be visible to the TDS/NCML subsequent processing.\n! The second and fourth lines of this code block are defining new grids.\n! IMPORTANT NOTE: It doesn't appear that these four line are currently used in further calculations\n! so they could presumably be removed.\nlet/d=1/units="`lon_rho,return=units`" lon_rho_p_0 = lon_rho[i=@shf:+1, j=@shf:+1]\nlet/d=1/units="`lon_rho,return=units`" lon_rho_p = lon_rho_p_0[gx=xrho@asn,gy=yrho@asn]\nlet/d=1/units="`lat_rho,return=units`" lat_rho_p_0 = lat_rho[i=@shf:+1, j=@shf:+1]\nlet/d=1/units="`lat_rho,return=units`" lat_rho_p = lat_rho_p_0[gx=xrho@asn,gy=yrho@asn]\n\n\n! These lines use the shift and grid transform operators to produce the needed angles on the new grid.\nlet/d=1 angle_p_0 = angle[i=@shf:+1, j=@shf:+1]\nlet/title="angle centered" angle_p = angle_p_0[gx=xrho@asn,gy=yrho@asn]\n\n! These lines use the shift and grid transform operators to produce a mask on the new grid.\nlet/d=1 mask_p_0 = mask_rho[i=@shf:+1, j=@shf:+1]\nlet/d=1 mask_p = mask_p_0[gx=xrho@asn,gy=yrho@asn]\n\n! These lines average the data to the centers of the cells.\nlet/d=1/units="`u,return=units`" u_p_0 = 0.5*(u_masked[j=@shf:+1]+u_masked[i=@shf:+1,j=@shf:+1])\nlet/title="U centered"/units="`u,return=units`" u_p = u_p_0[gx=xrho@asn,gy=yrho@asn]\nlet/d=1/units="`v,return=units`" v_p_0 = 0.5*(v_masked[i=@shf:+1]+v_masked[i=@shf:+1,j=@shf:+1])\nlet/title="V centered"/units="`v,return=units`" v_p = v_p_0[gx=xrho@asn,gy=yrho@asn]\n\n! These lines average the masked data to the centers of the cells.\nlet/d=1/units="`ubar,return=units`" ubar_p_0 = 0.5*(ubar_masked[j=@shf:+1]+ubar_masked[i=@shf:+1,j=@shf:+1])\nlet/title="UBAR centered"/units="`ubar,return=units`" ubar_p = ubar_p_0[gx=xrho@asn,gy=yrho@asn]\nlet/d=1/units="`vbar,return=units`" vbar_p_0 = 0.5*(vbar_masked[i=@shf:+1]+vbar_masked[i=@shf:+1,j=@shf:+1])\nlet/title="VBAR centered"/units="`vbar,return=units`" vbar_p = vbar_p_0[gx=xrho@asn,gy=yrho@asn]\n! ==========\n\n! Finally we use trig to transform the centered data to the new grid\nLET/d=1 urot = u_p*COS(angle_p) - v_p*SIN(angle_p)\nLET/d=1 vrot = u_p*SIN(angle_p) + v_p*COS(angle_p)\n\n! This transforms the masked data.\nLET/d=1 ubarrot = ubar_p*COS(angle_p) - vbar_p*SIN(angle_p)\nLET/d=1 vbarrot = ubar_p*SIN(angle_p) + vbar_p*COS(angle_p)\n}}}\n\n3. In the THREDDS catalog, create a dataset and reference the vectors.jnl as the data location. Remove all the FERRET intermediate variables and specify coordinates for the new cell-centered and rotated velocities using NcML:\n{{{\n<dataset name="ChesROMS - Synoptic (Rotated vectors)" ID="estuarine_hypoxia/chesroms/vectors"\n urlPath="estuarine_hypoxia/chesroms/">\n <serviceName>agg</serviceName>\n <netcdf xmlns=""\n location="/var/www/tomcat/content/las/conf/server/data/dynamic/vectors.jnl">\n <!--location="/data/ftp/upload/Estuarine_Hypoxia/umces/chesroms/synoptic_vectors/chesroms_rot_step2.jnl">-->\n <remove type="variable" name="LON_RHO_P_0"/>\n <!--<remove type="variable" name="lon_rho_p"/>-->\n <remove type="variable" name="LAT_RHO_P_0"/>\n <!-- <remove type="variable" name="lat_rho_p"/>-->\n <remove type="variable" name="ANGLE_P_0"/>\n <remove type="variable" name="ANGLE_P"/>\n <remove type="variable" name="MASK_P_0"/>\n <remove type="variable" name="MASK_P"/>\n <remove type="variable" name="U_P_0"/>\n <remove type="variable" name="U_P"/>\n <remove type="variable" name="V_P_0"/>\n <remove type="variable" name="V_P"/>\n <remove type="variable" name="UBAR_P_0"/>\n <remove type="variable" name="UBAR_P"/>\n <remove type="variable" name="VBAR_P_0"/>\n <remove type="variable" name="VBAR_P"/>\n <attribute name="wms-link" value=""/>\n <attribute name="wms-layer-prefix" value="chesroms-vectors"/>\n <attribute name="title" value="ChesROMS (UMCES) - ROMS-2.2"/>\n <attribute name="id" value="eh.umces.chesroms.synoptic_vectors"/>\n <attribute name="naming_authority" value="noaa.ioos.testbed"/>\n <attribute name="summary"\n value="Chesapeake Bay Application of ROMS/TOMS 2.2 with rotated vectors and no hypoxic variables"/>\n <attribute name="creator_name" value="Wen Long"/>\n <attribute name="creator_email" value=""/>\n <attribute name="creator_url" value=""/>\n <attribute name="cdm_data_type" value="Grid"/>\n\n <variable name="ZETA_P">\n <attribute name="coordinates" value="OCEAN_TIME LAT_RHO_P LON_RHO_P"/>\n </variable>\n <variable name="H_P">\n <attribute name="coordinates" value="LAT_RHO_P LON_RHO_P"/>\n </variable>\n <variable name="UROT">\n <attribute name="units" value="m/s"/>\n <attribute name="coordinates" value="OCEAN_TIME S_RHO LAT_RHO_P LON_RHO_P"/>\n <attribute name="standard_name" value="eastward_sea_water_velocity"/>\n <attribute name="missing_value" type="float" value="-1.0e34"/>\n <attribute name="wms-layer" value="chesroms-vectors/sea_water_velocity"/>\n </variable>\n <variable name="VROT">\n <attribute name="coordinates" value="OCEAN_TIME S_RHO LAT_RHO_P LON_RHO_P"/>\n <attribute name="standard_name" value="northward_sea_water_velocity"/>\n <attribute name="missing_value" type="float" value="-1.0e34"/>\n <attribute name="units" value="m/s"/>\n <attribute name="wms-layer" value="chesroms-vectors/sea_water_velocity"/>\n </variable>\n <variable name="UBARROT">\n <attribute name="coordinates" value="OCEAN_TIME LAT_RHO_P LON_RHO_P"/>\n <attribute name="missing_value" type="float" value="-1.0e34"/>\n <attribute name="standard_name" value="barotropic_eastward_sea_water_velocity"/>\n <attribute name="units" value="m/s"/>\n <attribute name="wms-layer" value="chesroms-vectors/barotropic_sea_water_velocity"/>\n </variable>\n <variable name="VBARROT">\n <attribute name="coordinates" value="OCEAN_TIME LAT_RHO_P LON_RHO_P"/>\n <attribute name="missing_value" type="float" value="-1.0e34"/>\n <attribute name="standard_name" value="barotropic_northward_sea_water_velocity"/>\n <attribute name="units" value="m/s"/>\n <attribute name="wms-layer" value="chesroms-vectors/barotropic_sea_water_velocity"/>\n </variable>\n <variable name="S_RHO">\n <attribute name="formula_terms"\n value="s: S_RHO eta: ZETA_P depth: H_P a: THETA_S b: THETA_B depth_c: HC"/>\n </variable>\n <attribute name="Conventions" value="CF-1.0"/>\n </netcdf>\n </dataset>\n}}}
I used the\nc:\srps\sm\sroms\srk4_run2.m\nfile to produce those cool plots. The rk4 works much better than the "stream2" function in Matlab, but it's an interactive thing, so it's tricky to recrease these plots.
Shell script to cut a ROMS file:\n{{{\n# Cut a valid ROMS file out of another valid ROMS file, \n# Can be used on history, grid or averages files, \n# Fixed, and checked against grid file on July 15, 2010\n\n# Usage: do_cut_roms_grid file_in file_out istart istop jstart jstop (1-based indexing)\n\ndeclare -i XI_RHO_START=$3\ndeclare -i XI_RHO_STOP=$4 \ndeclare -i ETA_RHO_START=$5\ndeclare -i ETA_RHO_STOP=$6\n\ndeclare -i ETA_U_START=ETA_RHO_START\ndeclare -i ETA_U_STOP=ETA_RHO_STOP\n\ndeclare -i XI_U_START=XI_RHO_START\ndeclare -i XI_U_STOP=XI_RHO_STOP-1\n\ndeclare -i ETA_V_START=ETA_RHO_START\ndeclare -i ETA_V_STOP=ETA_RHO_STOP-1\n\ndeclare -i XI_V_START=XI_RHO_START\ndeclare -i XI_V_STOP=XI_RHO_STOP\n\ndeclare -i XI_PSI_START=XI_RHO_START\ndeclare -i XI_PSI_STOP=XI_RHO_START-1\n\ndeclare -i ETA_PSI_START=ETA_RHO_STOP\ndeclare -i ETA_PSI_STOP=ETA_RHO_STOP-1\n\nncks -F -d xi_rho,$XI_RHO_START,$XI_RHO_STOP \s\n -d eta_rho,$ETA_RHO_START,$ETA_RHO_STOP \s\n -d eta_u,$ETA_U_START,$ETA_U_STOP \s\n -d xi_u,$XI_U_START,$XI_U_STOP \s\n -d eta_v,$ETA_V_START,$ETA_V_STOP \s\n -d xi_v,$XI_V_START,$XI_V_STOP \s\n -d eta_psi,$ETA_PSI_START,$ETA_PSI_STOP \s\n -d xi_psi,$XI_PSI_START,$XI_PSI_STOP \s\n $1 $2\n}}}
If you want to cut out a spatiotemporal chunk of NCOM output, you can\njust do it using the NCO tools. For example, if you compile and build\nNCO with opendap support (easy now that NetCDF includes a native\nopendap) and UDUNITS, you can create a local netcdf file directly from\nyour best time series aggregation. Here's an example of cutting just\na chunk near the BP site for a 4 day period:\n{{{\nrsignell@gam:~$ ncks -d time,"2010-08-01 00:00","2010-08-05 00:00" -d\nlat,27.0,30.0 -d lon,-90.0,-87.0 -v\nwater_temp,salinity,water_u,water_v,surf_el\n''\n\n}}}
{{{\nncks -O -F -d time,-d time,"2010-04-11 00:00","2010-08-04 00:00" -d lon,-88.37,-85.16 -d lat,29.31,30.4\n}}}\n
geoportal 1.2.4 supports DCAT\n\n"Geoportal Server (1.2.4) now support DCAT (Data Catalog Vocabulary) outputs. The DCAT output is in json format and is available through url pattern http://servername:port/geoportal/rest/find/document?f=dcat, it is possible to add additional parameters to the url as well, please refer to REST API Syntax for additional parameters. "
I just had this problem again: I submit a parallel job (8 cpus) and nothing seems to be happening. I try submitting a serial job in debug mode to see what's going on and it works fine.\n\nThe problem: stray processes that didn't get killed by "qdel".\n\nSolution: \n*Check which nodes the job is running on by doing a "qstat -f"\n*Fire up "konqueror" and see if any of the 4 cpu nodes have more than 4 processes running and if any of the 2 cpu nodes have more than 2 processes running.\n* Kill all the "oceanM" jobs via "rcom killall oceanM"\n* Resubmit the job \nHappiness!
From John Caron:\n\nI've also added the "_CoordinateSystemFor" to be able to assign coordinate systems to a class of variables. in case you dont want to (or cant) tag each data variable.\n note this is not a CF convention. docs here:\n\n\n brand new and little tested. banging on it and feedback would be appreciated. Ill let you decide if/when to share with others.\n <variable name="coordSysVar4D" type="int" shape="">\n <attribute name="_CoordinateAxes" value="x y zpos time"/>\n <attribute name="_CoordinateTransforms" value="zpos"/>\n <attribute name="_CoordinateSystemFor" value="xpos ypos zpos time"/>\n </variable>\n\n <variable name="coordSysVar3D" type="int" shape="">\n <attribute name="_CoordinateAxes" value="x y time"/>\n <attribute name="_CoordinateSystemFor" value="xpos ypos time"/>\n </variable>\n\n The names "coordSysVar4D", "coordSysVar3D" are not important.\n NJ just searches for variables that contain "_CoordinateSystemFor"\n attributes and uses those Coordinate axes for those dimensions.\n
[[netcdf]] [[tds]]
Ken ( and interested UAF folks)\n\nYou guys probably know this, but chunking can really speed up the performance of time series extraction at a point from remote sensing applications, yet still yeild good performance when returning full scenes at specific times.\n\nWe experimented with 450 images of 1km AVHRR SST images in the Gulf of Maine from 2009. The image size is 1222x1183, and we tried chunking at 50x50.\n| Type of File | Size on Disk |Time series (s) |Single Full Scene (s) |\n| NetCDF3 | 7,400M| 0.07| 1.49|\n| NetCDF4+deflation (no chunking) | 420M| 15.63| 0.92|\n| NetCDF4+deflation (chunksize=50) | 400M| 0.31| 0.89|\n\nThese are the median values for 10 extractions of each type. There are a lot of missing values in this dataset because there are lot of land values, but it would be interesting to try this on the GHRSST data. The script I used was:\n\n{{{\n!#/bin/bash\nfor file in *.nc\ndo \n /usr/bin/ncks -4 -L 1 -O $file netcdf4a/$file\n /usr/bin/ncks -4 -L 1 --cnk_dmn lat,50 --cnk_dmn lon,50 -O $file netcdf4b/$file\ndone \n}}}\n\nIf you want to play with these 3 datasets yourselves, they are here:\n{{{\n\n}}}
Matlab:\n{{{\nx=urlread('')\na=regexp(x,'bare(.*?)tif.bz2','match');\npart1=''\nfor i=1:2:length(a);\n url=[part1 char(a(i))];\n urlwrite(url,char(a(i)));\nend\n}}}
download all the netcdf-files from a directory:\n\nwget -nc -r -l2 -I /thredds/fileServer/,/thredds/catalog/\n'http://dev-vm188/thredds/catalog/osisaf/'\n\nI use here the existing datasetScan catalog.xml file, and fetch all\nnc-files up to two links away. Beside the nc-file, I get the\ncatalog-file of the nc-file (e.g.\nhttp://dev-vm188/thredds/catalog/osisaf/,\ntoo.\n\nA catalog-file in the fileServer would be saver, since the 2-levels\n(parent and child) might include other information, but at least I can\noffer our users something already now.
Here's how I keep up to date with NCTOOLBOX on my Windows box.\n\n1. I downloaded the Windows Binary command line tool for Mecurial from:\n\n\n2. I run the following batch script (c:\sRPS\sm_contrib\strunk\sdo_update_nctoolbox.bat) for Windows, which removes the existing toolbox, and then clones the latest one from the repository:\n{{{\nREM Batch script to update NCTOOLBOX\nrmdir /s /q nctoolbox\nhg clone nctoolbox\n}}}\n\nUpdate: once the source has been cloned, it's not necessary to clone it again. To update, cd to the nctoolbox directory and execute these two commands:\n{{{\ncd ~/nctoolbox\nhg pull\nhg update\n}}}\n\n\nNote: this script will only complete successfully if the NCTOOLBOX jar files are not in use, which means you need to close Matlab first if you've been using NCTOOLBOX (or perhaps clear the jar files from memory somehow, but I just close Matlab).
C code: \nPython-wrapper:
A sample CSW query/response\n{{{\nwget --header "Content-Type:text/xml" --post-file=p.xml '' -O response.txt\n}}}\np.xml looks like\n{{{\n<?xml version="1.0" encoding="UTF-8"?>\n<csw:GetRecords\n xmlns:csw=""\n xmlns:ogc=""\n xmlns:gmd=""\n xmlns:apiso=""\n xmlns:ows=""\n xmlns:xsd=""\n xmlns:gml=""\n xmlns:xsi=""\n service="CSW"\n version="2.0.2"\n resultType="results"\n outputFormat="application/xml"\n xsi:schemaLocation="\n\n\n\n\n "\n outputSchema=""\n startPosition="1"\n maxRecords="5"\n>\n <csw:Query typeNames="gmd:MD_Metadata">\n <csw:ElementSetName typeNames="gmd:MD_Metadata">full</csw:ElementSetName>\n <csw:Constraint version="1.1.0">\n <ogc:Filter>\n <ogc:PropertyIsEqualTo>\n <ogc:PropertyName>gmd:code</ogc:PropertyName>\n <ogc:Literal>imeds/inundation/simulation/</ogc:Literal>\n </ogc:PropertyIsEqualTo>\n </ogc:Filter>\n </csw:Constraint>\n </csw:Query>\n</csw:GetRecords>\n}}}
Search for recent data containing humidity:\n<,0,180,90&rel=&loc=&ts=2011-03-05T00:00:00&te=2011-03-27T00:00:00=&outputFormat=application/atom.xml>
I grabbed "s3 backup" from A bit terse, but seems to work fine. Started backing up my whole C drive, but that was going to take 14 hours, so ditched that idea. Tried just backing up m_cmg/trunk/cf instead. Make sure to hit the refresh button to see the folder you added. Tried the "usage report" and that worked fine too, showing me that so far I've spend $0.04 on Amazon Web Services.\n\nAfter reading a review in Laptop magazine, they mention some other web-backup services, including HP Upline ($59/year for *unlimited storage*!) I wonder what that really means! They mention also Xdrive ( and LinkUp (, but their favorite is SugarSync ( Said it cost $24.99 in the magazine, but it looked expensive if you had $100GB.
\nI was interested in the new NetCDF capabilities in ArcGIS 9.2, so I asked John O'Malley to install it on my notebook Windows XP computer. Took a couple of hours. \n\nHere's what I found:\n \nTo read NetCDF files, you can't just click on the "add" button like you would if you were dealing with geotiffs or Arc Grids. You have to use the new "multidimensional tools" in the Arc Toolbox. \n\nArc can read COARDS/CF compliant NetCDF grids that are evenly spaced (by clicking on "make NetCDF raster layer", kind of nonintuitive). It can read the "new style" GMT 4.1 NetCDF grid files, for instance. The caveat is that GMT apparently stores coordinate variables (e.g. x,y lon,lat) as floats, and if you have spacing like 3 arc seconds ( 8.3333e-004 degrees), ArcGIS will complain that the spacing is not uniform and refuse to read the data, even though it's a uniformly spaced grid. ESRI should probably be notified that they should interpret these grids as uniform if the deviation in dx and dy is less than 1.0e-5.\n\nArc can read other types of data from NetCDF files (other than uniformly-spaced grids) via the "make NetCDF feature layer". I tried loading one of our "EPIC" style NetCDF time series files, and I was able to select the "temperature" variable and plot a dot on a map, but there were lots of options on the Gui that I didn't fill in or understand, so I don't know what its capabilities are for time series data. \n\nThere is a very nice cookbook procedure for using ArcGIS 9.2 to animate time dependent NetCDF model data at so there certainly is some time series capability. I tried following the procedure. It all worked. \n\nFor NetCDF output, I tried output of an Arc Grid as NetCDF via "Raster to NetCDF". You do this by clicking on the Command Line Window icon (just to the left of the ?) near the right end of the standard toolbar. Then you type "RasterToNetCDF <layer name> <netcdf_file_name>\nI discovered that it writes the georeferencing information into a character attribute called "esri_pe_string" thusly:\n\n{{{\n$ ncdump -h\nnetcdf test36_arc {\ndimensions:\n        lon = 151 ;\n        lat = 80 ;\nvariables:\n        double lon(lon) ;\n                lon:long_name = "longitude coordinate" ;\n                lon:standard_name = "longitude" ;\n                lon:units = "degrees_east" ;\n        double lat(lat) ;\n                lat:long_name = "latitude coordinate" ;\n                lat:standard_name = "latitude" ;\n                lat:units = "degrees_north" ;\n        float topo(lat, lon) ;\n                topo:long_name = "topo" ;\n                topo:esri_pe_string = "GEOGCS[\s"GCS_WGS_1984\s",DATUM[\s"D_WGS_1984\s",SPHEROID[\s"WGS_1984\s",6378137.0,298.257223563]],PRIMEM[\s"Greenwich\s",0.0],UNIT[\s"Degree\s",0.0174532925199433]]" ;\n                topo:coordinates = "lon lat" ;\n                topo:units = "Degree" ;\n                topo:missing_value = 0.f ;\n\n// global attributes:\n                :Conventions = "CF-1.0" ;\n                :Source_Software = "ESRI ArcGIS" ;\n}\n}}}\n\nFor non-geographic projections, if known to CF, it adds the "mapping" variable. So for UTM, we get:\n\n{{{\n$ ncdump -h\nnetcdf test36_utm_arc {\ndimensions:\n x = 141 ;\n y = 101 ;\nvariables:\n double x(x) ;\n x:long_name = "x coordinate of projection" ;\n x:standard_name = "projection_x_coordinate" ;\n x:units = "Meter" ;\n double y(y) ;\n y:long_name = "y coordinate of projection" ;\n y:standard_name = "projection_y_coordinate" ;\n y:units = "Meter" ;\n float topo(y, x) ;\n topo:long_name = "topo" ;\n topo:esri_pe_string = "PROJCS[\s"NAD_1983_UTM_Zone_19N\s",GEOGCS[\s"GCS_North_American_1983\s",DATUM[\s"D_North_American_1983\s",SPHEROID[\s"GRS_1980\s",6378137.0,298.257222101]],PRIMEM[\s"Greenwich\s",0.0],UNIT[\s"Degree\s",0.0174532925199433]],PROJECTION[\s"Transverse_Mercator\s"],PARAMETER[\s"False_Easting\s",500000.0],PARAMETER[\s"False_Northing\s",0.0],PARAMETER[\s"Central_Meridian\s",-69.0],PARAMETER[\s"Scale_Factor\s",0.9996],PARAMETER[\s"Latitude_Of_Origin\s",0.0],UNIT[\s"Meter\s",1.0]]" ;\n topo:coordinates = "x y" ;\n topo:grid_mapping = "transverse_mercator" ;\n topo:units = "Meter" ;\n topo:missing_value = 0.f ;\n int transverse_mercator ;\n transverse_mercator:grid_mapping_name = "transverse_mercator" ;\n transverse_mercator:longitude_of_central_meridian = -69. ;\n transverse_mercator:latitude_of_projection_origin = 0. ;\n transverse_mercator:scale_factor_at_central_meridian = 0.9996 ;\n transverse_mercator:false_easting = 500000. ;\n transverse_mercator:false_northing = 0. ;\n\n// global attributes:\n :Conventions = "CF-1.0" ;\n :Source_Software = "ESRI ArcGIS" ;\n}\n}}}\n\nIf we have a projection that isn't Geographic, and isn't defined one of the accepted "grid_mapping" projections in CF, it writes just the esri_pe_string, as in this Miller Projection:\n\n{{{\n$ ncdump -h\nnetcdf test36_miller_arc {\ndimensions:\n x = 144 ;\n y = 92 ;\nvariables:\n double x(x) ;\n x:long_name = "x coordinate of projection" ;\n x:standard_name = "projection_x_coordinate" ;\n x:units = "Meter" ;\n double y(y) ;\n y:long_name = "y coordinate of projection" ;\n y:standard_name = "projection_y_coordinate" ;\n y:units = "Meter" ;\n float topo(y, x) ;\n topo:long_name = "topo" ;\n topo:esri_pe_string = "PROJCS[\s"Miller Cylindrical\s",GEOGCS[\s"GCS_North_American_1983\s",DATUM[\s"D_North_American_1983\s",SPHEROID[\s"GRS_1980\s",6378137.0,298.257222101]],PRIMEM[\s"Greenwich\s",0.0],UNIT[\s"Degree\s",0.0174532925199433]],PROJECTION[\s"Miller_Cylindrical\s"],PARAMETER[\s"False_Easting\s",0.0],PARAMETER[\s"False_Northing\s",0.0],PARAMETER[\s"Central_Meridian\s",-70.0],UNIT[\s"Meter\s",1.0]]" ;\n topo:coordinates = "x y" ;\n topo:units = "Meter" ;\n topo:missing_value = 0.f ;\n\n// global attributes:\n :Conventions = "CF-1.0" ;\n :Source_Software = "ESRI ArcGIS" ;\n}\n}}}\n\n\nWe should find out if other tools (such as the latest version of GDAL/FWTOOLS) can make use of this string to preserve georeferencing info. Okay, I've checked, and apparently gdal is doing something slightly different. This same file if converted to "NetCDF" with gdal_translate\n\n{{{\n gdal_translate test36_miller.tif -of netCDF\n}}}\n\nproduces this:\n\n{{{\nnetcdf test36_miller_gdal {\ndimensions:\n x = 144 ;\n y = 92 ;\nvariables:\n char miller_cylindrical ;\n miller_cylindrical:Northernmost_Northing = 4911333.35025183 ;\n miller_cylindrical:Southernmost_Northing = 4804259.39790056 ;\n miller_cylindrical:Easternmost_Easting =
/***\n| Name:|ExtentTagButtonPlugin|\n| Description:|Adds a New tiddler button in the tag drop down|\n| Version:|3.0 ($Rev: 1845 $)|\n| Date:|$Date: 2007-03-16 15:19:22 +1000 (Fri, 16 Mar 2007) $|\n| Source:||\n| Author:|Simon Baird <>|\n| License||\n***/\n//{{{\n\n// can't hijack a click handler. must redefine this entirely.\n// would be good to refactor in the core...\n// this version copied from 2.1.3 core\n\n// Event handler for clicking on a tiddler tag\nfunction onClickTag(e)\n{\n if (!e) var e = window.event;\n var theTarget = resolveTarget(e);\n var popup = Popup.create(this);\n var tag = this.getAttribute("tag");\n var title = this.getAttribute("tiddler");\n if(popup && tag)\n {\n var tagged = store.getTaggedTiddlers(tag);\n var titles = [];\n var li,r;\n for(r=0;r<tagged.length;r++)\n if(tagged[r].title != title)\n titles.push(tagged[r].title);\n var lingo = config.views.wikified.tag;\n\n wikify("<<newTiddler label:'New tiddler' tag:"+tag+">>",createTiddlyElement(popup,"li")); // <---- the only modification\n\n if(titles.length > 0)\n {\n var openAll = createTiddlyButton(createTiddlyElement(popup,"li"),lingo.openAllText.format([tag]),lingo.openAllTooltip,onClickTagOpenAll);\n openAll.setAttribute("tag",tag);\n createTiddlyElement(createTiddlyElement(popup,"li",null,"listBreak"),"div");\n for(r=0; r<titles.length; r++)\n {\n createTiddlyLink(createTiddlyElement(popup,"li"),titles[r],true);\n }\n }\n else\n createTiddlyText(createTiddlyElement(popup,"li",null,"disabled"),lingo.popupNone.format([tag]));\n createTiddlyElement(createTiddlyElement(popup,"li",null,"listBreak"),"div");\n var h = createTiddlyLink(createTiddlyElement(popup,"li"),tag,false);\n createTiddlyText(h,lingo.openTag.format([tag]));\n }\n,false);\n e.cancelBubble = true;\n if (e.stopPropagation) e.stopPropagation();\n return(false);\n}\n\n//}}}\n\n
For the NCOM IASNFS data, I wanted every 4th time step, since the data were saved at 6 hour intervals but I wanted only the values at 00:00 hours. So I used the command:\n{{{\nncks -O -F -d Time,"2010-05-18","2010-05-28",4 -v Surface_Elevation\n}}}\nThis takes a long time on my windows cygwin ncks: 12 minutes to get a 12MB file! Something must be working really hard, I guess.\n\nFor the HYCOM 1/25 degree NCODA, the output is daily, so I want every time step. So I did:\n{{{\nncks -O -F -d MT,"2010-05-18","2010-05-28" -v ssh\n}}}\nWhich takes about 10 seconds. Go figure.
FVCOM: All work is GOM3 model for MOP (500 m - 1km in coastal area, highest res in Nantucket Sound). No wetting and drying. Estuaries on south side of Cape Cod look very bad (way too big). Looks like poor coastline was used.\n\nOriginally FVCOM-GOM1-GOM2 open boundary had tides and climatology, but no transport. (clamped low-frequency elevation to zero). Thus the surface mean flow was not too bad (because largely wind driven), but deeper mean flow and gulf stream region were very bad.\n\nOriginal GOM1-GOM2 had 300m cutoff.\nExisting NECOFS forecast from GOM3 use 1500 m cutoff.\n\nSo to assess importance of these, check drifter comparison:\n684 Drifting buoy dataset (Globec and NEFSC 1995-2008)\nTried resetting each 6, 12, 24, 72 hours to see drifter forecast error over these time scales\n\nNew plan: use global FVCOM ocean model. Main problem is forcing of the open boundary condition (temperature and salinity structure). Looked at Global HYCOM and NCOM, and density structure was insufficient to drive Scotian Shelf boundary. Global model is 5-50 km, runs faster than regional model (2.5 days to run 1 year). Data assimilation every day. [we've seen this before for John Warner running his own global WW3, Harvard running the North Atlantic, ADCIRC running the Western North Atlantic, etc... Are global models for driving regional models useful? ]\n\nWave model is currently not being run for hindcast period, could do, but wave model takes: \n1 month = 3 days of run time\n\nCoupled current/wave takes 8 times longer\nwave takes 4 times longer than FVCOM\n\nSo maybe run one year of wave data, so we can compare to other wave models?\nWhat year should we pick? A year with big storms, or a year with lots of data?\nPerhaps when we had our Scituate and Boston Buoy tripods?\n\nIf model assessment and model/data comparisons could be done with Matlab toolbox accessing OPeNDAP data, it facilitate model analysis and utilization by other groups.\n\nGlobal model is running: 1978- present (only 5 years so far...)\nWill run FVCOM-GOM3 for 1978-2009. \nContacts: Global (Lai and Chen), FVCOM-GOM3 (Zhao, Sun & Chen)\nQC/Accuracy: Chen, Sun & Beardsley\nNOP Database on TDS: Xu\nRequests from MOP, MOP Consultants:\n\nCharge to Advisory Group:\n\nDan: Important to CZM:\nHabitat mapping of seafloor. Sediment, currents, temperatures + biotic communities\n\nseamex classification FGDC standard habitat model: seabed, geoforms (deep valleys, sloping, trench)\n\nWRF: Triple nested: need Domain 1 only for hindcasting (western Atlantic), Domain 3 (Mass Bay/Western Gulf of Maine) is 9km\n\nUMASSD has new server with 240TB (production grade, raid system) with TDS.\n\nWant to compare wave data for last two months to COAWST and to wave data.\n\n2009 GOM3 without boundary condition is online.\n\nJim will look at bottom temperature and compare with lobster trap data.\n\nFortran program for regridding to regular grid. \nTurn into a web service so that we can a grid in the resolution and domain we want.
\n\nGOM3, 48149 nodes, 40 levels: 523GB/year (without overlapping time)\nGOM2, 32649 nodes, 31 levels: 264GB/year\nMBAY: 98432 nodes, 11 layers: 250GB/year\n\nGOM3 sample file at:\n\n\nThis file is 4.3GB, with 72 hourly time steps, 48149 nodes, 40 vertical levels\n{{{\nsource: "FVCOM_3.0"\nhistory: "Sun Aug 2 11:08:33 2009: ncrcat -v x,y,lat,lon,xc,yc,lonc,latc,siglay,siglev,nv,nbe,h,temp,salinity,u,v,ww,zeta,Times -d time,3120,3192 -o\n}}}\nand this history shows that the model run since March 2009 is sitting in a single file called that must be (4.3GB/72steps)*3192steps=190GB (for these 4.4 months).\nSo \n\nGOM2 is 2.17GB for 72 hourly data, 32649 nodes, 31 levels.\n{{{\nsource: "FVCOM_2.6"\nhistory: "Thu Feb 18 08:30:26 2010: ncrcat -O -v x,y,lat,lon,xc,yc,lonc,latc,siglay,siglev,nv,nbe,h,temp,salinity,u,v,ww,zeta,Times -d time,744,816 /data01/necofs/FVCOM/RESULTS/ -o /data01/necofs/NECOFS_NC/\n}}}\nand this history shows that the data for the past 30 days or so is in a file\n\nMBAY, 2.07GB file for 72 hourly data, 98432 nodes, 11 layers.\n{{{\nsource: "FVCOM_3.0"\nhistory: "Thu Feb 18 19:42:25 2010: ncrcat -O -v x,y,lat,lon,xc,yc,lonc,latc,siglay,siglev,nv,nbe,h,temp,salinity,u,v,ww,zeta,Times -d time,240,312 /data01/necofs/FVCOM/output_mbn_layer11\n}}}
Started 1978, run for 10 years. Don't have SSH and SST before 1981. Looks better after assimilation. Fresh water on Scotian slope come from Gulf of St Lawrence, not Arctic. Expect 30 years run complete in 2-3 months. \n\nGOM3 run did not do assimilation right. Results not good. For multi-cpu run, locking up, found problem. \n\nGlobal model\n24 nodes x 8 cpu = 3 days wall clock for one year simulation\n\nGOM3\n24 nodes x 8 cpu = 1 day wall clock for one month simulation\n\nNECOFS\n24 nodes x 4cpu\n\nupgraded system 3 times, keep adding 20-24 nodes, but each set of nodes has different speeds, so really a bunch of mini 20-24 node systems.\n\nChen will ask with status of wave run. \n\nRich will ask Aron Roland about whether WWM can be run for GOM and whether WebEx can be shared with Chen.\n\n
1 cup chickpea flour\n1/2 tsp sea salt\n1/2 tsp freshly ground black pepper\n1 ¼ cup lukewarm water\n3 Tbsp extra-virgin olive oil\nGhee or oil, for pan\n1. In a large bowl, sift chickpea flour, salt, and pepper. Whisk in warm water and olive oil. Let sit, covered, for as many hours as possible (making this before you leave the house in the morning is perfect for making socca for dinner), but at least 30 minutes.\n2. Place heavy (preferably cast-iron) skillet in oven and preheat to 450 F.\n3. Remove skillet from oven. Add a small amount of oil or ghee to the hot pan, and pour batter in a steady stream until it reaches the edges of the pan. Bake for 8 to 10 minutes or until the pancake is firm and the edges are set.\n
{{{\nIn [105]: import sys\n\nIn [106]: sys.path\nOut[106]:\n['',\n '',\n 'c:\s\spython27',\n 'C:\s\sPython27\s\',\n 'C:\s\sPython27\s\sDLLs',\n 'C:\s\sPython27\s\slib',\n 'C:\s\sPython27\s\slib\s\splat-win',\n 'C:\s\sPython27\s\slib\s\slib-tk',\n 'C:\s\sPython27\s\slib\s\ssite-packages',\n 'C:\s\sPython27\s\slib\s\ssite-packages\s\sPIL',\n 'C:\s\sPython27\s\slib\s\ssite-packages\s\swin32',\n 'C:\s\sPython27\s\slib\s\ssite-packages\s\swin32\s\slib',\n 'C:\s\sPython27\s\slib\s\ssite-packages\s\sPythonwin',\n 'C:\s\sPython27\s\slib\s\ssite-packages\s\sIPython\s\sextensions']\n}}}
The MEST will work with JRE 1.6. Make sure you have it installed and make sure it is first in the System PATH environment eg. some versions of the 10g ORACLE client have a habit of installing the JRE 1.4.2 into the windows system PATH which causes the MEST to fail on startup. To check which version the MEST will use:\n\n * Start a Command window from the Start menu\n * Type:\n\njava -version\n\n * If the version is 1.4.2 or earlier then you must install a 1.6 JRE or JDK from Sun (eg. and set the system environment PATH variable as follows:\n o Right-click on My Computer and choose Properties\n o Select the "Advanced" tab and press the "Environment Variables" button\n o Choose the Path variable and press "Edit"\n o Add the path to your JRE 1.6 Java install to the front of the PATH variable so it looks something like the following:\n\nC:\sProgram Files\sJava\sJDK1.6.10\sbin; C:\s...\n\n o Save and close, then verify that the correct version of the JRE is being found using the steps described in the first two dot points above.\n
After upgrading Salamander from 2.5 to 2.51, I found that my "User Menu" options (activated via F9) stopped working. I found I had to edit the Visual Basic script c:\srps\ssrc\ssalhotmenu.vbs and replace \n{{{\nbFound = sh.AppActivate("Salamander 2.5")\n}}}\nwith\n{{{\nbFound = sh.AppActivate("Salamander 2.51")\n}}}\n
Worked on Baums TGLO files a bit. Tricky.\nThis is the aggregation that was having problems:\n{{{\n\n}}}\nSteve provided me access to his system, which was handy, because I discovered that one problem was some missing files in the aggregation:\n{{{\n[root@csanady HIS]# pwd\n/data1/TGLO/HIS\n[root@csanady HIS]# ls -s TGLO* | sort | head\n 0\n 0\n 0\n 0\n 4\n71548\n71548\n71548\n71548\n...\n}}}\nso I deleted them. Then things worked better. But not great, because this is an old version of ROMS (2.1) before the CF-compliant part got added. So we had to get time, lon and lat recognized as coordinate variables, and here's what we came up with:\n{{{\n<catalog name="TGLO Catalog" xmlns=""\n xmlns:xlink="">\n <service name="allServices" base="" serviceType="compound">\n <service name="dapService" serviceType="OpenDAP" base="/thredds/dodsC/"/>\n </service>\n <dataset name="TGLO ROMS History Files" ID="models/tglo/roms/his" serviceName="allServices"\n urlPath="models/tglo/roms/his">\n <netcdf xmlns=""> --> <aggregation\n dimName="time" type="joinExisting" recheckEvery="15min">\n <scan location="/data1/TGLO/HIS"\n regExp=".*TGLO-his-[0-9]{2}-[0-9]{2}-[0-9]{2}-[0-9]{2}-[0-9]{2}\$" olderThan="5 min"/>\n <variable name="zeta">\n <attribute name="coordinates" value="lat_rho lon_rho"/>\n </variable>\n <variable name="s_rho" orgName="sc_r">\n <attribute name="positive" value="up"/>\n <attribute name="units" value="1"/>\n <attribute name="standard_name" value="ocean_s_coordinate"/>\n <attribute name="formula_terms"\n value="s: s_rho eta: zeta depth: h a: theta_s b: theta_b depth_c: Tcline"/>\n </variable>\n <variable name="u">\n <attribute name="coordinates" value="s_rho lat_u lon_u time"/>\n </variable>\n <variable name="ubar">\n <attribute name="coordinates" value="lat_u lon_u time"/>\n </variable>\n <variable name="bustr">\n <attribute name="coordinates" value="lat_u lon_u time"/>\n </variable>\n <variable name="v">\n <attribute name="coordinates" value="s_rho lat_v lon_v time"/>\n </variable>\n <variable name="vbar">\n <attribute name="coordinates" value="lat_v lon_v time"/>\n </variable>\n <variable name="bvstr">\n <attribute name="coordinates" value="lat_v lon_v time"/>\n </variable>\n <variable name="time" orgName="ocean_time"/>\n </aggregation>\n <attribute name="Conventions" type="String" value="CF-1.0"/>\n </netcdf>\n </dataset>\n</catalog>\n}}}\nBut we had one additional problem: "lon_u, lat_u, lon_v, lat_v" were not in the history files. By looking at the global attributes, I found the ROMS grid file at \n{{{\n/home/baum/TGLO/SCRIPTS/\n}}}\nbut then the problem was how to add these variables in. Because the aggregations like be gridded data, I wanted to add these variables inside the aggregation loop, so I physically modified the 1st file in the aggregation (/data1/TGLO/HIS/, adding in the info from the grid file. I did this by making a tiny union aggregation in NcML:\n{{{\n<netcdf xmlns="">\n <aggregation type="union" >\n <netcdf location="c:/rps/cf/tamu/tglo/"/>\n <netcdf location="c:/rps/cf/tamu/tglo/"/>\n </aggregation>\n <attribute name="Conventions" type="String" value="CF-1.0"/>\n</netcdf>\n}}}\nbringing this up in the ToolsUI-GUI and clicking on the "ring" icon to write the NcML virtual aggregation to a physical NetCDF file. I then moved this file back to /data1/TGLO/HIS. \n\nThe one remaining step that screwed me up was that it turns out that the default behavior in TDS 4.1 is to *randomly* pick a file in the aggregation to use at a prototype. I was going crazy because I didn't know this, and had a test aggregation with two datasets where I had modified the 1st one. I kept reloading and seeing the variables in the aggregation continually switching back and forth! Luckily, there is a way to set the default behavior. \n\nSo I added these lines to \n/usr/local/apache-tomcat-6.0.18/content/thredds/threddsConfig.xml\n{{{\n <Aggregation>\n <typicalDataset>first</typicalDataset>\n </Aggregation>\n}}}\nThen everything worked!!!!!\n\n
I installed Matt Wilkie's (and flip_raster.bat) in my FWTOOLS1.3.6 directory on my PC, so it's not necessary to bring GMT stuff into Mirone and save to GeoTIFF (but Mirone adds the EPSG:4326 as well, so that's nice).\n\nUsage is simple:\n{{{\n flip_raster.bat bathy.grd bathy.tif\n}}}
Brian Eaton proposed "grid_description" container for all the specific attributes needed to\ndescribe connections and boundary nodes for a particular type of mesh.\n\n{{{\nnetcdf umesh_cf {\ndimensions:\n node = 9700 ;\n nele = 17925 ;\n nbnd = 1476 ;\n nface = 3 ;\n nbi = 4 ;\n sigma = 1 ;\n time = UNLIMITED ; // (0 currently)\nvariables:\n float time(time) ;\n time:long_name = "Time" ;\n time:units = "days since 2003-01-01 0:00:00 00:00" ;\n time:base_date = 2003, 1, 1, 0 ;\n time:standard_name = "time" ;\n float lon(node) ;\n lon:long_name = "Longitude" ;\n lon:units = "degrees_east" ;\n lon:standard_name = "longitude" ;\n float lat(node) ;\n lat:long_name = "Latitude" ;\n lat:units = "degrees_north" ;\n lat:standard_name = "latitude" ;\n float depth(node) ;\n depth:long_name = "Bathymetry" ;\n depth:units = "meters" ;\n depth:positive = "down" ;\n depth:standard_name = "depth" ;\n depth:grid = "grid_description";\n char grid_description\n mesh:grid_name = "triangular_mesh";\n mesh:Horizontal_Triangular_Element_Incidence_List = "ele";\n mesh:Boundary_Segment_Node_List = "bnd";\n mesh:index_start = 1;\n int ele(nele, nface) ;\n int bnd(nbnd, nbi) ;\n}}}\n\n\nRich thinks this idea would work well for staggered structured grids as well:\n\n{{{\nnetcdf c_grid_example { \n.... \n u:grid_staggering = "roms_stagger"\n v:grid_staggering = "roms_stagger"\n \n\n char roms_stagger\n roms_stagger:grid_name = "Arakawa_C_grid"\n roms_stagger:u_relative_to_pressure = "left"\n roms_stagger:v_relative_to_pressure = "below"\n}\n \n}}}\n where the last two attributes would signify that u(i,j) is "left" of p(i,j) and v(i,j) is "below" p(i,j). \n\nThe idea is that there are only a few common staggering arrangements, which would be described in the CF document, much like the vertical coordinate transformations. So "Arakawa_C_grid" would have specific rules that would tell applications how to do things like find the vertical coordinate at U(i,j) points by averaging the Z(i,j) and Z(i-1,j) points.\n\nBTW, The "C" grid is by far the most popular, at least according to Google: We searched A-E grids, and here's the result\n\n\n|Arakawa Type | Number of pages | Percent of total |\n| C | 50,500| 65%|\n| B | 13,500| 17%|\n| E | 7,490| 10%|\n| A | 5,650| 7%|\n| D | 501| 1%|\n\n\n\nSteve Hankin and Karl Taylor say "be careful adding new stuff" and Steve says trying to handle all the permutations of staggered grids gets too complicated. (But I don't agree)\n\nJohn Caron likes the idea of a container variable, but wants one container for the entire "coordinate system":\n\n{{{\nFrom my POV, both are properties of a "coordinate system" object, \nso i prefer a design that attaches the grid mapping and description (and \nwhatever else will eventually be needed) to a coordinate system \n"container variable"; the dependent variables then only have to point to \nthis, and all the information is contained in one place. I think i can \nalso incorporate "dimensionless vertical coordinates" in the same \nframework: rather than having a seperate mechanism for specifying \n"grid_mappings" and "dimensionless vertical coordinates", both are kinds \nof "coordinate system transformations". \n}}}\n\nHowever, taking a look at:\n\nit seems that for now, at least, the "vertical" and "projection" transforms are handled in separate "containers".\n\nJonathan Gregory endorses the container idea, but suggests using "grid_topology" to describe the connections of a grid (e.g. unstructured triangular grid), and wants a different term for describing the relationship between data variables on a grid (e.g. staggered C grid). \n\nBalaji's GridSpec suggested a "staggering" attribute \n\n{{{\ndimensions:\nnx = 46;\nny = 45;\nvariables:\nint nx_u(nx);\nint ny_u(ny);\nfloat u(ny,nx);\nu:standard_name = "grid_eastward_velocity";\nu:staggering = "c_grid_symmetric";\nu:coordinate_indices = "nx_u ny_u";\nGLOBAL ATTRIBUTES:\ngridspec = "/foo/";\nnx_u = 1,3,5,...\nny_u = 2,4,6,...\n}}}\n\nBut instead of a simple attribute, should we point to a grid_staggering container?
Different types of call forwarding for the iPhone and all other AT&T phones:\n\nTo Forward all calls:\n\n On your phone's calling screen, dial: *21*xxx-xxx-xxxx# and press send1\n Your phone will now provide feedback that all calls are forwarded\n\nTo ring for XX seconds, then forward:\n\n On your phone's calling screen, dial: *004*xxx-xxx-xxxx*11*time# ("time" is your desired ring duration in seconds; must be 5,10,15,20,25 or 30) and press send1\n Your phone will now provide feedback that calls are forwarded\n\nTo forward unanswered calls:\n\n Dial EXACTLY: *61*xxx-xxx-xxxx# and press send1\n Your phone will now provide feedback that unanswered calls are forwarded\n\nTo forward calls when you are busy or decline a call:\n\n On your phone's calling screen, dial: *67*xxx-xxx-xxxx# and press send1\n Your phone will now provide feedback that busy and declined calls are forwarded\n\nTo forward calls when the phone is off or in airplane mode/no service:\n\n On your phone's calling screen, dial: *62*xxx-xxx-xxxx# and press send1\n Your phone will now provide feedback that calls are forwarded when your phone is unavailable\n\nTo END all call forwarding:\n\n On your phone's calling screen, dial: #002# and press send\n Your phone will now provide feedback that call forwarding has been deactivated\n\n
Here is a ERDDAP RESTful query to get the surface layer velocity for 3 days over the entire domain, but subsampled by 2 in lon/lat dimensions for speed:\n\n[(2011-04-07):1:(2011-04-09)][(0.0):1:(0)][(34.0):1:(
GDAL would like to write a CDM-compliant NetCDF file, but it doesn't currently get it quite right. It doesn't create x,y coordinate variables, and it doesn't get the attribute specification quite right. All these are easy to fix in NcML, but also should be easy to fix in the GDAL NetCDF writer.\n\nHere's an example of converting an Arc GRID file to NetCDF\n{{{\ngdal_translate -of netcdf de.grd\n}}}\nand we also get an extra file\n\nThe GDAL produced looks like this in NcML form:\n\n{{{\n<?xml version="1.0" encoding="UTF-8"?>\n <netcdf xmlns=""\n location="">\n <dimension name="x" length="7000" />\n <dimension name="y" length="6971" />\n <attribute name="Conventions" value="CF-1.0" />\n <variable name="albers_conical_equal_area" shape="" type="char">\n <attribute name="Northernmost_Northing" type="double" value="1970917.9911730457" />\n <attribute name="Southernmost_Northing" type="double" value="1901207.9911730457" />\n <attribute name="Easternmost_Easting" type="double" value="1800692.5938176773" />\n <attribute name="Westernmost_Easting" type="double" value="1730692.5938176773" />\n <attribute name="spatial_ref" value="PROJCS[&quot;unnamed&quot;,GEOGCS[&quot;NAD83&quot;,DATUM[&quot;North_American_Datum_1983&quot;,SPHEROID[&quot;GRS 1980&quot;,6378137,298.257222101,AUTHORITY[&quot;EPSG&quot;,&quot;7019&quot;]],TOWGS84[0,0,0,0,0,0,0],AUTHORITY[&quot;EPSG&quot;,&quot;6269&quot;]],PRIMEM[&quot;Greenwich&quot;,0,AUTHORITY[&quot;EPSG&quot;,&quot;8901&quot;]],UNIT[&quot;degree&quot;,0.0174532925199433,AUTHORITY[&quot;EPSG&quot;,&quot;9108&quot;]],AUTHORITY[&quot;EPSG&quot;,&quot;4269&quot;]],PROJECTION[&quot;Albers_Conic_Equal_Area&quot;],PARAMETER[&quot;standard_parallel_1&quot;,29.5],PARAMETER[&quot;standard_parallel_2&quot;,45.5],PARAMETER[&quot;latitude_of_center&quot;,23],PARAMETER[&quot;longitude_of_center&quot;,-96],PARAMETER[&quot;false_easting&quot;,0],PARAMETER[&quot;false_northing&quot;,0],UNIT[&quot;METERS&quot;,1]]" />\n <attribute name="GeoTransform" value="1.73069e
The issue with "point" and "area" registrations between GMT and GeoTIFF that I reported on here:\n\nno longer exist in FWTOOLS 1.3.4.\n\nBoth point and area registered grids give uniform 5 m resolution now for:\ncd /home/rsignell/p/seth\ngdal_translate -a_srs EPSG:32618 point.grd point.ti\ngdal_translate -a_srs EPSG:32618 area.grd area.tif\n
Bring up MapSource on PC and click on "receive from device" and select tracks. In MapSource, save as "gpx" format, to the My Documents\sMy Garmin\stracks directory. Fire up Google Earth Plus and change the file type to GIS files and load the track.
{{{\n$ cd c:/programs/tomcat6/webapps/erddap/WEB-INF\n$ ./GenerateDatasetsXml.bat EDDGridFromDap http://localhost:8080/thredds/dodsC/hydro/national/4km > foo.xml\n}}}
GeoTIFF images can be loaded onto the HYPACK nav system on the Rafael and be used as background images. This can be quite handy for cruise planning.\n\nThe HYPACK nav system can, however, only read 8-bit UTM GeoTIFFS.\n\nSo if you have a 24 bit GeoTIFF in Geographic Coords, you have to:\n\n1. Convert from Geographic to UTM:\n{{{\ngdal_warp 24bit_geo.tif '+proj=utm +zone=19 +datum=NAD83' 24bit_utm.tif\n}}}\n2. Convert from 24 bit to 8 bit: \n{{{\nrgb2pct.bat 24bit_utm.tif 8bit_utm.tif\n}}}\n\n3. Make background transparent. Bring up the 8 bit image in OpenEV so you can see what value is for the background. The value for my sample image was 255, so I then did:\n{{{\ngdal_translate -a_nodata 255 8bit_utm.tif 8bit_utm_transparent.tif\n}}}\n\n
How I used FWTOOLS (1.0.5) to transform a regular TIF (a bathymetry smooth sheet from GEODAS) into a georeferenced geotiff:\n\n1. Open up the tif (e.g. mytif.tif) in OpenEV. Under "Preferences", change "Lat/Lon Format" to ddd.ddddd" (decimal degrees).\n\n2. Open an ASCII editor and create a new text file.\n\n3. Map known lon/lat points to pixel coordinates. OpenEV displays pixel locations in lower left corner, so move the tip of the pointer to a known lon,lat location (like the intersection of two lon/lat graticules). Write the pixel location into the text file, then on the next line, enter the lon/lat coordinates. Put a zero on the end of both lines. For example, if your cursor is on the known lon,lat point -130.5, 41.5, and OpenEV is telling you that this point is located at (3000.24P, 1500.25L), then the two lines in the text file should look like:\n{{{\n3000.
The Rutgers THREDDS server is at\n\nIf you want to explore the spatial extents and time extents of the data sets, a convenient way is via the EDC (Environmental Data Connector), built by ASA for NOAA, obtainable from\n\nIn the EDC, just cut-and-paste \n\ninto the spot for "Catalog URL" and then browse through the datasets, eventually selecting the OpenDAP access link. Then you will see the lon/lat range and time extents listed.\n\nThe Meteorology directory contains met model output converted to comply with ROMS forcing file naming conventions.\n\nNavigate the directory structure down to the dataset you want, and then click on the "Access=>OPENDAP" link to open up the OPeNDAP Dataset Access Form. Then cut-n-paste the URL found in the "Data URL" box. For example, the OPeNDAP URL for the 3 hour NAM Uwind field is:\n\n{{{\n\n}}}\n\nArmed with this URL, you can then use the NCO tools "ncks" command to extract just the lon/lat range and time range you are interested in.\n\nOn pikmin, the NCO tools are in /usr/local/nco/bin, so make sure this in your path. Then make a get_forcing script like this:\n\n{{{\n#!/bin/bash\n\n# get ROMS met forcing from Rutgers\n\nfor var in Pair Uwind Vwind Tair lwrad_down lwrad Qair swrad rain\ndo \n echo $var\n ncks -d lon,-71.5,-68.0 -d lat,41.5,42.5 -d time,"2007-06-15 00:00","2007-11-15 00:00" \s\n"$var" "$var"\ndone\n}}}\n\nMake it executable and run it:\n{{{\nchmod +x get_forcing\n./get_forcing\n}}}
Note: these are now outdated instructions. See instructions for gfortran and ROMS 3.0 at:\n\n\n\nBut for historical interest...\n\nFollowing Sachin Kumar Bhate's instructions:\n\n#Get cygwin from, and select {{{}}} when cygwin_setup asks you to pick a site. Make sure you check 'gcc core module' 3.4.4 under 'Devel' where you get package listings.\n#Go to \n##Get the Cygwin x86 tarball (stable version). \n##Unpack in the root directory. i.e. /cygdrive/c/cygwin. It automatically unpacks into usr/local/bin.\n#Netcdf installlation and compilation with g95.\n##Get the netcdf library 3.6.1/3.6.2 from\n##untar.\n##change to the src directory where you have just unpacked the netcdf source.\n##run this command at the prompt: {{{ CC=gcc F90=g95 F90FLAGS='-O -Wno-globals' CPPFLAGS='-Df2cFortran' ./configure}}}\n##type 'make'\n##type 'make check'\n##type 'make install'. \n#Get ROMS.\n##I checked out v2.2.4 from\n##Go to the ROMS installation directory now.\n##Change 'FORT ?= g95' in makefile\n##Open Master/ Uncomment line #11, and comment line #13. Save it.\n##set $NETCDF_INCDIR and $NETCDF_LIBDIR enivronmental variables with path where you have just built netcdf.\n##Copy the attached with this email to Compilers directory. Then open the and change NETCDF_LIBDIR and NETCDF_INCDIR paths. (you don't need to if you have already defined these as environmental variables).\n##Edit ROMS/Include/cppdefs.h and #define UPWELLING. \n##Try to compile it.\n#Test the resulting executable: {{{ ./OceanS < ROMS/External/ }}}\n\nThe UPWELLING test case with g95 took 570 s on my notebook PC (2.2GHz T2600 ). With the Intel Fortran compiler (ifort) it takes 470 s on my Linux desktop PC (3.0GHz Xeon).\n
{{{\n/data/ftp/upload/Inundation/vims/selfe_tropical/runs/Ike/2D_varied_manning_windstress\n\n <ns0:variable name="elev" shape="time node" type="float">\n <ns0:attribute name="coverage_content_type" value="modelResult" />\n}}}
To get started with this blank TiddlyWiki, you'll need to modify the following tiddlers:\n* SiteTitle & SiteSubtitle: The title and subtitle of the site, as shown above (after saving, they will also appear in the browser title bar)\n* MainMenu: The menu (usually on the left)\n* DefaultTiddlers: Contains the names of the tiddlers that you want to appear when the TiddlyWiki is opened\nYou'll also need to enter your username for signing your edits: <<option txtUserName>>\n\nSee also MonkeyPirateTiddlyWiki.
This will return a subsetted 32-bit Geotiff\n{{{\nwget -O test.tif ",41.10,-70.0,41.70"\n}}}\n\nI believe that curl used to work, but now this returns a complaint about using "POST". So I tried "wget" instead, and it worked. I guess wget uses "GET"!
Christoph originally said:\n\nGetting NetCDF4-Python to work is tricky since ArcGIS10 ships with old netcdf3 and hdf DLLs.\n\nThe only clean solution is to build netcdf4-python against the python, numpy,netcdf, and hdf5 libraries that ship with ArcGIS10, and use the same compiler & C runtime that was used to build ArcGIS10-python.\n\nHere's what I would need to do to get something to work without messing with ArcGIS dlls:\n\n1) First, I would have to update netcdf-4.1.3-msvc9-source with DAP support. This version is not supported by Unidata and requires curl libraries. I don't know what ESRI's policy is on unsupported code.\n\n2) Netcdf.dll needs to be built against a static or custom named version of HDF5-1.8.7 to avoid conflicts with the existing ArcGIS10 HDF DLLs. Still, this might not work depending on whether different versions of HDF5 libraries can be loaded/used at runtime.\n\n3) chances are that the netcdf4-python 0.9.7 source distribution is not compatible with numpy 1.3. This can probably be fixed by invoking Cython to generate new source files with numpy 1.3 installed.\n\n\n\nBut then he did it: \n\n" Please find the updated netCDF4-0.9.7-ArcGIS10.win32-py2.6.?exe at\n<>.\n\nThis version of netcdf4-python was built against numpy 1.3 and uses a\nnetcdf.dll, which was linked to static libraries of hdf5, zlib, szlib,\nlibcurl and libxdr. It should work on ArcGIS10 without upgrading\nnumpy or moving DLLs. Just make sure there is no other netcdf.dll in\nthe Windows DLL search path."\n\nAnd indeed, it does work! My test script worked like a charm:\n\n{{{\nimport netCDF4\nimport numpy as np\nimport arcpy\nurl=''\nnc=netCDF4.Dataset(url);\nlon=nc.variables['lon'][:]\nlat=nc.variables['lat'][:]\nbi=(lon>=-71.2)&(lon<=-70.2)\nbj=(lat>=41)&(lat<=42)\nz=nc.variables['topo'][bj,bi]\ngrid1=arcpy.NumPyArrayToRaster(z)\n}}}
Voucher=> Edit=> Digital Signature=> Submit Completed Document
How well does GridFTP via Globusconnect work in a real modeling testbed data transfer case? We did a few tests, first moving a thousands of files totaling 365GB from the testbed to a local server in Woods Hole, and just to test the ability to restart transfers automatically, we restarted the globusconnect service and the testbed server in the middle of the transfer. The transfer proceeded flawlessly, moving the data in about 14 hours, at a rate of 7.32MB/s. About 1/2 of this transfer occurred during working hours on the East Coast. To see if we got faster rates during non-work hours, and to compare against sftp, we did a second small test, moving 600MB from the testbed server to the Woods Hole machine at 7am. We got a data rate of 10.5MB/s for GridFTP, and 4.1MB/s for sftp, a factor of 2.5 speedup.\n\nDetails:\n\nTest
{{{\n#!/bin/csh\n\ngrdgradient gom15.grd -A315 -Ggom15_grad.grd -N -M -V\ngrdhisteq gom15_grad.grd -Ggom15_grad_eq.grd -V -N\ngrdmath gom15_grad_eq.grd 6.0 / = gom15_intens.grd\ngrdimage gom15.grd -Cgom4.cpt -P -K -V -JM22 -Igom15_intens.grd >\nconvert ppm:- | pnmtotiff -lzw > gom15a.tif\nconvert ppm:- | cjpeg > gom15a.jpg\n}}}\n\nmore gom4.cpt\n{{{\n\n -7000 0 0 255 -2000 0 0 255 \n -2000 0 50 255 -200 0 50 255 \n -200 0 110 255 -100 0 110 255 \n -100 0 170 255 -60 0 170 255 \n -60 0 215 255 -30 0 215 255 \n -30 40 255 255 0 40 255 255 \n 0 50 150 50 500 200 200 50\n 500 200 200 50 2000 200 0 200\n}}}
If I calculate the H-Index for R. Signell at Google Scholar\n\nI get:\n\nCitations for 'R. Signell' : 2761\nCited Publications:135\nH-Index: 25\n\nTurns out the ISI web of knowledge has a better tool for H-Index. You can select only certain subject areas. \n\nUse "search" on the "author" line\nSignell R*\nand then "create citation report" (little link on the right side of the page).\nH-Index using web of knowledge is 20.\n\nAccording to H-Factor wikipedia entry:\n\n\nHirsh, the physicist who came up with the idea of this factor, said that "for physicists, a value for h of about 10–12 might be a useful guideline for tenure decisions at major research universities. A value of about 18 could mean a full professorship, 15–20 could mean a fellowship in the American Physical Society, and 45 or higher could mean membership in the United States National Academy of Sciences."\n\nTo test this out , I tried the physical oceanography National Academy members):\n H-Factor\nJ.C. McWilliams: 61\nCarl Wunsch: 55\nChris Garrett: 48\nRuss Davis: 42\nJ. Pedlosky: 35\n\nSeems to work pretty well!\n\n
/***\n| Name|HideWhenPlugin|\n| Description|Allows conditional inclusion/exclusion in templates|\n| Version|3.0 ($Rev: 1845 $)|\n| Date|$Date: 2007-03-16 15:19:22 +1000 (Fri, 16 Mar 2007) $|\n| Source||\n| Author|Simon Baird <>|\n| License||\nFor use in ViewTemplate and EditTemplate. Example usage:\n{{{<div macro="showWhenTagged Task">[[TaskToolbar]]</div>}}}\n{{{<div macro="showWhen tiddler.modifier == 'BartSimpson'"><img src="bart.gif"/></div>}}}\n***/\n//{{{\n\nwindow.removeElementWhen = function(test,place) {\n if (test) {\n removeChildren(place);\n place.parentNode.removeChild(place);\n }\n};\n\nmerge(config.macros,{\n\n hideWhen: { handler: function(place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( eval(paramString), place);\n }},\n\n showWhen: { handler: function(place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( !eval(paramString), place);\n }},\n\n hideWhenTagged: { handler: function (place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( tiddler.tags.containsAll(params), place);\n }},\n\n showWhenTagged: { handler: function (place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( !tiddler.tags.containsAll(params), place);\n }},\n\n hideWhenTaggedAny: { handler: function (place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( tiddler.tags.containsAny(params), place);\n }},\n\n showWhenTaggedAny: { handler: function (place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( !tiddler.tags.containsAny(params), place);\n }},\n\n hideWhenTaggedAll: { handler: function (place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( tiddler.tags.containsAll(params), place);\n }},\n\n showWhenTaggedAll: { handler: function (place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( !tiddler.tags.containsAll(params), place);\n }},\n\n hideWhenExists: { handler: function(place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( store.tiddlerExists(params[0]) || store.isShadowTiddler(params[0]), place);\n }},\n\n showWhenExists: { handler: function(place,macroName,params,wikifier,paramString,tiddler) {\n removeElementWhen( !(store.tiddlerExists(params[0]) || store.isShadowTiddler(params[0])), place);\n }}\n\n});\n\n//}}}\n\n
The hilbert transform can be used to get the envelope of a modulated signal. 'hilbert' is part of the Matlab signal processing toolkit. You can use hilb_envel.m in RPSstuff to calculate the envelope.\n
Short on disk space? Compress those netcdf3 files using netcdf4 with deflation and chunking instead of gzipping! With ROMS and other ocean model output, my experience has been that deflation at least cuts the file size in half, and if you are willing to sacrifice a bit of precision (say you don't need currents saved to be more accurate than 0.001 m/s, for example), you can save even more. If you have large masked regions (land in ROMS, clouds in remote sensing imagery), you can save even more. We reduced some AVHRR data by a factor of 20!\n\nHere are a number of ways you can do it:\n\n1) nc3tonc4 (Jeff Whitaker):\nThis command line tool comes with NetCDF4-Python.\nI like this tool because it gives you the ability to specify the number of significant digits, which can improve compression by quite a bit. FYI, this tool is part of the Enthought Python Distribution, which is a one click install on 32/64 bit Mac, PC and Linux. If you are academic, you can get the EPD for free (, and if not, it's $200 (well worth it, IMHO). \n\nFor example, to keep 3 digits (0.001m/s) accuracy on velocity, and 4 on temperature, just do:\n{{{\n nc3tonc4 --quantize='u=3,v=3,temp=4'\n}}}\n\n2) nccopy (Unidata):\nWith versions higher than netCDF-4.1.2, you can do deflation and chunking.\n\nFor example, to convert netCDF-3 data to netCDF-4\ndata compressed at deflation level 1 and using 10x20x30 chunks for\nvariables that use (time,lon,lat) dimensions:\n{{{\n nccopy -d1 -c time/10,lon/20,lat/30\n}}}\n\n3) ncks (Charlie Zender):\nAlso let's you specify chuck sizes along any dimension:\n{{{\nncks -4 -L 1 --cnk_dmn lat,50 --cnk_dmn lon,50 -O\n}}}
{{{\ncat /proc/cpuinfo | grep processor | wc -l\n}}}
Hit the snack machine today, and the 2.07oz (59g) Snickers Bar was $1 and the 1.75 oz (49.6g) of Tom's peanuts was $0.75. So I was wondering how many oz of peanuts the snickers bar had? And how many peanuts? \n\nWe assume that all the calcium and protein in Snickers comes from skim milk and peanuts. \n\nFrom wikipedia (\n{{{\n100g of peanuts has 62mg calcium and 25g protein.\n}}}\nFrom webmd (\n{{{\n 1 cup of skim milk (244g) has 301mg calcium and 8.4g protein. \n}}}\n\n1 cup of skim milk = 244g\n\nJust as a check that the protein and calcium don't come from other ingredients, let's see if the protein and calcium in Milky Way have a ratio similar to that of milk (because Milky Way is like Snickers without peanuts). From this site:, 100g of Milky Way has 115mg calcium, 4g protein, a ratio of 115/4 = 28.7 calcium(mg)/protein(g), and milk has a ratio of 301/8.4 = 35.83 calcium(mg)/protein(g), so pretty close.\n\nSo onward to Snickers. From\n{{{\n7.53g protein/100g snickers = 4.4g protein/59 g snickers bar\n93mg calcium/100g snickers = 55mg calcium/59 g snickers bar\n}}}\nSo we have two equations with two unknowns:\n{{{\nprotein: x*(25g protein/100g peanut) + y*(8.4g protein/244g milk) = 4.4g protein\ncalcium: x*(62mg calcium/100g peanut) + y*(301mg calcium/244g milk) = 55mg calcium\n}}}\nwhere: \n{{{\nx=mass of peanuts(g)\ny=mass of skim milk(g)\n}}}\nI could solve by substitution, but I have Matlab, and if A is an N-by-N matrix and B is a column vector with N components, or a matrix with several such columns, then X = A\sB is the solution to the equation A*X = B.\n\nSo I have: \n{{{\n>> A=[25/100 8.4/244;62/100 301/244]\nA =\n 0.2500 0.0344\n 0.6200 1.2336\n\n>> B=[4.4; 55]\nB =\n 4.4000\n 55.0000\n\n>> X=A\sB\nX =\n 12.3126\n 38.3965\n}}}\n\nSo solving the two equations yielded:\n{{{\nX(1) = 12.3g peanuts (0.42 oz, or about 20% peanuts by weight)\nX(2) = 38.4g milk (equivalent of 1.25 oz skim milk)\n}}}\nTo convert 12.3g peanuts, we (thanks Michael!) took 10 Tom's peanuts and weighed them, which came out to 5.7g. So if Tom's peanuts are the same as Snickers peanuts (and a casual visual inspection supports that they are close), and my other assumptions are correct, there are \n{{{\n12.3g * (10 peanuts/5.7g) = 21.6 peanuts per Snickers Bar\n}}}\nAnd definitely way more peanuts (49.6g/12.3g = 4 times as many) in the bag of peanuts. But I still like Snickers better. ;-)\n\nHere's a picture of 21 peanuts:\n<html><img src="" width="400" /> </html>\n\n
1. Create animation in ncWMS/Godiva2\n2. Right click on the animated gif and select "copy image location". \n3. Drop the URL into TextPad, and replace\n{{{\nFORMAT=image/gif\n}}}\nwith\n{{{\nFORMAT=application/\n}}}\nSample URL:\n,2010-05-29T00:00:00.000Z,2010-05-30T00:00:00.000Z,2010-05-31T00:00:00.000Z,2010-06-01T00:00:00.000Z,2010-06-02T00:00:00.000Z,2010-06-03T00:00:00.000Z,2010-06-04T00:00:00.000Z&TRANSPARENT=true&STYLES=vector%2Frainbow&CRS=EPSG%3A4326&COLORSCALERANGE=0%2C2&NUMCOLORBANDS=254&LOGSCALE=false&SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&EXCEPTIONS=XML&FORMAT=application/,23.230544403196,-81.293768882752,31.799870654942&WIDTH=512&HEIGHT=400\n
{{{\n#!/usr/bin/perl\n# Concatenate NDBC stdmet files into a single NetCDF file\n \n# $buoy = 44005; # Gulf of Maine offshore\n# $start_year = 1978;\n# $stop_year = 2006;\n\n$buoy = 44013; # Mass Bay\n$start_year = 1984;\n$stop_year = 2006;\n\n# Use Curl to grab yearly NetCDF files\n$year = $start_year;\nwhile ($year <= $stop_year) {\n system("curl -o ${buoy}_${year}.nc${buoy}/${buoy}h${year}.nc");\n $year++;\n}\n$nyears = $stop_year - $start_year + 1;\n\n# Use NCO "ncrcat" to join yearly NetCDF files :\nsystem("ncrcat -O -n ${nyears},4,1 ${buoy}_${start_year}.nc ${buoy}.nc");\n}}}
Bare earth digital elevation model (DEM) data from the NED, previously available through the Seamless Server Viewer, is accessed in The National Map Viewer with the 'Download Data' tool. Use the following procedures to download DEM data:\n\n*Zoom to your area of interest.\n*Click the Download Data tool near the top right corner of the viewer banner.\n*Use the current map extent, choose a reference area polygon from the Download options dropdown menu, or create your own custom polygon.\n*Select the data theme of Elevation and product format.\n*Select available NED products, such as 2 arc-second (Alaska only), 1 arc-second (conterminous U.S., Hawaii, Puerto Rico, portions of Alaska, western & southern Canada along U.S. border, and all of Mexico), 1/3 arc-second (conterminous U.S., Hawaii, and portions of Alaska), or 1/9 arc-second (in limited areas of U.S. only).\n*Add selected products to the Cart.\n*Checkout and enter your e-mail address twice to place your order.\nNote: you will not initially see the Elevation theme in The National Map Viewer's Overlays table of contents, although we are working on offering more visualization options in the near future. Also, be aware that elevation products are available either as 'staged data' in pre-packaged 1x1 degree cell extents in either ArcGrid or GridFloat formats, or through 'dynamic data extracts' in user defined reference area polygons with additional formats of GeoTIFF or BIL_16INT (besides ArcGrid or GridFloat).
Java 8:\n{{{\nsudo add-apt-repository ppa:webupd8team/java\nsudo apt-get update\nsudo apt-get install oracle-java8-installer\n}}}\ncheck:\n{{{\njava -version\n}}}\n\nTomcat 8:
This bizarre first step seems to be necessary before the aptitude stuff will work:\n{{{\n1. Open console and type: ln -s / /cow\n\nMore info:\n\n}}}\n{{{\nsudo aptitude update\nsudo aptitude install sun-java6-bin sun-java6-plugin sun-java6-font\njava -version\n}}}
To install OpenCV Python interface on Windows, I downloaded the binary from \n{{{\n\n\n}}}\nand then unzipped it into\n{{{\nC:\sRPS\spython\sepd32\sopencv\n}}}\nthen added a path file called "opencv.pth" that contains the single line:\n{{{\nc:\srps\spython\sepd32\sopencv\sbuild\spython\s2.7\n}}}\n\nI can then import OpenCV as\n{{{\nimport cv2\n}}}\n\nthis binary works on 32 bit python ( does not work on 64 bit python).\n-Rich\n
Here's how I installed Java 1.6 on my RHEL4 box:\n\n* Go to\n* Click on 'Download' button for 'JDK6 update 3' (first one).\n* Choose your platform (linux) and download the rpm file.\n* Install by typing\n{{{\nsh jdk-6u3-linux-i586-rpm.bin\n}}}\n* See how many java alternatives exist on your system already \n{{{\n/usr/sbin/alternatives --config java\n}}}\nIn my case there were 5 choices.\n* I want to add my new java as alternative 6, so I type: \n{{{\n/usr/sbin/alternatives --install /usr/bin/java java /usr/java/jre1.6.0_01/bin/java 6\n}}}\n* Now I select my new java from the alternatives list\n{{{\n/usr/sbin/alternatives --config java\n}}}\nand choose number 6.\n* Done!\n\n\n
1. make CGAL 4.0.2\n{{{\ncd /home/epifanio/rps\nwget\nbzip2 -d CGAL-4.0.2.tar.bz2\ntar xvf CGAL-4.0.2.tar\ncd CGAL-4.0.2/\ncmake -DCMAKE_INSTALL_PREFIX=/home/epifanio .\ncd build\nmake install\n}}}\n2. make cgal-bindings\n{{{\ncd /home/epifanio/rps\ngit clone\ncd cgal-bindings\nmkdir -p build/CGAL-4.02_release\ncd build/CGAL-4.02_release\ncmake -DCGAL_DIR=/home/epifanio/rps/CGAL-4.0.2 -DJAVA_OUTDIR_PREFIX=../../examples/java -DPYTHON_OUTDIR_PREFIX=../../examples/python ../..\nmake -j 4\n}}}\nNote: this last step (make -j 4) takes quite a while. Time to make an espresso. The result of all this is:\n{{{\n/home/epifanio/rps/examples/python \n/home/epifanio/rps/examples/python/CGAL \n}}}\nThere is no, so the examples contain "import CGAL" which just imports the functions from the subdirectory below. This is pretty lame, but the examples run. I'll see about generating a file.
Followed these excellent instructions for installing Java 8:\n
1. Install "lxml" (available in the optional EPD packages from Enthought):\n{{{\nenpkg lxml \n}}}\n2. Install "OWSLib":\n{{{\ngit clone\ncd OWSLib\npython build\npython -c "import setuptools;execfile('')" bdist_egg\negginst dist/OWSLib*.egg\n}}}\n3. Install "pyGDP":\n{{{\ngit clone\ncd pyGDP\npython build\npython -c "import setuptools;execfile('')" bdist_egg\negginst dist/pyGDP*.egg\n}}}\n\n
Download new version of Linux Tar file to /usr/local\n\n{{{\ncd /usr/local\ntar xzvf FWTools-linux-1.3.6.tar.gz\nrm fwtools\nln -s FWTools-1.3.6 fwtools\ncd fwtools\n./\n}}}\n
Go to "register and invite" and invite a new reviewer. Let Ocean Dynamics send them the form letter that says they are registered, setting them as a "reviewer" for default role. Then invite them, but click on the "customize" button and remove the stupid intro line, and change "With kind regards" to "Sincerely". There will now be an asterisk by their name which indicates that the invitation letter has been customized, which means it's ok to send.\n
thinking about putting my "projects" (things with 3 or more steps) into Tiddlyspot
<html><img src="[(last)][(
/***\n|''Name:''|LegacyStrikeThroughPlugin|\n|''Description:''|Support for legacy (pre 2.1) strike through formatting|\n|''Version:''|1.0.1|\n|''Date:''|Jul 21, 2006|\n|''Source:''||\n|''Author:''|MartinBudden (mjbudden (at) gmail (dot) com)|\n|''License:''|[[BSD open source license]]|\n|''CoreVersion:''|2.1.0|\n|''Browser:''|Firefox 1.0.4+; Firefox 1.5; InternetExplorer 6.0|\n\n***/\n\n//{{{\n\n// Ensure that the LegacyStrikeThrough Plugin is only installed once.\nif(!version.extensions.LegacyStrikeThroughPlugin)\n {\n version.extensions.LegacyStrikeThroughPlugin = true;\n\nconfig.formatters.push(\n{\n name: "legacyStrikeByChar",\n match: "==",\n termRegExp: /(==)/mg,\n element: "strike",\n handler: config.formatterHelpers.createElementAndWikify\n});\n\n} // end of "install only once"\n//}}}\n
I followed this recipe:\n\n\nusing my old Toastmaster Belgian Waffler (model 230). It gets up to 355 degrees, which is just a bit less than the optimal 360 degrees. It melts and lightly carmelizes the pearl sugar with out burning, so I don't have to any of the work-around methods listed in the web site above. I just waited until the light turned off, just like a normal waffle, and it worked fine! Instead of Belgian Pearl Sugar, I used "Nordzucker Hagel Zucker" made in Germany by "Sweet Family", which cost 0.99 euro for a 250g pack in Germany. It worked great.
linear algebra\n*PCA using scipy.linalg.svd\n*sklearn.decomposition.PCA\n*or use mdp.pca\n*using Numpy with MKL makes huge difference. \n\ninterpolation\nlinear\ntripo\n\nscipy.signal.fftconvolve can give lots of speedups, even for linear operations\n\nscipy.signal.lfilter\n\nmyfilter = np.arange (.0, 4.0)\nmyfilter\narray([1., 2., 3.])\nsp.signal.lfilter(np.array([1.,0,0,0]), 1., np.array([2, 3, 4, 5]))\n\norigin of filter is first number of filter
To see what hardware is on your RedHat system, you can use "dmesg".\n\nIf you want to see what CPUs your machine has, for example, so\n\n{{{dmesg | grep CPU }}}
| | !N4_amp | !N4_pha | !N5_amp | !N5_pha |\n| !ROMS | 1.09 | 24.6 | 1.10 | 17.9 |\n| !DATA | 1.12 | 26.8 | 0.98 | 18.7 |
Avijit - 15 km grid, going to 5 km grid, currently GFS forcing, open boundaries climatology \n\nForecast once a week, 3-day composite used twice a week (John's Hopkins)\nRadiation outflow\nFeature model for Gulf Stream inflow, rings, Gulf of Maine to create initial conditions\nOI assimilation\n16 double sigma layers
I always have half-and-half and skim milk around. If I want to make 1% milk or whole milk, how much of each do I need to mix together to make 1 quart?\n\nFrom the web site:\nskim milk: Fat 0g/8oz = 0.0g/oz\n1% milk: Fat 2.5g/8oz = 0.3 g/oz\n2% milk: Fat 5.0g/8oz = 0.6 g/oz\nwhole milk: Fat 8g/8oz = 1.0g/oz\nhalf-and-half: Fat 3.5 g/2T = 3.5g/oz \nlight cream: Fat 3g/T = 6.0g/oz\nheavy cream: Fat 5g/T = 10.0g/oz\n\nFor 32 oz of whole milk, we need 32oz*(1g/oz)=32 g fat, so for half-and-half, we need 32g*(1oz/3.5g) = 9.1oz ~= 1 cup half & half. So 1 cup half & half + 3 cups skim ~= 1 quart of whole milk.\n\nFor 32 oz of 1% milk, we need 32oz*(0.3g/oz)=10 g fat, so 10g*(1oz/3.5g)=2.8oz ~= 1/3 cup half-and-half. So 1/3 cup half and half + 3 2/3 cups skim ~= 1 quart of 1% milk.\n
The best way to do this is to remove EPD's netCDF4 module like so:\n{{{\n $ enpkg --remove netCDF4\n}}}\nThen build your netCDF4 module into an egg. Since the netCDF4 sources\ndon't natively support setuptools, you have to do a slightly awkward,\nbut totally boilerplate command:\n\n # First build it.\n{{{\n$ python build --whatever --flags --you --need\n}}}\n # Now make the egg.\n{{{\n $ python -c "import setuptools;execfile('')" bdist_egg\n}}}\n\nNow use egginst to install the egg:\n{{{\n $ egginst dist/netCDF4-*.egg\n}}}\nBy building an egg and installing it with egginst, enpkg will have the\nmetadata necessary to uninstall it without any dangling files.\n\nOn Windows, "egginst" is in the "Scripts" directory.
I love the ease of Jing, but it can make only 5 minute long videos. \nHey, wait, perhaps that has advantages!\n- takes less time to record\n- forces you to be more concise\n- more likely to be watched\n- easier to rerecord if information becomes dated\n\nBecause I want to capture the screen, I want the text to be crisp as possible. For HD on YouTube, you want 1280x720. You can easily set your browser (IE, Firefox) window to 1280x720 by typing this into the address bar:\n{{{\njavascript:window.resizeTo(1280,720)\n}}}\nYou might as well create a bookmark if you are doing this a lot (I did).\n\nOnce you've saved your HD video to youtube, here's a cool tip. Instead of passing around the usual URL that looks like\n \nand then telling people "pop to full screen for HD", you can just give them this URL\n\nwhich will pop to full screen automagically!\n\n
WRF NetCDF files are not CF-Compliant. They work with NetCDF-Java applications because there is a special "WRF I/O Service Provider" written, but I was curious whether we could also make a WRF NetCDF CF-Compliant by just creating some NcML that adds the missing metadata. This would allow folks with collections of existing WRF files to make them CF-compliant simply by creating some NcML that they could put in a THREDDS catalog and then serve WRF as CF compliant data. \n\nI started with a sample WRF file containing 1 time step from Cindy Bruyere ( What I found was that I needed to do:\n\n1. Change the file name so that it had a ".nc" extension and remove the ":" from the time stamp: (e.g. wrfout_d01_2000-01-25_00:00:00 => I'm not sure this was completely necessary, but without the .nc extension, my THREDDS server was not picking up this file in the datasetScan, and also there seemed to be some issues with the ":". Perhaps the ":" just needed to be escaped, but in any case, it seems easy to rename.\n\n2. Add the "Conventions: CF-1.6" to the global attributes. \n\n3. Add the specification of the vertical coordinate in the "ZNU" coordinate variable using these attributes: \nstandard_name="atmosphere_sigma_coordinate", formula_terms="ptop: P_TOP sigma: ZNU ps: PSFC" and positive="down"\n\n4. Add the specification of the vertical coordinate in the "ZNW" coordinate variable using these attributes: \nstandard_name="atmosphere_sigma_coordinate", formula_terms="ptop: P_TOP sigma: ZNW ps: PSFC" and positive="down"\n\n5. Remove the time dimension from the coordinate variables: ZNU, ZNW, XLONG, XLAT, XLAT_U, XLONG_U, XLAT_V, XLONG_V. This was simple in this single time step file because I could just remove the "Time" dimension in the NcML. For example: XLAT in the NetCDF file has dimensions {"Time south_north west_east"} and I just changed this to "south_north west_east" since Time here is a singleton dimension. But in general we will need another approach. If there are 20 time steps, we could clip out the coord vars from the first time step, union in that dataset along with an aggregation that removes the coord vars from the aggregation.\n\n6. Add valid "units" to the time variable "XTIME": units="minutes since 2000-01-24 12:00:00"\n\nHere's the resulting NcML, which seems to work both in ToolsUI and in IDV:\n\n{{{\n<netcdf xmlns:xsi=""\n xsi:schemaLocation=""\n xmlns=""\nlocation="dods://">\n <attribute name="Conventions" value="CF-1.6"/>\n\n <variable name="ZNU" shape="bottom_top" type="float">\n <attribute name="positive" value="down"/>\n <attribute name="standard_name" value="atmosphere_sigma_coordinate"/>\n <attribute name="formula_terms" value="ptop: P_TOP sigma: ZNU ps: PSFC"/>\n <attribute name="units" value="layer"/>\n </variable>\n\n <variable name="ZNW" shape="bottom_top_stag" type="float">\n <attribute name="positive" value="down"/>\n <attribute name="standard_name" value="atmosphere_sigma_coordinate"/>\n <attribute name="formula_terms" value="ptop: P_TOP sigma: ZNW ps: PSFC"/>\n <attribute name="units" value="level"/>\n </variable>\n\n <variable name="U" shape="Time bottom_top south_north west_east_stag" type="float">\n <attribute name="coordinates" value="XLONG_U XLAT_U ZNU XTIME"/>\n </variable>\n\n <variable name="V" shape="Time bottom_top south_north_stag west_east" type="float">\n <attribute name="coordinates" value="XLONG_V XLAT_V ZNU XTIME"/>\n </variable>\n\n <variable name="W" shape="Time bottom_top_stag south_north west_east" type="float">\n <attribute name="coordinates" value="XLONG XLAT ZNW XTIME"/>\n </variable>\n\n <variable name="T" shape="Time bottom_top south_north west_east" type="float">\n <attribute name="coordinates" value="XLONG XLAT ZNU XTIME"/>\n </variable>\n\n <variable name="PSFC" shape="Time south_north west_east" type="float">\n <attribute name="coordinates" value="XLONG XLAT XTIME"/>\n </variable>\n\n <variable name="U10" shape="Time south_north west_east" type="float">\n <attribute name="coordinates" value="XLONG XLAT XTIME"/>\n </variable>\n\n <variable name="V10" shape="Time south_north west_east" type="float">\n <attribute name="coordinates" value="XLONG XLAT XTIME"/>\n </variable>\n\n <variable name="XTIME" shape="Time" type="float">\n <attribute name="units" value="minutes since 2000-01-24 12:00:00"/>\n </variable>\n\n <variable name="XLAT" shape="south_north west_east" type="float">\n <remove type="attribute" name="coordinates"/>\n </variable>\n\n <variable name="XLONG" shape="south_north west_east" type="float">\n <remove type="attribute" name="coordinates"/>\n </variable>\n\n <variable name="XLAT_U" shape="south_north west_east_stag" type="float">\n <remove type="attribute" name="coordinates"/>\n </variable>\n\n <variable name="XLONG_U" shape="south_north west_east_stag" type="float">\n <remove type="attribute" name="coordinates"/>\n </variable>\n\n <variable name="XLAT_V" shape="south_north_stag west_east" type="float">\n <remove type="attribute" name="coordinates"/>\n </variable>\n\n <variable name="XLONG_V" shape="south_north_stag west_east" type="float">\n <remove type="attribute" name="coordinates"/>\n </variable>\n\n</netcdf>\n}}}
Note: If you do not have a My Network Places icon on your Desktop, right click on the Desktop, select Properties, then select the Desktop tab. Click the Customize Desktop button and replace the checkmark next to My Network Places. Click OK.\n\n 1. Double-click on the My Network Places icon on the Desktop.\n 2. Click on Add Network Place from the Network Tasks menu on the left side of the window.\n 3. The Add a Network Place wizard will open. Click Next.\n 4. Select the Choose another network location option. Click Next.\n 5. In the Internet or network address field enter and click Next.\n 6. Enter your CSTMS/ROMS Username and Password when prompted. Any prompts for a username and password throughout the setup process will require entry of your Username and Password as well.\n 7. Enter a name for the link, such as CSTMS . Click Next.\n 8. Click Finish.\n 9. You may need to re-enter your Username and Password.\n 10. A Windows Explorer window will appear at the top-most level of file space in CSTMS. You may then click on the links to reach your desired area.\n
{{{\n$ cat *.png | ffmpeg -r 24 -y -f image2pipe -c:v png -i - -c:v libx264 \s\n-preset ultrafast -qp 0 -movflags +faststart -pix_fmt yuv420p test.mkv\n}}}\nThen in IPython notebook, do this:\n{{{\nfrom IPython.core.display import HTML\nvideo = open("test.mkv", "rb").read()\nvideo_encoded = video.encode("base64")\nvideo_tag = '<video autoplay loop controls alt="test"\nsrc="data:video/x-m4v;base64,{0}">'.format(video_encoded)\nHTML(data=video_tag)\n}}}\n
These are easy to assemble. The only hard part is waiting out the full two-week maceration period.\n\n1 pound ripe, thoroughly washed cherries (stems and pits intact)\n\nRatio:\n2/3 Luxardo Maraschino liqueur\n1/3 Cherry Heering liqueur\n\nPlace the cherries in a glass Mason jar or other container with a lid. Pour the liqueurs in the ratio above over the cherries. The goal is to add enough liqueur to immerse the cherries, but they will bob to the top of the liquid anyway.\n\nRefrigerate for at least 2 weeks. Gently swirl the container every 2 to 3 days to immerse the cherries in the liqueur.\n
\n\nThis workshop included many of the same participants as the preceding ACT workshop, but had a broader scope. I participated in the "learners" section where and intro to XML and a tool called Oxygen for editing it were presented. Discussion of users’ needs, and objectives was also discussed. Break-out groups intending to define, constrain, and build consensus between and among several emerging standards for metadata met simultaneously. A combined session in the last couple hours of the meeting let everyone hear what the others had been working on. The agenda and results are available at this site:\n\nThe following is my impression of what’s important. I learned that there's a perception that there are two kinds of metadata, one for discovery, and one for sensor specs and calibration. The former allows catalogues like geospatial one stop and GCMD to index and search for data matching a query. FGDC and DIF are the big gorillas in this arena for the US with ISO 19115, MarineML and several others in use internationally. The other type is more for an observatory manager to identify and communicate with sensors installed in their systems. SensorML is the emerging standard for the instrument nuts and bolts. TransducerML is similar, but doesn't have as much support. Neither type is fully defined yet, but the discovery end of things is more mature. The funny thing is that with all the concern about the metadata, the measurements themselves seem to get lost- the best way to serve them wasn't discussed.\n\nAnother thing that was abundantly clear was the vehicle for communicating metadata of both kinds was XML, and that web services were rapidly out-pacing SOAP for protocols. Easy conversion tools to instantiate oceanographic metadata in XML don't exist yet, but may in the next year or so. Eventually the user won't need to know XML to write metadata to fit one of the current models, however, at the moment it's mandatory.\n\nOur metadata, the global attributes in particular, blends the two kinds of information, so at present we don't fit the existing models. Plus our main users are the modeling community, so we're not working with most of the constraints of metadata from an observing system monitoring surface waves in the Chesapeake Bay. It seems that having data accessible via OpenDap is still a good way to go, but it also depends who the intended users are. There are plenty of GUI web interfaces that allow the user to select from time, region, variable and depth, and then extract records from some database. They aren't the best choice for obtaining multiple variables simultaneously though. \n\nThe largest obstacle to fitting our data into one of the existing metadata models in XML is mapping our terms into what they're called in the "standard". What does "units" mean? Can we estimate measurement accuracy for each of our sensors? What do we do with the metadata fields we think will be helpful but that don't exist in the "standard"? My thought is keep what we've got, and continue moving towards conversion to cf (we're going to have to rename everything for that effort anyhow). In future, we have to create XML to contain a subset of our metadata to submit to FGDC for "discovery" purposes, and if methods to do more in XML exist then, we should implement what seems practical.\n\n\nThe to-do list that developed from this workshop is listed below:\n\n* Follow up with Melanie Meux at NASA/FGDC to re-do our metadata in FGDC to use more dictionary terms to enhance Discoverability. I couldn't find our Mass Bay data by starting with searching for “ocean current measurements”, and trying to narrow the search region and time, but learned they could be found if one knew they were collected by USGS/CMG/WHSC. I'd like to improve on this less than good discoverability.\n\n* Look into Unidata's NCO tools that may help with an alternate route for our metadata into XML (Thank you Gerry at TAMU).\n\n* Learn XML (a 10-step program?)\n\n* Communicate with engineers at Satlantic to learn more about how they encapsulate there measurements (data) into XML. I believe data can be defined as a vector, so that each measurement for each sensor does not have to be surrounded by a lot of extra XML code. They've done the LEO-15 and several other systems, so have pertinent experience and seemed willing to help. I don't know if this will buy us much, but working towards an integrated solution is more appealing than dealing with two. \n\n* Incorporate Open Geospatial Consortium's (OGC) library terms into our metadata vocabulary.\n\n* Evaluate implementing a web based GUI chooser interface for our datasets (along with OpenDap). \n\n* Think about data and metadata management best practices.
On a tip from Dave Ralston, I took a look at the MassGIS 30m bathymetry raster grid:\n\nDownloaded and ran the .exe, then converted .\sbathymetry30m\simg_bathym30m\sw001001.adf from ESRI raster and the existing coordinate system (Mass State Plane Coordinates) to a Lon/Lat grid (EPSG:4326) geotiff using the gdalwarp command from the FWTOOLS command shell:\n{{{\nC:\sRPS\sbathy\sbathymetry30m\simg_bathym30m>gdalwarp w001001.adf -r bilinear -t_srs EPSG:4326 mass30m_geo.tif\nCreating output file that is 9841P x 6604L.\nProcessing input file w001001.adf.\nUsing internal nodata values (eg. -32768) for image w001001.adf.\n0...10...20...30...40...50...60...70...80...90...100 - done.\n}}}\n
"bench" timing (smaller better) in Matlab 2009a:\nLU FFT ODE Sparse 2-D 3-D Machine\n0.2623
This is the tinyurl for the Matlab Interoperability Demo: \n{{{\n\n}}}
In Matlab 2010a or higher, you can increase the Java memory thusly:\nFile -> Preferences -> General -> Java Heap Memory
We wanted to pass data with lon/lat values into Mirone from the matlab workspace. We got some "undocumented info" from J. Luis about how to do this (which follows) and based on this, I wrote a function called "grid2mirone.m" (svn update m_cmg/trunk/RPSstuff) which accesses the results of "cf_subsetGrid" thusly:\n{{{\n>> uri=''\n>> [d,g]=nj_subsetGrid(uri,'topo',[-71.5 -63 39 46]);\n>> grid2mirone(d,g);\n}}}\n\nHere is J. Luis's info:\n{{{\nOne can invoke Mirone in several ways:\n\n- mirone file_name (this works with many different formats,\nas long as it possible to find out what is in "file_name")\n\n- mirone(Z) Z is an array - the one that started this thred\n\n- mirone(Z, struc) Z as above and "struc" is a structure with\ncoordinates information\n\nthe "struc" structure has optional and mandatory fields. Bellow is a\ndescription of those fields and what they should contain depending the\ncases of use.\n\nHope that it is clear enough.\n\n\n---- OPTIONAL fields\n\n- To inform if coordinares are geographical or other\nstruct.geog = 1 (geog) or = 0 (other coords)\n\n- Color map\nstruc.cmap -> contains a Matlab [Mx3] colormap\n\n- Figure name\ = 'Whatever you like';\n\n\n- If you know the projection in terms of a WKT (Well Known (by who?)\nText)\nstruc.srsWKT = a projection WKT string as the ones used by GDAL\n\n(related note, the ogrproj MEX file lets you convert a Proj4\nprojection string into a WKT string\nexample: strWKT = ogrproj('+proj=merc') )\n\n\n---- MANDATORY fields\nhead\nX\nY\n\nAn header [1 x 9] array which contents are:\nstruc.head = [x_min x_max y_min y_max z_min z_max 0 x_inc y_inc];\nThe zero (7th element) indicates that grid registration is beeing used\n\nNow we have two cases:\n\n-1 case: the input array is of type uint8 and contains an image\n\nstruc.X = [x_min x_max]\nstruc.Y = [y_min y_max]\n\n-2 case: the input array is of type int16 or float (single or double)\n\nIf [n_rows, n_columns] = size(Z);\nstruc.X = linspace(x_min, x_max, n_columns)\nstruc.Y = linspace(y_min, y_max, n_rows)\n\n}}}\n\n
Getting the correct spatial extents for unstructured grid data:\n\nThe problem: ncISO was returning inccorect lon/lat ranges for unstructured grid, because it was using a netcdf-java routine that took short cuts (e.g. first/last value) instead of reading the entire lon or lat variable, which of course is required for unstructured grids.\n\nDave made a new version of the ncISO for TDS jar file, called threddsIso-2.22.jar, which we placed in:\n/var/www/tomcat-threddsdev/webapps/thredds/WEB-INF/lib/threddsIso-2.22.jar (and removed the old threddsIso jar file)\n\nThis version is a jar that will read from the array of data values only when cdm_data_type attribute value is not null and not a GRID. \n\nSee link below for possible cdm_data_type values:\n\n\nUnstructured grid is not currently an option, so suggest specifying an global attribute called "cdm_data_type" with value "any" until this is resolved. This will trigger the reading of the lon/lat variables to determine the true extents. \n\nThis is easy for us to do with the unstructured grid model datasets, since we are already modifying the metadata for all datasets via a python script. \n\nThe script \\nis the main script, which calls "", which in turn uses Alex Crosby's "ncml" routine to modify metadata in a NcML file.\n\nWe are using this to modify the ID and title based on the google spreadsheet values, so I just modified the file so that it reads:\n{{{\n ncmlFile = ncml.Dataset.NcmlDataset(ncmlFile)\n ncmlFile.addDatasetAttribute('id',datasetID)\n ncmlFile.addDatasetAttribute('cdm_data_type','any')\n ncmlFile.addDatasetAttribute('title',datasetName)\n}}}\n\nI then reharvested the "testing.xml" catalog with GI-CAT using the THREDDS-NCISO service, and now the unstructured grid models are there!\n\nThe google doc is at:\n\nThis google doc is points to NcML files contributed by modelers that specify aggregations, and modifies the metadata. \n\n\n\n
All times UTC\n\n0835 UTC: Met Bill Danforth, Chuck Worley, Barry Irwin at Harbor\n\nBill & Chuck working in Ethernet interface to swath mapping system as opposed to serial port.\n\nTalked to engineer Matt in UK using Skype through The Unwired Village wireless connection using laptop on fantail. \npicked up in harbor (Island Queen point?). Cool!\n \n~1400 Left Bill Danforth behind and headed out toward Middle Ground. Overcast skies. 1-2 ft seas, light wind 5-10 knots from east.\n\n1441 On station at beginning of zig-zag survey, but GPS not receiving interally. Needed to reset on receiver.\n\n~1600 Swath bathy system not removing pitch & roll, nothing seems to help, so start doing zig-zag survey with single beam echosounder, hoping that swath system will work by the time we reach the hi-res survey area.\n\n1700 Switched back to the old ISA Tem's, and that solved the pitch & roll problem. The USB Tem's apparently has a timing problem. Chuck asks: Could it be related to desktop computer not being USB2.0? \n\nLine 27. Start mowing the the lawn from SE corner on Hypack line 27. Started with swath of 30 m on a side, but then changed to 50 m on a side 100 m into the line or so, hoping to be able to skip a few lines. \n\nDid not skip line 26, however, since we want to avoid a hole at the beginning of line 27 when the width was only 30 on a side.\n\nSkipped line 25.\n\nLine 24. Went back to 30 m on a side and doing every line, because the lines go from shallow (6 m to 20 m). Should really adjust the survey lines to try to follow contours of the bank.\n\nLine 23\nLine 22\nLine 21\nLine 20 \n\n1809 Beginning line 18. heading 238\n\n1904 begin line 11\n\n1939 big rollers (during 30 seconds along line 7)\n\n2019 Water is noticably smoother rougher the south side of Middle Ground\n\nBarry suggests running along "the wall" instead of just across it. Running along the crest seems like it would be a good idea also.\n\nHeading over to N5 to do a quick bathy and to check the RTK relative to the N5 pressure sensor (after N5 is recovered).\n\n1839 finished with N5 survey\n\nChuck is trying new system again.
[[MonkeyPirateTiddlyWiki|]] is a distribution of [[TiddlyWiki|]] created by Simon Baird. See [[the web site|]] for more information.\n!!Upgrading ~MonkeyPirateTiddlyWiki\nThis "empty" ~MonkeyPirateTiddlyWiki file comes pre-installed with the core ~MonkeyPirateTiddlyWiki plugins. You can upgrade these core plugins to the latest version by doing the following:\n* Click ImportTiddlers\n* Click "Choose..." and select "~MptwUpgradeURL"\n* Click "fetch"\n* Click the checkbox in the first column heading to select all tiddlers\n* Click "More actions..." and select "Import these tiddlers"\n* Click "OK" to confirm you want to overwrite the tiddlers\n* Save and reload\n
{{{\n cd /home/old_dir\n tar cf - . | (cd /usr/new_dir; tar xvf -)\n}}}
/***\n| Name|MptwLayoutPlugin|\n| Description|A package containing templates and css for the MonkeyPirateTiddlyWiki layout|\n| Version|3.0 ($Rev: 1845 $)|\n| Source||\n| Author|Simon Baird <>|\n| License||\n!Notes\nPresumes you have TagglyTaggingPlugin installed. To enable this you should have a PageTemplate containing {{{[[MptwPageTemplate]]}}} and similar for ViewTemplate and EditTemplate.\n***/\n//{{{\n// used in MptwViewTemplate\nconfig.mptwDateFormat = 'DD/MM/YY';\nconfig.mptwJournalFormat = 'Journal DD/MM/YY';\n//config.mptwDateFormat = 'MM/0DD/YY';\n//config.mptwJournalFormat = 'Journal MM/0DD/YY';\n\nconfig.shadowTiddlers.GettingStarted += "\sn\snSee also MonkeyPirateTiddlyWiki.";\n\n//}}}\n\n//{{{\nmerge(config.shadowTiddlers,{\n\n'MptwEditTemplate':[\n "<!--{{{-->",\n "<!--- ($Rev: 1829 $) --->",\n "<div class=\s"toolbar\s" macro=\s"toolbar +saveTiddler saveCloseTiddler closeOthers -cancelTiddler cancelCloseTiddler deleteTiddler\s"></div>",\n "<div class=\s"title\s" macro=\s"view title\s"></div>",\n "<div class=\s"editLabel\s">Title</div><div class=\s"editor\s" macro=\s"edit title\s"></div>",\n "<div class=\s"editLabel\s">Tags</div><div class=\s"editor\s" macro=\s"edit tags\s"></div>",\n "<div class=\s"editorFooter\s"><span macro=\s"message views.editor.tagPrompt\s"></span><span macro=\s"tagChooser\s"></span></div>",\n "<div macro=\s"showWhenExists EditPanelTemplate\s">[[EditPanelTemplate]]</div>",\n "<div class=\s"editor\s" macro=\s"edit text\s"></div>",\n "<!--}}}-->"\n].join("\sn"),\n\n'MptwPageTemplate':[\n "<!--{{{-->",\n "<!-- ($Rev: 1829 $) -->",\n "<div class='header' macro='gradient vert [[ColorPalette::PrimaryLight]] [[ColorPalette::PrimaryMid]]'>",\n " <div class='headerShadow'>",\n " <span class='siteTitle' refresh='content' tiddler='SiteTitle'></span>&nbsp;",\n " <span class='siteSubtitle' refresh='content' tiddler='SiteSubtitle'></span>",\n " </div>",\n " <div class='headerForeground'>",\n " <span class='siteTitle' refresh='content' tiddler='SiteTitle'></span>&nbsp;",\n " <span class='siteSubtitle' refresh='content' tiddler='SiteSubtitle'></span>",\n " </div>",\n "</div>",\n "<!-- horizontal MainMenu -->",\n "<div id='topMenu' refresh='content' tiddler='MainMenu'></div>",\n "<!-- original MainMenu menu -->",\n "<!-- <div id='mainMenu' refresh='content' tiddler='MainMenu'></div> -->",\n "<div id='sidebar'>",\n " <div id='sidebarOptions' refresh='content' tiddler='SideBarOptions'></div>",\n " <div id='sidebarTabs' refresh='content' force='true' tiddler='SideBarTabs'></div>",\n "</div>",\n "<div id='displayArea'>",\n " <div id='messageArea'></div>",\n " <div id='tiddlerDisplay'></div>",\n "</div>",\n "<!--}}}-->"\n].join("\sn"),\n\n'MptwStyleSheet':[\n "/*{{{*/",\n "/* ($Rev: 1860 $) */",\n "",\n "/* a contrasting background so I can see where one tiddler ends and the other begins */",\n "body {",\n " background: [[ColorPalette::TertiaryLight]];",\n "}",\n "",\n "/* sexy colours and font for the header */",\n ".headerForeground {",\n " color: [[ColorPalette::PrimaryPale]];",\n "}",\n ".headerShadow, .headerShadow a {",\n " color: [[ColorPalette::PrimaryMid]];",\n "}",\n "",\n "/* separate the top menu parts */",\n ".headerForeground, .headerShadow {",\n " padding: 1em 1em 0;",\n "}",\n "",\n ".headerForeground, .headerShadow {",\n " font-family: 'Trebuchet MS' sans-serif;",\n " font-weight:bold;",\n "}",\n ".headerForeground .siteSubtitle {",\n " color: [[ColorPalette::PrimaryLight]];",\n "}",\n ".headerShadow .siteSubtitle {",\n " color: [[ColorPalette::PrimaryMid]];",\n "}",\n "",\n "/* make shadow go and down right instead of up and left */",\n ".headerShadow {",\n " left: 1px;",\n " top: 1px;",\n "}",\n "",\n "/* prefer monospace for editing */",\n ".editor textarea {",\n " font-family: 'Consolas' monospace;",\n "}",\n "",\n "/* sexy tiddler titles */",\n ".title {",\n " font-size: 250%;",\n " color: [[ColorPalette::PrimaryLight]];",\n " font-family: 'Trebuchet MS' sans-serif;",\n "}",\n "",\n "/* more subtle tiddler subtitle */",\n ".subtitle {",\n " padding:0px;",\n " margin:0px;",\n " padding-left:0.5em;",\n " font-size: 90%;",\n " color: [[ColorPalette::TertiaryMid]];",\n "}",\n ".subtitle .tiddlyLink {",\n " color: [[ColorPalette::TertiaryMid]];",\n "}",\n "",\n "/* a little bit of extra whitespace */",\n ".viewer {",\n " padding-bottom:3px;",\n "}",\n "",\n "/* don't want any background color for headings */",\n "h1,h2,h3,h4,h5,h6 {",\n " background: [[ColorPalette::Background]];",\n " color: [[ColorPalette::Foreground]];",\n "}",\n "",\n "/* give tiddlers 3d style border and explicit background */",\n ".tiddler {",\n " background: [[ColorPalette::Background]];",\n " border-right: 2px [[ColorPalette::TertiaryMid]] solid;",\n " border-bottom: 2px [[ColorPalette::TertiaryMid]] solid;",\n " margin-bottom: 1em;",\n " padding-bottom: 2em;",\n "}",\n "",\n "/* make options slider look nicer */",\n "#sidebarOptions .sliderPanel {",\n " border:solid 1px [[ColorPalette::PrimaryLight]];",\n "}",\n "",\n "/* the borders look wrong with the body background */",\n "#sidebar .button {",\n " border-style: none;",\n "}",\n "",\n "/* this means you can put line breaks in SidebarOptions for readability */",\n "#sidebarOptions br {",\n " display:none;",\n "}",\n "/* undo the above in OptionsPanel */",\n "#sidebarOptions .sliderPanel br {",\n " display:inline;",\n "}",\n "",\n "/* horizontal main menu stuff */",\n "#displayArea {",\n " margin: 1em 15.7em 0em 1em; /* use the freed up space */",\n "}",\n "#topMenu br {",\n " display: none;",\n "}",\n "#topMenu {",\n " background: [[ColorPalette::PrimaryMid]];",\n " color:[[ColorPalette::PrimaryPale]];",\n "}",\n "#topMenu {",\n " padding:2px;",\n "}",\n "#topMenu .button, #topMenu .tiddlyLink, #topMenu a {",\n " margin-left: 0.5em;",\n " margin-right: 0.5em;",\n " padding-left: 3px;",\n " padding-right: 3px;",\n " color: [[ColorPalette::PrimaryPale]];",\n " font-size: 115%;",\n "}",\n "#topMenu .button:hover, #topMenu .tiddlyLink:hover {",\n " background: [[ColorPalette::PrimaryDark]];",\n "}",\n "",\n "/* for Tagger Plugin, thanks sb56637 */",\n ".popup li a {",\n " display:inline;",\n "}",\n "",\n "/* make it print a little cleaner */",\n "@media print {",\n " #topMenu {",\n " display: none ! important;",\n " }",\n " /* not sure if we need all the importants */",\n " .tiddler {",\n " border-style: none ! important;",\n " margin:0px ! important;",\n " padding:0px ! important;",\n " padding-bottom:2em ! important;",\n " }",\n " .tagglyTagging .button, .tagglyTagging .hidebutton {",\n " display: none ! important;",\n " }",\n " .headerShadow {",\n " visibility: hidden ! important;",\n " }",\n " .tagglyTagged .quickopentag, .tagged .quickopentag {",\n " border-style: none ! important;",\n " }",\n " .quickopentag a.button, .miniTag {",\n " display: none ! important;",\n " }",\n "}",\n "/*}}}*/"\n].join("\sn"),\n\n'MptwViewTemplate':[\n "<!--{{{-->",\n "<!--- ($Rev: 1830 $) --->",\n "",\n "<div class='toolbar'>",\n " <span macro=\s"showWhenTagged systemConfig\s">",\n " <span macro=\s"toggleTag systemConfigDisable . '[[disable|systemConfigDisable]]'\s"></span>",\n " </span>",\n " <span style=\s"padding:1em;\s"></span>",\n " <span macro='toolbar closeTiddler closeOthers +editTiddler deleteTiddler undoChanges permalink references jump'></span>",\n " <span macro='newHere label:\s"new here\s"'></span>",\n " <span macro='newJournalHere {{config.mptwJournalFormat?config.mptwJournalFormat:\s"MM/0DD/YY\s"}}'></span>",\n "</div>",\n "",\n "<div class=\s"tagglyTagged\s" macro=\s"tags\s"></div>",\n "",\n "<div class='titleContainer'>",\n " <span class='title' macro='view title'></span>",\n " <span macro=\s"miniTag\s"></span>",\n "</div>",\n "",\n "<div class='subtitle'>",\n " <span macro='view modifier link'></span>,",\n " <span macro='view modified date {{config.mptwDateFormat?config.mptwDateFormat:\s"MM/0DD/YY\s"}}'></span>",\n " (<span macro='message views.wikified.createdPrompt'></span>",\n " <span macro='view created date {{config.mptwDateFormat?config.mptwDateFormat:\s"MM/0DD/YY\s"}}'></span>)",\n "</div>",\n "",\n "<div macro=\s"showWhenExists ViewPanelTemplate\s">[[ViewPanelTemplate]]</div>",\n "",\n "<div macro=\s"hideWhen tiddler.tags.containsAny(['css','html','pre','systemConfig']) && !tiddler.text.match('{{'+'{')\s">",\n " <div class='viewer' macro='view text wikified'></div>",\n "</div>",\n "<div macro=\s"showWhen tiddler.tags.containsAny(['css','html','pre','systemConfig']) && !tiddler.text.match('{{'+'{')\s">",\n " <div class='viewer'><pre macro='view text'></pre></div>",\n "</div>",\n "",\n "<div macro=\s"showWhenExists ViewDashboardTemplate\s">[[ViewDashboardTemplate]]</div>",\n "",\n "<div class=\s"tagglyTagging\s" macro=\s"tagglyTagging\s"></div>",\n "",\n "<!--}}}-->"\n].join("\sn")\n\n});\n//}}}\n
For upgrading directly from tiddlyspot. See [[ImportTiddlers]].\nURL: /proxy/\n
For upgrading. See [[ImportTiddlers]].\nURL:\n
\nTDS:\nOpenDAP URL:\n\nThis dataset has over 170,000 time records, spanning from 1979-01-01\nto 2010-07-28 (currently)!!!\n\nThis dataset is being served by TDS Version 4.1.20100520.1554\n\nI was able to bring this OpenDAP URL in the ToolsUI in the\nFeatureTypes tab, but it took over 20 minutes before the datasets\nappeared in the list! Once they appeared, the 2d fields plot rapidly\nand look great (see attached).\n\nThe 3d fields plot okay too, but only the latest 3D field is in the\naggregation (only 1 value instead of 170,000+).\n\nI downloaded one time step from this dataset to gam, \nncks -d time,0\nand it's 163MB! So written as NetCDF, this would be a 163MB/step * 173,304 steps * 1TB/1e6MB= 28.2TB!!\n
In my cygwin window, I just did:\n{{{\ncd /usr/local/bin\nwget\n( see for latest version)\ntar xvfz nco-4.0.6.win32.cygwin.tar.gz\n}}}\nand because I installed UDUNITS2 in a non-standard location (not in\n/usr/local/share), I needed to do:\n{{{\nexport UDUNITS2_XML_PATH=/home/rsignell/share/udunits/udunits2.xml\n}}}\nNote that you don't have to install the UDUNITS package -- you can just get the udunits2.xml file, put it somewhere, and then point to it using the environment variable. The UDUNITS package is already built into the cygwin binary for nco.\n\nI had to also install the curl library for cygwin. If you run cygwin's setup.exe, you will find this in the "Web" directory.\n\nOnce this was done, NCO with OpenDAP and UDUNITS2 worked like a champ:\n{{{\n/usr/local/bin/ncks -O -F -d time,"2010-08-31 00:00","2010-08-31\n12:00" -d lon,-88.37,-85.16 -d lat,29.31,30.4\n""\\n}}}\nYou can run "cygcheck -srv" to see the details of your cygwin installation.\n\n
Getting NCO going on Cygwin is now much easier, thanks to the binary tarball provided by Charlie Zender. But there are still a few steps to get NCO fully functional on Cygwin. Here's what they are:\n\n1. Download and unpack the Cygwin binary distribution of NCO from \n\n2. Make sure that "curl" is installed. Type "which curl" in a cygwin shell. If you don't see "/usr/bin/curl", then you need to run Cygwin setup.exe and install the "curl" package from the "Web" section of the installer.\n\n3. For UDUNITS support (which you want so you can extract data based on Gregorian times and such), download, and unpack the UDUNITS version 2 or higher distribution from\n\nNCO just needs the XML files defining the units, do you don't need to build the distribution. You can just copy all the XML files from the ./lib subdirectory to a suitable place like /usr/local/share/udunits. Make sure you copy them all, because udunits2.xml references the other XML files.\n\n4. Specify the location of the UDUNITS2.xml file in this environment variable:\n{{{ \nexport UDUNITS2_XML_PATH=/usr/local/share/udunits/udunits2.xml \n}}}\nYou probably want to put this in your .bashrc or .bash_profile so you always have it defined.\n\n5. Try it out. See if this works:
FYI, here's what i found out about wave and met reporting time for NDBC buoys. Met data is reported stamped with the time that the measurements ended (starting in 2004; starting in 1993 for archived data). Wave data are reported to the nearest hour or half-hour following the end of measurements. For example, for 44008 that records data from 20-40 min past the hour, wave measurements are actually centered on the 1/2 hour, but are reported on the hour.\n\nstart time and duration of wave measurements vary by buoy; a table is found here:\n\nmore about acquisition times at:\n\n
{{{\nGOM3, 48149 nodes, 40 levels: 523GB/year (without overlapping time)\nGOM2, 32649 nodes, 31 levels: 264GB/year\nMBAY: 98432 nodes, 11 layers: 250GB/year\n}}}\nFrom Qichun Xu <>:\n\nThe entire forecast system uses:\n{{{\n WRF hindcast met model: 5*12 processors\n WRF forecast met model: 3*12 processors\n gom2 hindcast :1*12 processors\n gom2 forecast :1*12 processors\n gom3 hindcast :4*12 processors\n gom3 forecast :4*12 processors\n gom3 wave :4*12 processors\n MassBay forecast :4*12 processors\n Scituate forecast: 6*8 processors\n}}}\n\n(26 nodes * 12 CPU/node) + (6 nodes * 8 CPU/node) = 360 CPUs\nrun takes 6 hours \n6 hours * 360 = 2160 CPU hours\n\n\n\n\nCloud Options: Amazon, Rackspace, ProfitBricks (infiniband)\nDNS Failover: Amazon Route 53\n\nRequired: \nExperience running FVCOM\nExperience running ocean models on the CLOUD \nExperience maintaining operational systems\n\nPreferred: Experience running FVCOM in a cloud environment\n\nPerformance comparison between ProfitBricks, Amazon EC2, Rackspace Cloud:\n\n\n
{{{\n>> url='';\n>> nc=ncgeodataset(url);\n>> jd=nc.time('time');\n>> datestr(jd([1 end]))\n31-Dec-1977 22:58:07\n01-Aug-2010 00:00:00\n\n>> url='';\n>> nc=ncgeodataset(url);\n>> jd2=nc.time('time');\n>> datestr(jd2([1 end]))\n01-Apr-2010 00:00:00\n31-Oct-2011 22:58:07\n}}}\n
Here's how to extract data from NOAA's ERDDAP for use in NOAA's GNOME (oil-spill and particle tracking freeware):\n
\nThe NOAA Estuarine Bathymetry \n\ncan easily be viewed and converted into lon/lat grids using Mirone\n\n\nExample: Barnegat Bay\n\nDownloaded 1 arc second zip file, and unpacks into a DEM file.\n\nThe DEM loaded okay in Mirone 2.0. \n\nIn Matlab, go to the mirone directory, type "mirone" and then "File=>Open Grid/Image=>Try Luck with GDAL", choose "all file types" and then select the DEM file. (M070_39074G2_BIG.dem in this case)\n\nThen to convert to this UTM grid to uniformly spaced geographic grid in Mirone, choose:\n"Projections=>GDAL Project" and choose "EPSG:4326" for output (the EPSG code for uniform lon/lat)\n\nThen save grid as type "GMT" which is a netcdf file.\n\nYou can also choose "File=>Workspace=>Grid/Image=>Workspace" and then you will find X,Y,Z variables in your matlab workspace.\n\nIn addition the the NOAA Estuarine Bathymetry, there are some very nice merged bathy/topo grids at:\n\n
NOAA's GNOME program can do 2D tracking with structured or unstructured grid NetCDF files, but requires certain conventions. We want to figure out how to get FVCOM results into GNOME.\n\nThe NetCDF file in the below is obviously the same, but is packaged with other files that are Mac or Windows specific. \n\nWindows:\n\n[[ GNOME for Windows | ]]\n[[ Sample Unstructured Grid w/NetCDF file |]]\n\nMac:\n[[ GNOME for Mac | ]]\n[[ Sample Unstructured Grid w/NetCDF file |]]\n\nSee pages 30 and 31 of this document:\n[[ GNOME Data formats | ]] \n\nComments: the 4 column BND variable is required (I tried deleting it, but GNOME then bombed). It is used to create the GNOME "map" file, which describes the boundary, and the nature of this variable is described on page 31. An issue for FVCOM is that velocities are on element centers, and GNOME expects u,v,lon,lat to be on nodes. GNOME also expects the boundary list to be a list of nodes. \n\n
{{{\n <netcdf xmlns=""\n location="">\n <variable name="theta_s" shape="" type="double">\n <values>5.0</values>\n </variable>\n <variable name="theta_b" shape="" type="double">\n <values>0.0</values>\n </variable>\n <variable name="Tcline" shape="" type="double">\n <values>10.0</values>\n </variable>\n <variable name="s_rho" shape="s_rho" type="double">\n <values start="-0.9875" increment="0.025"/>\n <attribute name="positive" value="up"/>\n <attribute name="standard_name" value="ocean_s_coordinate"/>\n <attribute name="formula_terms"\n value="s: s_rho eta: zeta depth: h a: theta_s b: theta_b depth_c: Tcline"/>\n </variable>\n <variable name="s_w" shape="s_w" type="double">\n <values start="-1" increment="0.025"/>\n <attribute name="standard_name" value="ocean_s_coordinate"/>\n <attribute name="positive" value="up"/>\n <attribute name="formula_terms"\n value="s: s_w eta: zeta depth: h a: theta_s b: theta_b depth_c: Tcline"/>\n </variable>\n <variable name="temp">\n <attribute name="coordinates" value="time s_rho eta_rho xi_rho"/>\n </variable>\n <variable name="u">\n <attribute name="coordinates" value="time s_rho eta_rho xi_u"/>\n </variable>\n <variable name="ubar">\n <attribute name="coordinates" value="time eta_rho xi_u"/>\n </variable>\n <variable name="v">\n <attribute name="coordinates" value="time s_rho eta_v xi_rho"/>\n </variable>\n <variable name="omega">\n <attribute name="coordinates" value="time s_w eta_rho xi_rho"/>\n </variable>\n <variable name="AKv">\n <attribute name="coordinates" value="time s_w eta_rho xi_rho"/>\n </variable>\n <variable name="vbar">\n <attribute name="coordinates" value="time eta_v xi_rho"/>\n </variable>\n <variable name="zeta">\n <attribute name="coordinates" value="time eta_rho xi_rho"/>\n </variable>\n <variable name="time" orgName="scrum_time">\n <attribute name="units" value="seconds since 2002-01-01 00:00 UTC"/>\n </variable>\n <variable name="eta_rho" shape="eta_rho" type="double">\n <attribute name="units" value="degrees_north"/>\n <values start="32.45" increment="
I tried using NetCDF java with NCML to extract just two time steps from a remote OpenDAP URL:\n\n{{{\njava -classpath toolsUI-4.0.jar ucar.nc2.dataset.NetcdfDataset -in test.ncml -out\n}}}\n\n$> more test.ncml\n{{{\n<?xml version="1.0" encoding="UTF-8"?>\n<netcdf xmlns="">\n <aggregation dimName="ocean_time" type="joinExisting">\n <netcdf location=""\n coordValue=" (stable)\n (beta)\n (nightly)
/***\n| Name:|NewHerePlugin|\n| Description:|Creates the new here and new journal macros|\n| Version:|3.0 ($Rev: 1845 $)|\n| Date:|$Date: 2007-03-16 15:19:22 +1000 (Fri, 16 Mar 2007) $|\n| Source:||\n| Author:|Simon Baird <>|\n| License||\n***/\n//{{{\nmerge(config.macros, {\n newHere: {\n handler: function(place,macroName,params,wikifier,paramString,tiddler) {\n wikify("<<newTiddler "+paramString+" tag:[["+tiddler.title+"]]>>",place,null,tiddler);\n }\n },\n newJournalHere: {\n handler: function(place,macroName,params,wikifier,paramString,tiddler) {\n wikify("<<newJournal "+paramString+" tag:[["+tiddler.title+"]]>>",place,null,tiddler);\n }\n }\n});\n\n//}}}\n\n
Pre meeting:\nIdeas for ocean representation on OGC Met Ocean Working Group, which seems dominated by UK Met Office and other met folks. We need some more oceanographers!\n\nFrom Roger Proctor: Richard Lampitt (NOC)\nAdam Leadbetter (BODC)\nLesley Rickards (BODC)\n\nCharlie: \nIOOC interagency ocean observations committee oversees DMAC (IOOC is \nCo-chaired by NASA/NOAA/NSF\nDavid Legler\n\nJeff NOAA Data Architect\nRoger Australian Integrated Marine Observing Sysstem \nJenifer working with Walter BOEM\n\nKrish: DARWIG ?\nJulie Thomas: SEADIP, executive director of SCOOS\nRu Morrison: NERACOOS former optical oceanographer\nSam Simmons: IOOC member, marine mammals, elephant seals, tagged\nEoin Howlett: MARACOOS, OOI, IOOS Modeling Testbed\nMark Fornwall: OBIS USA\nKrisa (sub for Ken Casey)\nJanet Fredericks (Data Quality: Cabled observatory MVCO, publishing once)\nCarl Benedict: ESIP (\nMichelle Gierach: Lead Scientist for PO-DACC\n\n
Jet Stream position:\nJet Stream is position is critical for conditions on the US East Coast, and climate models are all over the map.\nGISS is terrible\nGFDL is pretty good\n
TIGGE: 48 delay to 34 weather products from all over the world. Not yet accessible via TDS, but now with authentication, should become possible.\n\nAWIPS II can download data from LDM, but has decoders to convert to HDF5, not sure what data models is and how it's data model is, how it relates to CDM is unknown. \n\nBen mentions a project called F5 that is trying to invent a CDM-like approach to HDF5.\n\nRick Anthes stepping down as head of UCAR after 22 years.\n\nFlight simulator for IDV:\n\n\n
Wednesday\nENKI: get quote to ENKI, ask WHOI and NFRA group if they want a presentation\n\nChris Little: asking for fixed level could be very expensive.\n\nGeorge Percival "Environmental modeling"\n\nUnCertML available as an OGC discussion paper (for uncertainty in models)\nWPS: GetCapabilities,DescribeProcess,Execute\n\nM. Gould has used WPS to "run" a Hydrological Model
\n “In my opinion, tequila is misunderstood and underappreciated, but it’s one of the best spirits for cocktails. I love mezcal, too, and I’ve had some really nice cocktails with mezcal and pineapple, so I started there. Then I thought about what else pairs well with tequila and mezcal, and I just kept adding layers. Drinks aren’t one-dimensional to me; I try to add as many different notes as I can without letting one overpower another. Unfortunately, when some people think punch they think of the trash-can variety they had in college. But this is for graduates of that. There’s something so wonderful about sitting around drinking out of a big, beautiful punch bowl.”\nWho Jeret Peña, tavern keeper\nWhere Esquire Tavern, in San Antonio\n{{{\nIngredients:\n8 ounces tequila\n1 ounce mezcal \n3 ounces tawny port\n2 ounces St-Germain elderflower liqueur\n6 ounces pineapple juice \n4 ounces water \n2 ounces lime juice\n2 ounces simple syrup\n10 drops Bittermens Hellfire Habanero Shrub \ncinnamon stick\nlime wheel\npineapple wedge\n}}}\n Directions:\n{{{\n1. Chill ingredients, then stir together in a punch bowl or pitcher. \n2. Top with grated cinnamon. Garnish with a lime wheel and pineapple wedge. Serves five.\n}}}\n
The PSDEM_2000 data were supplied in Washington State Plane North coordinates NAD83, with elevations in feet relative to the NAVD88 vertical datum. The data was downloaded as "" a zipped ARC ASCII grid format from\n\nWe converted from state plane coordinates in feet with elevations in feet to geographic coordinates (lon,lat) with height in meters. The original grid spacing was 30 ft, and we used the slightly smaller 0.0001 arc degree spacing, using bilinear interpolation to interpolate the height values.\n\nStep 1. Convert Arc/ASCII to GeoTIFF using information in metadata document in the zip file. The key was realizing that the false_easting mentioned in the metadata
omni graffle\ncmapTools COE\nProtege\nSkype + dimdim MediaWiki\ngoogle web toolkit\nJena/TDB joseki triple store and SPARQL endpoint server
Here's a "Polpette" (meatball) recipe that seeks to recreate those served at \n"Osteria Ca D'Oro alla Vedova" in Venice\nSee this page for pics:\n\nIngredients: \n{{{\n1.5lbs veal/beef/pork mixture\n1/3 lb mortadella\n2 medium russet potatoes\n2 cloves garlic\n1 egg\n1 1/2 t salt\n1/2 cup parmesean or grana cheese\n1/2 cup parsley\n1/2 cup bread crumbs (only if necessarily)\n}}}\nBake the potatos at 400 for 1 hour or until done, and while still hot (or at least warm ) put through a ricer, or mash by hand.\nMince the garlic, parsley and mortadella very fine.\nMix all ingredients and form in to small balls (about 3/4 inch diameter). \nOnly add bread crumbs if necessary to get the balls to hang together. You want these to be nice and soft.\n\nFry in hot (350 degree) oil until golden brown, or fry in a pan over medium heat until cooked all the way through.
* convert the CEOS to calibrated backscatter using the free "rsat" program from Juha-Petri Kärnä ( This program expects the CEOS files to be named like this: \n dat_01.001\n lea_01.001\n nul_01.001\n tra_01.001\n vdf_01.001\nso I make a subdirectory for each date (e.g. "jan_26" containing these files. Then I make a subdirectory below this called "rs1". Cd to the "rs1" subdirectory and run the "rsat" program: \n{{{\nrsat -d ../ -c -l -o ./\n}}}\nthis will create a bunch of "rs1_XXX" files in the ER Mapper (.ers) format. \n\n * We want to preserve the GCP information in the original CEOS file, which is not maintained by Juha-Petri's program. If we convert the original CEOS file to ERMapper using gdal_translate, we can use the .ers file generated for the original image for the calibrated image instead, providing we change the \n{{{\n CellType = Unsigned8BitInteger\n}}}\nto\n{{{\n CellType = IEEE4ByteReal\n}}}\n\nThen we can convert the ER Mapper images to a uniformly spaced image in any coordinate system we want, for example, UTM zone 33, using gdal_warp.\n
Step 1: Run this grab & convert script:\n{{{\n#!/bin/bash\n#\n# DO_MERGE_SRTM30+\n#\n# Grab the latest SRTM30+ data (33 tiles in SRTM30 format)\n# from UCSD ftp site and merge into a single 16-bit GeoTIFF.\n# Grabbing the EM-mapper headers from UCSD allows ""\n# from the FWTOOLS ( to merge them\n# into single GeoTIFF with global extent with a single command.\n#\nwget*.ers\nwget*.srtm\n\n# Merge all 33 tiles into one global GeoTIFF\ -o srtm30plus_v6.tif *.ers\n\n# Convert GeoTIFF to NetCDF for distribution via OpenDAP\ngdal_translate srtm30plus_v6.tif -of NetCDF\n}}}\n\nNote to self: I've been doing this on my linux system in the directory ~/bathy/strm30plus\nwith subdirectories v5.0,v6.0 and script name "do_merge_srtm30plus". The "wget" command takes about 45 minutes to grab all the data.\n\nStep 2: Modify the metadata to make it recognized as a GridDataset by NetCDF-Java\nGDAL's NetCDF driver yields:\n{{{\n netcdf srtm30plus_v5 {\ndimensions:\n x = 43200 ;\n y = 21600 ;\nvariables:\n char GDAL_Geographics ;\n GDAL_Geographics:Northernmost_Northing = 90. ;\n GDAL_Geographics:Southernmost_Northing = - is using an albers projection on a unit sphere, with variable names following Snyder, repeated here by Wolfram:\n\nThese parameters are set early in the wind-bundle.js, and look like this:\n{{{\n var phi1 = radians(29.5);\n var phi2 = radians(45.5);\n var n = .5 * (phi1 + phi2);\n var C = Math.cos(phi1) * Math.cos(phi1) + 2 * n * Math.sin(phi1);\n var phi0 = radians(38);\n var lambda0 = radians(-98);\n var rho0 = Math.sqrt(C - 2 * n * Math.sin(phi0)) / n;\n}}}\nThese coordinates on a unit sphere are then multiplied by a scale factor to get pixels\n{{{\n var mapProjection = new ScaledAlbers(\n 1111 * 0.85, -75, canvas.height - 100, -130.1, 20.2);\n}}}\nwhere 1111*85 is the number of pixels/unit distance on the unit radius sphere (pi/2 = distance from pole to equator). This distance is measured from the lower left corner (-130.1,20.2), and an offset is included as canvas.height because we are measuring pixels from the top instead of pixels from the bottom. I would get rid of the offsets (-75 and -100).
Quick Pasta e Fagioli (serves 4 -- can easily double)\nPreparation time: 25 minutes!\n\nIngredients\n\n 1 tablespoon olive oil\n 1/2 cup chopped onion\n 2 slices of bacon, diced\n 1 stalk celery chopped \n 2 garlic cloves, minced\n 1/4 teaspoon crushed red pepper\n 1 teaspoon chopped fresh rosemary\n 1 (19-ounce) can cannellini beans with liquid\n 3 cups chicken broth\n 1/2 cup canned diced tomatoes with liquid\n 1/2 cup ditalini (very short tube-shaped macaroni)\n 1/4 cup finely shredded Parmesan cheese\n\nPreparation\n\n1. Cook bacon in a large saucepan over medium heat. Remove bacon and drain fat from pan (but don't clean it).\n Add oil, celery, onion and garlic; cook 5 minutes or until golden,stirring frequently.\n\n2. Stir in pepper, crumbled bacon and everything else except: 1/2 the beans, tomatoes,cheese and pasta\n Bring to a boil.\n\n3. Reduce heat, add tomatoes and simmer 10 minutes.\n\n4. Puree until smooth with an immersion blender\n\n4. Add pasta and the rest of the beans, and cook 7 minutes or until done. \n Turn off heat, mix in 1/3 cup parmesean cheese.\n Sprinkle each serving with 1 tablespoon more cheese.
/***\n| Name|QuickOpenTagPlugin|\n| Description|Changes tag links to make it easier to open tags as tiddlers|\n| Version|3.0 ($Rev: 1845 $)|\n| Date|$Date: 2007-03-16 15:19:22 +1000 (Fri, 16 Mar 2007) $|\n| Source||\n| Author|Simon Baird <>|\n| License||\n***/\n//{{{\nconfig.quickOpenTag = {\n\n dropdownChar: (document.all ? "\su25bc" : "\su25be"), // the little one doesn't work in IE?\n\n createTagButton: function(place,tag,excludeTiddler) {\n // little hack so we can to <<tag PrettyTagName|RealTagName>>\n var splitTag = tag.split("|");\n var pretty = tag;\n if (splitTag.length == 2) {\n tag = splitTag[1];\n pretty = splitTag[0];\n }\n \n var sp = createTiddlyElement(place,"span",null,"quickopentag");\n createTiddlyText(createTiddlyLink(sp,tag,false),pretty);\n \n var theTag = createTiddlyButton(sp,config.quickOpenTag.dropdownChar,\n config.views.wikified.tag.tooltip.format([tag]),onClickTag);\n theTag.setAttribute("tag",tag);\n if (excludeTiddler)\n theTag.setAttribute("tiddler",excludeTiddler);\n return(theTag);\n },\n\n miniTagHandler: function(place,macroName,params,wikifier,paramString,tiddler) {\n var tagged = store.getTaggedTiddlers(tiddler.title);\n if (tagged.length > 0) {\n var theTag = createTiddlyButton(place,config.quickOpenTag.dropdownChar,\n config.views.wikified.tag.tooltip.format([tiddler.title]),onClickTag);\n theTag.setAttribute("tag",tiddler.title);\n theTag.className = "miniTag";\n }\n },\n\n allTagsHandler: function(place,macroName,params) {\n var tags = store.getTags();\n var theDateList = createTiddlyElement(place,"ul");\n if(tags.length == 0)\n createTiddlyElement(theDateList,"li",null,"listTitle",this.noTags);\n for (var t=0; t<tags.length; t++) {\n var theListItem = createTiddlyElement(theDateList,"li");\n var theLink = createTiddlyLink(theListItem,tags[t][0],true);\n var theCount = " (" + tags[t][1] + ")";\n theLink.appendChild(document.createTextNode(theCount));\n var theDropDownBtn = createTiddlyButton(theListItem," " +\n config.quickOpenTag.dropdownChar,this.tooltip.format([tags[t][0]]),onClickTag);\n theDropDownBtn.setAttribute("tag",tags[t][0]);\n }\n },\n\n // todo fix these up a bit\n styles: [\n"/*{{{*/",\n"/* created by QuickOpenTagPlugin */",\n".tagglyTagged .quickopentag, .tagged .quickopentag ",\n" { margin-right:1.2em; border:1px solid #eee; padding:2px; padding-right:0px; padding-left:1px; }",\n".quickopentag .tiddlyLink { padding:2px; padding-left:3px; }",\n".quickopentag a.button { padding:1px; padding-left:2px; padding-right:2px;}",\n"/* extra specificity to make it work right */",\n"#displayArea .viewer .quickopentag a.button, ",\n"#displayArea .viewer .quickopentag a.tiddyLink, ",\n"#mainMenu .quickopentag a.tiddyLink, ",\n"#mainMenu .quickopentag a.tiddyLink ",\n" { border:0px solid black; }",\n"#displayArea .viewer .quickopentag a.button, ",\n"#mainMenu .quickopentag a.button ",\n" { margin-left:0px; padding-left:2px; }",\n"#displayArea .viewer .quickopentag a.tiddlyLink, ",\n"#mainMenu .quickopentag a.tiddlyLink ",\n" { margin-right:0px; padding-right:0px; padding-left:0px; margin-left:0px; }",\n"a.miniTag {font-size:150%;} ",\n"#mainMenu .quickopentag a.button ",\n" /* looks better in right justified main menus */",\n" { margin-left:0px; padding-left:2px; margin-right:0px; padding-right:0px; }", \n"#topMenu .quickopentag { padding:0px; margin:0px; border:0px; }",\n"#topMenu .quickopentag .tiddlyLink { padding-right:1px; margin-right:0px; }",\n"#topMenu .quickopentag .button { padding-left:1px; margin-left:0px; border:0px; }",\n"/*}}}*/",\n ""].join("\sn"),\n\n init: function() {\n // we fully replace these builtins. can't hijack them easily\n window.createTagButton = this.createTagButton;\n config.macros.allTags.handler = this.allTagsHandler;\n config.macros.miniTag = { handler: this.miniTagHandler };\n config.shadowTiddlers["QuickOpenTagStyles"] = this.styles;\n store.addNotification("QuickOpenTagStyles",refreshStyles);\n }\n}\n\nconfig.quickOpenTag.init();\n\n//}}}\n
Nobuhito Muri: \nOsaka Bay: Osaka City has warmed by 2 degrees over the last 50 years due to land use changes.\nWith WRF, urban land use and using measured SST in Osaka Bay gets close, but model nighttime cools are still 1.5 deg too cool.\nSpeculates that heating of osaka bay due to sewage (7 deg warmer) results in higher atmos temps. Only gives 0.07 deg increase.\n\nVHF Radar: 500 m resoution. Measured residual circulation shows similar to obs.\n\nTidal residual circulation is sensitive to vertical mixing, says MY2.5 gives better result than GLS. Why? \nAnd bottom friction not tested?
PV003\n\nIn "Include/cppdefs.h"\n{{{\n#define SOUTHERN_WALL\n#define NORTHERN_WALL\n#define EAST_M3GRADIENT\n#define WEST_M3GRADIENT\n#define EAST_TGRADIENT\n#define WEST_TGRADIENT\n#define ANA_FSOBC\n#define ANA_M2OBC\n#define FSOBC_REDUCED\n\n#define WEST_FSCHAPMAN\n#define WEST_M2REDUCED\n#define EAST_FSCLAMPED\n#define EAST_M2REDUCED\n}}}\n\nForced with Kelvin Wave on Eastern Side:\n{{{\n IF (EASTERN_EDGE) THEN\n! kelvin wave structure across boundary\n DO j=JstrR,JendR\n val=cff1*fac*EXP(-GRID(ng)%f(Iend,j)* &\n & (GRID(ng)%yp(Iend,JendR)-GRID(ng)%yp(Iend,j))/ &\n & SQRT(g*GRID(ng)%h(Iend,j)))\n BOUNDARY(ng)%zeta_east(j)=val*SIN(omega*time(ng))\n END DO\n END IF\n}}}\n\nPV004\n{{{\nSOUTH FSCLAMPED\nSOUTH M2REDUCED\nEAST M2CLAMPED\nEAST FSGRADIENT\n}}}\nPV005\nPV006\n{{{\nWEST_FSRADIATION\nWEST_M2RADIATION\n}}}\nPV007\n{{{\n -.12 amp velocity, 36 hours\n}}}\nPV008\n{{{\n72 hours\n}}}\nPV009\n{{{\n WEST_FSCHAPMAN \n WEST_M2REDUCED\n}}}\nPV010\n{{{\n WEST_FSRADIATION\n WEST_M2RADIATION\n EAST FSCHAPMAN\n}}}
DEF_HIS - creating history file:\noceanO: string.c:42: NC_check_name: Assertion `name != ((void *)0)' failed.\nAborted\n
First downloaded Padman's TMD Matlab toolbox from\n\nUnzipped this to c:\srps\sm\stides\stxpo\n\nDownloaded the latest TXPO data files via the links found on \n\nUpdate: they now have the data in netCDF, so the rest of this page may be no longer be necessary!\n\nUncompressed to \nc:\srps\sm\stides\stxpo\sDATA\n\nMoved the ASCII "pointer" file "Model_tpxo7.1" up a level to\nc:\srps\sm\stides\stxpo\sDATA\nsince this file contains these 3 lines:\n{{{\nDATA/h_tpxo7.1\nDATA/u_tpxo7.1\nDATA/grid_tpxo7.1\n}}}\nI then wrote two m-files to convert the coefficients in the binary files to NetCDF:\ntpxo_uv_2nc.m\ntpxo_z_2nc.m\n\nBefore you can run these scripts to create the NetCDF, you need to create the empty NetCDF files by editing the CDL templates "z_template.cdl" and "uv_template.cdl" and then doing:\n{{{\n$ ncgen -o < uv_template.cdl\n$ ncgen -o < z_template.cdl\n}}}\n\nThese programs simply read the list of constituents, then extract the amp and phase for each constituent and write to the NetCDF file. I also switch from 0:360 to -180:+180 just to be consistent with our other tidal databases.\n\nFinally, I wrote a program based on roms_tri_tides.m called roms_tides.m, which reads and interpolates from gridded tidal models (specifically TXPO in this case) to a ROMS grid. I checked it into m_cmg/trunk/adcirc_tides (now a bit of a misnomer, unfortunately).\n\n
{{{\nInstructions how to use Makefile to build ROMS/UCLA model.\n============ === == === ======== == ===== ========= ======\nThere are three types of makefiles files associated with the\nROMS/UCLA building procedure:\n\n i. Makefile -- a universal machine independent makefile. This\n file contains list of source code files which determine the\n particular model configuration to be build. User is free to\n add or delete files from this configuration list at his/her\n own discretion without any restrictions, depending on\n physical formulation of the problem.\n\n ii. Makedefs.machine_type (e.g., Makedefs.sgi, Makedefs.Linux):\n These files contain definitions of rules, compilers and compiler\n options, which are generally machine dependent. These files may\n be edited by the user in order to insure optimal usage of the\n compiler flags for a particular machine type or compiler.\n\niii. Make.depend -- an automatically generated list of dependencies.\n usually this list contains the names and dependencies of ALL\n source codes in the directory regardless weather they are\n actually needed in the present configuration or not. This file\n is practically machine independent. This file should not be\n edited by user under any circumstances.\n\n\nHow to make Makefile work:\n=== == ==== ======== =====\n\n 1. On a particular machine, for example a Sun, establish symbolic\n link:\n ln -s Makedefs.sun Makedefs\n\n (If the file for the paricular type of machine is not available\n create it, using one of the existing "Makedefs.machine" files\n as a template. Define appropriate compiler options.)\n\n 2. Check, if file "Make.depend" exists in the present directory.\n if it does not exist, create an EMPTY file and call it\n "Make.depend".\n\n 3. After steps 1 and 2 your Makefeile should be able to work.\n Type\n make tools\n\n This will create two auxiliary utility executable files named\n "cross_matrix" and "mpc". The first one, "cross_matrix" is a\n tool to analyze dependencies and build "Make.depend", the\n second one in an auxiliary multifunctional precompiler designed\n to make .f files generated by CPP more human readable by\n cleaning them from blank lines and comments, as well as to\n perform certain code transformations and optimizations\n (optionally). Read headers of files "mpc.F" and "cross_matrix.F"\n for more details. Once tools are build, it is not necessary\n to rebuild them every time when compiling the model, unless\n files "cross_matrix.F" and "mpc.F" were modified.\n\n 4. Type\n make depend\n\n This will update/create file "Make.depend" consistent with the\n content of all *.F files in the current working directory. All\n source code *.F files will be included into dependency list,\n regardless weather they are actually used or not. User have to\n update "Make.depend" only if\n\n (A) a brand new source code file is introduced into the\n working directory and it participates in the SRSC list\n in the "Makefile" to build the model,\n or\n (B) in a source code file a new #include statement, which\n includes a file previously not included.\n\n It is not necessary to type make depend every time after\n changing SRSC list in the "Makefile", say switching from\n "prsgrd.F" to "prsgrd3.F" back and forth, as long as neither\n (A) nor (B) happens.\n\n 5. After step 4 Makefile becomes fully operational.\n Type\n make\n or\n smake (SGI machines only)\n or\n smake -J 8 (SGI machines only)\n\n to build the model. (Here smake will make individual targets\n in parallel, is multiple processors are available. -J stands\n to specify the desired number of processors involved to override\n the default, for example 8.\n\nFinal remark:\n===== =======\n\n iv. Once steps 1 and 2 are performed, one can simply type\n\n make all\n\n instead of steps 3,4,5. However, doing it in parallel, that\n is "smake all" is not recommended, since the dependency file,\n "Make.depend", is being modified during this procedure.\n\n v. Command "make clean" is recommended when compiler options are\n changed. Otherwise it is unnecessary. "make depend" is\n sufficient most of the time after some of the model source\n codes and .h files were edited.\n}}}\n\n
{{{\nThe strict answer is "yes" and there should be no \nUNEXPLAINED differences.\n\nThis code has the capability of self-verification,\nwhich saves strings from the output from previous\nruns and AUTOMATICALLY compares them with new runs.\n\nThe most typical use of this capability is to verify\nthat there are no parallel bugs. To do so one needs\n\n 1. execute the code on a single CPU, setting 1x1\n partition: NSUB_X = NSUB_E = 1 in file param.h\n\n 2. Save some of the lines of the output into the\n file named\n etalon_data.APPLICATION_NAME\n\n [see several actual files which are already\n available with the code for their specific\n format. Also briefly glance "diag.F" where\n these files are included.\n\n Typically I save output after the 1st, 2nd,\n 4th, 8th, 16th, 32nd, 64th, 96th, 128th, ....\n etc time step.\n\n 3. Recompile and execute the code again, still\n 1 CPU, 1x1 partition, and observe that lines\n looking like \n\n PASSED_ETALON_CHECK \n \n appear after computation passes the check\n points.\n\n\n 4. Introduce partition, recompile the while code \n and execute it again. If everything is correct,\n the PASSED_ETALON_CHECK lines should still be\n present.\n\n Basically this means that global integrals,\n like kinetic energy are kept the same between\n the control "etalon" run and test run. The\n accuracy is 11 decimal places, which is close\n to double-precision accuracy.\n\n\n 5. If something goes wrong and the results of the \n two runs differ, the difference is printed using\n a special format in which leading zeros are\n replaced with the dots and decimal point with\n column. This is done for quick visual reading:\n one can glance the magnitude of the difference\n without reading it.\n\nMost of the time the difference indicate inconsistency\nbetween non-partitioned single processor run and\npartitioned one, which is most likely explained by a\nparallel bug, especially at coarse resolution.\n\n\n\nThere are however other reasons, for example different\ncompilr version may produce slightly different\nresults, \ndifferent implemntations of intrinsic functions and \neven optimization level may change the roundoff-level\nbehavior.\n\nThe etalon check is extremely sensitive, and actually\nspecial measures are done to ensure consistent\ncomputation of global sums regardless of the\npartition.\nFor example, a naive code to compute sum of squares\nof elements an array\n\n sum=0\n do i=1,N\n sum=sum+A(i)**2\n enddo\n\nresults in adding a small number to a large number,\nbecause sum may grow and be much larger than the \nindividual contributions. To avoid this problem,\na special algorithm --- summation by pairs --- is\nused in "diag.F" and elsewhere in this code where\nglobal summation takes place. The idea is that\nsuppose one needs to compute sum of elements of\narray A(i) in such a way that only comparable numbers\nare added at every stage. To do so we first add\n\n A(1) to A(2) and put the result into A(1)\n A(2) + A(4) ---> A(3)\n .....\n A(i) + A(i+1) ---> A(i) for all i<N\n\nThen we have an array of A(1) A(3) A(5) ... etc\nwhich is half the size. Do the same thing again:\n\n A(1) + A(3) ---> A(1)\n A(5) + A(7) ---> A(5)\netc\n\nthe result is array A(1) A(5) A(9) ... etc of the\nsize of one quarter of the original size of A.\nRepeat the above again and again, until it boils\ndown to just a single point.\n\nObviously the above works with the array size being\na power of 2, but actually the algorithm can be\ngeneralized to any number of points ---- there will\nbe just few "deffects" in the reduction tree.\n\ndiag.F contains a 2D version of the reduction\nalgorithm above.\n\nHint: if the dimensions are powers of 2 and the number\nof CPUs used are also power of 2, the above algorithm\nis ENTIRELY DETERMINISTIC, since the order of\nsummation does not depend on number of CPUs.\n\n}}}\n
The ROMS/UCLA code came as roms.tar.\n\nThe steps:\n\n\n 1. Edit "cppdefs.h" and make sure ISWAKE is the only case defined\n 2. Edit "param.h" and set the grid size (360x160x20 or 720x320x20) and the number of tiles (~NP_XI and ~NP_ETA for MPI runs, ~NSUB_X and ~NSUB_Y for ~OpenMP or Scalar runs)\n 3. Edit "ana_grid.F" and set define ~GAUSSIAN_SLOPE_ISLAND if desired (if not defined, the default cylinder-shaped island is used). Also set the island radius to either 10 or 5 km. \n 4. Follow the instructions for building ROMS/UCLA below -- basically just edit or create an appropriate file and then edit the Makefile to include that file.\n 5. Run roms: ~OpenMP: {{{roms <}}}; MPI: {{{roms}}}\n\n\nAccording to Sasha:\n\n{{{\n\nI encorporated Charles contributions related to\nthe Island Wake problem and is able to reproduce\nhis results, while maintaining full computational\nefficiency (since I gave my code to Charles back\nin February, a lot of changes took place on both\nends, so that, for example my code now has small-\nsize boundary arrays for off-line nesting\ncapability, which it did not have at the time I\ngave it to Charles. This specifically affects the\nIsland Wake problem, because not inflow--outflow\nboundaries for Island Wake problem are implemented\nusing this capability.)\n\n...So if you interested to setup a problem run\njust to have your dual-Xeon machine running over\nweekend, we can arrange it now.\n\n\nIndependently of this, I can still work on the code\nfor some time pursuing two goals: making cleaner and\nmore flexible CPP-configuration control (purely\nlogistical task); looking to improve downstream\nboundary (i do not thing that it will change anthing\nsignificantly, but I do see some oscillations in\nvorticity field near the donstream boundary, which\nI hope I can avoid); also I think I can do something\nin spirit of partial-cells to make numerically cleaner\nimplementation of masking boundaries. Flow around the\ncylinder provides a nice testing problem for this.\n\nPlease let me know.\n\nWith the code I have right now I can fit 720x320x20\ncomputiational grid running it on a single-processor\n3.2 GHz Pentium 4, 1GByte memory. I am getting\nslightly more that 1 model day per 2CPU hours for this\ngrid. Scientifically interesting runs for this\nproblem are about 30 model days, or 2 days of\ncomputing.\n\n\nSasha\n}}}\n
|!machine | !cpus | !partition | !run | !time | !compiler | !flags |\n|laptop | 1 Xeon | 1x1 | 19200 steps | 13:55 hours| ifort | FFLAGS = -O3 -ip -pc80 -tpp7 -axN -xN -auto -stack_temps -fpp2 -openmp |\n|pikmin | 1 Opteron | 1x1 | 19200 steps |14:14 hours| pgf90 | FFLAGS = -tp k8-64 -Bstatic -fastsse -Munroll=n:4 -Mipa=fast,inline |\n|pikmin | 4 Opteron | 2x2 | 19200 steps |6:16 hours| ifort | FFLAGS = -fpp2 -openmp -xW -auto -stack_temps -O3 -ip |\n|pikmin | 4 Opteron | 2x2 | 19200 steps |xxyy hours| mpif90 | |
On the NCEP RTOFS site\n\nOn the Ocean Nomads RTOFS site:\nIt directs the user to \n\nBut GRIB1 files start off the archive on 2006-05-31:\n\n\nThese early files have lon,lat as 1d vectors with uniform spacing, which is incorrect.\n\nBeginning with 2007-06-06 the file size goes from 15MB to 42MB:\n\nand we find that NetCDF-Java can't read the files, with the error:\nUnknown Grid Type : 204\nucar.grib.NoValidGribException: GDS: Unknown Grid Type : 204) is not supported.\n\n\nGRIB2 files (that NetCDF-Java reads okay) don't start until 2008-01-09:\n\n\n
NATO NURC located all of the original 10 RADARSAT CEOS images, with the exception of the image from Feb 12. Frustrating, since the Feb 12 image is one of the original 3 that we used to showcase the Bora in presentations and in the NATO internal report by Askari & Signell.\n\nLuckily, we have the original (unfiltered) NetCDF file from Feb 12 that Farid processed containing "longitude","latitude", "sigma0" and "Inc_angl". So the question is how to convert these sigma0 and incidence angles to be compatible with the ones that Jochen Horstmann is producing.\n\nFirst we compare the Jan 26 image, since we have both the unfiltered NetCDF from Farid, and the CEOS file.\n\nFor Jan 26, the dimensions of the CEOS image are: 10976, 12064, with a grid spacing of 50 m. There seems to be only 1 variable, which ranges from about 15 (black) to 255 (white).\n\nFor Jan 26, the dimensions of the NetCDF file are: 5000,5000 (darn!). But the grid spacing is 50 m, so this just must be a clipped version. The values of sigma0 range from 0 to 2.0235. The values of incidence_angle range from 23.6 to 41.7 (degrees).\n\nFor Jan 26, Jochen's file is 1799x1666, which is not 10976x12064/6. So I guess he clipped a bit also. The range of "nrcs" is from 0 to 2.2914, which seems pretty close to the NetCDF file, so that is good.\n\nTalked to Jochen finally, and he said it would be hard to believe that Farid's sigma0 is incorrect, because that's very easy to extract and calculate from the CEOS files. We decided that we will block average sigma0 to the 300 m grid (6x6 averaging), even though Jochen averaged before he calculated sigma0. Jochen says this difference should be very small. Then Jochen will calculate the sattelite look angle and generate a file that looks like the others he produced from CEOS format.\n\n|!Date|!Quality|\n|Jan 23| Good|\n|Jan 26| Excellent (after surgery)|\n|Jan 29| Poor|\n|Jan 30| OK |\n|Feb 2| Excellent (after surgery)|\n|Feb 5| OK |\n|Feb 9| Good, but very complicated to South|\n|Feb 15| Strange|\n|Feb 16| Excellent (after surgery)|\n
20,904 entries harvested by "Import THREDDS catalog"\n\n37,000 harvested when I tried creating a scheduled harvester, but metadata didn't come through.\n\nTrying again with a normal harvester:\n 0 harvest started at 8:45 am EST\n13,414 harvested by 9:51 am EST
Becca was having some trouble reading Landsat geotiff data from USGS into Matlab on a Mac, so I took a look. She was hoping to use "read_geotiff.m", but that was failing. So she gave me some files to take a look.\n\nI unzipped the 270MB (which came from into c:/rps/landsat, which made a directory with 9 geotiffs. \n\nUsing the FWTOOLS shell, I took a look at the metadata for one of the images using "gdalinfo". Note: Although FWTOOLS isn't available for MacOS, it appears that binaries for GDAL (which is all one would need) are at: <>\n{{{\nc:\sRPS\slandsat\sLE70100112010188EDC00>gdalinfo L71010011_01120100707_B10.TIF\nDriver: GTiff/GeoTIFF\nFiles: L71010011_01120100707_B10.TIF\nSize is 8851, 8281\nCoordinate System is:\nPROJCS["WGS 84 / UTM zone 22N",\n GEOGCS["WGS 84",\n DATUM["WGS_1984",\n SPHEROID["WGS 84",6378137,298.257223563,\n AUTHORITY["EPSG","7030"]],\n AUTHORITY["EPSG","6326"]],\n PRIMEM["Greenwich",0],\n UNIT["degree",0.0174532925199433],\n AUTHORITY["EPSG","4326"]],\n PROJECTION["Transverse_Mercator"],\n PARAMETER["latitude_of_origin",0],\n PARAMETER["central_meridian",-51],\n PARAMETER["scale_factor",0.9996],\n PARAMETER["false_easting",500000],\n PARAMETER["false_northing",0],\n UNIT["metre",1,\n AUTHORITY["EPSG","9001"]],\n AUTHORITY["EPSG","32622"]]\nOrigin = (369600.000000000000000,7846500.000000000000000)\nPixel Size = (30.000000000000000,-30.000000000000000)\nMetadata:\n AREA_OR_POINT=Point\nImage Structure Metadata:\n INTERLEAVE=BAND\nCorner Coordinates:\nUpper Left ( 369600.000, 7846500.000) ( 54d32'5.33"W, 70d41'20.10"N)\nLower Left ( 369600.000, 7598070.000) ( 54d11'3.95"W, 68d27'53.64"N)\nUpper Right ( 635130.000, 7846500.000) ( 47d20'14.25"W, 70d41'11.06"N)\nLower Right ( 635130.000, 7598070.000) ( 47d42'1.06"W, 68d27'45.61"N)\nCenter ( 502365.000, 7722285.000) ( 50d56'21.05"W, 69d36'32.82"N)\nBand 1 Block=8851x1 Type=Byte, ColorInterp=Gray\n}}}\nSo this is byte data, with 30 m spacing in UTM meters, so no big surprise that it didn't work with read_geotiff.m, which requires uniform lat/lon spacing. So let's warp the image to uniform lat/lon spacing (EPSG code 4326), which is also what Google Earth likes, using "gdalwarp":\n{{{\nc:\sRPS\slandsat\sLE70100112010188EDC00>gdalwarp L71010011_01120100707_B10.TIF -t_srs EPSG:4326 foo.tif\n}}}\nBy default, this uses "nearest" interpolation, which is usually what I want, but you can do "gdalwarp --help" to see other options.\n\nThe file "foo.tif" can then be loaded into Matlab using read_geotiff.m:\n{{{\n>> [lon,lat,z]=read_geotiff('foo.tif');\n>> imagesc(lon,lat,z)\n>> set(gca,'DataAspectRatio',[1 cos(mean(lat)*pi/180) 1]); %approx aspect ratio\n}}}\n\n\n[img[Matlab plot|]]\n[img[Google Earth overview|]]\n[img[Google Earth zoom|]]\n\n
If you have the NetCDF Toolbox installed you can run this test as is to grab the topo data from opendap, then write deflated, chunked output to NetCDF4 and read it back in.\n\nIf you want to just load a .mat file instead and test NetCDF4 writing and reading, grab the script and .mat file from \n\nand then type "netcdf4_test".\n\n{{{\n% netcdf4_test.m\n% Test writing and reading NetCDF4 with chunking and compression using\n% Native Matlab routines (tested in Matlab 2010b)\n% Rich Signell \n\nfilename=''\nif 1\n % read data from OpenDAP using NJ Toolbox (\n url='';\n nc=mDataset(url);\n topo=nc{'topo'}(1:12:end,1:12:end);\n g=nc{'topo'}(1:12:end,1:12:end).grid;\n topo(topo<0)=0;\n lon=g.lon;\n;\n save topo.mat topo lon lat\nend\n% or load previously save mat file\nload topo.mat\n[ny,nx]=size(topo);\nsubplot(211);pcolor(lon,lat,double(topo));shading flat;caxis([0 5000])\ntitle('Topo from Mat file');\n%%\n% write NetCDF4 with chunking & compression (deflation)\n\nncid = netcdf.create(filename,'NETCDF4');\nlatdimid = netcdf.defDim(ncid,'lat',ny);\nlondimid = netcdf.defDim(ncid,'lon',nx);\nvarid = netcdf.defVar(ncid,'topo','short',[latdimid londimid]);\nlonid = netcdf.defVar(ncid,'lon','float',[londimid]);\nlatid = netcdf.defVar(ncid,'lat','float',[latdimid]);\n\nnetcdf.defVarChunking(ncid,varid,'CHUNKED',[180 360]);\nnetcdf.defVarDeflate(ncid,varid,true,true,5);\nnetcdf.putAtt(ncid,latid,'units','degrees_north');\nnetcdf.putAtt(ncid,lonid,'units','degrees_east');\nnetcdf.putAtt(ncid,varid,'units','m');\n%netcdf.putAtt(ncid,varid,'missing_value',int16(-32767));\n\nnetcdf.putVar(ncid,lonid,[0],[nx],lon(1:nx));\nnetcdf.putVar(ncid,latid,[0],[ny],lat(1:ny));\nnetcdf.putVar(ncid,varid,[0 0],[ny nx],topo(1:ny,1:nx));\n\nnetcdf.close(ncid);\n\n% read NetCDF4 file\,'nowrite');\nvarid=netcdf.inqVarID(ncid,'topo');\ntopo2=netcdf.getVar(ncid,varid,[0 0],[ny nx]);\nnetcdf.close(ncid);\nsubplot(212);pcolor(lon,lat,double(topo2));shading flat;caxis([0 5000])\ntitle('Topo from NetCDF4 file');\n}}}\n\n\n
WebEx Recording and Playback\n\nPlease feel free to add your knowledge to this page, correct errors, and/or improve clarity.\n\nWebEx's network-based recording feature enables a meeting host to capture WebEx session content, both visual and audio, for later playback. This page describes the simplest (of multiple) techniques to record (at no direct cost and with no additional hardware or software) both the WebEx "shared desktop" activity and teleconference audio by "connecting WebEx" to the audio/phone bridge.\n\nThe recordings are stored on WebEx servers in arf (Advanced Recording Format), a proprietary format. They can be played back by anyone having the URL to the recording. The host is e-mailed a URL to the recording within 30 minutes after the recording is ended. This URL can be forwarded or posted to a Web site or wiki site. The host can access all recordings by logging into WebEx, clicking on the "Meeting Center" tab, and clicking on "My Recorded Meetings."\n\nArf files can be converted to Windows Media File (.wmv) format or to Flash (.swf) format through the WebEx Network Recording Player. The Player is available by logging into WebEx, clicking on the "Meeting Center" tab, and clicking on "Support", "Downloads", and "Recording and Playback".\nRecording\n\nThese instructions should work for both the RestonTalk Audio Bridge and audio bridge services.\n\n 1. Start the WebEx session\n 2. Click on "Record" in the WebEx meeting window, or click on "Meeting" and "Start Recording."\n 3. ...\n 4. On the WebEx Recorder (Record on Server) Setup window, enter the blue responses to the prompts:\n * Dial-in number: enter your audio bridge number, with any preceeding 1, 8, or 9.\n (Signell free conference:
Use to reduce color depth of geotiff and other gdal files, e.g. 24 bit geotiff to 8 bit:\n\nWindows: From the FWTOOLS shell:\n{{{\nrgb2pct.bat vs_bathy_shaded.tif vs_bathy_shaded_8bit.tif\n}}}\n
from Norm Vine:\n\nI can mod a few files to give you my additions for histogram massaging etc\nI can get you setup sometime \n\nbut you can play as is see the "-align similarity" command \nbut probably want to play with the affine and perspective options too\n\n\nbut don't add the images together and always allign against the same image\ne.g.\n{{{\nImageStack -load a.jpg -load b.jpg -align similarity -save b_X.jpg\n}}}\n\nyou can use attached as a guide to how I balanced the 'palette' as\nyou don't have my histoadapt operator so you can use something like\n \n-gamma X.X \s\n-eval "Y.Y*[c]" \s\n-eval "(0.5/mean())*[c]"\n\nwhere X.X and Y.Y are appropriate scalars\nuse your imagination imagestack is a cool toy\n\nuse -display instead of -save while debugging\n\n\n\nI use python to generate the scripts that get passed to ImageStack using\nstring templates
!~ImageStack\n\nNorm first pointed me at ImageStack (, which is not python, but a command line driven executable that can align images via commands like this:\n{{{\nc:\sprograms\sImageStack\sImagestack.exe -load b105\sBTV_TIFF_001.tif \s\n-load b105\sBTV_TIFF_020.tif -align similarity -save .\sfix\sBTV_TIFF_020.tif\n}}}\nThis worked great for the 5 images that Norm tried, but not so great for the 20 images I tried where there was a cable waving around. Plus I'd like a python based approach where I know what I'm doing.\n\n!ICP\n\nNorm also pointed me toward ICP:\n\nwhich seeks to register two images on disk. This is pure python, but requires loading images from disk, which seem stupid, and also doesn't seem to output the transformation matrix, which we need because we are going to register images that have all the ripples masked out, so that only the outlines of the images remain, and then use the transformation matrices to affine transform the original non-masked images.\n\n!ITK \nGoogled this one. The C++ based ITK ( and Python bindings in WrapITK ( are extensive image processing tools by Kitware, with code to do affine registration here:\n\n!IRTK (Image Registration Toolkit)\n\nThis isn't python at all, and no c bindings. But here transformation matrix can be obtained as an output, where imagestack didn't seem to do this. The info on the affine transformation to register two images is given here:\n\nIf we use this, we need to mention: "The Image Registration Toolkit was used under Licence from Ixico Ltd." \nIf we use the affine registration, we should reference:\n"C. Studholme, D.L.G.Hill, D.J. Hawkes, An Overlap Invariant Entropy Measure of 3D Medical Image Alignment, Pattern Recognition, Vol. 32(1), Jan 1999, pp 71-86."\n
/***\n| Name:|RenameTagsPlugin|\n| Description:|Allows you to easily rename or delete tags across multiple tiddlers|\n| Version:|3.0 ($Rev: 1845 $)|\n| Date:|$Date: 2007-03-16 15:19:22 +1000 (Fri, 16 Mar 2007) $|\n| Source:||\n| Author:|Simon Baird <>|\n| License||\nRename a tag and you will be prompted to rename it in all its tagged tiddlers.\n***/\n//{{{\nconfig.renameTags = {\n\n prompts: {\n rename: "Rename the tag '%0' to '%1' in %2 tidder%3?",\n remove: "Remove the tag '%0' from %1 tidder%2?"\n },\n\n removeTag: function(tag,tiddlers) {\n store.suspendNotifications();\n for (var i=0;i<tiddlers.length;i++) {\n store.setTiddlerTag(tiddlers[i].title,false,tag);\n }\n store.resumeNotifications();\n store.notifyAll();\n },\n\n renameTag: function(oldTag,newTag,tiddlers) {\n store.suspendNotifications();\n for (var i=0;i<tiddlers.length;i++) {\n store.setTiddlerTag(tiddlers[i].title,false,oldTag); // remove old\n store.setTiddlerTag(tiddlers[i].title,true,newTag); // add new\n }\n store.resumeNotifications();\n store.notifyAll();\n },\n\n storeMethods: {\n\n saveTiddler_orig_renameTags: TiddlyWiki.prototype.saveTiddler,\n\n saveTiddler: function(title,newTitle,newBody,modifier,modified,tags,fields) {\n if (title != newTitle) {\n var tagged = this.getTaggedTiddlers(title);\n if (tagged.length > 0) {\n // then we are renaming a tag\n if (confirm(config.renameTags.prompts.rename.format([title,newTitle,tagged.length,tagged.length>1?"s":""])))\n config.renameTags.renameTag(title,newTitle,tagged);\n\n if (!this.tiddlerExists(title) && newBody == "")\n // dont create unwanted tiddler\n return null;\n }\n }\n return this.saveTiddler_orig_renameTags(title,newTitle,newBody,modifier,modified,tags,fields);\n },\n\n removeTiddler_orig_renameTags: TiddlyWiki.prototype.removeTiddler,\n\n removeTiddler: function(title) {\n var tagged = this.getTaggedTiddlers(title);\n if (tagged.length > 0)\n if (confirm(config.renameTags.prompts.remove.format([title,tagged.length,tagged.length>1?"s":""])))\n config.renameTags.removeTag(title,tagged);\n return this.removeTiddler_orig_renameTags(title);\n }\n\n },\n\n init: function() {\n merge(TiddlyWiki.prototype,this.storeMethods);\n }\n}\n\nconfig.renameTags.init();\n\n//}}}\n\n
It seems that the time to compile ROMS is getting longer and longer\nOn a single processor, the default build.bash takes\nRoms Rutgers SVN version 119: 9.85 minutes, build.log has 1,123 lines\nRoms Rutgers SVN version 72: 6.68 minutes, build.log has 1,567 lines\n
[note: Roy also told me they do not restart tomcat on a regular schedule, They monitor the server with jconsole and watch the memory and number of threads. When they start approaching 90% of max, they restart tomcat.]\n\nHi Rich:\n\nOn one machine we just use:\n\nJAVA_OPTS='-server -Xms1500m -Xmx1500m'\n\nIt runs rock solid and is used heavily. It however does not have any remote catalogs, in particular ones that at times produce errors. Unidata has never been ableto duplicate it, but we find the combination of those two things appear to make TDS start to throw errors, and the garbage collection does not clean those up well. Then issues arise with the tomcat unable to create new threads.\n\nOn another machine which has a lot of external links we have changed settings over time with new versions of TDS we are presently using\n\nJAVA_OPTS='-server -Xms1000m -Xmx1500m -XX:MaxPermSize=128m -Xmn256m -XX:SurvivorRatio=16'\n\nresults still are not as good as we would like. I used to have other settings for the garbage collection but I can't remember what they are. I am not certain they make a difference - there was a period where we had a lot of instability and then we added the garbage stuff and things were stable for awhile, but it does seem to change with each version. Running Java 1.6 and Tomcat 6 seems to matter also. We have noticed that say when NCDC becomes unreachable then the errors from that seem to pile up and soon we can not create a new thread and the entire system breaks.\n\nHere is one of the pages I started with (I searched under "tuning java virtual machine) and followed a lot of the links:\n\n\n\nHTH,\n\n-Roy
It looks like the run time for LTRANS is proportional to the size of the input grid, based on a simple test I did.\n\nThe original SABGOM grid and history file are 440x320. I used Dave Robertson's bash script that uses NCO to cut a subset. See: [[Cutting a ROMS file]]\n\nwe cut both the grid and history file to 84x63, limiting the grid to a region about 40 km from the spill location (since we were just playing with short term simulations of a few days anyway). The actual ncks command that got executed by Dave's script was:\n{{{\n ncks -F -d\n xi_rho,149,211 -d eta_rho,198,281 -d xi_u,149,210 -d eta_u,198,\n281 -d xi_v,149,211 -d eta_v,198,280 -d xi_psi,149,148 -d eta_ps\ni,281,280 ../\sn",\n}}}\n\nThe result was impressive: the run using the whole grid took 24 minutes, while the subset grid run took 40 seconds!\n\ntotal grid cells to subset grid cells: (440*320)/(84*63)= 26\ntotal grid runtime to subset grid runtime: (60*24)/40 = 36\n\nI'm not sure why it was even faster than the one would expect from scaling with number of grid cells - perhaps the smaller sizes fit in cache better.
Following the WRF Users Guide:\n\n\nGetting MET Forcing: \nWent to\n\nand used the "ftp4u" service to obtain subsetted GRIB files from NAM\nused bounds [-
Here is the query:\n\n<,36,144,40&observedproperty=sea_water_salinity&responseformat=text/csv&eventtime=2011-03-28T00:00Z/2011-04-11T00:00Z>\n\nbut it doesnt work from outside for just gliders and ships, note this is network = all.\n\ncurrently we offer to the outside, a single glider or ship by id and by specific time range etc. You are getting a sneak peak from the url above what is coming in the near future which will include gliders and ships.\n\nBTW in our production services a bounding box query as above return no obs in that bounding box from any platform type.\n
Guns, Germs, and Steel\nSex, Bombs and Burgers...\n\n...Wire, Glue and Grandpa\n\nHow about elevating/redefining "Community Impact" on RGE. Letters from external community to tell the story about impact. \n\nGet a NRC panel to recommend a strategy. $50K. \n\nKevin:\nGeologic Mapping was worried about being under Core Science Systems, that science in mapping might be lost.\n\n
\nThe tomcats (one for each app) are at\n/var/www/...\n\nThe tomcat for THREDDS is\n/var/www/apache-tomcat-6.0.29/\n\nThe tomcat for ncWMS is\n/var/www/apache-tomcat-6.0.29-ncwms\n\nThe tomcat for RAMADDA is\n/var/www/apache-tomcat-6.0.29-ramadda\n\nThe RAMADDA repository is at\norig: /home/tomcat/.unidata/repository/derby/repository\nnow: /data/ramadda/derby/repository\n\nThe tomcat for ERDDAP is \n/var/www/apache-tomcat-6.0.29-erddap\n\naccessed by users at: \n\n\n\n\n\nSample datasets for CI are at\n/data/ftp/upload/Catalog\n\nUser uploaded datasets are at:\n/data/ftp/upload/Shelf_Hypoxia\n/data/ftp/upload/Inundation\n/data/ftp/upload/Estuarine_Hypoxia\n\n
Example of ADCP data with extra variables\n\n\nSimple seacat file\n
This URL returns a KMZ animation for just these times from the WMS:\n{{{\n,2009-12-09T00:00:00.000Z,2009-12-10T00:00:00.000Z,2009-12-11T00:00:00.000Z&TRANSPARENT=true&STYLES=BOXFILL%2Frainbow&CRS=EPSG%3A4326&COLORSCALERANGE=0%2C5&NUMCOLORBANDS=254&LOGSCALE=false&SERVICE=WMS&VERSION=1.3.0&REQUEST=GetMap&EXCEPTIONS=XML&FORMAT=application/
Sasha says there are 3 factors that compete on figuring out optimal tiling:\n* vectorization\n* communication\n* fitting in cache\n\nWe want long strips along the "I" direction since this is the fastest varying dimension (inner loop). But very long thin strips require lots of communication. Also we want to fit in cache. These things compete, but as a rough rule of thumb, often strips about 100x6 (100 in I, 6 in J)are optimal, and thus tiling should be designed accordingly. So for a 100x30 grid (I=100, J=30), the tiling should be \nNtileI=1\nNtileJ=5\n
Looking at vorticity with NCVIEW:\n\n{{{\nGet:\n\nand\n\n}}}\nThe former one is a "compile once, use forever" operator function\nwhich computes vorticity of ROMS solution stored as a sequence of\nnetCDF files. Just untar it, say "make" within that directory then place\nexecutable file "vort" into your bin directory. After that you can say\n{{{\n vort wake_his*.nc\n}}}\nand, one minute later,\n{{{\n ncview\n}}}\nThe other file is ncview slightly patched by me to insert color palletes I like into the code,\nand to adjust its defaults, so that I have do less mouse clicking later. It fully retains all other\nfunctionalities of ncview of David W. Pierce. You may decide to recompile ncview (it is\n as easy as configure -> make -> make install , or just to use palettes,\n{{{\n bright.ncmap\n rainbow.ncmap\n}}}\n[note, in my patched code I also have correspondig C-headers colormaps_bright.h and\ncolormaps_rainbow.h along with changes in the main code]\n\nIf you have both of them working, you can get running moview of vorticity from roms\nsolution within one minute.
/***\n| Name|SaveCloseTiddlerPlugin|\n| Description|Provides two extra toolbar commands, saveCloseTiddler and cancelCloseTiddler|\n| Version|3.0 ($Rev: 2134 $)|\n| Date|$Date: 2007-04-30 16:11:12 +1000 (Mon, 30 Apr 2007) $|\n| Source||\n| Author|Simon Baird <>|\n| License||\nTo use these you must add them to the tool bar in your EditTemplate\n***/\n//{{{\nmerge(config.commands,{\n\n saveCloseTiddler: {\n text: 'done/close',\n tooltip: 'Save changes to this tiddler and close it',\n handler: function(e,src,title) {\n config.commands.saveTiddler.handler(e,src,title);\n config.commands.closeTiddler.handler(e,src,title);\n return false;\n }\n },\n\n cancelCloseTiddler: {\n text: 'cancel/close',\n tooltip: 'Undo changes to this tiddler and close it',\n handler: function(e,src,title) {\n config.commands.cancelTiddler.handler(e,src,title);\n config.commands.closeTiddler.handler(e,src,title);\n return false;\n }\n }\n\n});\n\n//}}}\n\n
From Brian Sibley:\n\nGo to:\n"LAYERS"\n"Manage Seafloor Databases"\n( look now at bottom left )\n"create"\nSet for resolution of 5.6 meters ( or greater if you wish-remember GPS accuracy )\nHit "OK" and answer "yes" to shift bottom calculation\n\n\nThe "active " layer that is currently recording with say "active" beside it.\nIt will create new data as if it was the first time being used. Now you can flip back and forth between datasets to view them, but still record on the new one.\n
Okay, you could use RAMADDA, but here's a solution using GI-CAT, which can also harvest from many other sources, not just the TDS. \n\nFor providers:\n1. Make sure you are using TDS version 4.2.6 or higher\n2. Follow the instructions on this page for setting up ncISO and\nadding the ncISO service to your THREDDS catalogs:\n\n3. Install GI-CAT from\n(just download the .war file and deploy it like the TDS)\n4. Watch this youtube video as a quick start for how to set up GI-CAT\nto harvest from THREDDS catalogs:\n\nFor consumers:\n1. MATLAB: opensearch.m file to query GI-CAT and return OPeNDAP URLs\n
All the tiles for the NOAA Coastal Relief Model exist at WHSC on custer in Arc .adf format. Our strategy here is to merge all the tiles for each volume into a single GeoTIFF using gdal_merge from the FWTOOLS, and then convert to NetCDF. The final step is to add lon/lat values via the THREDDS Data Server (TDS) catalog so that they can be served as CF-1.0 compliant data sets. \n\nThe method I've used is a bit convoluted, but it works. \n\nOn at PC at WHSC, first mount \s\scusterdtxp\ssharedata as drive x.\n\nThen fire up a cygwin window and execute this "do_make_bat_files" script, which creates a batch file for each volume that will be used to assemble the tiles. \n{{{\n#!/bin/bash\n#\n# DO_MAKE_BAT_FILES\n#\n# Step 1: mount \s\scusterdtxp\ssharedata\scrmgrd as drive X:\n# find /cygdrive/x/crmgrd/volume1 -name w001001.adf volume1.list\n# find /cygdrive/x/crmgrd/volume2 -name w001001.adf volume2.list\nfor num in
The tinyurl for this page is:\n\nPeter Schweitzer pointed me at some magnetic anomaly data for Wyoming at\n\nas a sample of USGS gridded data on the web that might be amenable to delivering via the THREDDS Data Server. So here's a brain dump of what I did to get it going with the TDS.\n\nThe data files are available in ARC ASCII, ERM, GXF and ODDF format. I went with the GXF since it looked like it had the most amount of metadata about the projection. \n\nI used the FWTOOLS tools "gdal_translate" to convert from GXF to NetCDF:\n{{{\ngdal_translate magfinal.gxf\n}}}\n\nBecause the GXF used a non-standard form and because gdal_translate does not produce CF compliant NetCDF, I then created an NcML file (by hand) that allows the netcdf file to be CF compliant:\n{{{\n<?xml version="1.0" encoding="UTF-8"?>\n<netcdf xmlns="" \n location="C:/RPS/cf/peter/WY/">\n <attribute name="Conventions" value="CF-1.0" />\n <variable name="Lambert" shape="" type="char">\n <attribute name="grid_mapping_name" value="lambert_conformal_conic" />\n <attribute name="standard_parallel" type="float" value="33.0 45.0" />\n <attribute name="longitude_of_central_meridian" type="float" value="-
Start:1030EDT\n\nThe PRISM climate data is available from the web site as individual files: one ARC ASCII Grid for each variable for each year. My brother Steve Signell, who does GIS work for the State of New York and the Adirondack Park, had already downloaded most of the files and arranged them in directories by decade. So as a test, he sent me two decades of data (using DropSend!):\n{{{\\\n}}}\nwhich I unzipped into c:/rps/cf/prism. The individual file names are like\n{{{\n/cygdrive/c/rps/cf/prism/ppt/
Installed TDS 4.2.9 on Jordan's nomad3 machine:\n{{{\n[wd23ja@nomad3 bin]$ pwd\n/home/wd23ja/tomcat/apache-tomcat-6.0.32/bin\n}}}\nThis can be accessed at:\n\n\nWe set up a datasetscan, using wildcards, but the default I usually use didn't work because they don't name the output files with .grib or .grib2 extensions (because they traditionally only had grib files at NCEP).\n\nThe next order of business is to aggregate the grib data. \n\n
{{{\n<?xml version="1.0" encoding="UTF-8"?>\n<netcdf xmlns="">\n <aggregation dimName="ocean_time" type="joinExisting">\n <scan location="c:/rps/cf/roms" regExp=".*vs_his_[0-9]{4}\$"/> \n </aggregation>\n</netcdf>\n}}}
I bought a bottle of Zubrowka (bison grass vodka) in the US and after one sip, dumped it down the drain -- it was nothing like the real thing. It turns out the real stuff contains small amounts of coumarin, a blood thinner, so is outlawed by the FDA. The stuff for sale in the US tries to recreate the original taste with artificial flavor and fails miserably. \n\nI found another recipe for home made Zubrowka on the web, but it seemed more complicated than it needed to be, so I tried my own technique. I think the result is outstanding (naturally). This simple recipe below will make three 750 bottles of Zubrowka:\n\nIngredients:\n\n- 15 oz of 190 proof grain alcohol - e.g. Everclear\n- 1 30 inch sweet grass braid - e.g.\n- 3 bottles of 80 proof vodka\n- 1/2 cup simple syrup (boil 1/2 water and 1/2 cup sugar for 1 minute)\n\nSet aside a 3 pieces of grass that are bottle sized to use in the finished product.\n\nCut up the braid into 4 pieces each 6 inches long and put in a mason jar with 12 oz of grain alcohol. \n\nAfter 2 days, remove the grass -- the alcohol will be a beautiful green color and smell great. \n\nMake a simple syrup by bringing 1/2 cup water and 1/2 cup sugar to a full boil for 1 minute.\n\nTo make the Zubrowka, mix in this ratio: \n1 shot infused grain alcohol \n1 shot water\n4 shots 80 proof vodka\n2 tsp. simple syrup\n\nTo make a 750 ml bottle, which is just about 24 oz, you want:\n\n3 shots infused grain alcohol\n3 shots water\n12 shots (2 1/4 cups 80 proof vokda)\n2 Tbl. simple syrup\n\nMix together and pour through a coffee filter into a clean vodka bottle. \nAdd one blade of grass into the bottle for show.\n\nEnjoy! I brought this to my polish friend's house, who brought up a bottle of the real thing from Poland and we tried them side by side. They were definitely a bit different, but both delicious. In fact, my friend said he liked the homemade one better!
a TiddlyWiki approach
Rich Signell's Random Tidbits
We want to be able to reproduce Figure 5 in the Comp GeoSci Ppaer, the migrating trench test case.\n\nLooking at John's folder for the paper\n\s\sJohwardtxp\sdata\spapers\scomm_sed_Comp_Geosci\nwe see that Figure 5 files, \n"trench.eps" was created at 10:11 PM on Feb 15, 2007.\n"" was created at 10:29 PM on Feb 15, 2007. This matches Figure 5 in the paper.\nLooking at "trench.eps", there is a title "morph 60" that does not appear in the Comp GeoSci paper. \n\nWe could not find the m-file that generated "trench.eps" by searching for "morph 60", but we did find a "trench.m" m-file that produces a "trench.eps" file. \n\nWe are not completely certain that the trench.eps was based on morph 60, but using trench.m with "" found in the directory:\n\s\sJohwardtxp\sdata\smodels\sroms\shelp_cases\strench2\nmatches the figure quite well, provided we add 0.02 m to the observed final bed location when making the plot. The version of "trench.m" already added 0.01 m to the final bed, and contained a comment:\n"added .01 because fig 12.4.8 shows init elev at 0.01."\nOr perhaps the observed data was simply moved or smoothed slightly in Adobe Illustrator.\n\nIn any case, it seems unlikely that the morph 10 run was actually used, as documented in the paper, because the "" file has a date of 1:12 AM on Feb 16, about 3 hours after the figure had already been created.\n\nSo if morph60 was used, what were the parameters for the run? Do they match the paper?\nIn the morph60 file, we find:\n :his_file = "" ;\n :grd_file = "ROMS/External/" ;\n :ini_file = "ROMS/External/" ;\nPerusal of this ini_file shows that it is not a roms restart file, saved after 1000 steps, as the paper would indicate, but a file generated by Matlab. The 3D field "U" has a logrithmic structure and the intitial sediment concentration is a uniform 0.2 kg/m3.\n\nSo if this is the right output file, what parameters were likely used in the run? It seems likely that the actual code used is \n\s\sJohwardtxp\sdata\smodels\sroms\shelp_cases\strench2\\nIn this code distribution, cppdefs.h is indeed set to "trench", and \nsediment.F\n{{{\n#include "cppdefs.h"\n#undef NEUMANN\n#undef LINEAR_CONTINUATION\n#undef REMIX_BED\n#undef SLOPE_NEMETH\n#define SLOPE_LESSER\n#define BSTRESS_UPWIND\n}}}\n\nIn cppdefs.h, the TRENCH options are:\n{{{\n# elif defined TRENCH\n\n/*\n** Trench migration suspended sediment test.\n*/\n\n#undef LOG_PROFILE\n#define UV_ADV\n#define UV_LOGDRAG\n#define TS_U3HADVECTION\n#undef SALINITY\n#define SOLVE3D\n#define SEDIMENT\n#ifdef SEDIMENT\n# define SUSPLOAD\n# define BEDLOAD_MPM\n# define SED_MORPH\n# define SED_DENS\n#endif\n#define SPLINES\n#define NORTHERN_WALL\n#define SOUTHERN_WALL\n#define WEST_FSGRADIENT\n#define WEST_M2CLAMPED\n#define WEST_M3GRADIENT\n#define WEST_TCLAMPED\n#define EAST_FSGRADIENT\n#define EAST_M2CLAMPED\n#define EAST_M3GRADIENT\n#define EAST_TGRADIENT\n#undef ANA_VMIX\n#define GLS_MIXING\n#ifdef GLS_MIXING\n# define KANTHA_CLAYSON\n# define N2S2_HORAVG\n#endif\n#define ANA_BPFLUX\n#define ANA_BSFLUX\n#define ANA_BTFLUX\n#define ANA_SMFLUX\n#define ANA_SPFLUX\n#define ANA_SRFLUX\n#define ANA_SSFLUX\n#define ANA_STFLUX\n#define ANA_TOBC\n#define ANA_M2OBC\n\n# elif defined UPWELLING\n}}}\n\nIn the file and file, we find the parameters for a MORPHFAC=1 run, in that there are
I'm trying to see if there is consistency running a simple channel test case with various sediment transport models.\n\nI've got a 10 km long, 10 m deep channel driven by a 0.4 m slope over 10 km (slope = 0.00004 m/m)). The bottom roughness is z0 = 0.005 m for 3D runs, or Manning N = 0.030 for 2D runs, which results in a depth-averaged flow of just under 1 m/s at the center of the channel (0.95 m/s to be exact). I run for 6 hours, which is sufficient for equilibrium conditions to be established.\n\nFor 500 micron sediment:\nIf I run Delft3d in 2D mode with Van Rijn, I get sediment concentrations of 0.0103 kg/m^3, and a flow speed of 0.95 m/s, which yields a suspended load transport of 0.0103 kg/m^3 * 0.95 m/s *10 m = 0.098 kg/m/s. The bedload is 0.0479 kg/m/s. The total transport is 0.1458 kg/m/s. (Delft3D reports transport rates in m3/s/m, so must multiply by 2650 kg/m3 to get kg/m/s).\n\n
Running \n{{{\n$ system-config-soundcard\n}}}\nwas all I needed to get sound going on my linux RHEL4 box
Ingredients:\n*GBP\n*1/4 tsp creme-of-tartar\n*125 g of sugar\n*Juice from 1/2 lemon\n*150 g of peeled, thinly sliced ginger root\nInstructions: \nMake Ginger juice by boiling ginger root in 2 cups water for 30 minutes. Add sugar and cool to room temp. Add lemon juice, GBP and creme of tartar. Top up to 1 liter with cool water. \n
OMG, I can't believe I waited this long. \nHere's how to change the dark blue to cyan:\n{{{\nexport LS_COLORS='di=01;36'\n}}}\nThat's all it takes.
*Put 1 gallon of raw milk on stove over medium heat. Bring to 99 degrees F stirring often with a wooden spoon.\n*Put 1 cup of greek yogurt (any brand containing both L. Bulgaricus and S. Thermophilus) in a 2 cup measuring cup and whisk in 1 cup of the warm milk to make a better consistency for adding to the pot. \n*Add yogurt back into the milk in the pot. Let rest at 95-100 F for 30 min. Add 4 drops of liquid animal rennet diluted in 1/3 cool spring water. Let stand at 100 degrees for 90 min. \n*Cut curd vertically into large pieces, and ladle into cheesecloth lined cheese molds. If you don't line the mold with cheesecloth, the cheese fills the holes and doesn't drain as well. \n*Rest the molds on a splatter screen over a 13x9 pan to drain. Let sit draining in the mold for a couple of hours, prying the edges of the cheese away from the cheesecloth as it sinks. \n*After a couple of hours, wrap and flip the cheese. Flip again after a few more hours. Let sit at room temperature for 2 days, rewrapping and flipping every so often. \n*Soak in brine (3 cups cool (50 F) spring water + 3 T salt) in a 8 inch square pan for 1 hour. \n*Put back into molds, store at room temperature for an additional day. Wring out the water from the cheese cloth, rewrap, and wrap in paper towel. Put in quart size baggies in the fridge. \n*Eat after 3 days, good until about 10 days.
Vacation stuff:\nCancel Cape Cod Times: EMAIL \nor call Circulation Services at
To install Sun Java 1.6 on Ubuntu 11.04 (Natty Narwhale)\n{{{\nsudo apt-get update\nsudo apt-get install sun-java6-jdk\n}}}
Adapted from:\nThree types of information heterogeneity are often present: \n* Syntactic heterogeneity: Information resources use different representation and encodings of data. (e.g. ASCII files, NetCDF files, HDF files, Relational Databases)\n* Structural heterogeneity: Different information systems store their data in different data models, data structures and schemas (e.g. Unidata Common Data Model, CSML Feature Types)\n* Semantic heterogeneity: The meaning of the data is expressed in different ways (e.g. sea_surface_temperature_remotely_sensed, "temp", "Temperature")\n\nThe use of Web Services can solve the syntactic heterogeneity. XML and XSD (schemas) can solve the structural heterogeneity because a XML file that respects a specific XSD Schema has a well-defined structure. Using OWL, as a shared ontology, semantic heterogeneity is resolved.
Service Tag: JKPRM51\nQuantity Parts # Part Description\n1 67JDG Cable Assembly, Audio, TRANSFORMER METROPLEX DESKTOP...\n1 N2285 Processor, 80546K, 3.0G, 1M, Xeon Nocona, 800\n1 X0392 Printed Wiring Assy, Planar Tumwater, PWS670\n1 5120P Cord, Power, 125V, 6Feet, SJT..., Unshielded\n1 H2703 Card, Voltage Regulator Module Controller, 10.1\n1 N2285 Processor, 80546K, 3.0G, 1M, Xeon Nocona, 800\n1 7N242 Keyboard, 104 Key, UNITED STATES..., Silitek, LC, MIDNIGHT GRAY...\n2 H2084 Dual In-Line Memory Module 512, 400M, 32X72, 8, 240, 2RX8\n2 U3364 Dual In-Line Memory Module 512, 400M, 64X72, 8, 240, 1RX8\n1 Y3668 Mouse, Universal Serial Bus 2BTN, Wheel, Entry, Primax Electronics Ltd\n1 N4077 Card, Graphics, NVIDIA QUADRO FX1300 128MB MRGA10\n1 5R212 Floppy Drive, 1.44M, 3.5" FORM FACTOR..., 3MD NBZ, NEC CORPORATION..., CHASSIS 2001...\n1 H5102 Hard Drive, 250GB, Serial ATA 8MB, WD-XL80-2\n2 C6355 Assembly, Cable, Serial ATA Transformer Sky Dive Mini Tower, 2.0\n1 H5102 Hard Drive, 250GB, Serial ATA 8MB, WD-XL80-2\n1 J2427 DIGITAL VIDEO DISK DRIVE..., 17G, 16X, I, 5.25" FORM FACTOR..., Liteon Chassis 2001, V5\n1 C3164 Digital Video Disk Drive Read Write, 8X, IDE (INTEGRATED DRIVE ELECTRONICS) ..., Half Height, NEC 01\n1 42964 CABLE..., LIGHT EMITTING DIODE..., HARD DRIVE..., AUXILIARY..., 7\n1 C4272 Card, Controller, U320..., SCSI Precision Workstation Poweredge\n1 C5130 Kit, Documentation on Compact Disk, Red Hat Enterprise Linux 3WS, 1YR, WEST...\n \nParts & Upgrades\n\n\nThis Dell Precision WorkStation 670
* THREDDS is currently available only via http (no SSL) at\n> from the following subnets:\n>\n>
/***\n| Name|TagglyTaggingPlugin|\n| Description|tagglyTagging macro is a replacement for the builtin tagging macro in your ViewTemplate|\n| Version|3.0 ($Rev: 2101 $)|\n| Date|$Date: 2007-04-20 00:24:20 +1000 (Fri, 20 Apr 2007) $|\n| Source||\n| Author|Simon Baird <>|\n| License||\n!Notes\nSee\n***/\n//{{{\nconfig.taggly = {\n\n // for translations\n lingo: {\n labels: {\n asc: "\su2191", // down arrow\n desc: "\su2193", // up arrow\n title: "title",\n modified: "modified",\n created: "created",\n show: "+",\n hide: "-",\n normal: "normal",\n group: "group",\n commas: "commas",\n sitemap: "sitemap",\n numCols: "cols\su00b1", // plus minus sign\n label: "Tagged as '%0':",\n excerpts: "excerpts",\n noexcerpts: "no excerpts"\n },\n\n tooltips: {\n title: "Click to sort by title",\n modified: "Click to sort by modified date",\n created: "Click to sort by created date",\n show: "Click to show tagging list",\n hide: "Click to hide tagging list",\n normal: "Click to show a normal ungrouped list",\n group: "Click to show list grouped by tag",\n sitemap: "Click to show a sitemap style list",\n commas: "Click to show a comma separated list",\n numCols: "Click to change number of columns"\n }\n },\n\n config: {\n showTaggingCounts: true,\n listOpts: {\n // the first one will be the default\n sortBy: ["title","modified","created"],\n sortOrder: ["asc","desc"],\n hideState: ["show","hide"],\n listMode: ["normal","group","sitemap","commas"],\n numCols: ["1","2","3","4","5","6"],\n excerpts: ["noexcerpts","excerpts"]\n },\n valuePrefix: "taggly.",\n excludeTags: ["excludeLists","excludeTagging"],\n excerptSize: 50,\n excerptMarker: "/%"+"%/"\n },\n\n getTagglyOpt: function(title,opt) {\n var val = store.getValue(title,this.config.valuePrefix+opt);\n return val ? val : this.config.listOpts[opt][0];\n },\n\n setTagglyOpt: function(title,opt,value) {\n if (!store.tiddlerExists(title))\n // create it silently\n store.saveTiddler(title,title,config.views.editor.defaultText.format([title]),config.options.txtUserName,new Date(),null);\n // if value is default then remove it to save space\n return store.setValue(title,\n this.config.valuePrefix+opt,\n value == this.config.listOpts[opt][0] ? null : value);\n },\n\n getNextValue: function(title,opt) {\n var current = this.getTagglyOpt(title,opt);\n var pos = this.config.listOpts[opt].indexOf(current);\n // a little usability enhancement. actually it doesn't work right for grouped or sitemap\n var limit = (opt == "numCols" ? store.getTaggedTiddlers(title).length : this.config.listOpts[opt].length);\n var newPos = (pos + 1) % limit;\n return this.config.listOpts[opt][newPos];\n },\n\n toggleTagglyOpt: function(title,opt) {\n var newVal = this.getNextValue(title,opt);\n this.setTagglyOpt(title,opt,newVal);\n }, \n\n createListControl: function(place,title,type) {\n var lingo = config.taggly.lingo;\n var label;\n var tooltip;\n var onclick;\n\n if ((type == "title" || type == "modified" || type == "created")) {\n // "special" controls. a little tricky. derived from sortOrder and sortBy\n label = lingo.labels[type];\n tooltip = lingo.tooltips[type];\n\n if (this.getTagglyOpt(title,"sortBy") == type) {\n label += lingo.labels[this.getTagglyOpt(title,"sortOrder")];\n onclick = function() {\n config.taggly.toggleTagglyOpt(title,"sortOrder");\n return false;\n }\n }\n else {\n onclick = function() {\n config.taggly.setTagglyOpt(title,"sortBy",type);\n config.taggly.setTagglyOpt(title,"sortOrder",config.taggly.config.listOpts.sortOrder[0]);\n return false;\n }\n }\n }\n else {\n // "regular" controls, nice and simple\n label = lingo.labels[type == "numCols" ? type : this.getNextValue(title,type)];\n tooltip = lingo.tooltips[type == "numCols" ? type : this.getNextValue(title,type)];\n onclick = function() {\n config.taggly.toggleTagglyOpt(title,type);\n return false;\n }\n }\n\n // hide button because commas don't have columns\n if (!(this.getTagglyOpt(title,"listMode") == "commas" && type == "numCols"))\n createTiddlyButton(place,label,tooltip,onclick,type == "hideState" ? "hidebutton" : "button");\n },\n\n makeColumns: function(orig,numCols) {\n var listSize = orig.length;\n var colSize = listSize/numCols;\n var remainder = listSize % numCols;\n\n var upperColsize = colSize;\n var lowerColsize = colSize;\n\n if (colSize != Math.floor(colSize)) {\n // it's not an exact fit so..\n upperColsize = Math.floor(colSize) + 1;\n lowerColsize = Math.floor(colSize);\n }\n\n var output = [];\n var c = 0;\n for (var j=0;j<numCols;j++) {\n var singleCol = [];\n var thisSize = j < remainder ? upperColsize : lowerColsize;\n for (var i=0;i<thisSize;i++) \n singleCol.push(orig[c++]);\n output.push(singleCol);\n }\n\n return output;\n },\n\n drawTable: function(place,columns,theClass) {\n var newTable = createTiddlyElement(place,"table",null,theClass);\n var newTbody = createTiddlyElement(newTable,"tbody");\n var newTr = createTiddlyElement(newTbody,"tr");\n for (var j=0;j<columns.length;j++) {\n var colOutput = "";\n for (var i=0;i<columns[j].length;i++) \n colOutput += columns[j][i];\n var newTd = createTiddlyElement(newTr,"td",null,"tagglyTagging"); // todo should not need this class\n wikify(colOutput,newTd);\n }\n return newTable;\n },\n\n createTagglyList: function(place,title) {\n switch(this.getTagglyOpt(title,"listMode")) {\n case "group": return this.createTagglyListGrouped(place,title); break;\n case "normal": return this.createTagglyListNormal(place,title,false); break;\n case "commas": return this.createTagglyListNormal(place,title,true); break;\n case "sitemap":return this.createTagglyListSiteMap(place,title); break;\n }\n },\n\n getTaggingCount: function(title) {\n // thanks to Doug Edmunds\n if (this.config.showTaggingCounts) {\n var tagCount = store.getTaggedTiddlers(title).length;\n if (tagCount > 0)\n return " ("+tagCount+")";\n }\n return "";\n },\n\n getExcerpt: function(inTiddlerTitle,title) {\n if (this.getTagglyOpt(inTiddlerTitle,"excerpts") == "excerpts") {\n var t = store.getTiddler(title);\n if (t) {\n var text = t.text.replace(/\sn/," ");\n var marker = text.indexOf(this.config.excerptMarker);\n if (marker != -1) {\n return " {{excerpt{<nowiki>" + text.substr(0,marker) + "</nowiki>}}}";\n }\n else if (text.length < this.config.excerptSize) {\n return " {{excerpt{<nowiki>" + t.text + "</nowiki>}}}";\n }\n else {\n return " {{excerpt{<nowiki>" + t.text.substr(0,this.config.excerptSize) + "..." + "</nowiki>}}}";\n }\n }\n }\n return "";\n },\n\n notHidden: function(t,inTiddler) {\n if (typeof t == "string") \n t = store.getTiddler(t);\n return (!t || !t.tags.containsAny(this.config.excludeTags) ||\n (inTiddler && this.config.excludeTags.contains(inTiddler)));\n },\n\n // this is for normal and commas mode\n createTagglyListNormal: function(place,title,useCommas) {\n\n var list = store.getTaggedTiddlers(title,this.getTagglyOpt(title,"sortBy"));\n\n if (this.getTagglyOpt(title,"sortOrder") == "desc")\n list = list.reverse();\n\n var output = [];\n var first = true;\n for (var i=0;i<list.length;i++) {\n if (this.notHidden(list[i],title)) {\n var countString = this.getTaggingCount(list[i].title);\n var excerpt = this.getExcerpt(title,list[i].title);\n if (useCommas)\n output.push((first ? "" : ", ") + "[[" + list[i].title + "]]" + countString + excerpt);\n else\n output.push("*[[" + list[i].title + "]]" + countString + excerpt + "\sn");\n\n first = false;\n }\n }\n\n return this.drawTable(place,\n this.makeColumns(output,useCommas ? 1 : parseInt(this.getTagglyOpt(title,"numCols"))),\n useCommas ? "commas" : "normal");\n },\n\n // this is for the "grouped" mode\n createTagglyListGrouped: function(place,title) {\n var sortBy = this.getTagglyOpt(title,"sortBy");\n var sortOrder = this.getTagglyOpt(title,"sortOrder");\n\n var list = store.getTaggedTiddlers(title,sortBy);\n\n if (sortOrder == "desc")\n list = list.reverse();\n\n var leftOvers = []\n for (var i=0;i<list.length;i++)\n leftOvers.push(list[i].title);\n\n var allTagsHolder = {};\n for (var i=0;i<list.length;i++) {\n for (var j=0;j<list[i].tags.length;j++) {\n\n if (list[i].tags[j] != title) { // not this tiddler\n\n if (this.notHidden(list[i].tags[j],title)) {\n\n if (!allTagsHolder[list[i].tags[j]])\n allTagsHolder[list[i].tags[j]] = "";\n\n if (this.notHidden(list[i],title)) {\n allTagsHolder[list[i].tags[j]] += "**[["+list[i].title+"]]"\n + this.getTaggingCount(list[i].title) + this.getExcerpt(title,list[i].title) + "\sn";\n\n leftOvers.setItem(list[i].title,-1); // remove from leftovers. at the end it will contain the leftovers\n\n }\n }\n }\n }\n }\n\n var allTags = [];\n for (var t in allTagsHolder)\n allTags.push(t);\n\n var sortHelper = function(a,b) {\n if (a == b) return 0;\n if (a < b) return -1;\n return 1;\n };\n\n allTags.sort(function(a,b) {\n var tidA = store.getTiddler(a);\n var tidB = store.getTiddler(b);\n if (sortBy == "title") return sortHelper(a,b);\n else if (!tidA && !tidB) return 0;\n else if (!tidA) return -1;\n else if (!tidB) return +1;\n else return sortHelper(tidA[sortBy],tidB[sortBy]);\n });\n\n var leftOverOutput = "";\n for (var i=0;i<leftOvers.length;i++)\n if (this.notHidden(leftOvers[i],title))\n leftOverOutput += "*[["+leftOvers[i]+"]]" + this.getTaggingCount(leftOvers[i]) + this.getExcerpt(title,leftOvers[i]) + "\sn";\n\n var output = [];\n\n if (sortOrder == "desc")\n allTags.reverse();\n else if (leftOverOutput != "")\n // leftovers first...\n output.push(leftOverOutput);\n\n for (var i=0;i<allTags.length;i++)\n if (allTagsHolder[allTags[i]] != "")\n output.push("*[["+allTags[i]+"]]" + this.getTaggingCount(allTags[i]) + this.getExcerpt(title,allTags[i]) + "\sn" + allTagsHolder[allTags[i]]);\n\n if (sortOrder == "desc" && leftOverOutput != "")\n // leftovers last...\n output.push(leftOverOutput);\n\n return this.drawTable(place,\n this.makeColumns(output,parseInt(this.getTagglyOpt(title,"numCols"))),\n "grouped");\n\n },\n\n // used to build site map\n treeTraverse: function(title,depth,sortBy,sortOrder) {\n\n var list = store.getTaggedTiddlers(title,sortBy);\n if (sortOrder == "desc")\n list.reverse();\n\n var indent = "";\n for (var j=0;j<depth;j++)\n indent += "*"\n\n var childOutput = "";\n for (var i=0;i<list.length;i++)\n if (list[i].title != title)\n if (this.notHidden(list[i].title,this.config.inTiddler))\n childOutput += this.treeTraverse(list[i].title,depth+1,sortBy,sortOrder);\n\n if (depth == 0)\n return childOutput;\n else\n return indent + "[["+title+"]]" + this.getTaggingCount(title) + this.getExcerpt(this.config.inTiddler,title) + "\sn" + childOutput;\n },\n\n // this if for the site map mode\n createTagglyListSiteMap: function(place,title) {\n this.config.inTiddler = title; // nasty. should pass it in to traverse probably\n var output = this.treeTraverse(title,0,this.getTagglyOpt(title,"sortBy"),this.getTagglyOpt(title,"sortOrder"));\n return this.drawTable(place,\n this.makeColumns(output.split(/(?=^\s*\s[)/m),parseInt(this.getTagglyOpt(title,"numCols"))), // regexp magic\n "sitemap"\n );\n },\n\n macros: {\n tagglyTagging: {\n handler: function (place,macroName,params,wikifier,paramString,tiddler) {\n var refreshContainer = createTiddlyElement(place,"div");\n // do some refresh magic to make it keep the list fresh - thanks Saq\n refreshContainer.setAttribute("refresh","macro");\n refreshContainer.setAttribute("macroName",macroName);\n refreshContainer.setAttribute("title",tiddler.title);\n this.refresh(refreshContainer);\n },\n\n refresh: function(place) {\n var title = place.getAttribute("title");\n removeChildren(place);\n if (store.getTaggedTiddlers(title).length > 0) {\n var lingo = config.taggly.lingo;\n config.taggly.createListControl(place,title,"hideState");\n if (config.taggly.getTagglyOpt(title,"hideState") == "show") {\n createTiddlyElement(place,"span",null,"tagglyLabel",lingo.labels.label.format([title]));\n config.taggly.createListControl(place,title,"title");\n config.taggly.createListControl(place,title,"modified");\n config.taggly.createListControl(place,title,"created");\n config.taggly.createListControl(place,title,"listMode");\n config.taggly.createListControl(place,title,"excerpts");\n config.taggly.createListControl(place,title,"numCols");\n config.taggly.createTagglyList(place,title);\n }\n }\n }\n }\n },\n\n // todo fix these up a bit\n styles: [\n"/*{{{*/",\n"/* created by TagglyTaggingPlugin */",\n".tagglyTagging { padding-top:0.5em; }",\n".tagglyTagging li.listTitle { display:none; }",\n".tagglyTagging ul {",\n" margin-top:0px; padding-top:0.5em; padding-left:2em;",\n" margin-bottom:0px; padding-bottom:0px;",\n"}",\n".tagglyTagging { vertical-align: top; margin:0px; padding:0px; }",\n".tagglyTagging table { margin:0px; padding:0px; }",\n".tagglyTagging .button { visibility:hidden; margin-left:3px; margin-right:3px; }",\n".tagglyTagging .button, .tagglyTagging .hidebutton {",\n" color:[[ColorPalette::TertiaryLight]]; font-size:90%;",\n" border:0px; padding-left:0.3em;padding-right:0.3em;",\n"}",\n".tagglyTagging .button:hover, .hidebutton:hover, ",\n".tagglyTagging .button:active, .hidebutton:active {",\n" border:0px; background:[[ColorPalette::TertiaryPale]]; color:[[ColorPalette::TertiaryDark]];",\n"}",\n".selected .tagglyTagging .button { visibility:visible; }",\n".tagglyTagging .hidebutton { color:[[ColorPalette::Background]]; }",\n".selected .tagglyTagging .hidebutton { color:[[ColorPalette::TertiaryLight]] }",\n".tagglyLabel { color:[[ColorPalette::TertiaryMid]]; font-size:90%; }",\n".tagglyTagging ul {padding-top:0px; padding-bottom:0.5em; margin-left:1em; }",\n".tagglyTagging ul ul {list-style-type:disc; margin-left:-1em;}",\n".tagglyTagging ul ul li {margin-left:0.5em; }",\n".editLabel { font-size:90%; padding-top:0.5em; }",\n".tagglyTagging .commas { padding-left:1.8em; }",\n"/* not technically tagglytagging but will put them here anyway */",\n".tagglyTagged li.listTitle { display:none; }",\n".tagglyTagged li { display: inline; font-size:90%; }",\n".tagglyTagged ul { margin:0px; padding:0px; }",\n".excerpt { color:[[ColorPalette::TertiaryMid]]; }",\n"div.tagglyTagging table,",\n"div.tagglyTagging table tr,",\n"td.tagglyTagging",\n" {border-style:none!important; }",\n"/*}}}*/",\n ""].join("\sn"),\n\n init: function() {\n merge(config.macros,this.macros);\n config.shadowTiddlers["TagglyTaggingStyles"] = this.styles;\n store.addNotification("TagglyTaggingStyles",refreshStyles);\n }\n};\n\nconfig.taggly.init();\n\n//}}}\n\n
I did a recursive global replace on the ROMS directory, changing all values of "nf90_clobber" to "NF90_NETCDF4" so that ROMS would write NetCDF4 files. These occurred in many different routines such as "def_his.F", "def_avg.F", "def_floats.F", etc. A better solution would be to have a variable defined in the input file so that the user could choose what type of files to write.\n\nI first tried to change all variables to be compressed by changing these lines in "def_var.F":\n{{{\n status=nf90_def_var(ncid, TRIM(Vinfo(1)), Vtype, &\n & varid = Vid)\n}}}\nto\n{{{\n status=nf90_def_var(ncid, TRIM(Vinfo(1)), Vtype, &\n & varid = Vid)\n shuffle = 1\n deflate = 1\n deflate_level = 1\n status=nf90_def_var_deflate(ncid, Vid, &\n & shuffle, deflate, deflate_level)\n}}}\n\nThis compiled okay, but at runtime, although the status of nf90_def_var_deflate was okay (status=0), the status of nf90_enddef was NOT okay, returning status=-101.\n\nDid I do something wrong? As it turns out, I did. I didn't look carefully at the code, and I was deflating variables that had a single value! Okay, perhaps that should have crashed NetCDF, but it wasn't too smart, either. So when I only deflated multidimensional data, it worked fine.\n\nI tested on the teignmouth grid, which has a large masked area. The size for 1 time step in the deflated file was 1.7 MB, while the original was 14 MB.\n\nI then was able to read this into Matlab using the exact same syntax as for regular files.\n\n{{{\njavaaddpath('toolsUI-4.0.jar','-end');\nimport ucar.nc2.*\nuri=''\\n}}}\n\nIt all worked as advertised!\n\n-Rich\n
{{{\ncd /oceandata\nmkdir share\ncd share\nwget\ntar xvz zlib-1.2.6.tar.gz\nwget\ntar xvfz hdf5-1.8.8.tar.gz\ncd hdf5-1.8.8\n./configure --with-zlib=/oceandata --prefix=/oceandata --enable-hl --enable-shared\nmake clean\nmake check install\n\n\nwget\ntar xvfz netcdf-4.2.tar.gz\n\ncd netcdf-4.2\nexport CPPFLAGS=-I/oceandata/include\nexport LDFLAGS=-L/oceandata/lib\n./configure --prefix=/oceandata --enable-netcdf-4 --enable-shared --enable-dap\nmake clean\nmake install\n\nwget\ntar xvfz netcdf-fortran-4.2.tar.gz\ncd netcdf-fortran-4.2\n./configure --prefix=/oceandata\nmake install\nmake check\n\ncd /oceandata/share/roms\nsvn \n\ncd /oceandata/shares\n wget\n\n}}}\n\n{{{\nroot@master:~# df\nFilesystem           1K-blocks      Used Available Use% Mounted on\n/dev/xvda1            
\nThe ToolsUI GUI (the NetCDF 4.0 Tools App) is a very useful tool for debugging, quick browsing and mapping of data\nfrom NetCDF, NcML, OpenDAP URLs and more.If you haven't seen it, it's available on the NetCDF Java page:\n<>. \n\nGet the version 4+ toolsUI.jar, which you can use via Java webstart, or as I prefer, from the command line:\n{{{\nwget\njava -Xmx512m -jar toolsUI-4.1.jar\n}}}\n\nFor those of you who have wondered about what all those tabs do in the ToolsUI GUI, I had a web-enabled conversation with John Caron and we captured a WebEx recording of John giving a demo of the features he uses most often.The demo is about 40 minutes, and can be viewed with the WebEx Recorder on Mac & Windows.\n\nJohn Caron's ToolsUI Demo: <> (46 MB)\n\nWebEx Player (free download):\n\n(note that you don't have to sign up for a free trial of WebEx to just download the player).\n\n
On 04/29/2005 11:14 AM I received this message from David Divins in response to a query about the GMT "surface" routine parameters used for the Coastal Relief Model V1.0:\n\n{{{\nRich,\n\nWe blockmean the data (-I3c), then run a perl script that does most \nprocessing for us. Here is the surface info from that script:\n\n$gRes = "3c"; # 3 second grid resolution\n$gFormat = "12"; # NGDC G98 Format\n\n$Convergence = 0.1;\n$Tension = 0.5;\n$gOptions = "-I$gRes -C$Convergence -T$Tension -N500 -Lu-0.1 -V";\n\n$command = "surface $xyzFile -G$$.grd=$gFormat -R$range $gOptions";\n\n\nDavid \n}}}\n\nI think these settings could be improved in the following ways: \n\n * Change the tension from 0.5 to 0.35 (T=1 is a harmonic surface with no maxima or minima except at control data points, T=0 is the "minimum curvature solution" which can give undesired false maxima and minima). 0.35 is recommended by the GMT folks for "steep topography data" and decreasing the tension from 0.5 to 0.35 will make isolated data points look less like tent poles.\n\n * Include the aspect ratio argument, since we are gridding in geographic (lon,lat) coordinates. Should be set to the cosine of the central latitude (e.g. -A0.75 at 41.5 degrees north)\n\n
/***\n| Name|ToggleTagPlugin|\n| Description|Makes a checkbox which toggles a tag in a tiddler|\n| Version|3.0 ($Rev: 1845 $)|\n| Date|$Date: 2007-03-16 15:19:22 +1000 (Fri, 16 Mar 2007) $|\n| Source||\n| Author|Simon Baird <>|\n| License||\n!Usage\n{{{<<toggleTag }}}//{{{TagName TiddlerName LabelText}}}//{{{>>}}}\n* TagName - the tag to be toggled, default value "checked"\n* TiddlerName - the tiddler to toggle the tag in, default value the current tiddler\n* LabelText - the text (gets wikified) to put next to the check box, default value is '{{{[[TagName]]}}}' or '{{{[[TagName]] [[TiddlerName]]}}}'\n(If a parameter is '.' then the default will be used)\n\nExamples:\n\n|Code|Description|Example|h\n|{{{<<toggleTag>>}}}|Toggles the default tag (checked) in this tiddler|<<toggleTag>>|\n|{{{<<toggleTag TagName>>}}}|Toggles the TagName tag in this tiddler|<<toggleTag TagName>>|\n|{{{<<toggleTag TagName TiddlerName>>}}}|Toggles the TagName tag in the TiddlerName tiddler|<<toggleTag TagName TiddlerName>>|\n|{{{<<toggleTag TagName TiddlerName 'click me'>>}}}|Same but with custom label|<<toggleTag TagName TiddlerName 'click me'>>|\n|{{{<<toggleTag . . 'click me'>>}}}|dot means use default value|<<toggleTag . . 'click me'>>|\nNotes:\n* If TiddlerName doesn't exist it will be silently created\n* Set label to '-' to specify no label\n* See also\n\n!Known issues\n* Doesn't smoothly handle the case where you toggle a tag in a tiddler that is current open for editing\n\n***/\n//{{{\n\nmerge(config.macros,{\n\n toggleTag: {\n\n doRefreshAll: true,\n createIfRequired: true,\n shortLabel: "[[%0]]",\n longLabel: "[[%0]] [[%1]]",\n\n handler: function(place,macroName,params,wikifier,paramString,tiddler) {\n var tag = (params[0] && params[0] != '.') ? params[0] : "checked";\n var title = (params[1] && params[1] != '.') ? params[1] : tiddler.title;\n var defaultLabel = (title == tiddler.title ? this.shortLabel : this.longLabel);\n var label = (params[2] && params[2] != '.') ? params[2] : defaultLabel;\n label = (label == '-' ? '' : label);\n var theTiddler = title == tiddler.title ? tiddler : store.getTiddler(title);\n var cb = createTiddlyCheckbox(place, label.format([tag,title]), theTiddler && theTiddler.isTagged(tag), function(e) {\n if (!store.tiddlerExists(title)) {\n if (config.macros.toggleTag.createIfRequired) {\n var content = store.getTiddlerText(title); // just in case it's a shadow\n store.saveTiddler(title,title,content?content:"",config.options.txtUserName,new Date(),null);\n }\n else \n return false;\n }\n store.setTiddlerTag(title,this.checked,tag);\n return true;\n });\n }\n }\n});\n\n//}}}\n\n
TortoiseSVN is a popular Windows SVN client, that can be obtained from\n\nMost people will want to download the Windows 32 bit installer.
> I have some geotiff images with a black border. I want to make the border\n> transparent. Is there a way to do this with FWTools?\n\nSimon,\n\nIn part this depends on what software you want it to be transparent in.\nIn some software packages (like MapServer or OpenEV) you can pick a\n"nodata value" such as 0 to be treated as transparent. In some cases\nthe packages can also recognise nodata values as a kind of metadata from\nthe file. The gdal_translate -a_nodata switch can be used to assign this\nnodata value to an output file though it is only preserved in some formats.\n\nFor example:\n{{{\ngdal_translate -a_nodata 0 in.tif out.tif\n}}}\nAnother approach to transparency is to add an alpha channel with\nexplicit transparency. I think you could essentially assign all "zero"\npixels in an input greyscale image to have an alpha of zero in the output\nusing something like:\n\n{{{\ngdalwarp -srcnodata 0 -dstalpha in.tif out.tif\n}}}\nThis whole area though is rife with challenges because:\n\n* many software packages don't automatically support nodata values.\n* some software packages don't recognise alpha bands for transparency\n* some file formats don't support saving nodata values.\n* some formats don't support alpha bands.\n* GDAL's "nodata" data model treats nodata values as independent for\n each band.\n* all black pixels will get treated as transparent, not just border values.
Starting with parameter set exactly as in ROMS sed paper, except that we spin up for 30,000 time steps, which is 25 minutes, which is how long it takes to obtain steady state (for the initial transients to decay).\n\n{{{\nr001 ROMS 3.1, SLOPE_NEMETH, BEDLOAD_COEFF == 0.06, poros=0.3 (8cpu=35min)\nr002 ROMS 3.1, SLOPE_NEMETH, BEDLOAD_COEFF == 0.17, poros=0.3 (bombed!)\nr003 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, poros=0.3 (4cpu=66min)\nr004 ROMS 3.1, SLOPE_NEMETH, BSTRESS_UPWIND, BEDLOAD_COEFF == 0.17, poros=0.4 \nr005 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 1.0, poros=0.4 \nr006 ROMS 3.1, SLOPE_LESSER, MPM_RIPPLE, BEDLOAD_COEFF == 1.0, poros=0.4 \nr007 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, poros=0.3, d50=160, Wsed=14.3, TauCE=0.1658 \nr008 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BSTRESS_UPWIND, BEDLOAD_COEFF == 1.00, poros=0.3, d50=160, Wsed=14.3, TauCE=0.1658 \nr009 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, poros=0.3, Wsed=13.0, Flux=3.5e-2\nr010 ROMS 3.1, SLOPE_NEMETH, BEDLOAD_COEFF == 0.17, BSTRESS_UPWIND, poros=0.3 \nr011 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.17, BSTRESS_UPWIND, poros=0.3 \nr012 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.08, BSTRESS_UPWIND, poros=0.3 \nr013 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.10, BSTRESS_UPWIND, poros=0.3 \nr014 ROMS 3.1, SLOPE_LESSER, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, BSTRESS_UPWIND, poros=0.3\nr015 ROMS 3.1, SLOPE_LESSER, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, BSTRESS_UPWIND, poros=0.4 \nr016 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.10, BSTRESS_UPWIND, poros=0.4 \nr017 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.10, BSTRESS_UPWIND, poros=0.4, WSED=13.0\nr018 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.10, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.35d-2\nr019 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.10, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.35d-2, MORFAC=90\nr020 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.06, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.35d-2, MORFAC=90\nr021 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.35d-2, MORFAC=90\nr022 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.40d-2, WSED=13.0 MORFAC=90\nr023 ROMS 3.1, SLOPE_LESSER, BEDLOAD_COEFF == 0.06, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.40d-2, WSED=13.0 MORFAC=90\nr024 ROMS 3.1, SLOPE_LESSER, MPM_RIPPLE, BEDLOAD_COEFF == 0.06, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.35d-2, MORFAC=90\nr025 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 0.06, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.35d-2, MORFAC=90\nr026 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, poros=0.4, MUD_ERATE=0.35d-2, MORFAC=90 (exe copied from r003)\nr027 ROMS 3.1, SLOPE_NEMETH, MPM_RIPPLE, BEDLOAD_COEFF == 1.00, poros=0.4, MUD_ERATE=0.25d-2, MORFAC=90 (exe copied from r003)\nr028 ROMS 3.1, SLOPE_NEMETH, BEDLOAD_COEFF == 0.06, BSTRESS_UPWIND, poros=0.4, MUD_ERATE=0.35d-2, MORFAC=90 (same as 20, but with nemeth)\nr029 ROMS 3.1, SLOPE_NEMETH, BEDLOAD_COEFF == 0.06, poros=0.4, MUD_ERATE=0.35d-2, MORFAC=90 (same as 28, but without upwind)\n}}}\n\nDelft3d cases:\nSpin up for 30 minutes, use morph fac of 90 for 10 minutes more (40 minutes total)\n{{{\nrps03 10 layer algebraic model (log-stretched with 4% layer at bottom)\nrps04 10 layer k-eps model (log-stretched with 4% layer at bottom)\nrps05 20 layer k-eps model (log-stretched with 2% layer at bottom)\nrps06 10 layer algebraic model (log-stretched with 4% layer at bottom), 160 micron sand\n}}}
/***\nContains the stuff you need to use Tiddlyspot\nNote you must also have UploadPlugin installed\n***/\n//{{{\n\n// edit this if you are migrating sites or retrofitting an existing TW\nconfig.tiddlyspotSiteId = 'rsignell';\n\n// make it so you can by default see edit controls via http\nconfig.options.chkHttpReadOnly = false;\n\n// disable autosave in d3\nif (window.location.protocol != "file:")\n config.options.chkGTDLazyAutoSave = false;\n\n// tweak shadow tiddlers to add upload button, password entry box etc\nwith (config.shadowTiddlers) {\n SiteUrl = 'http://'+config.tiddlyspotSiteId+'';\n SideBarOptions = SideBarOptions.replace(/(<<saveChanges>>)/,"$1<<tiddler TspotSidebar>>");\n OptionsPanel = OptionsPanel.replace(/^/,"<<tiddler TspotOptions>>");\n DefaultTiddlers = DefaultTiddlers.replace(/^/,"[[Welcome to Tiddlyspot]] ");\n MainMenu = MainMenu.replace(/^/,"[[Welcome to Tiddlyspot]] ");\n}\n\n// create some shadow tiddler content\nmerge(config.shadowTiddlers,{\n\n'Welcome to Tiddlyspot':[\n "This document is a ~TiddlyWiki from A ~TiddlyWiki is an electronic notebook that is great for managing todo lists, personal information, and all sorts of things.",\n "",\n "@@font-weight:bold;font-size:1.3em;color:#444; //What now?// &nbsp;&nbsp;@@ Before you can save any changes, you need to enter your password in the form below. Then configure privacy and other site settings at your [[control panel|http://" + config.tiddlyspotSiteId + "]] (your control panel username is //" + config.tiddlyspotSiteId + "//).",\n "<<tiddler TspotControls>>",\n "See also GettingStarted.",\n "",\n "@@font-weight:bold;font-size:1.3em;color:#444; //Working online// &nbsp;&nbsp;@@ You can edit this ~TiddlyWiki right now, and save your changes using the \s"save to web\s" button in the column on the right.",\n "",\n "@@font-weight:bold;font-size:1.3em;color:#444; //Working offline// &nbsp;&nbsp;@@ A fully functioning copy of this ~TiddlyWiki can be saved onto your hard drive or USB stick. You can make changes and save them locally without being connected to the Internet. When you're ready to sync up again, just click \s"upload\s" and your ~TiddlyWiki will be saved back to",\n "",\n "@@font-weight:bold;font-size:1.3em;color:#444; //Help!// &nbsp;&nbsp;@@ Find out more about ~TiddlyWiki at [[|]]. Also visit [[TiddlyWiki Guides|]] for documentation on learning and using ~TiddlyWiki. New users are especially welcome on the [[TiddlyWiki mailing list|]], which is an excellent place to ask questions and get help. If you have a tiddlyspot related problem email [[tiddlyspot support|]].",\n "",\n "@@font-weight:bold;font-size:1.3em;color:#444; //Enjoy :)// &nbsp;&nbsp;@@ We hope you like using your site. Please email [[|]] with any comments or suggestions."\n].join("\sn"),\n\n'TspotControls':[\n "| tiddlyspot password:|<<option pasUploadPassword>>|",\n "| site management:|<<upload http://" + config.tiddlyspotSiteId + " index.html . . " + config.tiddlyspotSiteId + ">>//(requires tiddlyspot password)//<<br>>[[control panel|http://" + config.tiddlyspotSiteId + "]], [[download (go offline)|http://" + config.tiddlyspotSiteId + "]]|",\n "| links:|[[|]], [[FAQs|]], [[announcements|]], [[blog|]], email [[support|]] & [[feedback|]], [[donate|]]|"\n].join("\sn"),\n\n'TspotSidebar':[\n "<<upload http://" + config.tiddlyspotSiteId + " index.html . . " + config.tiddlyspotSiteId + ">><html><a href='http://" + config.tiddlyspotSiteId + "' class='button'>download</a></html>"\n].join("\sn"),\n\n'TspotOptions':[\n "tiddlyspot password:",\n "<<option pasUploadPassword>>",\n ""\n].join("\sn")\n\n});\n//}}}\n
__Ranglum__\n{{{\n2 oz Gosslings Black Seal\n1 oz lime juice\n¾ oz falernum\n½ oz Wray & Nephew Overproof\n½ to 1 bar-spoon sugar syrup\n}}}\nShake all ingredients with ice. Strain in to an ice filled old-fashioned glass and garnish with a lime wedge.\n\n__Royal Bermuda Yacht Club__\n{{{\n2 oz gold Barbados rum\n1 oz lime juice\n¾ oz falernum\n¼ oz Cointreau\n}}}\nShake all ingredients with ice. Strain in to a cocktail glass and garnish with a lime wedge.
Work with GI-CAT group/Matt to ensure that their THREDDS crawler calculates geospatial and temporal extents\n\nComplete Wiki entry for accessing Web Services from Matlab\nCreate an example CSW request from Matlab\n\nInstall ERDDAP for IOOS Modeling Testbed at SURA\n\nVisit NDBC to fix NetCDF files, TDS configuration and set up ERDDAP\n\nTest Radial FeatureType for Wave Spectra and HF Radar radial data \n\nTest TrajectoryCollection Featuretype for Particle tracking Model Output FeatureType.\n
{{{\n\s\sTeramac\str8\sACOE_LIDAR\s2005_2007_NCMP_MA\s\n.\sShapefiles has the list of which box corresponds to which region\n.\s2005_2007_NCMP_MA_1mGrid contains the original DEM geotiffs (with houses, trees, etc)\n.\s2005_2007_NCMP_MA_BareEarth contains the bare earth DEM geotiffs\n}}}\n\nConverted to the UTM grids to EPSG 4326:\n{{{\ngdalwarp 2005_2007_NCMP_MA_059_BareEarth.tif -t_srs EPSG:4326 -srcnodata 0 -dstnodata -99 bare_059_geo.tif\n}}}\nThe "dstnodata" puts -99 in the extra region around the borders created by UTM to lon/lat conversion.\n\nConverted the tif to NetCDF GMT format because Mirone didn't honor the -99 from the tif.\n{{{\ngdal_translate bare_059_geo.tif -of gmt foo.grd\n}}}\nI then read the "foo.grd" into Mirone, and used the "clip grid" to set any values less than -88 to NaN. I saved this as a GMT file (bare_059_geo.grd), and then did a color shaded relief image, setting the color pallete to "lidar.cpt" from the c:\srps\slidar\s directory, and then setting the color range to be from -2 to 26. I then did the default color shaded relief and saved to Google Earth.\n\nI put the GMT files on coast-enviro/models/bathy for Chen's group to grab.\n
Upgrading to Sun Java: try this first\n{{{\nsudo add-apt-repository ppa:sun-java-community-team/sun-java6\nsudo apt-get update\nsudo apt-get install sun-java6*\n}}}\nIf that doesn't work, try using the GUI first to change the repositories that are checked by apt-get, then try "apt-get install sun-java6*" again:\n\n\nTo see where a package got installed:\ndpkg -L <packagename>\n\nBefore Howar's NCO script would work, I needed to do\n{{{\nsudo apt-get install g++\n}}}\n
Next meeting, April 11-12, 2011\nClimate push from Policy Committee (NASA, NOAA. NSF supports broadening the user base of Unidata Program software)\n\nNew Unidata members:\nJulian Chasting (IDV/RAMADDA)\nDoug Dirks (writer/editor, used to work for IDL)\n\n500 machines at 250+ sites are running LDM-6. NOAA, NASA and NAVY and companies all use LDM software without IDD to push data within their institutions.\n\nUPC IDD Cluster relays data to more than 650 downstream connections. 5.7TB output/day => 525 Mbps (peaks rates exceed 1.1Gbps, and UCAR has a 10Gbps pipe, so 10% of available bandwidth). NEXRAD Level 2 and CONDUIT are the big players.\n\nRAMADDA use continues to grow, even after Jeff McWhirter departing, moving to more open source model.\n\nUnidata working with GFDL on Gridspec in libCF. Ed Hartnett is point man.\n\nNetCDF-CF has been proposed as an OGC standard, comments closed on oct 7. (how many comments?)\n\nNew NOAAPORT will have 3 times bandwidth. \n\nNext year 1/4 degree GFS, 1/2 degree is currently the smaller, \nCurrently queue sizes are not large enough. Machines are fast enough, but need more memory.\n\nLDM will have \n\n"Leverage the synergies" Steve\n\n\nNSF panels work very differently some just rubber-stamp the peer-reviews, but some actually just fund based on panel recommendations.\n"pack the panels!"\n\nAWIPS II\njava and python,\nbasically a new GEMPACK\n\nNew GIS project @ NCEP\nKML, Shapefiles, WMS and WCS\npublication quality graphs?\nwould encourage Web Services and Restful URL access for workflow.\n\nWould be cool to export NcML from the database so that same HDF5 files could be seen as NetCDF-4 files and therefore useful to load in other software.\n\nAll the custom local weather office stuff will be done as Python.\n\nSome version of OpenGL is required.\n\n\nCONDUIT\n58GB/day\n\nnew WOC in Boulder\nConduit is now immune to outages in Silver Spring\n\nCONDUIT:\nNDFD now in Conduit.\nAnything with a WMO header can now be put in Conduit (upon request).\n\nGFS now 0-192 hours\nRUC extension to 18 hours, all cycles\nnew 2.5 km CONUS RTMA (RealTime Mesoscale Analysis)\n\nFY11 Model Implementations (could be added to CONDUIT)\nFYQ1\nNew grids for hurricane wave model\nMajor upgrade to Climate Forecast System (CFS)\n\nFYQ2\n HIRES window upgrade\n\nFYQ3\nNAM /NEMS/NMMB subgrids within NAM\n\nFY12Q1 \nGFS upgrade with 0.25 degree output\n\nCould do now:\nRTOFS\nWW3 output\nHurricane-driven, GFS multi-grid wave model\n\nBrendan:\nUnidata software on iphone, droid \nyum install IDV\n\nPower of the CDM in the most common \nScientific Analysis and Visualization Environment: Matlab, Python, R and IDL\n\nWork on toolboxes that have the functionality in Python. \n\nGrads, GEMPACK, FERRET, \n\n
Action item for Signell: get a policy committee member on the THREDDS Steering Team\n\nWRF training classes, they asked what users wanted training on for visualizing WRF: and the answers were NCL, Vapor and IDV. \n\n"VAPOR is the Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers. VAPOR provides an interactive 3D visualization environment that runs on most UNIX and Windows systems equipped with modern 3D graphics cards."\n\nTom, \nHow are calculations in IDV (Theta E?) Don has said that there are no guarantees on postprocessing -- so we need to check it out. \n\nNETCDF: Ed Hartnett leaving is going to put a damper on NetCDF development\nRAMADDA: Julian: Unidata depends on RAMADDA because of it's tight integration with IDV\n\nBecky Cosgrove:\n4m CONUS (0-60 hours)\nNFCens NCEP/FNMOC Combined Wave ensemble\nSignificant wave height\nGlobal RTOFS 12/degree HYCOM with 3 hourly forcing to 6 days. \nNCEP has 1982-present (9 month reforecasts every 6 hours for every 5th day - looking for a home)\nHRRR - NCO is in discussion to GSD (Global Systems Division) (4km ARW 6 hours from RUC \nS\n\nIDD/LDM: 6000 megabytes/hour at peak (3GB/hour average)\n\nSend message about NCTOOLBOX to usercomm folks (Kevin, \n\nOpenDAP 2.45 TB\nWMS 0.27 TB\nADDE on Motherlode 14.4 TB\nADDE GOES-East 12.0 TB\nADDE GOES-West 9.6TB\nADDE GOES-South 6.0TB\nTotal 48.69TB\n\nGlobal output is the most accessed data from Motherlode, perhaps because people in the past who used IDD/LDM to access the global model now can use motherlode to clip out just the part they need.\n\nSEES (NSF project for sustainability education, research)
\nGo to http://localhost:8080/thredds and note the existing version number (including date).\n\nGo to the Tomcat Webapps directory c:\sprograms\stomcat6\swebapps and move thredds.war to thredds.war.bak\nDownload the latest thredds.war file to the webapps directory.\n\nGo to "Administrative Tools=>Services", click on "Apache Tomcat" and the click on "Stop the service" (if it is running). Leave this window open. Delete the c:\sprograms\stomcat6\swebapps\sthredds directory. Click on "Start the service".\n\nGo to <http://localhost:8080/thredds> and make sure the version number has changed.\n
> Regarding NCTOOLBOX, is making the zipfile available via NCTOOLBOX\n> something only you can do?\n\nNope.\n>\n> Do you do that by grabbing Alex's zipfile from github and uploading it?\n\nNope.\n\ncd nctoolbox\nhg pull\nhg update\n\\n\n\nA new date-stamped zip file will be created in the nctoolbox directory.\n>\n> If we are going to do that I noticed that the README file needs updating.\n>\nGo ahead and update the README and push your change.
from windows command prompt (run "cmd") type:\n{{{\ncd c:\sPython27_EPD\sScripts\s\nenpkg epd\nenpkg ipython 0.10.2 # downgrading from ipython-0.12-1.egg to 0.10.2-2.egg\nenpkg spyder # Spyder 2.1\nenpkg PyQt # required by Spyder 2.1\nenpkg rope # used by Spyder for code introspection\n}}}
1. Upgrade to the latest ipython from EPD (which will install any other dependencies, like kernmagic):\n{{{\nenpkg ipython\n}}}\n2. Get and build the latest ipython from github:\n{{{\ngit clone\ncd ipython\npython build\npython -c "import setuptools;execfile('')" bdist_egg\n}}}\n3. Remove the old ipython with "enpkg" and install the new ipython with "egginst". (Note that "egginst" doesn't know anything dependencies, so it really isn't "managed" in the sense that other EPD packages are, but using "egginst" means you can use {{{enpkg --remove}}} to remove non-EPD packages if they become supported by EPD).\n{{{\nenpkg --remove ipython \negginst dist/\n}}}\n\nThis didn't work on windows. It appeared to all work fine, but after installing, there was no ipython when I did:\n{{{enpkg -l}}}. And when I did {{{enpkg --remove ipython}}} it told me none was installed. So then I tried reinstalling the EPD ipython using {{{enpkg ipython}}}, but that also had problems. It turned out I needed to update kernmagicwhich provides extra ipython %magic commands available in the EPD Ipython:\n{{{\nhg clone\ncd kernmagic\npython build\npython -c "import setuptools;execfile('')" bdist_egg\negginst dist/kernmagic-0.0.0-py2.7.egg\n}}}
Here's how I upgraded my Matlab from 7.2 to 7.5 (r2006a to r2007b):\n\nMake new directory:\n{{{\nmkdir /usr/local/matlab75\ncd /usr/local/matlab\n}}}\nCopy over my old license file:\n{{{\ncp /usr/local/matlab7.2/etc/license.dat /usr/local/matlab75 \n}}}\nThe pop the DVD in and type:\n{{{\n/media/cdrom/install &\n}}}\nI changed the MATLAB environment variable in my .bashrc to point to /usr/local/matlab75\n\nThen copied over my startup.m file:\n{{{\ncd /usr/local/matlab75/toolbox/local\ncp /usr/local/matlab7.2/toolbox/local/startup.m .\n}}}\nthen fired up Matlab and everything just worked!\n\nMy license file looks like:\n{{{\n# MATLAB license passcode file for use with FLEXnet.\n# Get FLEX_LM license from Server\nDAEMON MLM /usr/local/matlab75/etc/lm_matlab\nSERVER 0030482298cc 27000\nUSE_SERVER\n}}}
| !date | !user | !location | !storeUrl | !uploadDir | !toFilename | !backupdir | !origin |\n| 15/1/2015 17:4:18 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:14:28 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:15:13 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:16:59 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:18:57 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:19:50 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . |\n| 15/1/2015 17:29:44 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:38:59 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:43:58 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:47:46 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:50:40 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 17:51:35 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . |\n| 15/1/2015 18:9:13 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . |\n| 15/1/2015 18:11:50 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . |\n| 15/1/2015 18:12:36 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 18:16:19 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . |\n| 15/1/2015 18:19:40 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . |\n| 15/1/2015 18:20:15 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 15/1/2015 18:20:47 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . | Ok |\n| 16/1/2015 4:9:0 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . |\n| 16/1/2015 8:47:19 | RichSignell | [[/|]] | [[store.cgi|]] | . | index.html | . |
/***\n|''Name:''|UploadPlugin|\n|''Description:''|Save to web a TiddlyWiki|\n|''Version:''|3.4.5|\n|''Date:''|Oct 15, 2006|\n|''Source:''||\n|''Documentation:''||\n|''Author:''|BidiX (BidiX (at) bidix (dot) info)|\n|''License:''|[[BSD open source license| ]]|\n|''~CoreVersion:''|2.0.0|\n|''Browser:''|Firefox 1.5; InternetExplorer 6.0; Safari|\n|''Include:''|config.lib.file; config.lib.log; config.lib.options; PasswordTweak|\n|''Require:''|[[UploadService|]]|\n***/\n//{{{\nversion.extensions.UploadPlugin = {\n major: 3, minor: 4, revision: 5, \n date: new Date(2006,9,15),\n source: '',\n documentation: '',\n author: 'BidiX (BidiX (at) bidix (dot) info',\n license: '[[BSD open source license|]]',\n coreVersion: '2.0.0',\n browser: 'Firefox 1.5; InternetExplorer 6.0; Safari'\n};\n//}}}\n\n////+++!![config.lib.file]\n\n//{{{\nif (!config.lib) config.lib = {};\nif (!config.lib.file) config.lib.file= {\n author: 'BidiX',\n version: {major: 0, minor: 1, revision: 0}, \n date: new Date(2006,3,9)\n};\nconfig.lib.file.dirname = function (filePath) {\n var lastpos;\n if ((lastpos = filePath.lastIndexOf("/")) != -1) {\n return filePath.substring(0, lastpos);\n } else {\n return filePath.substring(0, filePath.lastIndexOf("\s\s"));\n }\n};\nconfig.lib.file.basename = function (filePath) {\n var lastpos;\n if ((lastpos = filePath.lastIndexOf("#")) != -1) \n filePath = filePath.substring(0, lastpos);\n if ((lastpos = filePath.lastIndexOf("/")) != -1) {\n return filePath.substring(lastpos + 1);\n } else\n return filePath.substring(filePath.lastIndexOf("\s\s")+1);\n};\nwindow.basename = function() {return "@@deprecated@@";};\n//}}}\n////===\n\n////+++!![config.lib.log]\n\n//{{{\nif (!config.lib) config.lib = {};\nif (!config.lib.log) config.lib.log= {\n author: 'BidiX',\n version: {major: 0, minor: 1, revision: 1}, \n date: new Date(2006,8,19)\n};\nconfig.lib.Log = function(tiddlerTitle, logHeader) {\n if (version.major < 2)\n this.tiddler = store.tiddlers[tiddlerTitle];\n else\n this.tiddler = store.getTiddler(tiddlerTitle);\n if (!this.tiddler) {\n this.tiddler = new Tiddler();\n this.tiddler.title = tiddlerTitle;\n this.tiddler.text = "| !date | !user | !location |" + logHeader;\n this.tiddler.created = new Date();\n this.tiddler.modifier = config.options.txtUserName;\n this.tiddler.modified = new Date();\n if (version.major < 2)\n store.tiddlers[tiddlerTitle] = this.tiddler;\n else\n store.addTiddler(this.tiddler);\n }\n return this;\n};\n\nconfig.lib.Log.prototype.newLine = function (line) {\n var now = new Date();\n var newText = "| ";\n newText += now.getDate()+"/"+(now.getMonth()+1)+"/"+now.getFullYear() + " ";\n newText += now.getHours()+":"+now.getMinutes()+":"+now.getSeconds()+" | ";\n newText += config.options.txtUserName + " | ";\n var location = document.location.toString();\n var filename = config.lib.file.basename(location);\n if (!filename) filename = '/';\n newText += "[["+filename+"|"+location + "]] |";\n this.tiddler.text = this.tiddler.text + "\sn" + newText;\n this.addToLine(line);\n};\n\nconfig.lib.Log.prototype.addToLine = function (text) {\n this.tiddler.text = this.tiddler.text + text;\n this.tiddler.modifier = config.options.txtUserName;\n this.tiddler.modified = new Date();\n if (version.major < 2)\n store.tiddlers[this.tiddler.tittle] = this.tiddler;\n else {\n store.addTiddler(this.tiddler);\n story.refreshTiddler(this.tiddler.title);\n store.notify(this.tiddler.title, true);\n }\n if (version.major < 2)\n store.notifyAll(); \n};\n//}}}\n////===\n\n////+++!![config.lib.options]\n\n//{{{\nif (!config.lib) config.lib = {};\nif (!config.lib.options) config.lib.options = {\n author: 'BidiX',\n version: {major: 0, minor: 1, revision: 0}, \n date: new Date(2006,3,9)\n};\n\nconfig.lib.options.init = function (name, defaultValue) {\n if (!config.options[name]) {\n config.options[name] = defaultValue;\n saveOptionCookie(name);\n }\n};\n//}}}\n////===\n\n////+++!![PasswordTweak]\n\n//{{{\nversion.extensions.PasswordTweak = {\n major: 1, minor: 0, revision: 3, date: new Date(2006,8,30),\n type: 'tweak',\n source: ''\n};\n//}}}\n/***\n!!config.macros.option\n***/\n//{{{\nconfig.macros.option.passwordCheckboxLabel = "Save this password on this computer";\nconfig.macros.option.passwordType = "password"; // password | text\n\nconfig.macros.option.onChangeOption = function(e)\n{\n var opt = this.getAttribute("option");\n var elementType,valueField;\n if(opt) {\n switch(opt.substr(0,3)) {\n case "txt":\n elementType = "input";\n valueField = "value";\n break;\n case "pas":\n elementType = "input";\n valueField = "value";\n break;\n case "chk":\n elementType = "input";\n valueField = "checked";\n break;\n }\n config.options[opt] = this[valueField];\n saveOptionCookie(opt);\n var nodes = document.getElementsByTagName(elementType);\n for(var t=0; t<nodes.length; t++) \n {\n var optNode = nodes[t].getAttribute("option");\n if (opt == optNode) \n nodes[t][valueField] = this[valueField];\n }\n }\n return(true);\n};\n\nconfig.macros.option.handler = function(place,macroName,params)\n{\n var opt = params[0];\n if(config.options[opt] === undefined) {\n return;}\n var c;\n switch(opt.substr(0,3)) {\n case "txt":\n c = document.createElement("input");\n c.onkeyup = this.onChangeOption;\n c.setAttribute ("option",opt);\n c.className = "txtOptionInput "+opt;\n place.appendChild(c);\n c.value = config.options[opt];\n break;\n case "pas":\n // input password\n c = document.createElement ("input");\n c.setAttribute("type",config.macros.option.passwordType);\n c.onkeyup = this.onChangeOption;\n c.setAttribute("option",opt);\n c.className = "pasOptionInput "+opt;\n place.appendChild(c);\n c.value = config.options[opt];\n // checkbox link with this password "save this password on this computer"\n c = document.createElement("input");\n c.setAttribute("type","checkbox");\n c.onclick = this.onChangeOption;\n c.setAttribute("option","chk"+opt);\n c.className = "chkOptionInput "+opt;\n place.appendChild(c);\n c.checked = config.options["chk"+opt];\n // text savePasswordCheckboxLabel\n place.appendChild(document.createTextNode(config.macros.option.passwordCheckboxLabel));\n break;\n case "chk":\n c = document.createElement("input");\n c.setAttribute("type","checkbox");\n c.onclick = this.onChangeOption;\n c.setAttribute("option",opt);\n c.className = "chkOptionInput "+opt;\n place.appendChild(c);\n c.checked = config.options[opt];\n break;\n }\n};\n//}}}\n/***\n!! Option cookie stuff\n***/\n//{{{\nwindow.loadOptionsCookie_orig_PasswordTweak = window.loadOptionsCookie;\nwindow.loadOptionsCookie = function()\n{\n var cookies = document.cookie.split(";");\n for(var c=0; c<cookies.length; c++) {\n var p = cookies[c].indexOf("=");\n if(p != -1) {\n var name = cookies[c].substr(0,p).trim();\n var value = cookies[c].substr(p+1).trim();\n switch(name.substr(0,3)) {\n case "txt":\n config.options[name] = unescape(value);\n break;\n case "pas":\n config.options[name] = unescape(value);\n break;\n case "chk":\n config.options[name] = value == "true";\n break;\n }\n }\n }\n};\n\nwindow.saveOptionCookie_orig_PasswordTweak = window.saveOptionCookie;\nwindow.saveOptionCookie = function(name)\n{\n var c = name + "=";\n switch(name.substr(0,3)) {\n case "txt":\n c += escape(config.options[name].toString());\n break;\n case "chk":\n c += config.options[name] ? "true" : "false";\n // is there an option link with this chk ?\n if (config.options[name.substr(3)]) {\n saveOptionCookie(name.substr(3));\n }\n break;\n case "pas":\n if (config.options["chk"+name]) {\n c += escape(config.options[name].toString());\n } else {\n c += "";\n }\n break;\n }\n c += "; expires=Fri, 1 Jan 2038 12:00:00 UTC; path=/";\n document.cookie = c;\n};\n//}}}\n/***\n!! Initializations\n***/\n//{{{\n// define config.options.pasPassword\nif (!config.options.pasPassword) {\n config.options.pasPassword = 'defaultPassword';\n window.saveOptionCookie('pasPassword');\n}\n// since loadCookies is first called befor password definition\n// we need to reload cookies\nwindow.loadOptionsCookie();\n//}}}\n////===\n\n////+++!![config.macros.upload]\n\n//{{{\nconfig.macros.upload = {\n accessKey: "U",\n formName: "UploadPlugin",\n contentType: "text/html;charset=UTF-8",\n defaultStoreScript: "store.php"\n};\n\n// only this two configs need to be translated\nconfig.macros.upload.messages = {\n aboutToUpload: "About to upload TiddlyWiki to %0",\n backupFileStored: "Previous file backuped in %0",\n crossDomain: "Certainly a cross-domain isue: access to an other site isn't allowed",\n errorDownloading: "Error downloading",\n errorUploadingContent: "Error uploading content",\n fileLocked: "Files is locked: You are not allowed to Upload",\n fileNotFound: "file to upload not found",\n fileNotUploaded: "File %0 NOT uploaded",\n mainFileUploaded: "Main TiddlyWiki file uploaded to %0",\n passwordEmpty: "Unable to upload, your password is empty",\n urlParamMissing: "url param missing",\n rssFileNotUploaded: "RssFile %0 NOT uploaded",\n rssFileUploaded: "Rss File uploaded to %0"\n};\n\nconfig.macros.upload.label = {\n promptOption: "Save and Upload this TiddlyWiki with UploadOptions",\n promptParamMacro: "Save and Upload this TiddlyWiki in %0",\n saveLabel: "save to web", \n saveToDisk: "save to disk",\n uploadLabel: "upload" \n};\n\nconfig.macros.upload.handler = function(place,macroName,params){\n // parameters initialization\n var storeUrl = params[0];\n var toFilename = params[1];\n var backupDir = params[2];\n var uploadDir = params[3];\n var username = params[4];\n var password; // for security reason no password as macro parameter\n var label;\n if (document.location.toString().substr(0,4) == "http")\n label = this.label.saveLabel;\n else\n label = this.label.uploadLabel;\n var prompt;\n if (storeUrl) {\n prompt = this.label.promptParamMacro.toString().format([this.toDirUrl(storeUrl, uploadDir, username)]);\n }\n else {\n prompt = this.label.promptOption;\n }\n createTiddlyButton(place, label, prompt, \n function () {\n config.macros.upload.upload(storeUrl, toFilename, uploadDir, backupDir, username, password); \n return false;}, \n null, null, this.accessKey);\n};\nconfig.macros.upload.UploadLog = function() {\n return new config.lib.Log('UploadLog', " !storeUrl | !uploadDir | !toFilename | !backupdir | !origin |" );\n};\nconfig.macros.upload.UploadLog.prototype = config.lib.Log.prototype;\nconfig.macros.upload.UploadLog.prototype.startUpload = function(storeUrl, toFilename, uploadDir, backupDir) {\n var line = " [[" + config.lib.file.basename(storeUrl) + "|" + storeUrl + "]] | ";\n line += uploadDir + " | " + toFilename + " | " + backupDir + " |";\n this.newLine(line);\n};\nconfig.macros.upload.UploadLog.prototype.endUpload = function() {\n this.addToLine(" Ok |");\n};\nconfig.macros.upload.basename = config.lib.file.basename;\nconfig.macros.upload.dirname = config.lib.file.dirname;\nconfig.macros.upload.toRootUrl = function (storeUrl, username)\n{\n return root = (this.dirname(storeUrl)?this.dirname(storeUrl):this.dirname(document.location.toString()));\n}\nconfig.macros.upload.toDirUrl = function (storeUrl, uploadDir, username)\n{\n var root = this.toRootUrl(storeUrl, username);\n if (uploadDir && uploadDir != '.')\n root = root + '/' + uploadDir;\n return root;\n}\nconfig.macros.upload.toFileUrl = function (storeUrl, toFilename, uploadDir, username)\n{\n return this.toDirUrl(storeUrl, uploadDir, username) + '/' + toFilename;\n}\nconfig.macros.upload.upload = function(storeUrl, toFilename, uploadDir, backupDir, username, password)\n{\n // parameters initialization\n storeUrl = (storeUrl ? storeUrl : config.options.txtUploadStoreUrl);\n toFilename = (toFilename ? toFilename : config.options.txtUploadFilename);\n backupDir = (backupDir ? backupDir : config.options.txtUploadBackupDir);\n uploadDir = (uploadDir ? uploadDir : config.options.txtUploadDir);\n username = (username ? username : config.options.txtUploadUserName);\n password = config.options.pasUploadPassword; // for security reason no password as macro parameter\n if (!password || password === '') {\n alert(config.macros.upload.messages.passwordEmpty);\n return;\n }\n if (storeUrl === '') {\n storeUrl = config.macros.upload.defaultStoreScript;\n }\n if (config.lib.file.dirname(storeUrl) === '') {\n storeUrl = config.lib.file.dirname(document.location.toString())+'/'+storeUrl;\n }\n if (toFilename === '') {\n toFilename = config.lib.file.basename(document.location.toString());\n }\n\n clearMessage();\n // only for forcing the message to display\n if (version.major < 2)\n store.notifyAll();\n if (!storeUrl) {\n alert(config.macros.upload.messages.urlParamMissing);\n return;\n }\n // Check that file is not locked\n if (window.BidiX && BidiX.GroupAuthoring && BidiX.GroupAuthoring.lock) {\n if (BidiX.GroupAuthoring.lock.isLocked() && !BidiX.GroupAuthoring.lock.isMyLock()) {\n alert(config.macros.upload.messages.fileLocked);\n return;\n }\n }\n \n var log = new this.UploadLog();\n log.startUpload(storeUrl, toFilename, uploadDir, backupDir);\n if (document.location.toString().substr(0,5) == "file:") {\n saveChanges();\n }\n var toDir = config.macros.upload.toDirUrl(storeUrl, toFilename, uploadDir, username);\n displayMessage(config.macros.upload.messages.aboutToUpload.format([toDir]), toDir);\n this.uploadChanges(storeUrl, toFilename, uploadDir, backupDir, username, password);\n if(config.options.chkGenerateAnRssFeed) {\n //var rssContent = convertUnicodeToUTF8(generateRss());\n var rssContent = generateRss();\n var rssPath = toFilename.substr(0,toFilename.lastIndexOf(".")) + ".xml";\n this.uploadContent(rssContent, storeUrl, rssPath, uploadDir, '', username, password, \n function (responseText) {\n if (responseText.substring(0,1) != '0') {\n displayMessage(config.macros.upload.messages.rssFileNotUploaded.format([rssPath]));\n }\n else {\n var toFileUrl = config.macros.upload.toFileUrl(storeUrl, rssPath, uploadDir, username);\n displayMessage(config.macros.upload.messages.rssFileUploaded.format(\n [toFileUrl]), toFileUrl);\n }\n // for debugging store.php uncomment last line\n //DEBUG alert(responseText);\n });\n }\n return;\n};\n\nconfig.macros.upload.uploadChanges = function(storeUrl, toFilename, uploadDir, backupDir, \n username, password) {\n var original;\n if (document.location.toString().substr(0,4) == "http") {\n original =, toFilename, uploadDir, backupDir, username, password);\n return;\n }\n else {\n // standard way : Local file\n \n original = loadFile(getLocalPath(document.location.toString()));\n if(window.Components) {\n // it's a mozilla browser\n try {\n"UniversalXPConnect");\n var converter = Components.classes[""]\n .createInstance(Components.interfaces.nsIScriptableUnicodeConverter);\n converter.charset = "UTF-8";\n original = converter.ConvertToUnicode(original);\n }\n catch(e) {\n }\n }\n }\n //DEBUG alert(original);\n this.uploadChangesFrom(original, storeUrl, toFilename, uploadDir, backupDir, \n username, password);\n};\n\nconfig.macros.upload.uploadChangesFrom = function(original, storeUrl, toFilename, uploadDir, backupDir, \n username, password) {\n var startSaveArea = '<div id="' + 'storeArea">'; // Split up into two so that indexOf() of this source doesn't find it\n var endSaveArea = '</d' + 'iv>';\n // Locate the storeArea div's\n var posOpeningDiv = original.indexOf(startSaveArea);\n var posClosingDiv = original.lastIndexOf(endSaveArea);\n if((posOpeningDiv == -1) || (posClosingDiv == -1))\n {\n alert(config.messages.invalidFileError.format([document.location.toString()]));\n return;\n }\n var revised = original.substr(0,posOpeningDiv + startSaveArea.length) + \n allTiddlersAsHtml() + "\sn\st\st" +\n original.substr(posClosingDiv);\n var newSiteTitle;\n if(version.major < 2){\n newSiteTitle = (getElementText("siteTitle") + " - " + getElementText("siteSubtitle")).htmlEncode();\n } else {\n newSiteTitle = (wikifyPlain ("SiteTitle") + " - " + wikifyPlain ("SiteSubtitle")).htmlEncode();\n }\n\n revised = revised.replaceChunk("<title"+">","</title"+">"," " + newSiteTitle + " ");\n revised = revised.replaceChunk("<!--PRE-HEAD-START--"+">","<!--PRE-HEAD-END--"+">","\sn" + store.getTiddlerText("MarkupPreHead","") + "\sn");\n revised = revised.replaceChunk("<!--POST-HEAD-START--"+">","<!--POST-HEAD-END--"+">","\sn" + store.getTiddlerText("MarkupPostHead","") + "\sn");\n revised = revised.replaceChunk("<!--PRE-BODY-START--"+">","<!--PRE-BODY-END--"+">","\sn" + store.getTiddlerText("MarkupPreBody","") + "\sn");\n revised = revised.replaceChunk("<!--POST-BODY-START--"+">","<!--POST-BODY-END--"+">","\sn" + store.getTiddlerText("MarkupPostBody","") + "\sn");\n\n var response = this.uploadContent(revised, storeUrl, toFilename, uploadDir, backupDir, \n username, password, function (responseText) {\n if (responseText.substring(0,1) != '0') {\n alert(responseText);\n displayMessage(config.macros.upload.messages.fileNotUploaded.format([getLocalPath(document.location.toString())]));\n }\n else {\n if (uploadDir !== '') {\n toFilename = uploadDir + "/" + config.macros.upload.basename(toFilename);\n } else {\n toFilename = config.macros.upload.basename(toFilename);\n }\n var toFileUrl = config.macros.upload.toFileUrl(storeUrl, toFilename, uploadDir, username);\n if (responseText.indexOf("destfile:") > 0) {\n var destfile = responseText.substring(responseText.indexOf("destfile:")+9, \n responseText.indexOf("\sn", responseText.indexOf("destfile:")));\n toFileUrl = config.macros.upload.toRootUrl(storeUrl, username) + '/' + destfile;\n }\n else {\n toFileUrl = config.macros.upload.toFileUrl(storeUrl, toFilename, uploadDir, username);\n }\n displayMessage(config.macros.upload.messages.mainFileUploaded.format(\n [toFileUrl]), toFileUrl);\n if (backupDir && responseText.indexOf("backupfile:") > 0) {\n var backupFile = responseText.substring(responseText.indexOf("backupfile:")+11, \n responseText.indexOf("\sn", responseText.indexOf("backupfile:")));\n toBackupUrl = config.macros.upload.toRootUrl(storeUrl, username) + '/' + backupFile;\n displayMessage(config.macros.upload.messages.backupFileStored.format(\n [toBackupUrl]), toBackupUrl);\n }\n var log = new config.macros.upload.UploadLog();\n log.endUpload();\n store.setDirty(false);\n // erase local lock\n if (window.BidiX && BidiX.GroupAuthoring && BidiX.GroupAuthoring.lock) {\n BidiX.GroupAuthoring.lock.eraseLock();\n // change mtime with new mtime after upload\n var mtime = responseText.substr(responseText.indexOf("mtime:")+6);\n BidiX.GroupAuthoring.lock.mtime = mtime;\n }\n \n \n }\n // for debugging store.php uncomment last line\n //DEBUG alert(responseText);\n }\n );\n};\n\nconfig.macros.upload.uploadContent = function(content, storeUrl, toFilename, uploadDir, backupDir, \n username, password, callbackFn) {\n var boundary = "---------------------------"+"AaB03x"; \n var request;\n try {\n request = new XMLHttpRequest();\n } \n catch (e) { \n request = new ActiveXObject("Msxml2.XMLHTTP"); \n }\n if (window.netscape){\n try {\n if (document.location.toString().substr(0,4) != "http") {\n'UniversalBrowserRead');}\n }\n catch (e) {}\n } \n //DEBUG alert("user["+config.options.txtUploadUserName+"] password[" + config.options.pasUploadPassword + "]");\n // compose headers data\n var sheader = "";\n sheader += "--" + boundary + "\sr\snContent-disposition: form-data; name=\s"";\n sheader += config.macros.upload.formName +"\s"\sr\sn\sr\sn";\n sheader += "backupDir="+backupDir\n +";user=" + username \n +";password=" + password\n +";uploaddir=" + uploadDir;\n // add lock attributes to sheader\n if (window.BidiX && BidiX.GroupAuthoring && BidiX.GroupAuthoring.lock) {\n var l = BidiX.GroupAuthoring.lock.myLock;\n sheader += ";lockuser=" + l.user\n + ";mtime=" + l.mtime\n + ";locktime=" + l.locktime;\n }\n sheader += ";;\sr\sn"; \n sheader += "\sr\sn" + "--" + boundary + "\sr\sn";\n sheader += "Content-disposition: form-data; name=\s"userfile\s"; filename=\s""+toFilename+"\s"\sr\sn";\n sheader += "Content-Type: " + config.macros.upload.contentType + "\sr\sn";\n sheader += "Content-Length: " + content.length + "\sr\sn\sr\sn";\n // compose trailer data\n var strailer = new String();\n strailer = "\sr\sn--" + boundary + "--\sr\sn";\n //strailer = "--" + boundary + "--\sr\sn";\n var data;\n data = sheader + content + strailer;\n //"POST", storeUrl, true, username, password);\n try {\n"POST", storeUrl, true); \n }\n catch(e) {\n alert(config.macros.upload.messages.crossDomain + "\snError:" +e);\n exit;\n }\n request.onreadystatechange = function () {\n if (request.readyState == 4) {\n if (request.status == 200)\n callbackFn(request.responseText);\n else\n alert(config.macros.upload.messages.errorUploadingContent + "\snStatus: "+request.status.statusText);\n }\n };\n request.setRequestHeader("Content-Length",data.length);\n request.setRequestHeader("Content-Type","multipart/form-data; boundary="+boundary);\n request.send(data); \n};\n\n\ = function(uploadUrl, uploadToFilename, uploadDir, uploadBackupDir, \n username, password) {\n var request;\n try {\n request = new XMLHttpRequest();\n } \n catch (e) { \n request = new ActiveXObject("Msxml2.XMLHTTP"); \n }\n try {\n if (uploadUrl.substr(0,4) == "http") {\n"UniversalBrowserRead");\n }\n else {\n"UniversalXPConnect");\n }\n } catch (e) { }\n //"GET", document.location.toString(), true, username, password);\n try {\n"GET", document.location.toString(), true);\n }\n catch(e) {\n alert(config.macros.upload.messages.crossDomain + "\snError:" +e);\n exit;\n }\n \n request.onreadystatechange = function () {\n if (request.readyState == 4) {\n if(request.status == 200) {\n config.macros.upload.uploadChangesFrom(request.responseText, uploadUrl, \n uploadToFilename, uploadDir, uploadBackupDir, username, password);\n }\n else\n alert(config.macros.upload.messages.errorDownloading.format(\n [document.location.toString()]) + "\snStatus: "+request.status.statusText);\n }\n };\n request.send(null);\n};\n\n//}}}\n////===\n\n////+++!![Initializations]\n\n//{{{\nconfig.lib.options.init('txtUploadStoreUrl','store.php');\nconfig.lib.options.init('txtUploadFilename','');\nconfig.lib.options.init('txtUploadDir','');\nconfig.lib.options.init('txtUploadBackupDir','');\nconfig.lib.options.init('txtUploadUserName',config.options.txtUserName);\nconfig.lib.options.init('pasUploadPassword','');\nsetStylesheet(\n ".pasOptionInput {width: 11em;}\sn"+\n ".txtOptionInput.txtUploadStoreUrl {width: 25em;}\sn"+\n ".txtOptionInput.txtUploadFilename {width: 25em;}\sn"+\n ".txtOptionInput.txtUploadDir {width: 25em;}\sn"+\n ".txtOptionInput.txtUploadBackupDir {width: 25em;}\sn"+\n "",\n "UploadOptionsStyles");\nif (document.location.toString().substr(0,4) == "http") {\n config.options.chkAutoSave = false; \n saveOptionCookie('chkAutoSave');\n}\nconfig.shadowTiddlers.UploadDoc = "[[Full Documentation| ]]\sn"; \n\n//}}}\n////===\n\n////+++!![Core Hijacking]\n\n//{{{\nconfig.macros.saveChanges.label_orig_UploadPlugin = config.macros.saveChanges.label;\nconfig.macros.saveChanges.label = config.macros.upload.label.saveToDisk;\n\nconfig.macros.saveChanges.handler_orig_UploadPlugin = config.macros.saveChanges.handler;\n\nconfig.macros.saveChanges.handler = function(place)\n{\n if ((!readOnly) && (document.location.toString().substr(0,4) != "http"))\n createTiddlyButton(place,this.label,this.prompt,this.onClick,null,null,this.accessKey);\n};\n\n//}}}\n////===\n\n
Here's how to use GMT from cygwin on Windows without compiling anything!\n\nWent to the Mirone site for prebundled GMT\n\nand downloaded the 64 bit version (for my 64 bit Windows 7 machine)\n\nInstalled in the default location:\nc:\sprograms\sGMT64\nThen fired up Cygwin and set these environment variables:\n{{{ \n export GMT_SHAREDIR=c:\s\sprograms\s\smirone\s\sgmt_userdir\n export PATH=$PATH:/cygdrive/c/programs/gmt64/bin\n}}}\nand then I can use all the GMT binaries from Cygwin, for example:\n{{{\ncut -f2,3,4 DEM_NE_5.0_2005_01_01.txt | xyz2grd -: -I2.5m -R-84/-66.041667/36.041667/48\n}}}\nThe output grid can be nicely visualized in Mirone, btw:\n<html><img src=""/></html>\nThis was produced by the standalone version of Mirone, which is handy because on my home machine (on which I produced the above image) does not have Matlab. You can get this of Mirone version here:\n\n
I stumbled across this web services page at CO-OPS\n\nwhich contains information on how to get tidal constituent data and water level data using SOAP w/ WSDL.\n\nI had just earlier stumbled upon this Mathworks "Accessing Web Services Using Matlab SOAP functions" page:\n\nand this "Accessing Web Services That Use WSDL Documents" page:\n\nSo I was curious if I could use these tools to read some harmonic constituent data. Turns out it was pretty easy to follow the directions on this Mathworks page, and the tools just worked!\n\n!!Constituent Data\n\nFirst, I did a right mouse click on the "WSDL" link to copy the link location, and the pasted that into Matlab's createClassFromWsdl function: \n{{{\n>> createClassFromWsdl('')\n\n}}}\nThis created a directory called @HarmonicConstituentsService, with methods "getHConstiuentsAndMetadata.m", "getHarmonicConstituents.m", "display.m", and "HarmonicConstituentsService.m"\nI then did \n{{{\n>> help getHarmonicConstituents \n\n>> obj = HarmonicConsituentsService\n>> stationId='8454000'; \n>> unit = 0; % 0 for meters, 1 for feet\n>> time_zone = 0; % 0 for GMT, 1 for local time\n>> data=getHarmonicConstituents(obj,'8454000',unit,time_zone);\n\n>> data\n\ndata = \n\n item: [37x1 struct]\n\n>> data.item\n\nans = \n\n37x1 struct array with fields:\n constNum\n name\n amplitude\n phase\n speed\n}}}\n\n!!Water Level Data\n\nRight clicking on 6-minute data WSDL:\n{{{\n>> createClassFromWsdl('')\nRetrieving document at ''\n\nans =\n\nWaterLevelRawSixMinService\n\n>> help WaterLevelVerifiedSixMinService\nWaterLevelVerifiedSixMinService methods:\n\ngetWLVerifiedSixMinAndMetadata - (obj,stationId,beginDate,endDate,datum,unit,timeZone)\ngetWaterLevelVerifiedSixMin - (obj,stationId,beginDate,endDate,datum,unit,timeZone)\n\n}}}\n\nSo let's try getting the water levels at the Battery for Superstorm Sandy:\n{{{\n>> obj = WaterLevelVerifiedSixMinService\n>> data = getWaterLevelVerifiedSixMin(obj,'8518750','20121028','20121031','MLLW','0','0');\n>> for i=1:length(data.item);\n>> z(i)=str2num(data.item(i).WL);\n>> dn(i)=datenum(data.item(i).timeStamp);\n>> end\n\n>> plot(dn,z); datetick\n}}}\n\n\n
Install nctoolbox from You want the latest version of nctoolbox from the mecurial code repository, so follow the instructions here:\n\nIn Matlab, make sure you have run "setup_nctoolbox.m" to get your Matlab path and javaclasspath set up properly.\n\nGlobal NCOM Region 5 results are available on the OceanNomads site\n\nas individual forecast datasets (one each day)\n\n\n\nor as an aggregation, which is a "best time series" created from previous forecasts up to and including the latest forecast. The URL for this aggregated dataset never changes.\n\n\n\nSo here's how to access the last 3 days of data from the NCOM region 5 model for just the region around the Taiwan Strait [\n\n{{{\n% open as \nurl='';\nnc=ncgeodataset(url);\n\n\n
Ncdump will print out the netcdf header info and values in ascii. You might find that you can parse it easy enough. ncdump -x will put out NcML but not the data values.\n\nThe java library will put out NcML with the data values using:\n{{{\n java -classpath toolsUI.jar ucar.nc2.NCDumpW <NetCDF-3 filename> -ncml -vall\n}}}\nUnforunately, I just discovered that previous versions are not doing this correctly, so you'll have to get the latest development release (4.0.25) from Note you should use NCDumpW not NCDump as it says on the web page.
1. open a 32-bit Anaconda command prompt\n2. type \n{{{\nconda create -n esri python=2.7 numpy=1.6 pandas\nactivate esri\n}}}\n3. Create a file called {{{conda.pth}}} in your ArcGIS10.1 site-packages folder. On my machine this is {{{\nC:\sPython27\sArcGIS10.1\sLib\ssite-packages\sconda.pth\n}}}\n4. In this file add one line that points to your Anaconda site-packages for the esri environment you created. On my machine this is:\n{{{\nC:\sprograms\sAnaconda\senvs\sesri\sLib\ssite-packages\n}}}\n
To strip a particular variable out, first check what varaibles are available by doing \n{{{\nwgrib2 file.grib\n}}}\nThen you can select the variable (like sensible heat net flux) like this:\n{{{\n wgrib2 ofs_atl.t00z.F024.3d.grb.grib2 -s | grep ":SHTFL:surface" | wgrib2 -i ofs_atl.t00z.F024.3d.grb.grib2 -grib shtfl.grib2\n}}}\n
{{{\n gdaltransform -s_srs EPSG:4326 -t_srs "+proj=lcc +lat_1=30n +lat_2=60n +lon_0=80e +a=6371229.0 +es=0.0" eurostrat.ll\n}}}
*Grabbed BSB version of NOAA Raster Chart 13229 from\n*Brought it up in OpenEV and discovered that "white" was index=1. Want to make that transparent. So converting to UTM and making white transparent is two steps:\n{{{\ncd C:\sRPS\sBSB_ROOT\s13229\ngdalwarp -rcs -t_srs "+proj=utm +zone=19 +datum=WGS84" 13229_4.kap noaa_chart.tif\ngdal_translate -a_nodata 1 noaa_chart.tif noaa_chart_trans.tif\n}}}\n\n
On testbedapps, I followed these instructions for getting the virtual frame buffer going:\n\n\nAs root, I had to first install Xvfb, and then setup display 1:\n{{{\nyum install xorg-x11-server-Xvfb\nXvfb :1 -screen 0 1280x1024x24 -auth localhost\n}}}\nThen as user tomcat, I started the notebook, specifying display 1:\n{{{\nDISPLAY=":1" nohup ipython notebook -pylab inline &\n}}}\nProof that it works:\n{{{\n\n}}}
Stace:\nDrupal 6 is cool.\n\nAmber York: \n, working on passing metadata to OBIS. \n\nAlex: Classifiers. opencalais can read a newspaper and figure out the article is about Obama, how he has a dog, and where he went. Right, now, lots of command line java stuff. Trying to use MOA.\n\nRyan: lightning talk\n
type "cmd", right click to run as admin, then type "net start wlansvc"\n\n\n,44,30,54&WIDTH=256&HEIGHT=256\n
For the 360x160x10 homogeneous Island Wake Problem, the Rutgers version takes significantly longer to build and to run than the UCLA version. This run spends nearly 80% of the time in the 2D Kernel, and the UCLA code runs 80% faster than the Rutgers code, thus supporting Sasha's claim that the 2D Kernel in UCLA is about 2 times faster than the 2D Kernel in Rutgers. Of course, this run has only a constant background mixing, just the 3rd order advection and no sediment. In our ADRIATIC runs with 6 sediment classes, MPDATA took 66% of the run time, GLS Mixing took 8% and 2D Kernel took 2-12% of the run.\n\nSasha had 57601 steps, but just for timing, it seems 240 is sufficient.\n \nThe log-based timing for 240 steps (single Xeon cpu):\n| !Model | !Style | !Advection | !Subdomain | !FFLAGS | !2D Kernel (s) | !Total (s) | !Fraction in 2D (%) | !Relative Time | !Build Time |\n| UCLA (run003) | OceanO | 3rd order | 3x40 | -O3 -ip -pc80 -tpp7 -axN -xN -auto -stack_temps -fpp2 -openmp | unknown | 355.4| unknown | 1.00 (base) | 59.0|\n| UCLA (run003) | OceanS | 3rd order | 3x40 | -O3 -ip -pc80 -tpp7 -axN -xN -auto -stack_temps | unknown | 305.7| unknown | 0.85| 77.0|\n| Rutgers 1.9 | OceanO | 3rd order | 2x32 | ifort 9.1, -fpp2 -openmp -pc80 -tpp7 -axW -xW -w95 -align dcommon -auto -stack_temps -c -O3 -IPF_fma -ip| unknown | 418.0| | 1.17| 106.4|\n| Rutgers 1.9 | OceanO | 3rd order | 1x1 | ifort 9.1, -fpp2 -openmp -pc80 -tpp7 -axW -xW -w95 -align dcommon -auto -stack_temps -c -O3 -IPF_fma -ip| unknown | 685.7| | 1.92| 106.4|\n| Rutgers 1.9 | OceanO | 3rd order | 3x40 | ifort 9.1, -fpp2 -openmp -pc80 -tpp7 -axW -xW -w95 -align dcommon -auto -stack_temps -c -O3 -IPF_fma -ip| unknown | 445.7| | 1.28| 106.4|\n| Rutgers 1.9 | OceanO | 3rd order | 2x16 | ifort 9.1, -fpp2 -openmp -pc80 -tpp7 -axW -xW -w95 -align dcommon -auto -stack_temps -c -O3 -IPF_fma -ip| unknown | 431.3| | 1.21| 106.4|\n| Rutgers 3.0 (run009) | OceanO | 3rd order | 3x40 | ifort v8.0, -O3 -ip -pc80 -tpp7 -axN -xN -auto -stack_temps| 530.5| 651.9| 79.8| 1.83| |\n| Rutgers 3.0 (run009) | OceanS | 3rd order | 3x40 | ifort v8.0 -O3 -ip -pc80 -tpp7 -axN -xN -auto -stack_temps| 482.6| 589.9| 80.0| 1.65| 446 |\n| Rutgers 3.0 w/o 2D advection (run010)| OceanS | 3rd order | 3x40 | -O3 -ip -pc80 -tpp7 -axN -xN -auto -stack_temps| 415.7| 536.8| 75.8| 1.51| |\n| Rutgers 3.0 w/o 2D advection (run010)| OceanS | 3rd order | 3x40 | -O2 | 478.1| 617.2| 75.9| 1.73| 64.1|\n| Delft3D 1 layer (3 min time steps) | | 3rd order | 1x1 | | unknown | 126.1| | | |\n| Delft3D 1 layer (4.5s time steps) | | 3rd order | 1x1 | | unknown | 4050.0| | 11.39| |\n| Delft3D 10 layer (4.5s time steps) | | 3rd order | 1x1 | | unknown | 34200.0| | 96.23| |
Thursday, July 21: heavy fog all day\nTuesday, Sep 27: fog at 8:21 am ET, 0.2 mile visability
This document is a ~TiddlyWiki from A ~TiddlyWiki is an electronic notebook that is great for managing todo lists, personal information, and all sorts of things.\n\n@@font-weight:bold;font-size:1.3em;color:#444; //What now?// &nbsp;&nbsp;@@ Before you can save any changes, you need to enter your password in the form below. Then configure privacy and other site settings at your [[control panel|]] (your control panel username is //rsignell2//).\n<<tiddler tiddlyspotControls>>\n@@font-weight:bold;font-size:1.3em;color:#444; //Working online// &nbsp;&nbsp;@@ You can edit this ~TiddlyWiki right now, and save your changes using the "save to web" button in the column on the right.\n\n@@font-weight:bold;font-size:1.3em;color:#444; //Working offline// &nbsp;&nbsp;@@ A fully functioning copy of this ~TiddlyWiki can be saved onto your hard drive or USB stick. You can make changes and save them locally without being connected to the Internet. When you're ready to sync up again, just click "upload" and your ~TiddlyWiki will be saved back to\n\n@@font-weight:bold;font-size:1.3em;color:#444; //Help!// &nbsp;&nbsp;@@ Find out more about ~TiddlyWiki at [[|]]. Also visit [[TiddlyWiki Guides|]] for documentation on learning and using ~TiddlyWiki. New users are especially welcome on the [[TiddlyWiki mailing list|]], which is an excellent place to ask questions and get help. If you have a tiddlyspot related problem email [[tiddlyspot support|]].\n\n@@font-weight:bold;font-size:1.3em;color:#444; //Enjoy :)// &nbsp;&nbsp;@@ We hope you like using your site. Please email [[|]] with any comments or suggestions.
Trying to find out where to put the robots.txt file so that it shows up at\n\n\nLook at the "httpd.conf" file in /usr/local/apache2/conf and see what the "DocumentRoot" is set to\n{{{\n grep DocumentRoot /usr/local/apache2/conf/httpd.conf\n}}}
*Zachary Taylor - 12th U.S. President\n*William Howard Taft - 27th U.S. President\n*Cokie Roberts \n*Bing Crosby \n*Katherine Hepburn \n*Louisa May Alcott\n*Ralph Waldo Emerson\n*Margaret Bingham\n*Elizabeth Morris\n*Rich Signell\n*Katherine Signell\n*Julia Signell\n*Alex Signell
Grrr....\n\nDownloaded WinMorph 3.01 and spent some time indicating features on Middle Ground that I wanted morphed from our Sep 2006 survey to Nov 2006 survey. But the resulting movie was all black! A quick google turned up the fact that 32 bit TIFF (or any other 32 bit format) for that matter will produce a black movie if the alpha channel is blank. So the simple thing is to convert the 32 bit TIFF to 24 bit TIFF. But how to do that? I couldn't figure out how to do it with IrfanView, but googling again turned up that I could use ImageMagick (or more correctly the Mogrify tool from Imagemagick) thusly:\n{{{\nmogrify +matte my_tiff_image.tif\n}}}\nA guick gdalinfo confirmed that my 32 bit tiff became a 24 bit tiff by this process, and then WinMorph worked fine.\n
Grabbed some small test files:\n{{{\nrsignell@pikmin:~> ncks -O -x -v T,P,PB,QVAPOR,QCLOUD,QRAIN,U,V,W,PH,PHB,TSLB,SMOIS,SH2O\nrsignell@pikmin:~> ncks -O -x -v T,P,PB,QVAPOR,QCLOUD,QRAIN,U,V,W,PH,PHB,TSLB,SMOIS,SH2O\nrsignell@pikmin:~> ncks -O -x -v T,P,PB,QVAPOR,QCLOUD,QRAIN,U,V,W,PH,PHB,TSLB,SMOIS,SH2O\n}}}
We wanted to get only the "tar" files found in a bunch of subdirectories on the gsod ftp site, so we did\n{{{\n wget -r -A.tar\n}}}\nwhich works on gam and on laptop (wget 1.11.4, circa 2008), didn't work on blackburn (wget 1.10.2, circa 2005)\n\nRetrieved 2.9GB of data in 45 minutes.\n{{{\nrsignell@gam:~/gsod/2010$ ls | wc -l\n10457 \n}}}\nThere are 10,457 files in 2010!\nIf we stored all 12 variables in 10,457 files from
To list all packages with kernels:\n{{{\nrpm -qa | grep kernel\n}}}\nTo remove a specific kernel package:\n{{{\nrpm -e kernel-smp-2.6.9-34.EL\n}}}
Just a few notes here about NOAA's GEODAS site\n\n\n\nwhere you can download old charts as well as hi-res digital bathymetry and sidescan data.\n\nIf you click on "Get NOS Hydro Data" for example, you reach an\ninteractive map where you can select what type of data you are\ninterested in and then zoom in to see what's available. As an\nexample, I picked the layer for coastal multibeam data, zoomed into\nWoods Hole, saw a survey region that looked what I wanted, then\nclicked on the "i" icon (for information) and was led to this page\nwhich shows all the products available:\n\n\n\nAs you can see, you can get the report from the survey as a pdf,\nimages, the bathy data as YXZ ASCII files or Fledermaus/Iview3D files\nfor instant 3D visualization. I recommend downloading the Iview3D\nfile first, installing the free Iview3D viewer, and taking a look.\nYou can see all sorts of cool features of the sea floor in The Hole -\nat 1/2 m resolution!\n\nWhat I do is convert these big ASCII files to NetCDF and 32 bit\nGeoTIFF images. GeoTIFF is just a tiff file that has special tags\nthat contain the georeferencing information (lon/lat extent, grid\nspacing, projection, datum, etc) and can be read by most mapping\nsoftware. It's an image format, but if you store as 32 bits, you\ncan then interpret the pixel values as real numbers, thus it functions as a data format as well.\n\nI use the "xyz2grd" function from the GMT (generic mapping tools) to\nconvert the xyz grid to a NetCDF grid. I then use the "Mirone"\nprogram (can work as stand-alone or with Matlab) to crop, clip,\nhill-shade, convert to other projections, output as GeoTIFF or Iview3D and more. You can get both the pre-compiled GMT tools for PC and Mirone at:\n\n\n\n(see for some\nof the things Mirone can do. I've been using it for several years,\nand though the interface and doc could use some work, it's got\nawesome functionality!)\n
list all conda environments\n{{{\nconda info -e \n}}}\nremove all packages from an environment and then delete the environment\n{{{\nconda remove -n my_package --all\n}}}\nto try to automagically build a recipe for a package available from pypi:\n{{{\nconda build pyoos --build-recipe\n}}}
{{{\n [a,b,c]\n a || b || c\n\n [[a,b,c]]\n a && b && c\n\n [[a,b],[c],[d],[e]] or [[a,b],c,d,e]\n (a && b) || c || d || e\n}}}\n
Want to see what someone has installed in their cygwin?\n\n{{{cygcheck -c}}}\n\nCool!
Govtrip => Vouchers=> Edit => Review and Sign=>Digital Signature\nThen proceed and write either "authorized" or "would not take gov't cc" (for cash payments) and then "submit completed document"
Quantity: about 1 pint\n\nIngredients:\n\n1 cup water\n1 1/3 cups pure cane sugar\n1 Tbsp. cinchona bark; available in some herb stores or online)\n2 Tbsp. powdered citric acid (found in the bulk section of most well-stocked grocery stores)\n1 lime, zested and juiced\n1 stalk lemongrass, diced\n\nIn a small saucepan, bring the sugar and water to a boil until the sugar dissolves, then turn the heat down to low. Add the quinine, citric acid, lemongrass, lime zest and lime juice. Stir well and simmer for about 25 minutes. The syrup will be thin and runny. Remove from heat and let cool. Strain out the large chunks through a tea strainer or fine mesh colander into a 1 pint mason jar. Will keep for a month or more in the refrigerator. Use about 1/2 oz of syrup with 3 to 5 oz of sparking water and 1.5 oz gin for an awesome gin and tonic.
Trick to make NaN values look white in imagesc, just like in pcolor\n{{{\nA = magic(3)\nAA = A\nAA(1) = NaN\nimalpha = ones(size(A));\nh = image(AA), colorbar\ncolormap(jet(9))\nimalpha(isnan(AA)) = 0\nset(h,'alphadata',imalpha)\n}}}
\n1. Fill out this moped (motorized bicycle) registration form:‎\n2. Send renewal form with stamped, self-addressed envelope and check to:\n\nRegistry of Motor Vehicles\nATTN: Moped Registration Renewal\nP.O. BOX 55889\nBoston, MA 02205
First, NCO 4.0.1 introduced a simpler method to turn fixed dimensions\ninto record dimensions:\n\n{{{\nncks --mk_rec_dmn time\n}}}\nDetails are here:\nThis saves a few lines and significantly reduces disk access.\nThe preferred way to attack your problem is to change time into the\nrecord dimension using this command once per input file.
To get ncview going have to start up xinit with cygwin using:\n{{{\nxinit -- -nolock\ntwm &\n}}}
Adriatic Sea:\n{{{\nm_proj('albers equal-area','parallels',[39 49],'clon',15,'lon',[7 23],'lat',[39 49],'ell','clrk66')\nclf;m_gshhs_f('patch',[.6 .6 .6]);dasp;shg\n}}}\n\nGreat Lakes:\n{{{\nm_proj('albers equal-area','parallels',[41 49],'clon',15,'lon',[-93 -76],'lat',[41 49],'ell','clrk66')\nclf;m_gshhs_f('patch',[.6 .6 .6]);dasp;shg\n}}}\n\n\nItalian Place names:\n{{{\ncurl -o citiesJSON.txt\n# grep -v fcodeName citiesJSON.txt > foo.txt\n# grep -v fcl foo.txt > foo2.txt\n# grep -v name foo2.txt > foo.txt\n# grep -v fcode foo.txt > foo2.txt\n# grep -v geonameId foo2.txt > foo.txt\n# grep -v wikipedia foo.txt > foo2.txt\nsed -e s/"/'/g\n}}}\n\n
Eric Bridger asked me for the outline of the FVCOM grids, and I didn't know how to do this, so here's what I came up with. \n\nFirst I downloaded Brian Blanton's old OPNML matlab routines from because it had a "plotbnd.m" script which sounds like the right thing. I put this in c:/rps/m_contrib/trunk/opmnl_matlab5. To use this script you need a "FEM grid structure", which is described in "feb_grid_struct.m". In that routine, if the boundary list isn't defined, it's created with "detbnd.m", so I made the following script to first plot the boundary using plotbnd, then extracted the x,y positions and used join_cst.m to turn the collection of two point line segments into continuous coastline pieces, from longest to shortest.\n{{{\n%Script plotbnd_fvcom.m\nurl=''\nfout='gom2_bnd.txt';\n\n% Use NJ Toolbox ( for Matlab\nnc=mDataset(url);\n\n% Get Nodal Grid\nlon=nc{'lon'}(:);lat=nc{'lat'}(:);tri=nc{'nv'}(:).';\nh=nc{'h'}(:);\nclose(nc);\n\n% load into FEM grid struct required by OPMNL tools\n% help FEM_GRID_STRUCT\'gom2';\nfem.e=double(tri);\nfem.x=double(lon);\nfem.y=double(lat);\nfem.z=double(h);\n% determine boundary segment list\nfem.bnd=detbndy(double(tri));\n% plot up boundary. Looks good?\nh=plotbnd(fem);\n% extract boundary positions from graphics object and join all these\n% two point segments\nx=get(h,'XData');\ny=get(h,'Ydata');\ncoast=[x(:) y(:)];\ncoast=join_cst(coast,.0001);\n% save as ascii\nsaveascii(fout,coast,'%12.6f %12.6f\sn');\n}}}\n\n
open a 32-bit anaconda command prompt and type\n{{{\nactivate starcluster\nstarcluster terminate rps_cluster\n}}}
| tiddlyspot password:|<<option pasUploadPassword>>|\n| site management:|<<upload index.html . . rsignell2>>//(requires tiddlyspot password)//<<br>>[[control panel|]], [[download (go offline)|]]|\n| links:|[[|]], [[FAQs|]], [[announcements|]], [[blog|]], email [[support|]] & [[feedback|]], [[donate|]]|
go to nomads3\nuse directory "tomcat/apache-tomcat-6.0.32", not "tomcat6".\n\n
The problem:\nWe find that <tomcat2>/logs/catalina.out is HUGE, and is full of "too many files open".\n\nWe can try to figure out what files are associated with tomcat by first figuring out the <PID> \n{{{\nps -ef | grep tomcat2\n}}}\nand then doing \n{{{\nsudo /usr/sbin/lsof -a -p <PID> | more\n}}}\nThis revealed a bunch of umb_massbay files. \n\nI also found that the root file partition "/" was full.\n{{{\ndu -h --max-depth=1\n}}}\nwas super handy.\n\nI also found out that the default number of files to have open on a Linux system is 1024. You can check by doing\n{{{\nulimit -n\n}}}\nThen you can increase by following this article using\n\n{{{\n$more /etc/security/limits.conf\n* - nofile 4096\n}}}\nafter this modification we have not had the "too many files open" problem. \n\n\n
{{{\nproj -I +init="EPSG:32619" -f %12.6f >\n}}}
{{{\nvs005 august 5, 2007\nvs008 sep 25, 2007 30 day run with 1 seds, MORFAC=100 (5 m of 800 micron sediment)\nvs009 sep 23, 2007 30 day run with 4 seds (1 m of 200,400,600,800 sediment)\nvs010 sep 23, 2007 30 day run with 5 seds (1 m of 50,100,200,400,800 sediment)\nvs011 sep 25, 2007 30 day run with 5 seds only on Middle Ground shoal (same as vs010)\n}}}
Currently in WWM, Aron has\nunstructured grid, implicit scheme, ultimate quickest for advection, parallelization with Joseph Zhang using domain decomposition. \n\nWork for Hendrik:\n1. unstruct grid in ww3 (done)\n2. implicit scheme in ww3 (testing)\n3. ultimate quickest in ww3 (should be done by end of year)\n\nOther issues. Currently WW3 parallelizes by doing each spectral component separately, but there is no domain decomposition. Aron & Joseph want to get some funding before they give that away.\n\nAron thinks the best testbed activities would bring air, wave, hydro and sediment guys together to work on nearshore waves, currents, met and morphodynamics.