Viz Lecture 9 by Kurt
DRAFT DRAFT DRAFT
Kurt Schwehr
Copyright (C) 2003
kdschwehr _at_ ucsd dot edu
Viz Lecture Series
Lecture 9
Owen's Lake Core OL-92
$Revision: 943 $
CONTENTS
INTRODUCTION
ETOPO2/TOPO8.2
MISC PAPERS AND WEB SITES
RIVERS AND LAKES
DEM
DLG
STDS_AL
SRTM30
QUADRANGE NAMES
TOPO MAP
GEOLOGIC MAP
HYDROLOGY
ESRI SAMPLE DATA
X-CORE'S PALEOIV TOOLS
EIGEN VALUE PLOTS
LOG LOG PLOTS
CORE PHOTOS
TODO
* REGIONS
HISTOGRAMS
* INTRODUCTION
This is the presentation for an on land drill core (OL-92) collected
by the USGS at Owen's Lake in 1992. The idea was that the lake would
be a good paleo climate recorder for the last couple hundred thousand
years.
http://www.es.ucsc.edu/~jglen/playa.html
http://pubs.usgs.gov/of/of93-683/
The OL-92 cores were taken at:
-117.9611 36.38056
The core reaches a depth of 323 meters and covers about 750kyr
Smith, George I, 1996, Core OL-92 from Owens Lake, sourtheast
California: U.S. Geological Survey
* ETOPO2/SMITH and SANDWELL
First, let's get a quick regional topography. We will again use etopo
and Smill and Sandwell.
grdcut etopo2.grd -Gol-etopo2.grd -R-119/-117/36/37 -V
img2grd topo_8.2.img -R-119/-117/36/37 -Gol-topo8.2.grd -T1 -V
grd2xyz ol-etopo2.grd > ol-etopo2.xyz
grd2xyz ol-topo8.2.grd > ol-topo8.2.xyz
Now we need a color table. Here is a sample one:
-200 210 180 140 0 210 180 140
0 255 255 0 200 255 128 0
200 255 128 0 400 255 0 0
400 255 0 0 700 198 226 255
700 0 0 255 1000 173 255 47
1000 0 255 128 2000 0 255 255
2000 165 42 42 3000 165 42 42
3000 255 255 255 15000 255 255 255
terrain2 --in ol-etopo2.xyz --out ol-etopo2.iv --cpt ol.cpt
terrain2 --in ol-topo8.2.xyz --out ol-topo8.2.iv --cpt ../ol.cpt
* Misc papers and web sites
Has satellite imagery:
http://geochange.er.usgs.gov/sw/impacts/geology/owens/
http://www.es.ucsc.edu/grad/research/groups/paleomag/owrecord.html
http://www.es.ucsc.edu/grad/research/groups/paleomag/magstrat.html
http://geo-nsdi.er.usgs.gov/metadata/open-file/93-683/metadata.html
Glen, J.M. & Coe, R.S. (1997) Paleomagnetism and magnetic
susceptibility of Pleistocene sediments from drill hole OL-92, Owens
Lake, California.An 800,000-year palaeoclimatic record from Core
OL-92, Owens Lake, Southeast California (eds G.I. Smith and
J.F. Bischoff), pp. 67-78. Geological Society of America (Special
Paper 317), Boulder, CO.
* RIVERS AND LAKES
http://www.246.dk/ciaworld.html
http://www.ks.uiuc.edu/~jim/personal/cartog/
http://www.ks.uiuc.edu/~jim/personal/cartog/download/CIAMap.tar
*USGS DEM - Digital Elevation Models
http://data.geocomm.com/dem/dem2xyzn/dem2xyzn.zip%20
http://data.geocomm.com/dem/dem2xyzn/
1:250K
http://data.geocomm.com/dem/
http://edcsgs9.cr.usgs.gov/glis/hyper/guide/1_dgr_demfig/states.html
http://edcsgs9.cr.usgs.gov/glis/hyper/guide/1_dgr_demfig/states/CA.html
wget http://edcftp.cr.usgs.gov/pub/data/DEM/250/F/fresno-e.gz
wget http://edcftp.cr.usgs.gov/pub/data/DEM/250/D/death_valley-w.gz
wget http://edcftp.cr.usgs.gov/pub/data/DEM/250/D/death_valley-e.gz
*DLG - Digital Line Graphs
http://edc.usgs.gov/geodata/
http://edc.usgs.gov/geodata/dlg_large/states.html
http://edcftp.cr.usgs.gov/pub/data/DLG/LARGE_SCALE/O/owens_lake_CA/
wget http://edcftp.cr.usgs.gov/pub/data/DLG/LARGE_SCALE/O/owens_lake_CA/hydrography/1654702.HY.sdts.tar.gz
seamless....
http://data.geocomm.com/catalog/US/61069/sublist.html
10 meters? http://mcmcweb.er.usgs.gov/status/
How to read these buggers?
http://g3dgmv.sourceforge.net/
http://www.vterrain.org/Elevation/dem.html DEM intro
*SDTS_AL
We need to be able to read SDTS files from the USGS. The SDTS++
library from the USGS seems overly complicated. It requires the
boost.org package which again require JAM instead of make. It didn't
build without a large amount of work, so it wouldn't be fun to get the
three into fink. As a result, I am going to use SDTS_AL from
http://gdal.velocet.ca/projects/
Here is how I put together a fink package for it. The default gnu
autoconf setup for this package does not run ranlib on the static
library (libsdts_al.a) and it does not know how to install. This
means that we must create a patch file to be applied.
tar xfz sdts_1_2.tar.gz
mv sdts_1_2 sdts_1_2-patched
tar xfz sdts_1_2.tar.gz
Now edit the sdts_1_2-patched directory to have the changes we need.
Then create the patch in the unified format:
diff -u -r sdts_1_2 sdts_1_2-patched > sdts_1_2-1.patch
The file should be /sw/fink/dists/local/main/finkinfo/sdts-al-1.2-1.patch
Here is the patch file:
diff -u -r sdts_1_2/Makefile.in sdts_1_2-patched/Makefile.in
--- sdts_1_2/Makefile.in 2003-02-09 15:53:26.000000000 -0800
+++ sdts_1_2-patched/Makefile.in 2003-10-06 07:27:24.000000000 -0700
@@ -1,4 +1,51 @@
+# Patterned after flex src/Makefile.in
+
+
+CXXFLAGS = @CXXFLAGS@ @CXX_WFLAGS@
+LIBS = @LIBS@ -lm
+CXX = @CXX@
+
+prefix = @prefix@
+exec_prefix = @exec_prefix@
+bindir = $(exec_prefix)/bin
+libdir = $(exec_prefix)/lib
+includedir = $(prefix)/include
+
+SHELL = /bin/sh
+srcdir = @srcdir@
+VPATH = @srcdir@
+# Should be these, but need a new configure file.
+#RANLIB = @RANLIB@
+#INSTALL = @INSTALL@
+#INSTALL_DATA = @INSTALL_DATA@
+#INSTALL_PROGRAM = @INSTALL_PROGRAM@
+RANLIB = ranlib
+INSTALL = /sw/bin/install -c
+INSTALL_DATA = ${INSTALL} -m 644
+INSTALL_PROGRAM = ${INSTALL}
+
+default: sdts2shp 8211view
+
+install: libsdts_al.a sdts2shp 8211view installdirs
+ $(INSTALL_PROGRAM) sdts2shp $(bindir)/sdts2shp
+ $(INSTALL_PROGRAM) 8211view $(bindir)/8211view
+ $(INSTALL_DATA) libsdts_al.a $(libdir)/libsdts_al.a
+ $(INSTALL_DATA) $(srcdir)/cpl_config.h $(includedir)/cpl_config.h
+ $(INSTALL_DATA) $(srcdir)/cpl_conv.h $(includedir)/cpl_conv.h
+ $(INSTALL_DATA) $(srcdir)/cpl_error.h $(includedir)/cpl_error.h
+ $(INSTALL_DATA) $(srcdir)/cpl_port.h $(includedir)/cpl_port.h
+ $(INSTALL_DATA) $(srcdir)/cpl_string.h $(includedir)/cpl_string.h
+ $(INSTALL_DATA) $(srcdir)/cpl_vsi.h $(includedir)/cpl_vsi.h
+ $(INSTALL_DATA) $(srcdir)/iso8211.h $(includedir)/iso8211.h
+ $(INSTALL_DATA) $(srcdir)/sdts_al.h $(includedir)/sdts_al.h
+ $(INSTALL_DATA) $(srcdir)/shapefil.h $(includedir)/shapefil.h
+
+installdirs:
+ [ -d $(bindir) ] || mkdir -p $(bindir)
+ [ -d $(libdir) ] || mkdir -p $(libdir)
+ [ -d $(includedir) ] || mkdir -p $(includedir)
+
OBJ = sdtsiref.o sdtscatd.o sdtslinereader.o sdtslib.o \
sdtspointreader.o sdtsattrreader.o sdtstransfer.o \
sdtspolygonreader.o sdtsxref.o sdtsrasterreader.o \
@@ -11,16 +58,9 @@
\
cpl_error.o cpl_vsisimple.o cpl_string.o cpl_conv.o cpl_path.o
-CXXFLAGS = @CXXFLAGS@ @CXX_WFLAGS@
-LIBS = @LIBS@ -lm
-CXX = @CXX@
-
-
-default: sdts2shp 8211view
-
libsdts_al.a: $(OBJ)
- ar r libsdts_al.a $(OBJ)
-
+ $(AR) ru $@ $(OBJ)
+ -$(RANLIB) $@
#
# SDTS library
Now we need an info file that controls the fink build. To install it
yourself, put it in /sw/fink/dists/local/main/finkinfo/sdts-al-1.2-1.info
Package: sdts-al
Version: 1.2
Revision: 1
Maintainer: Kurt Schwehr
Source: ftp://gdal.velocet.ca/pub/outgoing/sdts_1_2.tar.gz
#SourceDirectory: sdts_al-%v
SourceDirectory: sdts_1_2
Source-MD5: bcb3a88703c306253c3963830577c559
Patch: %f.patch
BuildDepends: fink ( >= 0.9.9 )
UpdateConfigGuess: true
# Note that prefix is currently ignored
CompileScript: <<
./configure --prefix=%i
make
<<
InstallScript: <<
make install
<<
DocFiles:
Description: STDT Abstraction Library for reading spacial data (e.g. DEM and DLG)
DescDetail: <<
Spacial Data Transfer Standard (SDTS) library by Frank Warmerdam.
SDTS_AL, the SDTS Abstraction Library, is intended to be an relatively
easy to use library for reading vector from SDTS TVP (Topological
Vector Profile) files, primary DLG data from the USGS. It also include
support for reading raster data such as USGS DEMs in SDTS format. It
consists of open source, easy to compile and integrate C++ code.
See also: http://mcmcweb.er.usgs.gov/sdts/
<<
License: BSD
Homepage: http://gdal.velocet.ca/projects/sdts/
* SRTM30
http://www.jpl.nasa.gov/srtm/cbanddataproducts.html
Shuttle Radar Mapper
http://dbwww.essc.psu.edu/
http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/w140n40.jpg.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/W140N40.DMW.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/W140N40.HDR.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/W140N40.PRJ.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/W140N40.SCH.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/w140n40.dem.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/w140n40.dif.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/w140n40.num.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/w140n40.src.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/w140n40.std.zip
wget http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/w140n40/w140n40.stx.zip
http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/GTOPO30_Documentation
http://edcftp.cr.usgs.gov/pub/data/srtm/SRTM30/SRTM30_Documentation
http://edcftp.cr.usgs.gov/pub/data/srtm/United_States_1arcsec/1arcsec/
http://edcftp.cr.usgs.gov/pub/data/srtm/United_States_1arcsec/1arcsec/N32W119.hgt.zip
http://edcftp.cr.usgs.gov/pub/data/srtm/PI_Processor/California/N36W117/1arcsec/
WOOOOHOOOO!!!
http://edcftp.cr.usgs.gov/pub/data/
Bastards! --- i have the gisdatadepo site
http://edcftp.cr.usgs.gov/pub/data/DLG/100K/F/fresno-e_CA/
http://edcftp.cr.usgs.gov/pub/data/DLG/100K/F/fresno-e_CA/hydrography/1424719.HY.sdts.tar.gz
QUADRANGLE NAMES
wget http://edcftp.cr.usgs.gov/pub/data/nationalatlas/countyp020.tar.gz
How to get quad names?
http://www.mapmart.com/pap_24main.htm
On Vector, and did a zoom to lat/long decimal... then panned around
and used Excel to create this table:
Kearsarge Peak Independence Bee Springs Canyon Pat Keyes Canyon Lower Warm Springs West of Teakettle Junction Teakettle Junction
Mount Williamson Manzanar Union Walsh New York Butte Craig Canyon West of Ubehebe Peak Ubegebe Peak
Mount Whitney Mount Langley Lone Pine Dolomite Cerro Gordo Peak Nelson Range Jackass Canyon
Johnson Peak Cirque Peak Bartlett Owens Lake Keeler Santa Rosa Flat Lee Wash
Kern Peak Templeton Mountain Olancha Vermillion Canyon Centennial Canyon Talc City Hills Darwin
Casa Vieja Meadows Monache Mountain Haiwee Pass Haiwee Reservoirs Upper Centennial Flat Coso Peak China Gardens
Bonita Meadows Crag Peak Long Canyon Coso Junction Cactus Peak Petroglyph Canyon Louisiana Butte
Then pan over to the Owens Lake section and grabbed images of each of
the avialable layers. They are in the Mapmart directory. The Lat/Lon
range:
-118.190/-117.741/36.277/36.484
img2grd topo_8.2.img -R-118.190/-117.741/36.277/36.484 -V -T1 -Gol-mapmart.grd
terrain2 --in ol-mapmart.xyz --out ol-mapmart.vec.iv --tex mapmart.vec.gif
terrain2 --in ol-mapmart.xyz --out ol-mapmart.250k.iv --tex mapmart.250ktopo.gif
terrain2 --in ol-mapmart.xyz --out ol-mapmart.sat.iv --tex mapmart.sat.gif
*Topo Map
topozone.com
enter Owens Lake, CA
ol-topozone.png
1:100k series Large View 1:250000 DD MM.MM NAD27
Looks like lots of great code!!!!
Another attempt at reading SDTF...
****** **** >>>> http://gdal.velocet.ca/projects/
Works...
mkdir 14247919.sdts
cd 14247919.sdts
tar xf 14247919.sdts.tar
sdts2shp HY01CATD.DDF -v
sdts2shp HY01CATD.DDF -m LE01 -o le01.shp
produces an arc shape file
What is this external DLG3 dictionary? It is supposed to be here:
http://mcmcweb.er.usgs.gov/sdts/
http://thor-f5.er.usgs.gov/sdts/datasets/tvp/dlg3/
http://thor-f5.er.usgs.gov/sdts/datasets/tvp/dlg3/dlg3sdts.ps
* Geologic Map
One of the things I would like for this area is a good geologic map.
I have a copy of a California geologic map in paper form. Lets get it
into a usable digital georeferenced format.
Charles W. Jennings, 1997, "Map No. 2 Geologic Map of
California," Department of Conservation, CA Division of Mines and
Geology
I used the notes by Christie Lindemann in Lecture 7 using Geomatica to
georeference the scanned map. I started with the paper map on an
Epson flat bed scanner. It is difficult to get a large map on this
small scanner. I used photoshop to scan the image:
File -> Import -> TWAIN32
I chose 240 DPI (I think) and color. Save the image as a tif image
with lzw compression. This produced the following file:
imginfo jennings-ol-1977-scan.tif
Image file: jennings-ol-1977-scan.tif
File format: TIFF image
Dimensions (w,h,z,c): 2081, 1696, 1, 3
Page Size (w,h,z,c): 2081, 1, 1, 3
Data type: unsigned char
Dimension order: interleaved
Color model: RGB
Coordinate space: upper-left
Statistical Min/Max: 0 - 255
Display Min/Max: 0 - 255
Data Compression: Lempel-Ziv & Welch
Resolution unit: Inch
X resolution: 72
Y resolution: 72
I then used Adobe Illustrator to make an 8.5 x 11 inch printout of the
map to mark down information that I needed.
See Lecture 7 for how I georeferenced the image. For this case, I
only used 3 "Geographic Control Points" (GCP) which is really not the
best. I believe the droids in the USGS BatCave (Johanna and Geoff)
said that 8 or 16 was better. I can't remember. More is generally
better. Here is the report that Geomatica generated... I trimmed out
some stuff.
Uncorrected File : ... Desktop\Kurt\jennings-ol-1977-scan.tif
Channels : 1 2 3
Size : 2081 P x 1696 L
Orthorectified File : ojennings-ol-1977-scan.pix
Upper Left : -118.878495 37.402800
Lower Right : -116.467335 35.746725
Clip Area : Entire Image
Order : 1
X Coefficients : Values
Const : 1.0595824999924771e+005
X : 9.3356249999311274e+002
Y : 1.3499999999840020e+002
Y Coefficients : Values
Const : 5.5127249999455715e+004
X : 1.0406249999493474e+002
Y : -1.1492500000014238e+003
GCP ID Status Image X (P) Image Y (L)
------------------------------------------------------------------
G0001 Active 792.8750 +/- 0.1000 325.6250 +/- 0.1000
G0002 Active 657.8750 +/- 0.1000 1474.8750 +/- 0.1000
G0003 Active 1591.4375 +/- 0.1000 1578.9375 +/- 0.1000
GCP ID Georef Georef X Georef Y
----------------------------------------------------------------
G0001 LONG/LAT D000 -118.0000 +/- 0.0000 37.0000 +/- 0.0000
G0002 LONG/LAT D000 -118.0000 +/- 0.0001 36.0000 +/- 0.0000
G0003 LONG/LAT D000 -117.0000 +/- 0.0001 36.0000 +/- 0.0001
I then exported the image to a number of formats. Here are the files
that I ended up with:
12901888 ojennings-ol-1977-scan.pix
903 ojennings-ol-1977.aux
11104203 ojennings-ol-1977.img
1263544 ojennings-ol-1977.jpg
10562048 ojennings-ol-1977.ppm
2213 ojennings-ol-1977.report.txt
10566526 ojennings-ol-1977.tif
Now we need to create a topography to drape the terrain on using the
orthorectified bouding box:
img2grd topo_8.2.img -R-118.878495/-116.467335/35.746725/37.402800 -Gjennings-topo8.2.grd -T1 -V
Note that the grd file produced is not exactly right:
img2mercgrd expects topo_8.2.img to be 10800 by 6336 pixels spanning 0/360.0/-72.005977/72.005977
img2mercgrd: To fit [averaged] input, your -R-118.878495/-116.467335/35.746725/37.402800 is adjusted to -R-118.9/-116.466666667/35.7300247339/37.4163987419
img2mercgrd: The output will be 73 by 63 pixels.
grd2xyz jennings-topo8.2.grd > jennings-topo8.2.xyz
jpegtopnm ojennings-ol-1977.jpg | ppmquant 255 | ppmtogif > ojennings-ol-1977.gif
terrain2 --in jennings-topo8.2.xyz --out jennings-topo8.2.iv --tex ojennings-ol-1977.gif
Since the img2grd produces on offset grd to the image, we can use a
rectangle (jennings-rect.iv):
#Inventor V2.0 ascii
Separator {
Texture2 { filename "ojennings-ol-1977.gif" model DECAL }
TextureCoordinate2 { point [ 0 1, 1 1, 1 0, 0 0 ]}
#Normal { vector [ 0 0 1, 0 1 0, 0 1 0, 0 1 0 ]}
Coordinate3 { point [
-118.878495 37.402800 0,
-116.467335 37.402800 0,
-116.467335 35.746725 0,
-118.878495 35.746725 0,
] }
FaceSet { numVertices 4 }
}
* Hydrology
Since we are looking at the sedimentary input to Owens Lake, we would
like to display the hydrology. Jenn Hill provided two files in Arc
ungenerate/generate format for the US. FIX: source?? Python provides
a good way of bringing in these files and turning them into maps for
just the local area. In lecture 7 we used the ungenerateLines2iv.py
program. This is the basics of what we would like to do, but we need
to add some code to only keep points in our region. There are two
files, the first has all the rivers and drainages and the second for
just the named features.
66461595 hydrog1.gen
17556381 hydrog1_rivers.gen
wc -l *.gen
2916900 hydrog1.gen
767053 hydrog1_rivers.gen
./ungenRegion2Lines.py hydrog1_rivers.gen hydrog1_rivers.iv -120 -114 34 38
./ungenRegion2Lines.py hydrog1.gen hydrog1.iv -120 -114 34 38
ivview jennings-rect.iv hydrog1_rivers.iv hydrog1.iv
* ESRI Sample Data
ArcView comes with some sample data for the US. This is probably not
the best way to get the data, but it worked.
With MS Windows 2000 or XP
find usa.apr in ESRI/ESRIDATA
double click usa.apr to start arcview
Close the view
Click the script button
click new
load text file
c:/esri/av_gis3-/arcview/samples/scripts/shp2gen.ave
compile (the check button)
go to the view of the usa.
click on the check box and highlight what you want to export
Window->script
Click the running man to run the script
put your data somewhere
Select which field you want to have be the label for each polyline.
109934 major-cities.gen
325261 majorhighway-names.gen
321124 majorhighways.gen
33453 majorlakes.gen
152402 majorrivers.gen
320683 us-states.gen
2012333 uscounties.gen
1135348 zipcodes-state.gen
1260562 zipcodes.gen
There are several formats of ungenerate/generate data. Let's start
with the cities.
head major-cities.gen
Bellingham, -122.468184, 48.743880
Havre, -109.679855, 48.543818
Anacortes, -122.630685, 48.492216
Mount Vernon, -122.315768, 48.421557
Oak Harbor, -122.628927, 48.307898
Minot, -101.296537, 48.233721
Kalispell, -114.317965, 48.199319
Williston, -103.631372, 48.162313
Port Angeles, -123.455849, 48.104984
North Marysville, -122.148848, 48.099251
csvRegion2iv.py major-cities.gen major-cities.iv -120 -114 34 38
Now we can use the same script from the hydrography on the highway map
ungenRegion2Lines.py majorhighways.gen majorhighways.iv -120 -114 34 38
* X-Core's PaleoIV tools
Now that we have some basemaps to get a context for the OL92 core, we
need to take a look at the core itself. Since the original digital
AMS files were lost, I have typed in the data from the ANI-21 style
print outs that came from the AMS measurement program used on the
KLY-2 by Rosenbaum et al 2000. To try to catch all types, I entered
all of the important fields and then created a program to cross check
what ever parameters possible. I then added the depths from a table
called OwensLakeAMS.xls or depth-lookup.xls. The end result is a table
called rosenbaum-ams-stripped.dat. This file is a text space
delimited table. Here is the first line of the table:
12.89 F46 -4.49 54.73 0.0004 448.5 21.2 698.2 1.0075 1.0046 0.9878 0.0003 0.0003 0.0003 12.6 2.2 1.9 1.003 1.017 1.020 1.022 0.706 0.703 0.160 1.014 270 0 108 10 3 80 1.0046 1.0070 0.9885 0.0002 -0.0033 0.0009 07/13/96
The table columns are:
depth, sample name, sample holder, mean susceptibility, Normed
s. err/standard error, F anisotrpy test, F12 anisotropy test, F23
anisotropy test, Normed susc (3 values), Normed susc error (3
values), E12, E23, E13, L shape, F shape, P shape, P prime, T
shape, U shape, Q shape, E shape, Geographic dec (3 values),
Geographic inc (3 values), Geographic normed tensor (6 values)
That's a lot in one table. There are a couple things that are
important to note. These are documents by the f6.png image in
src/OL92. Equations marked in this table are from Lisa Tauxe's
Paleomag Principles and Practice Book.
Normed s. err. -- Sigma in equation 5.15
Tests for anisotropy -- eq 5.22
Normed princ. susc. -- Eigen values. K1-3, which are 3 * Tau
95% conf angles -- P183 eq 5.21
Geographic dec, inc -- Eigen vectors. V1-3
Normed-tensor -- s_bar 1-6
ChiB - Bulk Susc -- Xb = (s_bar1 + s_bar2 + s_bar3)/3 (eq 5.23)
Yikes, well, not that is done, let's start looking at the data. There
is a C++ program called processOL92 that creates a number files that
can be used with the PaleoIV tools.
export XCOREROOT=${HOME}/projects/xcore
export OL92DATA=${XCOREROOT}/data/OL92
With the PaleoIV programs, the assumption is that the cores are in
centimeters, so we need to change the depth entry in the database from
meters to centimeters. Note that units may be changed or configurable
in the future.
cp ${OL92DATA}/rosenbaum-ams-stripped.dat r-ams-meters.dat
depthm2cm.py r-ams-meters.dat r-ams-cm.dat
proccessOL92 r-ams-cm.dat ol92ams
ls -l
2379 ol92ams-bulkSus.dv
1751 ol92ams-samplename.dt
2320 ol92ams-shape.dt
4575 ol92ams-tau-all.dv3
2379 ol92ams-tau1.dv
2379 ol92ams-tau2.dv
2379 ol92ams-tau3.dv
3729 ol92ams-v1.ddi
3713 ol92ams-v2.ddi
3799 ol92ams-v3.ddi
Now we need to convert each of these data tables to Inventor scene
graph files. I have created a short bash script that converts all of
the files (make-paleoiv.bash):
#!/bin/bash
for file in *.dv; do
dv2iv ${file} ${file%%.dv}
done
for file in *.dt; do
dt2iv ${file} ${file%%.dt}
done
for file in *.ddi; do
ddi2iv ${file} ${file%%.ddi}
done
Now if you take a look at the directory where we are building these
files, you will see:
ol92ams-bulkSus.iv ol92ams-v1-horz.iv ol92ams-v2-say-di.iv
ol92ams-samplename.iv ol92ams-v1-say-depth.iv ol92ams-v2-url.iv
ol92ams-shape.iv ol92ams-v1-say-di.iv ol92ams-v3-ALL.iv
ol92ams-tau1.iv ol92ams-v1-url.iv ol92ams-v3-depth-text.iv
ol92ams-tau2.iv ol92ams-v2-ALL.iv ol92ams-v3-di-text.iv
ol92ams-tau3.iv ol92ams-v2-depth-text.iv ol92ams-v3-di.iv
ol92ams-v1-ALL.iv ol92ams-v2-di-text.iv ol92ams-v3-dmag-iv.iv
ol92ams-v1-depth-text.iv ol92ams-v2-di.iv ol92ams-v3-horz.iv
ol92ams-v1-di-text.iv ol92ams-v2-dmag-iv.iv ol92ams-v3-say-depth.iv
ol92ams-v1-di.iv ol92ams-v2-horz.iv ol92ams-v3-say-di.iv
ol92ams-v1-dmag-iv.iv ol92ams-v2-say-depth.iv ol92ams-v3-url.iv
Which is a lot of stuff to say the least. We can clean this up quite
a bit since much of this was design for AF or thermal demagnetized
data. Other files are redunant
\rm *say* *dmag* *ALL* *url*
\rm ol92ams-v[23]-depth-text.iv
Now we have a more manageable set of files:
ol92ams-bulkSus.iv ol92ams-v1-depth-text.iv ol92ams-v2-horz.iv
ol92ams-samplename.iv ol92ams-v1-di-text.iv ol92ams-v3-di-text.iv
ol92ams-shape.iv ol92ams-v1-di.iv ol92ams-v3-di.iv
ol92ams-tau1.iv ol92ams-v1-horz.iv ol92ams-v3-horz.iv
ol92ams-tau2.iv ol92ams-v2-di-text.iv
ol92ams-tau3.iv ol92ams-v2-di.iv
Now we are ready to take a look!!
ivview *.iv
Now that we see the data, it is obviously in 4 groups. We would like
to view each group next to each other. So in a separate directory, we
are going to split the groups up and move them so that they are all
lined up in a row next to each other.
mkdir GroupPaleoIV && cd GroupPaleoIV
cp ../r-ams-cm.dat .
awk '{print $1}' r-ams-cm.dat | less
We can construct a python program that splits the ams file based on
depth (splitdepth.py)
for line in infile.xreadlines():
s = line.split()
d = float(s[0])
if (d
g2.dat: N = 22 <7815/8513>
g3.dat: N = 30 <9965/10358>
g4.dat: N = 35 <11550/11970>
Max OS X Note: To find out which package in fink provides
/sw/bin/minmax, we can use the underlying Debian package control
program. All we have to do is ask it about the file:
dpkg -S minmax
gmt: /sw/share/man/manl/minmax.l
gmt: /sw/bin/minmax
Now we need to turn each group into inventor files. A small bash
script will do the job:
#!/bin/bash
declare -ar groups=( g1.dat g2.dat g3.dat g4.dat )
for g in "${groups[@]}"; do
processOL92 $g ol92-${g%%.dat}
done
Then run the same make-paleoiv.bash script to generate all the
Inventor files. This generates a whopping 132 iv files, which can be
trimmed down to 64:
/bin/rm -f *say* *dmag* *ALL* *url* *-v[23]-depth-text.iv
Now we need top level files to put it all together. One for each core
section and then one top level file. We can use a simple python
script that puts all of the command line args together in one file
(toplevel.py):
#!/usr/bin/env python
import sys
print "#Inventor V2.1 ascii"
for i in range(len(sys.argv)):
if (0==i): continue
print "Separator { File { name \"%s\" } }" % sys.argv[i]
Now run the program using bash glue
#!/bin/bash
for file in g?.dat; do
name=${file%%.dat}
names=`ls *${name}*.iv | grep -v TOP`
toplevel.py ${names} > ol92-$name-TOP.iv
done
Now we would like to have a file that puts them all in place:
toplevel.py *TOP*.iv > ol92-groups-ALL.iv
Now edit ol92-groups-ALL.iv and add translations to get them into
place:
#Inventor V2.1 ascii
Separator {
Translation { translation 0 0 +1289 }
File { name "ol92-g1-TOP.iv" }
}
Separator {
Translation { translation 0 100 +7815 }
File { name "ol92-g2-TOP.iv" }
}
Separator {
Translation { translation 0 200 +9965 }
File { name "ol92-g3-TOP.iv" }
}
Separator {
Translation { translation 0 300 +11550 }
File { name "ol92-g4-TOP.iv" }
}
* REGIONS
Rosenbaum 2000 defines a number of regions that we need to mark in the
3D world. From "4. Results and discussion", they have:
19.0-12.5 m -- fluidized - ref for extreme deformation
27.7-20.4 m -- reference for undisturbed sediment
85.2-78.1 m -- disrupted by shear fractures (INC K3 and STD K3 like 19-12)
103.6-99.6 m -- minimal deformation by AMS
Yet in part two, that have: (p418)
012.5m - 027.7m Mono Lake Excursion
020.4 - 027.7 undeformed
078.1m - 085.2m Blake
099.6m - 103.6m Jamaica/Biawa I
115.5m - 119.7m Pringle Falls
HISTOGRAMS
cd ShapePlots
processOL92 r-ams-cm.dat ol92ams
Make sure that it is max, int, min as order matters:
awk '{print $8,$9,$10, $8+$9+$10}' ol92ams-eigen.dat
awk '{print $11, $12}' *eigen* | sort -u
Isotropic Triaxial
Oblate Oblate
Oblate Triaxial
Triaxial Oblate
Triaxial Triaxial
awk '{print $11, $12}' *eigen* | egrep -v "Oblate Oblate|Triaxial Triaxial"
Isotropic Triaxial
Oblate Triaxial
Triaxial Oblate
Changes 3 values:
1289
10349
11681
From now on, use column 12 for shape name... the original typed in value
awk '{print $1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$12}' ol92ams-eigen.dat > ol92ams-eigen2.dat
awk '{print -$1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11}' ol92ams-eigen2.dat > ol92ams-eigen2-neg.dat
\rm *.{dv,dt,dv3,ddi} ol92ams-eigen.dat
grep Triaxial ol92ams-eigen2.dat | awk '{print $7}' > Triaxial.v3inc
grep Oblate ol92ams-eigen2.dat | awk '{print $7}' > Oblate.v3inc
makeHistogramTable.py Oblate.v3inc 0 90 45 > oblate.tab
makeHistogramTable.py Triaxial.v3inc 0 90 45 > triaxial.tab
gnuplot
set terminal pdf
set output "OL92-shape-histogram.pdf"
set xlabel "V3 inclination"
set ylabel "Samples"
set grid
set title "V3 Inclination Histogram by Shape"
set key left
plot "oblate.tab" with steps, "triaxial.tab" with steps
set output "OL92-shape-histogram.ps"
set terminal postscript landscape enhanced color
plot "oblate.tab" with steps, "triaxial.tab" with steps
----------
now we need to plot each region.
../getDepthRange.py 1249 1901 ol92ams-eigen2.dat eigs-012m.dat
../getDepthRange.py 2039 2771 ol92ams-eigen2.dat eigs-020m.dat
../getDepthRange.py 7809 8521 ol92ams-eigen2.dat eigs-078m.dat
../getDepthRange.py 9959 10361 ol92ams-eigen2.dat eigs-099m.dat
wc -l ../GroupPaleoIV/??.dat
35 ../GroupPaleoIV/g1.dat
22 ../GroupPaleoIV/g2.dat
30 ../GroupPaleoIV/g3.dat
35 ../GroupPaleoIV/g4.dat
wc -l ei*.dat
12 eigs-012m.dat # fluidize
22 eigs-020m.dat # undisturbed
22 eigs-078m.dat # shear fractures
30 eigs-099m.dat # minimal deformation
86 total
The groups in this section are from section 4.
ln -s eigs-012m.dat fluidized.dat
ln -s eigs-020m.dat undisturbed.dat
ln -s eigs-078m.dat shearfrac.dat
ln -s eigs-099m.dat minimaldef.dat
awk '{print $7}' fluidized.dat > fluidized.v3inc
awk '{print $7}' undisturbed.dat > undisturbed.v3inc
awk '{print $7}' shearfrac.dat > shearfrac.v3inc
awk '{print $7}' minimaldef.dat > minimaldef.v3inc
# 18 makes 5 degree bins
# 30 makes 3 degree bins
../makeHistogramTable.py fluidized.v3inc 0 90 30 > fluidized-v3inc.tab
../makeHistogramTable.py undisturbed.v3inc 0 90 30 > undisturbed-v3inc.tab
../makeHistogramTable.py shearfrac.v3inc 0 90 30 > shearfrac-v3inc.tab
../makeHistogramTable.py minimaldef.v3inc 0 90 30 > minimaldef-v3inc.tab
NOTE: Lisa's pmag package has a cdf program too.
Turns out that making CDF's is a better idea. They more clearly
illustrate what I am trying to see. Now I want to be able to break
things into triaxial and oblate. I wrote a bash script that uses
makeCDF.py to create all of the data files, histograms, and cdf data
files(do-all-shape-hist.bash):
declare -ar names=( fluidized undisturbed shearfrac minimaldef )
for name in "${names[@]}"; do
grep -i triaxial ${name}.dat > ${name}-tri.dat
grep -i oblate ${name}.dat > ${name}-obl.dat
awk '{print $7}' ${name}-tri.dat > ${name}-tri.v3inc
awk '{print $7}' ${name}-obl.dat > ${name}-obl.v3inc
../makeHistogramTable.py ${name}-tri.v3inc 0 90 18 > ${name}-tri-v3inc.tab
../makeHistogramTable.py ${name}-obl.v3inc 0 90 18 > ${name}-obl-v3inc.tab
done
for file in *.v3inc; do
../makeCDF.py $file > ${file}.cdf
done
Then use shape-hist.gnuplot to generate all the pdf plots:
gnuplot shape-hist.gnuplot
EIGEN VALUE PLOTS
Now we will make some plots of the core with depth to show how
properties change with depth. The first thing to do is to reverse the
sense of direction with the core. Positive down will cause our plots
to have down core going up in the gnuplot graphs.
cd Shape2
awk '{print -$1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11}' ol92ams-eigen2.dat > ol92ams-eigen2-neg.dat
We can do a quick pdf that contains three plot files in one to see
what looks good for plotting:
set terminal pdf
set output "OL92-taus.pdf"
plot "ol92ams-eigen2-neg.dat" using 8:1 with linespoints, \
"ol92ams-eigen2-neg.dat" using 9:1 with linespoints, \
"ol92ams-eigen2-neg.dat" using 10:1 with linespoints
plot "ol92ams-eigen2-neg.dat" using 8:1 with points, \
"ol92ams-eigen2-neg.dat" using 9:1 with points, \
"ol92ams-eigen2-neg.dat" using 10:1 with points
plot "ol92ams-eigen2-neg.dat" using 8:1 with steps, \
"ol92ams-eigen2-neg.dat" using 9:1 with steps, \
"ol92ams-eigen2-neg.dat" using 10:1 with steps
Looking at these plots, it is clear that we want the plots to have the
linespoints style, but we do not want the lines in between groups.
Rather than use the confusing splitdepth.py, we will this time use
another small python program (getDepthRange.py) to extract each
region:
../getDepthRange.py -3000 0 ol92ams-eigen2-neg.dat g0.dat
../getDepthRange.py -9000 -3000 ol92ams-eigen2-neg.dat g1.dat
../getDepthRange.py -11000 -9000 ol92ams-eigen2-neg.dat g2.dat
../getDepthRange.py -22000 -11000 ol92ams-eigen2-neg.dat g3.dat
The file shape.gnuplot plots Tau 1,2, and 3 down core. It also
introduces a number of new gnuplot features. Take special note of
these: nokey, ytics, size, label, and linespoint styles.
set nokey
set size ratio 1.5
set ytics 500
set label 1 "Fluidized" at graph 0.02,0.96
set label 2 "Undisturbed" at graph 0.02,0.9
plot "g0.dat" using 8:1 with lp lt 3 pt 4, \
"g1.dat" using 8:1 with lp lt 3 pt 4, \
"g2.dat" using 8:1 with lp lt 3 pt 4, \
"g3.dat" using 8:1 with lp lt 3 pt 4, \
...
Now, to go along with the plot, we should bring in the standard
paleomagnetic plot of inclination and v3 inclination to go with it.
These are in the depth lookup table provided by Rosenbaum.
Sample Depth (meters) Paleomagnetic Inclination (degrees) Mass (g)
Magnetic Susceptibility (m3/kg) mean susceptibility (SI vol=1m3)
SIRM (Am2/kg) SIRM/MS (A/m)
L F P P' T Q
K1 Declination (degrees) K1 Inclination (degrees)
K3 Declination (degrees) K3 Inclination (degrees)
6.80 32.2
7.40 32.3
F46 12.89 41.0 5.966 7.34E-08 4.38E-10 4.24E-04 5774 1.003 1.017 1.020 1.022 0.706 0.2 269.7 9.7 108.1 79.8
F47 12.91 45.9 6.100 7.44E-08 4.54E-10 4.54E-04 6107 1.002 1.022 1.023 1.026 0.854 0.1 351.9 4.0 155.6 85.8
We can use python to pull out the 2nd and 3rd columns.
for line in infile.xreadlines():
s = line.split(' ')
if (3>len(s)): continue
depth = "none"
try:
depth = float(s[1])
depth = -100 * depth
except:
print "no float: ",
print depth,s[2]
../getCols.py depth-lookup.txt | grep -iv depth | egrep '[0-9]' > inc.dat
Now we want a way to know if the fabric is triaxial or oblate. We can
do this by creating two data files that have depth and zero
inclination, but one has points for triaxial, the other oblate.
grep Oblate ol92ams-eigen2-neg.dat | awk '{print $1,0.0}' > depth-oblate.dat
grep Triaxial ol92ams-eigen2-neg.dat | awk '{print $1,0.0}' > depth-triaxial.dat
gnuplot < inc.gnuplot
plot [-90:90] [-12000:-1000] \
"inc.dat" using 2:1 with lp lt 2,\
"g0.dat" using 7:1 with lp lt 1,\
"g1.dat" using 7:1 with lp lt 1,\
"g2.dat" using 7:1 with lp lt 1,\
"g3.dat" using 7:1 with lp lt 1,\
"depth-triaxial.dat" using 2:1 with p pt 6,\
"depth-oblate.dat" using 2:1 with p lt 3 pt 8
We can take all of these pdfs and combine them into a poster using
Adobe Illustrator.
POSTER SIZE FIX: 36in wide x however long
LOG LOG PLOTS
Looking through the plots, it is apparent that maybe the F12 and F23
tests hold the key. Since these values get large quickly, a log log
plot is probably the way to go. We would like to make a table of
depth, F, F12 and F23.
awk '{print -$1, $6, $7, $8}' r-ams-cm.dat > f.dat
Then build a gnuplot file to take a look at the plots.
../getDepthRange.py -1901 0 f.dat f-g0-fluid.dat
../getDepthRange.py -3000 -1901 f.dat f-g1-undef.dat
../getDepthRange.py -9000 -3000 f.dat f-g2-shear.dat
../getDepthRange.py -11000 -9000 f.dat f-g3-minim.dat
../getDepthRange.py -22000 -11000 f.dat f-g4-unknw.dat
Next, I wanted to take a look at F,F12,and F23 along with Tau1, Tau2,
and Tau3 plots in 3D. This is easy to do in gnuplot with the splot
command. For example:
splot "g2.dat" using 8:9:10
Then use the mouse in the window to pan round the plot. Note the
"view" display at the bottom. You can use these values with "set
view" commands. From this I created a mini movie of panning around
the graph:
gnuplot < 3d-all-tau.gnuplot
for file in *.png;do
pngtopnm $file | ppmquant 255 | ppmtogif > ${file}.gif
done
gifsicle --colors 20 --delay=100 --loop ao-tau-*.gif > anim-ao-tau.gif
Then drag the anim-ao-tau.gif to your web browser.
Now, it looks like the F12 and F23 tests classify about 75% of the
data as triaxial and by my standards, not good data to use. That is a
pretty bad pass rate. We should plot the ardath data to see what the
pass rates are. ardath.hext is a flattened hext file where each
sample is on one line.
grep as1 ardath.hext > ardath-as1.hext
grep as2 ardath.hext > ardath-as2.hext
grep as3 ardath.hext > ardath-as3.hext
FIX: finish this plot description.
* TERNARY PLOTS
There are a number of ways to generate ternary plots. They not the
easiest since not many other fields outside of geology/geochemistry
use them (it seems).
gnuplot can not directly handle ternary (FIX: fix gnuplot), but there
is at least on example of ternary ploting being done with a trick:
http://warmada.pandu.org/Graphics/gnuplot/ternary/
Lisa's pmag program can plot ternary's from the Normed-tensor and
sigma values (.s files). First we will do the easiest thing, which is
to plot all of the values:
depth, sample name, sample holder, mean susceptibility, Normed
s. err/standard error, F anisotrpy test, F12 anisotropy test, F23
anisotropy test, Normed susc (3 values), Normed susc error (3
values), E12, E23, E13, L shape, F shape, P shape, P prime, T
shape, U shape, Q shape, E shape, Geographic dec (3 values),
Geographic inc (3 values), Geographic normed tensor (6 values)
1 2 3 4 5 6 7 8 9 10 11
12.89 F46 -4.49 54.73 0.0004 448.5 21.2 698.2 1.0075 1.0046 0.9878
12 13 14 15 16 17 18 19 20 21 22
0.0003 0.0003 0.0003 12.6 2.2 1.9 1.003 1.017 1.020 1.022 0.706
23 24 25 26 27 28 29 30 31 32 33 34 35
0.703 0.160 1.014 270 0 108 10 3 80 1.0046 1.0070 0.9885 0.0002
36 37 38
-0.0033 0.0009 07/13/96
awk '{print $32,$33,$34,$35,$36,$37,$5}' rosenbaum-ams-stripped.dat > all.s
cat all.s | s_tern | plotxy
mv mypost all-tern.ps
cat all.s | s_tern -p | plotxy
mv mypost all-tern-p.ps
Now we try the same using gnuplot. See the above URL for the ternary
bnd file.
awk '{print -$1,log($6),log($7),log($8),$9/3.,$10/3.,$11/3.}' ../r-ams-cm.dat > all.dat
# 1 2 3 4 5 6 7
# depth, F, F12, F23, tau1, tau2, tau3
../getDepthRange.py -1901 0 all.dat g0-fluid.dat
../getDepthRange.py -3000 -1901 all.dat g1-undef.dat
../getDepthRange.py -9000 -3000 all.dat g2-shear.dat
../getDepthRange.py -11000 -9000 all.dat g3-minim.dat
../getDepthRange.py -22000 -11000 all.dat g4-unknw.dat
awk '{print $5, $6, $7}' all.dat > all-tau.dat
awk '{print $2, $3, $4}' all.dat > all-f.dat
# Now we need to get everything to 0..1 scale
for file in g?-*.dat; do
awk '{print $5, $6, $7}' ${file} > _${file%%.dat}-tau.dat
./rescale3.py _${file%%.dat}-tau.dat all-tau.dat > _${file%%.dat}-tau-scl.dat
awk '{print $2, $3, $4}' ${file} > _${file%%.dat}-f.dat
./rescale3.py _${file%%.dat}-f.dat all-f.dat > _${file%%.dat}-f-scl.dat
done
QUICK ANALYSIS OF THEORY
So now we have a theory that Hext F test will work well, but I'm not
sure what value of confidence I should be using. 95.0%, 99.0%, 99.5%,
or 99.9%. A simple python program can check how samples do for each
confidence:
../numPassTest.py f-g0-fluid.dat 2 > confidences.txt
../numPassTest.py f-g1-undef.dat 2 >> confidences.txt
../numPassTest.py f-g2-shear.dat 2 >> confidences.txt
../numPassTest.py f-g3-minim.dat 2 >> confidences.txt
../numPassTest.py f-g4-unknw.dat 2 >> confidences.txt
To calculate the F threshold for a percent confidence for F and (F12
or F23), we can use matlab, scipy, or r. The CRC Math Tables book
works, but only has tables for a couple percentages. scipy gives a
NaN result on my Mac, so we will start first with matlab at the 95%
confidence level:
# F: answer = 3.4817
finv(.95,5,9)
# F12 or F23: answer = 4.2565
finv(.95,2,9)
Now a solution that uses python and the "R" statistical package. In
fink, you can install "rpy" or in my case, "rpy-py23", which uses the
"r-base" package.
python
# Thanks go to Robert Kern
from rpy import *
print r.qf(0.95,2,9) # For F, gives: 4.256495
print r.pf(4.256495,2,9) # Gives the p for F of 0.95
The scipy command would be something like this:
python
from scipy.stats import distributions
print distributions.f.ppf(0.95,2,9)
# Here is where it returns NaN for me...
SHAPE PLOTS
r-ams-cm.dat:
18 L
19 F
20 P
21 P'
22 T
23 U
24 Q
25 E
Flinn F versus L
Ramsay F' versus C' (the Flinn on a loglog)
Jelinek P' T
CORE PHOTOS
Figure 2 of Rosenbaum has core photos. We should get them referenced
and into the model.
1st photo: ol92-corephoto-15m.png 15.08m to 15.47m
2nd photo: ol92-corephoto-20m.png 20.43m to 20.84m
3rd photo: ol92-corephoto-78m.png 78.87m to 79.25m
ol92-crackdrawing-78m.png
7.3cm + 30.0cm + 1.4cm = 38.7cm
FIELD LOGS
http://pubs.usgs.gov/of/of93-683/2-field-log/field_log.html
Each log consists of four columns: the top of the described interval
in meters, bottom of the described interval in meters, thickness of
the interval in meters, and a textual description of the sediment
wget http://pubs.usgs.gov/of/of93-683/2-field-log/table2-1.txt
wget http://pubs.usgs.gov/of/of93-683/2-field-log/table2-2.txt
wget http://pubs.usgs.gov/of/of93-683/2-field-log/table2-3.txt
All listed depths are converted to "drill-pad datum depths" which are
0.94 m above lake surface
GRAIN SIZE
http://pubs.usgs.gov/of/of93-683/3-sed-min/1-grain-size/grain-size.html
wget http://pubs.usgs.gov/of/of93-683/3-sed-min/1-grain-size/tbl3-1-1.txt
output from SDSZ program
wget http://pubs.usgs.gov/of/of93-683/3-sed-min/1-grain-size/tbl3-1-3.txt
Clay content
wget http://pubs.usgs.gov/of/of93-683/3-sed-min/2-clay/tbl3-2-2.txt
AMS Radiocarbon:
http://pubs.usgs.gov/of/of93-683/5-chronology/3-radio/tbl5-3-1.txt
http://pubs.usgs.gov/of/of93-683/5-chronology/3-radio/tbl5-3-2.txt
Depth 36m, approximate age 50 kyr before present:
material CAMS Lab 14C age±error
total 6923 B 27470±110
total 6924 B 26580±110
total 7562 B 25370±290
humate 6928 B 29550±190
humin 6926 B 31760±180
Depth 55m, approximate age 80 kyr before present:
material CAMS Lab 14C age±error
total 6921 B 35030±150
total 7563 B 35190±990
total 6922 B 36730±130
humate 6927 B 38430±170
humin 6925 B 36730±240
* Open Inventor - coin3d.org version
I've been getting frustrated by the SGI version of OpenInventor for
lack of image file type support and the lack of text in the model. I
built coin with SoQT Mac version and was not able to get fonts
working. Here is what I did on Mac OS X 10.2.8 with coin:
http://www.trolltech.com/download/index.html
Install Qt/Mac Free
ftp://ftp.trolltech.com/qt/source/qt-mac-free-3.2.2.sit
Double click file on your desktop.
From a shell:
cd ~/Desktop/QtMac-3.2.2
tar xfz qt-mac-free-3.2.2.tar.gz
mv qt-mac-free-3.2.2 /Developer/qt
Added these to the end of your .bashrc file:
export QTDIR=/Developer/qt
export PATH=$QTDIR/bin:$PATH
export MANPATH=$QTDIR/doc/man:$MANPATH
export DYLD_LIBRARY_PATH=$QTDIR/lib:$DYLD_LIBRARY_PATH
sudo -s
./configure -platform macx-g++ -shared -debug
yes
make install
For the coin3d simage package, we should be able to get all kinds of
file formats supported. Here is the configure line I used:
./configure --with-jpeg=/sw --with-mpeg2enc --with-ungif=/sw
make
make install
Had to edit src/resize.c
include
to
include
For coin3D version 2 itself, here is the line I used:
./configure --enable-3ds-import --enable-man
Which didn't seem to give man pages.
SoQT installed fine. I am having troubloe getting SoXt to build:
configure --with-motif=/sw
Which fails to build... This error message goes on for many pages:
ld: multiple definitions of symbol SoXtDevice::invokeHandlers(XAnyEvent*)
devices/.libs/libSoXtDevices.al(SoXtDevice.lo) definition of SoXtDevice::invoke
andlers(XAnyEvent*) in section (__TEXT,__text)
devices/.libs/libSoXtDevices.al(SoXtDevice.lo) definition of SoXtDevice::invoke
andlers(XAnyEvent*) in section (__TEXT,__text)
ld: multiple definitions of symbol SoXtDevice::SoXtDevice[in-charge]()
devices/.libs/libSoXtDevices.al(SoXtDevice.lo) definition of SoXtDevice::SoXtDe
ice[in-charge]() in section (__TEXT,__text)
devices/.libs/libSoXtDevices.al(SoXtDevice.lo) definition of SoXtDevice::SoXtDe
ice[in-charge]() in section (__TEXT,__text)
The SOLUTION: replace libtool in SoXt-1.1.0 with a link to
/sw/bin/glibtool
Then
cd src/Inventor/Xt
/bin/sh ../../../libtool --mode=link g++ -g -O2 -fno-exceptions -W -Wall -Wno-unused -Wno-multichar -Woverloaded-virtual -L/sw/lib -L/usr/X11R6/lib -Wl,-framework,Inventor -Wl,-framework,ApplicationServices -Wl,-framework,AGL -Wl,-framework,OpenGL -o libSoXt.la -rpath /usr/local/lib -no-undefined -version-info 0:0:0 SoXtInternal.lo SoXt.lo SoXtComponent.lo SoXtGLWidget.lo SoXtResource.lo SoAny.lo SoXtCursor.lo SoXtObject.lo SoXtCommon.lo SoXtComponentCommon.lo SoXtGLWidgetCommon.lo SoXtRenderArea.lo devices/libSoXtDevices.la editors/libSoXtEditors.la engines/libSoGuiEngines.la nodes/libSoGuiNodes.la viewers/libSoXtViewers.la widgets/libSoXtWidgets.la -lXm -lobjc -lXt -lXp -lXi -lXmu -lXext -lXpm -lSM -lICE -lX11 -lglut
The key here was to add glut to the link line. don't know what needed
it.
then cd ../../..
* TODO
geotips.com
http://hobu.biz/software/pyTerra
Microsoft's TerraServer
Can I extract core images using http://imgseek.sourceforge.net/net/
http://www.python.org/sigs/image-sig/
Global ecosystems databases
http://www.ngdc.noaa.gov/seg/eco/ged_toc.shtml
http://www.rockware.com/catalog/pages/rockworksgallery.html
http://www.cs.arizona.edu/topovista/sdts2dem/
DEM stds http://rockyweb.cr.usgs.gov/nmpstds/demstds.html
DEMS in STDS format... http://www.atdi-us.com/
* SAT PHOTOS
With the southern california fires, there has been a big release of
satellite photos, or I at least, I'm just discovering them. A good
high resolution on is:
http://earthobservatory.nasa.gov/Newsroom/NewImages/Images/California.A2003299_lrg.jpg
I think it covers the Owens Lake area, but I'm not 100% sure.
MODIS is the Moderate Resolution Imaging Spectroradiometer on the
Terra satellite. Hmmm... don't know if the dates are correct...
http://visibleearth.nasa.gov/Sensors/Terra/MODIS.html
Here is a MODIS IMAGE of all of california taken 29-Sept-2003:
http://visibleearth.nasa.gov/cgi-bin/viewrecord?25927
http://visibleearth.nasa.gov/data/ev259/ev25927_California.A2003272.1855.250m.jpg
27-Oct/Sep??/2003
http://visibleearth.nasa.gov/cgi-bin/viewrecord?25926
http://visibleearth.nasa.gov/data/ev259/ev25926_California.A2003270.1910.250m.jpg
http://visibleearth.nasa.gov/cgi-bin/viewrecord?25599
http://visibleearth.nasa.gov/data/ev255/ev25599_Baja.A2003187.1840.250m.jpg
These images are free for non-commercial use!