Digitize an Analog Map With GeMS, Part B3

Video Transcript
Download Video
Right-click and save to download

Detailed Description

Digitize an Analog Map With GeMS, Part B3 - How to complete nonspatial tables, validate a GeMS database, and symbolize features.

The Geologic Map Schema (GeMS) defines a standard database schema — a database design — for digital publication of geologic maps. This tutorial is one of six originally presented as part of a short course at the 2021 Northeastern Section Meeting of the Geological Society of America by Ralph Haugerud, USGS, on how to use ArcMap and custom tools to create GeMS-compliant ArcGIS file geodatabases.

GeMS Trainings

  1. Getting Started With GeMS, Part A
  2. Digitizing an Analog Map With GeMS, Part B1
  3. Digitizing an Analog Map With GeMS, Part B2
  4. Digitizing an Analog Map With GeMS, Part B3
  5. Translating a Digital Map With GeMS, Part C1
  6. Translating a Digital Map With GeMS, Part C2

Details

Date Taken:

Length: 00:22:04

Location Taken: US

Video Credits

Video editing, Evan Thoms, USGS, Geologist, ethoms@usgs.gov

Caption editing, Megan James, South Carolina DNR Geological Survey, jamesm@dnr.sc.gov
 

Transcript

We've built an empty database, we've digitized contacts, and made polygons we've digitized strike and dips, we've digitized keybeds, we've checked topology. The next step is to complete the nonspatial tables in the map. Let's go look at our recipe here. GeMS prescribes three nonspatial tables that are essentially internal metadata, a data sources table, that tells where data, that describes where data comes. A description of map units table that describes the map units, and a glossary that defines a number of values: typed values, confidence values used to many points in the database. We're not going to the DMU. Let's go down to the next exercise, but let's fill out the data sources and glossary table for this map. First thing to do is to add the tables, we don't want, excuse me, to the map. And then, let's go, open, open one of them. And the table is partly filled because the GeMS schema uses some terms everywhere that need definition and they're, they're built, it's built into the database, empty database. Make it a little bit smaller. However, we know that we use, have some other sources for the data in this map. In particular, there's John Whetten OFR. So we need to edit this. And now we can, stick in the source, but typing into these fields is a pain. Very difficult to do accurately and well, and so what I've done is typed up some data sources externally. And I can take this, I can copy it, go into this field and paste, and there it is. For data sources, especially when we digitize, it's nice if we include some notes about who did the digitizing, when, and what, if any, they have added to the data set, and in particular, we've, or I've, attributed a number of confidence values that have very little definition in the source data. They're my judgment, naturally recorded. So let's copy that, and put that in the notes field. And there's an optional field or an as needed field for the URL. This report is available on the web, so let's put that in here. And we need something to call this and one thing we could call it is data source number one. It's the primary data source. That works. Any number of tokens are possible. GeMS is set up to use very short text tokens or very long ones that are globally unique identifiers. The only important thing is that within any given database, these ID's need to be unique. We're also going to stick in a few other sources, and I'm just going to stick them in here for now because I can. We're going to use the Glossary of Geology as a source for some definitions. And, that we're going to describe here. We're not going to use it as is. I chose to modify some of the definitions. And so this is modified from Glossary of Geology. And I think I'm going to stop right there. We can, close this. Save our edits, and move on to the Glossary table. And again it comes pre-populated with a couple of terms. We want to add some more. We know we have type contact in here. It needs a definition. We know we have type bedding. And my first inclination is to go to the Glossary of Geology to get definitions for these things. So let's do that. And, there's a picture I took on my cellphone yesterday of my trusty Glossary of Geology. I opened it to contact. And I find the definition number a is the one that's relevant is unsatisfactory because there's not a single contact on this map, the part we  digitized, that meets this definition. All of the contacts are between rock and deposits, or between deposits. They're none of them that are rock to rock, so we need a different definition. And, I'm really lazy, so rather than write one from scratch, I went to the last map that I worked on and took the definition out of it and there we defined a contact as the surface that separates two map units, that is not a fault. And the source for that was USGS SIM 3443, which we're going to put that in as a shorthand. We now clearly have a need for another data source for that there and then we can go to the other thing we have bedding, and I like the Glossary of Geology description fairly well, but not quite, so I modified it. I also prefer to modify these definitions as I'm concerned about Ccopyright issues if we wholesale copy and paste out of glossary of geology. So there's bedding. And that's going to be G O G, 5th edition modified. We now have a need for new data sources entries. And, GOG5M. We need SIM 3443. GOG5M is here already. Sims 3443. Not a long title or long reference. It also has a URL. And we need the token that defines it. Now, I think we've actually got this fairly well under control. We've filled out our nonspatial tables. We've got our spatial stuff done, oh. We left some fields unpopulated. Let's go through these here. Let's open orientation points. There's a bunch of points in here. And, mone of them have confidence values, or a data source, orientations data sources. So let's step through here. All of these things come off this map, and we decided, I think, this map is not that well located on its own. If it were a clean scale stable 24,000 map than 24 meters would be the right answer for LCM. It's not, so let's use 30. Identity confidence I trusted John Wetten thought everything that he said his bedding was bedding, no question, or he would have told us so. So that's certain. My experience having used a compass for many years and having taught field camp and having students test themselves to see how well they can measure orientations, is that a well yielded compass can routinely achieve two or three degrees reproducibility and if we go for five, we're certainly including the answer, here. Plot at scale is the denominator at the scale, smaller than which you shouldn't plot this thing because the symbols get too crowded. This map is not crowded. I'm going to, actually, I'm going to leave this. We'll run a script later to calculate that. Station ID, these things don't have stations. Map unit weren't, actually we could populate that right now. All of these are inside unit TCS. More typically they're inside many units and we would intersect the orientation points feature class with the map unit polys feature class and fill out backing it that way. Location source is from the map we just digitized. And the orientation source is the same. No notes on these. Orientation point's ID, we're going to calculate with a script in a few minutes and PT type is actually we're not going to use it, it was a convenience for translating data from, a la carte. So we can say that's done. We can go to map unit lines. And do some of the things, data sources populated. Existence confidence. We're good there. Contacts and faults. Well this will locally, open up the contacts and faults attribute table. Everything's got a type, it's got its concealed value. It has an existence confidence value, identity confidence, location confidence, meters. We are good except for the data source ID. And, I'm going to stop here. These three lines, the neat line, we did not get out of John Wettons map, so that's going to have to be something else. Rest of these, all came from the Chiwaukum 4 southeast quad. Those are that. And we have gone through, oh, map unit polys. That's that. Identity confidence, nothing was queried. So these are all certain. Data source ID was again. And again, P type is a convenience for translating from a la carte, and we're not going to use it here. At this point, the next step is to calculate the ID values for all these features. Open the gems tools, and go to set ID values. Let's make sure this is. We don't want to reset our data source IDs. We like what they look like. And that's done. We can see what happened by opening up, say, one of these here. Over and there. This is populated. And it will be that in all the tables in here. Now probably we didn't get everything, and typically we run the validate database script repeatedly to find out what needs polishing, what little holes need filling, what needs what needs fixing. It's not something you do just at the end of the process, but oftentimes in building the database. We're going to stick in, again, our Chiwaukum four Southeast database. And let's leave the topology checks in that's, there's an option to skip them because they can take a long time, and if you're trying to figure which tables need filling, that's a nuisance. We're not going to delete any unused source rows, unused rows, although we could, and we don't need to refresh Geo material dic. That's an issue if you're working with an older database being brought up to the current version of GeMS. This script does a lot of work. It's about 1000 lines of code and some things it checks twice, both the level 2 and the Level 3. It brings an answer back from levels of compliance to the gems standard. Level one is basically, you're in GIS with no expectations of what field names are, or feature class names. Level 2 is basically that you've got your contacts and polys in your, your map unit polys and your contacts and faults right. Level 3 is if you crossed all of your t's and dotted all of your i's. And at this point we're not concerned about compliance levels we're concerned about the error messages and what else needs it, filling out in the database. Okay. Close this. Go to the directory window that outputs are written to, which is the directory that our input database was in. Open up the validation script. It says we have some problems, which is not surprising. Okay. We have one term missing in Glossary, one term missing in data sources. So we didn't stick in data source two, and we need neat line. In the interest of saving time, I won't ask you to watch me fix all the errors identified by validate database. Know that typically I step through the validation report, fix each of the errors I can see, rerun the validation script, find more errors, fix those until report comes back clean. At that point I'm ready to work on symbolization for the map. Pick unit colors, record them in the DMU. I prefer to use the FGDC color set, Which is available online. Each of these colors has an identifier which is, in this case, a text string which can be keyed into the DMU. But you can use the other symbol values, also. If you use the FGDC colors you can use the symbol to RGB script to get area fill RGB values, otherwise you may have to go find that eyedropper in Illustrator or something like that, or use the arc style manager to read the RGB values and key those in. So you can use other symbol style sets. And if you do that you want to provide the style file with the geodatabase when you give it to others and you want to document what style you used in your report level metadata. You can then run the set symbol values script, which works with the FGDC symbology to set some line symbols in accord with their attributes, their confidence values, and the map scale. So if you change the map scale, you simply rerun the set symbol's value script in our different map scale, and it will change dashed lines to continuous ones if the scale is smaller. It also will set orientation symbols and in some of those it will choose to use different symbols depending on what the orientation confidence degree's value is. The mapping of type is concealed in the confidence values to symbol values is described in the file type FGDC symbol dot text, which is in the resources directory of the GeMS toolbox, and this is that file. And it's got a list of type values and an FGDC symbol identifier. And the FGDC symbols are arranged such, you can, from the, identifier for these certain accurate symbol calculate what the queried certain, created accurate or certain approximator, or query approximate values are. And this file can be edited to extend the list of types and the symbols numbers in here. This is not a complete set. Ones that I've had occasion to use. As you run the set symbol values script it has an option to set label values in map unit polys by conflating the label field in the DMU table with the existence con, the identity confidence field in the map unit polys feature class. And the set symbols value script can be run on, runs on a feature data set and it modifies contacts and faults, orientation points, geologic lines, map unit polys, and you can run it on a cross section feature data set, or a correlation of map unit diagrams feature data set. When you've done this for your transition to presentation quality, graphics, and want to move to a new plot file. So we want to save as to a different MXD. You want to change how we symbolize the contacts and faults layer. We delete the four-fold symbolization that we use for digitizing and, add contacts and faults as a single layer. Step through each of the layers and set the symbology and mostly use categories/ match to symbols in a style/ click the symbol field and match it to the FGDC style. If we want to label the map unit polys feature class, we add a new, map unit polys layer. We set the symbology to no outline, no fill color, so we're not going to see anything there, nor definition query to exclude units we don't want to label and you know areas that are too small and then label features with the label field. If we have map units that are to be patterned, we can again clone the map unit polys layer, join it to the DMU via map unit and symbolize via the area fill pattern description. And that may require some hunting to find area fills that are appropriate for that description. We can get nicely placed dipped numbers by first running set plot at scale values, which goes through the structure symbols and looks for the ones that are too close together and picks one and says don't plot the other and it does this by setting plot at scale number which is the denominator of the map scale at larger than which it's okay to plot that symbol. And so if you put in 500,000 that symbol plots at any scale larger than one to 500,000, if you put in 10,000, you've gotta zoom in to see it. Once you've done that, run inclination numbers, which calculates where the dip or plunge number are already placed. Makes a point and writes, writes the dip number on top of it. I think that's it. I have done that for this snip of the Chiwaukum four Southeast quadrangle. I've turned the polyfill to be 50% transparent on top of the LiDAR and it now looks something like a geologic map.