Ask A Cartographer

r there any scripts/tools that i can use to generalize linear features while maintaining integrity?

March 26 2008 | 0 comments
Categories: ArcGIS Methods, Data Modeling

Can the integrity of the man-made features of the National Hydro Network be maintained when algorithmically generalized? What tools are available in ArcMap to test this? or are there scripts that can i can utilize?

my data is at a scale of 1:20,000 i wish to generalize it so it looks like data at these scales.
1:50,000
1:100,000
1:250,000
1:500,000
1:1,000,000
1:2,000,000

Though i don't wanna create more files, perhaps a few .lyr files but no more data.

Mapping Center Answer:

Given your restrictions the answer is no. However, there is a great deal more to understand in terms of why.

1. The NHD data fundamentally lacks the attributes and integrity in the first place to be effectively generalized into a product with integrity. But, the USGS provides at least two scales, high resolution which works well for 1:20,000 to 1:80,000, and medium resolution which can be used at 1:80,000 to 1:1,000,000+ The scale ranges really depend on your purpose, which means that such guidance is at best vague, intended to get you started. So, in other words, depending on where you need data for, you may not have a problem in the first place.

2. For scales smaller than 1:100,000, consider using the NHDPlus data. A much richer set of attributes are available for this data. It represents a snapshot of the USGS medium resolution NHD data, but with these additional attributes, including Mean Annual Flow, which we are finding in ongoing research to be a very effective basis for selecting line features for display at smaller scales.

3. A recent blog entry on Mapping Center about a Python script tool for selecting polygons for maps at smaller scales covers another angle on the larger problem.

4. Simplification would be the next issue. Here, ArcGIS has tools that are fairly useful and work well for most cases, though not ideal for all cases. The Simplify line tool, and the Simplify polyon tool can used to take care of most of this task.

5. This is not solved in any sort of fashion, is the need to do the "right thing" with features that are too close to one another. Solutions range from aggregating polygons that are near to on another (which can be addressed at data capture), displacing features that are too near other features (streams separated from containment ponds by narrow levees, or other features too near streams or water bodies). I put quotes around "right thing" above, because there is no hard guidance about what is cartographically acceptable--that falls into the category of future research. In fact, we don't even know what our own minds are doing when we draw, with a pencil, a simpler version of a polygon; try doing that and put into words everything you did. I've found that little exercise leads to much writing.

6. One of the major issues with simplification, displacement, and aggregation is dealing with multiple part polygons. The islands in large rivers and lakes are not analyzed. The simple solution is to convert them to features. I typically create a feature class called Islands. That way I can simplfy and analyze islands with the same level of logic that I do for other hydro polygons.

I am not picking on you, but I am curious why you stated, "i don't wanna create more files...no more data". We hear that a lot of people--and it makes no sense. At this point, there is no "Easy Button" for this task.

As to why I say "it makes no sense", creating additional data is easy, it allows, at this point, more of the task to be addressed in an automated fashion. Analysis and Mapping in GIS is entirely about transforming data for a given purpose and creating new data either along the way or as the final result is often exactly what is supposed to happen. The degree to which we are succesful depends on fully understanding the problem and the solution in the first place, then finding the fastest, most efficient path to the solution. As a requirement, "no more data", has nothing to do with the nature of the problem or its solution, intead it is something that applies to the evolution of the solution--i.e., it is an optimization exercise. So like I said, since we don't even fully understand this problem, or how to solve it, it could be quite some time before "no more data" is part of the solution.

If you would like to post a comment, please login.

Contact Us | Legal | Privacy |