Notifications
Clear all

GPS Postprocessing Procedures

20 Posts
9 Users
0 Reactions
5 Views
(@jimmy-cleveland)
Posts: 2812
Topic starter
 

I just completed the GPS portion of a project, and ran into some problems with the post processing ( http://beerleg.com/index.php?mode=thread&id=44165)

I have two Topcon Hipers, and two Promark 3 units. I use them in conjunction with each other on almost every job that I use the GPS on. My post processing software is the latest versions of both GNSS Solutions and Topcon Tools demo version. I had been downloading the Hipers using an older version of PC-CDU. I just updated PC-CDU, and also installed Topcon Link and TRU.

I generally submit to OPUS as well, just as a blunder check. Well, this week that really saved the day, as it picked up and exposed a bad solution. I generally run my data through both GNSS and Topcon Tools, and use the OPUS as a triple check.

I guess this week's project really brought to light the importance of double checking data in more than one way. On this particular project, not having that secondary check would have put my project into a flood zone, and could have really been a bad, bad deal. It just really shows the need for being thorough in your procedures and calculations. The "fear" of not being right always has me double checking and then checking again.

I am old school, and don't use any VRS or RTK networks. My working range is so wide that I think it is better for me to use the base station/rover setup with the two PM3 units as additional site control.

The first clue to the error was the difference in the value for the "base point" onsite between GNSS Solutions and Topcon Tools, and the OPUS confirmed the Tools solution.

 
Posted : January 15, 2011 6:58 pm
(@dave-karoly)
Posts: 12001
 

I don't trust GPS enough for flood related work. I've seen it go bad at least twice by private survey contractors. The first time was about 4 feet; they used a BM seven miles away because it was in open sky instead of one right in the project area which they topo'd all around but didn't bother to at least shoot it with the TS. I have no idea why they did that.

The next time a private surveyor contractor used OPUS which amounted to about 1 foot the wrong way in a flat flood plain.

I found out about both of those because a smart Civil PE was just curious enough about something that didn't look right to show me the map and ask me good questions. My former employer had a habit of hiring Business degrees to do PM work and this is the result.

 
Posted : January 15, 2011 9:39 pm
(@jimmy-cleveland)
Posts: 2812
Topic starter
 

DAVE,

I should clarify one thing, I actually pulled the flood map after I posted this, and the property is entirely in a Zone X, however the elevation of the property would have been lower than the flood elevation shown on the final plat.

I agree with you on not just blindly trusting GPS, hence the double checking of data.

Thanks

 
Posted : January 15, 2011 11:20 pm
(@moe-shetty)
Posts: 1426
Registered
 

> I don't trust GPS enough for flood related work. I've seen it go bad at least twice by private survey contractors. The first time was about 4 feet; they used a BM seven miles away because it was in open sky instead of one right in the project area which they topo'd all around but didn't bother to at least shoot it with the TS. I have no idea why they did that.
>
> The next time a private surveyor contractor used OPUS which amounted to about 1 foot the wrong way in a flat flood plain.
>
> I found out about both of those because a smart Civil PE was just curious enough about something that didn't look right to show me the map and ask me good questions. My former employer had a habit of hiring Business degrees to do PM work and this is the result."""

but Dave, one of the above examples is just bad judgment, not GPS causing a problem.
the other may have been avoided as well.

ngs claims OPUS to give 2 CM vertical with 2 sigma results when following some reasonably easy guidelines in 2.5 hours static, and 1 cm' vertical in 12 hours static.
will look up a quote and edit this post soon...
from ngs opus website: http://www.ngs.noaa.gov/OPUS/about.html

" 1. OBSERVE LONGER: A longer-duration session provides PAGES a better opportunity to accurately fix ambiguities and mitigate multipath error. See the graph at right for the correlation between session duration and accuracy.

 
Posted : January 16, 2011 4:53 am
(@merlin)
Posts: 416
Registered
 

You can't be careful enough using OPUS GPS for elevations. If it is relevant I will tie into a NGS BM and compare it to my OPUS value. I in effect do a GPS level loop so that I have a closure. If the OPUS value is within the usual 2 tenths range(I forget which way, but it is always the same), I consider it a valid check of the BM, but in all cases I will ask myself the question: is the check appropriate for the project. If not I recommend to the client that we run levels in from the BM and or check in to numerous NGS monuments with the GPS.

 
Posted : January 16, 2011 5:37 am
(@loyal)
Posts: 3735
Registered
 

Bear in mind that the graph attached above by Eddie is expressed as 1 sigma (rms) values (68%).

A more in depth analysis and procedural outline for both 2 centimeter and 5 centimeters heights (ellipsoid and orthometric) can be found in NGS-58 & NGS-59 (see below).

http://www.ngs.noaa.gov/PUBS_LIB/NGS-58.pdf

http://www.ngs.noaa.gov/PUBS_LIB/NGS592008069FINAL2.pdf

Regardless of the techniques and post-processing software used, one must be VERY careful about the ellipsoid heights used as constraints in your adjustment, which means that when using the NGS CORS, you NEED to watch the 60 Day Time Series very closely.

There is also a certain amount of uncertainty in even the latest Geoid Model (GEOID-09), which in some extreme cases can be around a decimeter, and easily in the several centimeter range in many mountainous areas.

I have done a LOT of “repeat” [long duration] OPUS solutions (always using the 60 Day “corrections”) and 1-2 centimeter (ellipsoid height) “REPEATS” are certainly “do-able.” That does NOT however mean that the NAVD88 Orthometric Height that was generated is within 2 centimeters in an absolute, or even local sense (maybe, maybe not).

Just my 2 bits,
Loyal

 
Posted : January 16, 2011 6:40 am
(@dave-karoly)
Posts: 12001
 

I think flood elevations are more likely to be based on local benchmark control than some wider realization of NAVD88 (e.g. CORS). What really matters is the depth of flow from the river bed or in the case of a natural lake with a control outlet, the height of the outlet plus head. Those things are usually determined using the local benchmarks.

Therefore if you are using GPS to measure site elevations from existing local benchmarks then that is better than relying on CORS many miles away. A tie with levels or total station (using correct procedures) is better.

Unfortunately the middle of the central valley is subsiding fairly rapidly so that could be a problem. Of course if the river bed goes down then theoretically the flood goes down at least until it becomes tidal. The Sacramento River is tidal until roughly at downtown Sacramento.

 
Posted : January 16, 2011 6:46 am
(@mightymoe)
Posts: 9920
Registered
 

I'm confused about posts that use OPUS solutions to determine or even "check" elevations for flood determination. Maybe it's where I'm used to working at; OPUS can't match very well to the flood maps system here. The flood maps were generated from bench marks on 29 datum. Even the new maps (still waiting on them to be approved) which are tied to the local first order bench marks using published 88 datum don't really check to OPUS very well.

Maybe it's that CORS sites here are so far away; but even so doesn't everyone tie to the bench mark system that the maps were generated from?

 
Posted : January 16, 2011 7:10 am
(@dave-karoly)
Posts: 12001
 

I find that OPUS varies. I've seen it come very close to the local benchmarks but it is hit and miss. Mountain areas have more gravity issues and less control so tend to have bigger variations.

 
Posted : January 16, 2011 7:34 am
(@loyal)
Posts: 3735
Registered
 

Dave and Moe make great points, and I would have to agree with them 100%.

I don't do Flood Certificates, but if I did, I would REALLY hesitate to use OPUS (or any “long range” GPS technique) for anything but a rough check.

The USGS has a couple of very good “primers” on the subject of Land Subsidence that should be of interest to all surveyors:

3.98mb pdf
http://water.usgs.gov/ogw/pubs/fs00165/SubsidenceFS.v7.PDF

And:

http://pubs.usgs.gov/circ/circ1182/

Which provides links to the .pdf files covering Circular 1182.

This is a problem that is NOT limited to the Gulf Coast or the Central Valley of California (although those are some of the worst areas). There is also the post-glacial rebound (isostatic adjustment) situation in the upper Mid-West that can be of some some concern as well.

Many of the Bench Marks in the Great Basin haven't been “observed” since the 1920s or 30s (or even earlier), yet they are the ONLY Bench Marks for many tens of miles in any direction.

Here in the West, geoidal gradients can (and do) change FAST, and due to the very sparse [observed] gravity coverage out here, the overall gravity field is not particularly well defined. When GRAV-D is completed, and a new geocentric North American Datum (horizontal and vertical) is released, this situation should improve immensely, but that is some years down the road.

In the mean time, surveyors should be very careful about GPS derived orthometric heights, and cognizant of what the dynamics (horizontal and vertical) are in their area of interest.

Check, check, and double check.
Loyal

 
Posted : January 16, 2011 7:41 am
 ddsm
(@ddsm)
Posts: 2229
 

From the Elevation Certificate instructions:
Item C2. A field survey is required for Items C2.a-h. Most control networks will assign a unique identifier for each benchmark. For example, the National Geodetic Survey uses the Permanent Identifier (PID). For the benchmark utilized, provide the PID or other unique identifier assigned by the maintainer of the benchmark. For GPS survey, indicate the benchmark used for the base station, the Continuously Operating Reference Stations (CORS) sites used for an On-line Positioning User Service (OPUS) solution (also attach the OPUS report), or the name of the Real Time Network used.

😉

 
Posted : January 16, 2011 9:43 am
(@jimmy-cleveland)
Posts: 2812
Topic starter
 

Thanks for all the replies.

This has generated some very good discussion. Please note that I only use the OPUS as an additional check. I have only used OPUS for an elevation on a FEMA elevation certificate once, and that was for a Zone A, with no base flood elevations determined.

I used GPS on this particular project to transfer in elevations from two city benchmarks. The OPUS elevations in our area generally check to around 0.10' or so.

Keep the discussion going. This is a great topic in my opinion.

Have a great Sunday afternoon, I'm off to a family get together, and will check in later.

 
Posted : January 16, 2011 11:30 am
(@joe-the-surveyor)
Posts: 1948
Registered
 

I use GPS all the time to establish elevations for flood certificates.
I belong to a network, so RTK allows me to do this rather easily.

Now, I have checked on many monuments in my area with the GPS and everything has checked out very well.

I also check it against the town topography as a check just to make sure there isn't a bust in my results.

Now, I don't push the GPS to the limit as regrds field conditions. I have found multipath to be the devil when it comes to GPS.

With the ever decreasing of passive monuments (NGS said bye-bye to them). I feel we are going to have to rely more and more on GPS derived elevations.

 
Posted : January 16, 2011 12:07 pm
(@loyal)
Posts: 3735
Registered
 

I dunno Joe....

Although the days of campaign style surveys (horizontal OR vertical) are over as far as NGS crews go, the Passive Network hasn't really been "abandoned" per se.

Responsibility for the Horizontal Marks HAS pretty much been handed off to the local folks, but the Height Modernization projects currently underway (and planned) are (at least in part) done with monies (and participation) from the NGS.

The Passive Network still plays a pretty big role for many local surveyors, and I don't see that entirely going away anytime soon. I use the Passive Network quite a bit, I just don't necessarily use the published coordinates as constraints (except Elevations in many cases).

The Active Network (CORS) is obviously the backbone of the modern NSRS, and I don't have any problem with that at all. As OPUS (all flavors) evolves, it is going to be up to the local Surveyors and agencies to keep the Passive Network aligned to the NSRS via OPUS DB/Projects AS NEEDED.

There are many OLD Passive Network stations that nobody has visited in decades, and until such time as a modern coordinate estimate is needed thereon, what's the problem with the OLD one? I think that the local chapters of the several Land Surveyor Organizations should ENCOURAGE updating of these stations via OPUS_DB/Projects to the extent practical.

Again...just my 2-bits
Loyal

 
Posted : January 16, 2011 1:06 pm
(@joe-the-surveyor)
Posts: 1948
Registered
 

Loyal,

I see your point.
A lot of local benchmarks have been lost over the years, and with local municipalities strapped for cash, I don't see those coming back.

 
Posted : January 16, 2011 7:39 pm
(@loyal)
Posts: 3735
Registered
 

Joe

Agreed!

Most of my work is out in the boonies, where County Road Maintenance is the biggest concern. The more remote (and impassable) the area is, the more likely that everything is in place (duh). What little work I have done around “civilization” is a whole nuther story. It seems like Bench Marks (no matter how well identified) are the first things to go.

Bummer...

 
Posted : January 16, 2011 7:47 pm
(@dmyhill)
Posts: 3082
Registered
 

Ok, I am going to show my ignorance, mainly because I don't know how else to cure it...

Loyal,
What do you mean by "60 day corrections".

Are you referring to the corrected orbits?

-David

 
Posted : February 2, 2011 10:20 am
(@moe-shetty)
Posts: 1426
Registered
 

> Ok, I am going to show my ignorance, mainly because I don't know how else to cure it...
>
> Loyal,
> What do you mean by "60 day corrections".
>
> Are you referring to the corrected orbits?
>
> -David

i think he means by a sixty day correction to be a sixty day time series. cors is analyzed very often, every station. the 60 day time series will indicate oscillations of the station coordinate, due to various conditions:
http://www.ngs.noaa.gov/cgi-cors/corsage_2.prl
http://www.ngs.noaa.gov/CORS/Presentations/GLHM_Forum2009/Opusgl.pdf


1lsu has a better graphic:
ftp://www.ngs.noaa.gov/cors/Plots/1lsu.gif

 
Posted : February 2, 2011 10:49 am
(@dmyhill)
Posts: 3082
Registered
 

I see, he is not using an OPUS solution, but an independent static solution using his own software. (?)

 
Posted : February 2, 2011 11:02 am
(@loyal)
Posts: 3735
Registered
 

dmyhill

NO (not necessarily)...

GODE (above) is NOT representative of the 60 Day Time Series. GODE (along with ALGO, MDO1, and DRAO) are “HELD” as fixed in the NGS daily solutions from which the CORS 60 Day Time Series is derived. Therefore their 60 Day plots are “FLAT.”

GTRG is an extreme example of what I was talking about:

ftp://www.ngs.noaa.gov/cors/Plots/gtrg.gif

Dr. Sella will probably strangle me for posting that nasty sucker, but it is what it is for the time being (until the Multi-Year Solution is released). GTRG is NOT representative of the CORS Network as a whole, but there ARE plenty of stinkers out there right now.

If you use GTRG in an OPUS solution, OPUS will use the “predicted” coordinate estimate (the one indicated by the straight red line) and NOT the “best available” estimate (indicated by the wavy blue line).

There are several ways (some easier than others) to use OPUS and make the 60 Day Time Series corrections. This will in MANY cases improve your peak-peak variances, AND your coordinate estimates at your remote station (sometimes a LOT, like in the case of GTRG).

Whether you use OPUS or a commercial Post Processing Program, your “remote station” coordinate estimate is ONLY going to be a good as the coordinate estimates input for each CORS that constrain the solution. GIGO!

I have been using a “hybrid-OPUS” method for some years now, where I use the OPUS G-FILE data (vectors and statistical data) combined with either 60 Day Corrected NGS coordinate estimates, or SOPAC/SECTOR estimates on the several CORS. It works VERY well, and I can use ANY CORS, regardless of the variance between the predicted and modeled/60 day estimates.

Loyal

 
Posted : February 2, 2011 12:27 pm