Notifications
Clear all

OPUS Projects vs DIY

22 Posts
9 Users
0 Reactions
4 Views
(@john-hamilton)
Posts: 3347
Registered
Topic starter
 

I am building a height mod network, mainly to position my pedestal in the NSRS but also to get some good GPS on benchmark observations. I plan on bluebooking it so that the new mark will be published. It is not yet complete, I have one more benchmark to add to the network in the SW and repeat obs on the three in the NE. The final network will have all lines dup'd.

The way OPUS projects works is that it uses the CORS that were used in all of the individual OPUS static solutions, but it processes the connecting lines as well (i.e. lines between stations that have common observation time). There are several ways to pick which non trivial lines are processed (and you can process all, both trivial and non-trivial).

I processed in TBC (using one CORS) and adjusted using Geolab, holding only the CORS (the open circle at the top) in lat,long, and ellipsoidal height. I then checked the vertical misclosures, in meters, (NAVD88) at the benchmarks:


PID TBC/Geolab OPUS Projects
DN6056 0.000 -0.035
KX1261 -0.011 0.052
KX2400 0.017 -0.052
KX2401 -0.002 -0.049
KX2402 -0.006 -0.024
KX2442 0.001 -0.017
KY0376 0.032 -0.072
KY0391 0.003 -0.040

All of the sessions were minimum 120 minutes, most were much longer.

I note that all of the BM's have significantly better misclosures from my processing/adjustment than does OPUS Projects (same constraint). That also means that GEOID12A is very good in this area. Once I get all the lines observed and processed, I will do a comparison again, and also compare the final constrained adjustments.

I am not yet ready to start using OPUS Projects. I do not see the advantage at this time (I didn't see it when I went to the class, either).

 
Posted : March 5, 2014 7:45 am
(@paul-in-pa)
Posts: 6044
Registered
 

Does OPUS Projects Work With OPUS-RS ?

I find OPUS-RS much better for elevations. Split your long OPUS observations and resubmit. In OPUS-RS no CORS is ever held fixed, it is a best fit of all the data. OPUS-RS uses twice as many observables as OPUS.

Or is OPUS Projects a way to get almost as good as OPUS-RS?

Paul in PA

 
Posted : March 5, 2014 8:28 am
(@glenn-breysacher)
Posts: 775
Registered
 

Does OPUS Projects Work With OPUS-RS ?

I find that 4 hour static sessions result in the best orthometric heights.

 
Posted : March 5, 2014 8:38 am
(@loyal)
Posts: 3735
Registered
 

Does OPUS Projects Work With OPUS-RS ?

OPUS_RS may work fine back East where there are CORS sites every few miles, but out here in the West, its ONLY value is quick Horizontal positions. The Vertical is all over the place, and totally useless on short occupations.

Loyal

 
Posted : March 5, 2014 8:45 am
(@loyal)
Posts: 3735
Registered
 

That's interesting John...

I took the OPUS_Projects Training two years ago, and sat in on it again last week.

I haven't played around with it much in the last year, but I will probably get back into it this Spring on an upcoming project. As with any "package," there are any number of ways to play the game, and not all of them produce the "best" results.

Let us know how things work out when you are finished.

Loyal

 
Posted : March 5, 2014 8:49 am
(@john-hamilton)
Posts: 3347
Registered
Topic starter
 

Does OPUS Projects Work With OPUS-RS ?

No, it uses PAGES as the processor (same as OPUS-S). Minimum two hour observations to get anything decent, although I think it MIGHT process less time, but the results would definitely be suspect.

The utility definitely has some nice features, I wish they would create a manual (maybe they did?). I did go to the class (an early class a few years ago when it was still beta), but there are a lot of nuances, and as I get older I forget things more easily. But I was able to get it working fairly well. I probably should have deleted or disabled all of the extraneous CORS, everything is pretty close to the one at the top of the map.

 
Posted : March 5, 2014 11:01 am
(@cliff-mugnier)
Posts: 1223
Registered
 

We just completed a Bluebooking project here at LSU for a local surveyor in town. The process was like pulling teeth. The GPS processing was done with TGO, everything else was with NGS software. Tech support from NGS was glacial.

 
Posted : March 5, 2014 12:16 pm
(@john-hamilton)
Posts: 3347
Registered
Topic starter
 

bluebooking

I do quite a bit of bluebooking, did my first one in 1986. I have developed programs and workflows to do it quite efficiently. If anyone ever needs a project bluebooked, give me a call. Probably 3/4 of the 100 or so projects I have submitted were done by others (i.e. the field work was done by others and I did the office work).

OPUS Projects is supposed to make the process easier, but I don't quite see it that way. It does create g files and a b file, but there is more to it than just that. So, to get points published in the NSRS you still need to do the bluebook process, even with OPUS Projects. I think some are under the impression that OPUS Projects replaces bluebooking, not so (at least not now).

 
Posted : March 5, 2014 12:23 pm
(@kevin-samuel)
Posts: 1043
 

Does OPUS Projects hold all CORS as fixed?

I assume you may be seeing "better" looking results since you appear to only be holding one CORS site in your processing in TBC.

Is it possible to force OPUS Projects to use only one CORS?

Have you tried processing data from "all" of the CORS stations that OPUS Projects uses in another TBC adjustment? If so I would be curious to see how these values compare with the reported values you listed here.

If my assumptions are correct I would not expect identical results since you are working with slightly different data sets within a different set of constraints.

 
Posted : March 5, 2014 2:47 pm
(@loyal)
Posts: 3735
Registered
 

OPUS Projects will hold ONLY those CORS that you select as constraints as "fixed."

Loyal

 
Posted : March 5, 2014 5:33 pm
(@kevin-samuel)
Posts: 1043
 

I see. I am quite interested in John's final results.

Thanks Loyal!

 
Posted : March 5, 2014 5:44 pm
(@geeoddmike)
Posts: 1556
Registered
 

bluebooking

FWIW,

I thought the value of OPUS Projects was in the ability of a number of widely separated observers to submit data to a project. The tool would combine the observation files, form sessions and do the processing. Not likely the approach of most private sector entities but well suited to large state-wide or regional projects.

The ellipsoidal adjustment (free and fully constrained) can be readily automated especially when the fixed control are CORS. Adding NAVD88 constraints is not something I'd feel confident automating. The dilemma with respect to freeing and fixing constraints is well described in the paper http://geodesyattamucc.pbworks.com/w/file/52782335/ConstrainedAdjustments1.pdf

On the matter of the relative merits of the PAGES processor, it is a multi-baseline (not single baseline) processor that also includes modeling not generally part of commercial packages. It had problems in correctly fixing integers for short duration sessions. It should be remembered that short sessions rely on the robustness of models and not the randomization of errors over time.

As Mr Hamilton correctly notes, "blue booking" is not merely processing GPS vectors and combining them with constraints into an adjustment. What is commonly known as the "Blue Book" is a document with the title"Input Formats and Specifications of the National Geodetic Survey (NGS) Data Base." It describes the data and formats needed to satisfy the requirements of the NGS data base.

Each point must be uniquely named and numbered in order to access information about its position, heights, characteristics, description, observations at the point, vectors relating it to the NSRS. Keeping this information consistent across multiple files is a challenge.

I always told persons wanting to blue book a project that had already been observed that it would be more trouble and time than they would likely be able to justify. NGS did become very strict regarding the requirement that a project intended to submission to NGS get a planning review and approval.

You are lucky if the project was observed to the appropriate standard, adhering to specifications and included all needed data. Most folks do work in-house; informal data capture and documentation do not cut it.

Cheers,

DMM

 
Posted : March 5, 2014 7:35 pm
(@john-hamilton)
Posts: 3347
Registered
Topic starter
 

bluebooking

Based on my limited use of OPUS Projects so far, here are some good things about it:

1) It automatically gets the CORS files needed and the precise ephemerides. It will use the CORS that OPUS used, but you can add or delete other CORS. So, this saves me from downloading, trimming, etc.
2) It creates pages that will summarize all of the positions at each station, and shows plots for N, E, and UP. At a glance you can see how well repeat occupations fit. Very nice analysis tool.
3) when you enter the PID, it gets the published values from the database, so one does not have to worry about errors in entering coordinates, etc.
4) It automatically sets up sessions and processing schemes
5) it creates the G file for each session, and, after the adjustment, creates a b file.

Once I have completed my project, I will add some more thoughts. Our final observations are tomorrow, but I am going to try to add data from the upcoming "GPS on benchmarks" campaign in my area. I think submitting that data as a bluebook project will be more useful than the standard OPUS DB.

I have heard for years how PAGES is somehow "better" than manufacturers software because it does some modeling that others do not. I feel this is untrue and unfair to the manufacturers. While I only have experience with Trimble, I have processed many tens of thousands of baselines, and I see no significant difference. I have processed lines that were of necessity thousands of kilometers (India, Egypt), and have made many comparisons against OPUS, PAGES, AUSPOS, SCOUT, and others. In fact, I much prefer TBC (and TGO before that) because of the ability to process shorter duration datasets. I see no deficiencies at all in the modeling. Yes, there will be differences at the millimeter level, but I can get differences using PAGES by changing a parameter or two.

PAGES is not required for submitting a bluebook project EXCEPT for FAA projects, which I think is idiotic. Actually, the whole FAA process that has to be followed is ridiculous. I can get the same OR BETTER results doing things my way (processing). Is landing a plane on a runaway so difficult and critical and dependent on a position that they think they need special procedures? (a rhetorical question, I am a pilot and I know the answer)

However, ADJUST (from NGS) IS required for bluebooking. I have no issues with ADJUST, it performs as well as any other program. I think their rationale for this requirement is so that they have a standardized output from the adjustment. In the past, it was not required, and I submitted quite a few projects done using GEOLAB.

 
Posted : March 6, 2014 6:18 am
(@geeoddmike)
Posts: 1556
Registered
 

PAGES models used

Glad to see that at the least the GPS data processing phase is being automated. Maybe these smart folks can do something similar in the later stages of the process.

After looking through my oldest computer the most recent listing of models and changes to PAGES is from 2004. That said, do Trimble or other commercial packages include the type of modeling shown by the list in the image?

On some of the other matters, accepting data from outside users under conditions of limited staff requires that rigid bureaucratic requirements be imposed. I thought, however, that the FAA work required that users process use PAGES -OR- other similarly capable package which can display processing results graphically. Maybe this changed since I last looked over five years ago.

After using government/ academic packages like OMNI, PAGE-NT and GAMIT, the only benefit I could see in the commercial packages was enhanced productivity. Debugging and understanding how processing took place was not as pleasing.

With the advent of excellent orbits, better antennas and receivers, more SVs, etc perhaps the editing and cleaning is not as needed. It sure was in the 90's.

The fact that just about anyone can use a commercial package does not mean they use it correctly. I always hated when someone decided that since they could change the way tropo parameters were calculated that they should.

I am biased from work experience. I like to give data close scrutiny. ADJUST with its rich numeric output is preferable to seeing the results as either red or green lines.

Cheers,

DMM

 
Posted : March 6, 2014 8:57 am
(@john-hamilton)
Posts: 3347
Registered
Topic starter
 

PAGES models used

I first processed GPS data in 1986, when cycle slips had to be manually fixed (visually, by looking at a graph). And we had to pay $3000/month to get an ephemeris (downloaded over a 300 baud modem) to process the data on a special computer that ONLY had BIG (not 5 1/4" or 3 1/2" floppy drives). We have come a long way since then!

I will agree, Mike, that using Pages gives a lot more control for troubleshooting, and some of those models might be useful for very long lines. I am not (unfortunately) privy to what goes on inside the processing software, but I know it works. I also know it may occasionally give me a false negative, but never a false positive, as long as I have enough data for the length of line.

 
Posted : March 6, 2014 1:54 pm
(@alan-chyko)
Posts: 155
Registered
 

John - any particular reason why you did your test adjustment in Geolab as opposed to TBC? Just curious.

 
Posted : March 6, 2014 4:21 pm
(@john-hamilton)
Posts: 3347
Registered
Topic starter
 

Alan: I have been using Geolab since 1986 and am very comfortable with the results. There are some things that the adjustment algorithms in TBC do that I do not like. One is the way it weights geoid separations. Another reason is that Geolab does the computations in the ECEF (Earth Centered Earth Fixed) system, I believe TBC does it in a local horizon system, which has limitations. But, mostly the reason is familiarity.

Reminds me of a political graffiti I saw in 1972: Why change richards in the middle of a screw...vote for Nixon in '72 (replace richards with the nickname commonly given). To summarize, it may not be perfect, but it is what I know best.

I should add that I always do a free adjustment in TBC after processing everything, and then I reprocess using the updated coordinates (known as coordinate seeding). This improves the results, mainly when the network covers a large area. I do also use the adjustment in TBC to troubleshoot and look for bad HI's, etc. But, I get better vertical results in Geolab. And, I can add conventional observations. Yes, you can include conventional in TBC, but only if it is in a DC file, whereas I have much more flexibility to add/edit obs in geolab.

 
Posted : March 7, 2014 4:47 am
(@mightymoe)
Posts: 9920
Registered
 

Hmmm, looks to me like you are connecting multiple observations to each other, setting on the same points more than once: I thought we are just supposed to set up an antenna over a point, send it off to OPUS and we're good!! hehe

And I love the idea that then the point is considered "adjusted" lol.

Thank you for showing the results, I have had similar results myself and it's interesting to see over .3' in the two adjustments. That is something I've see many times before with respect to the vertical. Have you run the numbers using individual CORS to see what heights are produced? An odd aspect of adjusting verticals using TBC or TGO was that one CORS baseline tie would be say .2' higher than another and the adjustment might end up being slightly higher than the high one or lower than the low one. That always bothered me a little.

I've found that different CORS sites will produce quite different verticals when run separately.

What I find unusual is that with the TBC/Geolab adjustment the worst number you have is .1' that I never see around here. I hope attention is being paid to the column on the right and just what it implies, John is running control the way it's meant to be run and the vertical results from OPUS are not all that great, while the adjustment in TBC using the close CORS worked pretty well.

 
Posted : March 7, 2014 5:43 am
(@kevin-samuel)
Posts: 1043
 

FWIW

Maybe I am wrong, but I think the preliminary results John posted are in meters.

 
Posted : March 7, 2014 8:00 am
(@john-hamilton)
Posts: 3347
Registered
Topic starter
 

They are in meters.

 
Posted : March 7, 2014 12:17 pm
Page 1 / 2