Notifications
Clear all

Why don't OPUS-S vectors repeat?

10 Posts
8 Users
0 Reactions
17 Views
(@bill93)
Posts: 9837
Member
Topic starter
 

I ran data from last Friday through OPUS-S on Monday evening using default CORS. I ran it again this (Wednesday) evening forcing one of the CORS to be different. All runs list the same ephemeris file (rapid). The two re-used CORS as shown in the extended report each had vectors to my point slightly different from the first run.

I ran it again forcing the CORS to be the same ones used in the first run. All three had slight differences in the number of observations used per satellite, and several mm difference each of the vectors components. Thus the difference is not due to influence of a different CORS in the mix.

If I put in the same data to a program, I usually expect to get the same output. Something changed.

a) What info were they using that caused the difference in processing?
b) If it is a regular occurrence, what is the schedule, so I know the results wouldn't change again?

---------
As an aside, I noticed something insignificant that is amusing. The vectors are printed out to 5 decimals, but the last 2 digits don't change in each instance. Obviously something is rounded to the nearest mm and added to something that wasn't rounded.

 
Posted : February 15, 2017 9:36 pm
(@paul-in-pa)
Posts: 6044
Member
 

Put the same data in to the same three CORS and you will get the same answer, because your answer is a mean of the three independently solved positions, What you see as a vector is a mathematical calculation to that meaned position.

Change one of the CORS and your position is now altered by different data, i.e. the solution from that different CORS, so all three mathematical vectors to that different meaned point can change.

What you see in differences in number of observations could mean that the CORS data could have been found to have some bad data that was subsequently filtered out after it was initially posted. BTW you cannot get back to that original data. There is no schedule, the data testing and possible filtering happens when it happens, usually because some one, some where, saw a problem in a solution to a fixed point. Alternately the original data sent automatically from the CORS to NGS might have had a transmission problem and cleaner data was resent at a later time.

Another thing to check is which NGS computer and software version solved your position, it is listed in the header of the OPUS report. A software upgrade would not occur on all computers at the exact same time, in fact an upgrade is probably put on a machine and tested before it is put on all machines. You do not get a choice on which computer gets used.

I will give you some kindergarten advice from my granddaughter. "You get what you get and you don't throw a fit."

Paul in PA

 
Posted : February 16, 2017 5:47 am
(@mark-mayer)
Posts: 3363
Member
 

Least squares is an iterative process. Where you start affects where you end up. Iterations continue until the changes in each iteration slip below a fixed limit and not when you reach a certain conclusion. In an OPUS resolution the a priori assumptions - the starting point for the calculations - is chosen at random, therefore the endpoint will vary slightly.

If you are running the same data over time there may also be different ephemeri being used. The ultra-rapid, rapid, and precise ephemeri progressively becoming available. This will fundamentally shift the results.

 
Posted : February 16, 2017 7:30 am
(@mightymoe)
Posts: 9936
Supporter
 

If you look at the vector solution from individual CORS for a point; then at the least squares solution from the same points combined you will probably be surprised by the outcome...........you may say how the heck did they come up with that number?:cool:

 
Posted : February 16, 2017 8:26 am
(@a-harris)
Posts: 8761
Member
 

Paul in PA, post: 414350, member: 236 wrote: I will give you some kindergarten advice from my granddaughter. "You get what you get and you don't throw a fit."

That is priceless...........

The CORS stations may have updated position values during that difference in time or your values underwent different weighted variables in the two different processing events.

 
Posted : February 16, 2017 11:59 am

(@bill93)
Posts: 9837
Member
Topic starter
 

Paul in PA, post: 414350, member: 236 wrote: Put the same data in to the same three CORS and you will get the same answer

That's what I was expecting, but didn't see happen. Thus I was trying to figure out what input other than mine would have changed.

Paul in PA, post: 414350, member: 236 wrote: What you see in differences in number of observations could mean that the CORS data could have been found to have some bad data that was subsequently filtered out after it was initially posted. BTW you cannot get back to that original data. There is no schedule, the data testing and possible filtering happens when it happens, usually because some one, some where, saw a problem in a solution to a fixed point. Alternately the original data sent automatically from the CORS to NGS might have had a transmission problem and cleaner data was resent at a later time.

That's helpful to know.

Paul in PA, post: 414350, member: 236 wrote: software version solved your position, it is listed in the header

All identical for these submissions.

Paul in PA, post: 414350, member: 236 wrote: "You get what you get and you don't throw a fit."

I really didn't think trying to understand what I got, and the rules of the game, was throwing a fit.

Mark Mayer, post: 414368, member: 424 wrote: Least squares is an iterative process.

I was under the impression that there was no LS or iteration to convergence going on in OPUS.

Mark Mayer, post: 414368, member: 424 wrote: he starting point for the calculations - is chosen at random

I don't think I'd call it random. It is probably akin to a Garmin solution to get a starting point in the neighborhood, but it is algorithmic with no randomness other than in the sense that any GNSS solution includes random error. I think they pick a starting point and do one run through to get the answers.

If the data fed it is the same (including the ephemeris and CORS data) it gives the same result. I re-ran the file this morning and got the same output all the way through the extended report down to the last digit.

Mark Mayer, post: 414368, member: 424 wrote: different ephemeri

As noted in the OP, all these runs used the same rapid ephemeris file. I'm quite aware of the three types of file, but was too impatient to wait almost three weeks to see results from the precise orbits.

MightyMoe, post: 414384, member: 700 wrote: If you look at the vector solution from individual CORS for a point; then at the least squares solution from the same points combined you will probably be surprised by the outcome...........you may say how the heck did they come up with that number?

I don't think there is any least squares going on beyond an average (which is the LS solution to a simple problem). It isn't easy to sort through the report due to the need to keep track of what's IGS and what's NAD83, so that may contribute to the confusion of understanding their numbers.

A Harris, post: 414421, member: 81 wrote: The CORS stations may have updated position values during that difference in time

The CORS positions are precisely the same in all the reports.

 
Posted : February 16, 2017 2:09 pm
(@loyal)
Posts: 3735
Member
 

Bill,

You might want to check and see if the same "blade server" was used in both solutions (I think that's what they are called).

Here is some OPUS_S Report data lines from a couple of years ago (easiest for me to find today)

07/07/15
SOFTWARE: page5 1209.04 master91.pl 022814
SOFTWARE: page5 1209.04 master51.pl 022814
SOFTWARE: page5 1209.04 master91.pl 022814
07/09/15
SOFTWARE: page5 1209.04 master51.pl 022814
SOFTWARE: page5 1209.04 master90.pl 022814
SOFTWARE: page5 1209.04 master93.pl 022814
SOFTWARE: page5 1209.04 master53.pl 022814

All of these were submissions on the same point (different observations), and as I understand it, the "masterXX" indicates the Blade used. I seem to recall some years back seeing something like you describe, and narrowing it down to DIFFERENT Blades being used on the SAME Observation set (same CORS, same day, etc.), Of course I could be all wet too.

Loyal

 
Posted : February 16, 2017 2:59 pm
(@loyal)
Posts: 3735
Member
 

Okay, I ran a quick (and unscientific) test.

I uploaded 5 identical RINEX files (using different file names) in quick succession, and got back results from 4 different "blades" (Master 90, 91, 92, & 93), which included 2 solutions from Master 93.

All of the G-file Vectors are IDENTICAL, so I doubt that different blades is an issue.

Oh well...

 
Posted : February 16, 2017 6:58 pm
(@geeoddmike)
Posts: 1556
Member
 

FWIW,

I agree that given the same inputs a program should return the same results. Since you state that there are differences in the number of observations used, the inputs are not the same.

Have you also compared the OBS BY SATELLITE VS BASELINE values for differences rather than merely the header tabulation of OBS USED?

The inputs to the OPUS-S algorithms are:

1. Observation data at your site. Your submitted file.
2. Antenna model and height of instrument (from user input). BTW, the antenna height and model are transformed from dEdNdUto dXdYdZ and applied to the unknown site's coordinates.
3. A navigation and SP3 ephemeris file matching the time interval.
4. Positions and observations at the CORS (where the midpoint of the date and time are chosen as the processing epoch thus determining the reference sites (CORS) positions for computations. And data at the CORS extracted to match your observation file date/times.
5. Pre-defined processing parameters. OPUS was originally created to provide an alternative to often "unique" approaches to the choice of parameters by those submitting data to the NGS. OPUS also includes a number of models not included in commercial packages. Submitters cannot change these parameters e.g. elevation mask, tropospheric model and parameters, and more.

As others have noted, CORS site data is sometimes replaced with the possibility that missing or mangled in transmission. This to me is the most likely explanation for the differences which you mention are small. BTW, how small?

Computations are performed in ITRF with the exceptions that the antenna height (UP)and antenna offsets (ENU) are applied to the unknown site's XYZ and results are provided in NAD83.

As for your aside, I cannot understand why the vectors would change but not the sub-millimeter portion. How many comparisons have you made?

Cheers,

DMM

 
Posted : February 16, 2017 7:06 pm
(@shawn-billings)
Posts: 2689
Member
 

In the past I've seen holes in data availability that fill in days or weeks later. Not sure why this happens only that it does. This would be my prime suspect.

 
Posted : February 16, 2017 7:15 pm