Your idea of a simultaneous "session" baseline processing arrangement between all observed receivers at the same time is incorrect
I've bluebooked dozens of projects using PAGES/PAGE-NT.
It is in fact a session processor, as @geeoddmike mentioned. I was also a certified TBC trainer. I still run in-house training for field and office software.
OPUS is not the same thing as TBC. Please read the OPUS FAQ - it should help you understand that OPUS is not a network. It's a single-point solution and has little to no bearing on our discussion of session vs. baseline processing.
TBC (and most commercial processing programs) is not a session processor. Developing, processing, and adjusting a network in TBC is not the same thing as jamming a RINEX file into OPUS.
Differential GPS/GNSS positioning works because of software processing the single/double/triple differencing solutions of the observations to compute a single vector between any two receivers observing at the same time against any number of satellites.
That's the way it works in TBC and other commercial processors. Not in session processors. Session processors evaluate all observations for a given time series simultaneously, resulting in multiple baselines and covariances computed for all points in the session - which is why it is unwise to include the trivial baselines.
It's part of the reason why PAGE-NT utilizes hubs; each point gets connected to the nearest one, making the network plot look like a radial network of "sideshots", yet statistics are computed for all points in the session.
This does not happen in single-baseline processors. In single-baseline processors, if a post-processed vector is deleted between two points, that point pair will NOT get covariances computed during the network adjustment - unlike a session processor. There's a loss of data that does not occur with session processors.
A vector is a magnitude and a direction between two points during a chunk of time.
I'm pretty sure I get what a vector is. I've processed a lot of networks on a lot of different software packages over the years.?ÿ
For corridor surveys we would set static receivers up on each point at least twice. In those days we had about 8 that could collect static, so it was a matter of setting two main control points near the ends of the line and leap-frog or pinwheel the remaining receivers. We would use different receivers for each point and measure to different points using feet and meters.?ÿ
When you do that the resulting vectors are complicated, but you run the processing different ways to check the numbers on the point and if it all looks good do a final processing and report.?ÿ
Trivial vectors would be vectors between fixed control that would be switched off for processing. There is no reason to process vectors between held CORS points, also there may be other points you wish to hold that get turned off vectors. It all depends on the layout of the project. After GPS static is finished the route would get a level run and that's what would be included for the Z value. A spreadsheet report with Lat, Long, Height, XYZ in state plane and surface, type of monument, control monuments used, ect.?ÿ
The rule was 10 minutes plus one minute a mile for static, that's an old rule, GPS is way better these days.?ÿ
@rover83?ÿ
I'm glad you've bluebooked dozens of projects using PAGES/PAGES-NT. Same kudos about your Certified TBC Trainer status and in-house training for field/office software. Those accomplishments have no bearing on how well we are (or are not) communicating about this topic.
I never claimed that OPUS and TBC are the same thing. In fact I'm arguing the same as you, they are different.
And next, you eloquently point out the crux of the perceived disagreement: "understand that OPUS is not a network. It's a single-point solution and has little to no bearing on our discussion of session vs. baseline processing." Exactly!
The literature cited by geeoddmike (and reiterated by me) does not support the claim that OPUS is a session solution or session processor, read the text. It leaves out much discussion which is relevant to our perceived disagreement.
The following paper discusses the actual processing of baselines and their use in session adjustments or session solutions:
https://mcraymer.github.io/geodesy/pubs/gps_ion1992.pdf
According to the paper, the concept of vector processing, session adjustment (baseline processing), and session solutions (session processing) are distinct, and likely a source of our confusion. Even the paper is a bit loose with the language.
What are we really disagreeing over? Processing baselines (vector solutions), or the best solution to the location of a measured point??ÿRead the definition of terms from the paper carefully, it matters:
Baseline: Coordinate vector resulting from any station pair.
Session: An observing period of multiple receivers.
Independent baselines: A set of baselines where no individual baseline is a linear combination of any others.
Linearly dependent (trivial) baselines: Baselines which are linear combinations of others.
Baseline solution: Solution from processing a single baseline.
Session adjustment or baseline processing: 3D least squares adjustment of all possible baselines solutions.
Session solution or session processing: Solution from simultaneous processing of all independent baselines with mathematical correlations between baselines.
"In baseline processing, all individual baselines can be either separately processed (baseline solution) and subsequently combined in a 3D session adjustment or processed together as in session processing but without the correlations between baselines."
The "Tests With Simulated Data" portion discusses the methodology:
The equivalence of session and baseline processing was tested empirically using simulated GPS observations of an actual survey of the Ottawa GPS basenet. The relative location of the 8 stations in the network is illustrated in
Figure 1. The baseline lengths ranged from 2 to 150 km.
The simulated data for this network were generated using the Bernese GPS Software v3.3. Phase observations were computed for a single session at each of the 8 stations using a standard deviation of 3 mm and exactly the same start/stop times. The duration of the session was 6 hours. A broadcast ephemeris obtained from an actual GPS survey of the same network on day 337 December 3), 1991 was used for the simulation. No cycle slips or atmospheric effects were introduced.
Using this data, session solutions were obtained using station 833001 as the reference receiver for generating the 7 linearly independent baselines. Individual baseline solutions were then computed for all possible (28) baseline combinations. After removing the variance factors from each baseline solution, they were combined in a 3D adjustment (session adjustment). The variance factors where also removed from the session solution (for comparison purposes only). In both solutions, all double difference phase ambiguities were estimated. Similar session adjustments of baseline solutions were also computed for different numbers of these stations in order to check the variation of the n/2 scale factor with the number of receivers.
Baseline Processing (Session Adjustment (aka LSA)) and Session Processing (Session Solution) are two different methods (algorithms) of using independent baselines (coordinate vectors) to arrive at a coordinate value of a point.
?ÿ
Agree to disagree?
?ÿ
I really appreciate the time everyone has taken to write such detailed responses.
So to summarize, do not use the vectors between and control points that I wish to hold?
Holding GPS control points depends on how accurate they are.
Since you occupied them how do they look?
I would process them both using CORS and then I would process between them. If all those numbers look good, then yes hold them, but a iffy control point will mess up your adjustments. It's the small "errors" that are difficult to find, the big ones are usually simple to track down.?ÿ
I really appreciate the time everyone has taken to write such detailed responses.
So to summarize, do not use the vectors between and control points that I wish to hold?
Keep them in there for the first minimally constrained adjustment, so you can see how well the published values match up to your adjusted values. Run QC on the minimally constrained network and make sure that the data fits with itself before fitting it to the rest of the control.
After that, if you have decided to hold more than one CP, it's generally good practice to remove or disable the observations in between those fixed stations. Keeps the degrees of freedom, and consequently the chi-squared test values, a bit more realistic.
I miss doing this type of work.
???
?ÿ