Trivial?
Well great, here I was all proud because I had been coached up by a guru to get rid of the trivial baselines. Someone you know well (you and I had dinner with him November 2012). Now two people I regard as GPS gods have differing opinions? My world is imploding...
I originally piped up about it because I still disagree with the exponentially increasing number of vectors way of thinking. I don't think you get the redundancy and comfort that appears at face value if you set up six receivers simultaneously and upon import see what appears to be 15 baselines.
Which you seemed to have agreed with via your statement "overstating the accuracy"...
I wouldn't argue with you or Loyal, you both know far more than me.
Another paper to consider
FWIW,
I like Dr. Craymer's paper on the issue: Session Versus Baseline GPS Processing. see:
ftp://geod.rncan.gc.ca/pub/GSD/craymer/pubs/gps_ion1992.pdf
For other articles of possible interest see: http://www3.sympatico.ca/craymer/geodesy/craymer-pubs.html
Enjoy,
DMM
Trivial?
Yea, more receivers are sometimes a "bad" thing. To me it is more important to have double occupations rather than lots of lines. And having more receivers makes double occupations less likely. So there are times when I just like to use two receivers in a sort of traverse mode, where every point gets occupied twice. Plus add some cross ties to make the loops smaller.
That said, there are plenty of times I just do a radial survey, all depends on what the objective is. Photo control? No need to get all fancy. But networks of monuments, or intervisible stations, need direct connections.
And, Adam, I hate to ask this (I am terrible with names), but who are you? I assume you are talking about Vegas in November 2012, but I am drawing a blank.
Trivial?
I just sent you an email.
Another paper to consider
Thanks! I'll digest it this weekend. After skimming the beginning, though, I don't think any "standard" surveying software is capable of utilizing this approach?
Trivial?
Thanks, Adam. Sorry about not remembering. I admit I am terrible with names (but pretty good remembering faces).
Trivial?
Count me in the "nothing's trivial" camp, mostly because I don't believe the effort required to eliminate the trivial lines is justified by the result. On big projects when we may have 9 or 10 receivers going, I essentially create a TIN out the connections, disabling anything that crosses the nearest-neighbor lines. That often leaves me with trivial lines, but that's life.
As far as inflating the statistics goes, I find that the baseline processors are usually overly optimistic, at least on very short lines that I can measure with a total station. Star*Net -- which is what I usually use when mixing GPS and terrestrial observations -- allows an inline option to scale the vector error estimates to bring them in line with the more reliable EDM shots. I often end up with scalar values between 3 and 5 on the GPS observations.
Trivial?
looks like there a some different ideas on how to approach this on here.
I will read through the provided links as soon as I can find time and see if can get a better understanding of the process.
With only two receivers it looks like the loop system is the best way to link everything together?
Another thought...
1. you have one point with known coordinates that you wish to hold
2. you have done rtk work from this point already
If you have rtk measurements to suitable existing points for your network I would consider including them in the network Least Squares Solution. At the very least you could use the positions you already have as seed coordinates for your network adjustment.
Howdy,
I personally consider the information in the venerable "Geometric Geodetic Accuracy Standards" to have some excellent advice on what constitutes a well-conditioned network. It is rather dated with respect to GPS capabilities and discussion is centered on now-superseded accuracy standards but good nonetheless.
http://docs.lib.noaa.gov/noaa_documents/NOS/NGS/Geom_Geod_Accu_Standards.pdf
Enjoy,
DMM
GeeOddMike, post: 251077, member: 677 wrote: Another paper to consider
FWIW,
I like Dr. Craymer's paper on the issue: Session Versus Baseline GPS Processing. see:
ftp://geod.rncan.gc.ca/pub/GSD/craymer/pubs/gps_ion1992.pdf
For other articles of possible interest see: http://www3.sympatico.ca/craymer/geodesy/craymer-pubs.html
Enjoy,
DMM
Neither of the links in your post from 2014 seem to be available; do you know where I can find this? I read the first paper and thought I saved it but I can't find it. Any help is appreciated.
When I exclude trivial baselines, my statistics get worse and my results get better. During the brief time I convinced myself to do otherwise I was dogged by slop that I did not expect.
Some of my habits are nostalgia. Others were developed through pain, embarrassment and occasional hard work...
Lee D, post: 447657, member: 7971 wrote: Neither of the links in your post from 2014 seem to be available; do you know where I can find this? I read the first paper and thought I saved it but I can't find it. Any help is appreciated.
You might find this guideline by the Government of Alberta that lists Cramer's paper informative.
Standards, Specifications & Guidelines For GPS Surveys Of Alberta Survey Control
http://aep.alberta.ca/land/director-of-surveys/documents/StandardsGPSSurveysAlbertaSurvey-Mar2010.pdf
I wasn't able to find Cramer's paper online, but did download it in 2009. I didn't see any copyright on my copy of the paper, so hopefully it is okay to attach it.
Dr Craymer's paper mentioned above is available from his new (to me) website at: http://www.craymer.com/
Lots of good stuff.
BTW, briefly reviewing the postings, I note thebionicman mention the impact of including "trivial baselines" on adjustment statistics. That is the problem. You inflate the degrees of freedom and the statistics change...