> Perhaps you are unfamiliar with the technical terms generally used, but I often see posts about post-processing RTK work, which would generally include what you speak of.
Actually, PPK isn't RTK at all. Why do you think it is?
> But, to the point, I could (do) produce the same StarNet vectors and balancing with RTK.
Okay, so what you're saying (assuming that by the use of the term "balancing" you're refering to some process of survey adjustment) is that you are one RTK user who does export the vector data and adjust it with other classes of measurements? That would put the count at two then, including Norman Oklahoma. It obviously isn't a widespread practice. The stereotype stands.
> Well, the four GPS occupations were under an hour, but "how much time did it take" made no particular sense as a question since there was no better alternative that was more time efficient. I mean, if a surveyor doesn't have enough time...
I understand your point(and agree), but the following posts, and the one I keep on referencing seem to indicate that you are in the minority of not having to consider budgets. It seems most private surveyors find a way to make money on small lot surveys, construction surveying with razor thin margins, and NOT on large limitless budgets of jobs no one else can do.
I would imagine that your experiences, capabilities, current marketing success, and general outlook are a very small minority of surveyors. To judge the rest by your circumstances doesn't seem reasonable. I was trying to "sell" the process that you described above by not only highlighting the efficacy, but the small cost involved.
For most surveyors, it always matters how much it costs, and how long it takes. That is just the reality. We can lament it, and tilt at windmills, but it didn't seem to work for the other guy.
> It seems most private surveyors find a way to make money on small lot surveys, construction surveying with razor thin margins, and NOT on large limitless budgets of jobs no one else can do.
Well, the point stands that if the work isn't done properly, it makes no difference whether it was under budget or not. It's basically worthless. The real question is how to do things correctly, not how to cut corners to get a leg up in price competition.
Such absolutes with such a subjective discussion. Your minimally constrained data would suggest that the job was "right" with only two GPS vectors. You'll have to take that debate up with your data. But you're happy and can clearly show that your fence line survey is accurate to a centimeter relative to NAD83 2011 as a result. So I'm happy for you.
Namaste
> Such absolutes with such a subjective discussion. Your minimally constrained data would suggest that the job was "right" with only two GPS vectors.
In what strange world would that be true? The GPS pairs are very loose azimuth control, so basically the main condition that the minimally-constrained traverse is meeting is that the distance from GPS endpoint to GPS endpoint is consistent with expectation. So, the minimally constrained adjustment is not sufficiently good to test something as basic as that the angle measurements were free of abnormal errors.
All it takes to do that is to add several more GPS control points along the traverse. Problem solved. Quality of the results is better, i.e. more accurate and more reliable.
To me Kent's photo & narrative illustrate conditions where using a minimal GPS pair is unacceptable. The fact that either adjustment may or may not give satisfactory results is, at least to me, irrelevant.
You know, I'd probably agree with you there samurai. My comfort tolerance would probably have been satisfied with one extra GNSS point in between though, first unconstrained and then fully constrained in the final adjustment. If I understand Kent's graphic, he has 9 GNSS points distributed along that route (the blue lines). The practical difference in coordinate values of points along this route between the adjustment with 2 vectors and with 9 vectors was (by most standards) insignificant.
But according to Kent, the only right way to do this job is his way - 9 static vectors, adjusted in StarNet. It's his job and I have no problem with that.
> .... My comfort tolerance would probably have been satisfied with one extra GNSS point in between ...... But according to Kent, the only right way to do this job is his way - 9 static vectors, adjusted in StarNet. It's his job and I have no problem with that.
Once you have set up your base, and cut your line, and have the RTK in hand, it really doesn't take much extra time to get a few more vectors. (See the thread from the other day on occupation time) Yes, 9 is more than strictly necessary. But the right thing to do is to grab extra data that will improve your solution when it can be had with so little extra effort. In any case, I think you were taking Kent a little too literally if you think he was insisting that every single one of those vectors was critical.
Did I misunderstand Kent? I thought his GPS measurements along the traverse were static sessions, not RTK.
I agree with you in principle.
If RTK, just shoot them.
Logging 1 hr static per suitable traverse point? I believe there would be a clear point of diminishing return (dictated by the accuracy requirements). Maybe 15min static sessions on three or four of the traverse points would yield acceptable results.
However, especially if these jobs are in BFE (and require a lot of windshield time)... I would collect more data than I would possibly need.
Just my $0.02.
BTW, there are significant amounts of cadastral work in the PNW completed with the same methodology. There are more than two users of Least Squares, conventional traverse, and GPS vectors (Kent has a flair for hyperbole).
That being said, I know many surveyors reluctant to learn least squares and apply it to their work. I attribute the attitude to the following in no particular order: apathy, mistrust, the "git 'er dun" attitude, and the "if it ain't broke, don't fix it attitude."
Sadly many folks have the software available, but are not trained to use it and have no desire to.
Regarding RTK I think running dual base RTK would be a great alternative to add some easy redundancy (assuming one has the available hardware). I have not had an opportunity to try this myself but have heard positive reports.
http://geoline.com/wp-content/uploads/2014/01/HPB_Dual-BaseConfig.pdf
Yes Kent's comment that only he and Norman process GPS vectors and conventional simultaneously in LS was odd, to say the least.
I do it on just about every project.
Maybe Kent thinks only he and Jim Frame drive a Toyota pickup?
> But according to Kent, the only right way to do this job is his way - 9 static vectors, adjusted in StarNet.
No, and that's a puzzlingly mindless misrepresentation of the point that I fairly clearly stated above. The point was that adding GPS-derived positions in addition to the bare minimum of the pairs at the beginning and end of the traverse significantly improves the results, both in terms of accuracy and, more importantly, reliability.
> Yes Kent's comment that only he and Norman process GPS vectors and conventional simultaneously in LS was odd, to say the least.
Well, the "Kent" you apparently have in mind isn't me. My comment was that a slender minimum of RTK users import the data in the form of GPS vectors and adjust those vectors in combination with conventional measurements. Most RTKers appear to be perfectly satisfied just to have coordinates, with no treatment of the uncertainties.
> I do it on just about every project.
If you are importing RTK vectors and adjusting them with conventional data, that makes three users so far.
> Did I misunderstand Kent? I thought his GPS measurements along the traverse were static sessions, not RTK.
Yes, they were Rapid Static and PPK occupations, not RTK, but how the positions were obtained is less relevant than other considerations in the survey design.
> Logging 1 hr static per suitable traverse point?
Okay, this is drifting into nonsense. The longest session on any of the points was about 20 minutes, aside from Pt.86, which had a receiver parked on it as both a second base and to give a first-rate connection from Pt.10, the control point about three miles away with an excellent NAD83 position from OPUS solutions scattered over seven different days (as a base receiver occupied it during each field day). The PPK occupations were about 5 minutes.
So, the total additional investment in field time to add four points more than the three azimuth conditions, one at beginning, one at end, and one enroute, was under an hour.
> Okay, this is drifting into nonsense.
Easy there big fella. Nobody is messing with Texas.
I know you bring these topics up for good reasons.
I, for one, enjoy your posts. Thanks for sharing.
> I know you bring these topics up for good reasons.
Well, there have been recurring posts asking about how to adjust a conventional traverse between pairs of control points positioned via RTK (or OPUS-RS, as I recall). The obvious solution is to import the RTK vectors (with their uncertainties) or the OPUS-derived positions into an adjustment by least squares with the conventional traverse measurements.
If you do that, it should be easy to evaluate the uncertainties in the azimuths between the pairs of GPS points and then figure out the most efficient way of correcting that problem. As a rule, that fix ought to be simply adding more GPS-derived positions along the traverse route, but it's a fix that requires being able to export GPS vectors with uncertainties and to adjust them in combination with conventional measurements.
The main obstacle to such a simple fix would obviously be if the RTK user had no idea how to actually adjust RTK vectors (or was using equipment that did not facilitate that).
count me in too.
i do it my own way though. my adjustment package does not care for GPS vectors so by way of spreadsheet i calculate an azimuth and distance with appropriate variances based on estimated GNSS point quality and include them into the adjustment as conventional measurements.
i would typically use this only on larger traverses to keep them under control when observing conditions make total station observations looser than i'd like.