Activity Feed › Discussion Forums › Strictly Surveying › Processing RTK Data
-
Processing RTK Data
Posted by cmsurveyor on October 20, 2013 at 7:47 pmI ran across someone that processes RTK data. I have never done this before and have never heard of anyone doing this; not that I have ever asked.
Does anyone do this? If so, what is the reasoning behind it?
Have a great day!
john-hamilton replied 10 years, 11 months ago 7 Members · 7 Replies -
7 Replies
-
It would depend on what you mean by “process.”
If you mean extract vector data out of the RTK data collector file, and then use in a Least Squares Program, then I would suppose that quite a few of us “do it.”
Loyal
-
typically RTK observations are stored as either positions or vectors (at least in the Trimble world). I prefer vectors, along with statistical information on the quality of the solution.
I also like to adjust the vectors along with conventional data (total station and leveling) and static data if available. Or if nothing else I can adjust the network of RTK vectors to control that I have tied in.
-
What’s the benefit?
Flush out errors? Attempt to increase accuracy of the RTK positions?
Do you see a marked improvement after the adjustment, and if so, what were your metrics?
> typically RTK observations are stored as either positions or vectors (at least in the Trimble world). I prefer vectors, along with statistical information on the quality of the solution.
>
> I also like to adjust the vectors along with conventional data (total station and leveling) and static data if available. Or if nothing else I can adjust the network of RTK vectors to control that I have tied in. -
We were doing a job where we had to locate culvert crossings in some remote areas of the county. The tolerances were pretty lax as it was more for inventory/GIS type job. We were using Network RTK. There was a couple of locations where the rover couldn’t get internet connection so I had to post process. I post processed ALL the RTK data from that day (all the fixed solutions and the couple where there was no connectivity) and compared them in an excel spreadsheet. Most of the points compared as expected except for a couple of points that disagreed by a foot or two in both the hz and vert. These points were fixed solutions in the field and gave no indication to the surveyor of a problem in realtime.
-
> What’s the benefit?
If you had a bust in the position of the base there can be errors in the the resolved vectors which re-processing the raw data can fix. Reprocessing the raw data with precise orbit data can improve the quality of marginal vectors. Very rarely necessary, but occasionally beneficial.If by “processing” we mean LS adjusting redundant vectors, the benefits are obvious.
-
Say you are doing a topo with RTK and you start the base with a “here” position.
The base stores a static file all day.
The static file gets submitted to OPUS and returns a good position.
RTK observations should be stored as vectors, not positions because it is a radial survey.
Now you can reprocess all the RTK observations to radiate out of your OPUS position for the base, not the “here” position.
-
Reasons to adjust:
Main reason: any redundancy at all should be adjusted by least squares
1) When you have multiple occupations of the same point, whether from the same RTK base or from different bases, you have multiple baselines (aka repeat baselines)
2) When you have multiple control points, some occupied with RTK, others with static, maybe others with total station or by leveling
3) I want to hold some control, see how others fit, then decide which to hold in the final adjustment
4) why not? Even if I have non-redundant lines, the least squares adjustment is like a “master cogo” that will compute everything in a homogenous network
Log in to reply.