Expecting unconstrained coordinates to always fall near where the simple mean of absolute approximate coordinates - particularly when constraining to multiple stations - is to misunderstand LSA.
I don't (usually) expect unconstrained coordinates to stay exactly where they are, but I do expect the adjusted value to take into account all the observations and their weights. Depending on the network configuration and what I am holding fixed, they may change either a little, or a fair amount - or a lot, depending on how the a priori coordinate was derived.
All those observations are connected and will thus influence each other, sometimes a lot, sometimes a little. LSA is just mathematics.
Sometimes the results aren't necessarily going to line up with expectations. If they move in an unexpected way, the first thing I'm doing is reviewing my procedures and double-checking for blunders, bad HIs, etc.; if that doesn't yield an answer, I'm inspecting my control values for discrepancies to see if my network is a poor fit to those points.
@rover83?ÿ
This isn't that complicated, a simple two vector coordinate to the same antenna using the same receiver file, then the adjustment should fall between the two coordinates; if not the adjustment is wrong.
?ÿ
if not the adjustment is wrong.
Well, we're responsible for the quality of our data and for assessing the quality of the control being held. If we don't like the results of the adjustment, either the data are bad, or the observations do not fit the control well.
Or perhaps they do fit well, but the process of fitting those observations to the control moved the unconstrained coordinate more than expected. That's not necessarily unusual, especially with low degrees of freedom (such as a two-vector adjustment i.e., a single degree of freedom), and it's not indicative of an inherent problem with LSA itself.
Question from a non-Trimble user to the Trimble community.
?ÿ
Background:
We typically use a running point and then use Starnet in the office to choose which points get used in the adjustment (ie. if someone had a poop setup, we would remove that shot on the control point).?ÿ Starnet, by it having us spec our equipment, better calcs out the point based upon the limitations of said equipment.?ÿ I believe it does this better than, for example, SurvNET based upon my reading of a thread by BowTieSurvyor (I think?) a few years back here on the board.?ÿ Starnet's point autochoose works pretty well for this (ie. spec 2D, 0.1m proximity, and they get added to the list).
?ÿ
Question:
When you use the same point in Access and average the point, does it take into consideration the source of the data??ÿ Ie. will it weight total station data different than GNSS data??ÿ Or is it just a straight mean?
?ÿ
After the fact, when the data is back in the office, can you choose different data to hold for a point in TBC?
?ÿ
Question for the Leica crowd:
Does Leica Captivate have an equivalent?
?ÿ
?ÿ
Just generally interested in some of the differences between Carlson and the other brands.
Question:
When you use the same point in Access and average the point, does it take into consideration the source of the data??ÿ Ie. will it weight total station data different than GNSS data??ÿ Or is it just a straight mean?
?ÿ
After the fact, when the data is back in the office, can you choose different data to hold for a point in TBC?
Access and Trimble Business Center both give you the option to use simple or weighted mean. Weighted in Access is the defaut, and helpful for GNSS because the mean coordinate standard deviations are much more realistic and incorporate the H/V vector precisions.
You can always pick and choose what you want to hold for a point during post-processing. In TBC there are both coordinate records and observation records, and if you want to add or remove things - or just disable them to take them out of the computation process - it's easy.
Coordinate records can come from a CSV, from a raw data file (hand-entered, averaged, autonomous base, etc.) and can have quality levels of control, survey, mapping, and unknown in descending order. Raw observations are always survey grade.
Once you understand the hierarchy of coordinates & observations it's (I would say stupid) simple to get the results you want.
Either you disable, modify, or delete coordinate records and observation records until the computation engine gives you the right result. Or you fix one or more control-class coordinates and run a network adjustment, which will then use coordinate records for a priori positions only, and compute the statistically most likely positions using enabled observations only.
Network-adjusted coordinates rule over all, including control-class coordinates (as they should), and can then be pushed back into the field where Access will recognize the hierarchy just like TBC.
?ÿ
Question for the Leica crowd:
Does Leica Captivate have an equivalent?
Not sure about Captivate, but the Infinity post-processing office software doesn't really have an equivalent - it averages ALL observations and shows the average as a separate point but keeps the individual observations as "points" in the database as well.
I never really understood the reasoning behind the "update stations" command. Rather than hold the best coordinate for station setups (control-class, averaged, adjusted, observed, etc., whatever is at the top), it holds the unadjusted field-derived position (even if it was different for different setups over the same point) until you run the command - and then sometimes it won't even hold to what you told it to.
TBC, by contrast, prompts you to merge common points and observations on import, and once everything has been merged it applies the computation hierarchy the exact same way every time, from station setups to GNSS base positions, to level runs. Much easier to figure out what is going on, and there's no guessing as to where the observations are flowing out from. I wrangled with Infinity for three years and it was never as intuitive or obvious as TBC as to what exactly it was doing.