Notifications
Clear all

Least Squares in Civil 3D

96 Posts
22 Users
0 Reactions
16 Views
(@michigan-left)
Posts: 384
Registered
 
Posted by: @jon-payne

That scenario is frankly quite enough to raise a legitimate question of, within the very minimally redundant closed traverse what does LSA do for me that is objectively better.?ÿ

Remembering back to learning about closed loop traverses analyzed with LSA, I recall the lecturer stating something to the effect of, "if you don't set up 1, 2, x times in the middle of your closed loop and get non-adjacent point cross-ties, then you really have negated the value of LSA on closed loop traversing."?ÿI think that's what you were getting at, yes?

Posted by: @jon-payne

The question of what it brings to the table for a loop traverse has been rooted in those 1990's (and onward) readings as those papers always seem to jump very quickly to networks instead of a closed traverse.

I think you may have overlooked this aspect: properly designed and conducted field work with LSA in mind for the ananlysis will always yield "networks" with independent/redundant measurements.?ÿ

Example: 3 legged closed loop traverse (triangle).

Set & occupy 4th point in center of primary closed loop triangle and take angle/distances from/to each of the permiter 3 points. That's a network of 3 small triangles within the big triangle now. Pretty tough to get better than that.

Maybe the discussion in the 1990's papers you read didn't give this last crucial detail, but I got the impression that you thought the papers quickly jumped to the discussion of GPS networks, and left the traversing incomplete?

The "network" aspect of the LSA is designed by the operator, and a simple closed loop figure is not a stellar example of best practice for LSA, as you were pointing out.

I watched feverishly from the peanut gallery as all the members crunched the data to resolve that traverse issue described above. I have to pick my battles to get involved with the help, else I will go down a rabbit hole out of sheer curiosity and enjoyment, even if I shouldn't, due to time constraints.

?ÿ

?ÿ
 
Posted : 25/08/2022 4:08 pm
(@bill93)
Posts: 9834
 

Star*Net and probably most LSA programs will weight the angles with a proper combination of what you give it for instrument accuracy, instrument centering accuracy, and target centering accuracy with the nominal length of the line to form the standard error of the angle measurement. This alone is an improvement over compass rule.

 
Posted : 25/08/2022 6:39 pm
(@half-bubble)
Posts: 941
Customer
 

Closed loop traverses are condition equations.

Multiple cross ties or resections or breaking setup for more obervations = observation equations.

WIth the advent of modern computing power, the condition equations are a step backwards.

?ÿ

 
Posted : 25/08/2022 6:48 pm
(@mightymoe)
Posts: 9920
Registered
 

LSA can be glitchy,,,,or odd.?ÿ

Doing GPS adjustments I've always processed each vector before doing the LSA.?ÿ

I've seen many times the coordinate from the LSA fall outside the individual numbers.?ÿ

For instance an ellipsoid height of 100.10 from CORS point to the NW and 100.06 from the CORS point to the SW and the adjusted number is 100.11. No big deal for the most part but I will often override the adjustment and mean the value, same for lats and longs.

 
Posted : 26/08/2022 4:10 am
(@rover83)
Posts: 2346
Registered
 
Posted by: @mightymoe

LSA can be glitchy,,,,or odd.?ÿ

Doing GPS adjustments I've always processed each vector before doing the LSA.?ÿ

I've seen many times the coordinate from the LSA fall outside the individual numbers.?ÿ

For instance an ellipsoid height of 100.10 from CORS point to the NW and 100.06 from the CORS point to the SW and the adjusted number is 100.11. No big deal for the most part but I will often override the adjustment and mean the value, same for lats and longs.

That's a function of the network adjustment transforming GNSS vectors to local datum i.e., NAD83 ellipsoid heights, local ortho elevations, grid north/east, etc. In TBC the latitude/longitude deflections option defaults to on - it can be turned off, but it really should only be done in certain cases.

?ÿ

A good example: when doing airport/AGIS work, we had to fix the PACS & 2 SACS that were only a couple hundred meters apart (because they were our only primary control in the area) and clustered at the center of our work.

But our observed points for aerial targets were miles and miles out from the runway/airport itself. In this case turning the transformations off during adjustment kept the network from being warped at the outer edges due to holding 3 control points very close together at the very center.

If you're holding a single point fixed in the vertical, it will just shift the GNSS vectors "up" or "down", so depending on the location of the project and the values being fixed, unfixed (derived/adjusted) point values may not look correct, at least not intuitively.

But it's not a "glitch", it's exactly how LSA functions. It can only do what the operator tells it.

 
Posted : 26/08/2022 7:12 am
(@jon-payne)
Posts: 1595
Registered
 
Posted by: @olemanriver

LSA will show systematic errors.

Posted by: @olemanriver

Now the statistics will show where a possible blunder or systematic errors are occurring.

I'm probably too argumentative about this topic, but every time it is brought up there is this language that goes along with it that makes LSA out to be some cure all and only acceptable means for data analysis.?ÿ We've recently seen an example where systematic error was small enough as to be accepted as statistically good data.

I know I've read authoritative sources that state that blunders and systematic errors need to be removed first or - if they are small enough to sneak in under the radar- they will incorrectly be incorporated as random errors.

I know that I have experimented with purposely editing data just to see what happens and transposing the tenths and hundredths spot (a blunder) can either be accepted as good or be flagged depending heavily on the redundancy that ties with that particular edit.

?ÿ

Least squares is GREAT stuff!?ÿ CHEER CHEER CHEER.?ÿ It definitely has it's place.?ÿ But just like GPS in general, it does not (and probably should not) fit into every surveyor's tool belt.?ÿ Having a better car (McLaren 720S) will only create problems in some instances.?ÿ Best does not mean the same thing to everyone.?ÿ Some people's best would be a reliable economic vehicle instead of 0-60 in a few seconds.

?ÿ

EDIT:?ÿ Had to add this because it will definitely be read into my statement by someone - I am not suggesting that any other adjustment or analysis method is immune to or even better at keeping those possible small blunders or systematic errors out of processing.

 
Posted : 26/08/2022 1:19 pm
(@jon-payne)
Posts: 1595
Registered
 
Posted by: @michigan-left

Remembering back to learning about closed loop traverses analyzed with LSA, I recall the lecturer stating something to the effect of, "if you don't set up 1, 2, x times in the middle of your closed loop and get non-adjacent point cross-ties, then you really have negated the value of LSA on closed loop traversing."?ÿI think that's what you were getting at, yes?

I don't think I would say negated.?ÿ I think I would say failed to develop the conditions that are most beneficial to LSA.

?ÿ

Those 1990s papers didn't even touch on incorporating GPS as use of GPS was not as wide spread at the time.?ÿ They would quickly jump from one closed loop to connecting several together to create a network of observations.

 
Posted : 26/08/2022 1:25 pm
(@jon-payne)
Posts: 1595
Registered
 

@bill93 That certainly introduces the rigorous analysis and would explain a portion of the very minimal difference between the two processes when the traverse is planned and executed with an eye to balanced legs of the traverse versus when it is necessary to have those choppy alternating short/long lines that are fairly typical for traversing in some terrain.

As someone else already pointed out, there are also times that the angular adjustment might should be related to some other condition of the physical act of observing as well; such as crossing pavement or being able to sight directly on the dimple of the control versus a string or pole.

 
Posted : 26/08/2022 1:35 pm
(@olemanriver)
Posts: 2432
Registered
 

@jon-payne oh no problem here. Its a tool a method. Just like compass/Bowditch krandle and transit. And one most surveyors never use but gps orbits do is kalman filter. All have pros n cons. ?ÿJust like gps gnss rtk static ppk stadia total station chain/ tape. Edm.

 
Posted : 26/08/2022 1:44 pm
(@jon-payne)
Posts: 1595
Registered
 
Posted by: @half-bubble

Closed loop traverses are condition equations.

I did not think that was a correct statement.?ÿ I had to pause and look that up just to see if I was mistaken.?ÿ I was mistaken in my thought that the statement was incorrect, but it should be more accurately stated as one method a closed loop traverse can be processed by least squares uses condition equations; making the sum of the latitudes and departures a condition of the solution.

I had actually not known about that as a means of computing a least squares analysis of a closed loop traverse as I was only aware of the variation of coordinates method.?ÿ Which is not a condition method.

 
Posted : 26/08/2022 2:04 pm
(@mightymoe)
Posts: 9920
Registered
 

@rover83?ÿ

Nope no weighting, CORS held fixed for LLH, LSA value is replaced with a mean, keeping the numbers between the two vectors, but I don't really care, 1/4" this way or that is meaningless anyway.

It's nothing to do with local datum (NAD83) since all the free vectors and CORS were held as NAD83. The vectors between CORS points are turned off and not included in the adjustment.?ÿ

I doubt there are many people who do this that look at the individual numbers, it's something from my early days doing network adjustment. If each individual coordinate isn't acceptable it's time to look deeper into the survey.?ÿ

?ÿ

 
Posted : 26/08/2022 2:22 pm
(@michigan-left)
Posts: 384
Registered
 
Posted by: @jon-payne

I know I've read authoritative sources that state that blunders and systematic errors need to be removed first or - if they are small enough to sneak in under the radar- they will incorrectly be incorporated as random errors.

In many LSA reports, you can go to the portion with the histogram of standardized residuals, and see how they compare to an evenly distributed bell curve, etc. If the peak of the histogram is offset from the center either direction, and (or) the histogram is not a nice smooth curve, that usually indicates systematic error(s) of some sort.

The trick to making those errors stick out like a sore thumb and not look like noise (random errors) is to set your instrument errors, set up errors, etc. appropriately (or a bit tight) to exaggerate the warping enough to get the LSA to react.

Personally, if they're even doing it, I think many surveyors set their initial errors up a bit too loose. It really is a bit of an evolution to find your particular sweet spot where your gear performs, and how much effort you put (are willing) into keeping things properly adjusted.

Any modern equipment (last 20 years) that is properly maintained and employed using "reasonable" field methods should always get you within/around a tenth or better on an LSA. If you're routinely not seeing that, then there is likely something that needs attention.

I get grumpy when i see residuals exceed 0.05', and if I see 0.10' residuals, then there was a bust, or my gear has been compromised in some way. Modern equipment is capable of some amazing feats.

I would not have wanted to be a surveyor in the 1960's-1970's.

 
Posted : 26/08/2022 4:21 pm
(@olemanriver)
Posts: 2432
Registered
 

@jon-payne Finally done with making hay. Yes blunders and systematic errors should be removed. However reality is we donƒ??t always catch everything prior to adjustment. I usually check for bounders and any other errors prior to adjustment. Then I will run an unconstrained LSA holding nothing i mean nothing fixed. Its a sanity ck a extra step to see how well measurements fit themselves and I can look at the cool error ellipse to see if everything is moving the same direction aka systematic error. And see if any outliers show themselves. At this point I am also checking my weighting and scaler to see if I made a poor judgment. At this point I donƒ??t even care if it fails the chi square test. I do also look at the Tau . I then cancel the adjustment and clean up any other data apply corrected weighting if necessary or just use the scaler . Then run again unconstrained at this point i am usually passing chi and my scaler is close to 1. ?ÿThe. I fix lat long only on one of my primary control points. Run and check opposite or a different control point to see how it all fits if good i then add a height. Ellipsoid height and only 1. And lat long of 2nd if necessary. Depends on what i am trying to do. I am checking elevation to points. This gives me a check on geod model quality. I never fix a northing and easting well most of the time i do not i do check. Hardly ever fix an elevation you can distort things. When you have two or more lat longs fixed and a height or two you can look at the tilt and rotation of the lat longs heights they are one lol. This especially in static work letƒ??s you see a lot of how much vectors are moving etc. maybe have to reprocess a baseline or throw one out . ?ÿI am usually about done. I usually set my contol networks up in a fashion I donƒ??t need a fully constrained adjustment its not necessary and just gets i to the weeds. There are times it needs one but a minimally constrained adjustment planned correctly and approached correctly during adjustment is good to go.

he donƒ??t worry about arguing lol. You want to learn I want to learn arguments are not bad if we both learn. I hope i didnƒ??t come across as argumentative either. ?ÿI didnƒ??t take you as. I thought it was a great question and made me dust out one of the closets in my brain lol.?ÿ

Why do we do adjustments anyway. Because it gives us a way to check our work for blunders and errors. It gives us a mathmagical way to support our data. We can run a closed loop traverse and have a tight closure on ourselves but could be compensating errors. Like running a level loop to a point and back to ourselves and we blundered by a foot. Our closure was good but all our points are 1 foot off. With a different aproach running from one known BM to another we would have to blunder twice by same. Redundancy redundancy redundancy is our friend. Now to eat soup and rest my weary bones after being sick and getting hay done. I have a man cold. Lol. This has been a great topic for sure.?ÿ

 
Posted : 26/08/2022 5:56 pm
(@rover83)
Posts: 2346
Registered
 
Posted by: @mightymoe

@rover83?ÿ

Nope no weighting, CORS held fixed for LLH, LSA value is replaced with a mean, keeping the numbers between the two vectors, but I don't really care, 1/4" this way or that is meaningless anyway.

It's nothing to do with local datum (NAD83) since all the free vectors and CORS were held as NAD83. The vectors between CORS points are turned off and not included in the adjustment.?ÿ

I doubt there are many people who do this that look at the individual numbers, it's something from my early days doing network adjustment. If each individual coordinate isn't acceptable it's time to look deeper into the survey.?ÿ

?ÿ

I wasn't referring to weighting, but the fact that prior to an adjustment being run, a priori coordinates are, strictly speaking, based upon an arbitrary computation. In TBC, for example, before running an adjustment, the computation engine looks for coordinates first, running through the hierarchy from control to unknown, then applies observations in the order of levels, GNSS, and total station before returning to look for more coordinates.

It's a pretty good system, but is still arbitrary and, where a true network is involved, with lots of connections, it's just not telling the true story of how all of those relate to each other, especially with varying precisions of observations.

Upon evaluation using LSA, the entirety of the network is taken into account, which is why a simple mean of a couple of observations does not necessarily match the least-squares most likely solution.

And when we do fix station values, we are absolutely transforming GNSS vectors to fit our local datum. That's sort of how any adjustment process works, whether compass rule, LSA, or levelling by turns. We're forcing our data to fit something we want it to. The difference is that properly implemented LSA can tell us with high confidence whether our results are realistic or not.

?ÿ

Here's an example. In the not-too-distant past, our crews would take 6 short (5-epoch) RTK observations on control points. In the office, the tech would look for a cluster of two or three shots, take the simple mean, and delete the others. No mathematical or systematic approach, just "these two look good to me". We never included total station observations - just exported out those arbitrarily averaged values and locked our total station setups to them. Our conventional checks were never that great, which they chalked up to "RTK slop". Going back to legacy projects, our GNSS checks were pretty sloppy too.

These days, we take longer observations (2x or 3x 1-3 minutes, depending on conditions) and adjust using ALL of the data, including total station and level. Our repeat observations with GNSS, as well as total station backsight checks, are way better than they used to be. Sometimes an order of magnitude better.

A priori blunder and outlier checks are all well and good, but the cutoff really needs to be appropriate for the application. We had a couple of projects where the PMs declared that any independent RTK observations outside of 0.05' either horizontal OR vertical had to be tossed and repeated. We must have tossed out 30-40% of our shots that were perfectly good with solid H/V precisions that could have been properly weighted and used.

?ÿ

Posted by: @michigan-left

In many LSA reports, you can go to the portion with the histogram of standardized residuals, and see how they compare to an evenly distributed bell curve, etc. If the peak of the histogram is offset from the center either direction, and (or) the histogram is not a nice smooth curve, that usually indicates systematic error(s) of some sort.

The trick to making those errors stick out like a sore thumb and not look like noise (random errors) is to set your instrument errors, set up errors, etc. appropriately (or a bit tight) to exaggerate the warping enough to get the LSA to react.

Personally, if they're even doing it, I think many surveyors set their initial errors up a bit too loose. It really is a bit of an evolution to find your particular sweet spot where your gear performs, and how much effort you put (are willing) into keeping things properly adjusted.

This is the way. Absolute deltas between two observations might not look great, but when weighted appropriately their standardized residuals are nowhere near the cutoff.

On the other side of the spectrum, one shot might fall close to another one but the second observation's residuals are way, way out of whack because they were done with different equipment and/or have very different standard errors and therefore different standardized residuals.

There is far more confusion than is warranted over the process. I've had surveyors tell me with a straight face that "least squares spreads the error around" and is therefore somehow bad, while crowing about their compass rule adjustments where they threw a couple degrees of error into a small loop for no other reason than they needed it to close...with no concern over whether the closure was even within an appropriate error budget for their equipment.

 
Posted : 28/08/2022 4:02 pm
(@mightymoe)
Posts: 9920
Registered
 

@rover83?ÿ

That isn't the issue.

The coordinates I'm referring to are processed from controlling CORS stations before the adjustment is run. I always expect the resulting adjusted value to fall between the processed values, not higher than the high value, lower than the low value, east of the east value, ect. However, there are times that doesn't happen with LSA and I find that unacceptable.?ÿ

I doubt very many take the time to process the points using individual vectors, but I do.?ÿ

 
Posted : 29/08/2022 6:00 am
Page 6 / 7