Notifications
Clear all

Least Squares in Civil 3D

96 Posts
22 Users
0 Reactions
16 Views
(@norman-oklahoma)
Posts: 7610
Registered
 
Posted by: @jon-payne

Is there some actual benefit to running such a survey through LSA?

In terms of final coordinates, very little - if anything.?ÿ Especially if you are concerned with 2d only.?ÿ It does free you up to run you work in different ways, in different order, to combine data from different sources, etc., etc. - quite a lot.?ÿ?ÿ

Posted by: @jon-payne

It is just something that doesn't sit well with me to suggest that not using LSA when not required is somehow not exercising due diligence or meeting some personal opinion of "best practices".?ÿ

That is not a term that I have used in this thread. If your work is amenable to closed loop traversing then that works for you. I object to the blanket suggestion that LS isn't worth the time.?ÿ Perhaps you might suggest how I handle a control program like this one (showing only a fraction of the whole) using compass rule, or no adjustment at all? and what about the elevations?

image

If my business model was running out lots and acreages for boundary only I believe that I would still make a habit of running the LS on the work. I'd like to have that report in my files in case issues came up in the future. I'd like to have the data in that format so I could augment it later. I'd like to have the freedom to not run closed loops all the time - and still have analyzable redundancy - if the situation warranted it. So, really, it is a business decision as much as it is a technical one.?ÿ

 
Posted : 24/08/2022 10:20 am
(@rover83)
Posts: 2346
Registered
 

@jon-payne?ÿ

?ÿ

I have to check and QC everything after the crew uploads their field data and their notes to the cloud, at minimum to make sure I am doing my due diligence as licensee in responsible charge.

The data files are imported to office software (in my case TBC), where I will check measure-ups, point IDs, feature codes and attributes against the field notes. I can correct any discrepancies that I find, or call the crew to straighten things out. Items out of tolerance are flagged for me to review and check. Mean angle tolerances, vector precisions, point tolerances, etc. Review, and accept or delete/modify.

Once I complete the standard checks, all that data is sitting there. It takes about five clicks and ten seconds to run a minimally constrained adjustment to check if the data is internally consistent. (If I might need to constrain additional control, I get a comparison table at the top of my adjustment report, and can then decide what else to hold or float.)

I can run a compass rule adjustment IF there is a simple loop, but only in that case. And even if all I have is a simple loop, I can check for blunders statistically with LSA, as well as adjust in 3D, faster than performing a compass rule adjustment. Plus I'll get error ellipses for each of my points. I can do 2D-only LSA if I want just by changing a project setting.

What I'm saying is that LSA works for pretty much everything - loop, half-loop, mixed loops, link, spur, RTK, no RTK, levels, chaining, static, fixed control, weighted control, minimally constrained, fully constrained, whatever.

What I should have made clear in my previous posts:

Given the opportunity to use either one - which a great deal of survey software allows for - LSA is more rigorous and gives the best results, which is why I was expressing surprise at the number of folks who appear to have it at their fingertips but spurn it because "equipment is better than it used to be".

To me, it's like putting a marked board on a table saw, turning the saw on...and then ignoring it in favor of cutting it by hand with a hacksaw off the side of the table.

Will it get you a decent result that is "correct"? Probably.
Would it have been faster, easier, and more precise to just cut it with the table saw? Undoubtedly.

 
Posted : 24/08/2022 10:48 am
(@jon-payne)
Posts: 1595
Registered
 
Posted by: @norman-oklahoma

It does free you up to run you work in different ways, in different order, to combine data from different sources, etc., etc. - quite a lot.?ÿ?ÿ

Yes.?ÿ And that is great if someone chooses to do so.?ÿ Many people do not choose to incorporate other methods, equipment, etc.. into their practice and still provide quality work.

Posted by: @norman-oklahoma

That is not a term that I have used in this thread.

I did not intend to suggest you personally said that (apologies if you took offense), but it is a phrase that has been thrown out there in this thread as if it was some defined thing.

Posted by: @norman-oklahoma

Perhaps you might suggest how I handle a control program like this one (showing only a fraction of the whole) using compass rule, or no adjustment at all?

Since I do not know the work flow for what you have shown on screen, I would not suggest how you should handle the data.?ÿ As a matter of fact, I would assume that you are handling it in the way that you find appropriate and that that way provides results that are acceptable under any statutory requirement, office policy, or client contract you are applying to the project.

Posted by: @norman-oklahoma

In terms of final coordinates, very little - if anything.

That is what I have been wondering all along.?ÿ I could not see how the mathematics of it would do much.

 
Posted : 24/08/2022 12:10 pm
(@jon-payne)
Posts: 1595
Registered
 
Posted by: @rover83

And even if all I have is a simple loop, I can check for blunders statistically with LSA

Okay.?ÿ Do the blunders always get detected if you use LSA??ÿ I thought blunders needed to be removed prior to LSA as one of the assumptions of least squares is that the data consists of random errors not blunders.

Posted by: @rover83

LSA is more rigorous and gives the best results

In the context of the closed loop (no cross ties, etc...) what in the processing provides more rigor and better results??ÿ Am I mistaken in thinking that the best that can be accomplished without additional redundancy built into the closed loop is to calc coordinates forward and back, average them (potentially weight for line lengths)?

?ÿ

There's many different reasons to cut a board.?ÿ Sometimes a circular saw is the appropriate tool.?ÿ Sometimes the table saw is.?ÿ I doubt many professionals cut a board with a hack saw.?ÿ They probably reserve that tool for it's appropriate use also.

 
Posted : 24/08/2022 12:32 pm
(@olemanriver)
Posts: 2432
Registered
 

@norman-oklahoma you nailed it. LS gives you the ability to do whatever is needed and not be tied to running a closed loop. ?ÿI am not opposed to a closed loop and adjust by compass. If google field procedures are used then the compass rule and LS donƒ??t vary much in a typical closed loop fashion. But as your image shows the network. You would drive yourself nuts doing closed loops all throughout that project for sure. And I like LS even when I do a simple closed traverse because odds are I will be back or close enough to the site that I will have to expand or tie in somehow to old control. And being able to take all the old raw data and new data and throw it into the party mix is a valuable resource for sure.?ÿ

One other thing I like about LS is the fact after you have reviewed crews and their instrument and such overtime you can start to identify systematic errors or a instrument needing adjustment. It all starts to show up and when I see a change from a crew its easier to identify that information and say hay you might want to run the collimation or whatever on the gun x is getting loose.?ÿ

 
Posted : 24/08/2022 12:58 pm
(@michigan-left)
Posts: 384
Registered
 
Posted by: @jon-payne

I doubt many professionals cut a board with a hack saw.

I bet more professionals use that hacksaw on a regular basis than would ever admit to doing so.

Want proof?

Look around next time you go to a continuing education event.

Deer in headlights about any topic besides how to use a steel tape.

Or read through some of the questions/responses on this forum.

 
Posted : 24/08/2022 1:02 pm
(@jon-payne)
Posts: 1595
Registered
 
Posted by: @michigan-left

I bet more professionals use that hacksaw on a regular basis than would ever admit to doing so.

LOL.?ÿ Alright, you might be right on that.?ÿ I'll have to admit that just today I used an old flat head screwdriver to remove about a 1/2 inch of chip and seal road to uncover a pin.?ÿ So I may resemble that remark.

 
Posted : 24/08/2022 1:10 pm
(@norman-oklahoma)
Posts: 7610
Registered
 
Posted by: @jon-payne

Do the blunders always get detected if you use LSA??ÿ I thought blunders needed to be removed prior to LSA as one of the assumptions of least squares is that the data consists of random errors not blunders.

The blunders need to be fixed or removed for the adjustment to pass the chi-square test. The blundered data generally shows up as having the biggest residuals. The adjustment will (usually) run with blundered data included, but the statistics will clearly show it. If the blunder is bad enough the adjustment may fail to converge (ie/ not run).?ÿ

The most common busts, for me, are point numbering and measure ups on 3d traverses.?ÿ I take the opportunity to clean up point descriptors. Relatively few data sets contain actual weak measurement data.?ÿ ?ÿ ?ÿ?ÿ

 
Posted : 24/08/2022 2:53 pm
(@rover83)
Posts: 2346
Registered
 
Posted by: @jon-payne

Okay.?ÿ Do the blunders always get detected if you use LSA??ÿ I thought blunders needed to be removed prior to LSA as one of the assumptions of least squares is that the data consists of random errors not blunders.

The adjustment process is usually a two- or three-step affair. The first step is a free or minimally-constrained adjustment, and this is what checks the internal consistency of the observations.

If all blunders and outliers have been removed, and we are left with truly random errors, the post-adjustment residuals will fall within a specific range (depending on network redundancy and individual observations weights) based upon the normal distribution.

In the case of blunders that slip through the typical QC process (like cross-checking field notes), it's a very rare case that they don't get caught during the free/minimal adjustment. Their standardized residuals will get flagged, and the operator can discard one or more and re-run the adjustment.

I've caught many, many blunders that were not apparent because the crew wrote the incorrect measure-up or wrong point number both in the field book and in the data collector.

Once the operator is satisfied that blunders and outliers have been removed, the network can be further constrained. At that point any issues with control can be evaluated through reference factor/chi-squared testing (if they weren't already in the free adjustment); the reference factor should not jump significantly when adding control to the constraints.

It sounds a lot more complicated than it is. With good equipment and procedures I can ram a network adjustment through in a few minutes, even if I have to disable or delete a few observations along the way. Sometimes those observations are not a blunder, because statistically speaking there will sometimes be outliers that blow the network up - outliers that are nearly impossible to detect in a compass rule adjustment.

?ÿ

In the context of the closed loop (no cross ties, etc...) what in the processing provides more rigor and better results??ÿ Am I mistaken in thinking that the best that can be accomplished without additional redundancy built into the closed loop is to calc coordinates forward and back, average them (potentially weight for line lengths)?

It's a good question, because as Norman pointed out the absolute values of the final coordinates may not change very much when comparing a LSA to a compass rule adjustment.

Fundamentally, compass rule weighting is arbitrary; using line length is based upon old instrumentation and procedures from the theodolite-and-chaining era. LSA sucks in literally every single possible quantifiable source of error - centering, measure-up, levelling, EDM, atmospheric, curvature corrections, horizontal angle, vertical angle, pointing & reading.

All of those items can be quantified, either pulled from the datasheet specifications or derived from real-world observations. (Once you start pulling in RTK vectors, levelling observations, etc., all of their unique errors can be handled and appropriately weighted too, which is incredibly powerful.)

So rather than hanging adjustments off of a single arbitrary quantity, all of the standard errors I mentioned above go into computing the actual precision of the measured quantities and how well they fit with the multi-step process I outlined above.

Also, compass rule cannot handle any additional redundancy. So any more setups than that first traverse loop are just tossed aside. Granted, there may not be many additional setups, but any additional data beyond the minimal can be used both to tighten up our coordinates and more rigorously check for possible blunders.

Throw some RTK, static or levelling observations in there too, and I can just toss everything together in one full-blown adjustment, taking into account that the level observations are far tighter than the RTK elevation deltas, and that the total station EDM at 150 feet is far better than a fast-static vector between the same two stations.

Then comes the icing on the cake - I know exactly how good those final coordinates are relative to my fixed control, because I get error ellipses at whatever confidence interval I want. I have a mathematically verifiable estimate of the precisions of my final coordinates - whether or not they changed a lot or a little from pre- to post-adjustment.

So now I not only know how good my results are, I can point to my procedures as evidence that I did my due diligence and utilized every single bit of good data that I had in a simultaneous solution. (As opposed to calculating data from one loop, then another, or holding an RTK azimuth that's no better than 30" based on vector precisions as gospel and hanging an entire traverse off it.)

I can also do it in about 10-20 minutes, then take another 20 to process linework for the topo, maybe another half hour to add some attributes or descriptions, spit out some reports from the adjustment plus linework processing, and go straight to drafting in about an hour.

OK, I should clarify that it's that fast when we have good people in the field and good procedures in place with well maintained instrumentation. But if we don't, then neither compass rule nor LSA is going to fix bad data and unchecked blunders.

?ÿThere's many different reasons to cut a board.?ÿ Sometimes a circular saw is the appropriate tool.?ÿ Sometimes the table saw is.?ÿ I doubt many professionals cut a board with a hack saw.?ÿ They probably reserve that tool for it's appropriate use also.

For sure - I don't see a hacksaw as a bad tool per se, but if I already have the power tool with the precision guide and 3-4K RPM, and it's sitting right in front of me, I'm going to use that because I know it'll cut better than me 99% of the time.

Honestly? I like to work smarter and not harder. Maybe that's just code for being lazy. LSA lets me throw a bunch of data from different sources into a project, tweak a few things, run some checks, wipe out any bad data, and come up with a unified solution from a single program. It's not perfection, but it's about as close as I can get.

 
Posted : 24/08/2022 2:55 pm
(@norman-oklahoma)
Posts: 7610
Registered
 

I just want to pay cash for all my purchases. What is a debit card going to do for me? Answer: nothing much, really.

I just want to communicate by land line phone and US Mail. What is a cell phone going to do for me??ÿ Answer: nothing much, really.

I just want to run traverses with transit and tape. What is a robotic total station going to do for me? Answer: nothing much, really.

I just want to hand draft all my surveys. What is CAD going to do for me? Answer: nothing much, really.

?ÿ

 
Posted : 24/08/2022 3:26 pm
(@ncsudirtman)
Posts: 391
Registered
 

so after all that vibrant discussion... any takers on how to perform the LSA appropriately in C3D? lol some of us are curious about the OP's question and if nothing else I'd rather do that than a compass rule and level notes adjustment if possible haha

 
Posted : 24/08/2022 4:45 pm
(@andy-j)
Posts: 3121
 

@rover83?ÿ ?ÿExcellent post.?ÿ ?ÿI haven't done much LSA since the days of static GPS networks, but your workflow is exactly what I recall.?ÿ ?ÿ

?ÿ

One thing that always seems daunting is "side shots" to things that don't need redundancy and how does LSA software handle that??ÿ ?ÿDo you have to separate them from the control network then re-adjust or does it recalculate on the fly ??ÿ It would be interesting to see an interactive application where you can put in data and a priori errors to see how you can effect the residuals.?ÿ?ÿ

?ÿ

Andy?ÿ?ÿ

 
Posted : 25/08/2022 5:19 am
(@jim-frame)
Posts: 7277
 
Posted by: @andy-j

One thing that always seems daunting is "side shots" to things that don't need redundancy and how does LSA software handle that??ÿ

In Star*Net, sideshots are computed after the control network is done.?ÿ It's automatic and essentially instantaneous, so it doesn't appear to be a separate process.?ÿ I expect TBC does the same.

 
Posted : 25/08/2022 6:00 am
(@andy-j)
Posts: 3121
 

@jim-frame?ÿ ?ÿthanks...?ÿ that's what I thought.?ÿ I'm watching the Star*net playlist on the Microsurvey Youtube page now.?ÿ Good refresher!

 
Posted : 25/08/2022 6:33 am
(@rover83)
Posts: 2346
Registered
 

The default mode in TBC is to only adjust observations with redundancy, and then update any sideshots based on adjusted positions after the adjustment is done. The operator can change that in the project settings to get "adjusted" values plus error ellipse estimates for sideshots.

Of course, like you say, it requires very good a priori estimates of standard errors for those sideshot observations, since there are no residuals to evaluate in the post-adjustment statistics.

 
Posted : 25/08/2022 6:41 am
Page 4 / 7