Notifications
Clear all

Least Squares in Civil 3D

96 Posts
22 Users
0 Reactions
16 Views
(@norman-oklahoma)
Posts: 7610
Registered
 
Posted by: @beuckie

For normal, standard stuff i don't get you guys still use an adjustment routine in the office.

Adjusting is not the primary reason for running data through an adjustment routine. The primary reason is to identify and eliminate blunders. Adjusting is just something that happens incidentally. Doing that on the data collector in the field has its place, but I'd prefer to do it in the office on the big screen.?ÿ?ÿ

Few of my control surveys are of the loop type. But all have redundant measurements on everything. If I don't adjust I'm using one measurement to determine a coordinate, and treating subsequent measurements as mere checks. But they aren't simply checks, they are as valid as the first measurement. I prefer to take advantage of all the (correct) data I've worked to collect to determine a final coordinate.?ÿ The adjusted difference?ÿ is rarely worth the effort, but the elimination of blunders always is.?ÿ ?ÿ?ÿ

 
Posted : 23/08/2022 11:33 am
(@rover83)
Posts: 2346
Registered
 

Paraphrasing Jan Van Sickle from a presentation at a recent conference...all measurements contain error, and therefore proper procedure is to adjust them. Random errors can and should be properly distributed.

It's far more common now to mix observations of varying type and quality. An RTK vector is not the same thing as a total station observation, which is not the same thing as a level observation or a post-processed vector. Any of the above can vary in quality and thus, how they should be used relative to other observations.

Treating them all as equivalent simply because they came up with approximately the same coordinates or heights/elevations is to make a fundamental mistake. This is especially critical when attempting to sniff out blunders and outliers - or when trying to build a reliable, repeatable network.

We do small records of survey and short plats, which are pretty simple and have looser tolerances. We do topographic surveys ranging from a single lot up to a section or more, which can be more complicated and have moderate tolerances. We set targets for manned and unmanned aerial surveys and static and mobile LiDAR - need pretty good coordinates there. We create, maintain, and use tight control for precision construction staking on heavy civil projects. We set geodetic control for large-scale projects like 50+ mile transportation corridors and wind farms. We occasionally do control networks for deformation monitoring, and sometimes the monitoring itself.

All of that work contains random error, and can easily be evaluated for blunders and outliers. Every project gets at minimum the same best-practices treatment for quality control, analysis & adjustment. As @norman-oklahoma mentioned, the primary reason is to identify and eliminate blunders, and sometimes outliers. But once you're done, you have updated coordinates that are the best possible values. Why toss them aside?

I'm not hearing any valid reasons against employing best practices, when the software makes it so easy and it takes minimal time.

 
Posted : 23/08/2022 12:08 pm
(@ncsudirtman)
Posts: 391
Registered
 

for me personally, I wish somebody could actually post up how they use Civil 3D and the .fbk files from a data collector to properly do a least squares adjustment using C3D. I have added the report style generator to my data collector running trimble access, exported the .fbk file and imported it in C3D through the survey database portion. But I have not been able to figure out much from there myself though I do occasionally see others swear they've used it successfully many time. I know the data collector itself will perform simple adjustments on traverses but being able to perform a LSA, analyzing the data and reporting about it without purchasing TBC, Starnet or Carlson would be nice. Just wish Autodesk would address the need somehow

 
Posted : 23/08/2022 12:13 pm
(@jon-payne)
Posts: 1595
Registered
 

For those people who run a loop traverse and close it out to find a linear and angular misclosure that is well within their acceptance level for error (then adjust or not based on their determination of the need), what does LSA bring to the table?

To me, in the case of a traditional old fashioned closed loop traverse (without cross ties, solar azimuths along some lines, GPS observations here and there or some form of redundancy other than the closing line), there seems to be too little redundancy for meaningful least squares analysis.

What am I missing in this scenario?

 
Posted : 23/08/2022 12:53 pm
(@norman-oklahoma)
Posts: 7610
Registered
 
Posted by: @jon-payne

For those people who run a loop traverse .... what does LSA bring to the table?

Freedom from having to dogmatically do loop traverses to get some sort of analyzable redundancy into your work.

 
Posted : 23/08/2022 2:10 pm
(@rover83)
Posts: 2346
Registered
 
Posted by: @jon-payne

a traditional old fashioned closed loop traverse (without cross ties, solar azimuths along some lines, GPS observations here and there or some form of redundancy other than the closing line)

I can count on one hand the number of times I have run or processed one of those in the past five years. Maybe two hands will get me to ten years.

But even if that's all I had, I would still use LSA because all else being equal, it is best practices and returns the best results.

https://www.xyht.com/gnsslocation-tech/i-dont-need-no-stinkin-statistics/

 
Posted : 23/08/2022 5:54 pm
(@olemanriver)
Posts: 2432
Registered
 

@beuckie So you can check from point to point and make a decision if its to large of an error. But what about how one point fits to another all the way across a site or from one property corner to another thats not visible from same set up etc. I agree a lot can be done and checked in the field especially with someone like your self that has time under the belt. But something to be said also for having another set of eyes check your work. If I do calcs and I stake it and no one else checks it its only my perspective. I would rather someone else do the calcs let me check or vice versa same with control. If i do the field work I ask someone else to run it through the ringer or at-least do a look over .

 
Posted : 23/08/2022 6:39 pm
(@beuckie)
Posts: 346
Registered
 
Posted by: @olemanriver

@beuckie So you can check from point to point and make a decision if its to large of an error. But what about how one point fits to another all the way across a site or from one property corner to another thats not visible from same set up etc. I agree a lot can be done and checked in the field especially with someone like your self that has time under the belt. But something to be said also for having another set of eyes check your work. If I do calcs and I stake it and no one else checks it its only my perspective. I would rather someone else do the calcs let me check or vice versa same with control. If i do the field work I ask someone else to run it through the ringer or at-least do a look over .

I must tell that the area we work in is completely covered with a rtk network we can login too. This means that all setups i do and most surveyors over here are resections between 2 or more (i use 4) gps'd points.?ÿ

A traverse with tribachs and all the needed stuff isn't done anymore only at critical stuff like railroads. I used to traverse and use least squares in the beginning until like 5 years ago.

My field software gives the errors from my setups. I can check all in the field for standard topos.

?ÿ

For construction i use the ts of course and set out multiple points and resect them from more free setups to have optimal accuracy.

 
Posted : 23/08/2022 10:55 pm
(@beuckie)
Posts: 346
Registered
 
Posted by: @norman-oklahoma
Posted by: @beuckie

For normal, standard stuff i don't get you guys still use an adjustment routine in the office.

Adjusting is not the primary reason for running data through an adjustment routine. The primary reason is to identify and eliminate blunders. Adjusting is just something that happens incidentally. Doing that on the data collector in the field has its place, but I'd prefer to do it in the office on the big screen.?ÿ?ÿ

Few of my control surveys are of the loop type. But all have redundant measurements on everything. If I don't adjust I'm using one measurement to determine a coordinate, and treating subsequent measurements as mere checks. But they aren't simply checks, they are as valid as the first measurement. I prefer to take advantage of all the (correct) data I've worked to collect to determine a final coordinate.?ÿ The adjusted difference?ÿ is rarely worth the effort, but the elimination of blunders always is.?ÿ ?ÿ?ÿ

When i started out we would measure and code but couldn't check setups until in office. Of course checking was needed then but with continuous evolution of field software and the use of hybrid setups with a 100% coverage of rtk we don't need to have the procedures like before.

It depends on the task at hand of course but measuring a light pole with an offset will give different results everytime you remeasure it no matter how much control you do on the setups.

But that's my workflow of course. Everyone does it at their best thoughts.

 
Posted : 23/08/2022 11:05 pm
(@norman-oklahoma)
Posts: 7610
Registered
 
Posted by: @jon-payne

For those people who run a loop traverse .... what does LSA bring to the table?

Compass rule is a 2d function in a 3d world. LS will also handle the blunder detection and adjustment of elevations.?ÿ

 
Posted : 24/08/2022 7:33 am
(@jon-payne)
Posts: 1595
Registered
 

@norman-oklahoma Dogma.?ÿ That old chestnut.

I would suggest that it is dogma suggesting least squares is "best practices" when a closed loop is a perfectly fine method AND is certainly easily applicable to MANY jobs.?ÿ Perhaps not the work you are doing, but there is not a need to perform a static observation on two control points, then NRTK several other points, followed by shooting with a total station some additional points on a one acre lot covered with canopy when a 3 or 4 point closed traverse with 1 in 80-120K closure fits the bill perfectly fine.

 
Posted : 24/08/2022 9:07 am
(@jon-payne)
Posts: 1595
Registered
 
Posted by: @rover83

I can count on one hand the number of times I have run or processed one of those in the past five years

The recurrent theme was "I".?ÿ As was discussed about the national scope of the exams, practice is not the same in every area.?ÿ Not all surveying companies do the same type of work, and not all surveyors choose to use the same equipment.?ÿ As long as the procedures and equipment used meets the statutory requirement and any standard of care for the area of practice, then they are using best practices.

?ÿ

That is an excellent series of articles which I've read several times.?ÿ Just looking at the sketch of the proposed network very easily excludes it from my question.

 
Posted : 24/08/2022 9:11 am
(@norman-oklahoma)
Posts: 7610
Registered
 
Posted by: @jon-payne

I would suggest that it is dogma suggesting least squares is "best practices" when a closed loop ... with 1 in 80-120K closure fits the bill perfectly fine.

If agree that if you can close all your loops to that degree of precision you don't ever need to adjust. Personally, I'm not that good. Sometimes, I make mistakes.

Perhaps you are overestimating the level of effort involved here. I could adjust a 4 legged closed loop traverse, with side ties and without errors, in 5 minutes from beginning to download to completed report. On a laptop, on the tailgate of the truck.?ÿ If you own Carlson Survey, you already own the necessary program.?ÿ ?ÿWhere is the downside?

 
Posted : 24/08/2022 9:50 am
(@jon-payne)
Posts: 1595
Registered
 

By the way - my original question about the non-redundent closed traverse and what LSA analysis would bring to it was an actual inquiry seeking knowledge.?ÿ Is there some actual benefit to running such a survey through LSA?

I'm far from claiming to be the leading expert in such a discussion, but as much as I can tell there is no particular benefit (especially when you consider that many times there is no measure up occurring in those type of traverses so there isn't even a valid 3D to deal with).

 
Posted : 24/08/2022 9:53 am
(@jon-payne)
Posts: 1595
Registered
 

@norman-oklahoma I understand that it is very easily accomplished with clean data.?ÿ I've got Carlson, I use GPS combined with TS as needed, I've crossed tied a traverse, etc...?ÿ I've run data through SURVNET (although I do it infrequently enough that it is not a 5 minute exercise for me).

It is just something that doesn't sit well with me to suggest that not using LSA when not required is somehow not exercising due diligence or meeting some personal opinion of "best practices".?ÿ I expect there are a vast number of surveys conducted each day that are closed traverses with such minimal misclosure that choosing to not adjust or use LSA is perfectly fine.

I also expect there are many cases where LSA should be used (even is required to be used) but it is not being applied.

 
Posted : 24/08/2022 10:01 am
Page 3 / 7