I am curious how least squares is being used amongst surveyors out there. I've always thought of it as a tool to adjust systematic errors out of ones survey, but am wondering to what extent it may be being used to determine final boundary placement. From time to time I've picked up maps and been perplexed about how a specific corner was reestablished; record bearings weren't held, neither were record distances, proration not used. There seems to be no rationale behind anything. So I'm wondering, are there folks out there using least squares to actually create final boundary positions?
roger_LS, post: 415848, member: 11550 wrote: I am curious how least squares is being used amongst surveyors out there. I've always thought of it as a tool to adjust systematic errors out of ones survey, but am wondering to what extent it may be being used to determine final boundary placement. From time to time I've picked up maps and been perplexed about how a specific corner was reestablished; record bearings weren't held, neither were record distances, proration not used. There seems to be no rationale behind anything. So I'm wondering, are there folks out there using least squares to actually create final boundary positions?
Least squares is definitly not a method "to adjust systematic errors",
Least squares is a general method of dealing with random errors in multiple observations that, taken alone, give different results. The idea is that the least squares criterion, when used as a test of the "best" value that reconciles what would otherwise be discrepancies is likely to be realistic.
Survey retracement problems vary, but where the survey was a figure surveyed on the ground by methods such as transit and tape, the errors tend to be able to be characterized as either falling with ordinary, expected limits, or being so far outside those limits as to strongly suggest that a blunder was made. I've used least squares methods to reconstruct traverses run by transit and tape in the 1930s where I had a copy of the actual field measurements and whre a sufficient number of the marks connected by the traverse remained that the adjustment could be well controlled. I consider the results to be reliable within the uncertainties that are generated from the estimated errors in angles and distance used in the adjustment.
Probably the oldest survey that I've applied least squares methods to for the purposes of retracing it was a survey made in 1889 along the Rio Grande. The remaining marks of the survey were vary sparse, but conditions such as bearings to identifiable peaks from various points along the traverse provided conditions for the adjustment that significantly improved the reconstructed positions of corners.
Kent McMillan, post: 415850, member: 3 wrote: The remaining marks of the survey were vary sparse ...
They were also very sparse.
Applying it when you have direct accrss to direct measurments like in Ken's example might make sence if you don't have much to go by, but I don't think it would ever be the best way to go if your record is just a metes and bounds description or a plat.
Least squares is great to apply to your raw measurment data, but you are on much less stable ground when you apply it to scatered remaining evidence of the results of someone else's raw measurments. It is very rare for the law to endorse what a lay man would consider a complicated math solution. Retracing a boundary is a legal exercise not an engineering exersise.
One math solution that can be usefull and has been endorced by the courts is the method the BLM calls Miscellaneous Control.
aliquot, post: 415868, member: 2486 wrote: Applying it when you have direct accrss to direct measurments like in Ken's example might make sence if you don't have much to go by, but I don't think it would ever be the best way to go if your record is just a metes and bounds description or a plat.
Least squares is great to apply to your raw measurment data, but you are on much less stable ground when you apply it to scatered remaining evidence of the results of someone else's raw measurments. It is very rare for the law to endorse a math solution to a boundary that requires more than a couple of sentences to explain to the average non surveyor.
Least squares; "To reconcile discrepancies, I made small adjustments to the courses and distances shown upon the plat that I consider to be quite reasonable based upon my experience and knowledge of earlier surveying methods. This construction was consistent with the following evidence:______"
Rigid rule of construction: "I considered the distances to be less liable to error than the bearings of lines shown upon the plat and marked the corner in a position that I consider to be most consistent with the calls for distance to the corner as given on the plat. While it is true that the results look wacky and don't fit any of the other evidence that would suggest the corner was elsewhere, I used the rule of constructin since I thought it would be easier to convince a judge that it was the only acceptable way to resolve the discrepancies that the plat presents."
In my former life (prior to retirement) I used a best fit software-think it was Triad Boundary Analysis. Worked very well, but I do not know if this software is still available. The software would compare and fit coordinates from two different systems (record vs found).
I don't think of my least squares analisys as seeking out those last few millimeters. I use it to test my error estimates and detect blunders. If the conditions were right to apply it to the actual lines of a boundary my map would say so. My guess is, if you run into a map that doesn't explain why it appears to randomly deviate from record then the preparer doesn't understand least squares either.
BillRoberts, post: 415876, member: 1759 wrote: In my former life (prior to retirement) I used a best fit software-think it was Triad Boundary Analysis. Worked very well, but I do not know if this software is still available. The software would compare and fit coordinates from two different systems (record vs found).
I still use Triad, and it is available:
http://www.mcgee-gps-triad.com/win_triad_main_new.htm
It is a tool, not a magic wand to fix things. Just line Star*Net - a very powerful tool.
Ken
BillRoberts, post: 415876, member: 1759 wrote: In my former life (prior to retirement) I used a best fit software-think it was Triad Boundary Analysis. Worked very well, but I do not know if this software is still available. The software would compare and fit coordinates from two different systems (record vs found).
StarNet can be fooled into doing this same thing. The process is:
- Compute coordinates for the record boundary (StarNet can be use to do this also*) and set those coordinates as held (! ! !).
- Copy those coordinates and change their point numbers to the as-tied numbers you used in the field. Now you have 2 copies of each record coordinate, each with a different number.
- Give the record points a standard error (eg/ 0.01 0.01 0.01).
- Run your adjustment with any coordinates other than those mentioned above floated (* * *). The adjustment will best fit the record.
- Amend standard errors and throw out outliers as appropriate.
*You can also use the MAPMODE inline function to compute record boundaries and eliminate those pesky 0.01' misclosures by distributing.
Mark Mayer, post: 415880, member: 424 wrote: 3. Give the record points a standard error (eg/ 0.01 0.01 0.01).
I should have said ..."give the copies of the record points a standard error.."
Mark Mayer, post: 415880, member: 424 wrote: StarNet can be fooled into doing this same thing. The process is:
- Compute coordinates for the record boundary (StarNet can be use to do this also*) and set those coordinates as held (! ! !).
- Copy those coordinates and change their point numbers to the as-tied numbers you used in the field. Now you have 2 copies of each record coordinate, each with a different number.
- Give the record points a standard error (eg/ 0.01 0.01 0.01).
- Run your adjustment with any coordinates other than those mentioned above floated (* * *). The adjustment will best fit the record.
- Amend standard errors and throw out outliers as appropriate.
*You can also use the MAPMODE inline function to compute record boundaries and eliminate those pesky 0.01' misclosures by distributing.
So, I'm showing my ignorance here, but StarNet is a program that performs Least Squares Adjustments, right? My question relates just to a basic situation for example you've got a monument at a corner then the line goes North 100' where this next monument is missing then it goes East 100' where you have another monument. Without any other evidence, one option I might use would be a bearing-bearing intersection to restablish the position. Is StarNet or Least Squares being used for this type of thing?
roger_LS, post: 415885, member: 11550 wrote: ....Without any other evidence, one option I might use would be a bearing-bearing intersection to restablish the position. Is StarNet or Least Squares being used for this type of thing?
No, not really. You could fool it into calc'ing such a point but it wouldn't be the easiest way to get it done, IMO.
The process I described above is for cases where a good, accurate boundary survey has been done, with dimensions known and monuments in place, and the need is to establish control relative to that boundary, as perhaps in preparation for construction staking.
Software like Triad can be used to do what Kent was supporting over in the DOT Boundaries post, establishing a R/W centerline by best fitting the combined center and sideline geometry to a number of found sideline monuments. Triad is also handy to "get onto" the bearings of a previous survey. Rather than just using 2 monuments from a prior survey, you'd be using a weighted best fit to all found monuments from the previous survey. You'd have to do a little explaining in your basis of bearings statement, and the result may not be substantially different from just using 2 monuments, but it seems to be a technically better result. Outliers, of course, should be excluded from the solution.
My experience is that there at Least a few Squares everywhere.
But, Cal Tech, or Berkley U, probably have fewer squares, than BJU, or DTS.
I'm sure somebody has done a Survey of how many Squares, there are per square mile, in various geographical areas.
Most squares, are not prone to smoke Colorado...
Until recently, surveys were all performed one line at a time. This left a natural connection between the manner of establishment, order of calls and rights and the surveyor following the footsteps. I see very few applications where a least squares adjustment will provide a supportable solution to a boundary problem. A retracement that ignores the manner lines were established is not a retracement.
The tools we have today provide us options that seem efficient by several orders of magnitude. All too often they just get us to an incorrect answer faster.
thebionicman, post: 415895, member: 8136 wrote: Until recently, surveys were all performed one line at a time. This left a natural connection between the manner of establishment, order of calls and rights and the surveyor following the footsteps. I see very few applications where a least squares adjustment will provide a supportable solution to a boundary problem.
I believe that there are at least two different methods that use least squares under discussion here. The cruder method uses a Helmert transformation and takes the theoretical figure of some survey as expressed in some convenient coordinate system and computes the translation, rotation, and scale parameters that will transform those coordinates into some other coordinate system. The transformation process does nothing to the angular relationships of lines, treating their relative bearings as being absolutely correct. All it does is rotate the bearings to best agree with the coordinates of certain points on the boundary that are known in the second coordinate system. Similarly, it applies the same scale factor to all distances to systematically shrink or enlarge the whole figure to fit.
On the other hand, an alternate solution would treat both the lengths of lines and the relative angles between successive courses as observations and would adjust them using some specified uncertainties in both to fit the known coordinates of certain points, as would be the case where only some subset of the boundary markers of a tract are found in place and positioned by a new survey in its own coordinate system. The result isn't a simple transformation, but one that models the errors in each invidual line and the angles between successive lines, and derives better estimates of those values than the record provides.
Ok. So what I'm hearing is that the only time you'd be using it to determine final boundary coordinates might be when you had a monument, then a bunch of missing courses then another monument and could then force record onto these two monuments either by holding record angular relationships or by letting both angle and distance float. But this could also be used to solve for just one corner. Maybe this is what I'm seeing on these maps, but they forgot to include a note and left it unintelligible.
Kent McMillan, post: 415896, member: 3 wrote: I believe that there are at least two different methods that use least squares under discussion here. The cruder method uses a Helmert transformation and takes the theoretical figure of some survey as expressed in some convenient coordinate system and computes the translation, rotation, and scale parameters that will transform those coordinates into some other coordinate system. The transformation process does nothing to the angular relationships of lines, treating their relative bearings as being absolutely correct. All it does is rotate the bearings to best agree with the coordinates of certain points on the boundary that are known in the second coordinate system. Similarly, it applies the same scale factor to all distances to systematically shrink or enlarge the whole figure to fit.
On the other hand, an alternate solution would treat both the lengths of lines and the relative angles between successive courses as observations and would adjust them using some specified uncertainties in both to fit the known coordinates of certain points, as would be the case where only some subset of the boundary markers of a tract are found in place and positioned by a new survey in its own coordinate system. The result isn't a simple transformation, but one that models the errors in each invidual line and the angles between successive lines, and derives better estimates of those values than the record provides.
I don't disagree that there numerous methods, and I'm not saying toss the tool. I am saying that replacing monuments purely on a recreation of a prior survey as a standalone mathematical lump isn't really surveying.
I resolve my surveys one line or corner at a time regardless of how I make the measurements. Yes I run LS to evaluate my data, but not to opine upon the remaining evidence.
[USER=11550]@roger_LS[/USER] .
There is another tool, you can use, to put back in missing markers, between known original markers.
The align command, in acad.
I'm not knocking least squares, or necessarily promoting one above the other.... However align will give you a direct pro-rate.
I like to look at all solutions.
And sometimes, this will yield further answers, so that more answers become apparent.
With missing monuments, and large discrepancies, back and forth, field, then office, as new answers become apparent, is SOP for me.
thebionicman, post: 415902, member: 8136 wrote: I don't disagree that there numerous methods, and I'm not saying toss the tool. I am saying that replacing monuments purely on a recreation of a prior survey as a standalone mathematical lump isn't really surveying.
I resolve my surveys one line or corner at a time regardless of how I make the measurements. Yes I run LS to evaluate my data, but not to opine upon the remaining evidence.
In PLSSIa, I do imagine that there is the luxury of being able to even think about dealing with just a line at a time, but in typical retracement problems in Texas that would not be a successful plan for a resurvey.
For example, there is a series of surveys made along the Rio Grande for the Texas & Pacific Railroad Company running upstream from Presidio where extensive obliteration of the orignal marks has been commonplace. The surveys were located from traverses run along the river, marking corners at 950 vara intervals along the river front which were mostly either washed away or farmed out. Patents issued on surveys made in 1887 or so that are mostly phantoms as a result of the obliteration of evidence. While the grants ran back from the river for two miles, they terminated in the desert a corners "marked" by completely fictitious monuments, none of which were set.
The best physical evidence that remains of the original surveys are just a bare few rock mounds that happened to have survived and miles of nothing in between. The best record that perpetuates the original surveys is itself nearly a phantom, a traverse run a century ago when some of the marks and their bearing trees could be identified but which itself was only tied to a bare few things that still exist.
Trying to reconstruct each line of each survey in sequence would be an exercise in complete futility since part of the exercise dealt with the fact that the original suveyor was using a wildly incorrect variation in his compass and part of the exercise is to arrive at the most reliable measures of the actual direction of his "North" at various points along the phantom of his survey.
In other words, the only way to solve the problem is to deal with very many lines run by at least a couple of surveyors in a comprehensive way which includes bootstrapping the effort by deriving his "North" from his calls for bearings to topo features like peaks from the positions of long-vanished markers, a "North that is used to compute those positions.
So many interconnected, dependent parts related by a survey means there is no option not to deal with the whole pattern.