Notifications
Clear all

Blunder and Error Detection Procedures

110 Posts
13 Users
0 Reactions
17 Views
(@norman-oklahoma)
Posts: 7610
Registered
 

Question for yswami

> Honestly, I don't know what happened here....
Unlike Kent the vast majority of the data I deal with, including that I manipulate with StarNet, has been collected by others. In my time I've seen data from various iterations of TDS, Trimble, and Spectra Precision Survey Pro. I've seen data from Trimble Survey Controller and Access. I've seen it from Microsurvey's Field Genius running Topcons and Leica's. I've even seen some Leica VIVA. Every one of them gives some sort of real time warning to the instrumentman at the gun of a bust like this. And a majority of those instrumentmen have been good people making a good faith effort to get it right. Still, suffice to say I've seen data bolloxed up in a great many ways, which I fancy would not be the case were I collecting it myself. But I'm probably only fooling myself there.

Measure up busts happen frequently. They are - by far - the most common type of bust. I dare say that it is a rare day of data that does not contain at least one such bust. I think that it is often because the operator will simply re-record a busted shot with a new, corrected shot and not delete the busted one from the raw data. The StarNet converter simply converts all the raw data without regard to whether it was subsequently overwritten.

 
Posted : February 23, 2015 10:05 am
(@williwaw)
Posts: 3321
Registered
 

Question for yswami

With SP, I typically will store a new point number on a side shot into existing control and enter it in my field book giving it a descriptor 'CHK#' and do a quick inverse to see how well I'm hitting. Anything funny going on it's going to jump up a bite me and I need to resolve it before I continue. I do this as a matter of routine. Access can handle this differently but it's what I'm use to and I like having that check shot in the notes to give me a warm fuzzy everything's going to be okay if and when I get grilled later. I'm more paranoid if the control was RTK'd in.

Note to Yswami. Rule of thumb for me running a primary control traverse, avoid short legs that could be filled in later with secondary control off your primary after you've closed out your trav and adjusted everything. Short or steep back sights/fore sights will introduce a lot of error into your traverse that could be otherwise avoided by simply keeping the legs as long as possible. Always do a back sight check, basically measuring the horizontal and vertical distances from both ends of the line you're running. If the numbers don't jive, rinse and repeat. A bad HI or HR will show up there and you'll have the opportunity to fix it while the problem is still simple. 😉

 
Posted : February 23, 2015 10:23 am
(@kent-mcmillan)
Posts: 11419
 

Question for yswami

> Measure up busts happen frequently. They are - by far - the most common type of bust. I dare say that it is a rare day of data that does not contain at least one such bust. I think that it is often because the operator will simply re-record a busted shot with a new, corrected shot and not delete the busted one from the raw data.

That surprises me that just about all data collection software wouldn't have user-defined tolerances, and when those were exceeded, would flag the human to request a manual override in order to record the measurements and proceed.

 
Posted : February 23, 2015 11:01 am
(@kent-mcmillan)
Posts: 11419
 

Question for yswami

> With SP, I typically will store a new point number on a side shot into existing control and enter it in my field book giving it a descriptor 'CHK#' and do a quick inverse to see how well I'm hitting.

It seems as if it would be simpler just to have re-set tolerances for Horiz and Zenith Angles and for Distances and have the DC flag the human user when they are exceeded.

A tie to any other coordinated point should generate a difference in Azimuth, Horizontal Distance, and DeltaHgt which, if outside of tolerance, would ask the user whether he or she wanted to proceed and to store the measurements anyway.

 
Posted : February 23, 2015 11:07 am
(@norman-oklahoma)
Posts: 7610
Registered
 

Question for yswami

> That surprises me that just about all data collection software wouldn't have user-defined tolerances, and when those were exceeded, would flag the human to request a manual override in order to record the measurements and proceed.
But they do. They all do. It still happens.

 
Posted : February 23, 2015 11:51 am
(@kent-mcmillan)
Posts: 11419
 

Question for yswami

> > That surprises me that just about all data collection software wouldn't have user-defined tolerances, and when those were exceeded, would flag the human to request a manual override in order to record the measurements and proceed.
> But they do. They all do. It still happens.

I can't tell you how thrilled I am to only have to deal with data that I myself have collected. :> I'd be hard pressed to send a client a bill for hours spent debugging some field FUBAR caused by ignoring some major out-of-tolerance flag and just carrying on.

 
Posted : February 23, 2015 12:23 pm
(@williwaw)
Posts: 3321
Registered
 

Question for yswami

I collect all my own data for the most part and I'm reluctant to rely on some tolerance setting buried in a setup menu to warn me of impending disaster. My solution to that is every time I setup conventionally to measure to the back sight prism and note the difference between inversed and my current measurements and book it, vertically if I'm carrying elevations. That forces me to decide if that error is tolerable, not the DC. If the vertical differs by 0.4', it's a safe bet I fat fingered an HI or HR. Doesn't help with the horizontal angular errors, but will catch nearly all bad measure ups. I just assumed everyone did this. Only takes a minute and saves me hours of head scratching. If I'm trig leveling, I'll compute the elevations separately by measuring the vertical direct and reverse and averaging the result assuming they are close. If not, double check all my assumptions. Relying too much on the DC and software isn't always such a great idea, at least for me anyway. That being said, I rarely ever have this problem.

 
Posted : February 23, 2015 12:42 pm
 rfc
(@rfc)
Posts: 1901
Registered
 

Question for yswami

>My solution to that is every time I setup conventionally to measure to the back sight prism and note the difference between inversed and my current measurements and book it, vertically if I'm carrying elevations.

By that, do you mean look at your previous measurements TO your current location from the BS location, and compare the VD to the BS VD?

 
Posted : February 23, 2015 12:53 pm
(@williwaw)
Posts: 3321
Registered
 

Question for yswami

Yes. It's a way to prove up the leg before moving on given you have redundant measurements to compare from both ends of the measured line. If they don't agree within reasonable tolerance that means there's a problem and that would be the ideal time to find and fix it before you carry that error forward.

 
Posted : February 23, 2015 1:04 pm
(@norman-oklahoma)
Posts: 7610
Registered
 

Question for yswami

>... I'd be hard pressed to send a client a bill for hours spent debugging some field FUBAR caused by ignoring some major out-of-tolerance flag and just carrying on...
I've become fairly well practiced at running down errors of this sort, so it rarely takes hours. Things like knowing the height of rod, unextended, each crew has. And the difference in height between an instrument and a target on the same tripod (zero is ideal, hurrah for Leica). Having the crew note the setup information. And knowing if something has been tied more times than seems proper it is likely that the early ones were out of tolerance.

Of course it's best if they get it right in the first place, but if someone has to scratch their head and trace a bust it's probably better to be me in the comfort of the office rather than a crew with blackberry vines down their necks on a 3 inch screen. That's why I get the big bucks.

 
Posted : February 23, 2015 2:07 pm
(@lookinatchya)
Posts: 133
Registered
 

Jeff Lucas is right. Surveyors are addicted to math like it's crack.

 
Posted : February 23, 2015 2:17 pm
(@yswami)
Posts: 948
Registered
Topic starter
 

Question for yswami

Aloha, Willi:
Thank you for participating in this discussion since you've been using Survey Pro for long time, your clarification is very helpful.

I have a habit of doing the stake out routine to the back sight after each setup is complete before taking any measurements. Just to be sure I got it setup correctly and to verify the values. I also do the stakeout routine for each control points. These two somehow I really got it messed up! May be I was in a meditative state of mind;-)

Aloha

> Yswami,
>
> I use the same Survey Pro software. I think what Kent is referring to is your back sight check routine when your solving your occupy/back sight setup. When you set up on a station and enter your HI and BS HR and shoot your back sight, the software will give you the vertical and horizontal discrepancy between what was measured and what was computed using the numbers that you entered. It's a routine check that would reveal the discrepancy that went undetected. If you entered a bad BS HR or instrument HI, the discrepancy would show up there when you sighted and shot your BS. Careful attention to those checks will save you a lot of grief later on.

 
Posted : February 23, 2015 2:41 pm
(@yswami)
Posts: 948
Registered
Topic starter
 

Aloha, Norman:
Will do this experiment..

> > Back in the day (pre-Star Net), if I found that I had an angular blunder, I most-times would be able to find out where it occurred thus:
> Draw a perpendicular to the closure leg (that is, the little 0.3' misclosure leg) through the closing point. It should point to the station with the busted angle. That is if the bust is angular, and it is isolated to one station.

 
Posted : February 23, 2015 2:42 pm
(@yswami)
Posts: 948
Registered
Topic starter
 

Question for yswami

Aloha, Kent:
I caused this oddity! :-$

When I first started the job I was keeping the point number and the description same. Then I hit the first redundant point #7. I collected that point, then I realized my control points number and description will begin to drift. I stop shooting points for redundancy. I decide I'll go back to collect all the redundant points once I close the traverse loop.

It is obviously not be the proper field procedure! How do you do this normally?

> One odd feature of the data entry file was that quite a few of the stations had i.d. nos. in the descriptors that didn't match the pt. i.d. nos used by the DC and I had to wonder why that would be.

 
Posted : February 23, 2015 2:49 pm
(@yswami)
Posts: 948
Registered
Topic starter
 

Question for yswami

Aloha, Norman:

It is comforting to read that those who do this for their livelihood have the same challenge. I do find carrying elevation is challenging and prone to operator error. I wished one day the will be built-in laser distant measuring unit in all TS and Prisms! It is one variable that I am not comfortable with yet while learning surveying.

You seems to have dealt with various data format. I have question in this regard. The raw data process by Dave Karoly looks nice and clean. The same data looks so confusing in SurvNet (Carlson). I wonder what is the best way to clean up the data like Dave and Kent posted in Carlson's SurvNet? See attached screenshot.

Thank you!

> > Honestly, I don't know what happened here....
> Unlike Kent the vast majority of the data I deal with, including that I manipulate with StarNet, has been collected by others. In my time I've seen data from various iterations of TDS, Trimble, and Spectra Precision Survey Pro. I've seen data from Trimble Survey Controller and Access. I've seen it from Microsurvey's Field Genius running Topcons and Leica's. I've even seen some Leica VIVA. Every one of them gives some sort of real time warning to the instrumentman at the gun of a bust like this. And a majority of those instrumentmen have been good people making a good faith effort to get it right. Still, suffice to say I've seen data bolloxed up in a great many ways, which I fancy would not be the case were I collecting it myself. But I'm probably only fooling myself there.
>
> Measure up busts happen frequently. They are - by far - the most common type of bust. I dare say that it is a rare day of data that does not contain at least one such bust. I think that it is often because the operator will simply re-record a busted shot with a new, corrected shot and not delete the busted one from the raw data. The StarNet converter simply converts all the raw data without regard to whether it was subsequently overwritten.

 
Posted : February 23, 2015 3:04 pm
(@dave-karoly)
Posts: 12001
 

Question for yswami

Point 7 should always be Point 7. If you want to collect more sets to point 7 the DC will ask if you want to overwrite or collect more observations or something like that, you want to collect more observations. Then your DC will store the angles to the correct point number.

This way a least squares program can process and adjust using all the data to that point.

 
Posted : February 23, 2015 3:50 pm
(@norman-oklahoma)
Posts: 7610
Registered
 

> ....Surveyors are addicted to math like it's crack.
When Mulford wrote that having a bad position on the right point is better than a good position on the wrong one he did not intend to endorse mediocre positioning.

 
Posted : February 23, 2015 3:56 pm
(@kent-mcmillan)
Posts: 11419
 

Question for yswami

> When I first started the job I was keeping the point number and the description same. Then I hit the first redundant point #7. I collected that point, then I realized my control points number and description will begin to drift. I stop shooting points for redundancy. I decide I'll go back to collect all the redundant points once I close the traverse loop.
>
> It is obviously not be the proper field procedure! How do you do this normally?

The data collector I use (an ancient SDR-33 Pro) will let me collect as many redundant observations as I want to a Pt.7 from as many different instrument stations as I want. It will compare the azimuth, horizontal distance, and elevation difference computed from the angles and distances from the occupied point to Pt.7 to those it calculates from the stored coordinates of both points. If tolerances are exceeded, it will ask me if I really, really want to store the observations, but will do it.

Likewise, in closing a traverse, I don't have to rename the point closed upon, but can use the actual point name.

Of course, I'm exporting the measurement file to Star*Net for adjustment in the office, not trying to do it in the field in the DC.

 
Posted : February 23, 2015 4:02 pm
(@norman-oklahoma)
Posts: 7610
Registered
 

Question for yswami

> You seems to have dealt with various data format. I have question in this regard. The raw data process by Dave Karoly looks nice and clean. The same data looks so confusing in SurvNet (Carlson). I wonder what is the best way to clean up the data like Dave and Kent posted in Carlson's SurvNet?
I have only the briefest of experience with Carlson SurvNet. Perhaps others could advise on this matter better than I can. What you are looking at in your screen cap is the raw pointing data. The starnet stuff Dave and Kent have posted is reduced to angles. Are you importing to SurvNet via the RW5 editor? Isn't there an alternate method of converting the data inside SurvNet that yeilds data that looks more like StarNet?

 
Posted : February 23, 2015 4:02 pm
(@yswami)
Posts: 948
Registered
Topic starter
 

Question for yswami

Aloha, Kent:
Survey Pro will not allow duplicate point numbers. It had to unique.

Thank you!

>
> The data collector I use (an ancient SDR-33 Pro) will let me collect as many redundant observations as I want to a Pt.7 from as many different instrument stations as I want. It will compare the azimuth, horizontal distance, and elevation difference computed from the angles and distances from the occupied point to Pt.7 to those it calculates from the stored coordinates of both points. If tolerances are exceeded, it will ask me if I really, really want to store the observations, but will do it.
>
> Likewise, in closing a traverse, I don't have to rename the point closed upon, but can use the actual point name.
>
> Of course, I'm exporting the measurement file to Star*Net for adjustment in the office, not trying to do it in the field in the DC.

 
Posted : February 23, 2015 4:07 pm
Page 4 / 6