What to edit in a R...
 
Notifications
Clear all

What to edit in a Rinex file?

33 Posts
12 Users
0 Reactions
6 Views
(@bill93)
Posts: 9834
Topic starter
 

I've played with Teqc and can get the basic job done, but nowhere have I found a tutorial on what you should and should not do to get best results from your data.

If you have a file with a few hours of observation data and realize that you observed an hour into a bad DOP period, should you cut off that data? Does that data hurt your accuracy or just not help much? If receiver time stayed good, it seems like it should help slightly, but if the receiver time wanders, maybe that extra data hurts your results instead?

Likewise, if you have a satellite that was only high enough to clear obstacles for a short time, and has bad start and ending data with some good stuff in between, how do you decide whether to reject it or not?

What other editing should I pay attention to?

 
Posted : 01/11/2015 6:17 am
(@geeoddmike)
Posts: 1556
Registered
 

Of course the answer is: it depends.

During the early years when there were not many SVs we had to be careful to not lose data. With the current constellations it could be a valid decision to exclude a SV. of course, good software will make these decisions for you. If a software package is sufficiently robust, it can handle cycle slips and noise.

"Tweaking" data is time consuming and often fruitless. I recollect helping someone examine the unsatisfactory output of processing session. The software helpfully showed unfixed cycle slips. We were able to delete the troublesome data. Later versions of the software correctly dealt with the problem itself.

There are always tradeoffs in processing. Academic and high accuracy processing frequently use a 5-degree elevation mask for both observations and processing. The lower elevation data provides additional geometric strength in height determinations. It does introduce a lot of noisy data.

I have only used teqc to translate, extract, set new session start and end times and after analysis of processing I have removed SVs if the processing software did not provide that as an option. The teqc quality plots and the summary were always an essential part of processing plan.

HTH,

DMM

 
Posted : 01/11/2015 3:01 pm
(@john1minor2)
Posts: 699
Registered
 

Bill
Go to this link and slide to the bottom of the page.
TEQC | Software | UNAVCO

 
Posted : 01/11/2015 5:10 pm
(@bill93)
Posts: 9834
Topic starter
 

I've been using the UNAVCO_Teqc_Tutorial.pdf and find it pretty good on HOW to edit and get reports. Now I'm looking for how to make the judgment of WHEN and WHAT to edit to get optimum results from OPUS-S.

There's some stuff linked at the bottom of that page that I'll look into, but I don't find anything that is really on this topic. Did I miss something?

 
Posted : 01/11/2015 5:39 pm
(@john1minor2)
Posts: 699
Registered
 

Sorry but I won't be able to help with that.

 
Posted : 01/11/2015 6:09 pm
(@paul-in-pa)
Posts: 6044
Registered
 

Most do it yourself software allows you to set the elevation mask. I played with that and probably only once or twice got better results by increasing over the default of 10å¡.

If you have a satellite with only 10 minutes of peaking over he horizon data, first check your planning. If you have other satellites in that neck of the woods, simply discard it. If not remove the early and late observations that do not have clean data all around. Off course most do it yourself software allows you to look at the residuals for that satellite, which will tell if and when you should remove some data. Most software is good enough to ignore the bad data and not bother you with trimming.

High PDOP is usually figured by a generic program that thinks there is strength in quantity. If you are worried about high PDOP also use your planning tool for that time frame and look at the positioning. You can get a low PDOP with 8 satellites and planning can show you they are still not a good geometry.

OPUS and OPUS-RS tell you how many out of your total observations were used in the solution, so remember to look at that.

As far as a general idea on he data, your L1 data for an observation should be in the ratio of 1.283333 times the L2 data. Your C1 P1 P2 and C2 data should be in he range of 19-24 million and within a few units of each oher for each observation. That represents the distance in meters from the satellite to your antenna.

From the file you sent me.

5 L1 L2 C1 S1 S2 # / TYPES OF OBSERV

15 10 26 15 40 45.0000000 0 5G06G02G05G25G12
-0.02817 Blank 20681730.914 Blank 23.750
-0.57316 Blank 20812511.953 Blank 19.250
-0.92815 Blank 23645082.266 Blank 10.500
-0.49214 Blank 23621948.469 Blank 7.250
-0.73617 Blank 20972499.461 Blank 20.750
15 10 26 15 41 0.0000000 0 5G06G02G05G25G12
-15634.039 7 Blank 20678756.328 Blank 23.750
-49690.417 6 Blank 20803052.992 Blank 18.000
-92561.971 5 Blank 23627471.398 Blank 10.500
-82082.743 4 Blank 23606328.406 Blank 7.500
-52536.083 7 Blank 20962505.430 Blank 21.500

L1 only, useless startup data.

15 10 26 16 57 45.0000000 0 6G06G02G05G25G12G29
9038467.534 6 7067261.46221 22401704.227 16.500 1.500
-1029108.332 6 -685911.01923 20616678.211 19.750 3.031
-14468476.867 7 -2243222.41721 20891818.164 20.500 0.813
-9240363.003 6 -7008566.22921 21863565.070 15.500 1.000
8725931.536 6 6861880.11721 21315000.609 17.750 0.594
6962336.347 4 8132488.07421 23563445.633 7.250 0.031

One of only a few times you had L1/L2 data for all observations but the ratios are off.

This is what better data looks like.

15 10 16 23 48 30.0000000 0 7G24G22G21G20G18G15G14 0.000000002
14513034.000 8 11314523.13647 20685439.490 20685440.0544 20685446.0054
-21.204 -16.523
14239738.161 7 11111091.37446 22050053.142 22050052.2504 22050055.4124
2461.570 1918.106
14715678.954 7 11466250.36746 21473071.893 21473071.3124 21473074.7234
-1843.591 -1436.564
14830673.258 7 11552162.05546 22850935.215 22850934.2464 22850938.6714
-2865.421 -2232.796
14433817.522 7 11255515.17347 20474820.810 20474820.6964 20474824.1144
709.376 552.760
14807490.254 6 11534166.34745 22624444.327 22624445.1254 22624448.5144
-2769.528 -2158.074
14296192.831 7 11173843.45946 22804448.479 22804449.2794 22804452.9844
2375.998 1851.427

Paul in PA

 
Posted : 01/11/2015 8:05 pm
(@bill93)
Posts: 9834
Topic starter
 

Paul in PA, post: 342673, member: 236 wrote: Most software is good enough to ignore the bad data and not bother you with trimming.

Are you saying I need to do my own trimming, using criteria like you are describing, because OPUS isn't good enough at it, whereas some vendors' software does a better job?

 
Posted : 01/11/2015 8:30 pm
(@paul-in-pa)
Posts: 6044
Registered
 

OPUS-RS does not look through your whole file for good data. If it is not there in the beginning it quits.

That being said I got a recent solution using only 16% of observations.

Paul in PA

 
Posted : 01/11/2015 8:34 pm
(@bill93)
Posts: 9834
Topic starter
 

I think we've determined that the receiver I'm playing with won't support OPUS-RS. I'm asking about OPUS-S.

 
Posted : 01/11/2015 8:49 pm
(@paul-in-pa)
Posts: 6044
Registered
 

The receiver you have is not doing good enough on L2 to get an OPUS-S solution.

Paul in PA

 
Posted : 02/11/2015 3:10 am
(@makerofmaps)
Posts: 548
Registered
 

There are some tutorials here. WinTEQC Editor

 
Posted : 02/11/2015 9:02 am
(@williwaw)
Posts: 3321
Registered
 

Excellent thread Bill93. I'll be studying it in some detail.

 
Posted : 02/11/2015 9:27 am
(@geeoddmike)
Posts: 1556
Registered
 

Just curious, does the RINEX file you create indicate that you have L2 squared data? A "2" should appear in the WAVELENGTH entry in the header file. See: TEQC ‰ÛÓ Tutorial | Software | UNAVCO

 
Posted : 02/11/2015 10:08 am
(@bill93)
Posts: 9834
Topic starter
 

OPUS-S may be ignoring most of the L2 data, but I have reports from it on three points with "horizontal network accuracy" of 3.3, 1.1 , and 1.9 cm (none with precise orbits yet). The point with 3.3 is in a bad signal environment. All coordinates are reasonable in comparison to values from my optical sighting network.

That's useful for my purposes.

I plan to check my results on a HARN station as the real proof of the pudding.

 
Posted : 02/11/2015 12:04 pm
(@bill93)
Posts: 9834
Topic starter
 

I think it is a worthwhile topic, but so far the answers haven't been particularly clear and directed at the original questions.

 
Posted : 02/11/2015 12:06 pm
(@bill93)
Posts: 9834
Topic starter
 

Surprisingly, there is no 2. When I convert my .dat file to .15o, that line says
1 1 Wavelength Fact L1/L2

Maybe I've missed an option to force it to treat that data properly?

 
Posted : 02/11/2015 12:08 pm
(@dan-patterson)
Posts: 1272
Registered
 

Paul in PA, post: 342675, member: 236 wrote: OPUS-RS does not look through your whole file for good data. If it is not there in the beginning it quits.

That being said I got a recent solution using only 16% of observations.

Paul in PA

16%?!?!?!? I would throw that one away.

 
Posted : 02/11/2015 12:22 pm
(@williwaw)
Posts: 3321
Registered
 

That would have been one of my files Paul was gracious enough to take a look at. You'd likely hesitate to throw it away if it took you half a day of driving, boating and hiking to reach the corner. I ended up going back to hit it again, but just the same.

 
Posted : 02/11/2015 1:20 pm
(@geeoddmike)
Posts: 1556
Registered
 

As you saw in my link (to section 17 of the HTML version of the documentation), it is required that the wavelength factor of L2 squared receivers be a "2".

If the file you submitted to OPUS did not include the squared L2 code, I would edit it and try again.

BTW, have you tried submitting your files to any of the other GPS processing utilities? See this page for links:

Geodesy at Texas A&M-Corpus Christi / FrontPage

I do not have any 4000SST data around to play with. I last used one of these receivers in 1994.

 
Posted : 02/11/2015 2:09 pm
(@dan-patterson)
Posts: 1272
Registered
 

Williwaw, post: 342762, member: 7066 wrote: That would have been one of my files Paul was gracious enough to take a look at. You'd likely hesitate to throw it away if it took you half a day of driving, boating and hiking to reach the corner. I ended up going back to hit it again, but just the same.

That sucks, but it still doesn't bode well for the data quality.

 
Posted : 03/11/2015 5:14 am
Page 1 / 2