We all know that many GNSS units have two parts, an antenna, and a "receiver", which are connected by a coaxial cable. We also know the antenna has a reference point, which is often the bottom of the part where the 5/8 inch screw to hold it in place goes.
What I'm wondering is what the division of electronics is between the antenna and receiver so that the transit time through the coaxial cable does not have to be considered. Or is it a matter of only being allowed to use a specific kind of cable of a specific length so the transit time is known?
I don't think it would matter. For static, the raw GNSS observables contain timestamps that are used during post-processing.
For real-time work, I am pretty sure the radio or cell transmission rate between base-rover is still way slower than the signal going from the antenna to receiver, so I have to imagine the difference between cabled and integrated would be negligible. Maybe if you had 300+ feet of cable it might start to make a difference?
It's been a while since I have used a receiver with an external antenna, but I don't recall any difference in position quality or fix times. Integrated is pretty much industry standard these days anyways.
Don't think the ARP position would be relevant to the above...
In every GNSS receiver algorithm I've read about he position is computed from the differences in arrival of the sat signals at the antenna. The delay through the coax is essentially the same for all of them and cancels out in the differencing.
The additional coax has almost no effect other than losing signal strength if excessively long (many meters) or poor quality.
To fully convince myself I did some experiments with 100 ft vs 6 ft and got positions within each other's error ellipse.
what the division of electronics is between the antenna and receiver
The antenna will contain a pre amplifier to boost the strength. That is why signal loss in the coax is usually insignificant. The receiver sends power up the coax to run the preamp.
I'm with the "it doesn't matter" crowd.?ÿ I generally use a 10-meter cable with my old receivers, and they match positions with my newer integrated unit.
@bill93 Thinking about it, I believe the arrival of the various signals are timestamped against the receivers internal clock. This is known to be inaccurate, which is why at least 4 satellites are needed: 4 unknowns (x, y, z, and internal clock error) and 4 measurements. The cable just causes the internal clock to appear a few nanoseconds faster than it would otherwise be.
As Bill said, the cable length does not matter other than the loss of signal strength.?ÿ For longer runs you need to use a heavier, less flexible cable to reduce signal loss.?ÿ ?ÿSignal for each frequency is collected at a discrete electrical phase center that are related to the ARP through modeling.?ÿ The data is then transmitted from the antenna to the computation engine of the receiver where it is deciphered and stored relative to the phase centers.?ÿ In the case of RTK the data is then processed against the correction data to compute location.?ÿ In survey grade receivers, the signal's code is used to establish a rough distance and its phase characteristics are used to glean out the precise distances from each observed satellite.?ÿ The receivers to a good job at determining where on the wavelength the signal hit the phase center, it is the integer number of wavelengths that pose the problem.
The integrated unit came along as a convenience for RTK.?ÿ If you ever did RTK with a non-integrated system you would understand why.?ÿ You can get much cleaner signal data from a larger dedicated antenna.?ÿ I can not think of one CORS that uses an integrated antenna/receiver.
The signal and time offsets are at the antenna. The data stream doesn??t get rearranged in the antenna cable.
Even back in 1987 our Trimbles came with a long monster cable and 10 m cables. And signal splitter to record data from one antenna on two receivers.?ÿ
The signal and time offsets are at the antenna. The data stream doesn??t get rearranged in the antenna cable.
I think the concern is/was the time comparison to the receiver's internal clock.?ÿ
This is all above my pay grade, but I've always found it curious that the old NGS observation log sheets had an entry for antenna cable length.
@jim-frame I believe the receiver time is the time at the antenna. And NGS log sheets detailing antenna length. Operative word: old logs. In compiling large data sets a bias might??ve been observed in cable length, so that detail was captured. And no longer a concern.
I believe.?ÿ
Two quick points.
One; as noted by others the antenna is most often 'just an antenna' with the receiver (notably for correlation circuits that decode the GNSS signals) and the navigation filter that uses that data to estimate position.?ÿ If there is a real coax cable they are probably divided. ?ÿ The coax used must support GNSS signal and shorter is better to reduce signal loss.?ÿ But there are many variations and many antenna design where the corelator and/or nav filter reside in the same physical antenna package and the only IO is power and the control signals.?ÿ
Two;?ÿ The length of the coax does have a tiny effect, but the way GNSS works is that all the signals in space collapse at the antenna surface (in precise fact, at is electrical phase center point).?ÿ This is a key point to understand.?ÿ This phase relationship is maintained as the signal travels down the coax to the corelator and it is the relationship (as well as orbital details) that is used to determine the positional and time estimate.?ÿ The effect of the coax length is simply to slow the GNSS devices internal clock estimate from the 'real' time (example; speed of light ~1ns a foot, so a GNSS with as 30ft cable is computing time ~30ns slower than ideal).?ÿ
Unless you are using GNSS for precise time (in which case you just calibrate it out based on cable length), it is of little concern. And in RTK navigation, the filter is computing and tracking the time difference between the base and the rover also does not care much as long as the value is semi stable. For what it is worth, the two clocks between the base and the rover will drift about perhaps 200~400ps every minute or so as they slowly beat off each other.
No problem, over the years I have been surprised several times that many otherwise well versed GNSS engineers did not 'get' that bit either.?ÿ