I'm coming from the "field" end of the sandbox. Not at all the academic end.?ÿ
From the field end, I see approximately an increase in field efficiency of about 1.5x. Meaning 1.5 the data per time. Sometimes as much as 2x the data per time. This can be significant. It can also allow higher accuracy in heavy canopy. Ie, 2 shots on the same point, instead of one. This comparison is against the same javad, but 1 Hz rate.
This is my observation.
Nate
?ÿ
Javad then came up with a proprietary approach that would allow a correction to be extrapolated in a special way so as to allow for the correction to be sent at a 1 second rate, but for the processor to still perform the calculation five times per second.
Saleman mumbo jumbo spotted!
Why would you need to interpolate to 1Hz when you are computing 5x? Why not send what was computed at the second itself?
Can you explain what's a 'Special Way' of extrapolation? I am sure some mathematicians would be interested in this special method.
?ÿ
?ÿ
Why would you need to interpolate to 1Hz when you are computing 5x? Why not send what was computed at the second itself?
Can you explain what's a 'Special Way' of extrapolation? I am sure some mathematicians would be interested in this special method.
As I read Shawn's post, the bottleneck is the transmission to the rover, here taken as limited to 1 Hz to prevent overheating the transmitter.?ÿ The extrapolation in the rover gives 5 values per second for the computation.
My guess is that "extrapolated in a special way" is an "adaptive Kalman filter predictor".?ÿ Look that up if you want more than the salesman words and see if it makes sense in this context.
My guess is that "extrapolated in a special way" is an "adaptive Kalman filter predictor".?ÿ Look that up if you want more than the salesman words and see if it makes sense in this context.
Without any guessing, whatever they did, it was great!
At the time it was introduced, the users all said stuff like: "it's a beast, man!". From that time, it became known as "beast mode".?ÿ
It made winners, out of the users of it.
N
My guess is that "extrapolated in a special way" is an "adaptive Kalman filter predictor".?ÿ Look that up if you want more than the salesman words and see if it makes sense in this context.
Back in the early 1990s when I was working as a GIS Image Analyst I remember attending a lecture given by ESRI during one of their symposium. The gist of the lecture was that in digital imagery, it is very easy to down sample a large image and you would still get a useable image. But the reverse is not true. You would get a fuzzy image when you try to up sample an image. No algorithm is available that would be able to add new pixels from its surrounding pixel map to get a new image. All algorithms are just interpolating surrounding pixel colors to get a new pixel value but that data is not real world data. This is why camera sensors have higher resolutions every year. If interpolation was the key then we would be using 1 megapixel sensors and have software fill in the gaps.
This is the same for what the salesman was describing. If the rover only received the even second corrections, no amount of 'special way' interpolation can provide new accurate data for the intervening time samples.
If the objective for higher correction rates is to provide the rover with?ÿ data for it to compute faster using sub-second ambiguity resolutions then the only reliable data are for the even second ambiguities. It's the same scenario of sending ambiguity data every 10 seconds and let the rover interpolate it to every 2 seconds. You are sending more data because as someone said, not all data can get through the canopy and you are hoping that more data gets through somehow. Shotgun against a single shot rifle. So that method of interpolating from 1 second data to get 0.2 second data means that it may not even be getting all the 1 second data because not all the 1 second signals can get through canopy. And if not all is getting through then why bother with the 'special way of extrapolating' when there is no difference with sending actual data every 1 second?
Like driving a Ferrari in rush hour traffic.?ÿ
If the rover only received the even second corrections, no amount of 'special way' interpolation can provide new accurate data for the intervening time samples.
That would be true if all samples were independent.?ÿ
But the true values at those times for which the rover computes estimates at 5 Hz are correlated, meaning that each one doesn't tell you exactly what the next one will be, but the recent history tells you a value the next one is likely to be close to. So you can make a prediction that is not as good as the base would be able to get at 5 Hz, but much better than not using any values at all between 1-second times.
This is the same for what the salesman was describing. If the rover only received the even second corrections, no amount of 'special way' interpolation can provide new accurate data for the intervening time samples.
I think the system only needs "accurate enough" data to make effective use of the higher solution rate, something a well-designed extrapolation engine can likely provide.?ÿ But I'm just an end-user who will never understand all the number-crunching that goes on in the box, so I can't offer details.?ÿ
I do know that when it comes to choosing an RTK system I'll take the recommendation of a widely-respected and innovative GNSS system designer over that of a former GIS Image Analyst all day long.
?ÿ
each one doesn't tell you exactly what the next one will be, but the recent history tells you a value the next one is likely to be close to
That's wrong Bill. There is no research paper that's says you can predict the next value because the ambiguity parameters are not linear. The ambiguity parameters are in themselves located inside of statistical ellipses. It means there is no exact or correct value for each instance. As you yourself stated they are correlated but not linearly related. If what you say is true then someone would have made something akin to a logarithmic table to predict each 0.2 second of ambiguity resolved parameters.
I do know that when it comes to choosing an RTK system I'll take the recommendation of a widely-respected and innovative GNSS system designer over that of a former GIS Image Analyst all day long.
?ÿ
That's ok Jim. Someone has to buy the stuff so they can keep their payroll going. It's difficult for people that are only trained in 1 specific area of the geomatics field to grasp that geodesy does not only cover boundary work. So unless you have not used Google Earth ever in your work process previously, I will enlighten you that the images on Google Earth did not just appear out of nowhere on your laptop. There are GIS analysts that made those images zoom in with increasing detail using digital orthorectification software that are worth more than your house, car and?ÿ several years' income. All just to make sure that you won't get lost while you drive to your nearest bar.
But hey, what do we know about GPS positions right?
?ÿ
?ÿ
recommendation of a widely-respected and innovative GNSS system designer
and who voted for them to be a 'widely respected and innovative GNSS system designer' ?
must have been locked up in the GIS room when the survey was sent out.
I guess we'll just have to disagree until someone reveals the algorithm used for this extrapolation so we know for sure.
I've made my guess based on the ideas of correlated random process theory.
I'm not sure what you think their algorithm might be.
must have been locked up in the GIS room when the survey was sent out.
If you don't know who Javad Ashjaee was, then I agree -- you must have been locked away somewhere.
If you don't know who Javad Ashjaee was, then I agree -- you must have been locked away somewhere.
How about you? Do you know who Javad was? If you did then you would have realized that Trimble & Topcon?ÿ products were also made by Javad. So you see, Javad the company does not hold the title of the only 'widely-respected and innovative GNSS system designer'. So if you believe Javad's product to be superior then logic would also point to Trimble & Topcon to also have superior products since they were based on Javad's work too.
So how's the boundary work Jim? If you need any help on GIS or image rectification or satellite imagery processing, give me a heads up. Never too late to teach an old dog new tricks. Lol
Why do I have a feeling that you feel yourself to be superior to a GIS analyst? Am I correct? If you think the GIS tech is low compared to your boundary survey then better do more research my friend. You would be surprised to know that what we were using 20 years ago are just now being used in your surveying field. The drones that you are using to get aerial imagery were based on technology from the GIS field. The stitching of imagery to get a seamless coverage is based on the GIS field. The street navigation system in your car is based on the GIS field. The database that we had to create and edit held more rows than the number of shots from your largest topo survey.
Sometimes mocking others just reveals your insecurities and ignorance my friend and I am really sorry for you at your age to have such insecurities.
I do see a lot of insecurities on display.?ÿ Verbose ones, at that.