So I have a project that has a lot of inventory mapping. It is 99% line work and is added to daily. If I use the Trimble Access and do the automatic line work I know I can see the lines draw as I go and such. This is one of those jobs where I have many things going on at the same time. ?ÿIf say day one I do this and they add more (the contractors) and I go and continue and this is done daily. And then export the dxf straight from Access. And just keep adding to it to keep multiple crews knowing what has been located and not have all the points in is the part I understand. And when needed i know access can manually draw a line to say previous days work and continue on. ?ÿWhat I don??t know is how this affects civil3d for my final delivery. We do a delivery each day but how do I manage that on the civil3d side of bringing in the same linework over and over again. ?ÿI am not a civil3d guy but do realize it probably has its cons on my idea. I worked this project this week and for different phases which are all going on at the same time points just get cluttered. I have managed this by starting new job and only linking to the points for stake out and such. I am just trying to improve the process and save time both field and office. ?ÿI know also i can keep bringing the points into tbc and cad and redrawing daily to check for error in coding etc. just don??t know how laying lines on top of lines effects my cad work flow yet ?ÿI am still wet behind the ears on that. My end goal is to still bring in my job files to TBC and check rod heights and instrument heights and all qa/qc but speed up the whole daily pdf sketch deliverable with out messing up the final stamped version when that portion is completed. We have several different cad files for different things going on same project and same area. ?ÿAlready.?ÿ
?ÿ
What you propose will probably work but it is not the approach that I have used. I'll check the raw data and reduce the coordinates in the adjustment package of my choice, which happens to be StarNet, and then import the points to C3d and run F2f on them.?ÿ
Whichever method you use the main thing is keeping up with the daily mapping and giving the field crew near real time feed back.
Two things come to mind with a large project like this. First, you want the control settled before the topo begins. The topo'ing crew can maybe throw out a working point or two, but the general framework must be complete. Second, as you accumulate data the drawings can become too big to conveniently handle.?ÿ You may wish to divide the project area in some convenient manner.?ÿ?ÿ
@mark-mayer thank you. ?ÿYes on this project the control has been established and not much else to do in this regard. Daily adjustments are very rarely needed. Every now and then a new traverse point needs to be established the way it is currently done. I would have done it a little different as a resection would work 99.9% of the time. ?ÿHowever they have not allowed this historically. It is one of those where you are on a known backsight a known and move up to the new point and close on a known close into a known. So i would just do a 2 or 3 point resection and move on. ?ÿ
This area or job for my question is all in same area daily and just new materials to map in layers. ?ÿAnd most is linework. ?ÿSo i do go through on office side and check for rod height bust etc. I aggree that one thing is a turnaround to my field folks quickly as no way to really mark everything you have shot situations but to me it is so much easier to see the linework and know this line connects to the other line from yesterday instead of trying to determine which point several within a few feet of each other. ?ÿI truly am about as green as one can be on the civil3d side so the reason for asking. With Terramodel I would have just redrawn each days from field codes and did it that way. But that was before map screens on data collectors. Thank you i will ponder and look further into your suggestions for sure.?ÿ
As a member of a larger firm that is naturally inclined to "assembly-line surveying", I fight with my colleagues over this issue a lot.
There's nothing wrong with punching out daily unadjusted DXFs simply as a way to keep all crews up to date with what has been done so far. It works well and doesn't require office personnel to drop everything to get perfect data back out to the field.
But as @mark-mayer mentioned, the QC/QA has to be done, and that's best completed in the post-processing software package.
In your case, you have TBC, which will let you retain previous edits to a raw data file if you re-import that same file again with additional, newer data. You can process linework only on selected raw data files without altering others, modify feature codes and attributes and reprocess if necessary, and keep adding to it as much as you like.
In my opinion, the TBC project should be where the data lives until the vast majority of the project (and cleanup) has been done. CAD is just a deliverable development package, with the possible exception of boundary analysis and maybe surfaces, and even those things can be done in TBC. CAD's good for making things look pretty and adding labels and putting everything on a sheet set.
?ÿ
Also, keep in mind that you can turn on a single JXL's worth of data and only export the linework and points from that file - if you have already QC'd and exported previous days' data, you don't have to re-export that work.
The View Filter Manager is a fantastic tool for isolating only the things you want to see.
@rover83 You always have very well written and understandable language. I truly appreciate that. I am going to do some more experiments with exactly this. ?ÿI do have TBC and have been self taught except from people like you pointing me in the right direction and the Trimble knowledge centers. ?ÿThe whole idea of what you just said of processing by job jxl files and using the filter is something I have not done hardly at all. Heck where I am they kinda through up a red flag when I brought in terrestrial data and gps in one project. ?ÿI did this on jobs at the other place but even they couldn??t wrap their heads around that. To me it was not a big deal. ?ÿI need to i see from this comment and a few others dive into this filter manager and just the tools of handling the data more. I have mainly focused on the adjustment process and using the different spreadsheet things to look for blunders etc. like optical spreadsheet vector sheets. Which are very useful. And I learned i can customize them to only see what I want to see.?ÿ
I was working a project that i used to make a sheet so the crew knew what time they had last shot a point as I wanted them to rtk again at a 4 hour gap in time several days later. ?ÿOne of those jobs where we pushed gps hard in canopy and I wanted to really make sure no multi path was getting me.?ÿ
I bet if i sat down with you for a few days you would teach me more about TBC than I have figured out on my own in the last year part time using it.?ÿ
i do have a post processing question i have been meaning to ask you. I usually do static and mimic from the CORS stations like NGS does a spokes of a wagon wheel type. I usually never process the vectors between CORS stations themselves. I will however work the triangles throughout my network and have a check back to CORS. ?ÿI have seen some projects where people are processing between the CORS stations and not practicing independent and dependent baselines. Which making a triangle close all in one session doesn??t make since to me. ?ÿNow when i say i don??t process between CORS I might as a check independently of my overall network. I want my new control to be tight relative and as close to true as possible. And flowing to unknown points from a cors is great for that. Will TBC handle this approach or has it been determined bad practice in this case. ?ÿUnderstanding of the software is something you truly seem to have a grasp of. Doing geodetic work you don??t always create triangles like we practice on the private sector side. It??s to me i see both sides but i also understand software is written different as well. Gps doesn??t really need all the triangles but its the way we think I reckon. Maybe falls back to strength of figure basics as well.?ÿ
oh and thank you. Always great reading your information.?ÿ
It's likely that you'll get some pretty wide ranging opinions on how to process static data against CORS in a commercial program. My two cents:
Personally, I will always process the vectors between CORS initially, but only include those vectors in the initial minimally constrained adjustment(s), in order to see how well they align with each other. Once I decide which ones to hold fixed, I strip out or just disable the baselines between the stations I am going to hold. (I guess I could look at the short-term time series but it's easy enough to just run the adjustment and see.)
TBC does not process baselines by session (like PAGES), but by observation pairs, so while processed vectors with overlapping data may be somewhat correlated, there are no baselines that are straight up independent - at least, not in the sense that there would be in an OPUS/PAGES session. You'll never see a 100% perfect 3-legged loop closure.
My SOP (doesn't happen on all of our projects) is to run two reference receivers and then leapfrog around with "rover" receivers. Sometimes I will connect the CORS only to the reference receivers, and sometimes to everything. Depends on length of lines, session time, satellite visibility, etc.
For the rest of the network, triangles is the name of the game. Maybe I'm in the minority but I don't care if baselines cross each other, unless it's high-precision geodetic work and we are running with specific requirements to not cross them. Adjacent points are always occupied simultaneously at least once, usually twice.
Yes, yes, independent, separate sessions at different times of the day, where all receivers are logging at the same time on all points is still the best way...but it's far more likely that we're leapfrogging and overlapping a lot for some pairs and less for others with a lot fewer receivers than it would take to observe the entire network at once. TBC works great for that.
@rover83 That makes a lot of sense. I kinda did the same thing with gpsurvey. Pre TBC pre TGO. So THC does allow you to disable vectors that you do not want then. ?ÿI follow a very similar approach as you. I will test the CORS stations on themselves before setting foot on a job. Look for issues like wrong antenna etc. ?ÿif a non NGS owned station gets a new antenna today it doesn??t always get updated at NGS level right away. Learned this the hard way when adjusting a network for a RTN in Georgia. I pulled a few CORS in Florida it literally took me taking a few day trip and I went and found the 2 culprits that had choke ring antennas versus zepher geodetic at the time. They had been changed 1 week prior . ?ÿThen I would just flow from a few cors to different points in different quadrants of my project area which was the state of Georgia. And had a few land in the middle east west which made it nice. Unfortunately we found that the software itself had just as good results as post processing. This was before OPUS Projects. I used TGO. And also a Beta version of TBC. But it was crashing so much it didn??t go well. Then I used Pages and Adjust and compared everything. I a few months ago went through the OPUS Project Manager course. Man they make it easy. I have not done any static work just been reviewing some in TBC. And every project it seems they used every baseline vector that was produced in TBC?ÿ
?ÿ
I don??t mind crossing vectors at all never have had a issue. If it makes sense for redundancy it should not matter. I think a lot of that crossing dates back to old software that could not solve a compass rule adjustment if a traverse crossed itself really.?ÿ
I hope I did not miss anything but I have crossed vectors with traverse and rtk observed control point method on a few jobs because it just made since for the redundancy I needed. I was using the network adjustment routine in TBC. I am a firm believer that most issues come from poor field procedures. And if good procedures go in good data is more than likely going to come out. Yes sometimes we get a blunder or an issue. When I was in the service we did networks all over but a lot of those derived from absolute positions versus like here in the states with CORS. we also were not concerned about projections though until the end. All gps was lat long ellipsoid hts. And no NAD83 all WGS84. ?ÿAfter getting the gps we would take the traverse and triangulation data into the mix and at the end leveling data.?ÿ
nNow i am trying to apply that knowledge to TBC and such as I learn it. But on east coast not much static work is done. Although static on some of our jobs would be more productive in my opinion but for some reason they can??t get past the minimum of 15 minutes Fast static or more depending on baseline length etc. vs a 3 minutes rtk shot. I do have a 2000 acre boundary coming up that i said lets use static on for the area and then use rtk and conventional for the other. I am going to do a time study to compare to this. Lots of driving around this site. When I could use a few crews for a day and have most corners and control done. Then get the rest by other methods. I could be wrong but 3 to 4 receivers burning and moving would cover some ground pretty quickly I think.?ÿ