Data Management
Quote from party-chef on October 12, 2024, 4:09 amFrom my experience poor data management is the root cause of the majority of blunders in construction and surveying.
I would like to improve my understanding of best practices and procedures in how to build and manage data sets, including design, field and correspondence.
Currently working with Trimble Access, Trimble Sync, Sharepoint, and Civil 3D and but really am most interested in general rules and advice.
What is your approach and why?
Job numbers, field book indexing, file structure, layer management, point group management, spatial systems like GIS or google maps, extended data in properties, photos with chalkboards in frame like Kent etc, what have you seen work and what have you seen fail?
How do you build a system that others both present and future will be able to follow, especially across multiple crews, techs managers and years.
I imagine that municipal agencies have some real insight, especially regarding standards, anyone have a recommended pdf that breaks it down a little?
Drop some knowledge on me.
When I have been working in operations with good systems in place they just kind of chugged along over my head with me just doing my part to work within the defined path, coding, documenting and filing but without really seeing the whole system. I now want to know how to define the path.
From my experience poor data management is the root cause of the majority of blunders in construction and surveying.
I would like to improve my understanding of best practices and procedures in how to build and manage data sets, including design, field and correspondence.
Currently working with Trimble Access, Trimble Sync, Sharepoint, and Civil 3D and but really am most interested in general rules and advice.
What is your approach and why?
Job numbers, field book indexing, file structure, layer management, point group management, spatial systems like GIS or google maps, extended data in properties, photos with chalkboards in frame like Kent etc, what have you seen work and what have you seen fail?
How do you build a system that others both present and future will be able to follow, especially across multiple crews, techs managers and years.
I imagine that municipal agencies have some real insight, especially regarding standards, anyone have a recommended pdf that breaks it down a little?
Drop some knowledge on me.
When I have been working in operations with good systems in place they just kind of chugged along over my head with me just doing my part to work within the defined path, coding, documenting and filing but without really seeing the whole system. I now want to know how to define the path.
Quote from OleManRiver on October 12, 2024, 7:37 amI think you have a great point as this is a tremendous issue for sure. I think you have a great handle on what I see happening all the time. Back in the 90’s we had a system the flowed from office to field and field to office. Often times this was not even 100% digital as much as it is today.
We kept say a copy of a subdivision plat where we would draw in or plot by colored pencil at very least of where control was on a site. As a crew chief one of the things we had to do was keep track of what control was destroyed what new control was added over the life of a project so that any crew and office personnel would be aware and note this.
Trimble Access in itself is very good at this. The concept of a project folder to hold many files such as csv or dwg dxf job files etc allows for this. I always had a two way conversation and such with my crews for control and stake out points. I would have an updated control file csv they used and as control was destroyed or added they received an updated version. The crews would take older job files and older control files with a year month day and brief meta data csv file that was older and make a sub folder to place the older outdated files in within the project folder so the newer files were easily seen and could be used. I kept different csv files for like items for stake out of points. So say cub and gutter was its own csv structures had its own waterline and other utilities had its own. That way they could link to what they needed at anytime. Again this was a year month day at the beginning along with a desc of what it was. As we all know design changes so updated files were used. Once all data had been backed up and stored on server and confirmed the older file list was deleted from the data collector. This did a few things kept the project folder in TA clean and organized kept the office management side flowing as well. In TBC the office side I had a layer for control that was destroyed etc that all that went to. Also any comps that changed per design I never liked deleting anything much so to have it if needed it simply moved to a layer for its group.
Civil3d doesn’t like alpha numeric but TBC doesn’t mind at all. So all control all property corners used an alpha numeric system. Topo was all point numbers. Stake out design also used an alpha numeric system. So if it was curb design might be BC100 beginning point number. The as staked would have this in the code and an alpha prefix was assigned to the design name for the stored name TA handles this fairly well. Same for structures and any other stake out. All of these were assists layer of there own per completion as now I could at anytime a upper management asked about soon could simply show him exactly who did what what methane time of day and date something was staked. It also allowed me to keep them it’s how much more how many re stakes to completion of any thing on the project. I could say cub and gutter is at 75% completion we have 15% of re stake billing etc. in a regular cad type program like civil 3d you truly have to keep up with a very tight managed csv or txt files that can relate back to who which crew and date or it can go south fast. Point numbers that are duplicated from two different field crews and renumbered in the cad often that data traceability is lost at that point if no one is tracking that. Have a new hire or someone leave and all that is lost except for someone going back to the original job files and tracking it down. But that takes time on a large site and money. There are many ways you can set this up but the most important is that both office and field are on the same page and maintain the discipline to always follow the system that’s set in place to make it work. Even when at times it is tough. That Friday and you just want to get home and you set one extra point and didn’t use the proper annoy convention etc.
one of the biggest pet peeves for me is getting a csv or txt file with everything that means check shots and adjusted and un adjusted points and have no idea which of the 10 points all within a coke bottle cap is the one in which I am supposed to use to set up on for control. I have had this happen many times and takes time to figure out just what is good and what is bad.
I think you have a great point as this is a tremendous issue for sure. I think you have a great handle on what I see happening all the time. Back in the 90’s we had a system the flowed from office to field and field to office. Often times this was not even 100% digital as much as it is today.
We kept say a copy of a subdivision plat where we would draw in or plot by colored pencil at very least of where control was on a site. As a crew chief one of the things we had to do was keep track of what control was destroyed what new control was added over the life of a project so that any crew and office personnel would be aware and note this.
Trimble Access in itself is very good at this. The concept of a project folder to hold many files such as csv or dwg dxf job files etc allows for this. I always had a two way conversation and such with my crews for control and stake out points. I would have an updated control file csv they used and as control was destroyed or added they received an updated version. The crews would take older job files and older control files with a year month day and brief meta data csv file that was older and make a sub folder to place the older outdated files in within the project folder so the newer files were easily seen and could be used. I kept different csv files for like items for stake out of points. So say cub and gutter was its own csv structures had its own waterline and other utilities had its own. That way they could link to what they needed at anytime. Again this was a year month day at the beginning along with a desc of what it was. As we all know design changes so updated files were used. Once all data had been backed up and stored on server and confirmed the older file list was deleted from the data collector. This did a few things kept the project folder in TA clean and organized kept the office management side flowing as well. In TBC the office side I had a layer for control that was destroyed etc that all that went to. Also any comps that changed per design I never liked deleting anything much so to have it if needed it simply moved to a layer for its group.
Civil3d doesn’t like alpha numeric but TBC doesn’t mind at all. So all control all property corners used an alpha numeric system. Topo was all point numbers. Stake out design also used an alpha numeric system. So if it was curb design might be BC100 beginning point number. The as staked would have this in the code and an alpha prefix was assigned to the design name for the stored name TA handles this fairly well. Same for structures and any other stake out. All of these were assists layer of there own per completion as now I could at anytime a upper management asked about soon could simply show him exactly who did what what methane time of day and date something was staked. It also allowed me to keep them it’s how much more how many re stakes to completion of any thing on the project. I could say cub and gutter is at 75% completion we have 15% of re stake billing etc. in a regular cad type program like civil 3d you truly have to keep up with a very tight managed csv or txt files that can relate back to who which crew and date or it can go south fast. Point numbers that are duplicated from two different field crews and renumbered in the cad often that data traceability is lost at that point if no one is tracking that. Have a new hire or someone leave and all that is lost except for someone going back to the original job files and tracking it down. But that takes time on a large site and money. There are many ways you can set this up but the most important is that both office and field are on the same page and maintain the discipline to always follow the system that’s set in place to make it work. Even when at times it is tough. That Friday and you just want to get home and you set one extra point and didn’t use the proper annoy convention etc.
one of the biggest pet peeves for me is getting a csv or txt file with everything that means check shots and adjusted and un adjusted points and have no idea which of the 10 points all within a coke bottle cap is the one in which I am supposed to use to set up on for control. I have had this happen many times and takes time to figure out just what is good and what is bad.
Quote from party-chef on October 13, 2024, 4:09 amThanks for chiming in, yes, a tremendous issue indeed.
In terms of looking backwards, for sure I think that the days of old often had systems in place that were easier to follow. I started surveying at a shop that had a library of jobs and field books going back to maybe the 1960's and would occasionally open a field book and be able to at least start to understand the project pretty quickly.
To take a moment for nostalgia or history I remember hearing stories from my party chief of when the calcs in the office were performed at a desk by the LS on a legal pad, with the math written out and then included in the folder, then the PC would take that and go to field with the same documents used by the LS and start searching and generating their own set of measurements and calcs, all on paper while checking the work of the LS.
Like a number of challenges that should be easier with technology it is also possible to loose our way in the question of data management.
Your point about loosing traceability with a csv in a drawing is a good one, and I imagine fairly common. The relationship between raw data and output data is often not understood or appreciated by beginners and by a startling amount of veterans.
I have been shocked time and time again by office surveyors who had no oversight of the raw and are only interested in the csv (PNEZD). Users like yourself and rover who look at the raw in tbc or norman who looks with star net have the right idea for sure.
The subject grows so quickly that it can be tricky to attempt to summarize.
Generally I think the first step is to generate calculated values most typically in CAD but occasionally in Survey soft such as TBC. This is the first opportunity to strengthen or weaken the chain of data, how do we document the why and where of data sources used to develop the design model?
How to keep the model clean and concise while also rich enough to provide rationale.
Do you develop a narrative in your design model, do you annotate it or leave only the elements?
In addition to the data there is the question of managing correspondence and documentation. I have seen that go pretty sideways also, once with an engineering company accusing the survey company I was working with of hacking their ftp because we had design data they had included in error on a compact disk.
This is more or less my first time posting on here since the upgrade, it will take me a minute to get oriented to the new format but I hope that we can get some traction with the thread because I am sure there is a lot of great advice amongst the heads here.
Thanks for chiming in, yes, a tremendous issue indeed.
In terms of looking backwards, for sure I think that the days of old often had systems in place that were easier to follow. I started surveying at a shop that had a library of jobs and field books going back to maybe the 1960's and would occasionally open a field book and be able to at least start to understand the project pretty quickly.
To take a moment for nostalgia or history I remember hearing stories from my party chief of when the calcs in the office were performed at a desk by the LS on a legal pad, with the math written out and then included in the folder, then the PC would take that and go to field with the same documents used by the LS and start searching and generating their own set of measurements and calcs, all on paper while checking the work of the LS.
Like a number of challenges that should be easier with technology it is also possible to loose our way in the question of data management.
Your point about loosing traceability with a csv in a drawing is a good one, and I imagine fairly common. The relationship between raw data and output data is often not understood or appreciated by beginners and by a startling amount of veterans.
I have been shocked time and time again by office surveyors who had no oversight of the raw and are only interested in the csv (PNEZD). Users like yourself and rover who look at the raw in tbc or norman who looks with star net have the right idea for sure.
The subject grows so quickly that it can be tricky to attempt to summarize.
Generally I think the first step is to generate calculated values most typically in CAD but occasionally in Survey soft such as TBC. This is the first opportunity to strengthen or weaken the chain of data, how do we document the why and where of data sources used to develop the design model?
How to keep the model clean and concise while also rich enough to provide rationale.
Do you develop a narrative in your design model, do you annotate it or leave only the elements?
In addition to the data there is the question of managing correspondence and documentation. I have seen that go pretty sideways also, once with an engineering company accusing the survey company I was working with of hacking their ftp because we had design data they had included in error on a compact disk.
This is more or less my first time posting on here since the upgrade, it will take me a minute to get oriented to the new format but I hope that we can get some traction with the thread because I am sure there is a lot of great advice amongst the heads here.
Quote from Norman_Oklahoma on October 14, 2024, 9:58 amI've worked in a lot of offices and seen a lot of "systems", but I've never seen any two quite the same. The main thing is to have a system that everyone who is using it understands, buys into, and follows. Which is rare. Best would be to have something in place at startup that everyone joining the organization will find on entry. By the time you have two employees you will have three different fixed ideas of what should happen. If you have a going concern you should introduce new things step by step, get buy in, then move on to the next step.
The details of the structure are not critical. That there be a workable structure and it be strictly enforced is.
Some of the elements of said structure should include:
- Job numbering. I tend to prefer a system that goes by year, such as 24001, 24002, etc. But just sequential numbering works, too
- Some directory structure for keeping raw data, control adjustments, reduced coordinates, drawings, other work products, research materials, etc. organized. Plus, a place to keep as-submitted copies of work products. Also a place for copies of as-received work products - such as plan sets and associated CAD for staking jobs - from outside sources.
- Regimented file naming for drawings and other work products. I prefer real words for names like "topography.dwg, boundary.dwg, Record of Survey.dwg, Jones Legal Description.docx, etc." rather than cryptic combinations of numbers and letters. We haven't been restricted to eight characters for a generation.
- Raw data file naming. I use a system of numbering raw data files as jobnumber-date, such as a file for today's work might be named 24035-241014. Daily downloads of raw data files.
- Point numbering. I currently use 1-99 for control points, 100-399 for boundary points, and 10000 and up for topo. 400-499 are search calcs. 500-999 are for plat corner staking calcs. 1000-1999 are as staked record points. 2000, 3000, 4000, ... 9000 series points are for different types (ie/ building corners, curb and road way, storm, sanitary , water ,etc.) of staking calcs.
- A work flow for examining the raw data and processing it into mappable coordinates. I, too, am appalled by the all too common practice of just downloading coordinates straight from the dc into a CAD drawing. There are certain specific circumstances where this is acceptable but many more where it isn't. csv files are not raw data.
- Field to Finish. Uniform field coding of topographic features, template files, and automated linework drawing.
Stopping here. There is lots more.
I've worked in a lot of offices and seen a lot of "systems", but I've never seen any two quite the same. The main thing is to have a system that everyone who is using it understands, buys into, and follows. Which is rare. Best would be to have something in place at startup that everyone joining the organization will find on entry. By the time you have two employees you will have three different fixed ideas of what should happen. If you have a going concern you should introduce new things step by step, get buy in, then move on to the next step.
The details of the structure are not critical. That there be a workable structure and it be strictly enforced is.
Some of the elements of said structure should include:
- Job numbering. I tend to prefer a system that goes by year, such as 24001, 24002, etc. But just sequential numbering works, too
- Some directory structure for keeping raw data, control adjustments, reduced coordinates, drawings, other work products, research materials, etc. organized. Plus, a place to keep as-submitted copies of work products. Also a place for copies of as-received work products - such as plan sets and associated CAD for staking jobs - from outside sources.
- Regimented file naming for drawings and other work products. I prefer real words for names like "topography.dwg, boundary.dwg, Record of Survey.dwg, Jones Legal Description.docx, etc." rather than cryptic combinations of numbers and letters. We haven't been restricted to eight characters for a generation.
- Raw data file naming. I use a system of numbering raw data files as jobnumber-date, such as a file for today's work might be named 24035-241014. Daily downloads of raw data files.
- Point numbering. I currently use 1-99 for control points, 100-399 for boundary points, and 10000 and up for topo. 400-499 are search calcs. 500-999 are for plat corner staking calcs. 1000-1999 are as staked record points. 2000, 3000, 4000, ... 9000 series points are for different types (ie/ building corners, curb and road way, storm, sanitary , water ,etc.) of staking calcs.
- A work flow for examining the raw data and processing it into mappable coordinates. I, too, am appalled by the all too common practice of just downloading coordinates straight from the dc into a CAD drawing. There are certain specific circumstances where this is acceptable but many more where it isn't. csv files are not raw data.
- Field to Finish. Uniform field coding of topographic features, template files, and automated linework drawing.
Stopping here. There is lots more.
Quote from OleManRiver on October 14, 2024, 6:26 pmSorry for getting back so late. But you have a lot of knowledgeable folks here to get ideas from. I just read @Norman_Oklahoma reply and he most definitely has some great pointers. I have been in 100’s of survey companies while doing training or working with them to transition from old technology to new. One of the things I learned early on was sorta how Norman was stating. Get buy in and commitment. I am working at a place now that has been around for many years and we are implementing some changes and we work daily together on brainstorming how and what to implement next. This man I am working for has true wisdom and keeps me in check. I have always had a knack for looking at how something is done wether the way in which data is stored and how many folders or sub folders one has to go through to place a csv or raw data pics dwg’s scopes etc. and see where it could be improved and see where the pitfalls are. But I have to take baby steps because you have to bring all on board and not everyone has the same skills and not all know the big picture. A crew chief might not know or even be exposed to why they name files a certain way but once they get to the office then it becomes more clear. So try and understand how the cad guy looks for his/her data the same for the LS who has to check or the technician etc
Now I say all of that but the real magic is when and if surveyors will get a handle on a geodatabase structure vs sub folders etc Once that is achieved a lot of things and a lot of time is paid back in a easily searchable database that can be as simple as a touchscreen monitor and be able to visualize an area and know and see everything that relates or has related to that new project overtime . Many ways to interact with that sort of system. Many many years ago I kept a simple excel spreadsheet on my desktop I would hyper link all the projects I was working on these were globally so survey projects that integrated with weather command AOR’s GIS etc this allowed me to get to my folder structure quickly. Once we moved to a geodatabase that excel quick sheet slowly disappeared as now I had a smart young man that set me up some icons almost like shortcuts on my desktop that did that and more and also gave me some basic search functionality once in though the geodatabase took over. This doesn’t change the naming formats to be unique but gives more flexibility in how as say in surveying you have (project#) the dwg can be identical the csv the reports as you have the extensions that make it different etc
Sorry for getting back so late. But you have a lot of knowledgeable folks here to get ideas from. I just read @Norman_Oklahoma reply and he most definitely has some great pointers. I have been in 100’s of survey companies while doing training or working with them to transition from old technology to new. One of the things I learned early on was sorta how Norman was stating. Get buy in and commitment. I am working at a place now that has been around for many years and we are implementing some changes and we work daily together on brainstorming how and what to implement next. This man I am working for has true wisdom and keeps me in check. I have always had a knack for looking at how something is done wether the way in which data is stored and how many folders or sub folders one has to go through to place a csv or raw data pics dwg’s scopes etc. and see where it could be improved and see where the pitfalls are. But I have to take baby steps because you have to bring all on board and not everyone has the same skills and not all know the big picture. A crew chief might not know or even be exposed to why they name files a certain way but once they get to the office then it becomes more clear. So try and understand how the cad guy looks for his/her data the same for the LS who has to check or the technician etc
Now I say all of that but the real magic is when and if surveyors will get a handle on a geodatabase structure vs sub folders etc Once that is achieved a lot of things and a lot of time is paid back in a easily searchable database that can be as simple as a touchscreen monitor and be able to visualize an area and know and see everything that relates or has related to that new project overtime . Many ways to interact with that sort of system. Many many years ago I kept a simple excel spreadsheet on my desktop I would hyper link all the projects I was working on these were globally so survey projects that integrated with weather command AOR’s GIS etc this allowed me to get to my folder structure quickly. Once we moved to a geodatabase that excel quick sheet slowly disappeared as now I had a smart young man that set me up some icons almost like shortcuts on my desktop that did that and more and also gave me some basic search functionality once in though the geodatabase took over. This doesn’t change the naming formats to be unique but gives more flexibility in how as say in surveying you have (project#) the dwg can be identical the csv the reports as you have the extensions that make it different etc
Quote from party-chef on October 15, 2024, 2:33 amThanks for chiming in Norman, I had a feeling you would have some ideas.
How do you guys handle folder names that contain lots of chronological data?
I know that some will rename the folder at every addition of new data and others will keep the original dated name and some will not have a date in the name at all.
What do you mean by geo database? Is it a GIS product or a drawing with links to files or?...
The best advice I ever got for taking field notes was that the notes should tell the story of the day. I reflected on that nugget big time. What's the best advice for file management? From Norman I am getting that it be workable and enforced. To me that points to simplicity and participation/follow up from someone to ensure it is being used correctly.
Thanks for chiming in Norman, I had a feeling you would have some ideas.
How do you guys handle folder names that contain lots of chronological data?
I know that some will rename the folder at every addition of new data and others will keep the original dated name and some will not have a date in the name at all.
What do you mean by geo database? Is it a GIS product or a drawing with links to files or?...
The best advice I ever got for taking field notes was that the notes should tell the story of the day. I reflected on that nugget big time. What's the best advice for file management? From Norman I am getting that it be workable and enforced. To me that points to simplicity and participation/follow up from someone to ensure it is being used correctly.
Quote from john-hamilton on October 15, 2024, 9:30 amA related question...point numbering. I setup my workflow to have unique point identifiers for all points except topo ground shots. I use the job number (i.e. 24035) plus one or more letters, for example 24035AAC). Each point can also have another name, like GCP01 or T001, usually assigned by the client. Right now I have a job going on where the client assigned names like "H24-11-087". While that is a unique identifier for them (DOT), it is an issue for a program like Star*Net that uses "-" to separate point names. Yes, that can be changed in Star*Net to something else using the .SEP inline option. The other LSQ program that I use has point names in fixed width fields, so it doesn't matter in that program.
So, the question is, how many reuse the same point numbers on every job, like T001, T1, 101, etc? I see it all the time driving around when I see control points on the side of the road with high stakes, usually 101, 102, etc.
A related question...point numbering. I setup my workflow to have unique point identifiers for all points except topo ground shots. I use the job number (i.e. 24035) plus one or more letters, for example 24035AAC). Each point can also have another name, like GCP01 or T001, usually assigned by the client. Right now I have a job going on where the client assigned names like "H24-11-087". While that is a unique identifier for them (DOT), it is an issue for a program like Star*Net that uses "-" to separate point names. Yes, that can be changed in Star*Net to something else using the .SEP inline option. The other LSQ program that I use has point names in fixed width fields, so it doesn't matter in that program.
So, the question is, how many reuse the same point numbers on every job, like T001, T1, 101, etc? I see it all the time driving around when I see control points on the side of the road with high stakes, usually 101, 102, etc.
Quote from party-chef on October 15, 2024, 11:12 amI have been surveying since 2001 and have never worked at a survey shop that did not reuse the point numbers on every job.
Generally there is some range designation like what norman was saying and occasionally some real handy rules or maybe a alpha suffix thrown in there.
I have heard of one operation where the point numbers were something like book number_page number_date_shot#that day...
I worked with a manager on multi-year projects and his approach (except for control) was to have you start with the same number every day and then add some number to it on import. Whatever he was doing worked really well as over ten years I do not recall us ever loosing data.
I have been surveying since 2001 and have never worked at a survey shop that did not reuse the point numbers on every job.
Generally there is some range designation like what norman was saying and occasionally some real handy rules or maybe a alpha suffix thrown in there.
I have heard of one operation where the point numbers were something like book number_page number_date_shot#that day...
I worked with a manager on multi-year projects and his approach (except for control) was to have you start with the same number every day and then add some number to it on import. Whatever he was doing worked really well as over ten years I do not recall us ever loosing data.
Quote from MightyMoe on October 15, 2024, 11:14 amFor point numbers we use DOT convention, since it flows well with our work. 1-200 is control, 200-400 are calculated points (they get deleted constantly as they become located or staked), 500-600 are row monuments, 600-1000 are property monuments. 1000-up is topo locations which include fence, surfaces, power, gas, water, ect.
All the Field to Finish is controlled by the feature code. There is a book of codes that we use, it's not simple but it works.
As we bring in small jobs into a bigger file say for the city monumentation we renumber the points and leave a Job Number next to the areas to cross-reference.
For point numbers we use DOT convention, since it flows well with our work. 1-200 is control, 200-400 are calculated points (they get deleted constantly as they become located or staked), 500-600 are row monuments, 600-1000 are property monuments. 1000-up is topo locations which include fence, surfaces, power, gas, water, ect.
All the Field to Finish is controlled by the feature code. There is a book of codes that we use, it's not simple but it works.
As we bring in small jobs into a bigger file say for the city monumentation we renumber the points and leave a Job Number next to the areas to cross-reference.
Quote from OleManRiver on October 15, 2024, 5:46 pm@john-hamilton I am so with you on point numbers be unique especially in the control arena and property corners for sure. I did this and implemented it on a small scale for several projects. Unfortunately cad programs don’t like alpha numerical point names. I also did this with comp points for staking. TBC001 the 001 auto incremented. So Top Back of Curb the description or feature code had the offset etc. when they stored that the as staked number got a prefix or suffix added automatically which always allowed for a connection from the computed point to the point that was actually staked . I used for control the state then county abbreviations followed by zone and auto increment number. Prop corners were similar so if I came back to adjoining lot then that number was still that number. We live in a time where we are able to easily be on a known datum 1st of all then on a known projection coordinate system and I believe we should utilize that and be able benefit from that. I am working on a site now where I am wading through txt files and looking at this project to densify control but I have to sort through all point numbers and have way to many duplicates for sure. The same goes for mapping projects and using attributes along with codes instead of just codes and text afterwords. I try honestly to take the examples from NGS on having who knows how many marks over a couple hundred years plus and all are uniquely identifiable and on the mapping side GIS with attributes. For example. iron rod found or pipe conc Mon etc. 1 code MON is all that is needed then attribute of found or set. What is it rebar size 5/8” condition bent northerly. But that’s how my mind works. Add a picture or photo. Found the disc stamping = picture etc. I remember doing rubbing with pencil. So pics are very nice now days.
@john-hamilton I am so with you on point numbers be unique especially in the control arena and property corners for sure. I did this and implemented it on a small scale for several projects. Unfortunately cad programs don’t like alpha numerical point names. I also did this with comp points for staking. TBC001 the 001 auto incremented. So Top Back of Curb the description or feature code had the offset etc. when they stored that the as staked number got a prefix or suffix added automatically which always allowed for a connection from the computed point to the point that was actually staked . I used for control the state then county abbreviations followed by zone and auto increment number. Prop corners were similar so if I came back to adjoining lot then that number was still that number. We live in a time where we are able to easily be on a known datum 1st of all then on a known projection coordinate system and I believe we should utilize that and be able benefit from that. I am working on a site now where I am wading through txt files and looking at this project to densify control but I have to sort through all point numbers and have way to many duplicates for sure. The same goes for mapping projects and using attributes along with codes instead of just codes and text afterwords. I try honestly to take the examples from NGS on having who knows how many marks over a couple hundred years plus and all are uniquely identifiable and on the mapping side GIS with attributes. For example. iron rod found or pipe conc Mon etc. 1 code MON is all that is needed then attribute of found or set. What is it rebar size 5/8” condition bent northerly. But that’s how my mind works. Add a picture or photo. Found the disc stamping = picture etc. I remember doing rubbing with pencil. So pics are very nice now days.