Notifications
Clear all

Certification or Licensing

151 Posts
34 Users
0 Reactions
7 Views
(@eapls2708)
Posts: 1862
Registered
 

Ric Moore, post: 436697, member: 731 wrote: I thought that too originally Evan, but when seeking clarification on that later, I found that if someone received zero because they didn't answer the questions, that result was treated like a "no show" and not considered in the scoring criteria. Now, the ones who actually answered some questions that still resulted in receiving a zero score (1-2 each admin), their answers were considered in the scoring process but only for the questions they actually answered. I can see Evan's point on affecting the overall weight of each question, but that was the point of the standard setting process (at that time) with a combination of licensees and testing experts making those decisions. Thankfully, we have evolved way beyond this process and it is only memories now.

I think we described the same thing. As I recall, if the examinee did nothing more than place their number on the booklet, but made no other marks, it was a non-attempt and disregarded. If they made any marks in any of the answer areas for any question on the problem that might possibly have been the start of providing an answer, it was counted as an attempt and went into consideration with the rest. You were probably clearer that these attempt/non-attempt determinations affected the exam on a problem by problem basis.

Example: Four problems (each with many associated questions) on the exam. Examinee #328 provides answers to all or most of the questions for problems 1 and 3, completely skips problem 2, and makes one half-hearted attempt to begin an answer to one question for problem 4. Answer booklet 328-2 is not scored (examinee #328 receives 0 for that portion) and it is not considered in the standard setting for that problem. Answer booklets 328-2 & 28-3 are fully scored and considered in standard setting (as makes sense). Answer booklet 328-4, even though it is quite clear there was no serious attempt to answer any portion of it, is (or was) still considered an attempt nonetheless and went into the statistics affecting the scoring standards of that question.

However, since the scoring for each problem is grouped together in the end rather than a pass/fail reported for them individually, those almost attempts do affect the overall cut score.

Ric Moore, post: 436697, member: 731 wrote: Maybe I failed to explain it properly in a previous post to this thread, but that's what I was leading to when I asked what is the professional society doing about this. The licensing board doesn't force people to submit applications. When they do decide to submit an application is based on any number of factors which is their prerogative, not the licensing board's. What I'm asking is what is the professional society, the professional community, the individual licensees in responsible charge, etc. doing to prepare these people for becoming licensed? Isn't there a responsibility there? That's what I meant when I said "growing the profession".

As to what the CLSA is doing, I described a lot of that, which is quite a bit. There is always room for improvement, but as a professional society, the CLSA is not slacking. I'm sure many other state societies likewise are making good efforts in this regard. I started another thread on this topic hoping others would chime in about what's going on in their states, but there wasn't much follow up on it by others.

The problem though, as we've discussed, is licensees signing for and assessing as qualified, candidates that they know, or should know are nowhere near ready. That's a matter of personal integrity and there is even less that a professional society can do about it than the Board might (given the authority through the rulemaking and/or legislative processes). You had some good ideas about it that could put some accountability back on the reference licensees of those candidates who demonstrate that they were clearly not ready by failing the exam several times. I think with some collaboration between BPELSG, CLSA, and the other licensee societies, that idea can be developed into a workable solution.

Ric Moore, post: 436697, member: 731 wrote: Regarding the "CA has some of the lowest experience and education standards in the nation" statement, Evan is correct that I have stated that several times as well as discussed this with him. However, the more I've learned about the requirements imposed across the nation, the more I've learned to appreciate CA's requirements. CA is the only state that I am aware of, that requires surveyor applicants to clearly demonstrate 12 months of responsible field training and 12 months of responsible office training. Nearly every other state shares the common requirement of a 4 year ABET degree and 4 years of experience.

Unless the states I've previously lived in and had looked into their licensing requirements have changed the criteria, I don't think that's true. Each called for "progressive" or "responsible" experience. In those states with a degree requirement, some don't count experience prior to attaining the degree. It was explained to me that for most, the pre-education experience was not likely to be in a responsible position, so the "responsible" part of the experience requirement, if not expressly stated, is implied by that acceptance criteria. Some required certain percentages of experience in certain areas of practice.

I've not looked at them all, but of those states that I have looked into in the past, most required at least 8 years experience or combined education and experience. Just prior to going back to school to complete my degree (got my 4-yr degree on the 14-yr plan), I was living in a state that required significantly more experience than CA, and required the vast majority to be boundary related. At the time, I had about 10 years experience with a little more than half being construction staking. Only a small amount of the construction experience would have counted as general experience, but the rest wouldn't count. To qualify, I either needed to complete my degree and/or get 2 to 3 more years of boundary experience. That was also the only state that I recall the requirements being such that you pretty much had to qualify for the LS in order to sit for the LSIT. Upon passing the LSIT, you could schedule to sit for the next LS exam.

When I moved to CA, I was very surprised at the requirements. Where I would have been considered barely qualified in the state I had previously lived in, I had roughly double the education and experience necessary upon moving to CA. CA is 6 times the size of that other state, but typically had 10 to 15 times as many examinees annually. Where CA's LS pass rate was typically between 10% and 20%, it was between 50% and 80% in that other state.

Ric Moore, post: 436699, member: 731 wrote: I'm one that does not have a surveying degree. Do I believe that individuals with a 4 year degree have knowledge that I don't? Absolutely. Do I also agree that I had knowledge after 4 years of surveying experience than an individual with a 4 year degree had? Absolutely. It's just different knowledge.

Education is designed for breadth of knowledge.

Experience should give depth of knowledge.

 
Posted : July 13, 2017 7:00 am
(@warren-smith)
Posts: 830
Registered
 

eapls2708, post: 436845, member: 589 wrote: Education is designed for breadth of knowledge.

Experience should give depth of knowledge.

Evan - I like that!

 
Posted : July 13, 2017 7:09 am
(@clearcut)
Posts: 937
Registered
 

eapls2708, post: 436845, member: 589 wrote:

Education is designed for breadth of knowledge.

Experience should give depth of knowledge.

Just to be argumentative, I would have to say that I received a deeper understanding of many profession related matters from formal and self education efforts than I could have possibly obtained by job experience alone.

 
Posted : July 13, 2017 7:24 am
(@thebionicman)
Posts: 4438
Customer
 

Many of my past jobs and volunteer efforts involved teaching. The idea that one form of learning can be declared superior in every case is beyond wrong. Every person has unique traits and will respond differently to the same experiences.
While I understand the need for objective policy, it is a plain fact that one size does not fit all. I took the nationals and State specific in one sitting. At the time I had 4 semester hours of formal education and 21 years experience. I was one of 7 examinees and the only one without a degree. Nobody else passed all 3 tests and my scores were all top of the pile. That doesnt make me better than the other guys, but it shows that experience works for me. I do not support policies that say nobody else is in my special club. I can be a jerk but I'm generally not that arrogant. Generally...

 
Posted : July 13, 2017 8:18 am
(@eapls2708)
Posts: 1862
Registered
 

Gene Kooper, post: 436769, member: 9850 wrote: Well, no problem with agreeing to disagree. I don't think my position is horrible and it should definitely be brought up should a state decide to implement your suggestion.

I don't agree with your position, but I recognize it as well thought out and well articulated. There wouldn't be much of a discussion if we all agreed.

Gene Kooper, post: 436769, member: 9850 wrote: When I was involved in the continuing professional competency rule making, among the suggestions was something tailored after medical doctors. Architects could obtain specialty certification to show continued competency; like a doctor with a board certification in thoracic surgery. I don't see that being practical for surveyors, but it was an option discussed on how best to demonstrate continued competency.

I would agree with you for those jurisdictions with a narrow definition of the practice of surveying, such as those that limit it to boundary, topo, and little or nothing else. Some states, including CA, have a very broad definition such that it isn't practically possible for one to be fully competent in all areas of practice under the definition.

In the area of practice I've chosen to focus on, I see the harmful results of what people who think they are qualified by virtue of being licensed do when they incompetently identify boundaries, often providing the basis for needless strife and litigation between neighbors. Those who focus in other areas most likely have similar observations.

For those states with broader definitions of the practice of surveying (which I personally favor), I believe that the time for specialized certification of advanced knowledge is here, and perhaps has been here for quite some time.

Gene Kooper, post: 436769, member: 9850 wrote: I see no problem with a professional informally following the continuing professional competency guidelines in their quest to improve and/or add expertise to their practice. ...

Although I suspect that relatively few do, I would hope that all practicing professionals would do this, or at least agree with it in principle.

Gene Kooper, post: 436769, member: 9850 wrote: I also think it is important to recognize that the examinations are designed to segregate two groups of people, those that are minimally competent to become licensed and those that are not. I'm no pschometrician, but before it is used for a "higher" purpose there should be some analysis on whether it can discern a professional's strengths and weaknesses. If so, the examination should be given to active licensees after being blessed by a host of psychometricians. Active licensees would be required to take the test every 5 to 10 years. They would not be required to pass the exam and their scores would not be used to establish the cut off score. They would receive a confidential diagnostic that could be a substitute for creating a list of weaknesses and deficiencies in their practice. I would hope that there would be statutory guarantees that the diagnostic shall not be used in Board disciplinary actions and by an opposing party in civil litigation. Call it a means for a licensed professional to show continued competency without the threat of sanctions.

Now, I say this somewhat with my tongue in my cheek, but if the proponents of giving new licensees an exam diagnostic are correct, then it should have equal value to the old hand licensees. In all seriousness, if the object is to raise the profession then all should participate.

Tongue in cheek or not, the basic idea is interesting. My initial thought is that certification for specific areas of practice is more practical. Once one has been practicing for many years, some will remain a jack-of-all-trades kind of surveyor and some will have settled into working only in certain areas of practice. If a surveyor has specialized in construction, they probably also need to be at least minimally competent in topographic methods and, depending upon size and scope of the projects they get involved in, have anywhere from minimal to advanced competency in geodetic methods. But they may never deal directly with boundary determination. If they comply with the statute or code provision to practice only in their areas of competence by always subbing out boundary determination work, why should they be tested on that area of practice?

I expect that we can agree that very few, if any surveyors are fully competent in all areas of survey practice (whether covered under statutory definition or simply general recognition of an activity being associated as "surveying"). We may also agree (maybe not) that the possession of a license to survey does little to inform potential clients whether the surveyor they are considering for their project is fully competent in the areas of practice required to complete the project. We apparently don't agree that if a surveyor had both a general surveying license and certificates in boundary, topographic methods, and construction, it tells that potential client that this surveyor is most likely more qualified to see the project al the way through without creating problems than another surveyor who does not have a certification for a critical portion of the project. Or the client may be informed that he should hire different surveyors for different portions of the project. In my view, that would be useful toward the goal of protecting the public. You may or may not agree.

Gene Kooper, post: 436769, member: 9850 wrote: My comment starting with "I don't see" is aimed at whether the existing exams are capable of discerning strengths and weaknesses of a minimally competent land surveyor. I have not seen any data that supports the notion that exam diagnostics have any value other than to discriminate those that are minimally competent from those that are not. If someone has some data, I'd appreciate it being shared here. Not to dismiss others' ideas that the exam diagnostic may help and cannot hurt a new licensee, IMO that reasoning comes up short.

If the exam is ineffective to discern strengths and weaknesses, then there is no point in administering it at all. All exams are for the purpose of discerning strengths and weaknesses. Most are for the purpose of determining whether the examinee has sufficient strengths for some particular purpose, whether it's to surgically treat human beings, survey their lands, or to pass from the 4th grade to the 5th.

If the exam is incapable of being used to discern strengths and weaknesses, the authority to survey independently should be automatically granted according to the amount of experience and abilities of a candidate by a certain number of licensed surveyors - completely on the honor system. There certainly would be no reason to provide a diagnostic report to a failed examinee. Not only is there no public protection aspect to providing an unlicensed person with such a report, but you assert that it would be utterly meaningless as the exam is inadequate to measure the information it purports to provide.

Although I find your points to be well considered and well articulated, IMO, it's your reasoning that comes up short. Perhaps we will need to agree to disagree on this.

Mike Marks, post: 436781, member: 1108 wrote: May not be so now but when involved I got a few letters asking if I'd retake the test 5-15+- years later; apparently with no consequences and I can't remember whether it was simultaneously with the applicants or a few months prior to "fine tune" the test. Through the grapevine I heard the "psychometrician" said the PLS's scored an 85%+- pass rate, but it was BS because only the top tier PLS's volunteered, not the jacklegs (called "selection bias"). So they're already doing that (or were).

California did, and likely still does this. The idea is to see how a group that is presumably representative of the "just competent" group, that is they were judged at least minimally competent fairly recently but haven't been practicing so long as to have developed a lot of advanced knowledge, performs on an exam designed to measure minimal competence. Theoretically, this group should score overall a few points better than the "just competent" group of examinees who pass within just a few points of the cut score. If you're aiming for a cut score around 70%, the group of recent licensees should probably average 75% to 80%.

That exam session was used not so much to adjust the difficulty or complexity of questions, although it could be used for that purpose, but more often to adjust the wording of questions to make them more understandable or better focus them toward a principle or fact you want the examinees to recognize. I never participated as part of that test group, but later, when I was involved in exam development, I thought it was a very useful step in the exam development process. It's not easy to write questions that effectively eliminate or limit room for assumptions that might be made, or overlook the fact that certain assumptions might be required to answer the question. This step helped to identify those issues through the perspective of someone taking the exam. Had I understood how that process was used 10 years earlier, I likely would have volunteered to be one of the guinea pigs.

 
Posted : July 13, 2017 8:29 am
(@eapls2708)
Posts: 1862
Registered
 

Mike Marks, post: 436781, member: 1108 wrote: That is exactly the test purpose. The applicants that got a 69% vs. the 71%, one passed, one failed, but the latter is now a PLS. Which one will grow into a competent surveyor? What other result can be implemented in a testing protocol with a cut score? Contrast his with the McDonald's "fry room" test, where only a score of 100% is passing. Similarly a Navy jet pilot's testing regimen is 100% correct answers before live flat-top training commences. Are you suggesting a testing regimen that guarantees 100% competency can be created for Land Surveyors? Impossible given the complexity of the profession.

Quite right. The difference with the fry and carrier landing tests is that they are testing to ensure that the examinee is capable of following a specified protocol or process to safely achieve consistent results. An improper step at any point of the set process might result in undercooked/overcooked fries and oils burns on the person standing close to the vat, or several millions or billions $ of damage to a lot of very expensive high tech equipment along with killing or maiming the flight crew and many members of the deck crew. These tests are about memorizing and following a set process and not about exercising judgment about what process should be used and how it should be used. In that regard, 100% is a reasonable expectation and the process is such that if the examinee improperly performs part of it, that mistake or act of incompetence is often immediately identifiable to both examiner and examinee.

Since a professional practice exam tests many disparate abilities and areas of knowledge, to be effective as a measure of competence, it had also better be effective as an indicator of areas of strengths and weaknesses. I think we're saying the same thing.

 
Posted : July 13, 2017 8:30 am
(@eapls2708)
Posts: 1862
Registered
 

clearcut, post: 436853, member: 297 wrote: Just to be argumentative, I would have to say that I received a deeper understanding of many profession related matters from formal and self education efforts than I could have possibly obtained by job experience alone.

Obviously that's true. If education provides breadth, there will be many areas where you learned something that you would not have had the opportunity to learn by experience alone. But the education likely provided only a good base of knowledge upon which to develop deeper understanding and expertise. It is very unlikely that you became expert in any area of practice due to formal education alone. And if you think that you were expert in any area by virtue of education alone, it's more likely that your fooling yourself than it is that you actually were expert upon graduation.

Self-education is outside of the formal education requirement and from a licensing requirement standpoint, part of experience.

 
Posted : July 13, 2017 8:36 am
(@clearcut)
Posts: 937
Registered
 

eapls2708, post: 436878, member: 589 wrote: But the education likely provided only a good base of knowledge upon which to develop deeper understanding and expertise.

Would be curious if your understanding of depth of knowledge borne from an educational background would be different if you had attended CSU instead?

:p

 
Posted : July 13, 2017 11:16 am
(@edward-reading)
Posts: 559
Registered
 

clearcut, post: 436900, member: 297 wrote: Would be curious if your understanding of depth of knowledge borne from an educational background would be different if you had attended CSU instead?

:p

Obviously, his depth of knowledge would be considerably less.
Beak 'em Owls!

 
Posted : July 13, 2017 11:39 am
(@eapls2708)
Posts: 1862
Registered
 

You got that right Ed!

Jeff, are you a Fresnoid?

 
Posted : July 13, 2017 12:06 pm
(@gene-kooper)
Posts: 1318
Registered
 

Gene Kooper, post: 436769, member: 9850 wrote: Well, no problem with agreeing to disagree. I don't think my position is horrible and it should definitely be brought up should a state decide to implement your suggestion.

eapls2708, post: 436876, member: 589 wrote: I don't agree with your position, but I recognize it as well thought out and well articulated. There wouldn't be much of a discussion if we all agreed.

Oh, I think we agree quite a bit, Evan. We definitely have different perspectives that are most likely created by our differing educations and experiences.

eapls2708, post: 436876, member: 589 wrote: In the area of practice I've chosen to focus on, I see the harmful results of what people who think they are qualified by virtue of being licensed do when they incompetently identify boundaries, often providing the basis for needless strife and litigation between neighbors. Those who focus in other areas most likely have similar observations.

I see the same problem. My concentration is on mineral surveys and GPS control surveys. I cringe when I chat with peers whose usual practice is urban surveying and they tell me that they really enjoyed doing a mineral survey, because it was a nice simple rectangle. What they enjoyed was getting up in the mountains. I also know some old hands who finally decided to make some "easy" money with used GPS gear. Their understanding encompasses no more than the basics of being able to set up and turn on the black box, turn it off, download the data and after a few failed attempts get the data sent to NGS for an OPUS solution. They may have shiny new geodetic coordinates but no more.

eapls2708, post: 436876, member: 589 wrote: For those states with broader definitions of the practice of surveying (which I personally favor), I believe that the time for specialized certification of advanced knowledge is here, and perhaps has been here for quite some time.

I'd be interested in any ideas you might have to implement a certification program. As an example from the BPELSG Board, Professional Geologists can become Certified Engineering Geologists and/or Certified Hydrogeologists. Perhaps some similar certifications could be developed for the licensed California PLS. For example, certifications in rectangular PLSS surveys, riparian surveys, mineral surveys and geodetic surveys. As for voluntary certifications, CFedS certification is an option for demonstrating proficiency in PLSS surveying. In the BLM's introductory video on the 2009 Manual, Don Buhler stated that CFedS is a "certification of knowledge."

eapls2708, post: 436876, member: 589 wrote: We apparently don't agree that if a surveyor had both a general surveying license and certificates in boundary, topographic methods, and construction, it tells that potential client that this surveyor is most likely more qualified to see the project al the way through without creating problems than another surveyor who does not have a certification for a critical portion of the project. Or the client may be informed that he should hire different surveyors for different portions of the project. In my view, that would be useful toward the goal of protecting the public. You may or may not agree.

Actually, I did not provide an opinion regarding additional certification and whether I believe it would better protect the public health, safety, and welfare. My answer would depend on the particulars of how to administer the additional certifications. Would it be reviewed by those who currently hold the certification? A certification exam? A combination of the two?

I did have an additional certification, but have allowed my CFedS to expire. I've taken the necessary continuing education courses to qualify for recertification, but I haven't remitted the renewal and penalty fees. The main reasons I haven't renewed are twofold. I don't need the additional letters after my name to practice in the areas I restrict my practice to and none of my clients have any idea of what a CFedS is. Almost all of my work comes from referrals by other surveyors. I guess that could be regarded as an informal certification. 🙂

Gene Kooper, post: 436769, member: 9850 wrote: My comment starting with "I don't see" is aimed at whether the existing exams are capable of discerning strengths and weaknesses of a minimally competent land surveyor. I have not seen any data that supports the notion that exam diagnostics have any value other than to discriminate those that are minimally competent from those that are not. If someone has some data, I'd appreciate it being shared here. Not to dismiss others' ideas that the exam diagnostic may help and cannot hurt a new licensee, IMO that reasoning comes up short.

eapls2708, post: 436876, member: 589 wrote: If the exam is ineffective to discern strengths and weaknesses, then there is no point in administering it at all. All exams are for the purpose of discerning strengths and weaknesses. Most are for the purpose of determining whether the examinee has sufficient strengths for some particular purpose, whether it's to surgically treat human beings, survey their lands, or to pass from the 4th grade to the 5th.

If the exam is incapable of being used to discern strengths and weaknesses, the authority to survey independently should be automatically granted according to the amount of experience and abilities of a candidate by a certain number of licensed surveyors - completely on the honor system. There certainly would be no reason to provide a diagnostic report to a failed examinee. Not only is there no public protection aspect to providing an unlicensed person with such a report, but you assert that it would be utterly meaningless as the exam is inadequate to measure the information it purports to provide.

Although I find your points to be well considered and well articulated, IMO, it's your reasoning that comes up short. Perhaps we will need to agree to disagree on this.

This is where I think our different perspectives is the reason for any perceived disagreement. My background is that I had nearly 7 years of apprenticeship sprinkled with the ICS correspondence courses. I then went back to school and obtained degrees in geological engineering and hydrogeology. During my graduate program I took a course in geologic data analysis. It dealt mainly with multi-variate statistical methods to evaluate geologic data. Geologists use least squares not to come up with unbiased error estimates of our measurements, but to segregate the anomolies (what we are looking for) from the background noise. One of those methods is called discriminant analysis. Not to go into the weeds, but it attempts to create an statistical algorithm that separates the data into two groups. For example, say I have several stream sediment samples that I sent to a lab to get concentrations of several heavy metals. My purpose is to use the stream sediment samples as a reconnaissance tool to screen mineralized areas from barren areas. In other words, I want to look for the unique characteristics that indicate the stream sediment was eroded from a nearby ore body. This is similar to determining a "cut score" for discriminating between the incompetent and minimally competent. This is what I see as being equivalent to the licensing exam. Its designed purpose is to place each examinee in one of the two groups. Once I have isolated the mineralized zones, I will use other tools to evaluate ore grades from good to better to best to bestest! I am not saying that the licensing exam is incapable of discerning a licensee's strengths and weaknesses. What I am saying is that I have not seen any analysis that demonstrates it is capable of making those distinctions.

IIRC you went to college before gaining experience in surveying (sorry if I misstated your path). I know that you have experience in the California exam. I have no experience in creating exam questions or in administering exams. My point of view is based on the stated objective of the exam. Although we have different experiences and education that have shaped our perspectives, I know we both appreciate the value of education in our profession and both of us seek to raise the professional bar (so to speak). Plus we both seem to enjoy giving presentations and interacting with our peers. Conference presentations are not solely the presentation of information, they hopefully are the exchange of ideas with our peers.

If we continue with the discussion, don't be surprised if I employ the geologist's' friend, colored pencils. It is approaching the point where I'm going to start using color to distinguish the levels of replies. Cheers!

 
Posted : July 13, 2017 2:31 pm
Page 8 / 8