thebionicman, post: 436478, member: 8136 wrote: So far I have only seen one remotely valid argument against giving passing examinees a diognostic report. The Board could easily deal with the 'superior' attitude.
I received my first license in 1994 from Oregon. The letter notifying me of my successful attempt included scores for both the National and state specific portions of the test. I did quite well and, as a joke, proceeded to post copies of the letter with dollar signs anywhere I thought my might see them. Other than that the extra points were a waste of time. I could have spent more time drinking at the strip club the night before (true story). I have never had anyone ask how well I did on the exam.
At the time I sat, you could review your results with the state. But you did have to drive to Salem.
I still can't get my head around this fluctuating passing grade everyone keeps talks about. Competence has nothing to do with how well everyone else did on the exam.
John Putnam, post: 436529, member: 1188 wrote: I could have spent more time drinking at the strip club
Words to live by
Here are the statistics for the California exams going back to 1998, when only 9 candidates out of 471 passed the PLS exam. Cutscores are included up through 2011, after which the exam was changed to the computer based multiple choice format. As you will see, the cutscore was most often less than 50% of the total possible points and the passage rate was under 30%, except in a few cases. The April 2008 exam was somewhat of a statistical anomaly with the cutscore of 220 out of 400 and a passage rate of almost 32%. Until 2012, the exam was only given once annually.
http://www.bpelsg.ca.gov/applicants/exam_statistics.shtml
If interested, old exams are posted here (without solutions) from 1980-1992. I passed the 1986 exam.
http://www.dot.ca.gov/hq/row/landsurveys/Study_material/LS_Exams/index.html
thebionicman, post: 436478, member: 8136 wrote: So far I have only seen one remotely valid argument against giving passing examinees a diognostic report. The Board could easily deal with the 'superior' attitude. Make it confidential just like the exam. Use of the results in advertising or marketing gets penalized the same as exam subversion. Fixed.
The fact that some won't use it is a ridiculous argument. If not the entire Board and system of regulation should go away. There are after all folks who will never accept Board authority or work within the rules.
It boggles my mind that a simple tool with great potential meets with objection. Is it a panacea for all that is wrong with us? No. It is however a low cost, high potential tool.
Let me try and describe another, more ominous argument against giving passing examinees a diagnostic report. First, a little background about how I came to my position.
Many years ago, the state architects society (AIA Colorado) desired to implement a continuing education requirement for license renewal. The Colorado Dept. of Regulatory Agencies (DORA) opposed this. The state folks countered that they would only support a new concept called continuing professional competency. The architects agreed to this continuing education alternative as long as they were allowed to participate in the development of the rules for its implementation. Legislation was passed. Being the PLSC Executive Director at the time, I participated on the licensing board's committee tasked with promulgating rules for implementing a continuing professional competency requirement for license renewals of architects.
At the time, continuing professional competency was a novel concept implemented in the health care professions. Nurses in North Carolina were the first to adopt this alternative to continuing education. The committee met monthly for 18 months before the architects finally threw up their hands in disgust, hired a lobbyist and got legislation passed to rescind the continuing professional competency requirement and replace it with a classic continuing education requirement.
The main sticking point between the architects and DORA had to do with a requirement that each architect list the weaknesses and deficiencies in their practice. The Board had hired an expert with experience in implementing the program for nurses in NC. He was a psychometrician and got DORA to buy into this as a means of evaluating the "effectiveness" of the program. The state went so far as to say that each architect must not only write a self-evaluation listing their weaknesses and deficiencies, they would have to develop a continuing education program targeting their deficiencies and after completing the courses evaluate the effect on their practice. This self evaluation was to be formally entered onto a state computer. The architects went ballistic, esp. with the notion that the information would be stored on a state computer. Should an architect ever be the object of a lawsuit, it was more than likely that a smart attorney would be able to obtain the "confessional" during discovery and use the architect's own words against them in court!
I see something similar with providing new licensees with a diagnostic from the exam listing strengths and weaknesses. I took the NCEES exams for the LSIT, PS and EIT in 1984 and 1985 . Back then, the Board included the exam score in addition to informing me that I passed. They no longer do that for a myriad of reasons. CFedS has a similar policy of not telling those that pass what their exam scores are. The only distinction to that rule is that the first two CFedS numbers (1001 and 1002) were awarded to the person with the highest overall exam score and the person that scored 100% on one of the three exam modules. IMO the licensing exam is a crude tool to determine those regarded as being minimally competent to practice. Most states have a requirement that the registrant/licensee shall limit their practice to the areas they are competent.
I can see the diagnostic tool being used as a weapon against a licensee should they face a lawsuit. The exam's purpose isn't to delineate the areas that a licensee is competent to practice in; it is a determination that the licensee is minimally competent to practice in any area of land surveying. It is incumbent upon each registrant/licensee to extend their learning throughout their career, not the state licensing board or professional society. Being a simple country boy, i know the truth in the old adage that you can bring a horse to water, but you can't make them drink.
From what I saw in the CA review class I took, a lot of people were taking the exam that really were not ready. I was shocked at how many people had set for the exam repeatedly. Someone there told me it partly because CALTRANS paid more for a PLS no matter what their job description was. As I said before the cut score should not be changed to increase the pass ratio, the candidates just need to be better prepared.
John Putnam, post: 436529, member: 1188 wrote:
...I still can't get my head around this fluctuating passing grade everyone keeps talks about. Competence has nothing to do with how well everyone else did on the exam.
Making exams with the same degree of difficulty as previous versions of the same exam is hard. An easier approach is to just shift the passing score. Suppose last year's exam had 100 questions and the passing score was 70%. The average candidate's score was 80%. Seventy-five questions were retained; last year's candidates scored 79% on the retained questions and this year's candidates scored 81% on the retained questions, so the candidate's abilities are about the same. But the overall average score this year is 63%, indicating that the new questions are quite hard. So the passing score should be adjusted downward because the new questions are harder than last year's questions.
Gene Kooper, post: 436551, member: 9850 wrote: Let me try and describe another, more ominous argument against giving passing examinees a diagnostic report. First, a little background about how I came to my position.
Many years ago, the state architects society (AIA Colorado) desired to implement a continuing education requirement for license renewal. The Colorado Dept. of Regulatory Agencies (DORA) opposed this. The state folks countered that they would only support a new concept called continuing professional competency. The architects agreed to this continuing education alternative as long as they were allowed to participate in the development of the rules for its implementation. Legislation was passed. Being the PLSC Executive Director at the time, I participated on the licensing board's committee tasked with promulgating rules for implementing a continuing professional competency requirement for license renewals of architects.
At the time, continuing professional competency was a novel concept implemented in the health care professions. Nurses in North Carolina were the first to adopt this alternative to continuing education. The committee met monthly for 18 months before the architects finally threw up their hands in disgust, hired a lobbyist and got legislation passed to rescind the continuing professional competency requirement and replace it with a classic continuing education requirement.
The main sticking point between the architects and DORA had to do with a requirement that each architect list the weaknesses and deficiencies in their practice. The Board had hired an expert with experience in implementing the program for nurses in NC. He was a psychometrician and got DORA to buy into this as a means of evaluating the "effectiveness" of the program. The state went so far as to say that each architect must not only write a self-evaluation listing their weaknesses and deficiencies, they would have to develop a continuing education program targeting their deficiencies and after completing the courses evaluate the effect on their practice. This self evaluation was to be formally entered onto a state computer. The architects went ballistic, esp. with the notion that the information would be stored on a state computer. Should an architect ever be the object of a lawsuit, it was more than likely that a smart attorney would be able to obtain the "confessional" during discovery and use the architect's own words against them in court!
I see something similar with providing new licensees with a diagnostic from the exam listing strengths and weaknesses. I took the NCEES exams for the LSIT, PS and EIT in 1984 and 1985 . Back then, the Board included the exam score in addition to informing me that I passed. They no longer do that for a myriad of reasons. CFedS has a similar policy of not telling those that pass what their exam scores are. The only distinction to that rule is that the first two CFedS numbers (1001 and 1002) were awarded to the person with the highest overall exam score and the person that scored 100% on one of the three exam modules. IMO the licensing exam is a crude tool to determine those regarded as being minimally competent to practice. Most states have a requirement that the registrant/licensee shall limit their practice to the areas they are competent.
I can see the diagnostic tool being used as a weapon against a licensee should they face a lawsuit. The exam's purpose isn't to delineate the areas that a licensee is competent to practice in; it is a determination that the licensee is minimally competent to practice in any area of land surveying. It is incumbent upon each registrant/licensee to extend their learning throughout their career, not the state licensing board or professional society. Being a simple country boy, i know the truth in the old adage that you can bring a horse to water, but you can't make them drink.
Which is all a non-problem if it's confidential and tied to exam security.
Gene Kooper, post: 436551, member: 9850 wrote: Let me try and describe another, more ominous argument against giving passing examinees a diagnostic report. First, a little background about how I came to my position.
Many years ago, the state architects society (AIA Colorado) desired to implement a continuing education requirement for license renewal. The Colorado Dept. of Regulatory Agencies (DORA) opposed this. The state folks countered that they would only support a new concept called continuing professional competency. The architects agreed to this continuing education alternative as long as they were allowed to participate in the development of the rules for its implementation. Legislation was passed. Being the PLSC Executive Director at the time, I participated on the licensing board's committee tasked with promulgating rules for implementing a continuing professional competency requirement for license renewals of architects.
At the time, continuing professional competency was a novel concept implemented in the health care professions. Nurses in North Carolina were the first to adopt this alternative to continuing education. The committee met monthly for 18 months before the architects finally threw up their hands in disgust, hired a lobbyist and got legislation passed to rescind the continuing professional competency requirement and replace it with a classic continuing education requirement.
The main sticking point between the architects and DORA had to do with a requirement that each architect list the weaknesses and deficiencies in their practice. The Board had hired an expert with experience in implementing the program for nurses in NC. He was a psychometrician and got DORA to buy into this as a means of evaluating the "effectiveness" of the program. The state went so far as to say that each architect must not only write a self-evaluation listing their weaknesses and deficiencies, they would have to develop a continuing education program targeting their deficiencies and after completing the courses evaluate the effect on their practice. This self evaluation was to be formally entered onto a state computer. The architects went ballistic, esp. with the notion that the information would be stored on a state computer. Should an architect ever be the object of a lawsuit, it was more than likely that a smart attorney would be able to obtain the "confessional" during discovery and use the architect's own words against them in court!
I see something similar with providing new licensees with a diagnostic from the exam listing strengths and weaknesses. I took the NCEES exams for the LSIT, PS and EIT in 1984 and 1985 . Back then, the Board included the exam score in addition to informing me that I passed. They no longer do that for a myriad of reasons. CFedS has a similar policy of not telling those that pass what their exam scores are. The only distinction to that rule is that the first two CFedS numbers (1001 and 1002) were awarded to the person with the highest overall exam score and the person that scored 100% on one of the three exam modules. IMO the licensing exam is a crude tool to determine those regarded as being minimally competent to practice. Most states have a requirement that the registrant/licensee shall limit their practice to the areas they are competent.
I can see the diagnostic tool being used as a weapon against a licensee should they face a lawsuit. The exam's purpose isn't to delineate the areas that a licensee is competent to practice in; it is a determination that the licensee is minimally competent to practice in any area of land surveying. It is incumbent upon each registrant/licensee to extend their learning throughout their career, not the state licensing board or professional society. Being a simple country boy, i know the truth in the old adage that you can bring a horse to water, but you can't make them drink.
Interesting perspective Gene.
I could see how individual's diagnostic reports could be used/abused as part of civil litigation.
I also can see how it would be unlikely these individual reports would be upheld as being restricted from other's review by either subpoena and possibly even by a PRA request.
On a similar note, if the Board of Registration were to issue a license with an accompanying diagnostic report that showed the licensee had a low score in a particular subject area, would it be plausible that the Board would then face civil liability for not restricting the licensee from said low score subject area? The CYA law that states the licensee should only practice in the areas he/she is competent in may not go far enough to prevent the board from such exposure if they issue such a report denoting specific deficiencies without any accompanying specific restrictions.
thebionicman, post: 436573, member: 8136 wrote: Which is all a non-problem if it's confidential and tied to exam security.
All I can add is that the architects didn't believe that their "confessionals" would remain confidential and I agreed with their assessment. Even if the diagnostic remains confidential, that doesn't necessarily mean that the Board could not review the diagnostic should the licensee have a complaint filed against them.
I don't see a licensing exam as a measure of how competent the exam taker is. Licensing exams (at least in their current form) are intended to discriminate between those that are not competent and those that are minimally competent. I don't see them as serving a higher purpose. YMMV.
James Fleming, post: 436383, member: 136 wrote: I favor a degree requirement for a number of reasons, but that statement is so inherently illogical that it makes my brain hurt.
Individual A - four year degree and passed national exams.
Individual B - autodidact and passed national exams.Without subjecting Individual B to the same battery of tests that Individual A took to graduate, there is no way to compare the two...nothing is "proved" at all it's all confirmation bias and conjecture.
The person with the four year degree has proved they know enough to pass the degree program. The person who does not have the degree has not proved that. I made no value judgment on the the person who does not have the degree. They may know more then the person with one, but they have not proved it to anyone. As I said before, the state boards could make an effort to determine this, but I doubt most states are willing to foot the bill. It would also make becoming a surveyor harder. People on here would be complaining about the arbitrary tests rather than the arbitrary degree. If we are to have licensing we need a way to know if a person is competent. The degree requirment is not a perfect way to do that, but I haven't really seen anyone offer a practical alternative.
Jawja, post: 436319, member: 12766 wrote: I disagree. The judges exceeded their authority. Legislation called them professionals. But a group of 9 individuals ignored what 100+ people stated and chose to look outside the law and decide to ignore what the legislature called them. In my opinion if the legislation says you are a professional, then no court had a right to ignore that and decide on its own what defines profession.
If you really believe that someone with a 4 yr degree is automatically better than someone without one, then we are done discussing. Continue your conversation with your person with a four year degree from the University of Northern Virginia.
Sent from my SM-G955U using Tapatalk
The legislature did not bestow open land surveyors the classification of proffesional. They passed a very narrow "proffesional licensing" statute. The use of a narrowly defined term in one law does not automatically apply to all legal matters. A law pertaining to licensing does not apply to liability unless the law shows an intention to do that. The courts can and did take the terms in a statute unrelated to the case at hand when they are trying to define a term that was not defined by the legislature, but it would be very arbitrary to stop their investigation there.
If the legislature disagreed with the court they could have easily passed a law solving the problem. Instead they seemed to agree with the court and legislated accordingly.
I am not sure where you got the idea that I think someone with a degree is automatically better than someone without one. I never said anything like that.
I have no idea what the University of Northern Virginia comment means.
Dave Karoly, post: 436518, member: 94 wrote:
I personally think one thing that would help is to remove Surveying education from the Engineering Schools.
This is crucial. Engineering Schools certainly can provide a service with geomatics engineering programs, but they don't seem to even understand what a land surveyor is.
Gene Kooper, post: 436551, member: 9850 wrote: ...The main sticking point between the architects and DORA had to do with a requirement that each architect list the weaknesses and deficiencies in their practice. The Board had hired an expert with experience in implementing the program for nurses in NC. He was a psychometrician and got DORA to buy into this as a means of evaluating the "effectiveness" of the program. The state went so far as to say that each architect must not only write a self-evaluation listing their weaknesses and deficiencies, they would have to develop a continuing education program targeting their deficiencies and after completing the courses evaluate the effect on their practice. This self evaluation was to be formally entered onto a state computer. The architects went ballistic, esp. with the notion that the information would be stored on a state computer. Should an architect ever be the object of a lawsuit, it was more than likely that a smart attorney would be able to obtain the "confessional" during discovery and use the architect's own words against them in court!
Wow, I would have a very difficult time agreeing to that concept, let along convincing me to go to my board to talk them into it.
Gene Kooper, post: 436551, member: 9850 wrote: ...I see something similar with providing new licensees with a diagnostic from the exam listing strengths and weaknesses. I took the NCEES exams for the LSIT, PS and EIT in 1984 and 1985 . Back then, the Board included the exam score in addition to informing me that I passed. They no longer do that for a myriad of reasons. CFedS has a similar policy of not telling those that pass what their exam scores are. The only distinction to that rule is that the first two CFedS numbers (1001 and 1002) were awarded to the person with the highest overall exam score and the person that scored 100% on one of the three exam modules. IMO the licensing exam is a crude tool to determine those regarded as being minimally competent to practice. Most states have a requirement that the registrant/licensee shall limit their practice to the areas they are competent.
I can see the diagnostic tool being used as a weapon against a licensee should they face a lawsuit. The exam's purpose isn't to delineate the areas that a licensee is competent to practice in; it is a determination that the licensee is minimally competent to practice in any area of land surveying. It is incumbent upon each registrant/licensee to extend their learning throughout their career, not the state licensing board or professional society. Being a simple country boy, i know the truth in the old adage that you can bring a horse to water, but you can't make them drink.
Very well said Gene
Discussions like this seem to always seem to turn into a pissing match of who is the most intelligent. Among fellow surveyors no less. Surveyors are their own worst enemy in a sense because of this. Especially when 90% (that might be a bold assumption) of the folks determining someone's qualifications were mentored by someone who didn't have a college degree. Sure, the mentor might have had some formal education but no one reaches even a supervisory role in this business without some formal education. If one should strive to the peak of becoming licensed and not hold a 4 year degree only to be held back and looked down upon due to the lack of "formal education" is an abomination to the profession and lacks all of the principles of teaching a future generation. Not to mention the fact that the arrogance of the profession is so profound that surveyors would like to divvy up the licenses into groups, as to further segregate each other, as if one is better than the next and is more qualified to measure .01 or .001 or .00000000000000000000001. Yes there are different facets that require different expertise and to operate outside of your knowledge realm would be irresponsible, but it doesn't take a 4 year degree to learn that and it doesn't take a 4 year degree to understand boundary law.
Surveyors, especially "PLS, RLS, LLS, PSM, etc." whatever you or or your state requires to title and this may not go well with the choir here, and I do think of the folks here as professionals with more experience and knowledge, if only perhaps due to age, seem to always look down on others. If your field guy has a legitimate complaint, it's typically looked down on because he or she "doesn't grasp the gravity of the equation" or is just bitchin' to bitch. In the same sense, the one's that have put up with the 4 year requirement and gave an arm and a leg to get it, peer at others as lazy individuals and if you really want it, it can be done. I have heard it time and time again, "I did it with 3 kids, a wife, and it was snowing. (It was miserable and I hated myself and I barely get paid enough most days to pay off my student debt, but that's besides the point)"
Land surveyors are not rocket scientists. If we were to survey the moon, we would, but NASA would get us there, and once there we would pin cushion the hell out of it.
aliquot, post: 436598, member: 2486 wrote: The person with the four year degree has proved they know enough to pass the degree program. The person who does not have the degree has not proved that. I made no value judgment on the the person who does not have the degree. They may know more then the person with one, but they have not proved it to anyone. As I said before, the state boards could make an effort to determine this, but I doubt most states are willing to foot the bill. It would also make becoming a surveyor harder. People on here would be complaining about the arbitrary tests rather than the arbitrary degree. If we are to have licensing we need a way to know if a person is competent. The degree requirment is not a perfect way to do that, but I haven't really seen anyone offer a practical alternative.
Agreed 100%; but that's not what you originally stated. You wrote that a person with a degree has proved they know more than a person without one. If you've run a five minute mile, and I've never run a timed mile, you've proven that you can run a mile in five minutes and I haven't. However you haven't proven that you can run a mile faster than I can.
One option I haven't seen mentioned. Since the FS is designed to test for a knowledge base equivalent to an accredited four year degree, eliminate the FS for holders of acredited degrees and expand it into a multi day comprehensive exam for those without one.
James Fleming, post: 436629, member: 136 wrote: Agreed 100%; but that's not what you originally stated. You wrote that a person with a degree has proved they know more than a person without one. If you've run a five minute mile, and I've never run a timed mile, you've proven that you can run a mile in five minutes and I haven't. However you haven't proven that you can run a mile faster than I can.
One option I haven't seen mentioned. Since the FS is designed to test for a knowledge base equivalent to an accredited four year degree, eliminate the FS for holders of acredited degrees and expand it into a multi day comprehensive exam for those without one.
Definitely don't agree with eliminating the FS. It isn't the problem here. The two-step path to licensing is only a problem for those without a degree, who've learned everything on the job, but don't possess that crucial piece of paper, that is actually arbitrary and hypocritical, as we've got licensed surveyors, some with no degree, who made that decision, based on feeling and not factual data (in some jurisdictions).
Brian McEachern, post: 436614, member: 9299 wrote: Somewhat comical definition of our profession, but also a sad reality. Found on a page of fellow surveyors who probably have my similar sentiments toward a four year degree, and being looked down the nose upon.
That's from a T-shirt.I have seen the same one for a Computer programmer and I am sure other profession have that same one as well.
Gene Kooper, post: 436551, member: 9850 wrote: Many years ago, the state architects society (AIA Colorado) desired to implement a continuing education requirement for license renewal. ...
The main sticking point between the architects and DORA had to do with a requirement that each architect list the weaknesses and deficiencies in their practice. The Board had hired an expert with experience in implementing the program for nurses in NC. He was a psychometrician and got DORA to buy into this as a means of evaluating the "effectiveness" of the program. The state went so far as to say that each architect must not only write a self-evaluation listing their weaknesses and deficiencies, they would have to develop a continuing education program targeting their deficiencies and after completing the courses evaluate the effect on their practice. This self evaluation was to be formally entered onto a state computer. The architects went ballistic, esp. with the notion that the information would be stored on a state computer. Should an architect ever be the object of a lawsuit, it was more than likely that a smart attorney would be able to obtain the "confessional" during discovery and use the architect's own words against them in court!... I can see the diagnostic tool being used as a weapon against a licensee should they face a lawsuit.
This takes the idea and turns its purpose into something else entirely. The diagnostic report given to a failed examinee is for that examinee's personal use to focus their review for the next time they take the exam. It could have the same use for new examinees.
If the board's statutory purpose is to ensure or promote protection of the public, I fail to see the logic behind helping the incompetent get just over the bar so that they are then able to practice for the public without guidance while refusing to help the just barely competent identify areas to focus on to become more than just minimally competent.
Unless the State chose to enact some hair-brained scheme where licensees had their strengths and weaknesses cataloged in a category of records which would be public, such as for meeting the CE requirements, then the report, along with the scores, remain confidential info exempt from any State PRA or FOIA. If exam diagnostic reports could effectively be used in civil liability cases, then I'm sure we would have seen it happen for someone who passed on a subsequent attempt. Keep it part of the exam record, and the only part of that record which is open to public access is the pass/fail result.
Gene Kooper, post: 436551, member: 9850 wrote: I see something similar with providing new licensees with a diagnostic from the exam listing strengths and weaknesses. I took the NCEES exams for the LSIT, PS and EIT in 1984 and 1985 . Back then, the Board included the exam score in addition to informing me that I passed. They no longer do that for a myriad of reasons.
As I said in previous posts, the percentage score as a measure of overall exam performance isn't all that useful if the exam tests more than one practice area. It does nothing to inform the examinee about areas of strength vs. areas of weakness.
I took a look at an example of a diagnostic report on the NCEES site. As I recall, it had performance ratings of "proficient", "marginal", and "not proficient" (or maybe "not competent", I'd have to look again). That's all the report would need to say. It would not need to say "you scored 86% as compared to a minimum requirement of 62%". It doesn't have to have the ego boosting aspect many in this thread want to claim others are looking for.
Gene Kooper, post: 436551, member: 9850 wrote: CFedS has a similar policy of not telling those that pass what their exam scores are. The only distinction to that rule is that the first two CFedS numbers (1001 and 1002) were awarded to the person with the highest overall exam score and the person that scored 100% on one of the three exam modules. IMO the licensing exam is a crude tool to determine those regarded as being minimally competent to practice. Most states have a requirement that the registrant/licensee shall limit their practice to the areas they are competent.
I agree with those who are opposed to any public class ranking information. That's not what I or anyone else in this thread has suggested. It's not really helpful to anyone but potentially detrimental not only to the "bottom" tier for those who passed, but to the profession as a whole.
Gene Kooper, post: 436551, member: 9850 wrote: The exam's purpose isn't to delineate the areas that a licensee is competent to practice in; it is a determination that the licensee is minimally competent to practice in any area of land surveying.
If it were a determination that a licensee is minimally competent to practice in any area of land surveying, then there would be a cut score for each practice area and the examinee would be required to meet or exceed that score in every area tested. That's not how the exam is designed. It is an exam to determine if the examinee can demonstrate minimal competence in enough areas to pass the exam as a whole. There were years that the cut score in CA was as low as 44.6%. By that score, a person may have passed yet not even scored half of the points for any given practice area.
I've heard the argument that it's the Board's job to ensure minimal competency, and that anything beyond that is not their responsibility. I strongly disagree. It's the Board's statutory mission to protect the public from incompetent practice. If an examinee just barely passes, and Ric has correctly stated that most pass or fail within just a few points of the cut score, then it's practically assured that the new licensee is still incompetent, even if just barely, in one or more of the areas of practice tested.
The State says "You passed" without qualifying the statement other than the implied reference to the statute or administrative rule that says to practice only within your area of competence. Without qualification, and as you described the exam, that grant of a license covers all activities defined as the practice of surveying and by definition, the examinee has no areas of deficiency. If that were truly the purpose of the exam and what it actually was measuring, there would be no need for the "practice within areas of competence" provisions in the law.
An examinee can reasonably know areas of competence if they went in knowing they lacked knowledge in certain areas and either didn't answer or guessed at the questions in those areas, then they can assume that they did quite well in the areas they had confidence in. But for those examinees who passed by being just barely competent in some areas and just barely incompetent in others, they can't reasonably be expected to know which was which.
Gene Kooper, post: 436551, member: 9850 wrote: It is incumbent upon each registrant/licensee to extend their learning throughout their career, not the state licensing board or professional society. Being a simple country boy, i know the truth in the old adage that you can bring a horse to water, but you can't make them drink.
I completely agree with this statement. But the corollary is that just because you can't force the horse to drink is no reason to deprive it of water.
James Fleming, post: 436629, member: 136 wrote: Agreed 100%; but that's not what you originally stated. You wrote that a person with a degree has proved they know more than a person without one. If you've run a five minute mile, and I've never run a timed mile, you've proven that you can run a mile in five minutes and I haven't. However you haven't proven that you can run a mile faster than I can.
One option I haven't seen mentioned. Since the FS is designed to test for a knowledge base equivalent to an accredited four year degree, eliminate the FS for holders of acredited degrees and expand it into a multi day comprehensive exam for those without one.
Yeah, I am not sure what additional value the FS has for evaluating a survey program graduate. I think that, along with state specific exams that do a better job testing in the areas that the surveying degrees are weak could work.