Best Practices For Certification Exam Blueprints |
By Joe Cannata August 6, 2019 Before a subject matter expert (SME) writes the first item, before the exam cut score is determined, you need a solid foundation and strategy to base your exam. This cornerstone is the exam blueprint. The blueprint, or critical document, contains the content and specifications for an exam. In this blog, I address multiple-choice and short answer exams, since blueprints for performance-based exams require different sets of rules and constraints. As part of any job task analysis (JTA) performed in an exam development cycle, one of the major deliverables will be an exam blueprint. An exam blueprint contains three key elements: The number of items asked in the exam The major sections represent the main areas of focus for the exam, like modules in a course. The objectives represent the complete body of knowledge that the exam covers, as well as what you expect the candidate to know to get a passing score. These are equivalent to individual module learning objectives. Best practices for a blueprint: No more than seven sections The content domain dictates how many sections you have. Typically, in a JTA, you use some methodology to determine the number of sections and objectives. The number of objectives is really the key number. As your number of objectives increases, the amount of questions you can ask for each objective decreases. That means, due to the constraints of the blueprint you may not be able to dive as deeply as you desire technically. For those objectives where you have only one item, ask yourself, is answering one question correctly enough for a candidate to demonstrate mastery of the concept? Think of the exam as a job interview, where you may ask only 60 questions. If you were interviewing a candidate about advanced network architecture, would one question about IP addresses convince you the person knows that subject? Look to pare your objective counts by making them broader, instead of specific, in scope. A carefully worded umbrella objective might span multiple concepts and helps reduce your objective count. You could also increase the number of items on the exam and the time. However, a consideration is that some delivery platforms charge more for longer seat time, which could lead to higher delivery costs. When considering the objectives you create, give thought to the level of the exam. There are two types of objectives, recall and reasoning. Recall is straight knowledge of a fact. Reasoning requires a higher cognitive skill to arrive at a solution given a situation. Especially on a multiple choice or short answer exams, drive for a higher number of reasoning objectives. You want the exam to be more than a regurgitation of facts. Entry-level exams: 70% recall, 30% reasoning If you are able to, it is good practice to exceed the percentages for reasoning items. The more reasoning items on the exam, the higher the cognitive process will be. There are times when straight fact recall is important as well, so don’t totally avoid recall items. There is no good reason to write a scenario and make an item longer if the scenario is not necessary. If you can cover up the scenario and still answer the item based just on the stem, then you do not have a reasoning item. One last best practice: Once you get all the SMEs to ratify the sections, objectives, and item counts per objective, you must consider this cast in concrete. It is important to stress this with the SMEs during the JTA process, because inevitably they will beg later on, asking to redistribute item counts because they cannot meet the item targets. The JTA workshop leader should have the wisdom and experience to caution the SMEs throughout the process so they do not agree to a promise they will not be able to fulfill. Once the SMEs set a blueprint, it should remain unchanged. |