Blackboard Battles: Test Tool

Blackboard has a feature where you can create tests or surveys. I use this for pre-labs for 3 experiments in 1st year and have done so since 2013/14. I’ve used it for a lab safety quiz as well.  There are various types of question styles you can use, some that can be automatically graded (and are more sophisticated than multiple choice), and some that require manual grading.

I use multiple choice questions to create short formative tests for students to test their understanding. Often these are simpler versions of the MCQs for Peer Instruction or derived from the banks of MCQs that most textbooks seem to come with. Generally they have feedback. For calculations, I generate banks of similar MCQs (inspiration: ViCE2012 http://www.possibilitiesendless.com/2012/09/variety-in-chemistry-education-2012/) via a semi-automated process using excel and ‘concatenate’. This also works for other multiple choice questions where it is easy to define a question stem and variables and just change the variables.

Fill in multiple blanks lends itself to more complex calculations or to balancing equations. For example, I create a table of Beer-Lambert law related calculations with columns of concentration, absorbance, pathlength and molar absorption coefficient and leave one cell blank per row for the student to calculate the appropriate missing value from the others. I have to specify the answer I want and there is little room for rounding errors in this question (compared to calculated numeric styles that I’ll talk about below). It also works for balancing equations – fill in the blank bits  – but some guidance on the right syntax and style need to be given or you need to manually regrade to accommodate some variance. One example is [Co(OH2)6]2+ and [Co(H2O)6]2+: both would probably be marked as correct by a human but not necessarily by automated marking. Also, easier to not require subscripts and superscripts in answers which can create mixed messages about form.

Calculated numeric are brilliant for practice on performing basic calculations like converting between concentration, volume, mass, moles. You can input a range of acceptable values and so can accommodate rounding issues with multi-step calculations. Again a bit of a steer is required for the students as to how many decimal places you want and to state whether units are required. I’ve not done it but I could see a nice question pair with a calculated numeric question giving the value followed by an MCQ to select the appropriate units. Again, the lack of units may create mixed messages but can be handled well.

I use the matching question style for things like determining the products of transition metal reactions. The students get a list of half equations (half in the sense of the reactant half and the product half, not in the electrochemical sense) and match them up. This system can accommodate two or more routes to the same answer, I just forget it can from time to time! I think it’s helpful in the TM reactions, particularly where the students struggle to even think up what the products might be. They can be a bit more deductive in puzzling it out. I’ve found that the reactant half of the equation causes a bit of confusion when it represents several steps in one, for example:

Fe(NO)3(s) + H2O(l) + NaOH(aq) + H2O2(aq)

The students clearly imagine shoving all of that into a test tube when it’s much easier to take each thing in turn and consider it the dissolution of iron nitrate in water followed by formation of iron hydroxide then subsequent oxidation. I may break this type of question down into different steps in the future, I haven’t decided yet because there are several equations, the first two of which cover:

Fe(NO)3(s) + H2O(l)

and

Fe(NO)3(s) + H2O(l) + NaOH(aq)

It might be easier to ensure they appear in that order rather than being randomised by the computer.

For several of these question types, I create banks of questions and students get a random selection. This means that two students doing the pre-lab together are unlikely to get precisely the same questions. This works really well for calculated numeric types where it’s easy to vary volume/moles/mass for the same compound or for complete the blanks styles. For the matching, it requires a completely different set of equations but that’s possible for me because we study several metals in the lab in question.

I find it easier to set these tests up with 2 attempts per student but consistently screw up the settings about release of their grade. Ideally I’d give them 2 attempts just incase they have an issue with their internet. The year I didn’t do this, I was inundated with emails asking me to give more attempts. This year for the safety quiz, I gave students unlimited attempts and some were excessive in their pursuit of a high mark. Two or three attempts accommodates likely internet failings or other issues. Again, ideally, I would release their grade after the due date but I never quite get that right. I’ve considered whether I should just allow them multiple attempts to get the best possible grade, something that isn’t possible with paper-based pre-labs. I’ve reached no particular considerations based on those considerations, except to continue offering a limited number of attempts and allowing students to get the highest mark they obtain. It would be possible to permit them to see the questions they got wrong and the feedback before doing another attempt and this has advantages for learning, but is different to how other staff run their paper-based pre-labs. After the due date, they can get their grade and feedback, and after the late submissions deadline they can get full access – grade, feedback, correct and incorrect answers. I’m still not convinced I’ve got the settings precisely right for this but it works.

For my two pre-labs that have now run for three years, the average marks have ranged between 13.71 – 14.36 out of 20 for one and 12.55 – 14.84 /20 for the other. I’m just looking at specific questions now to see how they have varied over the 3 years. There have been changes to how I teach in that time so it will be interesting to investigate. I am using these data to determine areas where more support is needed, particularly around calculations and transition metal reaction equations. Hopefully I can work with a summer student this year to produce some new questions and better feedback. It’s nice to have the data (which wouldn’t be possible with paper-based in the same way, with statistics etc) because it means I can see which issues are cohort-specific and which issues are annual.

The real advantages of these tests is that after the initial set-up time which is definitely longer than setting a paper-based pre-lab, checking through the marks/manually marking a few bits here and there takes a lot less time than marking the paper-based submissions. The number of submissions has increased from 53 to 96 over the past 3 years, reflecting our increasing student numbers. I can use the time I would spend marking developing new questions, improving the feedback on the existing questions or producing additional resources to address areas where some students struggle. This year I wrote a lot of model calculations and used selective release to make them available to students who had attempted the pre-lab.

The disadvantages are the initial set-up time and thinking creatively about how to translate paper-based questions into the more rigid electronic formats. I think with a bit of creativity and the inclusion of videos and images, it’s possible to accommodate most things. You can even get a format where a student can click part of an image – this could work for mechanisms, identifying the place where the curly arrow starts or ends, the functional group that gives a specific peak on IR etc.

And as for the personal touch, there’s room to write specific feedback through manual grading if it is felt to be necessary.

Leave a Reply

Your email address will not be published. Required fields are marked *

*