The deadline (and 7 day late deadline) has passed but I’ve not yet marked the work so this seems like a good time to talk about the first years’ information retrieval exercise. We use 15% of the module marks in one of our first year modules to have the students get accustomed to retrieving and evaluating chemical information. Last year I had a quiz that required the students to familiarise themselves with a core textbook. It took quite a while to set up the questions, despite using Dr Stephen Ashworth’s wonderful excel method for writing questions quickly. I made a choice last year only to write questions based on one of our 4 core books and picked the most general one. Unfortunately that one was cut this year and replaced with a more specific physical chemistry one. Sigh. You can’t make the students do a quiz based on a book that they haven’t been asked to obtain. Effort wasted (these types of quiz are only ‘worth’ the time spent if they can be used for several years, one year is insufficient really versus the time spent setting and marking an assignment).
I have two pet hates when it comes to student work and they are linked in the same way that twins, separated at birth may well be. The first is the portrayal of chemistry on the internet and the number of supposedly legitimate sites that get the chemistry vastly wrong. The second is the unquestioning way in which some (by no means all) students will google first, think second. Therein lies the makings of an information retrieval exercise. I will admit that it was with a degree of tongue in cheek that I sent the assessment criteria and example of what should be done to my colleagues. I didn’t really expect a positive response but I did get one so decided to go ahead with the exercise.
The brief to the students was simple: find an example of inaccurate chemistry information on the internet and, using the core textbooks and other sources as appropriate, explain why it is incorrect and suggest changes that should be made to the site. I directed the students to the Way Back Machine in case they found a good example which was subsequently changed. My example piece of work was one of these examples – a BBC article on the science of Breaking Bad. Initially hydrofluoric acid was referred to as a strong acid and then it was edited to say powerful.
I think there has been a mixed response to this assignment. Some students really struggled to find a piece of wrong chemistry information and I suggested using image searches or internet forums. Some resorted to social media and found conversations on Twitter. Others have found some wonderful examples of chemical imprecision, if not inaccuracy. Some students have found generalisations on A-level chemistry sites that they clearly understand to be only part of the whole story which is heartening in itself. Some students have found things swiftly and with the sort of ease that we often assume our computer-savvy students all have. It is safe to say that despite having unprecedented access to ‘online’, many of our students are reluctant to move beyond their well trodden paths and familiar sites. Perhaps google first, think second is less common than I first believed.
I’m sure I will have more comments as the marking commences. I’m blogging about it today because Turnitin seems momentarily down for maintenance but secretly I’m glad that I can curl up and write rather than mark this afternoon!
 Stephen H. Ashworth: (2013) Generating Large Question Banks of Graded Questions with Tailored Feedback and its Effect on Student Performance. New Directions 9(1), 55-59. DOI: 10.11120/ndir.2013.00006
Available at http://journals.heacademy.ac.uk/doi/abs/10.11120/ndir.2013.00006 (accessed 2/11/13)
 http://archive.org/web/ (accessed 2/11/13)