Impractically Impossibly Imperfect

This is post 2 on laboratory practicals, sparked by discussion on Twitter of A-level science reforms to replace practical assessment with an endorsement. In post 1, I considered that the only way to assess practical work directly is to observe students doing practical work. In this post I’m aiming to consider the impact that prior experience of practical work has on 1st year chemistry students.

We overtly assume little when it comes to the practical skills of our incoming students. Take an group of students and they will have sufficiently diverse backgrounds that make assumptions very dangerous. Our lab course is designed to build up the basic skills in the first semester. That’s not to say we go slow, and that’s not to say that good prior experience doesn’t help, but we go at a reasonable pace and knowing your way around a lab is an advantage.  We do, however, unconsciously assume a great deal about the practical skills of our incoming students, but this is practical skills in a far broader context.

Firstly, we assume that students can follow instructions to carry out practical procedures and that it goes without saying that this involves gathering equipment, planning an efficient way to do work, working in an organised and tidy manner, and cleaning up effectively.  I think this is a fair assumption to a point. These skills rely on extensive experience of lessons in all subjects that aren’t just sitting doing worksheets. That can be anything from art to home economics, chemistry to woodwork. It can be hobbies or clubs, activities with family, friends, scouts or guides.  And yet it seems that some students struggle with this kind of thing. Being aware of the surroundings and the other people working near by is essential here as well. Not leaving drawers or cupboard doors open, moving the stool so no one trips over it, returning items promptly after use. We assume that our students are used to working in a busy environment, again something that comes from experience.

Secondly, we assume that students are practiced at critiquing practical procedures and knowing when something isn’t right. I’m not talking about the mythical powers of any demonstrator to identify a product waved in their face and answer the ‘does this look right?’ question. I’m talking about the ability to use an ice bath so that the flask doesn’t tip, or recognise whether something is boiling or not.  There are sometimes logic gaps in how people work in a lab that come down to a lack of experience of doing stuff.

Essentially, how can we hope to teach students to transfer air sensitive materials via syringe (even after 2 semesters of lab) if they are unaccustomed to translating written instructions into physical actions, incorporating verbal suggestions, finding equipment and performing basic tasks? Less practical work in any context is bad news for the first year lab class.




Posted in lab mahem, Teaching | Leave a comment

Practically Possibly Perfect

I believe I’m supposed to have an opinion on the changes to A-level science subjects with respect to practical work. I can’t muster up much enthusiasm for the debates, although I like what I saw late last week on Twitter when it was pointed out that Nobel Laureates don’t know very much about how practical work is run/assessed in schools, teachers do.  I don’t know very much about how practical work is run/assessed in schools but I do know what it looks like from the point of view of teaching the 1st year undergraduate chemistry cohort. The question for this post is how to assess practical work effectively. I’ll probably write a second post on how we deal with our incoming students’ practical skills.

We have two practical exams, one in December of 1st year and the other in May of 2nd year. Both are intended to directly assess the practical skills of our chemistry students by directly observing them in the lab, and also by measuring the more typical indirect measures of competent practical work such as results obtained, lab diary entry, data analysis. The vast majority of practical work assessment that goes on through out degree courses is indirect assessment – that is, assessing the analysis of results obtained (and the results themselves). This is poor for a number of reasons. Firstly, many people confuse assessing results analysis with assessing practical skills. They aren’t the same thing. Given that most undergraduate practicals are comparatively fail-safe, it may not take that much skill to get enough product to do the write-up.  Secondly, results can be fudged (or more scientifically, be open to measurement bias – more on that later), shared, googled, extrapolated and just plain made up.  How many departments list ‘doctoring experimental results’ as academic misconduct? Finally, some marks schemes don’t assign enough marks to the bits that require critical appraisal of how the experiment has been carried out.  The yield is good for this, the NMR can be clean or not (unless the peaks have been deleted in processing), and looking at the quantity of stuff in a sample vial tells you more. Marks have to be assigned for those tasks.

For many experiments, it is possible to be biased when making measurements. I can understand why titrations are popular experiments, a nice convenient number that can be verified by a competent person using the same chemicals and procedure. However, getting two or three concordant readings after running a rough titration doesn’t strike me as something that’s particularly robust. Once you know where the end point roughly is, it’s probably very easy for your brain to trick you into thinking you’ve reached it, giving you better results than you might otherwise get. You need to be very honest when performing such techniques, with a good sense of what the end point looks like to be consistent here. I’d prefer three titrations using three different concentrations of stuff in the burette and see how close that rough titration is to the precise value. That’s a bit less biased really, but possibly a better test of the observation and skill needed.

Practical exams are stressful for all concerned and there is no way to get around that. You have to watch what people are doing. Frustratingly, there is substantial variation in what is viewed as correct technique, even within the same module or course.  One member of staff may believe that putting the lab diary down on the balance table when weighing out is bad practice, another just prefers that the students write the mass down quickly and doesn’t care where the lab diary is. One member of staff prefers that students use a weighing boat and quantitative transfer when preparing samples for UV-Vis analysis, another may feel that weighing directly into the volumetric flask is less likely to induce errors in the results (the ones they will mark).  The only way to overcome this is to have a mark scheme that allows for subjectivity on the part of the marker:

In the opinion of the invigilator, the student performed the task (e.g. preparing a solution in volumetric glassware):

0 – did not perform task (e.g. ran out of time, produced no product…)

1 – poorly (significant mistakes made that would affect the quality of the results obtained – e.g. weighed out on a weighing boat but failed to quantitatively transfer the material or weigh by difference)

2 – adequately (minor mistakes made that would affect the quality of the results – e.g. weighed directly into the flask but failed to notice the quantity spilled on the balance resulting in lower mass than expected in the flask)

3 – well (minor mistakes make with some impact on the quality of results – e.g. used the recommended mass from the instructions to calculate the concentration of the prepared solution rather than the actual mass used, actual mass not recorded)

4 – very well (minor mistakes related to the procedure that would not affect the quality of the results – e.g. used the recommended mass from the instructions to calculate the concentration of the prepared solution rather than the actual mass used, but the actual mass was recorded and the mistake could be rectified with prompting)

5 – exceptionally (could not find fault)

Where does the threshold for a pass sit in such a scheme? Probably around 2.5 I’d say, so 50%, a little higher than the standard 40%.

Safety is the next thing to factor in. For our 2nd year practical exam, we operated a 3 strikes policy – if a student must be prompted on a safety aspect more than 3 times, they fail the component. We meant things like taking off safety goggles, spilling substances and not alerting a demonstrator or cleaning it up (no points were to be deducted for spilling, only if it was ignored), or setting up equipment in a way that meant the demonstrator had to intervene before it was turned on such as an unclamped reflux.  I’m not convinced this is the way to go – perhaps a deduction of marks rather than limiting it to 3 strikes because there will always be a couple of student who push that to its limits! We also reserved the right to ask a student to leave if we felt they were unable to work safely and were a danger to those around them, if they turned up inappropriately attired, or broke any of the general lab safety rules in a serious way.

Finally, a serious discussion about what a pass looks like in terms of lab performance needs to be had. A bare pass probably involves a fair bit of prompting by the demonstrators but with the student able to complete the tasks when prompted. For example, a prep may say that the reaction was cooled rapidly to 0 ºC and a student may have forgotten how to do this. A demonstrator may prompt with ‘use an ice bath’ and the student would then do the task. I’d probably give a bare pass if they just used an ice bath, but if they used an ice-water bath and clamped the flask in it, I’d give a little more because there is at least evidence of further thought.  This is entirely debatable however and it may be the opinion of those conducting the assessment that prompts should not be needed.

The only suggestions I have to minimise the stress of being watched doing lab work is to run a mock exam (and this also helps you test drive the mark scheme and train the demonstrators), ask each demonstrator to mark 4 – 6 students at a time, and ask the students to think of it as a really quite lab class but where common courtesy such as ‘are you finished with the acetone’ is still permitted. The last point makes a big difference in stress levels and atmosphere. It’s also really important that students understand that they can and should ask for prompts in order to get through the allocated tasks.

Posted in Uncategorized | 2 Comments

On the recording of lectures

I’ve been using Camtasia, a tablet PC (old style laptop one) and dictaphone to record lectures for about 4 or 5 years. I forget how long. I started by recording audio only, but after getting the tablet PC have moved to full screen capture. I upload processed versions of these captures to my modules. Processing is really topping and tailing with a small amount of editing if we’ve been doing lots of problem solving or if a situation has arisen that warrants it.  I generally make the recordings available to those who have attended or who have been absent with good cause and its not unknown for me to set selective release to those people who were there. I’m never happy enough with how a session has gone to contemplate using the lecture capture the following year as a flipped session. Also, I think short recordings are better than long recordings for flipping. I ensure my screencasts are produced with a table of contents (from the powerpoint titles) and a controller to allow pause, fast forward and rewind. The annotated notes may also be downloaded.

I’m starting to question the wisdom of lecture capture however. At first I believed it was a viable solution to the ‘problems’ with traditional lecturing – too much information, sage on a stage, and very passive. I felt that for complex topics, students would welcome the opportunity to revisit the explanations give, or supplement the notes they took with more detail, and that those who couldn’t write as fast or who were absent could catch up. For complex topics, I’d rather create a specific screencast but I struggle to find the time to create those resources, and I really hate recording things in my office. So the lecture capture seemed a good trade off.

But the passive nature is starting to really really bother me. Sitting re-watching lectures as if they were some kind of educational box set is not revising. It sure as hell isn’t learning.  And despite my seemingly formidable knowledge of many TV box sets that I have watched repeatedly, I suspect that my students generally lack the stamina or time to watch my lectures enough to really memorise the content! Yes I know, I could be flipping or some hybrid version, greater interactivity… I should also be doing 50,000 other things as well and converting some courses to more interactive versions is not high enough on the priority list at this time.  But at a time where more and more institutions around the country investigate or continue to use lecture capture technology as standard, these are questions worth asking because a great deal of lecturing is still conventional and now being captured. And lecture capture is, by all accounts, very very popular with students.

Sitting re-watching lectures is not revision. It’s learning the lecture which, at best, is an outline of the material designed to be complimented by trying examples, reading around the topic, and challenging, consolidating and improving subject knowledge. Yes it’s great when you can re-watch that complex derivation, or watch the mechanism being drawn more closely to really see the specifics, but you have to go beyond. Our intended learning outcomes are designed to define what 40%, the barest of passes is. Maybe it would be better to consider the lecture in the same light – learn the lecture, get 40%. Have knowledge of multiple concepts in isolation but fail to build a framework of understanding where they are interlinked, complementary and amenable to tackling complex problems.

So that’s my line this coming semester: learn the lectures and you might got 40% if you do a half way decent job of it. Attend the problem classes and try the problem sheets, and you’ll head towards 60% but if you want to break 70%, if you want to head into 1st class, do more.Do every example you can find. Read the recommended reading and start to develop a deeper understanding. Take responsibility – your learning comes down to your effort, not my effort in a short series of lectures. I’ll be trying to convey a sense of a fairly complex topic in a non-ideal room while you all struggle to wake up or struggle to stay away depending on which side of noon the class is. If we acknowledge the non-ideality of the situation, we might get somewhere. And I’ll keep recording for now.


Posted in Uncategorized | 25 Comments

End of week 9 update

No matter how determined I am to ‘write more’ of any sort (and there is a direct correlation between the quantity of writing here and the quantity of work done on drafts of papers – more blogging = more writing), semester has ways of ending the best will.

I have been travelling a little for work to attend a committee meeting at the RSC and to do a programme validation. I’m amazed that (a) I can find days with nothing scheduled to travel, and (b) just how tiring a simple train or plane trip can be for a day.

I have been writing new lectures at a ridiculous rate (at least 1 new prep per week this semester, some weeks up to 3).  It is hard going developing new-to-me content, worse still when some is new-to-my-uni content meaning there are no previous lecture notes to use as a road map.

I have been working on the paperwork for a new degree programme here and find that writing programme specifications kills any other writing ability temporarily.  Reading may module proposals, trying to figure out what regulations are needed and giving the programme handbook a wide berth until the other components of the course are finalised.

I was happy to see  one of my posts featured in this month’s Nature Chemistry Blogroll column ( It was my begin again post of a few weeks ago, (Was that just a few weeks ago? Seems much longer).

And I’m counting down to my own version of ‘black Friday’ – the day when all my coursework is due! Somehow I managed to schedule work from three different courses to come in 4pm, Friday week 11. This gives me time to mark things over the holidays but fingers crossed that no disaster befalls the submission software! I always forget that proximity to deadlines increases the number of student questions that require prompt responses…


Posted in Academia Nuts, Teaching | Leave a comment

If I had a million dollars…what I’d do!

£50,000 is an interesting sum for UK academics – it’s just about enough for a PhD studentship funded at RCUK rates and some consumables (potential issue with fees but not a major one). It’s probably about enough for a 1-year postdoctoral researcher or 18 months of a graduate research assistant. It might well be sufficient for teaching buy-out for a year’s sabbatical and expenses. It could also be a large number of summer studentships, all or part of a shiny bit of kit, or a means of accessing facilities at another institution.

So what would I do? A postdoc or research assistant would be more use to me really than a PhD studentship. A 12-18 month position to get some initial data on a couple of projects that would allow for initial publications and future funding applications. I’d shy away from a PhD studentship because despite the longer term benefits of training someone, I’d rather  focus on the research. I have a few project ideas brewing and it would be really nice to get a dedicated person to tackle them rather than the piecemeal effort I make, balancing everything else.



Posted in Academia Nuts, grant hell | 1 Comment

If I had a Million dollars….

Thought experiment for academics:

If someone handed you £50,000 (approx 65,000 euros and $80,000 US), what would you do with it?

Would you use it for work related purposes? Perhaps a sabbatical, pump priming a pet project or funding a research worker. Assume no constraints and no ‘overheads’ can be taken from the money.

Posted in Academia Nuts, grant hell | Leave a comment

Chemical Shortcuts

I was writing some lecture notes this morning when it occurred to me that I’d probably never told my students of some shortcuts that are really useful. I was trying to draw structures for a synthetic route towards paclitaxel…trying is the operative word unless you are aware of what a SMILES string is and the fact that you can copy-paste that through paste-special in ChemDraw and other programmes and the structure will be drawn for you. In the case of paclitaxel derivatives, that’s hours of pain reduced to seconds of smugness.

The best source of SMILES or InChI strings is ChemSpider ( that has all kinds of useful information as well as that bit. What I didn’t know before writing this post was that SMILES (apart from creating a great many smiles at the sheer simplicity of complex structures) stands for Simplified molecular-input line-entry system, and InChI, International Chemical Identifier.

Another simple trick when looking for scientific journals (or making quick references in work ahead of proper referencing) is to copy-paste the DOI of the article rather than the reference. Digital Object Identifiers ( are found on most reputable articles (and probably most disreputable ones as well, but hey) and can be a very quick and easy way to keep tabs on references electronically. They are found on most journal articles, their websites on the publishers site, and are readily googled to find the full text.  Of course, they do not replace the proper reference in the desired format but they do make wrangling a reference collection in early drafts much easier.

I can’t think of any more at the moment – any suggestions?

Posted in Academia Nuts, Chemistry, organic gripes, Teaching | Leave a comment

REPOST:The Ritualistic Nature of Literature Searching

I’m reposting this today as I’ve got a session with our 2nd years on information literacy this afternoon. We’ll be covering Web of Science searches along with other hints and tips for wrangling the scientific literature.

First published February 9th, 2009 on v2.0…

It has just gone 8.30am and on this snowy Monday morning I have reached my desk, set my shoes to dry on the office radiator and booted the computer. Before I make my first cup of tea of the day, I’m blogging. Procrastination? Actually, no, let’s call it a plea for help.
The first thing I do on Monday mornings is log into Web of Knowledge and search for ‘dendrimer’ with Latest(current)week setting. I know, you’re thinking that there is nothing unusual about this. Perhaps you’ve seen my profile and noticed that part of my research is to do with these dendrimer things. It is only natural that I’d search for recent papers published that may be relevant. No, the problem is that I do this every Monday morning that I’m at a desk with a computer, and I have done this ever Monday morning (except for holidays and brief unemployment) since the turn of the millennium.
It has never occurred to me to automate this search in any way, shape or form. It has, if I may suggest such a thing, become a research related ritual. The difference this morning is that it struck me as odd.
As keyword searches go, it is pretty lousy. One lonely keyword spanning a whole subfield of chemistry. No specification for biomedical applications, or environmental metal stuff, no acknowledgment that other keywords may yield relevant results (for example: polymer, hyperbranched polymer etc), and that combinations of them may be quite efficient.
It doesn’t work, yet I do it weekly. One task for today: find a better way! Will update later…

Posted in Academia Nuts, Chemistry, Publications | 14 Comments

REPOST: All the pretty colours…

One of the most spectacular chemistry experiments is flame tests. I think it’s a particularly elegant experiment, and also one that can easily be done in high schools. I remember doing it when I was about 14 or 15, as part of standard grade chemistry. It is very simple; you take some wire, clean it in concentrated acid, dip the wire in acidic solutions of various metal salts and then put the wire in a blue Bunsen flame. It has some of the key components of a good chemistry experiment – acids, Bunsen burners and pretty colours. In a similar manner to fireworks, different metals burn with different coloured flames. I’m sure that my standard grade chemistry teacher had a sense of humour when she suggested to us that we leave sodium until last for it was hard to see. Yes, the bright and vivid orange colour was quite surprising coming after potassium’s pale lilac, strontium’s red and copper’s blue green.

I was thinking about this during a lab session last week when we were looking at exothermic reactions. OK, we were looking at some really great demonstration type experiments including thermite and the potassium dichromate volcano. More about them another time. We were also doing the screaming jelly baby experiment, also known as the jelly baby rocket or rocket to the moon. Simply, this involves taking a boiling tube, clamping it at a 45 degree angle, filling it with some potassium chlorate. The potassium chlorate is melted with a Bunsen burner and the whole thing is done in a fume hood. Once molten, half a jelly baby is dropped into the boiling tube and a rapid and spectacular reaction ensues as the sugar reacts creating flames, sparks, sometimes a roaring noise and lots and lots of smoke (one good reason for the fume hood). We noticed that the Bunsen flame, still set to blue, turned lilac due to the quantity of potassium ions in the smoky atmosphere. It was pretty impressive really, and despite watching 5 or 6 groups of student perform the experiment, never lost its magic.
There are lots of simple, elegant and impressive chemistry demonstrations like this, often used to bribe potential students at university visit days and enchant more disruptive classes at school. They are also the things that people remember most strongly about high school chemistry.

First Published May 19th 2009, on version 2.0!

Posted in Chemistry, lab mahem | 2 Comments

On the process of doing research.

Question: if you are the first person in the known universe to attempt a reaction, why on earth would you assume that the reaction has worked and progress to the next step without characterising your product fully?

I appreciate that undergraduate labs are often ‘illustration of concept’ labs that have been designed and further refined to work properly with only a tenuous link to experimental technique, but there seems to be a major logical gap between the idea of carrying out research and verifying that the experiments have worked. This comes up from time to time with students and it always makes me stop and wonder what’s going on. And where’s the curiosity? Where is the drive to prove that you were the first to do a reaction and here is the best possible evidence to support that? That’s probably more to do with ego than curiosity but still.

I suppose the flip side is far bleaker – when did I become more cynical that my working assumption is that a reaction has not worked ? And I assume that I need to analyse it fully to determine just how much it has failed before varying something and trying again! Perhaps the more optimistic approach is better, but then there is the disappointment…

Posted in Academia Nuts, reactions, Teaching | Leave a comment