ViCEPHEC 2012 Storify Backup

As a preamble to VICEPHEC you may want to see the Storifys prepared by <a class=”tweet-url username” href=”https://twitter.com/kjhaxton” data-screen-name=”kjhaxton” target=”_blank” rel=”nofollow”>@kjhaxton</a> of last year’s VICE conference.<br>

The Social Media Story storified by Katherine Haxton</div><div class=”s-attribution”><div class=”s-source s-storify”><a href=”http://storify.com” target=”_blank” rel=”nofollow”><div class=”s-source-icon storifycon-logo”></div></a></div><div class=”s-author”><a href=”http://storify.com/kjhaxton” target=”_blank” rel=”nofollow” class=”s-author-name”>kjhaxton</a></div><div class=”s-clear”></div></div></div><div class=”s-clear”>

Variety in Chemistry Education how it happened on Twitter</div><div class=”s-attribution”><div class=”s-source s-storify”><a href=”http://storify.com” target=”_blank” rel=”nofollow”><div class=”s-source-icon storifycon-logo”></div></a></div><div class=”s-author”><a href=”http://storify.com/kjhaxton” target=”_blank” rel=”nofollow” class=”s-author-name”>kjhaxton</a></div><div class=”s-clear”></div></div></div><div class=”s-clear”>

By the miracle of the web, we even have a record of VICE 2010!<br><br>

Variety – Living Up to its Name!

For VICEPHEC 2012 the story starts here:<br>

For those suffering withdrawl between ViCE 2011 and ViCEPHEC the HEA session on Lecturing without Lectures provided some relief and a Storify all of its own, <a href=”http://sfy.co/j0D1″ title=”” target=”_blank” rel=”nofollow”>sfy.co/j0D1</a>

Is Matthew on Twitter?

Hands up if you too have a wiggly giggly!

ViCEPHEC2014 Storify Backup

ViCEPHEC 2013 Storify Backup

Stop attacking technology

There are a lot of articles, twitter threads and the like flying around a few weeks ago about the alleged impact of technology on shall we say the ‘quality’ of learning?

There was #LTHEchat on lecture capture, and a lot of anecdotes about lecture capture encouraging superficial learning (box set mentality) circulate regularly. The lack of evidence of improvement in learning is also frequently cited by the no lecture capture side.

There have been the articles that mobile devices (laptops, phones) damage learning when used to take notes in lectures. And inevitably the calls that electronic devices should be banned from classrooms/lecture theatres.

I’m tired of both of these ‘issues’.  I’m tired of having to point out that there are students who cannot participate fully in learning sessions without the ability to take notes by laptop. I’m tired of having to point out that there are students who cannot attend every session and who benefit from lecture recordings even when there are only limited portions of the recorded session that are ‘transmission of information’. (e.g. sessions with high active learning – record them). I’m tired of wanting to remind people who should simply know better that:

(a) they were not representative of all styles and modes of learning when they were students and were likely a-typical in that they are now academics.

(b) as researchers, they should be willing to base decisions on evidence rather than personal opinion, anecdote, or herd decisions. And if the evidence for their specific context is lacking, they probably have the ability to conduct a study and generate some.

(c) if use or absence of use of technology has that critical an impact on students’ success, one has to consider the pedagogy being employed as being flawed.

It’s been a long semester and I’m all kinds of tired, but mostly I’m tired of people (who should know better) forgetting that students are humans.

 

Keele Seminar

I gave a seminar at Keele University (my own uni) yesterday. I decided to present two works in progress (WIPs) rather than any complete thing. I read somewhere that presenting before you’re ready with the completely polished final thing was a useful way to get some feedback and drive your thinking about a project so I thought I’d give it a go.

From Tesla to TESTA: Meanderings in Chemistry Education Research

This seminar comes in two parts. The first looks at the use of diagnostic tests to evaluate the knowledge of students at the start of a block of teaching. Over the past 3 years, two diagnostic tests have been developed, one evaluating 1st year’s knowledge of topics in spectroscopy on entry to 1st year, and the second evaluating 2nd year’s knowledge of NMR and related topics in preparation for a multinuclear NMR course. The prevalence of key misconceptions is determined and areas to be addressed in subsequent teaching identified. The second part looks at the use of the TESTA (Transforming the Experience of Students Through Assessment) process to catalogue changes in the Chemistry course from 2010 to 2017. The TESTA process provides metrics to evaluate the nature of assessment and feedback processes, however is deficient in a key regard: impact of assessment deadlines on student workload. Assuming an ‘ideal spherical student’, a student workload model is proposed and considered in the context of having sufficient time to participate in assessment for learning activities.

 

If you’ve seen any of my poster presentations in the past year, you’ll be familiar with some of the diagnostic test stuff. I’ve been plugging away developing diagnostic tests for about three years now and I’m getting fairly close to happy with the question sets. I find they are very useful in informing my teaching and a couple of changes I made this year seem to have added clarity to the responses received. Firstly, I moved the tests from paper-based (MCQ, confidence scale, free-text response bit for explanation) to online (MCQ, confidence scale, studied before yes/no/maybe). This allowed for automatic marking and meant I didn’t need to spend hours typing in the answers. I also changed the confidence scale from a 1 – 7 scale with 1 being ‘not at all confident’ and 7 being ‘highly confident’ to a categorical scale. I intended this to make it easier for students to select an answer (I found myself becoming tied up in the difference between a 5 and a 6 whereas the difference between ‘neither confident nor unconfident’ and ‘a little bit confident’ seemed easier to grapple with). Never the less, I do employ NSS style groupings to split the responses into low, medium and high confidence.

 

The second part is based in part on the ‘Spherical Students’ post and is consideration of a student workload model as well as the use of the TESTA (transforming the experience of students through assessment) process to evaluate a curriculum review. When I looked at our TESTA data from several years ago with a view to seeing how things had evolved since the modules were shiny and new, I realised that while it accounts nicely for type and number of assessment and goes into great detail on feedback, there’s little about considering the timing of assignments. This, combined with my on-going bewilderment about how more ‘active’ forms of learning requiring not insubstantial amounts of pre-sessional activity should be accounted for in module proposals (where hours are broken down) and in timetabling (if you’ve got 8 lectures a week, can every lecture come with an hour of prep?), has all lead me to start reading about student workload models. It’s fascinating stuff and seems fairly under researched in HE. The basics are obvious: it’s really difficult to pin a meaningful metric on the task of evaluating student workload and methods vary. They include word counts and proxy word counts (e.g. a 15 credit module is 4000 – 5000 words) which obviously doesn’t work that well for calculations and the like, time on task which is objective (how much time should be spent) and subjective (how much time is spent) and further complicated by the idea that researching and writing 1000 words at FHEQ level 4 (1st year UK uni) is very different to researching and writing 1000 words at FHEQ level 6 (3rd year UK uni). An effective student workload model must make some allowances for level of difficulty of material but really should be so much richer than that. So far I’ve only got as far as figuring out that where we put our deadlines is really important in whether students appear overloaded in any given week or not but I’ve got more reading and thinking and modelling to do on this one.

As it was about work in progress, I’m not uploading the slides. I got some really good questions at the end and need to think some more about some of the issues.

Spooky Slimey Vampire Science

Last Saturday we did the ‘Spooktacular’ science outreach event at Keele University Hub. If my recollections are correct, it’s the 6th Spooktacular I’ve taken part in and the 5th where I’ve designed and run activities with the help of student volunteers. The audience is families with wee kids, typically primary aged so there has to be lots of seasonal fun and wee bits of science. These events are extremely worthwhile because:

(a) the whole family can attend and so parents and other adults can get involved with the science bit too, helping with the tasks and hearing the explanations.

(b) it allows undergraduates, postgraduates and academics to run activities in a marketplace format and hence test out their science communication skills in a dynamic and changing environment before (for example) taking on a larger project such as going into a school.

(c) it’s fantastic publicity for the institution and cements the role of the university as part of the community. It is a generally positive experience for attendees and breaks down barriers about what a university is.

I started out with one planned activity but then got rather a lot of student volunteers so came up with a second. Here’s what we did and, importantly, why.

  1. Slime (well it’s Hallowe’en, you can’t not!).

Since the whole PVA-borax fiasco that’s resulted in a lot of recipes for PVA slime using contact lens solution with boron based buffers being bandied about, we changed our normal approach to slime to alginates. In the past we’ve had mixed success with them, primarily because the interaction was a little on the short side. This time:

  • measure 20 mL of distilled water into a plastic cup
  • add one spatula of alginate powder
  • stir with lolly stick (highly sophisticated scientific labware)

Now, it takes a fair bit of time to dissolve, a couple of minutes and while this can be challenging for the youngest participants, it provides a good opportunity to talk about the science of what’s going on.

  • add food dye and glitter to the now thick solution.
  • Using a large small pipette (comedy description but this year I bought kids science pipettes from amazon that are around 5 mL and have a larger bulb on them than the usual disposable 3mL plastic pipettes), add to calcium chloride solution.
  • Make worms/beads/lumpy aggregates and fish from solution using large forceps (I bought kids forceps as metal lab ones are often a little difficult for wee hands). Prod, poke and generally play.

 

  1. Mission Starlight…Vampire Edition

I have a backpack in my office from Summer 2016 when we took the RSC’s global experiment ‘Mission Starlight’ to a summer science event. That was when we had the UV-active beads on teddies and we were protecting teddy from the sun and talking sun safety with visitors. The UV beads change colour in sunlight and we provide various cloths, films and sunscreens to build the perfect UV protection suit. For Hallowe’en we adapted it to protect Vampires from sunlight – what would a Vampire wear to avoid the sun? I bought some rubber bats and put the UV beads on them. It worked pretty well but one draw back is it involves the participants going in and out of the room which can be alarming for parents and guardians, particularly if the room is busy. We also had some UV emitting LEDs which came attached to pens as part of an ‘invisible ink’ set. UV active dye in the pen, no obvious marks, shine the LED at it…you get the idea. In any case, those worked well when the sun went in a wee bit.

Mission Starlight was one of the best RSC Global Experiments in my opinion because there is no clean-up, few consumables (the fabrics can be reused), and it’s dry to store. I’m in favour of outreach being a bit more sustainable in the generating limited waste department. Other experiments such as the polymer hydrogels and the vitamin C in foodstuffs required quite a bit of wet prep and disposal. I need to find more activities like this, although I’ll note that not pre-dissolving the alginates did cut down on prep time significantly and allowed the participants the chance to dye their own alginate.

Next on the outreach calendar is our regional Top of the Bench Heat. I’ve got the practical challenge sorted (shhhh! Top secret until after!) but need to figure out the written challenge.

Automated Marking of Tests/Quizzes

I’ve been dabbling in the automatic marking of tests and quizzes for several years now. By this, I mean a web-based set of questions that a student completes on a specific topic, that automatically grades the answers as correct/incorrect (and sometimes gives partial credit) and returns the mark (sometimes with feedback) to the student at a specific time. Before you think – oh this is good, no marking, I’d warn you about setup burden.

What kinds of assignments make really good auto-tests? I have used them for the following:

  • pre-laboratory exercises that give practice at calculations, some safety aspects, and identifying products/balancing equations (Blackboard Test/Pool)

  • online safety quiz  (Blackboard Test/Pool)

  • assessed tutorial with a very tight deadline before the exam  (Blackboard Test/Pool)

  • referencing and academic conduct test  (Blackboard Test/Pool)

  • diagnostic test (new for 2017! Google Form Test)

The technology has limitations, particularly related to the type of questions you can ask. I find the following types useful:

  • multiple choice questions

  • calculated numeric [with the caveat that Blackboard can’t deal with 5.5 and 5,5, units, or number of decimal places]

  • fill in the blank or short answer [with the caveat that students often can’t spell (even when given a list of words to select their answer from), and sometimes deducing the syntax of the required answer is tricky]

  • matching pairs [really good for reactant/product equation matching in transition metal redox chemistry]

I also like the ability to write a pool of questions and a system that allows each student to be asked a number of questions from the pool. If every question is from a different pool, this reduces the scope for collusion. An example of a good pool question stem for a calculated numeric question:

Calculate the mass of copper sulfate required to prepare a [Y] molar aqueous solution in a [X] mL volumetric flask. 

You can see how simple it is to vary X and Y within the realms of possibility, generate all the correct answers in excel and make a pool of questions.

The setup burden is how long it takes to create the initial test. As a rough guide, I’d estimate it to be at least twice as long as it would take to manually mark! So for a pre-lab done by 50 students, taking me 10 hours to mark, I’d expect to spend about 10 hours developing the online version. I do not recommend doing the online test thing unless you know you can use it for at least 2 years – one reason for doing it is to reduce the marking load and you don’t really start to make gains until the 3rd year of running. On the other hand, it’s a convenient way to ship time from semester (marking time) into quieter times of the year (prep time). I estimate that each test requires 1 – 2 hours of tweaking and set-up each year, usually after reviewing the analytics from Blackboard, weeding out poorer questions, adding a couple of new ones…that sort of thing.

Why do I do this? Well each of the assignments I’ve outlined are reason enough in themselves, but some have transitioned from paper-based to online (pre-labs, diagnostic test) and some would not exist if they could not be online (safety, referencing, academic conduct, assessed tutorial). So sometimes there is no reduction in marking time for me because I wouldn’t offer the assignment in an alternative manner. Technology facilitates the use of formative tests to aid learning, so I use it.

This year I’m expanding my range of formative tests to transfer my 1st year spectroscopy ‘drill’ questions into an online format. When teaching things like the equations of light, basic NMR, IR etc, I recognize the value in doing lots of examples. I also recognize the value in those examples stepping up in difficulty every so often, I’ve been calling them levels.

For example, using the equation E = hν

Level 1 – calculation of E in J, with ν in Hz

Level 2 – calculation of E in kJ/mol

Level 3 – calculation of E in kJ/mol with ν in ‘insert standard prefix here’ Hz

Level 4 – calculation of ν with energy in J

Level 5 – …

You get the idea anyway.  I read a paper on this a few years back, about stepping up calculations in small steps.  So I’m making question pools for each level, bundling a few levels together into a quiz then setting a minimum score requirement to gain access to the next levels. Students will do quiz 1 and if their mark is high enough (80%+) they get access to quiz 2. If it isn’t, they’ll get access to a couple of worked examples and the chance to re-do quiz 1 to get the mark.

I’m aware that this type of drill enforces a purely algorithmic approach, but if my students can’t do these bits, they are going to run into a whole lot of problems at higher levels. When setting exam questions, I balance the question between the algorithmic problem solving stuff like performing a calculation, and the ‘explain your answer’ part where they need to demonstrate a degree of understanding. We can argue over the appropriate balance between those sections but I think the algorithmic stuff should be 40 – 60% of the marks available (depending on the level of the paper) and the balance should be the explanation stuff, or higher level problem solving such as unseen, unfamiliar problem types. With this balance I’m saying ‘well you can probably pass the exam if you can do the basics but you need to show more to get a great mark’.  I also assume that intended learning outcomes define a passing mark (40%) or a low 2:2 mark (50%), rather than a 100% mark.

The experience of setting up and running diagnostic test through Google Form Tests requires a post on it’s own so I’ll come back to that.

 

Reflections on Variety in Chemistry Education/Physics Higher Education Conference 2017

This year’s conference started on Thursday 24th August 2017. I did not attend the lab events on Wednesday because I think a 2-day conference is sufficient – I get seriously ‘conferenced out’ after a while and I’d rather focus on the talks. I did arrange to arrive in time for Wednesday dinner so as to avoid early trains and a very long day by arriving on Thursday.

This was also the first time I attended a conference seriously aware that I had limited energy and that I needed to take care not to over do things. There were a few challenges in that regard, simple things like getting between venues, accommodation, dining rooms, and bus stops was harder, as was standing for several hours in a poster session. It’s quite eye opening when you notice that you struggle to do things you previously took for granted. The only advice I have for people organising conferences, is to be really explicit in delegate information before arrival about distances between things, requirements to take buses, and to give serious thought to how much seating is available during sessions such as posters and lunches.

I enjoyed many of the sessions, and a lot of the talks. I found several talks highly frustrating, particularly those that could easily play into the unconscious bias held by many. These included the keynote on gender differences in the physics force concept inventory, and the closing keynote on perspectives either side of A-level/university chemistry teaching. Here’s the thing: unconscious bias afflicts us all both as recipients and as people who hold them. One particularly insidious one is the bias that those who teach in HE often have towards those who teach in secondary.I’ve heard a lot of academics disparage teachers of chemistry at A-level, particularly around ‘teaching to the test’.  I do not doubt that many HE teachers view incoming students through a deficit lens – they focus on what the students can’t do rather than respecting the enormous hard work and effort and learning that has taken place in secondary (and further education). And let’s be clear, I don’t mean all HE teachers but I’d rather focus on what our incoming students can do and acknowledge the diversity inherent in the secondary and further education sectors, and the enormous pressure on both teachers and students. I disliked that the keynote could feed those attitudes by drawing greater attention to the issues.

My (annual?) issue with the gender thing. Males and females performed differently on a test that was developed to see how much students understand about some physics stuff. It wasn’t designed to investigate gender, and if it shows a difference in male/female performance, we should absolutely investigate and be concerned. But we have to be really careful how we do this and make sure that we’re using the right research tools and asking the right research questions. We also have to be really careful to explore gender sensitively which means considering all aspects of gender, rather than treating it as a male/female binary.

 

Moving on from that, there was an interesting talk on teaching tips for making a more inclusive teaching environment. I think that’s something we all need a reminder of from time to time but I would have liked a few more concrete suggestions on things that work. PDF format is variable and not all versions work for screenreaders…OK then, which way of generating a PDF is best? That sort of thing. And it would be really good to share ideas about how to convince all those who teach in HE to adopt practices that make teaching as inclusive as possible.

The poster session didn’t really work for me – there were two rooms, with many of those in the first room unaware of the second room. There were also very few seats and that made it very very tiring. I had several very good discussions with people over my posters but never really made it round them all. Fortunately the majority of the posters were shared electronically beforehand and this allowed me to review them at my own pace. There were still lots of people I wanted to talk to in the poster session but I never found them – and by the time lunch was served, I was out of energy.

There is also some question over what Variety in Chemistry Education conference is for. What type of chemistry education thing. To me, the strength of Variety is in the variety: I go to see great chem ed ideas presented, but I also enjoy presentations that steer more strongly towards chemistry education research. I like the mix and I wouldn’t want to see that change. I do, however, have one caveat: I dislike intensely anything presented that has not been sensibly evaluated. And to evaluate a teaching innovation, one must carry out the activity with students. I’m OK with the evaluation being done purely from the perspective of the teacher, I’m OK with the evaluation being done from all perspectives. I’m not OK with unevaluated activities being presented. I suppose I should comment on the role of Variety in seeking collaborations, or putting a flag up to say you’re doing a certain thing and would anyone like to help. That’s what oral bytes are for, nothing longer. I would call on future organisers of this conference, and those who scrutinise the abstracts to ensure that evidence of evaluation is clear.

I’ll also note that I have no intention of attending the pre-conference activities. 2 days of conference and dinner the night before is sufficient for me, and I don’t need to attend more.

We also had a chat on Twitter the other night about whether we should facilitate the attendance of undergraduates who’ve done chemistry education final year projects. I think it would be good to see undergrads attending Variety and if their work is of sufficient standard to justify submitting an abstract for a presentation, then great stuff, what an opportunity! I would not, however, reduce the ‘standard’ expected of that abstract or subsequent presentation. So I’m not in favour of giving undergraduates (or any other group) more chance of getting a presentation slot just because of what they are.

 

This post has been several drafts in the making, I’m still not happy that I’m articulating what I want to say very clearly, but right now I’m done with it and am hitting publish! Time to move on!

 

Poster: Alternative Assessments #ViCEPHEC17

This is my poster for #ViCEPHEC17 on a range of assessments through 3 years of the chemistry curriculum. It was a surprisingly difficult poster to put together as there was so much I wanted to include. In the end I focussed on the general skills developed rather than the specifics of each assignment.

One of the highlights of these assessments is getting students to create ‘What Am I?’ puzzles. This started as a part of this blog where I’d post the chemical components of everyday items and readers had to guess the item. Since running this as an assessment, I’ve hidden those blog posts. The students have to decide on an everyday item, investigate the chemical composition of it, draw the compounds using ChemDraw and make it an appealing single page graphic. The purpose of the first page is to allow the reader (that’d be me) to guess the item. I’m getting very good at identifying various body sprays by chemical structure. On the 2nd page they should indicate what the item is, and briefly outline the purpose of each chemical in that item, finishing with a reference list.  Should you wish to play, here is the What Am I? Variety in Chemistry Education Edition:

Below I’ve listed the key chemicals found in a common thing, which may be slightly UK-centric.  I’ve drawn the chemical structures of principal components where simple and appropriate; given the E number or CAS number (however tempting Sigma-Aldrich catalogue numbers would be) if no simple chemical structure exists for an additive; and given the chemical formulae or name if neither of the above make sense.  See if you can guess what this is!  If you guess on Twitter (@kjhaxton), please DM your guess so others can play.

 

structures of chemicals contained in common item

 

And should you wish to find more of these puzzles, try: http://www.possibilitiesendless.com/category/what-am-i/

With respect to the other assessments I mention, I have presented on some of the elements before so there are a couple of slide decks for further info.

For more information on the 3rd year infographics:

For more information on some of the other assessments

And on the 1st year screencast presentations

 

Poster: NMR Diagnostic Test #ViCEPHEC17

One purpose of this blog post is to provide additional materials for my poster at Variety in Chemistry Education 2017.

This is the third of three posters presented this year on our NMR diagnostic test project. It’s a subset of a larger project but the NMR results have been quite interesting and I’ve also been fortunate enough to have a couple of students work on the project.

For those attending ViCEPHEC17 who fancy having a go at some spectroscopy diagnostic test questions, here’s a link to a shortened online version. All submissions are anonymous and there are only 5 questions!  https://goo.gl/forms/zmuTiOxxf39hyF2m1

If you have any feedback on the test, you can comment on this post, or get in touch at Variety or by email. Using GoogleForms is new this year for this type of test.

The ViCEPHEC17 Poster is here:

 

For the RSC Twitter Poster competition, this poster was submitted which outlines some of the key initial findings:

https://rscposter.tumblr.com/post/158621983310/novice-to-expert-alternative-conceptions-in-nmr

For Methods in Chemistry Education Research, the poster focussed on the research methods we were using to investigate this project:

Poster: Methods of investigating alternative conceptions in NMR