“I know it when I see it”
What do good scientific research and hardcore pornography have in common?
Sweat, yes. Lack of sufficient protection for their performers, maybe. Faked enthusiasm, sometimes. But the key thing – or at least, what TIR wants to focus on here – is the intuitive recognition of what they are. As Supreme Court judge Potter Stewart observed, it is difficult to define obscenity [hardcore pornography], but “I know it when I see it”. The same applies to assessment of scientific research. It can be fiendishly difficult to nail down what constitutes good science, but most people have an intuitive sense of what it looks like.
Put succinctly, we recognise clarity of thought. We see when the background is clearly and confidently sketched, when the project’s aims are clearly articulated, when the experiments test the proposed hypotheses and the controls address the caveats, and when the interpretations are within the bounds of the obtained data.
Sometimes we see all those things in a single project, but that’s that rare 10% of the time when assessment becomes a pleasure rather than a chore. More often we see single or multiple elements of excellence, rather than the complete crown jewel – a clear background, but vague aims; clear aims, but unfeasible experiments; clear data, but premature or overeager interpretation.
We sometimes see that clarity of thought in papers, we sometimes see it in presentations, but we never, we cannot, see it in CVs. A CV contextualises a person, but will say nothing about how they think.
This is exactly why it is so important to assess people in person. An interview will quickly show if a candidate is not as good as their CV might suggest, or if they’re good but overconfident and unable to take criticism. It will also show if someone can think on their feet, interact productively, and show potential.
In the absence of an in-person assessment, it becomes much more difficult to determine quality. Metrics attempt to do this, of course, but while they can be very effective indeed they are not a one-size-fits-all measure. Metrics work when you can parametrise the thing you’re assessing and assign weighting to those parameters to ensure you get what you’re looking for – but what you’re looking for will vary depending on the situation (a good teacher? a good researcher? a good communicator? a theoretician? an engineer?). Being able to render an applicant into a digit has an obvious appeal to the lazy side of our nature, but it’s valid only when great care has been put into the design of the process.
An additional disadvantage of the impersonal assessment is that it places a greater emphasis on language. Science has three main ways of publicising past and future work: the presentation, the paper, and the proposal. The latter two have a strong literary element, and writing inevitably confers certain handicaps based on fluency in the language being used (almost invariably English at present). When it comes to funding however, it’s always the written proposal – whether in short or long form – that constitutes the first and sometimes only means of assessment.
Which begs the question, is this the best way of identifying that clarity of thought? Or are you in fact more likely to hear a good presentation than you are to read a good paper? Is a slideshow a more authentic way of representing someone’s thought processes?
Perhaps instead of requiring applicants to write proposals, we should make them prepare presentations*. This would probably take less time for applicants to prepare, would still impart a high degree of rigour (ask anyone who’s had to give a short presentation at a conference), and would also enable faster evaluation and higher throughput. While raising the stakes somewhat, it does also give the applicant a greater degree of control – it’s easier to steer the audience’s attention with a slideshow than it is with a document, because you can determine when things appear.
So perhaps it’s time for scientists, like pornographers, to get behind the camera.
*(With either a voiceover or displayed notes; a videoed presentation risks bias on the basis of the applicant’s appearance.)
Originally published on Total Internal Reflection - HERE.