Latest entries

Sunday, 26 July 2015

One week old baby

The other day I commented to the father of my child that having a baby is a little bit like going to prison. Not the ‘nice’ sort of prison where they let you do Open University courses and try to make you a better person. A Victorian-style prison where the inmates are forced to turn a crank thousands of times a day or walk on a treadmill for hours and hours with no end product to show for their labours. You do your time and then you’re released into a world you no longer recognise, having lost half your hair and a year of your life.

You know, writing that makes me realise why my partner looked at me with his ‘What the fuck?’ face. Yes, yes, I have a one year old miniature human to show for all those hours spent pacing to and fro sob-singing Somewhere over the Rainbow while the baby screamed because she was so flipping tired that she just couldn’t sleep. And, yes, all that panic-Googling because said baby has a highly disturbing habit of rolling her eyes over white when it’s windy wasn’t entirely wasted. I now know more about neurological disorders in infants than the average neurologist.

But I can’t say the first year of my baby’s life has lived up to my expectations of motherhood. Don’t get me wrong—it’s been totally worth it. My child really is the best baby in the world, even taking into account that month-long phase where she got so fat that all our baby photos look like we’ve dressed a small pig in human clothes. Or that way she hisses and bares her teeth whenever there’s a bright light nearby, like some blood-sucking vampire grub. Or the time she gave me Hand, Foot and Mouth disease that adults totally aren’t meant to get, which I totally did get, and am in fact currently typing with a fingernail that is totally going to fall off any day now, and urggggghhh.

I know what you’re thinking: if this is Kathryn’s first foray into Mummy-blogging then, Dear God, nooooo. You’re alright, don’t worry. I am fully aware that, while I’ve somehow managed to find myself in possession of a predominantly happy and healthy toddler, it has not been thanks to any previously dormant Mother-skills that need to be unleashed upon the world. I barely know what I am doing when it comes to my own child, so I am most definitely not in any position to offer parenting advice to others.

What I do want to talk about is how wasted time isn’t always wasted time. I worked in scientific research for, what, 8 years as a post-doc and for 12 years total if you count the time spent working towards my PhD. And of all the work I did during that time, maybe 5% amounted to something useful. Maybe less, depending on your feelings towards basic tuberculosis microbiology. All those hours in the lab, all that funders' money. You can’t be scared of failure if you want to be a scientist. 

I’ve read papers before where years of work have been condensed down into a few lines. The protein could not be crystallised. A gene deletion mutant showed no phenotype. Compounds showed poor activity in vivo. No one gets into science to spend their days performing the grunt work that provides the filler for the rare discovery that changes the status quo. But the reality of working in the lab is that taking the easy road rarely leads to the really interesting results, but taking a risk—trying something completely new—will often lead absolutely nowhere.

Now that I’ve left the lab and am embarking on a new adventure in the form of writing a popular science book, I am getting to see science from a new perspective. It’s dizzying to see how much research there is out there that no one outside the field ever hears about. Work that’s published in the best journals can still be just another drop in the ocean. Years, sometimes decades, of work. Even the biggest discoveries can sometimes look very boring from the outside. It’s enough to make my own work feel very small and insignificant.

But somehow, while you’re hunched over the lab bench, it doesn’t feel like you’re wasting your time. Two weeks struggling to make a protein expression vector. Two months purifying a protein so that you can get started on the real experiments. Two years screening inhibitors only to conclude that the protein you picked at the start wasn’t the best drug target after all. Go back to the beginning and try again. From the outside, it looks like wasted effort but, along the way, there were small successes. New techniques that will improve future attempts. Students trained who will go on to do their own research. Interesting side projects that may or may not lead to something exciting.

I’d rather hoped that the lessons science taught me about the value of failure might translate into a bottomless well of patience when it came to stay-at-home-parenting. Hypothesis rejected! One of the issues is that reproducibility goes out the window when it comes to babies and there’s an inverse relationship between how much research you do and how good the outcome is. But, like my scientific career, I sometimes look back on the first year of my baby’s life and wonder where all the time went. What have I achieved? Could I have done more?

Now that I’m coming out the other side of the baby period and starting to piece back together my own life and ambitions, I can see why so many women find it so difficult to juggle motherhood and a career. Pre-baby, it seemed so simple. I couldn’t understand why some women seemed to cease to exist as an individual once they had kids. Now? I can’t decide whether those failures in the lab would feel like time that could have been better wasted at home with my daughter, or vice versa.

So instead of returning to work for someone else, I am going to attempt to carve out a freelance writing career that will let me work on something I want to work on while also being around to clean spaghetti off the walls. No doubt there will be plenty of failures along the way and I am sure there will be times where I regret wasting my time on something that leads nowhere. But, hopefully, in a year’s time when I look back, it will have all have been worth it.

Sunday, 22 February 2015

In a small chapel just outside Prague, a chandelier made from every bone in the human body hangs from a garland of skulls like the world's creepiest wind-chime. Nearby, a coat of arms features an almost comical bone bird—its wings a human hand and its neck a gnarled vertebrae—that pecks at a skull's eye socket. In each corner of the room, several thousand, maybe more, bones are tightly packed into huge bell shaped mounds.

Back in 1278, the abbot of the Sedlec monastery sprinkled some earth from Jesus' supposed burial site onto the abbey graveyard. The effect was much like the opening of a new crossrail station in a previously affordable London borough. One moment, fashionable types wouldn't be seen dead there; the next there's a trendy pub opening up with artfully stained sofas and an influx of skinny jeans. Or, in the case of the Sedlec graveyard, corpses. Suddenly, it was the place to go and die, much like Southwold only without the beach.

By the 16th century, the Black Death, or plague, had deposited so many bodies in Sedlec that there was literally no more room. So a partially-sighted monk was tasked with digging up all the bones crammed into the graveyard and stacking them up in neat little piles. The remains of 40,000 people eventually found their way into the ossuary (I like to imagine that each and everyone was moved by that one dedicated monk). Later, during the 19th century, a local family employed a wood carver to make the bones pretty. Clearly this wood carver was a distant relation of Tim Burton, and his creations were eerie and strange, and surprisingly beautiful.

Is that shameful to admit? The Black Death, after all, has to be one of history's most terrible killers, decimating huge swaths of the population over the centuries. Shouldn't the remains of all those dead invoke feelings of solemnity and sadness, not awe?

This is an issue which, as a scientist working on a killer disease, I've often wondered about. Can an infectious agent that kills millions ever be beautiful and the subject of admiration and respect, or should I have felt horror and disgust with every swirl of the culture flask, every squirt of my pipette? After all, there were days when confessing my profession to a stranger left me feeling a little like I’d just admitted to marrying a serial killer in a prison chapel.

I worked on a different type of plague to the Black Death—the White Plague, or tuberculosis. Even today, tuberculosis kills almost two million people every year. Despite this, I’ve always held a strange affection for the bacterium. There's something amazing about peering into the minuscule world of the viruses and bacteria and, like an astronomer looking out into space, feeling wonder at the complexity and elegance present in each and every tiny species.



Recently, an artist called Luke Jerram created blown glass sculptures of various killer viruses and bacteria. He didn’t make one of the Black Death bacterium, Yersinia pestis, but he did create an Ebola, Smallpox, and HIV, among others. While many people marvelled at the beauty of his art, more than one person raised the question of whether it was distasteful to admire diseases responsible for untold misery.

Perhaps a better way of looking at it is that it is the science that is beautiful. Once, we believed that the Black Death was a judgement from God or a curse from the odd lady down the road with tangled hair who talks to cats (don't we all?). Now, we can look this tiny killer in the pilli and see it for what it is—an amazing creation of nature that walks the fine line between horror and beauty. The more we understand, the less there is to fear.

When I look at the Sedlec Ossuary, I see the humanity in the careful way that the bones have been arranged, and am reminded of how fragile and fleeting life can be. I see a memorial to all the lives claimed by diseases such as the Black Death and I remember how far we’ve come since the days when the plague killed between 30 and 60% of the European population. The bones aren't beautiful because they are dead but because, once, they were alive.

Saturday, 20 December 2014

Scientist spends 8 years training for a career in research. Scientist gives up said career to embark on her greatest experiment yet—an adorable squishy baby! Only the scientific method no longer works and that Excel spreadsheet titled “How I will raise baby” is woefully inadequate…

1. Your PhD did not prepare you for this 
At no point in all your years in the lab did you encounter a problem that required you to wrestle your boob into another human’s mouth while they screamed to the point of liquefaction. Transferrable skills my ass.

2. To do lists now consist of ‘Keep baby alive’ and ‘Remember to eat’ 
Once, you juggled multiple projects; now, it’s time to burn that list of things you planned to do on maternity leave. Learn French? Ha ha ha. Have another Jaffa cake.

3. Virkon-ing your child is frowned upon  
Interesting fact: microbiology labs and babies are often interchangeable when it comes to their smell. Only no one sells pink powder that can be sprinkled on a human child to decontaminate it.

4. You’re the annoying student and there’s no postdoc to help 
Instead, you have your mother’s thirty-year-old parenting advice. Remember that time your PI decided to run a gel for ‘old time’s sake’ and set two undergrads on fire…?

5. There’s no peer-review for baby articles 
One minute you’re researching developmental milestones; the next, Google has diagnosed your little troll with goat plague spread by vaccination.

6. Evidence and reason no longer apply 
My baby has goat plague! What do you mean only goats can get it? Oh my god, I think my baby’s a goat.

7. No one cares about your results anymore 
And here’s a photo of her sticking her right foot in her mouth. And I’m teaching her to grab the left one too! I think there’s a picture somewhere here if you just…wait, where are you going? 

8. Reproducibility is a distant memory 
One day, you can pace to-and-fro shushing like a deranged librarian and It Sleeps! The next? You get the ‘Like that’s going to work on me, Mother-Creature’-look of baby contempt.

9. The baby doesn’t give a shit how smart you are 
Like the plot to the sequel of ‘Flowers for Algernon’, you’ve experienced a career where using your brain is kind of handy and now you’re singing ‘You are my Sunshine’ for the thousandth time.

10. I don’t remember what this one was going to be 
So I read the draft of a paper the other day and got half way through before I realised that I’d written the damn thing. The old me. The one who had an attention span long enough to…huh what?


Saturday, 5 October 2013

Chimp holding a skull
 
Steve Jones begun his talk at the Henley Literature Festival by breaking the news that he was not the same Steve Jones who played guitar in the Sex Pistols. I was personally quite glad about this because it would have made writing this article on genetics somewhat difficult.

The Welsh geneticist and snail fan-iteration of Steve Jones has a new book out called The Serpent's Promise: The Bible Retold as Science. In the words of one of his reviews, it is a “re-cranking of the Darwinian barrel organ – accompanied by the monkey of New Atheism as it screeches petulantly at religion.” Sounds awesome, right?

His talk focused on the role of genetics in the nature versus nurture argument, using the bible as his starting point. Are we born already tainted by Adam and Eve’s transgressions in the Garden of Eden or, to put it in more scientific terms, is it worth trying to fight the genes we’re dealt at conception or are we all screwed before we even get started?

What does genetics tell us?

Are you sitting with a few others? If so, check out the two people closest to you. Statistically speaking, two out of the three of you are going to die as a result of your genes. Cheery thought, right? Although, in Shakespeare’s time, two of you would already be dead so that’s something to be thankful for. It was with this introduction that Jones begun his discussion of what genetics can—and what it can’t—inform us about who we are.

Clearly plenty of human attributes are linked to our genes. Thanks to my parents, I am at risk of developing high blood pressure and have my mother’s nose (in a jar on the mantelpiece, mwah ha ha). But it’s not a simple case of Genetics=Destiny, despite what certain scientists and members of the media would lead you to believe.

“Ignorance more frequently breeds confidence than does knowledge.”

Type ‘Scientists find the gene for’ into Google and more than 10,000 pages pop up. Among the hits is the slightly dubious premature ejaculation gene. Jones was quick to explain how this kind of reporting contributes to the public misunderstanding of genetics.

Overhyping the role of genetics is what was behind the UK’s disastrous Eleven Plus education policy in which they attempted to identify the ‘naturally talented’ kids worthy of a decent schooling. Take a class of kids and measure their IQ and there’ll be a natural variation in their scores. But we now know that, during childhood, genes can only explain 10% of this variation (interestingly, this goes up to 70% in the 65-70 age range). You see this when you look at twins adopted into different families—their environment plays a far bigger role in academic performance than genes do.

So the Eleven Plus didn’t really measure a child’s potential, all it did was deprive some kids of an education that could have drastically changed their life.

Extreme poverty drags everyone down regardless of genes

The part of the talk that stuck with me the most has to be how the contribution of genes to IQ differs dramatically depending on how rich you are. Among the top percentile for income, the contribution of genes to the population’s variation in IQ comes in at 0.7 (70% of the variation can be explained by genetic variance). Bottom percentile for income, and it drops to 0.1. For these people, their genes don’t make the damnedest bit of difference. Cue embarrassed shuffling from some members of the entirely middle- and upper-class audience.

It’s a similar situation when you consider the ‘gene for criminality’ found in half of the population. Yup, that’s the one responsible for testosterone production in the violent, dangerous creatures otherwise known as ‘men’. If you look at murder rates by age for men and women, those in possession of a Y chromosome commit around 10 times more murders than women. The peak age for criminality is between 20 and 30, gradually tailing off into ‘grumpy old men’ as Jones put it.

Compare the graphs for the UK and Detroit, and they look identical until you notice the scale of the Y axis. In the UK, the peak murder rate is 25 in 1 million. In Detroit, it’s 1000 in 1 million. Men still commit 10 times more murders than women but something about the environment has changed the scale of the problem. I’ve never been to Detroit and, after listening to Jones’ talk, I am not sure I want to.

Do our lizard brains impact on criminality?

Setting aside the limitations of brain scanning as a science, it can be used to demonstrate something really clever about how genetics can influence criminality. There’s this primitive little bit of the brain called the amygdala responsible for emotional responses and, if you surprise someone in an MRI scanner, you can make this region light up.

The degree to which the amygdala is activated depends on levels of a protein called monoamine oxidase A (MAO-A)—an enzyme involved in the transmission of nerve impulses. Those who are genetically programmed to produce low levels of MAO-A tend to respond more strongly to sudden shocks, giving them a worse temper than those who produce normal levels of MAO-A.

But MAO-A levels are not deterministic when it comes to aggression and criminality. There are plenty of people who make low levels of MAO-A (Steve Jones, for one) who don’t run around fighting and murdering. And there are people with normal levels who, thanks to their circumstances, end up taking a less than virtuous path in life. The interesting difference appears when we compare the effect of stress and trauma on antisocial behaviour in those who make low and high levels of MAO-A.

Normal levels of MAO-A + unstable childhood = 3 times higher rates of antisocial behaviour.
Low levels of MAO-A + unstable childhood = 20 times higher rates of antisocial behaviour.

It seems that, in this case, genetics can predispose someone towards violence but environment plays a huge role in deciding if a person will live up to their ill-fated inheritance. Should genes, therefore, be taken into account in the justice system? Should upbringing? Or are we all ultimately responsible for making the best of whatever cards we are dealt?

"We don’t need more geneticists, we need more theologians"

With genome sequencing becoming easier and cheaper, Jones believes that we will soon reach a point where it is possible to sequence the DNA of every child born in Britain. The problem is that this won’t tell us very much. No matter what genes we are born with, nurture still gets a look in. It’s why I get frustrated every time I see newspapers sloppily reporting scientific developments with headlines such as Are you a victim of the hunger gene? It misleads people into thinking that human behaviour can be explained in genetic terms when it is far, far more complicated in reality.

Monday, 2 September 2013


Let’s imagine for a moment that uncertain job prospects and too much caffeine pushes me over the edge and I gather up every monkey in the world and shut them in a room with a bunch of computers. Sometime later, I return to a lot of flung poo and, among all the random strings of letters typed by the unfortunate (and now cannibalistic) monkeys, I discover that one capuchin has typed the sentence: “HELLO KAT”.

This is a version of the Infinite Monkey Theorem, which basically states that a monkey hammering on a keyboard for an infinite amount of time will eventually type out the complete works of William Shakespeare. It’s all about probabilities.

Give a million monkeys ten years, and the probability that one of them will type ‘HELLO KAT’ entirely by chance is 1 in 2. The same as guessing the outcome of a coin toss*. Throw in all the other 9 character sentences that can be made from the letters on a keyboard, and the likelihood of one of the monkeys NOT typing something meaningful by chance is practically zero.

But what happens if I now take that one, single monkey, and I publish a paper saying that I have found the world’s first literate capuchin? Disregarding all the random sentences typed by all the other monkeys, I proclaim that there was only a 1 in 1.8x107 chance that my monkey could have randomly typed ‘HELLO KAT’. Those odds are so slim that surely this particular monkey must have intentionally hit those particular keys?

This is an example of Survivorship Bias, in which only focussing on the successes while ignoring the failures can lead you to make incorrect conclusions.

The same thing happens when it comes to careers. I can’t count the number of times I’ve listened to a leading scientist explain how they made it to the top using their formula of:

(Being smart) x (Choosing the right field) (Hard Work) + (Networking) = Success

The thing is, this doesn’t take into account all the people who are plugging the exact same numbers into the exact same formula and coming up with entirely different results. When something is heavily dependent on chance and luck, you can’t make conclusions based only on the survivors—you need to check the graveyard too. The road to permanent scientific positions is littered with the tombstones of postdocs who have fallen along the way and I can’t believe I didn’t notice them until the point at which I was down on my hands and knees, scrabbling around in the dirt.

The stupid thing is that others did try to warn me when I started out, but I didn’t want to listen. Looking back, I wish I hadn’t been so quick to disregard the experiences of older scientists finding themselves in the same position I am now in. It was all too easy to presume that they’d done something wrong; that they hadn’t tried hard enough or they simply weren’t very good at science. Understanding the role that luck plays in a scientific career wouldn’t have stopped me from becoming a scientist, but it might have made me less of a dick.

Now that I am picking myself back up and heading off for pastures new, I am experiencing yet another example of Survivorship Bias. People who know that I write science fiction novels in my spare time keep sending me articles about self-publishing success stories. Why are you trying to find a traditional publisher when E. L. James self-published 50 Shades of Grey and look at her now! What they don’t realise is that, for every wannabe author who becomes famous from self-publishing, there are hundreds of thousands who fail miserably.

When it is a scientist who tries to tell me how to be successful as a writer, I ask them if they would self-publish a scientific paper that had been rejected by a few dozen journals. It’s not the same, they say, science isn’t subjective like writing. You either do it right or wrong. Then I sit back and wait for them to do everything right and find that it still isn’t quite enough.


*Let’s just say there are 50 keys on my keyboard. So the probability of that monkey hitting the first ‘H’ is 1/50. The probability that the ‘H’ will be followed with ‘E’ is (1/50) X (1/50) and so on. I worked it out, and the overall probability is 1 in 1.9x1015, which in the grand scheme of things is extremely close to zero. But let’s say that a monkey can type at a speed of 200 characters a minute and it manages to type around 100 million strings of 9 characters over one year. If we work out the probability that the monkey will type ‘HELLO KAT’ at some point over the year, it works out at 1 in 1.8x107 – still very, very unlikely. But what if we give a million monkeys ten years? Now the probability that one will type ‘HELLO KAT’ entirely by chance is up to 1 in 2.3. Entirely doable.

Tuesday, 20 August 2013


Science embodied as a person would be a rubbish date. You’d be so dazzled by Science's awesome that you’d not only end up paying for dinner, but you’d find yourself promising them your undying loyalty. Then, before you know it, you're feeling guilty for not spending all of your time with Science and Ohhh that Kool-Aid looks really tasty*.

Misplaced loyalty to a career undoubtedly isn’t unique to scientists, but it does sometimes seem to be worryingly common in my profession. How many non-science people do you know who’d continue to work when they’re no longer being paid? Not many, yet it is all too common for end-of-PhD students and sometimes even postdocs who need to do that last experiment for the paper. And don’t get me started on the long hours and weekend work that seem to be the norm in most research laboratories.

We tell ourselves that we’re doing it for our own benefit—because we love what we do and want to give ourselves an edge in a very competitive environment. But lab heads and universities happily take advantage of this devotion to our careers and there comes a point where they are benefiting far more than the temporary scientisit. Like the vampires inexplicably romanticised by young adult fiction, of course employers aren't going to say no to willing victims eager to be sucked dry of their intellectual creativity*. But maybe they should.

Sure, less PhDs would get funded because it would cost a hell of a lot to keep paying every student until the moment they submit their thesis. Some papers wouldn’t get finished if universities couldn’t find extra money to keep on postdocs at the end of a grant. And the scientists would be the first ones to complain and defend their right to be exploited. 

With four months left in the lab, I’m not sure what scares me more—coming to the end of my contract and not having a new career to move on to, or finding a new position with time to spare and having to leave my project unfinished. Come December 31st, neither my current boss nor my research career is going to be buying my New Year’s Eve beers, so why do I feel like I would be letting both down if I don’t stick it out until the very last chime of Big Ben?

I’m sure that there are few lab heads out there who, if offered the professorship of their dreams, would turn it down out of loyalty to their postdocs and PhD students. So why do some temporary staff like me feel so guilty at the prospect of jeopardising a lab’s future grants and papers by making a selfish decision that would be in our best interest? It's like I have to keep reminding myself that my contract with the university is for a three year postdoc and not my soul. 

My relationship with Science has reached the point where I’m sat comfortably on the sofa in jogging bottoms, with barbeque sauce smeared around my face. Science is out there being all sciencey and cool, and here I am, clinging on to the memories of all our happy times together*. I keep telling myself that loyalty is only worth as much as the rewards it yields, that I could be so much happier in a new relationship, but it is so hard to not feel guilty about leaving. 

*I blame impending unemployment for all this melodrama. If any potential employers are reading this, I really am entirely sane. Please give me a job. 

Thursday, 1 August 2013

They can't really break your arm with their wings

On leaving laboratory science and why it’s going to be awesome (but first a rant)

A few years ago, I went on this residential course for postdocs whose years’ experience was greater than their output of Nature/Science/Cell papers. We made paper bridges for hamsters and drew our innermost feelings on giant shields for a reason I can’t quite fathom. Later, when we’d all got to know each other through the medium of unrelenting pessimism and beer, we went around the room and stuck post-it note ideas for alternative careers on everyone else’s shields.

My alternative career suggestions? Science fiction novelist, science journalist, or primary school art teacher.

Today, with five months left of my contract and the decision made that it is time to move on to a new career, I find myself looking back on those suggestions and thinking how absolutely, ridiculously na├»ve they were. We all know that there are more postdocs than there are permanent research jobs, and that most of us will have to pack up our ‘transferrable skills’ in a little knotted handkerchief and venture out into the big wide world. But this enthusiastic ‘You can do anything you want with a PhD!’-mentality doesn’t help anyone. It’s right up there with patting a five-year-old on the head and telling her that of course she can grow up to be an astronaut if she Just Dreams Big Enough.

It’s the science journalist suggestion that bugs me the most because I hear this from students. All. The. Time.

“What are you going to do after your PhD?”
“Oh, you know, I’ll probably just go into science journalism.”

No experience, no training, no particular interest in science communication. But, for some reason, there seems to be a prevailing attitude among a worrying proportion of scientists that having Dr in front of our name somehow qualifies us to hop out of the lab and easily pinch someone else’s hard-won career. It might be our plan B, but it will do. And I worry that it makes the rest of us look like dicks by association.

This is my big problem with PhD training.

Universities are churning out all these slightly entitled 25-year-olds with no idea of how the real world works. Students are paying more and more for their undergraduate courses, and the teaching is becoming increasingly structured with teaching fellows taking the lectures instead of researchers. To me, it feels a bit like we’re spoon-feeding people who should be able to learn independently by this point in their life. Then some start a PhD and a small minority never pause to consider that maybe they should stop thinking of themselves as a student and start acting like an adult.

No one really fails a PhD—I've seen far too many poor students saunter through their vivas with no problems after spending 3 or 4 years treating their PhD like a hobby rather than a professional job. And this devalues the PhD for everyone else. With so many of us wanting to use our skills in other careers, I'd kind of like it to represent the pinnacle of scientific education and not become an esoteric qualification unworthy of respect.

It's hard enough for the best and the brightest scientists to secure fellowships or lectureships, so why are we letting people continue wasting their time and money on a pointless PhD that won't help them become a scientist and hasn't taught them anything they couldn't have learnt better in the workplace?

There is no grading for a PhD but maybe there should be. Or maybe the standards just need to be higher and more consistent. Would I have passed if this was the case, or even got a project in the first place? I like to think I'd have risen to the challenge but we will never know. 

But what does this bitter little rant have to do with me leaving science?

At the end of the day, it’s not the poor job prospects and uncertainty that got me (although that didn't help). No, it’s being part of a system that spews out more and more PhDs despite knowing that there just aren’t enough jobs, then tells us that we can do anything we want with our little qualification and starts again with the next batch of naive wannabe scientists. Throw enough people at Science and a few will stick. Everyone else? Transferable skills!

It lets everyone down—the students who don’t have a clue and the postdocs who become demoralised at the thankless task of mothering adults who don't know why they’re doing a PhD in the first place. Science and scientists are complicit in a system that screws over postdocs in more than one way and it's shit.

So I'm going to take all my 'transferable skills' and find a job that makes me happy instead of frustrated; challenged instead of used; that respects me for the things I am good at instead of treating me like a disposable scientific thinker, broken equipment tinkerer and exhausted nursemaid.

I’ve always felt like leaving science and starting something new would feel like I’d failed. And I guess this is part of the reason why I’m jumping before I am pushed. But, now that I’ve told my boss that this postdoc is it for me, I feel inexplicably happy. I have no idea what I will be doing after Christmas, and it’s going to be awesome finding out.