Tuesday, 11 September 2012

Scientists must talk to people other than scientists!

(This piece appeared in substantially the same form in the BSA magazine but the web version has now been taken down so it is archived here.)

I have been sharing my scientific proclivities, in public, since I was 13. My early enthusiasm for science was ignited by space exploration, the rise of micro-electronics, and the promise of unlimited energy from artificial suns. It has continued to be some part of my everyday work almost every day since; in industry and further education; in schools, prisons and drop-in centres for disabled people; at public events and evening classes; in engineering firms, and at the Workers’ Education Association.

I didn’t apply for my first research job until 2001. It was immediately obvious that many academics, who were simply disinclined or already adjusting to the increasing emphasis on undergraduate teaching, saw engagement with the wider public as a mere side-show.

Of course researchers know that the funding councils and big industries don’t print money; that is the money that funds research. They also know that we live in an open, connected society. So any attempt to ignore how the funding bodies get their money is, as described by Professor Brian Cox at a British Science Association Conference recently, ‘myopic’.

You cannot blame academics for being short sighted. We have had decades of short-term contracts, the pellmell pursuit of scarce posts via a good publication record, and increasing pressure to secure funding is piled on to the demand for excellence in undergraduate teaching. And anyway, when I emailed a colleague recently to ask for someone to represent their research group at a university-sponsored public event, he said one of his postdocs might be willing, but that it was ‘outside her job description’.

He is absolutely right – it is; the myopia is, by omission, part of the contract.

I am lucky to be working alongside senior colleagues who can see that there is value in my continuing outreach activities. When the new Cognition Institute at Plymouth University came in to existence, I successfully applied for the first research fellowship at the university that incorporated an explicit public engagement remit. For me, at least, it is inside my job description. I regarded this as a small victory despite the short–term, part-time contract.

Seven years ago I sailed too close to the enchanted islands of the public engagement community and was lured on to the rocks by the sirens (they were called Sharon, Gina, and Timandra). I took part in a Science Communication competition called Famelab, and was a finalist which led to my first invitation to speak at the Cheltenham Science Festival. I had sailed too close to the wider public engagement community and, because I had no one to tie me to the mast, I was lured onto the rocks.

This opened my eyes to many more disparate routes through which academics can develop ideas and cooperate on projects: national competitions, open-mic events, citizen science, and many others. If we are to reach the widest audience, it is essential that the projects we support are diverse and inventive.

Not all academics will want to get involved in any of these, but I have been surprised by how sceptical, or dismissive, many are of their value. This is particularly true of my regular support for science and maths in primary schools, which I have been told recently are ‘pointless’. There is certainly a lot of work yet to be done.

Science needs to foster a joint enterprise with the society that funds it, and which benefits from its work. When I say this out loud I still tend to receive blank looks and awkward silences. This isn’t just about publicity for your research, getting your face in the media, building your CV, or meeting a grant deliverable.

If you believe that democracy is strengthened when the people who vote understand the issues, then it is a matter of citizenship. Only the research community can take responsibility for this, and as a result universities must commit to taking a leading role.

Thursday, 12 July 2012

Epigenetics - new hair, new legs, new science.

Many of the most powerful and elegant ideas in science don’t address why, or how, things happen. For example, Newton realised that under perfect conditions, if you push something, like a car, it speeds up and if you push it twice as hard it speeds up twice as quickly. Now, if you push a bigger car, one with exactly twice the mass, and apply exactly as much force, the car speeds up exactly half as quickly. This relationship is so precise and so famous that it is usually just called “the Second Law of Motion” and it has a role in everything from building bridges to planning missions to Mars.

However, the laws of motion don't tell us why mass and acceleration should both be related to force in this simple way. (Not to mention why objects have mass in the first place, how an object can be accelerated by gravity without being touched, and a host of other difficult questions.) Newton himself thought that speculations about such things “have no place in experimental philosophy”. Despite this, physicists have tried to answer these more difficult questions and have made a lot of progress.

Newton might not have been interested in why or how things happen, but most of the rest of us are. The answers we come up with may not be as elegant or precise as the Second Law, but they make a real difference. So scientists spend most of their time attempting to emulate Dorothy when she pulls back the curtain to reveal the old man behind all the apparently magical goings on in Oz: “Aha! That’s what makes it all happen!”

Take genetics as an example; from its beginnings in the mid-nineteenth century there has been an increasingly powerful sense of the curtain being drawn back and the machinery of life being revealed. As a result we are all now familiar with the idea that our genes come from our parents and contain some sort of “blueprint” which determines how we are put together. This idea is clear enough. But all our cells contain exactly the same instructions, and although we start out as a ball of essentially identical cells, we do not, of course, end up that way.

Cells all develop differently, they differentiate, they become heart cells, or brain cells, or whatever type of cells, and they pass on this specialism, this memory of the sort of cell they are, to their offspring. This inheritance cannot be genetic because the genetics of every cell in the organism is identical. The mechanism that does this is outside genetics or epigenetic. If genetics is the study of the cards you are dealt, then epigenetics is the study of how the hand is played out.

Gardeners know that new plants can be generated from small cuttings. This is because the new cells produced as the cutting grows discover new roles, in fact all the roles necessary to make a whole plant. In contrast most animal cells mostly remain true to their epigenetic inheritance and so animals have only limited abilities to regenerate. However, in some amphibians cells that are near a limb amputation have their differentiation switched off. As a result embryo-like development starts over again and a new limb grows complete with bones, muscles, and nerve connections. Never mind Newton, people really want to know how this works.

In humans (and mammals in general) regeneration abilities are rare, but one or two examples are worth noting. The nerve connections that allow mammals to smell are, unusually, absolutely determined to re-grow after all sorts of injury and abuse, and they often succeed. Also there is a strain of laboratory-bred mice that can regenerate in remarkable ways; for example, large holes punched in their ears do not stay open but heal. They do not heal over with scar tissue, but with supple new cartilage and skin complete with regenerated hair follicles; thus offering hope to middle aged men everywhere.

A 2012 paper in the journal PLoS One (Tyrka et al, 2012) is one of many to take epigenetic investigations much further. It presents evidence that epigenetics may help explain why people who are affected by maltreatment, abandonment, or bereavement in childhood are at higher risk of mental illness as adults. Parts of the genetic code are, it seems, epigenetically switched off as a result of what the authors call “Disruption or lack of adequate nurturing”. Notice again, that we certainly are interested in why this link exists, not simply that it exists.

From growing new nerves and limbs, to interventions which prevent long term traumatic stress blighting the lives of millions affected by war and disaster; epigenetics promises to be a vital part of the story. It is a science in its infancy and progress may initially be sporadic and disconnected. But we really do want to know how it works.

Tuesday, 6 March 2012

Discovery is messy, we mustn't try to clean it up

Albert Szent-Gyorgyi (a Hungarian who hardly anybody has heard of) was recognised by a Nobel prize in 1937 as the man who discovered Vitamin C. In fact this is a typically messy scientific story. The discovery, in reality, relied on contributions from dozens of people going all the way back to James Lind, another forgotten man .

In 1747, almost 200 years before the work by Szent-Gyorgyi, Lind showed that scurvy could be prevented, and cured, by fruit juice. (He used lemons and the Royal Navy later relied on limes, this is the origin of the American term for the British: `Limeys'.) Apart from this Lind, who was a British naval officer, also has the distinction of being the man who conducted the first scientifically valid controlled clinical trial in history. He did this precisely in order to get the results concerning scurvy. This is a turning point in our civilisation which deserves as much credit as the work itself!

Along the way another key contribution came from Axel Holst and Theodore Frolich (circa 1900, really obscure guys) who discovered, by accident, that guinea pigs suffered from scurvy if their diet lacked fresh fruit and veg, just like people.  Most mammals cannot get scurvy because they make their own Vitamin C. With this discovery it became possible to study scurvy in the lab, and this is probably the origin of the term `guinea-pig' to mean a test subject in an experiment.

There was a final double-fuddle in this story.  First, the confirmation that the cure for scurvy had been positively identified used materials prepared and supplied by Szent-Gyorgyi to another lab. This work was published without giving him any credit which lead to serious bad feeling on all sides. Afterwards the Nobel committee gave Szent-Gyorgyi the 1937 Nobel Prize for medicine alone without crediting any of the guys who actually did the animal studies! A mess all round.

I supply this example (one of many I could have chosen) to illustrate the folly of the current narrow mindedness prevalent in the funding of research. If we legislate against a diverse range of apparently unconnected projects, competing laboratories, work that seem to have no use at the time, patience, and dumb luck, science will not be streamlined, it will be hobbled.