Messy research methods

I did my PhD under “non-traditional” supervisory arrangements.  This is a euphemism meaning that things didn’t work out between my assigned supervisor and me, so I had to carve out my own path.

One by-product of these non-traditional arrangements was that I wrote my thesis by publication. I didn’t have a supervisor to guide me through writing a traditional thesis and provide feedback on my written work, so instead my thesis took the form of three published articles.

When I came to put the three articles together, I was advised that I needed some sort of thread that linked them together. Easy! That was me, of course. I was the thread that united all the elements of my research.

Unsurprisingly, my PhD research was a circuitous adventure. I fell into an existing project and pursued it eagerly, but thoughtlessly. I was researching by brute force, hoping if I worked hard enough, something interesting would fall out.

Eventually it did. I was excited that my first revelation (my first paper) raised more questions than it answered. My early results were tantalising but equivocal. Having spent years in the field and in the lab, I was no closer to understanding the climate processes that were pertinent to my study area.

And so I leapt to the next paper. I used vastly different approaches to test some of the assumptions I made in my first study. I was eventually able to show that in some regions, prevailing interpretations of past climatic change were not necessarily the most accurate.

Mid-PhD, I proudly presented my results at a conference. I was approached after my presentation by a researcher expressing relief that his high-latitude field sites provided comparatively simpler study areas than the tropics I had just discussed.

Uh oh. I hadn’t thought this necessarily followed from my presentation. And so I leapt onto my third paper.

But of course, it’s not that easy. Looking at my nearly finished thesis, it was evident to me that what defined my research was me.

Clearly, I wasn’t able to weave a thread through my papers defined by me, the researcher.  Instead, I retrospectively fitted order to my research. Hypotheses were posited, objectives met. Each piece became a carefully constructed study that fed seamlessly into the next.

I was frustrated. I hated the finished product and I still hesitate to drag my thesis off my bookshelf.  Rather than creating something more meaningful and useful from my research experience, the sanitised version seemed less valuable.

I recently started a John Law book about method, and mess, in social research. Here, traditional method is described as a system for offering bankable guarantees that guide us quickly to our destination.

But the danger in adopting conventional, risk averse approaches is that it gives over to mechanical replacement. That is, research becomes automated.

Law suggests that social researchers unmake their methodological habits. Rather than craving certainty and the expectation that we can arrive at more or less stable conclusions about the way things really are, we broaden, subvert and remake method.

We divest our distracting concern with these hygienic, sanitised approaches and instead embrace multiple, diverse and uncertain methods.

In practice, embracing diverse and uncertain scientific approaches may be little different from now. After all, I did it during my PhD, but never had space in which to talk about it.

As a starting point, it might be useful to talk about a recent Science article on the ongoing difficulties in coaxing researchers to share negative results.  Researchers and journals alike tend to cast their results as a story they believe others will read. Negatives findings are left unpublished and equivocal findings inflated, with words selected to emphasise the importance of a result (‘alarming’ increase in obesity, rather than ‘modest’).

In this article, Daniele Fanelli, who studies bias and misconduct at the University of Montreal, argues “the only way out of this [is that] people report their studies saying exactly what they did.” We need a shift in our cultural norm, embracing and sharing null results without regret or hesitation.

I don’t expect we will soon see climate science PhD theses in which short ethnographies detailing the experience of research connect a series of peer-reviewed papers.

But embracing uncertainty, diversity and indefiniteness within research could save us a lot of time and effort, as well as encouraging high risk but potentially high reward research.

Law comments that methods aren’t just a set of techniques or a philosophy of method. Indeed, researchers are methods  – method goes with work and a way of working and a way of being.

As a scientist, I participate in the social world, being shaped by it and shaping it in turn.  I don’t want to give myself over to mechanical replacement. Surely there is room for this to be reflected somewhere?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s