The Craft of Research and Salsa Dancing

Note-taking for Principles of Empirical Research with Catherine Voulgarides

The Craft of Research by Wayne Booth, Gregory Colomb and Joseph Williams is a sober and traditional guide to humanities scholarship.

The Craft of Research

The authors begin by asking: why do research at all? For one thing, you can best evaluate the research of others if you understand the messy process that led to its creation. You can read around what’s presented on the page, knowing that it’s the result both of the questions asked and not asked, and how the results were shaped with a particular audience in mind. How do we know what research is trustworthy if we don’t have firsthand knowledge to compare to? This is an especially pertinent question in the age of Trump. Ultimately, we have to trust the researcher not to mislead us, intentionally or otherwise.

The authors mention that the only audience for most student research is their professors. (That makes me sad, which is why I post all of my writing online.) But even without an audience, writing is still worthwhile, because it helps you remember, understand, and gain perspective.

Writers have to create an imagined reader, a user persona in UX terms. We enter into a social contract with readers–they give us attention, and in return, we resist the urge to info dump and ramble. We offer to tell them something interesting, or to solve a practical problem, or to propose an answer to a burning question.

The authors’ advice on working in groups is written as if the group members all have a lot of time and attention to devote to the project. The scholarly writing I’ve done so far has been undertaken in a hurry, squeezed in around everyone’s kids and jobs and such. But maybe the situation is different when you’re not collaborating with a bunch of adjuncts and grad students?

Booth et al urge you to defeat the anxiety of being a novice by writing as you go. This is another reason why I love blogging. By the time I get to the “real” writing, I’ve already done versions of it repeatedly in less formal language. You can’t just write up everything after you’re done researching, because you may not know exactly what it is you’re researching until you’re writing about it.

Research can solve a practical problem, or a “research problem,” incomplete knowledge or flawed understanding. It’s a challenge to find research problems with the urgency of practical problems. I’ve resisted academia for a long time because of this difficulty. One sense of the word “academic” given by my computer dictionary: theoretical, conceptual, notional, philosophical, hypothetical, speculative, conjectural, suppositional; impractical, unrealistic, ivory-tower. I’d like to avoid doing that kind of research. I guess the answer is to take a cue from Miles Davis and constantly ask: so what?

For a very different guide to research, we were also assigned Salsa Dancing Into The Social Sciences by Kristin Luker. I like the title. Sadly, she doesn’t literally mean you should conduct research by salsa dancing. It’s more of a shorthand she uses for hip, nontraditional scholarship, as opposed to the canonical quantitative sociology she was trained in. Luker refers to her tribe of scholars as “salsa dancing sociologists.” That’s way too awkward for me, so I’ll just say “we” instead.

Salsa Dancing Into The Social Sciences

Luker begins by pointing out that asking a student to “review the literature” means something very different now than it did when the literature had a finite and manageable size.

Canonical social science tries to make predictions. Luker’s hip sociologist tries to make discoveries. We are not making comparisons between categories of people; rather, we are letting new categories emerge from our investigations.

Luker observes that the sole task of the social sciences is to investigate how people make meaning out of their surroundings, and how they are constrained or supported in doing so by the invisible social world. We aren’t equipped to build comprehensive theories about people drawn from random samples. We don’t know in advance who we’re studying exactly; we have to discover the parameters of our sample in the course of the research itself.

If we aren’t using data from a statistically significant random sample, can we still generalize from our observations? Or are we limited to doing slow-paced journalism on populations of one or a few people? Luker is confident that while we can’t generalize statistically, we can still extrapolate logically. We can make valid inferences by ascertaining which social features are important and which aren’t, by analyzing the categories we arrive at, and extending our inferences from there.

How can you turn a research interest into a research question? For Luker, a research question has four characteristics: it proposes a relationship between or among concepts; it uses that relationship to explain something significant about social life; it admits a range of possible answers that we can examine to see which one best fits the data; and it advances broader scholarly discourse. Journalists report the facts, but sociologists have to explain the facts using some kind of theory (however intuitive that theory might be.)

We don’t necessarily sample from huge data sets to do intuitive sociology, but we still sample. There’s a perfect analogy to musical sampling. We weed through a huge body of data and try to isolate a few little pieces of interest. Then we scrutinize those pieces, combine them in new ways, extend and expand on them, and use them to build something new. This is just as true for James Brown loops as it is for sociological theory.

Our sampling might be intuitive, but it still has to be valid. We need to find a case (or a set of cases) that’s reasonably representative of the larger phenomenon we’re investigating. This is not the same as the way that canonical sociology looks for samples that are representative of the larger population. We can build our sample using a tacit control group, or a formal and theoretically driven comparison. There’s a risk here of coming up with a theory and then hunting for a group of people who confirm the theory, but that’s a risk in quantitative research too. The best we can do is to to wear our preconceptions on our sleeves.