Triangulate That Ish

The last couple of days I have taken the opportunity of “Lower Ed” critiques to talk more about how this project came about, plus a few other tangential issues.

Today, I thought it would be useful to talk about something I hinted at on Twitter in my discussion of generalizability given Carrie Wofford’s critique that my data are not generalizable and my argument overgeneralizes.

In “Lower Ed” readers see the big story of what is really a collection of five research projects: a document analysis of SEC documents and legal cases; participant observation at nine for-profit colleges; an autoethnography of working as an enrollment officer in two for-profit colleges; a network analysis of for-profit college leaders; and interviews with for-profit college students.

It is easy to assume, as Carrie does, that my analysis draws broad claims only from interview data or participant observation. That is bulk of the data that I present in the book because it is the most novel and interesting, from a creative standpoint. Almost no one wants to read a book that’s just me tracing 19 codes through 400 or so pages of SEC documents even though that analysis was critical to my final conclusions. That’s why a project that took me almost four months to complete shows up in the book as barely half a paragraph and footnote. Trust me, it killed me to do it but it is what book writing is (as I have learned, painfully).

But this is a good moment to reflect on the importance of data triangulation.

I have already discussed how important site selection to my study. I tell my students that in qualitative work, your site selection works very much like controls in a quantitative study: it delimits the data in a way that is deliberate and important to the analysis. When I chose Atlanta and not Charlotte or Chicago, that had implications for my data. My job was to try to know as much about those implications as possible so that I could interpret the data as best as I could.

Further, in any kind of qualitative study it is important to identify and exhaust all available sources of data triangulation. I am going to borrow, with credit, this graphic from Saad Aqueel and R Campbell to depict some basic conclusions in the field of qualitative research regarding triangulation:

screen-shot-2017-04-10-at-12-40-41-pm

The other way to think about this method is the art of “disciplined case study design” (that’s me giving you google search terms).

What you are essentially trying to do is get your arms around the context of a site of inquiry. In my case, I was trying to get my arms around the social problem of risky credentialing processes.

That meant understanding the legal context of financial disclosure forms (for which I took an independent study in the business school), the social context of higher education choices, the lived experiences of college choice, the organizational structuration of choice, the system of higher education, some governance structures and economic processes.

Interviews cannot provide you that context. People rarely reflect on their social location. Trust me, it would be much easier if they did. But they don’t. And if they do? It’s a research red flag. Probe, probe, probe. Because people don’t often reflect on their social location, triangulation is critical to data integrity. It cannot make a qualitative study like mine “generalizable”, but that isn’t the point. The point is depth, not breadth. For depth, triangulation is a kind of check on respondents’ experience of the world and a check on ideological renderings of those experiences.

For all that important context, you have investigate the structure of experiences.

This graphic gives an idea of how to do that.

That data triangulation suggested that while my interviews were not “generalizable”, my site selection did emphasize generalizable patterns in the expansion of for-profit credentials: basically, it emphasized the status groups from which students came, the organization of their choices, and the political economy of those choices.

Had there been any reason to explore, say, the structure of the Department of Education? I would have done so. Instead, it did not emerge from theory, data, or the literature on the subject. It would be very critical to understanding governance, for example. Instead, some of the traces of ED decisions and processes showed up in my data triangulation: students experienced things like job placement data in ways consistent with their social contexts. Whether that is the intent of how those regulations are designed isn’t the point of this study. But it does provide something useful for social policy folks. People aren’t experiencing job data the way you think they should. Maybe you might care about that.

For my purposes, data triangulation informed how I recruited, how I chose my site, how I analyzed data and ultimately how I came to understand Lower Ed’s big story.

What’s the big story? Few people have summarized it as well as Matt Reed, so I’ll just rip it from him:

McMillan Cottom worked as a recruiter for two for-profit colleges before going to graduate school in sociology, and some of the more vivid parts of the book draw on her own time in those roles. (I worked as both faculty and, eventually, administration at a campus of DeVry from 1997 to 2003. Based on that, I can attest that much of what she describes rings true.) Working closely with students there, and later interviewing them for her research, she found that they weren’t the clueless rubes that the “predatory” narrative suggests. In fact, the area of most rapid growth in for-profit higher ed in the 2000’s was graduate degrees, drawing almost entirely on students who had attained bachelor’s degrees in traditional settings. If those students are witless, we have a much larger problem.

But they’re not. Instead, they’re up against an increasingly unforgiving political economy in which all manner of risk has been shifted onto employees (and prospective employees).  The mid-century model had colleges providing broad education, and companies providing specific training. That made some level of sense when both employers and employees expected workers to stick with a single company for decades, if not for an entire career. Now, companies hire and shed workers much more quickly, and entrepreneurialism — or what she calls “the hustle” — has become a de facto requirement for survival. The costs of training have been displaced onto the worker, or the prospective worker.

For-profits embody “the hustle,” and adapt well to it. McMillan Cottom applied as a student to several for-profits in the course of her research to see how they’d treat her, and contrasted their methods to the multi-step process her alma mater used. As she put it, “the enrollment process I experienced at for-profit institutions never once assumed that I had been cultivated to navigate a complex bureaucracy.” (126) Compared to the low-touch, DIY approach that most community colleges use for admissions, for-profits offer a sort of concierge service that walks the student through the paperwork and assumes that the student has neither the time nor the taste for hoop-jumping.

That’s the big story and triangulation is how I got there.

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous:

The #LowerEd “So What?”

This week I’ve been writing about questions that emerged from “Lower Ed”, starting with some critiques of the book. I discussed how I chose my case study, how I triangulated data, how I think about “public sociology“, and how contemporary social problems require some innovative methodological approaches. Next, I’d like to discuss the “so what”Read More “The #LowerEd “So What?””

Next:

Doing Public Sociology

Yesterday, I responded to critiques of “Lower Ed” with an explanation of what sociology is, how my sociological methods work, and what the research process is. Today, I thought I’d expound some more on related issues. People ask me a lot about “public sociology”. I put this in quotes because what people mean by itRead More “Doing Public Sociology”