I’m futzing with a new site theme. Excuse the mess.
What you may not know about me is that, once upon a time, I went to graduate school fully intending to focus my efforts on Irish literature. I had the opportunity to meet Seamus Heaney on a couple of occasions, and to hear him read several times, both in the States and during my semester abroad in Dublin. So it was with some sadness that I learned this morning that he had passed away.
Here is what you don’t know. Some of this has to do with Heaney, some more of it with Seamus Deane (who was a visiting professor at Carleton one term and from whom I took a course), but most of it has to do with Irish literature in general. As an English major in college, I took plenty of literature courses, and all of that literature was mediated through the printed page, of course. And the page, it flattens things. Birth and death dates follow the names of the writers included in our Norton anthologies, important numbers, but ultimately pretty meaningless to a 20 year old. When I started studying Irish literature, it was a bit newer, certainly, but as it stretched to the current day, to authors still living, and later to authors that I was meeting, literature changed a bit for me. This is the analogy my undergraduate mind devised: as I started out, the texts I was reading were like stars in the night sky, bits of brilliance against a much vaster sea of dark. What Irish literature did for me was to flip that metaphor on its head–I began to see literature itself as the field and the textstars as intensities rather than disconnected objects.
A big part of that was that the writers I met in Ireland all knew each other, and wrote both for and to each other. It may seem obvious to me now, but they were part of an ecology, a network, a community, and something was lost when you read them in isolation from the others. They weren’t writing in isolation and so they changed the way I read–little wonder that when, in a few years, I encountered what rhetoric and composition had to say about the myth of the isolated, originary writer, I was already primed for that work to resonate with me. And while it may be a stretch from the outside to connect the 20-year old me stepping out of a Dublin pub with the me who’s focused on ecologies and networks for the past several years, for me it’s always made perfect sense.
So, below the fold, is Heaney’s “The Ministry of Fear,” written for Seamus Deane:
Apropos of a couple of different Twitter conversations yesterday, I found myself thinking last night about just how much our tiny, academic corner of the media landscape has changed in the past five or ten years. The first such conversation involved a simple request for resources–someone was looking for an article on a topic (archives) that could be included in a syllabus–and the second was about citation, and whether a particular writer had engaged with the scholarship in a specific discipline. Depending on the Venn diagram of our following, you may have seen one or both of these. I’m being purposefully vague here, though, because the details here matter less to me than the fact of the conversations themselves.
I entered my PhD program right about the time that Mosaic was making the transition to Netscape–the latter was released during my first semester, although I wouldn’t know it for another year or two. Here’s how long ago this was: unless you specifically requested otherwise, my school put the last 4 digits of your Social Security number in your email username–that’s how oblivious we/they were at the time. Before I devolve here into a combination of tech history and Abe Simpson stories, I’ll just say that it was a different time.
The availability of Facebook and Twitter (and any number of other sites), and the presence of fairly usable journal databases, are in the process of changing much of what it’s meant to be an academic. Those changes are happening in fits and starts, and they’re unevenly distributed, but I think they’re slowly shifting our expectations for ourselves, in ways that will only become more apparent over the next decade. It wasn’t so long ago that “searching the journal database” would have been a euphemism for going to my office and thumbing through back issues. And the database was limited strictly to whichever journals you and your colleagues had subscribed to.
And that’s to say nothing of trying to access journals or topics from other fields; tracking an idea or topic across multiple disciplines would have required such effort as to make it practically impossible. When I was in graduate school, I used to use the campus bookstore that way–I would walk the stacks and look at the texts required in other programs’ graduate courses, to see what sounded relevant or interesting. Now? I follow folks from any number of disciplines on Twitter, I read their blogs, and I trace back their citations online. Part of the reason for this, of course, is that I have certain interests–technology cuts across disciplines–but the conditions of possibility for my interests have emerged alongside social media.
If there’s an interesting side effect to this, it’s that having access to a broader swath of scholarship changes our assumptions. For those of us who use these platforms, the boundaries among disciplines feel much more porous and arbitrary, such that interdisciplinarity or transdisciplinarity becomes our default. So in the case of the second conversation I refer to above, part of me wonders how much of “citation politics” is being driven by these platforms. That conversation was not an isolated occurrence–with the emergence of digital humanities, there have been times where I felt myself resenting the fact that its history is defined almost exclusively in terms of “humanities computing” rather than “computers and writing.” There’s also a long history of consternation about the import/export ratio in writing studies–why do “we” draw on “their” ideas but “they” don’t return the favor? And so on.
This isn’t to say that, with a wave of the Twitter wand, everyone everywhere should now be aware of everything. But I do think that social media are rapidly simplifying what used to be insurmountable issues of material access, and as a result, our expectations for intellectual access are changing as well. I’m fairly sure that this is a positive development–when I was young, the only way your work would be read outside of your home discipline was if you were lucky enough to transcend your discipline and become a superstar. This isn’t probably the right term for it, but I think of this as “vertical interdisciplinarity.” I think what we’re seeing emerge now is a more “lateral interdisciplinarity,” where only a couple degrees of separation is required to search in other fields–if I know someone who knows someone, that’s all it takes. And that “knowing” can be something as simple as following a blog or a couple of people on Twitter. A lot of my own interest and work in network studies comes less from superstars and more from finding places where people in other fields are sharing their syllabi and writing occasional blog posts.
But we’re in a transitional time, and that means that not everyone is working this way, even if we believe that they all could be. The assumption that we are all equally visible to one another sounds a little absurd, but I catch myself at it all the time. I know about you; how can you not know about me?! Between the defunding of academic presses and the proliferation of social media platforms, I find myself having to think more about whether and how visibility is my own responsibility. At the very least, this raises issues of how our notions of audience are changing.
I was about to post this when it occurred to me that this could be read as a kind of “blame the victim” argument with respect to any kind of disciplinary asymmetries–that I could be taken to mean that if a particular group of people isn’t adequately represented in a conversation, it’s somehow their fault for not working hard enough to make themselves visible. That’s definitely not my point here. I don’t think these changes alter our professional ethics in that regard–if anything, they should (ideally) make them simpler to achieve. It should be easier to assemble a broader range of citations, course readings, and/or keynote speakers–to my mind, there is even less excuse for homogeneity. A second, related point is that, at a time where open access challenges the model of scarcity upon which many of our organizations have built themselves, I think visibility and aggregation are services that those organizations should work harder to provide.
I’ve got more to say, but also syllabi to finish. So that’s all for now.
Next spring, I’ll be teaching our grad program’s DH course for the second time. While not a complete loss, the first time I taught the course was affected in no small measure by the fact that I had major surgery just prior to the start of the semester. It turned out that debilitating pain and medication-hazed convalescence were not especially conducive to course planning.
So in many ways, then, it feels like I’m really teaching the course for the first time. I’m going to paste below my preliminary outline for course readings, for which I would especially welcome your feedback. It’s not really close to done yet, but I feel like I’ve got enough now that I can finally move on and plan my fall courses (!!). I still have several layers of research to do (bookmarks, instapaper, fave tweets, TOCs, etc.) before this will feel finalized. So if you have any suggestions for readings, please share them here, or drop me a note–this will be an ongoing process throughout much of the fall semester.
A couple of additional notes: In addition to weekly readings, I expect that I’ll ask the students to look at 1-2 online projects a week–I have a huge list of possibilities, but I haven’t sorted through them yet to match them up with readings and topics. We’ll also be spending time each week in the lab working with various tools–again, big list that needs sorting and matching. Finally, I’ll be hosting a more dynamic version of the syllabus at http://rcdh14.wikispaces.com/. The static version of the syllabus will be available both there and here once I’m closer to finalizing.
Oh, and there are only 13 weeks because we’ll lose one to CCCC, I think, and because I always leave a week open for flexibility’s sake.
CCR 733: Rhetoric, Composition, and Digital Humanities
This is not required, but I highly recommend you read either or both of the following over winter break–if you are new to DH, both books provide a nice introduction to some of the issues we’ll be discussing:
There are no required readings for our first meeting, but I do recommend setting up some sort of blogspace (WordPress, e.g.) in advance, as well as a Twitter account if you don’t already have one. We will spend some time discussing this site, the role that blogs and Twitter will play in the course. If you have questions about this prior to the actual course, feel free to contact me.
Matthew Berry, “Introduction” Understanding Digital Humanities
Anne Burdick, et al., “A Short Guide to the Digital Humanities,” from Digital_Humanities (PDF)
Alan Liu, “The Meaning of Digital Humanities,” PMLA 128.2 (March 2013): 409–423.
Willard McCarty, “The Future of Digital Humanities is a Matter of Words.” from A Companion to New Media Dynamics (PDF)
Part I of Debates in the Digital Humanities, “Defining the Digital Humanities” (available online)
Andrew Abbott, “The Chaos of Disciplines,” from Chaos of Disciplines (PDF)
Part III of Debates in the Digital Humanities, “Critiquing the Digital Humanities” (You should at least skim this)
Charles Cooney, et al. “The Notion of the Textbase: Design and Use of Textbases in the Humanities.” LSDA
Sharon Daniel, “The Database: An Aesthetics of Dignity.” from Database Aesthetics (PDF)
Ed Folsom, Database as Genre: The Epic Transformation of Archives. PMLA 122.5 (Oct 2007), 1571-1612 (includes responses).
Christiane Paul, The Database as System and Cultural Form: Anatomies of Cultural Narratives.” Database Aesthetics (PDF)
Geoffrey Sirc, “Serial Composition.” Rhetorics and Technologies (PDF)
Collin G Brooke, “Databases, Data Mining” and “Personal Patterns: Mapping and Mining” from Lingua Fracta (PDF)
Casey Boyle. Low-Fidelity in High-Definition: Speculations on Rhetorical Editions. RDH
Tarez Samra Graban, Alexis Ramsey-Tobienne and Whitney Myers. In, Through, and About the Archive: What Digitization (Dis)Allows. RDH
Michael Neal, et al. Making Meaning at the Intersections. Kairos
Liza Potts. Archive Experiences: A Vision for User-Centered Design in the Digital Humanities. RDH
Jenny Rice and Jeff Rice. Pop Up Archives. RDH
Week 5 Metadata
Tarez Samra Graban, “From Location(s) to Locatability: Mapping Feminist Recovery and Archival Activity Through Metadata.” College English
Kieran Healy, “Using Metadata to Find Paul Revere.”
Richard McNabb, “Making the Gesture: Graduate Student Submissions and Expectations of Journal Referees.” Composition Studies, 29.1 (2001): 9-26.
Geoffrey Nunberg, “Google’s Book Search: A Disaster for Scholars.” CHE, August 31, 2009.
Jessica Reyman, “User Data on the Social Web: Authorship, Agency, and Appropriation.” College English 75.5 (May 2013): 513-522.
Jentery Sayers, et al., Standards in the Making
Week 6 Algorithm
Kevin Brock, One Hundred Thousand Billion Processes: Oulipian Computation and the Composition of Digital Cybertexts
James Brown, “Making Machines”
Paul Eggert, “Text as Algorithm and as Process.” from Text and Genre in Reconstruction (PDF)
Stephen Ramsay, Reading Machines: Toward an Algorithmic Criticism
Mark Sample, Hacking the Accident
Don Foster, from Author Unknown
Matthew Jockers & Julia Flanders, “A Matter of Scale”
Seth Long, “Text Network and Corpus Analysis of the Unabomber Manifesto.” Feb 12, 2013
Geoffrey Rockwell, “What is Text Analysis, Really?” LLC 18.2 (2003): 209-219.
Greg Urban, “The Once and Future Thing.” from Metaculture (PDF)
Patrick Juola, “Rowling and “Galbraith”: an authorial analysis” Language Log, July 16, 2013.
Ben Zimmer, “Decoding Your Email Personality.” NYT, July 23, 2011
Ben Zimmer, “The Science that Uncovered J.K. Rowling’s Literary Hocus-Pocus.” WSJ, July 16, 2013.
Journal of Law and Policy Symposium on Authorship Attribution (PDF available at site)
Carol Berkenkotter & Thomas Huckin, “Conventions, Conversations, and the Writer”
David Hoffman and Don Waisanen. “At the Digital Frontier of Rhetorical Studies: An Overview of Tools and Methods for Computer-Aided Textual Analysis” RDH
Dan Wang, “Is There a Canon in Economic Sociology?” Accounts 11.2 (2012): 1-8.
Issue 2.1 of the Journal of Digital Humanities on Topic Modeling
Matthew Jockers, Macroanalysis
Nelya Koteyko. Corpus-Assisted Analysis of Internet-Based Discourses: From Patterns to Rhetoric. RDH
Franco Moretti, “Conjectures on World Literature.” NLR
Gregory Crane, “What Do You Do With a Million Books?” D-Lib Magazine12.3 (2006).
Jean-Baptiste Michel et al., “Quantitative Analysis of Culture Using Millions of Digitized Books,”Science331.176 (2011): 176-182.
Ted Underwood, “How Not To Do Things with Words,” blog post, The Stone and the Shell, 25 August 2012.
Albert-Lazlo Barabasi. from Linked &o Bursts
Anna Munster, “Prelude to the Movements of Networks.” from An Aesthesia Of Networks
James Porter. “Rhetoric in (as) a Digital Economy.” Rhetorics and Technologies (PDF)
Duncan Watts, from Six Degrees.
Tanya Clement, “Text Analysis, Data Mining, and Visualizations in Literary Scholarship.” LSDA
Johanna Drucker, “Graphesis: Visual knowledge production and representation” (PDF)
Johanna Drucker, “Humanities Approaches to Graphical Display.” DHQ
Martyn Jessop, “Digital Visualisation as a Scholarly Activity” LLC 23.3 (2008): 281-293.
Krista Kennedy and Seth Long, “The Trees Within the Forest: Extracting, Coding, and Visualizing Subjective Data in Authorship Studies.” RDH
Manuel Lima, “The Syntax of a New Language.” from Visual Complexity (PDF) (Companion Site)
Stéfan Sinclair, et al., “Information Visualization for Humanities Scholars.” LSDA
Max Black. from Models and Metaphors
Morgan Currie, “The Feminist Critique: Mapping Controvery in Wikipedia.” UDH
Kieran Healy, “A Co-Citation Network for Philosophy.”
Brad Lucas and Drew Loewe, “Coordinating Citations and the Cartography of Knowledge.” The Changing of Knowledge in Composition
Derek Mueller, “Grasping Rhetoric and Composition by Its Long Tail: What Graphs Can Tell Us about the Field’s Changing Shape.” CCC 64.1 (Sep 2012): 195-223.
Anne Stevens and Jay Williams, “The Footnote, in Theory.” Critical Inquiry
Scot Barnett, “Psychogeographies of Writing: Ma(r)king Space at the Limits of Representation.” Kairos
Franco Moretti, “Maps.”
Franco Moretti, “Network Theory, Plot Analysis.”
I am not an historian, nor a member of AHA, nor an early-stage scholar, nor a publisher, nor am I responsible for library acquisitions. But then, the same can be said of plenty of folk who have weighed in on the decision by the American Historical Association to release a statement allowing for (and by implication, perhaps, endorsing) the “embargo” of history dissertations. As Rick Anderson notes (in a Scholarly Kitchen post that provides a pretty strong overview), the AHA “smack[ed] the hornet’s nest.” I follow enough Digital Humanities and Open Access inclined historians on Twitter that this statement, and the furor that ensued, registered substantially throughout my feed. And over the past week or so, the discussion has trickled upwards to the usual suspects (and beyond!) and sideways to other disciplines. At least it has to my own, based on listserv discussions and retweets.
And it should spread, because it’s not just an issue for historians. Times for university presses and for academic libraries are tough all over, and that affects every discipline. As someone who routinely advises late-stage graduate students and untenured faculty, I think that the questions raised by the AHA statement are ones that everyone in the humanities should be thinking about, not just members of that particular organization. For a good cross-section of the various positions and issues, my best recommendation is Open History, a project that began as part of the backlash against the AHA statement, but one that I’ll be watching with interest. They’ve got a pretty thorough collection of the responses to date, and a mechanism for adding others (addressing a weakness of the link round-up posts I’ve seen). I’m thinking about using that “issue” in my DH course for next spring, incidentally.
I have a couple of thoughts, neither of which necessarily addresses the core issues at play in the Statement or the responses to it. The first is that, put simply, I think discussions like these reveal their scale-free status, or at least raise the question of scale. As I’ve been sorting through what I want to do with my network rhetorics project, I’ve been returning to some of the terms and ideas that I’ve been taking for granted, and one of those is this idea of scale. We describe random networks as having scale when you can select any of the nodes and treat them as roughly representative of each of the nodes in the network; in other words, the behavior of a given node “scales up” as typical to the network itself. In a scale-free network, however, you can’t generalize from the behavior of a single node to the network. Easy enough, right?
Whether we treat the system addressed by the AHA Statement as a single network, or see it as a set of overlapping networks of various stakeholders (graduate students, faculty, universities, publishers, libraries, et al.), those networks are scale-free, which makes framing any discussion problematic. Insofar as we can speak of a system here, it’s difficult for me to see any particular node or group of nodes as typical. I don’t think there are many among us in the academy who would be arrogant enough to describe their own writing process, hiring process(es), job histories, departmental and college relationships, publication records as typical of all of their colleagues, not in academia, their own field, nor even their own department. Part of how scale-free networks function is iterative–the status of a given network affects its future development. So, for example, highly trafficked websites are more likely to attract additional traffic–our experience of the web isn’t random (or scaled). Each stakeholder in this conversation has a variety of factors to balance, and the ratio among them is a decision that happens locally, based on circumstance, and that ratio necessarily shifts over time.
I know that this sounds painfully self-evident (different Xs behave differently!), but you can see a certain amount of incommensurability that creeps into the discussion. We can’t speak in generalized terms about the policies of academic presses, so we push for specific evidence. But because those specifics don’t scale, they can never function beyond the level of anecdote, and ultimately make it more difficult to speak in generalized terms. I work in a field where perhaps the best-selling book of all time (Cross-Talk in Comp Theory, now in its 3rd edition) is not published by a university press but a professional organization, 95% of which appeared in print prior to its inclusion in that volume, and is almost entirely available to anyone for free provided they have access to JSTOR. And that tells you exactly nothing about NCTE’s own editorial policies, much less the other presses in our field, much less academic presses in general.
In the absence of representative anecdotes, what can be done? Do arguments like this get to a point where we should just throw up our hands? Not really. I do think, though, that at a certain point, networks of arguments hit a threshold where incommensurability sets in. When that happens, a different kind of argument takes place. And maybe this is my second point: lemons are to lemonade as scale-free arguments are to scale (as in “when life gives you X, you make Y”). While a number of people have responded to the AHA Statement as a particular kind of intervention (endorsing a rapidly deteriorating and increasingly obsolete model of scholarly communication), I wonder if it’s more appropriate to think of it as an attempt to introduce scale into the discussion, to establish some kind of baseline that allows us to say certain things about the typical behavior of that system. Bear with me.
It’s an imperfect analogy, to be sure, but think about professional sports, and the effects achieved through salary caps. It’s still hard to speak of the typical franchise in, say, the NBA (the Celtics and Lakers are not representative, e.g.), but a salary cap mitigates the scale-free quality of the network of franchises, allowing Oklahoma City to compete with Los Angeles, even if the two take very different approaches to team development, talent acquisition, etc. Even though teams may appear to ignore the cap (hello, Brooklyn!), the penalties for doing so are not insubstantial (increased luxury taxes, the inability to sign players for more than the league minimum, etc.), and can’t be maintained indefinitely. The salary cap doesn’t homogenize the league nor guarantee any sort of sustained success for a franchise, but it does level the field in terms of opportunity. There are still strategies and tactics involved, but there’s sufficient scale in the network (we might argue) to allow every team to compete.
Part of the AHA Statement, then, is about scale: all graduates should be able to choose what to do with their dissertations. In this sense, the AHA Statement isn’t so far away from guidelines offered by the MLA regarding the value of digital work:
Institutions and departments should develop written guidelines so that faculty members who create, study, and teach with digital objects; engage in collaborative work; or use technology for pedagogy can be adequately and fairly evaluated and rewarded. The written guidelines should provide clear directions for appointment, reappointment, merit increases, tenure, and promotion and should take into consideration the growing number of resources for evaluating digital scholarship and the creation of born-digital objects.
If your department or discipline falls under the MLA umbrella, then this provides a pretty clear statement about the importance and value of digital work–whether or not every single department follows this Statement to the letter, MLA provides a baseline for the responsible evaluation of tenure/promotion cases that include such work. It doesn’t say, however, that you should not be tenured unless you do such work, or that folks who don’t work digitally are less likely to be tenured. If I were forced to name the difference between these two sets of claims, perhaps I’d call them policy statements and position statements. The MLA here is setting policy, or at least offering a Statement that itself can be adopted and adapted to policy on the local level.
Where the hornet’s nest gets smacked with the AHA Statement, I think, is precisely those moments when it drifts away from policy towards position:
an increasing number of university presses are reluctant to offer a publishing contract to newly minted PhDs whose dissertations have been freely available via online sources….online dissertations that are free and immediately accessible make possible a form of distribution that publishers consider too widespread to make revised publication in book form viable…
I don’t think it’s particularly controversial to note that, if your interest is in crafting policy, the phrase “tangible threat” is not the best choice. If nothing else, that phrase alone functions deictically, calling attention to the specific circumstances surrounding the Statement and inviting the polarization that occurred in response. It treats those circumstances not as variable or conditional (which they surely are, if others’ responses are to be trusted), but as given. And it turns the policy into an outgrowth of a position that is conjectural (and which proved to be controversial). I don’t know if I’d go so far as to say that it’s simply a matter of tone, but it’s hard for me to imagine that there would have been nearly as much outcry, if the Statement had presented the policy as a necessary update given the shifts in scholarly communication/technology. I think that the case (and the policy) could have been made without implying that OA advocates’ beliefs constitute a (tangible!) threat to their newer colleagues.
If I were to put a bow on this, I think I’m slowly articulating an idea for myself about network rhetorics: this notion of making an argument to scale is something that I find coming up again and again, one that only really emerges with clarity when you think about arguments ecologically or as networked. I still don’t quite have the vocabulary for it yet, but most networks don’t occupy either the random or the scale-free end of the spectrum; they’re semi-scaled, maybe, and maybe I’m thinking about scale as a consequence of rhetoric. That may be the next puzzle piece for me as I work on my next book. I wish I had a witty remark to insert here about my failure to embargo this post, but let your imagination run wild…
[I wasn't sure whether or not I really wanted to share this piece of writing. Perhaps you'll understand both why I was hesitant to do so, and why I have. -cgb]
If you were on FB or Twitter this weekend, and are associated with academia, you probably caught a glimpse of a tweet from an evolutionary psychologist who suggested that “obese PhD applicants” should save themselves the trouble of applying for doctoral programs, since their obvious lack of willpower will keep them from being able to write a dissertation. I’m not going to link in any way to Geoffrey Miller’s work, but this Jezebel story will tell you most of what you need to know. Miller himself has progressed quickly through the life cycle of denial: he initially defended his statement, then deleted it, then apologized for it, then disavowed it, and finally, when pressed by his university, claimed that it was part of a “research project.” My guess is that Miller has managed to damage himself pretty seriously; it wouldn’t shock me to hear that his home institution will have nothing more to do with him.
Like a lot of people, my first response to that tweet was both outrage and rage. It was a shitty thing to say. The more I thought about it, though, the more layers I found. Some of them were prompted by others’ comments about Miller’s tweet, but I’ve been thinking a lot about my own embodied response as well. If you’ve never met me in person, then one thing you need to understand, first off, is that I’m obese, fat, overweight. That’s not something I talk about much, and I never write about it. I was a big kid growing up–my father played football and rugby, and I inherited his size. When I was a kid, I was pretty active: I played a lot of different sports, except the one (football) for which my body was probably best suited. All through college and into graduate school, I think that a lot of people assumed that I played football. Anyhow, as I got into college and grad school, my life became more sedentary (reading will do that for you), while my eating and exercise habits declined. While you might have charitably described me as “big” in high school, by the time I graduated from college, I was overweight. And that hasn’t really changed.
The odd thing about Miller’s remark isn’t that society treats an excess of body mass as a deficit of willpower or self-discipline; frankly, he’s saying out loud there what plenty of people believe. The odd thing is that he thinks that there’s just one kind of willpower, and that “evidence” of its absence is somehow universal. This was my experience: as a fat academic, I was thrilled to be in a field where (ostensibly) I would be judged for the quality of my mind rather than the “failures” of my body. With blind peer review, no one can see that you’re fat. And so, if I lacked self-discipline when it came to carbs, I could throw all of my effort into writing (I’m doing it right now) and be disciplined there. I wrote my 250-odd page dissertation in less than 3 months, and my lack of willpower regarding exercise and healthy eating had nothing to do it; if anything, my willingness to focus like a laser on writing, and not worry about my body at all, helped me. If we imagine that willpower, like attention, is a networked phenomenon, spread amongst a variety of objects, then there was/is a sense in which my lack of physical willpower helped to feed my intellectual willpower. I’m sure that it’s not that simple, of course, nor should my experience somehow be generalized to “disprove” Miller’s prejudice. My own experience, more than 20 years in academia, tells me that there’s no formula here–successful academics come in all shapes and sizes. To imagine otherwise, as Miller does, seems to me to be stupid.
When I say that Miller’s just saying out loud what many people already believe, I say this because I believe it too, at least on some level. The thing that folks who aren’t overweight don’t typically understand is that our experience of the world is different from theirs, in a range of ways. I rarely fly, in part because being above average in both height and width means that airplane seats don’t fit. When I was at my heaviest, they were physically painful to wedge myself into. And don’t get me started on the number of comedic scenes and/or commercials about being condemned to sit next to the fat person on the airplane–I feel that shame every time I walk onto one. I don’t fit into smaller cars, and there was a time where I had to suck in my stomach to get the seat belt to fasten, when I got a ride from someone else. For several years, I couldn’t sit at the molded desks in the classrooms where I taught. When I went home for holidays, I had to make sure to sit in chairs without armrests, because again, they were a tight fit at best. And even when I can fit on a chair, it might not be able to support my weight without creaking, or god forbid, breaking. I couldn’t walk past someone on a tight staircase without pressing against the wall. The floors in an old house are always a little more aware of my presence than they are of anyone else. The world around me tells me that I’m the wrong shape and size, that I don’t fit. Faced with a constant stream of small indications that there’s something wrong with my size, I am amazed and inspired by those who are better able than I am to accept themselves. If there’s a place where I feel my own lack of willpower, it’s there.
And if there are those among you who doubt the idea of non-human rhetorics, let me introduce you to the suasive force of the clothing industry. When you are the wrong size, as a man, there’s really only one place where you can buy clothes, the big-and-tall store, usually located in a strip-mall. For a long time, big-and-tall clothing was constructed according to the principle that there was only one true body shape, and that you were just taller or wider. Even when the clothes “fit,” they often didn’t. The ratios among my various measurements are not the same as those of a “normal” person, and so buying clothes to fit one part of my body often meant ignoring others. “Tall” clothes often assume basketball-player sized people, and thus a smaller waist, but dress shirts also often have additional length and an extra button, making them easier for a fat person to keep tucked in–so I often had to choose between shirts that were tight around my midsection or much too large for my shoulders/arms. Both options served as a constant reminder that I was malformed, though. And fat people aren’t allowed to care about fashion–”if they really cared about how they looked…”
Big and tall clothing has improved in recent years, but decades of shame over “trying on new clothes” is hard to overcome. And I think about all these things knowing that they’re not intentional. Although I do sometimes think that it was a room full of skinny assholes who came up with the idea of the television show The Biggest Loser (“no, they’re ‘losers’ because they’ll be losing weight–ha ha ha!”), I know that the world is the world. There are millions of people out there who suffer from prejudices far more intentional and pernicious. Partly, this is the shame talking, but I do have more control over my weight than many people have over their own embodied circumstances, and so I don’t tend to think publicly about my size. Compared to what many other people go through as a result of circumstances they can’t control, claiming or emphasizing my own struggles has always felt presumptuous.
I did want to make one more point, though. While I was gratified to see the speedy, collective outrage over Miller’s tweet, it made me think back a couple of weeks to a conversation that happened on Twitter about how academics should dress. Once upon a time, I was told (quietly) that if I expected to receive tenure, I would need to dress better. The thing is, when you’re overweight and wearing clothes that aren’t tailored to your body’s shape, your body puts different stresses on those clothes. Dress clothes in particular tend to assume the “norm,” and while it’s funny to watch Chris Farley split a jacket or the rear seam on a pair of pants, imagine doing it while you’re teaching a class bending over to retrieve a pen or a piece of chalk. And then imagine that the simple act of dressing one’s self every day carries with it that extra layer of anxiety over whether today will be the day that your body betrays and humiliates you. Most ties are manufactured with certain assumptions about the size of the neck around which they will be worn; for a fat person, a regular tie often doesn’t fit. Sports jackets often assume a particular shoulder to waist ratio. I normally teach in jeans, because for a variety of reasons, they tend to be manufactured to handle more stress and wear than dress pants. In this Twitter conversation, however, the idea of teaching in jeans was one of the things that was considered unprofessional among faculty of a certain age. I don’t mean to call anyone out about this, but I will say that I felt no less shame seeing this conversation than I did seeing Miller’s remarks. I didn’t see all the responses to the thread, but I’m pretty sure that most people didn’t think of it as fat-shaming, or respond to it with the same outrage. It probably didn’t register to them.
I guess my point is this: my wish would be to take a small piece of the outrage, and apply it to awareness. Try to be a little more conscious of the ways that our assumptions about the world, whether it’s dress codes or the way we arrange our spaces, subtly reinforce the fat-shaming that Miller was engaging in explicitly. Even if it’s something as simple as not assuming that everyone has the same relationship to clothing as you, or understanding that not every seat in the restaurant is equally comfortable for someone who’s overweight. It can be tricky to be more interventionist without also shaming, but it’s possible to invite someone for a walk rather than a cup of coffee, or to have them over for a healthier meal than you’re likely to find at a restaurant. It should probably go without saying that we should all, myself included, try harder to catch ourselves when we make assumptions about people based on their appearance (not just their size or shape), but it’s worth reminding ourselves precisely when stuff like this happens. My gut reaction was outrage, but my second thought was to ask myself if I’d been guilty of that prejudice myself.
There’s a lot more to say about this, I’m sure. I’ve alluded several times to the fact that I’m not as heavy as I used to be. Far from being a story about the triumph of the Collin will, the fact of the matter is that I came kind of close to dying a couple of years ago, partly for my unhealthy ways, and partly because my shame over it kept me from getting the help I needed to get more healthy. Neither of those things is easy to admit for me, and they’re what makes Miller’s tweet particularly cruel. Most of us don’t have the level of “control” over ourselves that his comment implied–I know I still don’t. It took major surgery and more than a year’s worth of recovery for me to break through even a part of my own complex of shame and guilt and habit to find a healthier place. I’m fortunate to be healthier now physically, although it’s something that I have to work at constantly.
The implication that “fat” is a problem easily solved through the application of willpower is laughable to me, though, and that’s the biggest part of what I find objectionable in that tweet. It takes a partial truth (we do have some control over our body’s health) and twists it to rationalize a prejudice that itself works against that truth through shame. And that’s pretty evil.
I think I’m done now. Time to go for a walk…
So, Jim put out this call for advice this week:
— Jim (@jamesjbrownjr) June 1, 2013
It’s been a while since I last posted here, and Jim’s tweet got me to thinking, so I figured I might write a few thoughts down. They’re not necessarily complete, because I do think that discipline and venue matter quite a bit, as does the student’s progress, work habits, and readiness. While it might be nice if there were a simple 10-point listicle that provided us all we ever needed to know about publishing, the fact of the matter is that it’d be pretty horoscopic. I’m not sure my advice will be any better, but it’s generally worked for me.
There are a few essays that I hand out to graduate students on a semi-regular basis, pieces that I’ve found really useful to have and to revisit every so often for my own writing. In honor of the listicle, I present to you my Top 5 Must-Read Essays for the Aspiring Scholarly Writer:
* C. Wright Mills, “On Intellectual Craftsmanship” (PDF) — It’s dated, and it’s from the social sciences, but it’s worth every graduate student’s time to read and adapt Mills’ advice:
By keeping an adequate file and thus developing self-reflective habits, you learn how to keep your inner world awake. Whenever you feel strongly about events or ideas you must try not to let them pass from your mind, but instead to formulate them for your files and in so doing draw out their implications, show yourself either how foolish these feelings or ideas are, or how they might be articulated into productive shape. The file also helps you build up the habit of writing. You cannot `keep your hand in’ if you do not write something at least every week. In developing the file, you can experiment as a writer and thus, as they say, develop your powers of expression.
Mills’ piece is a new one for me–I picked up a used copy of The Sociological Imagination years ago, but only happened to read its appendix recently. It may seem overly simple to imagine that there is someone who doesn’t realize that writers must “write something at least every week,” but it took me a long time to figure this out. I no longer assume that it’s something that goes without saying. Publishing is the tip of a massive iceberg of writing.
* Joseph Williams, “Problems into PROBLEMS” (PDF) — This is a long read, the academic equivalent of a novella, longer than an article but shorter than a book. Again, this may seem like obvious stuff, but I assure you, it can be really helpful to use the framework that Williams supplies to look at one’s own writing. The putative topic of Williams’ book is learning how to stage introductions effectively, and that in itself is worth the price of admission. But I use this text less as a means of helping me write my introductions than I do as a way to help me crystallize the point of whatever I’m working on at the time.
…posing and solving PROBLEMS is what most of us do, but most of our students, both undergraduate and graduate, seem unaware of not just how to pose a PROBLEM, but that their first task is to find one. As a consequence, they often seem just to “write about” some topic, and when they do, we judge them to be not thinking “critically,” to be writing in ways that are at best immature (Berkenkotter, Huckin, and Ackerman), at worst incompetent. Yet many of our students who do not seem to engage with academic PROBLEM-solving, in fact, do. Their problem is that they are ignorant of the conventional ways by which they should reveal that engagement; ours is that we have no systematic way of demonstrating to them the rhetoric of doing so.
The first time I read this work (the first of many, many), I was a little resistant to the idea that everything could be “reduced” to problem-solution; I’m not sure I feel that way any longer. I don’t think that it’s always necessary to make that framework explicit in one’s writing, certainly, and I think that there are times when we invent the “problems” we are solving, particularly in the humanities. On balance, though, it has helped me to think through my work in terms of this framework. I return to Chapter 1 frequently.
* Richard McNabb, “Making the Gesture: Graduate Student Submissions and the Expectations of Referees” (PDF) — This may be the single best essay for the aspiring graduate student that you’ve never heard of. It was published in Composition Studies in 2001, and is based on a study of graduate student submissions to Rhetoric Review over the course of nearly a decade.
The typical graduate manuscripts I saw as an associate editor suggest that the success of one’s argument depends on the appropriation of the correct gestures, that is, the discursive conventions that govern the ways of arguing and evaluating that define the language of the field. As I have tried to illustrate, writing for publication goes beyond producing a coherent, effective, well-supported argument; a writer has to be able to negotiate the publishing system by making the right gestures. I have identified two such gestures present in the scholarship (22).
“Gestures to a Rhetorical Mode” draws on Goggin’s taxonomy of description, testimony, history, theory, rhetorical analysis, and research report. “Gestures to a Problem Presentation” draws on MacDonald, Swales, and others to differentiate between epistemic and non-epistemic presentations. I don’t think I’m giving away any secrets to say that McNabb sees many graduate student submissions that rely on testimony and present themselves non-epistemically. What’s interesting about this piece is that it’s a rare study of a category of submissions that isn’t defined in terms of success, a problem that we run into if we only look at published writing when we talk about how to publish–it’s instructive to see the differences.
* Carol Berkenkotter and Thomas Huckin, “Gatekeeping at an Academic Convention” (from Genre Knowledge in Disciplinary Communication) — Speaking of differences: once upon a time, our national conference made its submissions, both those that had and those that hadn’t been accepted, available to researchers. Right towards the tail end of that time (the early 90s, I think), B&H examined a fairly large random sample of CCCC abstracts pulled from three years’ worth of submissions, “in hopes of getting a more comprehensive picture of the genre” (102). As with the other pieces on this list, you can’t take too literally the results of a study of conference abstracts, one from 20 years ago at that, but at the same time
“In this chapter we have illustrated at least two of the principles laid out in Chapter 1, namely those of form and content and of community ownership. The former states that “Genre knowledge embraces both form and content, including a sense of what content is appropriate to a particular purpose in a particular situation at a particular point in time” (13). It is clear from our study, we think, that the ability to write a successful CCCC abstract depends on a knowledge of what constitutes “interestingness” to an insider audience, which in turn depends on timeliness, or kairos. The principle of community ownership states that “Genre conventions signal a discourse community’s norms, epistemology, ideology, and social ontology” (21). Here, too, we think our study provides some insight in to the intellectual constitution of the rhetoric and composition community” (115).
Much of this book is worth reading, if no other reason that to think carefully about what B&H call “genre knowledge,” and to learn how to recognize and to internalize it throughout one’s graduate career.
* [Insert Role Model Here]: This is not as tongue-in-cheek as you might think. When I watch a show or movie that I really like, I end up internalizing pieces of the characters, and the same goes for academic writing that I find particularly inspiring. One of the best things you can do is to locate your own role models for writing, and to read and reread them on a regular basis. I don’t do so in order to imitate them, necessarily, but I find that part of what I find inspiring about them is the way that they write, not just what they have to say. Don’t share your models with anyone–they are yours and yours alone. As soon as you start choosing your models according to what you think others expect from you, you’re sort of missing the point.
One of the common threads among all of the pieces I’m recommending here is the idea of genre knowledge–we tend to overemphasize “originality” of content at the expense of timeliness of contribution when it comes to scholarly communication. And timeliness is not something that can be planned out ahead of time, or captured in a listicle. It requires us to engage with the conversation, to see what others have to say, to think about where we might contribute, to account for the context of the discussion, and to make it worth reading in both form and content.
My words here are hardly the last ones on the subject, but these are the things that I’ve found helpful in my own work. Good luck!
UPDATE: Just as you find the perfect citation only after you send that article out for review, hitting publish helped me to remember a variety of texts that I could very well have included on this list. I first taught a grad course in 2005 that was a combination of genre studies and EAP, where I used these and many other readings. Some of the other books I could have easily recommended include:
- Casanave and Vandrick, Writing for Scholarly Publication
- Bazerman and Paradis, Textual Dynamics of the Professions
- pretty much any of John Swales‘ books
and so on. Please feel free to add your own recommendations in the comments–I’m aware of how partial my own list is…
I want to wish everyone a happy Burkeday — Kenneth Burke was born on this day in 1897, making today as good a day as any to celebrate rhetoric.
KB is part of my origin story: When I returned to graduate school for my PhD, my first course wasn’t actually official. The summer before I started, I sat in on Victor Vitanza’s Kenneth Burke course. For me, it was like a homecoming, and only partly because I was glad to get back to academia. I was a fairly half-hearted rhetoric and composition person, having done a concentration in my MA program on the counsel of our graduate advisor. I’d originally gone to graduate school thinking to study Irish literature, and I was possessed of a fondness for critical theory. While I could see some connections with rhet/comp, they were weak ties at best, and it may not have been an accident that I ended up taking a couple of years after my first attempt.
Anyhow, reading Burke was a revelation for me. It wasn’t always easy reading, nor would I say that I agree with everything he wrote, but I’ve always felt a resonance with his work. I don’t doubt that it shows up in my own writing from time to time. But reading Burke was one of the things that made me feel (finally) like I’d made the right decisions to go back to graduate school and to stick with rhetoric and composition. One of Burke’s passages that has always appealed to me comes from the Afterword to the 3rd edition of Attitudes Toward History, revised a bit for an interview he gave later on:
Remember the big traffic jam in New York when the subways stopped? That’s when I learned the word gridlock. Gridlock means you can’t go any way. The traffic is so jammed, it can’t go forward, backwards, or sideways. What I had was counter-gridlock….So, I’d write six or seven pages; then another tangent would seem needed, and I’d start over again, with the same baffling outcome. Instead of no way out, there was a clutter of ways out, each in its own way running into something that cancelled it.
Kenneth Burke, “Counter-Gridlock”
I don’t know if other people’s minds work that way, but mine sure did. I think that’s part of what drew me to hypertext originally, and eventually to blogging and social media. Along the way, I’ve learned tricks to help tame my own counter-gridlock (cut the first 5 pages, work on multiple parts at once, etc.), but it’s always there, making it harder for me to force my ideas into the shapes that I know they need to take.
There’s another piece of Burke that always appealed to me secretly. Burke was raised on the work of Mary Baker Eddy (who founded Christian Science), and while he turned away from those ideas to an extent, there is a sense that runs throughout his work that language is not simply representational but material, that the ideas we hold affect us physiologically. The idea of literature as “equipment for living” is a mild expression of this. There’s a story about him that says that one of the reasons why he never published the third volume of the Motives trilogy was that he would be “finished,” and not just in the intellectual sense.
You might imagine how even the hint of this would appeal to a kid who grew up reading and gaming in worlds where language did have that power. There’s a “not really…but maybe” quality to it all in my head that sometimes crosses over the line separating figurative and literal. If you were to connect this idea to a passage from an academic text like this one, published in the fall before I started back to graduate school–
After all, anyone the least bit familiar with the workings of the new era’s definitive technology, the computer, knows that it operates on a principle impracticably difficult to distinguish from the pre-Enlightenment principle of the magic word: the commands you type into a computer are a kind of speech that doesn’t so much communicate as make things happen, directly and ineluctably, the same way pulling a trigger does. They are incantations, in other words, and anyone at all attuned to the technosocial megatrends of the moment — from the growing dependence of economies on the global flow of intensely fetishized words and numbers to the burgeoning ability of bioengineers to speak the spells written in the four-letter text of DNA — knows that the logic of the incantation is rapidly permeating the fabric of our lives.
Julian Dibbell, A Rape in Cyberspace, Village Voice, December 1993
–well, then, you might begin to tease out some of my own motives and interests. In The Philosophy of Literary Form, Burke writes, “The magical decree is implicit in all language, for the mere act of naming an object or situation decrees that it is to be singled out as such-and-such rather than as something other” (4).
Here’s where it gets even less rational. Imagine that you’re a person for whom writing has never been a struggle, but who does struggle with putting it into a straight line. And imagine further that you’ve got a secret fascination with what Dibbell calls that “logic of the incantation.” You force your work into those shapes, article after article and conference papers galore, and eventually, you even manage to craft your own little snow globe, your first book.
I shouldn’t continue in 2nd person here. I started deflecting before I even realized that I’d done it. I am those things, and have done those things. The process of taking Lingua Fracta from initial manuscript to published volume, however, took almost 5 years. If you’ve read my book, you’ll know that my father passed away before he had a chance to see it published. What you may not know is that my grandfather did as well, about a year later. And my grandmother’s health at the end of her life was such that she probably only caught a glimpse.
Part of me loves my book, and part of me blames my book. It makes no sense, and even sounds silly to me as I write it down like this. But the fact of the matter is that I stopped wanting to write for a long time. The gradual fade of my first blog took place over about 3 months following my father’s death, and the loss of my grandparents sealed the deal. In my brain, I know that this is a story (events happen) that a tiny part of me has turned into a plot (events are connected!)–that’s the very definition of superstition–but sometimes all it takes is a tiny part
The tiny thing that helped me come out of this, to the degree that I’m out of this, came last summer. Since my book was published, I’ve been thinking on and off about what I’ll do my next book on. I’ve had several possibilities in mind, but I’m fairly sure that when I hit a certain level of detail in the planning, something in me just shut down. It was too easy to turn to something else and just forget about it. I’ll spare you the long stories of my self-distraction.
Last summer, though, I realized that I don’t have to write another book. Ever. I say this fully aware that this is a luxury; I am in an incredibly privileged position to be able to say it. But I don’t mean it in the sense that I no longer have to work: I’ve been writing articles and chapters for collections, supervising students, teaching and designing courses, mentoring as best as I’ve been able–I overfill my time (sometimes) with the work that I’m obliged to do and the work that I enjoy doing. What I mean by this is that I can continue my work, my reading, my writing, my teaching, my mentoring, my participation–and none of those things have to take the particular material form of a book.
Is this distinction clear enough? Because it’s made all the difference for me. In that deep part of me that associated my book with loss and grief, the idea that it could be the book as formal obligation rather the specific incantation I wove to meet that obligation shook something loose in me that’s allowed me to start relearning how to write. I know that this might sound like “Aha! It wasn’t my fault after all, but the evil institution that made me do it!” But that’s not quite right. Writing had become this thing that forced me against my inclinations and ended in heartbreak. It wasn’t a matter for me of finding someone else to blame; rather, it was working my way through to a place where “blame” didn’t quite work to capture the full range of possible relations. It’s not like it doesn’t still occupy me, but I no longer feel locked in by it.
I don’t know if this quite makes sense. It does in my head.
Here’s a last little odd fact about me. When I was young, I was fascinated by writing backwards and writing upside down. To this day, I can read text upside down almost as quickly as right side up. I would practice backwards cursive with a mirror–something about inverting and reversing the shapes of letters felt like magic to me. I loved codes, non-Roman alphabets, letter substitutions, all that stuff. Our daily paper had a cryptoquote next to the crossword that I would try and solve in my head. Palindromes, ambigrams, word ladders, snowball poems, I have always been fascinated by the extravagant capacities of language. So add that fascination to my discomfort with the book as form and my fascination with the logic of incantations, mix it together with a little technology expertise, and it makes perfect sense that what I should do is to do it backwards, to announce the “publication” of my new “book.”
Believe it or not, I’m not joking.
My next project is called Rhetworks, and I’m publishing it today, even though it hasn’t been written yet. It may or may not become a book; I’ve toyed with the idea of describing it as a BOOC, a Book-Sized Open Online Colloquium. I’ve been thinking about the relationships between rhetoric and networks for close to 10 years now, and I think I’m going to start writing something big and sprawling on the subject.
Over the next 2 years, starting today, I’m going to write it online, using a PBWiki installation. That means that I’m going to write in public, which scares the heck out of me, but not nearly as much as it used to. I’m going to make mistakes and I’m going to have to trust in the generosity of my readers. At the end of two years, if I feel like I have enough material to justify publishing it as a book, I may do so. But I’m equally prepared for the possibility that I won’t. In either case, I’ll be writing under a Creative Commons License and it will stay up there, freely available to anyone who’s interested, regardless of any subsequent form it might take.
I have a hypothesis, a fairly grand one, that I want to work through, and I even have a set of keywords that may someday provide me with the chapter structure for a book. But neither of those things will drive this project. I am interested instead in giving reign to my counter-gridlock, without knowing ahead of time whether or not it will actually work. But at its most basic level, this is an experiment. It may not catch on, I may grow bored with it, other people may find it stupid or silly or self-indulgent–I can imagine a hundred different ways that this could fail. And that’s why I’m going to do it.
Oh, but there’s more. My first idea was to write Rhetworks on a private wiki, and invite people to visit it once I’d gotten “enough” of it going to feel comfortable sharing. What I’m doing instead is to invite you to participate in it from the get go, and to contribute to it as much as you’re comfortable with. For some, this may mean correcting a typo or two, asking some questions in the comments, or adding a work or two to my bibliography. And that’s fine. But that’s just the start of what’s possible. I’m willing to collaborate with you on sections. I’m willing to list you as co-author. As long as you’re comfortable, I’m willing to let you publish your own work on the site, and even in the pages of the book, if it comes to that. My only request is that you make your own work as available and editable and shareable as I’m making my own. Does that mean that I’d be willing to include a chapter written by someone else entirely in the book version of this? Or include entire sections or chapters that disagree with me? Yes. Yes, it does. I’m also open to the possibility of using the site as an invention space and breaking off pieces of it to publish collaboratively in other venues–I know that not everyone can afford to invest time and effort as open-endedly as I can.
And yes, I can imagine that this project could be derailed by edit wars, or that someone might get it into their head to try and ruin it. I’ll be restricting editing access to registered users, so that I can exercise some minimal amount of supervisory influence. But I have thought about a lot of different ways that people might make use of the site, add to it in ways that I cannot predict, and even disagree with me in fundamental ways, and I find that I’m surprisingly okay with that.
A scholarly project sits at the heart of a network by nature. The traditional model of publication, though, encourages us to mediate that network ourselves, often out of fear of what would happen if we let others see before it was complete. William Germano described it as a snow globe in the Chronicle a couple of weeks ago:
Within the realm of the snow globe, every authority on the subject has been cited or pacified. Look inside and find a perfect, tidy, improbable world where no questions are asked, or invited. Scholarly books, especially first ones, are a paranoid genre—their structure assumes that someone is always watching, eager to find fault. And they take every precaution against criticism.
He asks if we dare write for readers–what I want to do here is write WITH readers, with you. I want to create a book-sized network of scholarship that itself is the product of the network. It’s not coincidental that it’s about networks, too.
I go back and forth about this. On the one hand, it feels like the next step, or maybe a leap of faith: the idea that scholarship can locate itself somewhere that’s part text, part connectivist MOOC, part community. Germano suggests that maybe “the best form a book can take—even an academic book—is as a never-ending story, a kind of radically unfinished scholarly inquiry,” and part of me believes that enough to give it a try. Maybe what I’m describing is actually a 2-year online course on networks and rhetoric, open to anyone who’s interested. (I will almost certainly use it to some degree in the digital humanities course I teach next spring, and I hope others will take it up that way, too.) It pushes the idea of public, online review even further, and maybe it will ultimately push at our ideas of what acceptable (and accessible) online scholarship can look like.
And then there are days where I imagine that I’m so crazy to even think of this that I can’t see outside of the crazy. Even if I manage to summon the effort, time, and energy to do this successfully, it feels insanely risky, when I could just sit down, open my books, fire up my browser, and bang out a publishable manuscript.
And then I think about the “clutter of ways” that I want to give voice to.
I think about how maybe if I cast my spell backwards this time, something magnificent might happen.
And I think about Clay Shirky’s incantation–publish, then filter–and how much more sense it makes to me, even if it sounds upside down.
And then, one day in early May, I publish a book that doesn’t yet exist, and invite you to write it with me. I wonder what could possibly happen next.
Last night, in lieu of watching television, getting caught up on my work, or doing any number of other, more productive things around the house, I let myself get sucked into rearranging some of my bookshelves. As I mentioned on Facebook afterwards, one of the things I started thinking about was how I would arrange them if I were going to play Bookshelf Bingo, a game I invented while I was shelving.
Bookshelf Bingo is not all that different from Hipster Bingo or SXSW Bingo. My first idea was that you could play it during a keynote address at a conference, although I would think a Twitter feed might work as well (although it’d be more difficult, as I explain below). Here are the rules:
1. Each player needs to start with roughly similar shelving units. I’m a huge fan of the cube shelves myself, which have the added advantage of providing the bingo grid. Each cube holds around 15 books, give or take. The grids should be the same for each player, and the grid sizes as well, to keep it fair.
2. Each player can arrange books on and among the shelves in any way he or she sees fit.
3. Each player then takes a high resolution photo of their shelves and prints it out. This is the Bingo card.
4. Then, at a keynote address (or panel), a player gets to cross off a square each time a book in that square is cited by the speaker(s).
5. First player to complete a row, column, or diagonal wins! And is crowned the biggest Nerd in the audience! (Jumping up and yelling “Bingo!” during the talk itself is not recommended.)
Variations: It occurred to me at first that you could do this with a Twitter feed in lieu of a keynote address, probably because A1 contains a bunch of books that I’ve purchased as they’ve come across my feed in the past month (Jockers, Golbeck, Hofstadter, Morozov, et al.). That would require folks to subscribe to roughly the same feeds, although you could create a Bingo list in Twitter and share it with the other players.
Since doing the shelf version requires that the players own all these books, I thought too about just putting 1 book per square–it’d be easy to pull covers from Amazon, arrange them in a 5×5 table on a page, and do it that way. That would be the much less expensive version, and might make an interesting pedagogical exercise for a graduate course. If they’d read some of a speaker’s work, and then created a book-per-square (or even 1 author per square) grid, it’d be a fun way to watch a streaming keynote, perhaps. It’d be a novel way of thinking about which thinkers and sources a particular speaker was most likely to rely upon for his or her work.
So that’s Bookshelf Bingo, coming soon to an academic conference near you!
Every once in a while, everything just seems to flow into one large conversation full of resonances, connections, and it’s like striking a tuning fork. This is a post about the challenges of graduate education, and perhaps, by extension, academic work for those of us who identify with the digital humanities. Let me see if I can gather the threads together.
There’s a little history. Jokingly, I tell people that one of my biggest academic regrets is a paper I delivered at CCCC a few years back (2010). Our session took place in a huge ballroom (the size of our audience did not do it justice), and rather than a projector and portable screen, we had like a 30-foot monitor. It was colossal, and one of the things I regret is that, not knowing about it ahead of time, I didn’t prepare a full slide deck. Instead, I gave the only talk I’ve ever given that had just one, solitary slide. Don’t get me wrong, I was proud of that slide, and I wish that I hadn’t lost it in the Great Laptop Crash of 2011. It was a screen capture of a cover of an old issue of Field & Stream magazine, lovingly Photoshopped to reflect the topics in my talk, which was called “Writing Retooled: Loop, Channel, Layer, Stream.” Keep in mind that this was 3 years ago, when Twitter was still relatively exotic for academics, but what I was arguing was that
For those of us who engage with the field through social media, though, that engagement may seem more shallow in the short term, but it is constant and ongoing. We are setting foot in the river every day, rather than waiting for the occasional, official “event” to do so.
Think of it this way: who is more likely to shape the field? The person who sits in the audience for a presentation or reads a journal article that’s already been written, or the one who participates in weblog or Twitter conversations about that writing as it is being done? And yet, if you asked 100 people at this conference whether they’d rather publish an essay in CCC or have a couple of hundred followers on Twitter, I’m pretty sure most people would choose the first option.
A couple of hundred. Heh. Anyways, I suggested that, rather than focusing exclusively on the “field” of writing studies, we needed to be building the tools and habits necessary for dealing with the “stream.” I was arguing and, not or, but my talk was certainly weighted towards the stream, given where the field was (is?) at the time.
Anyhow, someone reminded me of that talk this year at CCCC, my first trip back since I gave it, so I’ve had cause in the past month or so to remember it fondly. Over the past couple of days, it’s connected for me with a few different links. First, there’s Anil Dash’s talk yesterday at the Berkman Center on “The Web We Lost.” There are a number of things in there worth thinking about, but Doug Hesse pointed out in my FB comments something that I’m not sure we’ve all really processed:
We built the Web for pages, but increasingly we’re moving from pages to streams (most recently-updated on top, generally), on our phones but also on bigger screens. Sites that were pages have become streams. E.g., YouTube and Yahoo. These streams feel like apps, not pages. Our arrogance keeps us thinking that the Web is still about pages. Nope. The percentage of time we spend online looking at streams is rapidly increasing. It is already dominant.
In Writing Studies, I think that we still think of ourselves as being in the business of writing pages. Think about all of the infrastructure we have, from page counts to citation formats, that make this simple assumption about the “object” of our practices. Or about how vital .PDF has been in finally getting people to accept that scholarship isn’t necessarily inferior because it’s online. (None of these are particularly thrilling examples to me.)
As part of my own stream, I just came across a tweet from Jay Rosen that provides some nice overlap as well:
But I actually think stock and flow is the master metaphor for media today. Here’s what I mean:
- Flow is the feed. It’s the posts and the tweets. It’s the stream of daily and sub-daily updates that remind people that you exist.
- Stock is the durable stuff. It’s the content you produce that’s as interesting in two months (or two years) as it is today. It’s what people discover via search. It’s what spreads slowly but surely, building fans over time.
I feel like flow is ascendant these days, for obvious reasons—but we neglect stock at our own peril. I mean that both in terms of the health of an audience and, like, the health of a soul. Flow is a treadmill, and you can’t spend all of your time running on the treadmill. Well, you can. But then one day you’ll get off and look around and go: Oh man. I’ve got nothing here.
If you push on, as I did, and read the Rushkoff interview, then you’ll see Sloan’s treadmill metaphor writ large, and translated into “present shock.” This is a line from the book that the interviewer quotes:
When we attempt to pack the requirements of storage into media or flow, or to reap the benefits of flow from media that locks things into storage, we end up in present shock.
I realize here that I’m making my own talk appear far more prescient (and perhaps more sophisticated) than it actually was. I was in good shape just identifying the difference between what I was calling field and stream, I suspect.
Another thing that I talked about with several people at this year’s CCCC was how I was sometimes struggling with the presentism of social media. It’s particularly acute for me as I dip into conversations around the digital humanities, as so much of that discussion seems to happen on Twitter. You could argue variously that this is a symptom of its relative novelty but also of its dynamic energy, and even perhaps a combination of the two. Talk to me in five years, I suppose. It’s sometimes become difficult for me, though, to step back from social media and to focus instead on the page-oriented commitments that I have. The virtue of being in my position is that, if I want, I can just tone down the commitments and focus instead on more short-form work of the sort that social media energizes and provokes from me. I’m conscious that not everyone has that luxury, though.
This is not a post where I want to scold anyone. Rushkoff has a particular position that he’s promoting, to be sure, and there are hints of it in Dash and Sloan, I suppose, but my own interest is in thinking about how the balance that I was arguing for back in 2010 has so radically shifted in the other direction. But only in certain places. I’m slated to teach our Rhetoric, Composition, and Digital Humanities graduate course next spring, and already I’m thinking about how I can hack the curricular and conceptual space of my classroom to allow for a more dynamic and distributed course experience. But now I find myself in the odd position of thinking about whether that kind of course will provide enough field, enough stock, for students who (as I was arguing three years ago)
are more likely to rely on bookmarking than bookshelving. They are more likely to read an article that has well‐developed keywords than one with page numbers. And they are more likely to follow citation trails than to sit still and read a paper journal cover‐to‐cover. They are more accustomed to managing the flows of information, sorting them, and assembling them for their own uses. In short, they are much more likely today to be what Thomas Rickert and I have described as practitioners of ambient research.
I’ve been deeply committed to making over my pedagogy in ways that help students work with flow, but as a colleague and I were talking about today, those students still have to go through a comprehensive exam process and to write a dissertation. Believe me when I say that I know all the arguments for reshaping those requirements, and that I agree with them. But I have to reconcile them with my own ethical beliefs about graduate education and whether it prepares students adequately for what follows. I’m not so full of myself as to think that a single graduate course with me will make the difference in a student’s ability to finish or not; however, years spent as a graduate director have made me keenly aware that every course is itself a blend of stock and flow, with obligations both to itself and to the ongoing curriculum that it is a part of.
So while the blogger in me celebrates the short-form and the streams, the academic in me starts to wonder if the shift away from more traditional academic practices doesn’t ultimately do my students a disservice–I think about whether or not I’m responsibly modeling the kind of balance they’re going to need in their own careers. I say that fully aware that it sounds like the first step on the road to rationalization, but it’s not. Really. I think that it means that I’ll think more carefully about how I hack my course next spring, not whether or not I’ll do so. It’s an issue that I’ll likely grapple with for some time, and this is really just the beginning of that process for me. That’s all.
(ps. If you’ve read the above and thought, “why isn’t he doing something about this in his research?” or some variation on the hack/yack question, then you’ve happened upon one of the driving forces behind my next major project. About which, more soon. )