Listing entries tagged with peer_review
1 | 2 | 3 | 4 | 5 | 6
expressive processing: post-game analysis begins 03.20.2008, 3:27 AM
So Noah's just wrapped up the blog peer review of his manuscript in progress, and is currently debating whether to post the final, unfinished chapter. He's also just received the blind peer reviews from MIT Press and is in the process of comparing them with the online discussion. That'll all be written up soon, we're still discussing format.
Meanwhile, Ian Bogost (the noted game designer, critic and professor) started an interesting thread a couple of weeks back on the troubles of reading Expressive Processing, and by extension, any long-form text or argument, on the Web:
The peer review part of the project seems to be going splendidly. But here's a problem, at least for me: I'm having considerable trouble reading the book online. A book, unlike a blog, is a lengthy, sustained argument with examples and supporting materials. A book is textual, of course, and it can thus be serialized easily into a set of blog posts. But that doesn't make the blog posts legible as a book...
...in their drive to move textual matter online, creators of online books and journals have not thought enough about the materiality of specific print media forms. This includes both the physicality of the artifacts themselves (I violently dogear and mark up my print matter) and the contexts in which people read them (I need to concentrate and avoid distraction when reading scholarship). These factors extend beyond scholarship too: the same could be said of newspapers and magazines, which arguably read much more casually and serendipitously in print form than they do in online form.
I've often considered Bolter and Grusin's term "remediation" to be a derogatory one. Borrowing and refashioning the conventions of one medium in another opens the risk ignoring what unremediated features are lost. The web has still not done much more than move text (or images, or video) into a new distribution channel. Digitizing and uploading analog material is easy and has immediate, significant impact: web, iPod, YouTube. We've prized simple solutions because they are cheap and easy, but they are also insufficient. In the case of books and journal articles, to offer a PDF or print version of the online matter is to equivocate. And the fashionable alternative, a metaverse-like 3D web of the sort to which Second Life points, strikes me as a dismal sidestepping of the question.
conversation, revision, trust... 02.18.2008, 1:07 PM
A thought-provoking "meta-post" from Noah Wardrip-Fruin on Grand Text Auto reflecting on the blog-based review of his new book manuscript four chapters (and weeks) into the process. Really interesting stuff, so I'm quoting at length:
This week, when I was talking with Jessica Bell about her story for the Daily Pennsylvanian, I realized one of the most important things, for me, about the blog-based peer review form. In most cases, when I get back the traditional, blind peer review comments on my papers and book proposals and conference submissions, I don't know who to believe. Most issues are only raised by one reviewer. I find myself wondering, "Is this a general issue that I need to fix, or just something that rubbed one particular person the wrong way?" I try to look back at the piece with fresh eyes, using myself as a check on the review, or sometimes seek the advice of someone else involved in the process (e.g., the papers chair of the conference).
But with this blog-based review it's been a quite different experience. This is most clear to me around the discussion of "process intensity" in section 1.2. If I recall correctly, this began with Nick's comment on paragraph 14. Nick would be a perfect candidate for traditional peer review of my manuscript -? well-versed in the subject, articulate, and active in many of the same communities I hope will enjoy the book. But faced with just his comment, in anonymous form, I might have made only a small change. The same is true of Barry's comment on the same paragraph, left later the same day. However, once they started the conversation rolling, others agreed with their points and expanded beyond a focus on The Sims -? and people also engaged me as I started thinking aloud about how to fix things -? and the results made it clear that the larger discussion of process intensity was problematic, not just my treatment of one example. In other words, the blog-based review form not only brings in more voices (which may identify more potential issues), and not only provides some "review of the reviews" (with reviewers weighing in on the issues raised by others), but is also, crucially, a conversation (my proposals for a quick fix to the discussion of one example helped unearth the breadth and seriousness of the larger issues with the section).
On some level, all this might be seen as implied with the initial proposal of bringing together manuscript review and blog commenting (or already clear in the discussions, by Kathleen Fitzpatrick and others, of "peer to peer review"). But, personally, I didn't foresee it. I expected to compare the recommendation of commenters on the blog and the anonymous, press-solicited reviewers -? treating the two basically the same way. But it turns out that the blog commentaries will have been through a social process that, in some ways, will probably make me trust them more.
expressive processing meta 01.29.2008, 2:20 PM
To mark the posting of the final chunk of chapter 1 of the Expressive Processing manuscript on Grand Text Auto, Noah has kicked off what will hopefully be a revealing meta-discussion to run alongside the blog-based peer review experiment. The first meta post includes a roundup of comments from the first week and invites readers to comment on the process as a whole. As you'll see, there's already been some incisive feedback and Noah is mulling over revisions. Chapter 2 starts tomorrow.
In case you missed it, here's an intro to the project.
expressive processing: an experiment in blog-based peer review 01.22.2008, 5:30 AM
An exciting new experiment begins today, one which ties together many of the threads begun in our earlier "networked book" projects, from Without Gods to Gamer Theory to CommentPress. It involves a community, a manuscript, and an open peer review process -? and, very significantly, the blessing of a leading academic press. (The Chronicle of Higher Education also reports.)
The community in question is Grand Text Auto, a popular multi-author blog about all things relating to digital narrative, games and new media, which for many readers here, probably needs no further introduction. The author, Noah Wardrip-Fruin, a professor of communication at UC San Diego, a writer/maker of digital fictions, and, of course, a blogger at GTxA. His book, which starting today will be posted in small chunks, open to reader feedback, every weekday over a ten-week period, is called Expressive Processing: Digital Fictions, Computer Games, and Software Studies. It probes the fundamental nature of digital media, looking specifically at the technical aspects of creation -? the machines and software we use, the systems and processes we must learn end employ in order to make media -? and how this changes how and what we create. It's an appropriate guinea pig, when you think about it, for an open review experiment that implicitly asks, how does this new technology (and the new social arrangements it makes possible) change how a book is made?
The press that has given the green light to all of this is none other than MIT, with whom Noah has published several important, vibrantly inter-disciplinary anthologies of new media writing. Expressive Processing his first solo-authored work with the press, will come out some time next year but now is the time when the manuscript gets sent out for review by a small group of handpicked academic peers. Doug Sery, the editor at MIT, asked Noah who would be the ideal readers for this book. To Noah, the answer was obvious: the Grand Text Auto community, which encompasses not only many of Noah's leading peers in the new media field, but also a slew of non-academic experts -? writers, digital media makers, artists, gamers, game designers etc. -? who provide crucial alternative perspectives and valuable hands-on knowledge that can't be gotten through more formal channels. Noah:
Blogging has already changed how I work as a scholar and creator of digital media. Reading blogs started out as a way to keep up with the field between conferences -- and I soon realized that blogs also contain raw research, early results, and other useful information that never gets presented at conferences. But, of course, that's just the beginning. We founded Grand Text Auto, in 2003, for an even more important reason: blogs can create community. And the communities around blogs can be much more open and welcoming than those at conferences and festivals, drawing in people from industry, universities, the arts, and the general public. Interdisciplinary conversations happen on blogs that are more diverse and sustained than any I've seen in person.
Given that ours is a field in which major expertise is located outside the academy (like many other fields, from 1950s cinema to Civil War history) the Grand Text Auto community has been invaluable for my work. In fact, while writing the manuscript for Expressive Processing I found myself regularly citing blog posts and comments, both from Grand Text Auto and elsewhere....I immediately realized that the peer review I most wanted was from the community around Grand Text Auto.
Sery was enthusiastic about the idea (although he insisted that the traditional blind review process proceed alongside it) and so Noah contacted me about working together to adapt CommentPress to the task at hand.
The challenge technically was to integrate CommentPress into an existing blog template, applying its functionality selectively -? in other words, to make it work for a specific group of posts rather than for all content in the site. We could have made a standalone web site dedicated to the book, but the idea was to literally weave sections of the manuscript into the daily traffic of the blog. From the beginning, Noah was very clear that this was the way it needed to work, insisting that the social and technical integration of the review process were inseparable. I've since come to appreciate how crucial this choice was for making a larger point about the value of blog-based communities in scholarly production, and moreover how elegantly it chimes with the central notions of Noah's book: that form and content, process and output, can never truly be separated.
Up to this point, CommentPress has been an all or nothing deal. You can either have a whole site working with paragraph-level commenting, or not at all. In the technical terms of WordPress, its platform, CommentPress is a theme: a template for restructuring an entire blog to work with the CommentPress interface. What we've done -? with the help of a talented WordPress developer named Mark Edwards, and invaluable guidance and insight from Jeremy Douglass of the Software Studies project at UC San Diego (and the Writer Response Theory blog) -? is made CommentPress into a plugin: a program that enables a specific function on demand within a larger program or site. This is an important step for CommentPress, giving it a new flexibility that it has sorely lacked and acknowledging that it is not a one-size-fits-all solution.
Just to be clear, these changes are not yet packaged into the general CommentPress codebase, although they will be before too long. A good test run is still needed to refine the new model, and important decisions have to be made about the overall direction of CommentPress: whether from here it definitively becomes a plugin, or perhaps forks into two paths (theme and plugin), or somehow combines both options within a single package. If you have opinions on this matter, we're all ears...
But the potential impact of this project goes well beyond the technical.
It represents a bold step by a scholarly press -? one of the most distinguished and most innovative in the world -? toward developing new procedures for vetting material and assuring excellence, and more specifically, toward meaningful collaboration with existing online scholarly communities to develop and promote new scholarship.
It seems to me that the presses that will survive the present upheaval will be those that learn to productively interact with grassroots publishing communities in the wild of the Web and to adopt the forms and methods they generate. I don't think this will be a simple story of the blogosphere and other emerging media ecologies overthrowing the old order. Some of the older order will die off to be sure, but other parts of it will adapt and combine with the new in interesting ways. What's particularly compelling about this present experiment is that it has the potential to be (perhaps now or perhaps only in retrospect, further down the line) one of these important hybrid moments -? a genuine, if slightly tentative, interface between two publishing cultures.
Whether the MIT folks realize it or not (their attitude at the outset seems to be respectful but skeptical), this small experiment may contain the seeds of larger shifts that will redefine their trade. The most obvious changes leveled on publishing by the Internet, and the ones that get by far the most attention, are in the area of distribution and economic models. The net flattens distribution, making everyone a publisher, and radically undercuts the heretofore profitable construct of copyright and the whole system of information commodities. The effects are less clear, however, in those hardest to pin down yet most essential areas of publishing -? the territory of editorial instinct, reputation, identity, trust, taste, community... These are things that the best print publishers still do quite well, even as their accounting departments and managing directors descend into panic about the great digital undoing. And these are things that bloggers and bookmarkers and other web curators, archivists and filterers are also learning to do well -? to sift through the information deluge, to chart a path of quality and relevance through the incredible, unprecedented din.
This is the part of publishing that is most important, that transcends technological upheaval -? you might say the human part. And there is great potential for productive alliances between print publishers and editors and the digital upstarts. By delegating half of the review process to an existing blog-based peer community, effectively plugging a node of his press into the Web-based communications circuit, Doug Sery is trying out a new kind of editorial relationship and exercising a new kind of editorial choice. Over time, we may see MIT evolve to take on some of the functions that blog communities currently serve, to start providing technical and social infrastructure for authors and scholarly collectives, and to play the valuable (and time-consuming) roles of facilitator, moderator and curator within these vast overlapping conversations. Fostering, organizing, designing those conversations may well become the main work of publishing and of editors.
I could go on, but better to hold off on further speculation and to just watch how it unfolds. The Expressive Processing peer review experiment begins today (the first actual manuscript section is here) and will run for approximately ten weeks and 100 thousand words on Grand Text Auto, with a new post every weekday during that period. At the end, comments will be sorted, selected and incorporated and the whole thing bundled together into some sort of package for MIT. We're still figuring out how that part will work. Please go over and take a look and if a thought is provoked, join the discussion.
ithaka report on scholarly publishing 07.30.2007, 12:12 PM
From a first skim and browsing of initial responses, the new report from the non-profit scholarly technologies research group Ithaka, "University Publishing in a Digital Age," seems like a breath of fresh air. The Institute was one of the many stops along the way for the Ithaka team, which included the brilliant Laura Brown, former director of Oxford University Press in the States, and we're glad to see Gamer Theory is referenced as an important experiment with the monograph form.
A good summary of the report and a roundup of notable reactions (all positive) in the academic community is up on Inside Higher Ed. Recommendations center around better coordination among presses on combining services, tools and infrastructure for digital scholarship. They also advocate closer integration of presses with the infrastructure and scholarly life of their host universities, especially the library systems, who have much to offer in the area of digital communications. This is something we've argued for a long time and it's encouraging to see this put forth in what will no doubt be an influential document in the field.
One area that, from my initial reading, is not siginificatnly dealt with is the evolution of scholarly authority (peer review, institutional warrants etc.) and the emergence of alternative models for its production. Kathleen Fitzpatrick ponders this on the MediaCommons blog:
The report calls universities to task for their failures to recognize the ways that digital modes of communication are reshaping the ways that scholarly communication takes place, resulting in, as they say, "a scholarly publishing industry that many in the university community find to be increasingly out of step with the important values of the academy."
Perhaps I'll find this when I read the full report, but it seems to me that the inverse is perversely true as well, that the stated "important values of the academy" -? those that have us clinging to established models of authority as embodied in traditional publishing structures -? are increasingly out of step with the ways scholarly communication actually takes place today, and the new modes of authority that the digital makes possible. This is the gap that MediaCommons hopes to bridge, not just updating the scholarly publishing industry, but updating the ways that academic assessments of authority are conducted.
nature opens slush pile to the world 06.20.2007, 6:16 AM
This is potentially a big deal for scholarly publishing in the sciences. Inspired by popular "preprint" servers like the Cornell-hosted arXiv.org, the journal Nature just launched a site, "Nature Precedings", where unreviewed scientific papers can be posted under a CC license, then discussed, voted upon, and cited according to standards usually employed for peer-reviewed scholarship.
Over the past decade, preprint archives have become increasingly common as a means of taking the pulse of new scientific research before official arbitration by journals, and as a way to plant a flag in front of the gatekeepers' gates in order to outmaneuver competition in a crowded field. Peer review journals are still the sine qua non in terms of the institutional warranting of scholarship, and in the process of academic credentialling and the general garnering of prestige, but the Web has emerged as the arena where new papers first see the light of day and where discussion among scholars begins to percolate. More and more, print publication has been transforming into a formal seal of approval at the end of a more unfiltered, networked process. Clearly, Precedings is Nature's effort to claim some of the Web territory for itself.
From a cursory inspection of the site, it appears that they're serious about providing a stable open access archive, referencable in perpetuity through broadly accepted standards like DOI (Digital Object Identifier) and Handles (which, as far as I can tell, are a way of handling citations of revised papers). They also seem earnest about hosting an active intellectual community, providing features like scholar profiles and a variety of feedback mechanisms. This is a big step for Nature, especially following their tentative experiment last year with opening up peer review. At that time they seemed almost keen to prove that a re-jiggering of the review process would fail to yield interesting results and they stacked their "trial" against the open approach by not actually altering the process, or ultimately, the stakes, of the closed-door procedure. Not surprisingly, few participated and the experiment was declared an interesting failure. Obviously their thinking on this matter did not end there.
Hosting community-moderated works-in-development might just be a new model for scholarly presses, and Nature might just be leading the way. We'll be watching this one.
More on David Weinberger's blog.
sketches toward peer-to-peer review 05.24.2007, 1:26 AM
Last Friday, Clancy Ratliff gave a presentation at the Computers and Writing Conference at Wayne State on the peer-to-peer review system we're developing at MediaCommons. Clancy is on the MC editorial board so the points in her slides below are drawn directly from the group's inaugural meeting this past March. Notes on this and other core elements of the project are sketched out in greater detail here on the MediaCommons blog, but these slides give a basic sense of how the p2p review process might work.
MediaCommons paper up in commentable form 03.30.2007, 1:12 PM
We've just put up a version of a talk Kathleen Fitzpatrick has been giving over the past few months describing the genesis of MediaCommons and its goals for reinventing the peer review process. The paper is in CommentPress -- unfortunately not the new version, which we're still working on (revised estimated release late April), it's more or less the same build we used for the Iraq Study Group Report. The exciting thing here is that the form of the paper, constructed to solicit reader feedback directly alongside the text, actually enacts its content: radically transparent peer-to-peer review, scholars talking in the open, shepherding the development each other's work. As of this writing there are already 21 comments posted in the page margins by members of the editorial board (fresh off of last weekend's retreat) and one or two others. This is an important first step toward what will hopefully become a routine practice in the MediaCommons community.
In less than an hour, Kathleen will be delivering the talk, drawing on some of the comments, at this event at the University of Rochester. Kathleen also briefly introduced the paper yesterday on the MediaCommons blog and posed an interesting question that came out of the weekend's discussion about whether we should actually be calling this group the "editorial board." Some interesting discussion ensued. Also check at this: "A First Stab at Some General Principles".
MediaCommons editorial board convenes 03.26.2007, 3:49 PM
Big things are stirring that belie the surface calm on this page. Bob, Eddie and I are down on the Jersey shore with the newly appointed editorial board of MediaCommons. Kathleen and Avi have assembled a brilliant and energetic group all dedicated to changing the forms and processes of scholarly communication in media studies and beyond. We're thrilled to be finally together in the same room to start plotting out how this initiative will grow from a rudimentary sketch into a fully functioning networked press/community. The excitement here is palpable. Soon we'll be posting some follow-up notes and a Comment Press edition of a paper by Kathleen. Stay tuned.