future boy 01.30.2008, 7:48 AM
The picture is of a Futurizer, based on the kinds of contraption I built as a child from cardboard, balsa wood and string which allowed me to communicate with other planets and centuries. It was reconstructed by a group of us at a conference on Transliteracy at the Institute of Creative Technologies at De Montfort University, organised by PART. The aim of the day was to try to make some transliterate objects and in so doing consider if such things can, could or should exist. We had an enjoyable if inconclusive time grappling with this.
Plug headphones into an iPod or XBox and you will be able to listen to one of a large but finite range of sounds. Plug headphones into a cardboard box and you can (not) hear anything you can possibly imagine. Travelling back through the years to my childhood, these machines allowed me to think across time and space, out of the (cardboard) box. They were also a means of engaging with the TV I loved, in a bygone era when no adult expressed any interest in the way I read my TV21 comic or consumed Thunderbirds and The Man From Uncle.
Unlike those friends who screwed together bits of meccanno to build working bridges, or fiddled with circuit boards until bulbs lit up, my games were all about interfaces.
I never worried for a moment about how these things might actually work. Now a lot of inventiveness is once again going into cutting and sticking, playing with FaceBook applications and YouTube clips like we used Corn Flake packets and sticky-backed plastic. Isn't it great, living here in the future?
By the end of the day the Futurizer had been photographed and uploaded to Second Life. a fitting place for it to end up really: transmogrified, transliterated, futurized.
expressive processing meta 01.29.2008, 2:20 PM
To mark the posting of the final chunk of chapter 1 of the Expressive Processing manuscript on Grand Text Auto, Noah has kicked off what will hopefully be a revealing meta-discussion to run alongside the blog-based peer review experiment. The first meta post includes a roundup of comments from the first week and invites readers to comment on the process as a whole. As you'll see, there's already been some incisive feedback and Noah is mulling over revisions. Chapter 2 starts tomorrow.
In case you missed it, here's an intro to the project.
amazon reviewer no. 7 and the ambiguities of web 2.0 01.29.2008, 3:21 AM
Slate takes a look at Grady Harp, Amazon's no. 7-ranked book reviewer, and finds the amateur-driven literary culture there to be a much grayer area than expected:
Absent the institutional standards that govern (however notionally) professional journalists, Web 2.0 stakes its credibility on the transparency of users' motives and their freedom from top-down interference. Amazon, for example, describes its Top Reviewers as "clear-eyed critics [who] provide their fellow shoppers with helpful, honest, tell-it-like-it-is product information." But beneath the just-us-folks rhetoric lurks an unresolved tension between transparency and opacity; in this respect, Amazon exemplifies the ambiguities of Web 2.0. The Top 10 List promises interactivity - ?"How do I become a Top Reviewer?" - ?yet Amazon guards its rankings algorithms closely.... As in any numbers game (tax returns, elections) opacity abets manipulation.
ace research news in the uk 01.25.2008, 8:13 AM
The Institute for the Future of the Book has been appointed by Arts Council England to undertake research into digital developments in literature. This is exciting news for us, not least because it marks the official launch of our London office.
Over the next few months Chris Meade and Sebastian Mary Harrington will be talking to a wide range of organisations including Arts Council England literature clients and others whose work could provide useful models to the sector.
We'll be looking at book publishing and magazines, reader development, writers including collaborative and new media authors and the blurring of distinctions between amateur and professional, live literature and festivals, plus other web activity that could provide inspiration to agencies working to spread the word about the word - and we'll be posting questions and comments on the ifbook blog as we go along.
Sebastian Mary Harrington's scarf captured live under construction at the Institute's London HQ, skillfully knitted in the colours of The Institute for the Future of the Book - and The School of Everything - to celebrate the start of our new research project.
freedom of expression -? free nyc screening jan. 31 01.24.2008, 1:33 AM
If you're in the New York City region, this is worth checking out (features Institute fellow Siva Vaidhyanathan):
From Free Culture @ NYU:
In 1998, university professor Kembrew McLeod trademarked the phrase "freedom of expression" - ?a startling comment on the way that intellectual property law can restrict creativity and the expression of ideas. This provocative and amusing documentary explores the battles being waged in courts, classrooms, museums, film studios, and the Internet over control of our cultural commons. Based on McLeod's award-winning book of the same title, Freedom of Expression® charts the many successful attempts to push back the assault on free expression by overzealous copyright holders.
In cooperation with the Media Education Foundation and La Lutta, Free Culture @ NYU is screening Freedom of Expression®: Resistance and Repression in the Age of Intellectual Property at 9pm on Thursday, January 31.
Narrated by Naomi Klein, the film features interviews with Stanford Law's Lawrence Lessig, Illegal Art Show curator Carrie McLaren, Negativland's Mark Hosler, UVA media scholar Siva Vaidhyanathan, and Free Culture @ NYU co-founder Inga Chernyak, among many others. This 53-minute documentary will be preceded by selections from Negativland's new DVD, Our Favorite Things, and it will be followed by a Q&A with Freedom of Expression® author and director Kembrew McLeod and co-producer Jeremy Smith.
Freedom of Expression Screening and Q&A with Creators
Sponsored by Free Culture @ NYU, NYU ACM, and WiNC
Free and Open to the Public (bring ID if non-NYU)
Thursday, January 31, 2008
NYU's Courant Institute
251 Mercer Street b/w Bleecker and W. 4th
On the film's site, I found this very clever (if slightly spastic) DVD extra, "A Fair(y) Use Tale":
expressive processing: an experiment in blog-based peer review 01.22.2008, 5:30 AM
An exciting new experiment begins today, one which ties together many of the threads begun in our earlier "networked book" projects, from Without Gods to Gamer Theory to CommentPress. It involves a community, a manuscript, and an open peer review process -? and, very significantly, the blessing of a leading academic press. (The Chronicle of Higher Education also reports.)
The community in question is Grand Text Auto, a popular multi-author blog about all things relating to digital narrative, games and new media, which for many readers here, probably needs no further introduction. The author, Noah Wardrip-Fruin, a professor of communication at UC San Diego, a writer/maker of digital fictions, and, of course, a blogger at GTxA. His book, which starting today will be posted in small chunks, open to reader feedback, every weekday over a ten-week period, is called Expressive Processing: Digital Fictions, Computer Games, and Software Studies. It probes the fundamental nature of digital media, looking specifically at the technical aspects of creation -? the machines and software we use, the systems and processes we must learn end employ in order to make media -? and how this changes how and what we create. It's an appropriate guinea pig, when you think about it, for an open review experiment that implicitly asks, how does this new technology (and the new social arrangements it makes possible) change how a book is made?
The press that has given the green light to all of this is none other than MIT, with whom Noah has published several important, vibrantly inter-disciplinary anthologies of new media writing. Expressive Processing his first solo-authored work with the press, will come out some time next year but now is the time when the manuscript gets sent out for review by a small group of handpicked academic peers. Doug Sery, the editor at MIT, asked Noah who would be the ideal readers for this book. To Noah, the answer was obvious: the Grand Text Auto community, which encompasses not only many of Noah's leading peers in the new media field, but also a slew of non-academic experts -? writers, digital media makers, artists, gamers, game designers etc. -? who provide crucial alternative perspectives and valuable hands-on knowledge that can't be gotten through more formal channels. Noah:
Blogging has already changed how I work as a scholar and creator of digital media. Reading blogs started out as a way to keep up with the field between conferences -- and I soon realized that blogs also contain raw research, early results, and other useful information that never gets presented at conferences. But, of course, that's just the beginning. We founded Grand Text Auto, in 2003, for an even more important reason: blogs can create community. And the communities around blogs can be much more open and welcoming than those at conferences and festivals, drawing in people from industry, universities, the arts, and the general public. Interdisciplinary conversations happen on blogs that are more diverse and sustained than any I've seen in person.
Given that ours is a field in which major expertise is located outside the academy (like many other fields, from 1950s cinema to Civil War history) the Grand Text Auto community has been invaluable for my work. In fact, while writing the manuscript for Expressive Processing I found myself regularly citing blog posts and comments, both from Grand Text Auto and elsewhere....I immediately realized that the peer review I most wanted was from the community around Grand Text Auto.
Sery was enthusiastic about the idea (although he insisted that the traditional blind review process proceed alongside it) and so Noah contacted me about working together to adapt CommentPress to the task at hand.
The challenge technically was to integrate CommentPress into an existing blog template, applying its functionality selectively -? in other words, to make it work for a specific group of posts rather than for all content in the site. We could have made a standalone web site dedicated to the book, but the idea was to literally weave sections of the manuscript into the daily traffic of the blog. From the beginning, Noah was very clear that this was the way it needed to work, insisting that the social and technical integration of the review process were inseparable. I've since come to appreciate how crucial this choice was for making a larger point about the value of blog-based communities in scholarly production, and moreover how elegantly it chimes with the central notions of Noah's book: that form and content, process and output, can never truly be separated.
Up to this point, CommentPress has been an all or nothing deal. You can either have a whole site working with paragraph-level commenting, or not at all. In the technical terms of WordPress, its platform, CommentPress is a theme: a template for restructuring an entire blog to work with the CommentPress interface. What we've done -? with the help of a talented WordPress developer named Mark Edwards, and invaluable guidance and insight from Jeremy Douglass of the Software Studies project at UC San Diego (and the Writer Response Theory blog) -? is made CommentPress into a plugin: a program that enables a specific function on demand within a larger program or site. This is an important step for CommentPress, giving it a new flexibility that it has sorely lacked and acknowledging that it is not a one-size-fits-all solution.
Just to be clear, these changes are not yet packaged into the general CommentPress codebase, although they will be before too long. A good test run is still needed to refine the new model, and important decisions have to be made about the overall direction of CommentPress: whether from here it definitively becomes a plugin, or perhaps forks into two paths (theme and plugin), or somehow combines both options within a single package. If you have opinions on this matter, we're all ears...
But the potential impact of this project goes well beyond the technical.
It represents a bold step by a scholarly press -? one of the most distinguished and most innovative in the world -? toward developing new procedures for vetting material and assuring excellence, and more specifically, toward meaningful collaboration with existing online scholarly communities to develop and promote new scholarship.
It seems to me that the presses that will survive the present upheaval will be those that learn to productively interact with grassroots publishing communities in the wild of the Web and to adopt the forms and methods they generate. I don't think this will be a simple story of the blogosphere and other emerging media ecologies overthrowing the old order. Some of the older order will die off to be sure, but other parts of it will adapt and combine with the new in interesting ways. What's particularly compelling about this present experiment is that it has the potential to be (perhaps now or perhaps only in retrospect, further down the line) one of these important hybrid moments -? a genuine, if slightly tentative, interface between two publishing cultures.
Whether the MIT folks realize it or not (their attitude at the outset seems to be respectful but skeptical), this small experiment may contain the seeds of larger shifts that will redefine their trade. The most obvious changes leveled on publishing by the Internet, and the ones that get by far the most attention, are in the area of distribution and economic models. The net flattens distribution, making everyone a publisher, and radically undercuts the heretofore profitable construct of copyright and the whole system of information commodities. The effects are less clear, however, in those hardest to pin down yet most essential areas of publishing -? the territory of editorial instinct, reputation, identity, trust, taste, community... These are things that the best print publishers still do quite well, even as their accounting departments and managing directors descend into panic about the great digital undoing. And these are things that bloggers and bookmarkers and other web curators, archivists and filterers are also learning to do well -? to sift through the information deluge, to chart a path of quality and relevance through the incredible, unprecedented din.
This is the part of publishing that is most important, that transcends technological upheaval -? you might say the human part. And there is great potential for productive alliances between print publishers and editors and the digital upstarts. By delegating half of the review process to an existing blog-based peer community, effectively plugging a node of his press into the Web-based communications circuit, Doug Sery is trying out a new kind of editorial relationship and exercising a new kind of editorial choice. Over time, we may see MIT evolve to take on some of the functions that blog communities currently serve, to start providing technical and social infrastructure for authors and scholarly collectives, and to play the valuable (and time-consuming) roles of facilitator, moderator and curator within these vast overlapping conversations. Fostering, organizing, designing those conversations may well become the main work of publishing and of editors.
I could go on, but better to hold off on further speculation and to just watch how it unfolds. The Expressive Processing peer review experiment begins today (the first actual manuscript section is here) and will run for approximately ten weeks and 100 thousand words on Grand Text Auto, with a new post every weekday during that period. At the end, comments will be sorted, selected and incorporated and the whole thing bundled together into some sort of package for MIT. We're still figuring out how that part will work. Please go over and take a look and if a thought is provoked, join the discussion.
watch wikipedia happen 01.22.2008, 3:59 AM
Markers move around the map registering Wikipedia edits in close to real time. Weirdly compelling.
new commentpress version available: plays well with latest wordpress! 01.21.2008, 11:53 AM
We've finally squashed the bug that made CommentPress incompatible with the latest version of WordPress (2.3), so anyone out there with a CP installation can finally go ahead and upgrade:
Other than the compatibility fix, 1.4.1 is exactly the same as 1.4. Of course, there are numerous improvements we'd still like to make, and plans for that are underway. Stay tuned.
Also: tomorrow I'm going to be announcing an exciting new CommentPress publishing experiment that will suggest possible future directions for the tool's development.
london calling 01.21.2008, 11:14 AM
Thurs 13 Mar @ Bishopsgate Institute
BOOK FUTURES: Scott Pack (thefridayproject.co.uk) + Chris Meade (futureofthebook.org) + John Lenehan + Shirley Dent (Chair)
What does the future hold for reading, writing and publishing? When we all go digital, what will be left of the book? Is the Ebook actually any good? And has Blogging really revolutionised literature? These are just some of the questions tackled by our expert panel of writers and publishers, which includes Scott Pack and Chris Meade. This is one for voracious readers, aspiring writers, keen Bloggers and tech heads alike.
Go to www.londonwordfestival.com for further details.
literature electrique 01.21.2008, 10:23 AM
I've been meaning to post something for a while about The Reprover, or Le Reprobateur, a hugely impressive work of digital fiction by François Coulon, Paris-based digital writer. It includes excellent cartoons, live video of the main character and a witty text in French and elaborate English which expands and contracts - the same sentence blooming different additional clauses each time you pass a mouse across it. This is a deeply disconcerting effect at first, but once you've got used to it, a whole new kind of three dimensional reading emerges. It's a fascinating idea which could only work on the web.
I've been meaning to post.. but haven't got round to it. That's why I need a Reprobateur, "someone who would be there simply to give us a bad conscience." Part psychoanalyst, part priest, part bloke in a suit, the Reprover is a wonderful creation. The story is set in the 80s and you can navigate around it by spinning a 3D polyhedron. "It's literature plus electricity!" says Coulon.
It's also plus so many tricks and distractions that it's hard to settle into - there's too much fun to be had clicking, spinning and adjusting the layers of soundtrack to actually immerse oneself in the story. The Reprover is beautifully produced and costs real money: 16 Euros or 160 for institutions, but you can get an excellent taster by going to http://www.totonium.com.
I've been going back to this one several times for more. Once you're signed up you can contact the narrator for free advice from your very own Reprover. You'll wonder how you coped all those years without one.
read this 01.16.2008, 1:27 PM
An interesting experiment on Vimeo. See what's going on?
Via IT IN place.
emergency books 01.16.2008, 12:18 AM
In the course of looking for something else entirely, I just stumbled upon Emergency Books. It's a (slightly dormant) side project of Litromagazine, a freesheet that publishes and distributes short fiction outside London Underground stations. Emergency Books are, very simply, out-of-print texts taken from Project Gutenberg and dropped wholesale into a PDF template that makes them easy and economical to print on a standard home printer. They're designed "for when you've nothing to read and a standard issue of Litro is too short", the publisher (is that the right word here?) explains:
Each 'double page spread' fits nicely in an Acrobat Reader window, which results in minimal need for scrolling. On- or off-screen, the columns are relatively narrow and short so you don't get lost in a sea of text (as you would if you simply printed direct from Project Gutenberg). There is little of the blank white space found in standard books - this is to get as much text on the page as possible thereby reducing the total number of pages required (for example, The Call of the Wild by Jack London, at 128 pages in book form, takes only 15 double-side printed A4 sheets as an Emergency Book - while being just as easy to read). This saves on resources as well as making the printed Emergency Book easier to fold and carry around.
If you are a 'format purist', you may well hate them. But if you love literature for the content, Emergency Books could be for you.
Of the small number who've saved Emergency Books on del.icio.us, one noted that Emergency Books are 'for reading when you're caught short. If that ever happens'. I like the idea of literature being, like cigarettes, something one can be 'caught short' without - for all that in this age of information overload the reverse more often feels true. There aren't that many texts there at present, and I'm slightly baffled by the extant choice. But whatever you think of Conan Doyle, Emergency Books shows a refeshingly pragmatic grasp of the relation between digital and paper publishing formats, and represents an interesting attempt at minimising the downsides of each in the interests of guaranteeing the reading addict a regular fix.
nominate the best tech writing of 2007 01.15.2008, 8:44 AM
digitalculturebooks, a collaborative imprint of the University of Michigan press and library, publishes an annual anthology of the year's best technology writing. The nominating process is open to the public and they're giving people until January 31st to suggest exemplary articles on "any and every technology topic--biotech, information technology, gadgetry, tech policy, Silicon Valley, and software engineering" etc.
The 2007 collection is being edited by Clive Thompson. Last year's was Steven Levy. When complete, the collection is published as a trade paperback and put in its entirety online in clean, fully searchable HTML editions, so head over and help build what will become a terrific open access resource.
orson whales in high def 01.14.2008, 11:01 AM
youtube purges: fair use tested 01.11.2008, 2:18 PM
Last week there was a wave of takedowns on YouTube of copyright-infringing material -? mostly clips from television and movies. MediaCommons, the nascent media studies network we help to run, felt this rather acutely. In Media Res, an area of the site where media scholars post and comment on video clips, uses YouTube and other free hosting sites like Veoh and blip.tv to stream its video. The upside of this is that it's convenient, free and fast. The downside is that it leaves In Media Res, which is quickly becoming a valuable archive of critically annotated media artifacts, vulnerable to the copyright purges that periodically sweep fan-driven media sites, YouTube especially.
In this latest episode, a full 27 posts on In Media Res suddenly found themselves with gaping holes where video clips once had been. The biggest single takedown we've yet experienced. Fortunately, since we regard these sorts of media quotations as fair use, we make it a policy to rip backups of every externally hosted clip so that we can remount them on our own server in the event of a takedown. And so, with a little work, nearly everything was restored -? there were a few clips that for various reasons we had failed to back up. We're still trying to scrounge up other copies.
The MediaCommons fair use statement reads as follows:
MediaCommons is a strong advocate for the right of media scholars to quote from the materials they analyze, as protected by the principle of "fair use." If such quotation is necessary to a scholar's argument, if the quotation serves to support a scholar's original analysis or pedagogical purpose, and if the quotation does not harm the market value of the original text -- but rather, and on the contrary, enhances it -- we must defend the scholar's right to quote from the media texts under study.
The good news is that In Media Res carries on relatively unruffled, but these recent events serve as a sobering reminder of the fragility of the media ecology we are collectively building, of the importance of the all too infrequently invoked right of fair use in non-textual media contexts, and of the need for more robust, legally insulated media archives. They also supply us with a handy moral: keep backups of everything. Without a practical contingency plan, fair use is just a bunch of words.
Incidentally, some of these questions were raised in a good In Media Res post last August by Sharon Shahaf of the University of Texas, Austin: The Promises and Challenges of Fan-Based On-Line Archives for Global Television.
poem for no one 01.11.2008, 8:08 AM
Just came across something lovely. Video for "Jed's Other Poem (Beautiful Ground)" by the now disbanded Grandaddy from their great album The Sophtware Slump (2000). Jed is a character who weaves in and out of the album, a forlorn humanoid robot made of junk parts who eventually dies, leaving behind a few mournful poems.
Creator Stewart Smith: "I programmed this entirely in Applesoft BASIC on a vintage 1979 Apple ][+ with 48K of RAM -- a computer so old it has no hard drive, mouse up/down arrow keys, and only types in capitals. First open-source music video, code available on website. Cinematography by Jeff Bernier." A nice detail of the story is that this was originally a fan vid but was eventually adopted as the "official" video for the song.
Thanks to Alex Itin for the link!
no longer separated by a common language 01.10.2008, 8:06 AM
LibraryThing now interfaces with the British Library and loads of other UK sources:
The BL is a catch in more than one way. It's huge, of course. But, unlike some other sources, BL data isn't normally available to the public. To get it, our friends at Talis, the UK-based library software company, have granted us special access to their Talis Base product, an elephantine mass of book data. In the case of the BL, that's some twelve million unique records, two copies Gutenberg Bibles and two copies of the Magna Carta.
reading between the lines? 01.09.2008, 12:13 PM
The NEA claims it wishes to "initiate a serious discussion" over the findings of its latest report, but the public statements from representatives of the Endowment have had a terse or caustic tone, such as in Sunil Iyengar's reply to Nancy Kaplan. Another example is Mark Bauerlein's letter to the editor in response to my December 7, 2007 Chronicle Review piece, "How Reading is Being Reimagined," a letter in which Bauerlein seems unable or unwilling to elevate the discourse beyond branding me a "votary" of screen reading and suggesting that I "do some homework before passing opinions on matters out of [my] depth."
One suspects that, stung by critical responses to the earlier Reading at Risk report (2004), the decision this time around was that the best defense is a good offense. Bauerlein chastises me for not matching data with data, that is for failing to provide any quantitative documentation in support of various observations about screen reading and new media (not able to resist the opportunity for insult, he also suggests such indolence is only to be expected of a digital partisan). Yet data wrangling was not the focus of my piece, and I said as much in print: rather, I wanted to raise questions about the NEA's report in the context of the history of reading, questions which have also been asked by Harvard scholar Leah Price in a recent essay in the New York Times Book Review.
If my work is lacking in statistical heavy mettle, the NEA's description of reading proceeds as though the last three decades of scholarship by figures like Elizabeth Eisenstein, Harvey Graff, Anthony Grafton, Lisa Jardin, Bill Sherman, Adrian Johns, Roger Chartier, Peter Stallybrass, Patricia Crain, Lisa Gitelman, and many others simply does not exist. But this body of work has demolished the idea that reading is a stable or historically homogeneous activity, thereby ripping the support out from under the quaint notion that the codex book is the simple, self-consistent artifact it is presented as in the reports, while also documenting the numerous varieties of cultural anxiety that have attended the act of reading and questions over whether we're reading not enough or too much.
It's worth underscoring that the academic response to the NEA's two reports has been largely skeptical. Why is this? After all, in the ivied circles I move in, everyone loves books, cherishes reading, and wants people to read more, in whatever venue or medium. I also know that's true of the people at if:book (and thanks to Ben Vershbow, by the way, for giving me the opportunity to respond here). And yet we bristle at the data as presented by the NEA. Is it because, as academics, eggheads, and other varieties of bookwormish nerds and geeks we're all hopelessly ensorcelled by the pleasures of problematizing and complicating rather than accepting hard evidence at face value? Herein lies the curious anti-intellectualism to which I think at least some of us are reacting, an anti-intellectualism that manifests superficially in the rancorous and dismissive tone that Bauerlein and Iyengar have brought to the very conversation they claim they sought to initiate, but anti-intellectualism which, at its root, is - ?just possibly - ?about a frustration that the professors won't stop indulging their fancy theories and footnotes and ditzy digital rhetoric. (Too much book larnin' going on up at the college? Is that what I'm reading between the lines?)
Or maybe I'm wrong about that last bit. I hope so. Because as I said in my Chronicle Review piece, there's no doubt it's time for a serious conversation about reading. Perhaps we can have a portion of it here on if:book.
University of Maryland
Related: "the NEA's misreading of reading"
the year of reading dangerously 01.08.2008, 6:48 PM
2008 is going well so far for the Institute in London - I was invited to 10 Downing Street this morning for the launch of the National Year of Reading which takes place in 2008, as one of a small group including literacy promoters, librarians, teachers, schoolchildren, authors and Richard Madeley, the presenter who with his partner Judy has become the British equivalent of Oprah, hosting a hugely influential TV book group which helps the trade to sell stacks of the titles it recommends. Prime Minister Gordon Brown has had a rough few months since taking over from Blair, but was at his best today - he's a genuine enthusiast for reading.
One topic for discussion was the importance of fathers reading to their children, and in particular to their sons. There are so many opportunities for new media here to help reach out to those who don't think of themselves as 'book people'.
Ten years ago the first Year of Reading kicked off a lot of activities and alliances which have thrived since, but I don't remember anyone giving much attention to the internet - except as a place to download resources from. So I was delighted to be there this time representing the Institute, and able to make the point at the outset that any promotion of the importance of literacy skills, reading appetite and the pleasure of literature must recognise the cultural importance of the networked screen and the interconnectedness of different media in the minds of young people and the lives of us all, even those who don't acknowledge this. Well, I kind of made that point...briefly and perhaps not so clearly. Anyway, I was there and got to speak up for if:book. The year has a different theme each month, ending with the Future of Reading in December, so we are planning all kinds of activities to link with that. Watch this space.
NEA reading debate round 2: an exchange between sunil iyengar and nancy kaplan 01.08.2008, 6:39 PM
Last week I received an email from Sunil Iyengar of the National Endownment for the Arts responding to Nancy Kaplan's critique (published here on if:book) of the NEA's handling of literacy data in its report "To Read or Not to Read." I'm reproducing the letter followed by Nancy's response.
The National Endowment for the Arts welcomes a "careful and responsible" reading of the report, To Read or Not To Read, and the data used to generate it. Unfortunately, Nancy Kaplan's critique (11/30/07) misconstrues the NEA's presentation of Department of Education test data as a "distortion," although all of the report's charts are clearly and accurately labeled.
For example, in Charts 5A to 5D of the full report, the reader is invited to view long-term trends in the average reading score of students at ages 9, 13, and 17. The charts show test scores from 1984 through 2004. Why did we choose that interval? Simply because most of the trend data in the preceding chapters--starting with the NEA's own study data featured in Chapter One--cover the same 20-year period. For the sake of consistency, Charts 5A to 5D refer to those years.
Dr. Kaplan notes that the Department of Education's database contains reading score trends from 1971 onward. The NEA report also emphasizes this fact, in several places. In 2004, the report observes, the average reading score for 17-year-olds dipped back to where it was in 1971. "For more than 30 years...17-year-olds have not sustained improvements in reading scores," the report states on p. 57. Nine-year-olds, by contrast, scored significantly higher in 2004 than in 1971.
Further, unlike the chart in Dr. Kaplan's critique, the NEA's Charts 5A to 5D explain that the "test years occurred at irregular intervals," and each test year from 1984 to 2004 is provided. Also omitted from the critique's reproduction are labels for the charts' vertical axes, which provide 5-point rather than the 10-point intervals used by the Department of Education chart. Again, there is no mystery here. Five-point intervals were chosen to make the trends easier to read.
Dr. Kaplan makes another mistake in her analysis. She suggests that the NEA report is wrong to draw attention to declines in the average reading score of adult Americans of virtually every education level, and an overall decline in the percentage of adult readers who are proficient. But the Department of Education itself records these declines. In their separate reports, the NEA and the Department of Education each acknowledge that the average reading score of adults has remained unchanged. That's because from 1992 to 2003, the percentage of adults with postsecondary education increased and the percentage who did not finish high school decreased. "After all," the NEA report notes, "compared with adults who do not complete high school, adults with postsecondary education tend to attain higher prose scores." Yet this fact in no way invalidates the finding that average reading scores and proficiency levels are declining even at the highest education levels.
"There is little evidence of an actual decline in literacy rates or proficiency," Dr. Kaplan concludes. We respectfully disagree.
Director, Research & Analysis
National Endowment for the Arts
I appreciate Mr. Iyengar's engagement with issues at the level of data and am happy to acknowledge that the NEA's report includes a single sentence on pages 55-56 with the crucial concession that over the entire period for which we have data, the average scale scores of 17 year-olds have not changed: "By 2004, the average scale score had retreated to 285, virtually the same score as in 1971, though not shown in the chart." I will even concede the accuracy of the following sentence: "For more than 30 years, in other words, 17year-olds have not sustained improvements in reading scores" [emphasis in the original]. What the report fails to note or account for, however, is that there actually was a period of statistically significant improvement in scores for 17 year-olds from 1971 to 1984. Although I did not mention it in my original critique, the report handles data from 13 year-olds in the same way: "the scores for 13-year-olds have remained largely flat from 1984-2004, with no significant change between the 2004 average score and the scores from the preceding seven test years. Although not apparent from the chart, the 2004 score does represent a significant improvement over the 1971 average - ?a four-point increase" (p. 56).
In other words, a completely accurate and honest assessment of the data shows that reading proficiency among 17 year-olds has fluctuated over the past 30 years, but has not declined over that entire period. At the same time, reading proficiency among 9 year-olds and 13 year-olds has improved significantly. Why does the NEA not state the case in the simple, accurate and complete way I have just written? The answer Mr. Iyengar proffers is consistency, but that response may be a bit disingenuous.
Plenty of graphs in the NEA report show a variety of time periods, so there is at best a weak rationale for choosing 1984 as the starting point for the graphs in question. Consistency, in this case, is surely less important than accuracy and completeness. Given the inferences the report draws from the data, then, it is more likely that the sample of data the NEA used in its representations was chosen precisely because, as Mr. Iyengar admits, that sample would make "the trends easier to read." My point is that the "trends" the report wants to foreground are not the only trends in the data: truncating the data set makes other, equally important trends literally invisible. A single sentence in the middle of a paragraph cannot excuse the act of erasure here. As both Edward Tufte (The Visual Display of Quantitative Information) and Jacques Bertin (Semiology of Graphics), the two most prominent authorities on graphical representations of data, demonstrate in their seminal works on the subject, selective representation of data constitutes distortion of that data.
Similarly, labels attached to a graph, even when they state that the tests occurred at irregular intervals, do not substitute for representing the irregularity of the intervals in the graph itself (again, see Tufte and Bertin). To do otherwise is to turn disinterested analysis into polemic. "Regularizing" the intervals in the graphic representation distorts the data.
The NEA report wants us to focus on a possible correlation between choosing to read books in one's leisure time, reading proficiency, and a host of worthy social and civic activities. Fine. But if the reading scores of 17 year-olds improved from 1971 to 1984 but there is no evidence that during the period of improvement these youngsters were reading more, the case the NEA is trying to build becomes shaky at best. Similarly, the reading scores of 13 year-olds improved from 1971 to 1984 but "have remained largely flat from 1984-2004 ...." Yet during that same period, the NEA report claims, leisure reading among 13 year-olds was declining. So what exactly is the hypothesis here -? that sometimes declines in leisure reading correlate with declines in reading proficiency but sometimes such a decline is not accompanied by a decline in reading proficiency? I'm skeptical.
My critique is aimed at the management of data (rather than the a-historical definition of reading the NEA employs, a somewhat richer and more potent issue joined by Matthew Kirschenbaum and others) because I believe that a crucial component of contemporary literacy, in its most capacious sense, includes the ability to understand the relationships between claims, evidence and the warrants for that evidence. The NEA's data need to be read with great care and its argument held to a high scientific standard lest we promulgate worthless or wasteful public policy based on weak research.
I am a humanist by training and so have come to my appreciation of quantitative studies rather late in my intellectual life. I cannot claim to have a deep understanding of statistics, yet I know what "confounding factors" are. When the NEA report chooses to claim that the reading proficiency of adults is declining while at the same time ignoring the NCES explanation of the statistical paradox that explains the data, it is difficult to avoid the conclusion that the report's authors are not engaging in a disinterested (that is, dispassionate) exploration of what we can know about the state of literacy in America today but are instead cherry-picking the elements that best suit the case they want to make.
Nancy Kaplan, Executive Director
School of Information Arts and Technologies
University of Baltimore
the year of the author 01.07.2008, 9:16 PM
Natalie Merchant, one of my favorite artists, was featured in The New York Times today. She is back after a long hiatus, but if you want to hear her new songs you better stand in line for a ticket to one of her shows because she doesn't plan to release an album anytime soon. She appeared this weekend at the Hiro Ballroom in New York City. According to the Times, a voice in the crowd asked when Ms. Merchant would release a new album, she said with a smile that she was awaiting "a new paradigm for the recording industry."
hmm, well, the good news is that the paradigm is shifting, fast. But we don't yet know if this will be a good thing or a bad thing. It's certainly a bad thing for the major labels, who are losing market share faster then polar bears are losing their ice (sorry for the awful metaphor). But as they continue to shrink, so do the services and protections they offer to the artists. And the more content moves online the less customers are willing to pay for it. Radiohead's recent experiment proves that.
But artists are still embracing new media and using it to take matters into their own hands. In the music industry, a long-tail entrepreneurial system supported by online networks and e-commerce is beginning to emerge. Sites like nimbit empower artists to manage their own sales and promotion, bypassing itunes which takes a hefty 50% off the top and and, unlike record labels, does nothing to shape or nurture an artist's career.
Now, indulge me for a moment while I talk about the Kindle as though it were the ipod of ebooks. It's not, for lots of reasons. But it does have one thing in common with its music industry counterpart, it allows authors to upload their own content and sell it on amazon. That is huge. That alone might be enough to start a similar paradigm shift in publishing. In this week's issue of Publisher's Weekly, Mike Shatzkin predicts it will.
So why have I titled this, "the year of the author"? (I borrowed that phrase from Mike Shatzkin's prediction #3 btw). I'm not trying to say it will be a great year for authors. New media is going to squeeze them as it is squeezing musicians and striking writer's guild members. It is the year of the author, because they will be the ones who drive the paradigm shift. They may begin to use online publishing and distribution tools to bypass traditional publishers and put their work out there en masse. OR they will opt out of the internet's "give-up-your-work-for-free" model and create a new model altogether. Natalie Merchant is opting to (temporarily I hope) bring back the troubadour tradition in the music biz. It will be interesting to see what choices authors make as the publishing industry's ice begins to shift.
the future of the sustainable book 01.02.2008, 10:59 PM
On New Year's Eve, I got lost in Yonkers trying to take my son's gently-used toys to the Salvation Army. The Yonkers store was the only one I could find willing to take them. The guy on the phone hesitated, "Are they in good condition?" he asked, clearly unhappy about my impending donation. I assured him they were, and he sighed and told me to come on over.
On principle, I try (really hard) to give away anything that is not completely worn out. But it is getting harder and harder to do. Nobody wants my old furniture or clothes or books. And they especially don't want used children's toys. My attempt to give them away was ill-fated. A police barricade stopped me at Nepperhan Avenue (a construction site disaster). Then I drove around for forty minutes until I found an alternate route but was twarted at Ashburton Ave (building on fire, streets blocked). I gave up and went home. With stomach full of guilt, I put the plastic toys in the dumpster. My son didn't mind because he had a brand new pile of toys in his playroom, Christmas gifts from relatives and friends who couldn't be dissuaded.
Point is, it seems increasingly difficult to opt out of the cycle of waste-creation. Plastic kids' toys are just one example. I'm also guilty of consuming and transforming lots of other things into waste: clothes, computers, cell phones, magazines, all sorts of complicatedly-packaged food and beverage items, etc... So yesterday, when I contemplated how best to spend 2008, I decided to focus on figuring out how to create a more sustainable lifestyle. And since I work in book publishing, job one is to figure out what it means to create a sustainable book. Lots of models come to mind. Good ones like Wikipedia (device-neutral and always in the latest, free, edition) and bad ones like the Kindle, (which tries to create a market for an ebook reader with designed obsolescence).
Anyway, I thought it might be useful to weave the sustainability discussion into if:book's ongoing consideration of networked ebooks, because at this stage in their developement, networked books could be shaped with sustainability in mind. So, I'm hoping to stir up some interesting discussion and serious contemplation of the perfectly sustainable book: one that is constantly revised, but never needs to be reprinted (or repurchased); one that is lean and simple and doesn't require a small server farm or a special device; one that makes an enormous impact, but leaves a teeny tiny carbon footprint; one we can live with for ever and ever without getting bored or satiated.
coming soon to a laptop near you 01.02.2008, 3:22 AM
What makes me think 2008 will be a big year for the future of the book?
Last night in London we went to see the movie of The Golden Compass adapted from the excellent Northern Lights by Philip Pullman. Imagine my surprise when all three trailers shown were about films about books - not just film adaptations but movies in which the book itself stars.
Number one: Inkheart. "Maggie and her father had a special gift when it came to reading stories... but there's one book they should never have read... " It's based on the novels of Cornelia Funke.
Number two: Spiderwick, based on books by artist Tony DiTerlizzi and Holly Black. 'Their World is Closer Than You Think'.
A child reads: 'Do not dare to read this book for if you do but take a look..."
The trailer ends with an evil monster growling, "Give me the book!"
And finally, for the adults, Nicholas Cage in National Treasure: Book of Secrets.
Quotes from the trailer: "I need to see the page - there's a symbol... it's the
President's secret book, it contains all the conspiracies...The book exists!"
it's the Search for the Code of the Bride of Harry Potter & Da Vinci.
Meanwhile I loved The Golden Compass, which has been less compromised by Hollywood e than I'd feared. When the movie was launched I was horrified to hear a radio debate in New York about the dangers of Pullman's philosophy contaminating innocent children - nobody voiced the view that kids deserved more atheist messages not less.
But I thought the least successful element was the tricksy way they showed the all knowing althiometer at work: swirling dust revealing fuzzy orange images. Only in text can you convincingly describe what it would feel like to know the future.
Will this fascination with the secret world of reading lead to increased sales for conventional tomes or is this the beginning of the final battle between page and screen?! And will it lead to more interest in new ways of mixing literature and image to make networked works of staggering genius?
This Spring experience: IFBOOK! 'It's a novel, Bob..but not as we know it.'
Happy New Year