Listing entries tagged with internet
lessig: read/write internet under threat 02.17.2006, 1:51 PM
In an important speech to the Open Source Business Conference in San Francisco, Lawrence Lessig warned that decreased regulation of network infrastructure could fundamentally throw off the balance of the "read/write" internet, gearing the medium toward commercial consumption and away from creative production by everyday people. Interestingly, he cites Apple's iTunes music store, generally praised as the shining example of enlightened digital media commerce, as an example of what a "read-only" internet might look like: a site where you load up your plate and then go off to eat alone.
Lessig is drawing an important connection between the question of regulation and the question of copyright. Initially, copyright was conceived as a way to stimulate creative expression -- for the immediate benefit of the author, but for the overall benefit of society. But over the past few decades, copyright has been twisted by powerful interests to mean the protection of media industry business models, which are now treated like a sacred, inviolable trust. Lessig argues that it's time for a values check -- time to return to the original spirit of copyright:
It's never been the policy of the U.S. government to choose business models, but to protect the authors and artists... I'm sure there is a way for [new models to emerge] that will let artists succeed. I'm not sure we should care if the record companies survive. They care, but I don't think the government should.
Big media have always lobbied for more control over how people use culture, but until now, it's largely been through changes to the copyright statutes. The distribution apparatus -- record stores, booksellers, movie theaters etc. -- was not a concern since it was secure and pretty much by definition "read-only." But when we're dealing with digital media, the distribution apparatus becomes a central concern, and that's because the apparatus is the internet, which at present, no single entity controls.
Which is where the issue of regulation comes in. The cable and phone companies believe that since it's through their physical infrastructure that the culture flows, that they should be able to control how it flows. They want the right to shape the flow of culture to best fit their ideal architecture of revenue. You can see, then, how if they had it their way, the internet would come to look much more like an on-demand broadcast service than the vibrant two-way medium we have today: simply because it's easier to make money from read-only than from read/write -- from broadcast than from public access."
Control over culture goes hand in hand with control over bandwidth -- one monopoly supporting the other. And unless more moderates like Lessig start lobbying for the public interest, I'm afraid our government will be seduced by this fanatical philosophy of control, which when aired among business-minded people, does have a certain logic: "It's our content! Our pipes! Why should we be bled dry?" It's time to remind the media industries that their business models are not synonymous with culture. To remind the phone and cable companies that they are nothing more than utility companies and that they should behave accordingly. And to remind the government who copyright and regulation are really meant to serve: the actual creators -- and the public.
Posted by ben vershbow at 01:51 PM
| Comments (6)
tags: Copyright and Copyleft , DRM , Network_Freedom , broadband , copyleft , copyright , internet , lessig , media , network_freedom , network_neutrality , policy , read/write_web
google gets mid-evil 01.30.2006, 3:46 PM
At the World Economic Forum in Davos last Friday, Google CEO Eric Schmidt assured a questioner in the audience that his company had in fact thoroughly searched its soul before deciding to roll out a politically sanitized search engine in China:
We concluded that although we weren't wild about the restrictions, it was even worse to not try to serve those users at all... We actually did an evil scale and decided not to serve at all was worse evil.
what I heard at MIT 01.26.2006, 9:47 AM
Over the next few days I'll be sifting through notes, links, and assorted epiphanies crumpled up in my pocket from two packed, and at times profound, days at the Economics of Open Content symposium, hosted in Cambridge, MA by Intelligent Television and MIT Open CourseWare. For now, here are some initial impressions -- things I heard, both spoken in the room and ricocheting inside my head during and since. An oral history of the conference? Not exactly. More an attempt to jog the memory. Hopefully, though, something coherent will come across. I'll pick up some of these threads in greater detail over the next few days. I should add that this post owes a substantial debt in form to Eliot Weinberger's "What I Heard in Iraq" series (here and here).
Naturally, I heard a lot about "open content."
I heard that there are two kinds of "open." Open as in open access -- to knowledge, archives, medical information etc. (like Public Library of Science or Project Gutenberg). And open as in open process -- work that is out in the open, open to input, even open-ended (like Linux, Wikipedia or our experiment with MItch Stephens, Without Gods).
I heard that "content" is actually a demeaning term, treating works of authorship as filler for slots -- a commodity as opposed to a public good.
I heard that open content is not necessarily the same as free content. Both can be part of a business model, but the defining difference is control -- open content is often still controlled content.
I heard that if you build the open-access resources and demonstrate their value, the money will come later.
I heard that content should be given away for free and that the money is to be made talking about the content.
I heard that reputation and an audience are the most valuable currency anyway.
I heard that the academy's core mission -- education, research and public service -- makes it a moral imperative to have all scholarly knowledge fully accessible to the public.
I heard that if knowledge is not made widely available and usable then its status as knowledge is in question.
I heard that libraries may become the digital publishing centers of tomorrow through simple, open-access platforms, overhauling the print journal system and redefining how scholarship is disseminated throughout the world.
And I heard a lot about copyright...
I heard that probably about 50% of the production budget of an average documentary film goes toward rights clearances.
I heard that many of those clearances are for "underlying" rights to third-party materials appearing in the background or reproduced within reproduced footage. I heard that these are often things like incidental images, video or sound; or corporate logos or facades of buildings that happen to be caught on film.
I heard that there is basically no "fair use" space carved out for visual and aural media.
I heard that this all but paralyzes our ability as a culture to fully examine ourselves in terms of the media that surround us.
I heard that the various alternative copyright movements are not necessarily all pulling in the same direction.
I heard that there is an "inter-operability" problem between alternative licensing schemes -- that, for instance, Wikipedia's GNU Free Documentation License is not inter-operable with any Creative Commons licenses.
I heard that since the mass market content industries have such tremendous influence on policy, that a significant extension of existing copyright laws (in the United States, at least) is likely in the near future.
I heard one person go so far as to call this a "totalitarian" intellectual property regime -- a police state for content.
I heard that one possible benefit of this extension would be a general improvement of internet content distribution, and possibly greater freedom for creators to independently sell their work since they would have greater control over the flow of digital copies and be less reliant on infrastructure that today only big companies can provide.
I heard that another possible benefit of such control would be price discrimination -- i.e. a graduated pricing scale for content varying according to the means of individual consumers, which could result in fairer prices. Basically, a graduated cultural consumption tax imposed by media conglomerates
I heard, however, that such a system would be possible only through a substantial invasion of users' privacy: tracking users' consumption patterns in other markets (right down to their local grocery store), pinpointing of users' geographical location and analysis of their socioeconomic status.
I heard that this degree of control could be achieved only through persistent surveillance of the flow of content through codes and controls embedded in files, software and hardware.
I heard that such a wholesale compromise on privacy is all but inevitable -- is in fact already happening.
I heard that in an "information economy," user data is a major asset of companies -- an asset that, like financial or physical property assets, can be liquidated, traded or sold to other companies in the event of bankruptcy, merger or acquisition.
I heard that within such an over-extended (and personally intrusive) copyright system, there would still exist the possibility of less restrictive alternatives -- e.g. a peer-to-peer content cooperative where, for a single low fee, one can exchange and consume content without restriction; money is then distributed to content creators in proportion to the demand for and use of their content.
I heard that such an alternative could theoretically be implemented on the state level, with every citizen paying a single low tax (less than $10 per year) giving them unfettered access to all published media, and easily maintaining the profit margins of media industries.
I heard that, while such a scheme is highly unlikely to be implemented in the United States, a similar proposal is in early stages of debate in the French parliament.
And I heard a lot about peer-to-peer...
I heard that p2p is not just a way to exchange files or information, it is a paradigm shift that is totally changing the way societies communicate, trade, and build.
I heard that between 1840 and 1850 the first newspapers appeared in America that could be said to have mass circulation. I heard that as a result -- in the space of that single decade -- the cost of starting a print daily rose approximately %250.
I heard that modern democracies have basically always existed within a mass media system, a system that goes hand in hand with a centralized, mass-market capital structure.
I heard that we are now moving into a radically decentralized capital structure based on social modes of production in a peer-to-peer information commons, in what is essentially a new chapter for democratic societies.
I heard that the public sphere will never be the same again.
I heard that emerging practices of "remix culture" are in an apprentice stage focused on popular entertainment, but will soon begin manifesting in higher stakes arenas (as suggested by politically charged works like "The French Democracy" or this latest Black Lantern video about the Stanley Williams execution in California).
I heard that in a networked information commons the potential for political critique, free inquiry, and citizen action will be greatly increased.
I heard that whether we will live up to our potential is far from clear.
I heard that there is a battle over pipes, the outcome of which could have huge consequences for the health and wealth of p2p.
I heard that since the telecomm monopolies have such tremendous influence on policy, a radical deregulation of physical network infrastructure is likely in the near future.
I heard that this will entrench those monopolies, shifting the balance of the internet to consumption rather than production.
I heard this is because pre-p2p business models see one-way distribution with maximum control over individual copies, downloads and streams as the most profitable way to move content.
I heard also that policing works most effectively through top-down control over broadband.
I heard that the Chinese can attest to this.
I heard that what we need is an open spectrum commons, where connections to the network are as distributed, decentralized, and collaboratively load-sharing as the network itself.
I heard that there is nothing sacred about a business model -- that it is totally dependent on capital structures, which are constantly changing throughout history.
I heard that history is shifting in a big way.
I heard it is shifting to p2p.
I heard this is the most powerful mechanism for distributing material and intellectual wealth the world has ever seen.
I heard, however, that old business models will be radically clung to, as though they are sacred.
I heard that this will be painful.
Posted by ben vershbow at 09:47 AM
| Comments (5)
tags: Copyright and Copyleft , Education , Network_Freedom , Publishing, Broadcast, and the Press , Remix , academia , academy , broadband , conferences_and_excursions , copyleft , copyright , creative_commons , cyberlaw , democracy , economics , economics_of_open_content , film , freedom , internet , media , monopoly , music , network , open_content , open_spectrum , p2p , politics , publishing , scholarship , technology , wikipedia
the book is reading you 01.19.2006, 1:42 PM
I just noticed that Google Book Search requires users to be logged in on a Google account to view pages of copyrighted works.
They provide the following explanation:
Why do I have to log in to see certain pages?
Because many of the books in Google Book Search are still under copyright, we limit the amount of a book that a user can see. In order to enforce these limits, we make some pages available only after you log in to an existing Google Account (such as a Gmail account) or create a new one. The aim of Google Book Search is to help you discover books, not read them cover to cover, so you may not be able to see every page you're interested in.
So they're tracking how much we've looked at and capping our number of page views. Presumably a bone tossed to publishers, who I'm sure will continue suing Google all the same (more on this here). There's also the possibility that publishers have requested information on who's looking at their books -- geographical breakdowns and stats on click-throughs to retailers and libraries. I doubt, though, that Google would share this sort of user data. Substantial privacy issues aside, that's valuable information they want to keep for themselves.
That's because "the aim of Google Book Search" is also to discover who you are. It's capturing your clickstreams, analyzing what you've searched and the terms you've used to get there. The book is reading you. Substantial privacy issues aside, (it seems more and more that's where we'll be leaving them) Google will use this data to refine Google's search algorithms and, who knows, might even develop some sort of personalized recommendation system similar to Amazon's -- you know, where the computer lists other titles that might interest you based on what you've read, bought or browsed in the past (a system that works only if you are logged in). It's possible Google is thinking of Book Search as the cornerstone of a larger venture that could compete with Amazon.
There are many ways Google could eventually capitalize on its books database -- that is, beyond the contextual advertising that is currently its main source of revenue. It might turn the scanned texts into readable editions, hammer out licensing agreements with publishers, and become the world's biggest ebook store. It could start a print-on-demand service -- a Xerox machine on steroids (and the return of Google Print?). It could work out deals with publishers to sell access to complete online editions -- a searchable text to go along with the physical book -- as Amazon announced it will do with its Upgrade service. Or it could start selling sections of books -- individual pages, chapters etc. -- as Amazon has also planned to do with its Pages program.
Amazon has long served as a valuable research tool for books in print, so much so that some university library systems are now emulating it. Recent additions to the Search Inside the Book program such as concordances, interlinked citations, and statistically improbable phrases (where distinctive terms in the book act as machine-generated tags) are especially fun to play with. Although first and foremost a retailer, Amazon feels more and more like a search system every day (and its A9 engine, though seemingly always on the back burner, is also developing some interesting features). On the flip side Google, though a search system, could start feeling more like a retailer. In either case, you'll have to log in first.
Posted by ben vershbow at 01:42 PM
| Comments (5)
tags: Copyright and Copyleft , Libraries, Search and the Web , POD , amazon , books , e-commerce , e-publishing , ebooks , google , google_book_search , google_print , internet , print_on_demand , privacy , publishing , search , web
who owns the network? 01.12.2006, 5:15 PM
Susan Crawford recently floated the idea of the internet network (see comments 1 and 2) as a public trust that, like America's national parks or seashore, requires the protection of the state against the undue influence of private interests.
...it's fine to build special services and make them available online. But broadband access companies that cover the waterfront (literally -- are interfering with our navigation online) should be confronted with the power of the state to protect entry into this self-owned commons, the internet. And the state may not abdicate its duty to take on this battle.
Others argue that a strong government hand will create as many problems as it fixes, and that only true competition between private, municipal and grassroots parties -- across not just broadband, but multiple platforms like wireless mesh networks and satellite -- can guarantee a free net open to corporations and individuals in equal measure.
Discussing this around the table today, Ray raised the important issue of open content: freely available knowledge resources like textbooks, reference works, scholarly journals, media databases and archives. What are the implications of having these resources reside on a network that increasingly is subject to control by phone and cable companies -- companies that would like to transform the net from a many-to-many public square into a few-to-many entertainment distribution system? How open is the content when the network is in danger of becoming distinctly less open?
end of cyberspace 01.11.2006, 7:26 AM
The End of Cyberspace is a brand-new blog by Alex Soojung-Kim Pang, former academic editor and print-to-digital overseer at Encyclopedia Britannica, and currently a research director at the Institute for the Future (no relation). Pang has been toying with this idea of the end of cyberspace for several years now, but just last week he set up this blog as "a public research notebook" where he can begin working through things more systematically. To what precise end, I'm not certain.
The end of cyberspace refers to the the blurring, or outright erasure, of the line between the virtual and the actual world. With the proliferation of mobile devices that are always online, along with increasingly sophisticated social software and "Web 2.0" applications, we are moving steadily away from a conception of the virtual -- of cyberspace -- as a place one accesses exclusively through a computer console. Pang explains:
Our experience of interacting with digital information is changing. We're moving to a world in which we (or objects acting on our behalf) are online all the time, everywhere.
Designers and computer scientists are also trying hard to create a new generation of devices and interfaces that don't monopolize our attention, but ride on the edges of our awareness. We'll no longer have to choose between cyberspace and the world; we'll constantly access the first while being fully part of the second.
Because of this, the idea of cyberspace as separate from the real world will collapse.
If the future of the book, defined broadly, is about the book in the context of the network, then certainly we must examine how the network exists in relation to the world, and on what terms we engage with it. I'm not sure cyberspace has ever really been a home for the book, but it has, in a very short time, totally altered the way we read. Now, gradually, we return to the world. But changed. This could get interesting.
.tv 01.09.2006, 6:15 PM
People have been talking about internet television for a while now. But Google and Yahoo's unveiling of their new video search and subscription services last week at the Consumer Electronics Show in Las Vegas seemed to make it real.
Sifting through the predictions and prophecies that subsequently poured forth, I stumbled on something sort of interesting -- a small concrete discovery that helped put some of this in perspective. Over the weekend, Slate Magazine quietly announced its partnership with "meaningoflife.tv," a web-based interview series hosted by Robert Wright, author of Nonzero and The Moral Animal, dealing with big questions at the perilous intersection of science and religion.
Launched last fall (presumably in response to the intelligent design fracas), meaningoflife.tv is a web page featuring a playlist of video interviews with an intriguing roster of "cosmic thinkers" -- philosophers, scientists and religious types -- on such topics as "Direction in evolution," "Limits in science," and "The Godhead."
This is just one of several experiments in which Slate is fiddling with its text-to-media ratio. Today's Pictures, a collaboration with Magnum Photos, presents a daily gallery of images and audio-photo essays, recalling both the heyday of long-form photojournalism and a possible future of hybrid documentary forms. One problem is that it's not terribly easy to find these projects on Slate's site. The Magnum page has an ad tucked discretely on the sidebar, but meaningoflife.tv seems to have disappeared from the front page after a brief splash this weekend. For a born-digital publication that has always thought of itself in terms of the web, Slate still suffers from a pretty appalling design, with its small headline area capping a more or less undifferentiated stream of headlines and teasers.
Still, I'm intrigued by these collaborations, especially in light of the forecast TV-net convergence. While internet TV seems to promise fragmentation, these projects provide a comforting dose of coherence -- a strong editorial hand and a conscious effort to grapple with big ideas and issues, like the reassuringly nutritious programming of PBS or the BBC. It's interesting to see text-based publications moving now into the realm of television. As Tivo, on demand, and now, the internet atomize TV beyond recognition, perhaps magazines and newspapers will fill part of the void left by channels.
Limited as it may now seem, traditional broadcast TV can provide us with valuable cultural touchstones, common frames of reference that help us speak a common language about our culture. That's one thing I worry we'll lose as the net blows broadcast media apart. Then again, even in the age of five gazillion cable channels, we still have our water-cooler shows, our mega-hits, our television "events." And we'll probably have them on the internet too, even when "by appointment" television is long gone. We'll just have more choice regarding where, when and how we get at them. Perhaps the difference is that in an age of fragmentation, we view these touchstone programs with a mildly ironic awareness of their mainstream status, through the multiple lenses of our more idiosyncratic and infinitely gratified niche affiliations. They are islands of commonality in seas of specialization. And maybe that makes them all the more refreshing. Shows like "24," "American Idol," or a Ken Burns documentary, or major sporting events like the World Cup or the Olympics that draw us like prairie dogs out of our niches. Coming up for air from deep submersion in our self-tailored, optional worlds.
Posted by ben vershbow at 06:15 PM
| Comments (6)
tags: Publishing, Broadcast, and the Press , TV , broadband , broadcast , documentary , google , internet , journalism , media , media_consumption , multimedia , network , photography , religion , science , slate , television , yahoo
new mission statement 01.02.2006, 4:30 PM
the institute is a bit over a year old now. our understanding of what we're doing has deepened considerably during the year, so we thought it was time for a serious re-statement of our goals. here's a draft for a new mission statement. we're confident that your input can make it better, so please send your ideas and criticisms.
The Institute for the Future of the Book is a project of the Annenberg Center for Communication at USC. Starting with the assumption that the locus of intellectual discourse is shifting from printed page to networked screen, the primary goal of the Institute is to explore, understand and hopefully influence this evolution.
We use the word "book" metaphorically. For the past several hundred years, humans have used print to move big ideas across time and space for the purpose of carrying on conversations about important subjects. Radio, movies, TV emerged in the last century and now with the advent of computers we are combining media to forge new forms of expression. For now, we use "book" to convey the past, the present transformation, and a number of possible futures.
THE WORK & THE NETWORK
One major consequence of the shift to digital is the addition of graphical, audio, and video elements to the written word. More profound, however, are the consequences of the relocation of the book within the network. We are transforming books from bounded objects to documents that evolve over time, bringing about fundamental changes in our concepts of reading and writing, as well as the role of author and reader.
SHORT TERM/LONG TERM
The Institute values theory and practice equally. Part of our work involves doing what we can with the tools at hand (short term). Examples include last year's Gates Memory Project or the new author's thinking-out-loud blogging effort. Part of our work involves trying to build new tools and effecting industry wide change (medium term): see the Sophie Project and Next\Text. And a significant part of our work involves blue-sky thinking about what might be possible someday, somehow (long term). Our blog, if:book covers the full-range of our interests.
As part of the Mellon Foundation's project to develop an open-source digital infrastructure for higher education, the Institute is building Sophie, a set of high-end tools for writing and reading rich media electronic documents. Our goal is to enable anyone to assemble complex, elegant, and robust documents without the necessity of mastering overly complicated applications or the help of programmers.
NEW FORMS, NEW PROCESSES
Academic institutes arose in the age of print, which informed the structure and rhythm of their work. The Institute for the Future of the Book was born in the digital era, and we seek to conduct our work in ways appropriate to the emerging modes of communication and rhythms of the networked world. Freed from the traditional print publishing cycles and hierarchies of authority, the Institute seeks to conduct its activities as much as possible in the open and in real time.
HUMANISM & TECHNOLOGY
Although we are excited about the potential of digital technologies to amplify human potential in wondrous ways, we believe it is crucial to consciously consider the social impact of the long-term changes to society afforded by new technologies.
Although the institute is based in the U.S. we take the seriously the potential of the internet and digital media to transcend borders. We think it's important to pay attention to developments all over the world, recognizing that the future of the book will likely be determined as much by Beijing, Buenos Aires, Cairo, Mumbai and Accra as by New York and Los Angeles.
another view on the stacey/gamma flap 12.28.2005, 12:42 PM
For an alternative view of Lisa's earlier post ... i wonder if Gamma's submission of Adam Stacey's image with the "Adam Stacey/Gamma" attribution doesn't show the strength of the Creative Commons concept. As i see it, Stacey published his image without any restrictions beyond attribution. Gamma, a well-respected photo agency started distributing the image attributed to Stacey. Isn't this exactly what the CC license was supposed to enable — the free-flow of information on the net. perhaps Stacey chose the wrong license and he didn't mean for his work to be distributed by a for-profit company. If so, that is a reminder to all of us to be careful about which Creative Commons license we choose. One thing i'm not clear on is whether Gamma referenced the CC license. They are supposed to do that and if they didn't they should have.
last week: wikipedia, r kelly, gaming and google panels, and more... 12.18.2005, 4:27 PM
Here's an overview of what we've been posting over the last week. As well, a few of us having been talking about ways to graphically represent text, so I thought I would include a mind map of this overview.
As a follow up to the increasingly controversial wikipedia front, Daniel Brandt uncovered that Brian Chase posted false information about John Seignthaler that was reported here last week. To add fuel to the fire, Nature weighed in that Encyclopedia Britannica may not be as reliable as Wikipedia.
Business Week noted a possible future of pricing for data transfer. Currently, carries such as phone and cable companies are developing technology to identify and control what types of media (voice, images, text or video) are being uploaded. This ability opens the door to being able to charge for different uses of data transfer, which would have a huge impact on uploading content for personal creative use of the internet.
Liz Barry and Bill Wetzel shared some of their experiences from their "Talk to Me" Project. With their "talk to me" sign in tow, they travel around New York and the rest of the US looking for conversation. We were impressed at how they do not have a specific agenda besides talking to people. In the mediated age, they are not motivated by external political/ religious/ documentary intentions. What they do document is available on their website, and we look forward to see what they come up with next.
The Google Book Search debate continues as well, via a panel discussion hosted by the American Bar Association. Interestingly, publishers spoke as if the wide scale use of ebooks is imminent. More importantly and even if this particular case settles out of court, the courts have a pressing need to define copyright and fair use guidelines for these emerging uses.
With the protest of the WTO meetings in Hong Kong this past week, new journalism forms took one step forward. The website Curbside @ WTO covered the meetings with submissions from journalism students, bloggers and professional journalists.
McDonalds filed a patent which suggests that it intends to offer clips of movies instead of the traditional toys in their kids oriented Happy Meals. Lisa pondered if a video clip can successfully replace a toy, and if it does, what the effects on children's imaginations might be.
R. Kelly's experiments in form and the "serial song" through his Trapped in the Closet recordings. While R Kelly has varying success in this endeavor, Dan compared the experience of not only the serial novel, but also Julie Powell's foray into transferring her blog into book form and what she might have learned from R. Kelly (its hard to make unified pieces maintain an overall coherency.)
The world of academic publishing was challenged with a proposal calling to create an electronic academic press. This segment seems especially ripe for the shift to digital publishing as many journals with small circulations face raising printing and production costs.
Sol and others from the institute attended "Making Games Matter," a panel with contributors from The Game Design Reader: A Rules of Play Anthology, edited by Katie Salen and Eric Zimmerman. The discussion covered among other things: involving the academy in creating a discourse for gaming and game design, obstacles in studying and creating games, and the game "industry" itself. The book and panel called out for games and gaming to undergo a formal study akin to the novel and the experience of reading. Also, in the gaming world, the class economics of the real and virtual began to emerge as a Chinese firm pays employees to build up characters in MMOGs to sell to affluent gamers.
Posted by ray cha at 04:27 PM
| Comments (0)
tags: Roundup , academia , broadband , e-publishing , fast_food , gaming , google , google_book_search , internet , mcdonalds , network_neutrality , publishing , r_kelly , video_games , wikipedia
the net as we know it 12.16.2005, 7:27 AM
There's a good article in Business Week describing the threat posed by unregulated phone and cable companies to the freedom and neutrality of the internet. The net we know now favors top-down and bottom-up publishing equally. Yahoo! or The New York Times may have more technical resources at their disposal than your average blogger, but in the pipes that run in and out of your home connecting you to the net, they are equals.
That could change, however. Unless government gets pro-active on the behalf of ordinary users, broadband providers will be free to privilege certain kinds of use and certain kinds of users, creating the conditions for a broadcast-oriented web and charging higher premiums for more independently creative uses of bandwidth.
Here's how it might work:
So the network operators figure they can charge at the source of the traffic -- and they're turning to technology for help. Sandvine and other companies, including Cisco Systems, are making tools that can identify whether users are sending video, e-mail, or phone calls. This gear could give network operators the ability to speed up or slow down certain uses.
That capability could be used to help Internet surfers. BellSouth, for one, wants to guarantee that an Internet-TV viewer doesn't experience annoying millisecond delays during the Super Bowl because his teenage daughter is downloading music files in another room.
But express lanes for certain bits could give network providers a chance to shunt other services into the slow lane, unless they pay up. A phone company could tell Google or another independent Web service that it must pay extra to ensure speedy, reliable service.
One commenter suggests a rather unsavory scheme:
The best solution is to have ISPs change monthly billing to mirror cell phone bills: X amount of monthly bandwidth any overage customer would be charged accordingly. File sharing could become legit, as monies from our monthly bills could be funneled to the apprioprate copyright holder (big media to regular Joe making music in his room) and the network operators will be making more dough on their investment. With the Skypes of the world I can't see this not happenning!
It seems appropriate that when I initially tried to read this article, a glitchy web ad was blocking part of the text -- an ad for broadband access no less. Bastards.
Posted by ben vershbow at 07:27 AM
| Comments (1)
tags: ISP , Network_Freedom , Publishing, Broadcast, and the Press , bandwidth , broadband , cable , e-publishing , internet , network_neutrality , phone
the role of note taking in the information age 12.03.2005, 3:19 PM
An article by Ann Blair in a recent issue of Critical Inquiry (vol 31 no 1) discusses the changing conceptions of the function of note-taking from about the sixth century to the present, and ends with a speculation on the way that textual searches (such as Google Book Search) might change practices of note-taking in the twenty-first century. Blair argues that "one of the most significant shifts in the history of note taking" occured in the beginning of the twentieth century, when the use of notes as memorization aids gave way to the use of notes as a aid to replace the memorization of too-abundant information. With the advent of the net, she notes:
Today we delegate to sources that we consider authoritative the extraction of information on all but a few carefully specialized areas in which we cultivate direct experience and original research. New technologies increasingly enable us to delegate more tasks of remembering to the computer, in that shifting division of labor between human and thing. We have thus mechanized many research tasks. It is possible that further changes would affect even the existence of note taking. At a theoretical extreme, for example, if every text one wanted were constantly available for searching anew, perhaps the note itself, the selection made for later reuse, might play a less prominent role.
The result of this externalization, Blair notes, is that we come to think of long-term memory as something that is stored elsewhere, in "media outside the mind." At the same time, she writes, "notes must be rememorated or absorbed in the short-term memory at least enough to be intelligently integrated into an argument; judgment can only be applied to experiences that are present to the mind."
Blair's article doesn't say that this bifurcation between short-term and long-term memory is a problem: she simply observes it as a phenomenon. But there's a resonance between Blair's article and Naomi Baron's recent Los Angeles Times piece on Google Book Search: both point to the fact that what we commonly have defined as scholarly reflection has increasingly become more and more a process of database management. Baron seems to see reflection and database management as being in tension, though I'm not completely convinced by her argument. Blair, less apocalyptic than Baron, nonetheless gives me something to ponder. What happens to us if (or when) all of our efforts to make the contents of our extrasomatic memory "present to our mind" happen without the mediation of notes? Blair's piece focuses on the epistemology rather than the phenomenology of note taking — still, she leads me to wonder what happens if the mediating function of the note is lost, when the triangular relation between book, scholar and note becomes a relation between database and user.
insidious tactic #348: charge for web speed 12.02.2005, 8:31 AM
An article in yesterday's Washington Post -- "Executive Wants to Charge for Web Speed" -- brings us back to the question of pipes and the future of the internet. The chief technology officer for Bell South says telecoms and cable companies ought to be allowed to offer priority deals to individual sites, charging them extra for faster connections. The Post:
Several big technology firms and public interest groups say that approach would enshrine Internet access providers as online toll booths, favoring certain content and shutting out small companies trying to compete with their offerings.
Among these "big technology firms" are Google, Yahoo!, Amazon and eBay, all of whom have pressed the FCC for strong "network neutrality" provisions in the latest round of updates to the 1996 Telecommunications Act. These would forbid discrimination by internet providers against certain kinds of content and services (i.e. the little guys). BellSouth claims to support the provisions, though the statements of its tech officer suggest otherwise.
Turning speed into a bargaining chip will undoubtedly privilege the richer, more powerful companies and stifle competition -- hardly a net-neutral scenario. They claim it's no different from an airline offering business class -- it doesn't prevent folks from riding coach and reaching their destination. But we all know how cramped and awful coach is. The truth is that the service providers discriminate against everyone on the web. We're all just freeloaders leeching off their pipes. The only thing that separates Google from the lady blogging about her cat is how much money they can potentially pay for pipe rental. That's where the "priorities" come in.
Moreover, the web is on its way to merging with cable television, and this, in turn, will increase the demand for faster connections that can handle heavy traffic. So "priority" status with the broadband providers will come at an ever increasing premium. That's their ideal business model, allowing them to charge the highest tolls for the use of their infrastructure. That's why the telecos and cablecos want to ensure, through speed-baiting and other screw-tightening tactics, that the net transforms from a messy democratic commons into a streamlined broadcast medium. Alternative media, video blogging, local video artists? These will not be "priorities" in the new internet. Maximum profit for pipe-holders will mean minimum diversity and a one-way web for us.
In a Business Week interview last month, SBC Telecommunications CEO Edward Whitacre expressed what seemed almost like a lust for revenge. Asked, "How concerned are you about Internet upstarts like Google, MSN, Vonage, and others?" he replied:
How do you think they're going to get to customers? Through a broadband pipe. Cable companies have them. We have them. Now what they would like to do is use my pipes free, but I ain't going to let them do that because we have spent this capital and we have to have a return on it. So there's going to have to be some mechanism for these people who use these pipes to pay for the portion they're using. Why should they be allowed to use my pipes?
The Internet can't be free in that sense, because we and the cable companies have made an investment and for a Google or Yahoo! or Vonage or anybody to expect to use these pipes [for] free is nuts!
This makes me worry that discussions about "network neutrality" overlook a more fundamental problem: lack of competition. "That's the voice of someone who doesn't think he has any competitors," says Susan Crawford, a cyberlaw and intellectual property professor at Cardozo Law School who blogs eloquently on these issues. She believes the strategy to promote network neutrality will ultimately fail because it accepts a status quo in which a handful of broadband monopolies dominate the market. "We need to find higher ground," she says.
I think the real fight should be over rights of way and platform competition. There's a clear lack of competition in the last mile -- that's where choice has to exist, and it doesn't now. Even the FCC's own figures reveal that cable modem and DSL providers are responsible for 98% of broadband access in the U.S., and two doesn't make a pool. If the FCC is getting in the way of cross-platform competition, we need to fix that. In a sense, we need to look down -- at the relationship between the provider and the customer -- rather than up at the relationship between the provider and the bits it agrees to carry or block...
...Competition in the market for pipes has to be the issue to focus on, not the neutrality of those pipes once they have been installed. We'll always lose when our argument sounds like asking a regulator to shape the business model of particular companies.
The broadband monopolies have their priorities figured out. Do we?
image: "explosion" (reminded me of fiber optic cable) by The Baboon, via Flickr
katrina archive on internet archive 12.01.2005, 2:26 PM
The Internet Archive has just established an archive dedicated to preserving the online response to the Katrina catastrophe. According to the Archive:
The Internet Archive and many individual contributors worked together to put together a comprehensive list of websites to create a historical record of the devastation caused by Hurricane Katrina and the massive relief effort which followed. This collection has over 25 million unique pages, all text searchable, from over 1500 sites. The web archive commenced on September 4th.
If you try to link to the Internet Archive today, you might not get through, because everyone is on the site talking about the Grateful Dead's decision to allow free downloading
flushing the net down the tubes 11.29.2005, 8:11 AM
Grand theories about upheavals on the internet horizon are in ready supply. Singularities are near. Explosions can be expected in the next six to eight months. Or the whole thing might just get "flushed" down the tubes. This last scenario is described at length in a recent essay in Linux Journal by Doc Searls, which predicts the imminent hijacking of the net by phone and cable companies who will turn it into a top-down, one-way broadcast medium. In other words, the net's utopian moment, the "read/write" web, may be about to end. Reading Searls' piece, I couldn't help thinking about the story of radio and a wonderful essay Brecht wrote on the subject in 1932:
Here is a positive suggestion: change this apparatus over from distribution to communication. The radio would be the finest possible communication apparatus in public life, a vast network of pipes. That is to say, it would be if it knew how to receive as well as to transmit, how to let the listener speak as well as hear, how to bring him into a relationship instead of isolating him. On this principle the radio should step out of the supply business and organize its listeners as suppliers....turning the audience not only into pupils but into teachers.
Unless you're the military, law enforcement, or a short-wave hobbyist, two-way radio never happened. On the mainstream commercial front, radio has always been about broadcast: a one-way funnel. The big FM tower to the many receivers, "prettifying public life," as Brecht puts it. Radio as an agitation? As an invitation to a debate, rousing families from the dinner table into a critical encounter with their world? Well, that would have been neat.
Now there's the internet, a two-way, every-which-way medium -- a stage of stages -- that would have positively staggered a provocateur like Brecht. But although the net may be a virtual place, it's built on some pretty actual stuff. Copper wire, fiber optic cable, trunks, routers, packets -- "the vast network of pipes." The pipes are owned by the phone and cable companies -- the broadband providers -- and these guys expect a big return (far bigger than they're getting now) on the billions they've invested in laying down the plumbing. Searls:
The choke points are in the pipes, the permission is coming from the lawmakers and regulators, and the choking will be done....The carriers are going to lobby for the laws and regulations they need, and they're going to do the deals they need to do. The new system will be theirs, not ours....The new carrier-based Net will work in the same asymmetrical few-to-many, top-down pyramidal way made familiar by TV, radio, newspapers, books, magazines and other Industrial Age media now being sucked into Information Age pipes. Movement still will go from producers to consumers, just like it always did.
If Brecht were around today I'm sure he would have already written (or blogged) to this effect, no doubt reciting the sad fate of radio as a cautionary tale. Watch the pipes, he would say. If companies talk about "broad" as in "broadband," make sure they're talking about both ends of the pipe. The way broadband works today, the pipe running into your house dwarfs the one running out. That means more download and less upload, and it's paving the way for a content delivery platform every bit as powerful as cable on an infinitely broader band. Data storage, domain hosting -- anything you put up there -- will be increasingly costly, though there will likely remain plenty of chat space and web mail provided for free, anything that allows consumers to fire their enthusiasm for commodities through the synapse chain.
If the net goes the way of radio, that will be the difference (allow me to indulge in a little dystopia). Imagine a classic Philco cathedral radio but with a few little funnel-ended hoses extending from the side that connect you to other listeners. "Tune into this frequency!" "You gotta hear this!" You whisper recommendations through the tube. It's sending a link. Viral marketing. Yes, the net will remain two-way to the extent that it helps fuel the market. Web browsers, like the old Philco, would essentially be receivers, enabling participation only to the extent that it encouraged others to receive.
You might even get your blog hosted for free if you promote products -- a sports shoe with gelatinous heels or a music video that allows you to undress the dancing girls with your mouse. Throw in some political rants in between to blow off some steam, no problem. That's entrepreneurial consumerism. Make a living out of your appetites and your ability to make them infectious. Hip recommenders can build a cosy little livelihood out of their endorsements. But any non-consumer activity will be more like amateur short-wave radio: a mildly eccentric (and expensive) hobby (and they'll even make a saccharine movie about a guy communing with his dead firefighter dad through a ghost blog).
Searls sees it as above all a war of language and metaphor. The phone and cable companies will dominate as long as the internet is understood fundamentally as a network of pipes, a kind of information transport system. This places the carriers at the top of the hierarchy -- the highway authority setting the rules of the road and collecting the tolls. So far the carriers have managed, through various regulatory wrangling and court rulings, to ensure that the "transport metaphor" has prevailed.
But obviously the net is much more than the sum of its pipes. It's a public square. It's a community center. It's a market. And it's the biggest publishing system the world has ever known. Searls wants to promote "place metaphors" like these. Sure, unless you're a lobbyist for Verizon or SBC, you probably already think of it this way. But in the end it's the lobbyists that will make all the difference. Unless, that is, an enlightened citizens' lobby begins making some noise. So a broad, broad as in broadband, public conversation should be in order. Far broader than what goes on in the usual progressive online feedback loops -- the Linux and open source communities, the creative commies, and the techno-hip blogosphere, that I'm sure are already in agreement about this.
Google also seems to have an eye on the pipes, reportedly having bought thousands of miles of "dark fiber" -- pipe that has been laid but is not yet in use. Some predict a nationwide "Googlenet." But this can of worms is best saved for another post.
Posted by ben vershbow at 08:11 AM
| Comments (2)
tags: Network_Freedom , Publishing, Broadcast, and the Press , brecht , broadband , broadcast , cable , fiber , google , internet , linux , media , net , radio , short_wave , telecom , telephone , tubes , utopia , verizon , web
virtual libraries, real ones, empires 11.28.2005, 12:36 PM
Last Tuesday, a Washington Post editorial written by Library of Congress librarian James Billington outlined the possible benefits of a World Digital Library, a proposed LOC endeavor discussed last week in a post by Ben Vershbow. Billington seemed to imagine the library as sort of a United Nations of information: claiming that "deep conflict between cultures is fired up rather than cooled down by this revolution in communications," he argued that a US-sponsored, globally inclusive digital library could serve to promote harmony over conflict:
Libraries are inherently islands of freedom and antidotes to fanaticism. They are temples of pluralism where books that contradict one another stand peacefully side by side just as intellectual antagonists work peacefully next to each other in reading rooms. It is legitimate and in our nation's interest that the new technology be used internationally, both by the private sector to promote economic enterprise and by the public sector to promote democratic institutions. But it is also necessary that America have a more inclusive foreign cultural policy -- and not just to blunt charges that we are insensitive cultural imperialists. We have an opportunity and an obligation to form a private-public partnership to use this new technology to celebrate the cultural variety of the world.
What's interesting about this quote (among other things) is that Billington seems to be suggesting that a World Digital Library would function in much the same manner as a real-world library, and yet he's also arguing for the importance of actual physical proximity. He writes, after all, about books literally, not virtually, touching each other, and about researchers meeting up in a shared reading room. There seems to be a tension here, in other words, between Billington's embrace of the idea of a world digital library, and a real anxiety about what a "library" becomes when it goes online.
I also feel like there's some tension here — in Billington's editorial and in the whole World Digital Library project — between "inclusiveness" and "imperialism." Granted, if the United States provides Brazilians access to their own national literature online, this might be used by some as an argument against the idea that we are "insensitive cultural imperialists." But there are many varieties of empire: indeed, as many have noted, the sun stopped setting on Google's empire a while ago.
To be clear, I'm not attacking the idea of the World Digital Library. Having watch the Smithsonian invest in, and waffle on, some of their digital projects, I'm all for a sustained commitment to putting more material online. But there needs to be some careful consideration of the differences between online libraries and virtual ones — as well as a bit more discussion of just what a privately-funded digital library might eventually morph into.
war on text? 11.27.2005, 3:23 PM
Last week, there was a heated discussion on the 1600-member Yahoo Groups videoblogging list about the idea of a videobloggers launching a "war on text" — not necessarily calling for book burning, but at least promoting the use of threaded video conversations as a way of replacing text-based communication online. It began with a post to the list by Steve Watkins and led to responses such as this enthusiastic embrace of the end of using text to communicate ideas:
Audio and video are a more natural medium than text for most humans. The only reason why net content is mainly text is that it's easier for programs to work with -- audio and video are opaque as far as programs are concerned. On top of that, it's a lot easier to treat text as hypertext, and hypertext has a viral quality.
As a text-based attack on the printed work, the "war on text" debate had a Phaedrus aura about it, especially since the vloggers seemed to be gravitating towards the idea of secondary orality originally proposed by Walter Ong in Orality and Literacy — a form of communication which is involved at least the representation of an oral exchange, but which also draws on a world defined by textual literacy. The vlogger's debt to the written word was more explicitly acknowledged some posts, such as one by Steve Garfield that declared his work to be a "marriage of text and video."
Over several days, the discussion veered to cover topics such as film editing, the over-mediation of existence, and the transition from analog to digital. The sophistication and passion of the discussion gave a sense of the way at least some in the video blogging community are thinking, both about the relationship between their work and text-based blogging and about the larger relationship between the written word and other forms of digitally mediated communication.
Perhaps the most radical suggestion in the entire exchange was the prediction that video itself would soon seem to be an outmoded form of communication:
in my opinion, before video will replace text, something will replace video...new technologies have already been developed that are more likely to play a large role in communications over this century... how about the one that can directly interface to the brain (new scientist reports on electroencephalography with quadriplegics able to make a wheelchair move forward, left or right)... considering the full implications of devices like this, it's not hard to see where the real revolutions will occur in communications.
This comment implies that debates such as the "war on text" are missing the point — other forms of mediation are on the horizon that will radically change our understanding of what "communication" entails, and make the distinction between orality and literacy seem relatively miniscule. It's an apocalyptic idea (like the idea that the internet will explode), but perhaps one worth talking about.
explosion 11.22.2005, 2:10 PM
A Nov. 18 post on Adam Green's Darwinian Web makes the claim that the web will "explode" (does he mean implode?) over the next year. According to Green, RSS feeds will render many websites obsolete:
The explosion I am talking about is the shifting of a website's content from internal to external. Instead of a website being a "place" where data "is" and other sites "point" to, a website will be a source of data that is in many external databases, including Google. Why "go" to a website when all of its content has already been absorbed and remixed into the collective datastream.
Does anyone agree with Green? Will feeds bring about the restructuring of "the way content is distributed, valued and consumed?" More on this here.
Posted by lisa lynch at 02:10 PM
| Comments (5)
tags: Libraries, Search and the Web , Online , Publishing, Broadcast, and the Press , RSS , blogging , blogs , darwin , darwinism , google , internet , singularity , syndication , web , xml
"open source media" -- not the radio show -- launches a best-bloggers site 11.19.2005, 1:07 PM
On Wednesday, November 17, a media corporation called Open Source Media launched a portal site that intends to assemble the best bloggers on the internet in one place. According to the Associated Press, some 70 Web journalists, including Instapundit's Glenn Reynolds and David Corn, Washington editor of the Nation magazine, have agreed to participate. The site will link to individual blog postings and highlight the best contributions in a special section: bloggers will be paid for content depending on the amount of traffic they generate.
Far from a seat-of-the-pants effort, OSM has $3.5 million dollars in venture capital funding. Supposedly, the site will pay for itself — and pay its bloggers — with the advertising it generates. In the "about" section of the site, the founders of OSM lay out their vision for remaking the future of blogging and media in general:
OSM’s mission is to expand the influence of weblogs by finding and promoting the best of them, providing bloggers with a forum to meet and share resources, and the chance to join a for-profit network that will give them additional leverage to pursue knowledge wherever they may find it. From academics, professionals and decorated experts, to ordinary citizens sitting around the house opining in their pajamas, our community of bloggers are among the most widely read and influential citizen journalists out there, and our roster will be expanding daily. We also plan to provide a bridge between old media and new, bringing bloggers and mainstream journalists—more and more of whom have started to blog—together in a debate-friendly forum.
We at if:book like the idea of a blog portal, especially one staffed by a series of editors selecting the best posts on the blogs they've chosen. But this venture — which fits perfectly with John Batelle's vision for the web's second coming — also seems to nicely embody the tension between doing good and making money: all that venture capital and overhead is going to put a lot of pressure on OSM to deliver the Oprah of the blogging world, if she's out there. And paying bloggers based on how many readers they get is certainly going to shape the content that appears on the site. Unlike others who conceived of their blogs from the get-go as small businesses, most of the bloggers chosen by OSM haven't been trying to make money from their blogs until now.
OSM also shot themselves in the foot by stealing the name of the newish public radio show Open Source Media, which we've written about here. The two are currently involved in a dispute over the name. and OSM hasn't really been able to come up with a good reason why they should keep using a name that belongs to someone else. They have trademarked OSM, and they now refer to their unabbreviated name as "not a trade name," but "a description of who we are and what we do."
Needless to say, OSM has generated a fair amount of bad blood by appropriating the name of a nonprofit, and most of the grumbling has taken place in exactly the same place OSM hopes to make a difference — the blogosphere.
a better boom? 11.18.2005, 12:01 PM
An editorial in today's New York Times by The Search author Jon Battelle makes the argument that the current resurgence in technology stocks is not the sign of another technology "bubble," but rather an indication that companies have finally figured out how to capitalize on the internet. Batelle writes:
... we are witnessing the Web's second coming, and it's even got a name, "Web 2.0" - although exactly what that moniker stands for is the topic of debate in the technology industry. For most it signifies a new way of starting and running companies - with less capital, more focus on the customer and a far more open business model when it comes to working with others. Archetypal Web 2.0 companies include Flickr, a photo sharing site; Bloglines, a blog reading service; and MySpace, a music and social networking site.
In other words, Batelle is pointing out that one way to "get it right" is not to sell content to users, but rather to give them the opportunity to create and search their own content. This is not only good business sense, he says, it's also more enlightened — the creators of social software such as Flickr are motivated equally by a desire to "do good in the world" and a desire to make money. "The culture of Web 2.0 is, in fact, decidedly missionary," Batelle writes, "from the communitarian ethos of Craigslist to Google's informal motto, 'don't be evil.'"
O.K. Doing good while making money. Reading this, I'm reminded of Paul Hawken's Natural Capitalism and the larger sustainability movement — the optimistic philosophy that weaves together environmental ethics and profitability. But is that what's really going on here? Isn't the "missionary" culture of the internet a bit OLDER than Web 2.0? Batelle is suggesting that Internet capitalists have gotten all misty and utopian; isn't it the case that some of the folks who were already misty and utopian have just started making some money?
I guess the more viable comparison here would be to Marc Andreessen's decision to transform his Mosaic browser from its public-domain University of Illinois incarnation into the Netscape Browser. Andreessen certainly started out as a browser missionary — and, like the companies Batelle sees as characteristic of Internet 2.0, Andreessen's vision for Netscape (and in the beginning, Jim Clark's vision as well) was a strong customer focus and open business model. What happened? Netscape's meteoric success helped inflate the internet "bubble" Batelle's referring to, and in the end, after the long battle with Microsoft, the company's misfortunes helped to burst that bubble as well.
So what paradigm fits? Is "Internet 2.0" really new and more socially enlightened? Or are we just seeing a group of social software businesses — and one big search engine — just in the early stages of an inevitable transformation into corporations that are less interested in doing good than making money?
Incidentally, last month, Marc Andressen launched a social networking platform called Ning.
the times they are a-changin' 11.14.2005, 4:18 PM
Knight Ridder Inc., the second largest newspaper conglomerate in the U.S., is under intense pressure from its more powerful investors to start selling off papers. The New York Times reports that the company is now contemplating "strategic alternatives." Consider the following in terms of what Bob is saying one post down about time. With the rise of the 24-hour news cycle and the internet, news is adopting a different time signature.
It is unclear who may want to buy Knight Ridder. Newspaper companies, though still immensely profitable, have a murky future that is clouded by a shrinking readership and weak advertising revenue, both of which are being leeched away by the Internet.
...In the six moths that ended in September, newspaper circulation nationally fell 2.6 percent daily and 3.1 percent on Sundays, the biggest decline in any comparable period since 1991, according to the Audit Bureau of Circulations. All in all, 45.2 million people subscribed to 1,457 reporting papers, down from a peak of 63.3 million people and 1,688 newspapers in 1984.
By comparison, 47 million people visited newspaper Web sites, about a third of United States Internet users, according to the circulation bureau.
The time it takes to read the newspaper in print -- a massive quilt, chopped up and parceled (I believe Gary Frost said something about this) -- you might say it leads to a different sort of understanding of the world around you. It seems to me that the newspapers that will last longest in print are the Sunday editions, aimed at a leisurely audience, taking stock of the week that has just ended and preparing for the one about to commence. On Sundays, the world spreads out before you in print, and perhaps you make a point of taking some time away from the computer (at least, this might be the case for hybrid monkeys like me who are more or less at home with both print and digital). The briskness of discourse on the web and in popular culture does not afford the time to engage with big ideas. Bob talks, not without irony, about "tithing to the church of big ideas." Set aside the time to engage with world-changing ideas, willfully turn away from the screen.
The persistence of the Sunday print edition, if it comes to pass, might in some way reflect this kind of tithing, this intentional slowing down.
Posted by ben vershbow at 04:18 PM
| Comments (1)
tags: Mediated Existence , Online , Publishing, Broadcast, and the Press , Transliteracies , internet , journalism , knight_ridder , media , news , newspaper , sunday , web
blogging and beyond 11.10.2005, 6:01 AM
Yesterday on Talking Points Memo, Josh Marshall drew back momentarily from the relentless news cycle to air a few meta thoughts on blogs and blogging, fleshing out some of the ideas behind his TPM Cafe venture (a multi-blog hub on politics and society) and his recent hiring notice for a "reporter-blogger" to cover Capitol Hill.
Marshall's ruminations tie in nicely with a meeting the institute is holding tomorrow (I'm running to the airport shortly) at our institutional digs at the University of Southern California in Los Angeles to discuss possible futures of the blogging medium, particularly in regard to the academy and the role of public intellectual. Gathering around the table for a full day of discussion will be a number of blogger-professors and doctoral students, several journalists and journalism profs, and a few interesting miscellaneous spoons to help stir the pot. We've set up a blog (very much resembling this one) as a planning stage for the meeting. Feel free to take a look and comment on the agenda and the list of participants.
The meeting is a sort of brainstorm session for a project the institute is hatching that aims to encourage academics with expert knowledge and a distinctive voice to use blogs and other internet-based vehicles to step beyond the boundaries of the academy to reach out to a broader public audience. Issues/questions/problems we hope to address include the individual voice in conflict with (or in complement to) mainstream media. How the individual voice establishes and maintains integrity on the web. How several voices could be aggregated in a way that expands both the audience and the interaction with readers without sacrificing the independence of the individual voices. Blogging as a bridge medium between the academy and the world at large. Blogging as a bridge medium between disciplines in the academy in a way that sheds holistic light on issues of importance to a larger public. And strengths and weaknesses of the blog form itself.
This last point has been on our minds a lot lately and I hope it will get amply discussed at the meeting. A year or two ago, the word "blog" didn't mean anything to most people. Now it is all but fully embraced as the medium of the web. But exciting as the change has been, it shouldn't be assumed that blogs are the ideal tool for all kinds of discourse. In fact, what's interesting about blogs right now, especially the more intellectually ambitious ones, is how much they are doing in so limiting a form. With its ruthlessly temporal structure and swift burial of anything more than 48 hours old, blogs work great for sites like TPM whose raison d'être is to comment on the news cycle, or sites like Boing Boing, Gawker, or Fark.com serving up oddities, gossip and boredom cures for the daily grind. But if, god forbid, you want ideas and discussion to unfold over time, and for writing to enjoy a more ample window of relevance, blogs are frustratingly limited.
Even Josh Marshall, a politics blogger who is served well by the form, wishes it could go deeper:
...the stories that interest me right now are a) the interconnected web of corruption scandals bubbling up out the reining Washington political machine and b) the upcoming mid-term elections.
I cover a little of both. And I've particularly tried to give some overview of the Abramoff story. But I'm never able to dig deeply enough into the stories or for a sustained enough period of time or to keep track of how all the different ones fit together. That's a site I'd like to read every day -- one that pieced together these different threads of public corruption for me, showed me how the different ones fit together (Abramoff with DeLay with Rove with the shenanigans at PBS and crony-fied bureaucracies like the one Michael Brown was overseeing at FEMA) and kept tabs on how they're all playing in different congressional elections around the country.
That's a site I'd like to read because I'm never able to keep up with all of it myself. So we're going to try to create it.
I'm excited to hear from folks at tomorrow's meeting where they'd like blogging to go. I'd like to think that we're groping toward a new web genre, perhaps an extension of blogs, that is less temporal and more thematic -- where ideas, not time, are the primary organizing factor. This question of form goes hand in hand with the content question that our meeting will hopefully address: how do we get more people with big ideas and expertise to start engaging the world in a serious way through these burgeoning forms? I could say more, but I've got a plane to catch.
transliterature: can humanism transform the web? 10.25.2005, 3:03 PM
For decades now, hypertext guru Ted Nelson has slipped in and out of public awareness, often left for dead or permanently exiled in Xanadu, only to re-emerge suddenly in a wonderful burst of curmudgeonly dissent. A recent Slashdot thread discusses his latest project, or more accurately, the latest stage in his ongoing quest: transliterature, "a humanist format for re-usable documents and media," or, an alternative to the constricting protocols of the world wide web. What exactly will this new format entail? It's hard to tell. But Nelson's plea is worth heeding:
The tekkies have hijacked literature- with the best intentions, of course!-) - but now the humanists have to get it back. Nearly every form of electronic document- Word, Acrobat, HTML, XML- represents some business or ideological agenda. Many believe Word and Acrobat are out to entrap users; HTML and XML enact a very limited kind of hypertext with great internal complexity. All imitate paper and (internally) hierarchy. I propose a different document agenda: I believe we need new electronic documents which are transparent, public, principled, and freed from the traditions of hierarchy and paper. In that case they can be far more powerful, with deep and rich new interconnections and properties- able to quote dynamically from other documents and buckle sideways to other documents, such as comments or successive versions; able to present third-party links; and much more. Most urgently: if we have different document structures we can build a new copyright realm, where everything can be freely and legally quoted and remixed in any amount without negotiation.
Nelson is always given a nod as the coiner of "hypertext", but his other concepts -- "transclusion", "virtual rearrangement", "clinks," for example -- are largely dismissed, or simply unknown to most people. But elements of his thinking can be observed far and wide in some of the emerging practices -- blogging, wikis, APIs -- of what people are calling "Web 2.0", or, the web as operating system. Over the past few years, the web has transformed from an interlinked series of brochures into a massive hypertext conversation, a platform in which we are increasingly able to weave, quote and track back to other documents. This is at least in the neighborhood of what Nelson is talking about.
Granted, the microeconomy of quotation (transclusion) that Nelson envisions has not yet materialized, but that may only be because he is thinking so far ahead of his time. Staying focused on the present, it's worth taking a look at what is developing with online advertising. Keyword ads, Google's "AdSense", Amazon's web services, and even voluntary donation models like PayPal tip jars -- couldn't you say these are the humble foundations of an online micropayment economy? The explosion of electronic self-publishing has not as yet produced an equivelant commercial rigging, but with blogging now accepted as an important medium, that could soon change.
The next generation of publishing software may include a more robust infrastructure that could support some kind of quotation or cross-referencing economy. Right now, the few blogs that make money do so by encrusting themselves with ads. Advertisers will buy space if the site can demonstrate impressive traffic stats. But doesn't this all sort of skirt around the edge of what makes blogging exciting and influential? What if talented bloggers could earn money when significant portions of their writing were quoted?
You can already quote images, video and sound in the way Nelson dreams of quoting text: by loading it remotely, i.e. from another location on the internet. Of course, there is no microtransaction infrastructure in place. It's much more roughshod than that. You simply pull html from the source site, or embed the file's address in a media player, and plug it in your page. That's how I've transcluded John Ashbery reading his poem "The Tennis Court Oath" (source - ubuweb):
There's still a long way to go, but the points of contact with Nelson's theories are many. For me, it's his humanist philosophy, more than the fuzzy mechanics of his proposed system, that is most inspiring. There's a generosity, an understanding of the interdependency of form and content, that is conspicuously absent in the prevailing tekkie culture. Perhaps the thinker closest of kin to Nelson was Jef Raskin, whose work on the humane interface is founded on many of the same convictions about usability and connectedness. I also find there's a kind of poetry in Nelson's dream of a literary hypertext economy, captured not only in his writings but in his frayed, manic illustrations (transquoted here):
I think he's a kindred spirit of the institute too. Here's Nelson on electronic literature (sadly, not transquoted, just cut-and-paste):
What is literature? Literature is (among other things) the study and design of documents, their structure and connections. Therefore today's electronic documents are literature, electronic literature, and the question is what electronic literature people really need.
Electronic literature should belong to all the world, not just be hoarded by a priesthood, and it should do what people need in order to organize and present human ideas with the least difficulty in the richest possible form.
A document is not necessarily a simulation of paper. In the most general sense, a document is a package of ideas created by human minds and addressed to human minds, intended for the furtherance of those ideas and those minds. Human ideas manifest as text, connections, diagrams and more: thus how to store them and present them is a crucial issue for civilization.
The furtherance of the ideas, and the furtherance of the minds that present them and take them in, are the real objectives. And so what is important in documents is the expression, reception and re-use of ideas. Connections, annotations, and most especially re-use-- the traceable flow of content among documents and their versions-- must be our central objectives, not the simulation of paper.
Posted by ben vershbow at 03:03 PM
| Comments (2)
tags: Transliteracies , design_curmudgeonry , digital_literature , ebooks , history_of_interactive_media , html , hypertext , internet , literature , ted_nelson , transclusion , transliterature , web , web_2.0 , xanadu
google expands book-scanning project to europe 10.18.2005, 8:56 AM
This week Google will be paying a visit to the Frankfurt Book Fair to talk with European publishers and chief librarians (including arch nemesis Jean-Nöel Jeanneney) about eight new local incarnations of Google Print. (more)
Posted by ben vershbow at 08:56 AM
| Comments (0)
tags: Libraries, Search and the Web , Online , books , copyright , ebook , europe , frankfurt , google , internet , library , publishing , search , web
a future written in electronic ink? 10.18.2005, 8:47 AM
Discussions about the future of newspapers often allude to a moment in the Steven Spielberg film "Minority Report," set in the year 2054, in which a commuter on the train is reading something that looks like a paper copy of USA Today, but which seems to be automatically updating and rearranging its contents like a web page. This is a comforting vision for the newspaper business: reassigning the un-bottled genie of the internet to the familiar commodity of the broadsheet. But as with most science fiction, the fallacy lies in the projection of our contemporary selves into an imagined future, when in fact people and the way they read may have very much changed by the year 2054.
Being a newspaper is no fun these days. The demand for news is undiminished, but online readers (most of us now) feel entitled to a free supply. Print circulation numbers continue to plummet, while the cost of newsprint steadily rises -- it hovers right now at about $625 per metric ton (according to The Washington Post, a national U.S. paper can go through around 200,000 tons in a year).
Staffs are being cut, hiring freezes put into effect. Some newspapers (The Guardian in Britain and soon the Wall Street Journal) are changing the look and reducing the size of their print product to lure readers and cut costs. But given the rather grim forecast, some papers are beginning to ponder how other technologies might help them survive.
Last week, David Carr wrote in the Times about "an ipod for text" as a possible savior -- a popular, portable device that would reinforce the idea of the newspaper as something you have in your hand, that you take with you, thereby rationalizing a new kind of subscription delivery. This weekend, the Washington Post hinted at what that device might actually be: a flexible, paper-like screen using "e-ink" technology.
An e-ink display is essentially a laminated sheet containing a thin layer of fluid sandwiched between positive and negative electrodes. Tiny capsules of black and white pigment float in between and arrange themselves into images and text through variance in the charge (the black are negatively charged and the white positively charged). Since the display is not light-based (like the electronic screens we use today), it has an appearance closer to paper. It can be read in bright sunlight, and requires virtually no power to maintain an image.
Frank Ahrens, who wrote the Post piece, held a public online chat with Russ Wilcox, the chief exec of E Ink Corp. Wilcox predicts that large e-ink screens will be available within a year or two, opening the door for newspapers to develop an electronic product that combines web and broadsheet. Even offering the screens to subscribers for free, he calculates, would be more cost-efficient than the current paper delivery system.
A number of major newspaper conglomerates -- including The Hearst Corporation, Gannett Co. (publisher of USA Today), TOPPAN Printing Company of Japan, and France's Vivendi Universal Publishing -- are interested enough in the potential of e-ink that they have become investors.
But maybe it won't be the storied old broadsheet that people crave. A little over a month ago at a trade show in Berlin, Philips Polymer Vision presented a prototype of its new "Readius" -- a device about the size of a mobile phone with a roll-out e-ink screen. This, too, could be available soon. Like it or not, it might make more sense to watch what's developing with cell phones to get a hint of the future.
But even if electronic paper catches on -- and it seems likely that it, or something similar, will -- I wouldn't count on it to solve the problems of the print news industry. It's often tempting to think of new technologies that fundamentally change the way we operate as simply a matter of pouring old wine into new bottles. But electronic paper will be a technology for delivering the web, or even internet television -- not individual newspapers. So then how do we preserve (or transfer) all that is good about print media, about institutions like the Times and the Post, assuming that their prospects continue to worsen? The answer to that, at least for now, is written in invisible ink.
Posted by ben vershbow at 08:47 AM
| Comments (2)
tags: Online , Publishing, Broadcast, and the Press , The Ideal Device? , book , books , computer , e-ink , ebook , eink , gadget , gadgets , interactive , internet , ipod , journalism , media , media_consumption , newspaper , paper , print , publishing , reading , readius , spielberg , technology , web
nicholas carr on "the amorality of web 2.0" 10.17.2005, 9:00 AM
Nicholas Carr, who writes about business and technology and formerly was an editor of the Harvard Business Review, has published an interesting though problematic piece on "the amorality of web 2.0". I was drawn to the piece because it seemed to be questioning the giddy optimism surrounding "web 2.0", specifically Kevin Kelly's rapturous late-summer retrospective on ten years of the world wide web, from Netscape IPO to now. While he does poke some much-needed holes in the carnival floats, Carr fails to adequately address the new media practices on their own terms and ends up bashing Wikipedia with some highly selective quotes.
Carr is skeptical that the collectivist paradigms of the web can lead to the creation of high-quality, authoritative work (encyclopedias, journalism etc.). Forced to choose, he'd take the professionals over the amateurs. But put this way it's a Hobson's choice. Flawed as it is, Wikipedia is in its infancy and is probably not going away. Whereas the future of Britannica is less sure. And it's not just amateurs that are participating in new forms of discourse (take as an example the new law faculty blog at U. Chicago). Anyway, here's Carr:
The Internet is changing the economics of creative work - or, to put it more broadly, the economics of culture - and it's doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it's created by amateurs rather than professionals, it's free. And free trumps quality all the time. So what happens to those poor saps who write encyclopedias for a living? They wither and die. The same thing happens when blogs and other free on-line content go up against old-fashioned newspapers and magazines. Of course the mainstream media sees the blogosphere as a competitor. It is a competitor. And, given the economics of the competition, it may well turn out to be a superior competitor. The layoffs we've recently seen at major newspapers may just be the beginning, and those layoffs should be cause not for self-satisfied snickering but for despair. Implicit in the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for one can't imagine anything more frightening.
He then has a nice follow-up in which he republishes a letter from an administrator at Wikipedia, which responds to the above.
Encyclopedia Britannica is an amazing work. It's of consistent high quality, it's one of the great books in the English language and it's doomed. Brilliant but pricey has difficulty competing economically with free and apparently adequate....
...So if we want a good encyclopedia in ten years, it's going to have to be a good Wikipedia. So those who care about getting a good encyclopedia are going to have to work out how to make Wikipedia better, or there won't be anything.
Posted by ben vershbow at 09:00 AM
| Comments (5)
tags: Libraries, Search and the Web , OS , Online , Publishing, Broadcast, and the Press , Social Software , Web2.0 , amateur , blog , blogging , blogs , book , books , britannica , collective , encyclopedia , encyclopedia_britannica , internet , journalism , mainstream_media , media , msm , open_content , open_source , publishing , web , web_2.0 , wiki , wikipedia
an ipod for text 10.13.2005, 9:26 AM
When I ride the subway, I see a mix of paper and plastic. Invariably several passengers are lost in their ipods (there must be a higher ipod-per-square-meter concentration in New York than anywhere else). One or two are playing a video game of some kind. Many just sit quietly with their thoughts. A few are conversing. More than a few are reading. The subway is enormously literate. A book, a magazine, The Times, The Post, The Daily News, AM New York, Metro, or just the ads that blanket the car interior. I may spend a lot of time online at home or at work, but on the subway, out in the city, paper is going strong.
Before long, they'll be watching television on the subway too, seeing as the latest ipod now plays video. But rewind to Monday, when David Carr wrote in the NY Times about another kind of ipod -- one that would totally change the way people read newspapers. He suggests that to bounce back from these troubled times (sagging print circulation, no reliable business model for their websites), newspapers need a new gadget to appear on the market: a light-weight, highly portable device, easy on the eyes, easy on the batteries, that uploads articles from the web so you can read them anywhere. An ipod for text.
This raises an important question: is it all just a matter of the reading device? Once there are sufficient advances in display technology, and a hot new gadget to incorporate them, will we see a rapid, decisive shift away from paper toward portable electronic text, just as we have witnessed a widespread migration to digital music and digital photography? Carr points to a recent study that found that in every age bracket below 65, a majority of reading is already now done online. This is mostly desktop reading, stationary reading. But if the greater part of the population is already sold on web-based reading, perhaps it's not too techno-deterministic to suppose that an ipod-like device would in fact bring sweeping change for portable reading, at least periodicals.
But the thing is, online reading is quite different from print reading. There's a lot of hopping around, a lot of digression. Any new hardware that would seek to tempt people to convert from paper would have to be able to surf the web. With mobile web, and wireless networks spreading, people would expect nothing less (even the new Sony PSP portable gaming device has a web browser). But is there a good way to read online text when you're offline? Should we be concerned with this? Until wi-fi is ubiquitous and we're online all the time (a frightening thought), the answer is yes.
We're talking about a device that you plug into your computer that automatically pulls articles from pre-selected sources, presumably via RSS feeds. This is more or less how podcasting works. But for this to have an appeal with text, it will have to go further. What if in addition to uploading new articles in your feed list, it also pulled every document that those articles linked to, so you could click through to referenced sites just as you would if you were online?
It would be a bounded hypertext system. You could do all the hopping around you like within the cosmos of that day's feeds, and not beyond -- you would have the feeling of the network without actually being hooked in. Text does not take up a lot of hard drive space, and with the way flash memory is advancing, building a device with this capacity would not be hard to achieve. Of course, uploading link upon link could lead down an infinite paper trail. So a limit could be imposed, say, a 15-step cap -- a limit that few are likely to brush up against.
So where does the money come in? If you want an ipod for text, you're going to need an itunes for text. The "portable, bounded hypertext RSS reader" (they'd have to come up with a catchier name --the tpod, or some such techno-cuteness) would be keyed in to a subscription service. It would not be publication-specific, because then you'd have to tediously sign up with dozens of sites, and no reasonable person would do this.
So newspapers, magazines, blogs, whoever, will sign licensing agreements with the tpod folks and get their corresponding slice of the profits based on the success of their feeds. There's a site called KeepMedia that is experimenting with such a model on the web, though not with any specific device in mind (and it only includes mainstream media, no blogs). That would be the next step. Premium papers like the Times or The Washington Post might become the HBOs and Showtimes of this text-ripping scheme -- pay a little extra and you get the entire electronic edition uploaded daily to your tpod.
As for the device, well, the Sony Librie has had reasonable success in Japan and will soon be released in the States. The Librie is incredibly light and uses an "e-ink" display that is reflective like paper (i.e. it can be read in bright sunlight), and can run through 10,000 page views on four triple-A batteries.
The disadvantages: it's only black-and-white and has no internet connectivity. It also doesn't seem to be geared for pulling syndicated text. Bob brought one back from Japan. It's nice and light, and the e-ink screen is surprisingly sharp. But all in all, it's not quite there yet.
There's always the do-it-yourself approach. The Voyager Company in Japan has developed a program called T-Time (the image at the top is from their site) that helps you drag and drop text from the web into an elegant ebook format configureable for a wide range of mobile devices: phones, PDAs, ipods, handheld video games, camcorders, you name it. This demo (in Japanese, but you'll get the idea) demonstrates how it works.
Presumably, you would also read novels on your text pod. I personally would be loathe to give up paper here, unless it was a novel that had to be read electronically because it was multimedia, or networked, or something like that. But for syndicated text -- periodicals, serials, essays -- I can definitely see the appeal of this theoretical device. I think it's something people would use.
Posted by ben vershbow at 09:26 AM
| Comments (2)
tags: Online , Publishing, Broadcast, and the Press , RSS , The Ideal Device? , apple , book , books , e-ink , e_ink , ebook , ebooks , gadget , internet , ipod , japan , journalism , librie , media , news , newspaper , paper , paperless , podcast , podcasting , print , publishing , reader , reading , sony , syndication , technology , web
trackback, adieu 10.13.2005, 1:20 AM
We've officially and permanently shut off the trackback function on if:book. We're sad to do it. The idea of trackback is such a good one -- a way to send signals (pings) to other blogs alerting them that one of their posts is being discussed on your site. It ties the blogosphere together, fosters conversations across the web. It was a beautiful dream, but spammers killed it.
Tom Coates pronounced trackback dead back in April, but if:book was only a few months old at the time, still green and optimistic. We were also less known, so spam was only coming in a light sprinkle. Now it's been a month since our last legitimate ping, and the daily dose of spam has grown so large (and so filthy) that it hardly seems worth it to keep the door open. Fewer bloggers are tracking back now anyway since most have accepted that it is a dying practice, or perhaps haven't even heard of it at all.
So trackback is done. I just want to say a few goodbyes...
Goodbye, diet pills.
Goodbye, discount sneakers.
Goodbye, hentai comics.
Goodbye, cheap loans (spelled lones).
Goodbye, online pharmacy.
Goodbye, online casino.
Goodbye, texas holdem.
Goodbye, arbitrage sports betting.
Goodbye, free nude black jack.
Goodbye, rape fantasies.
Goodbye, incest stories.
Goodbye, shemale porn.
Goodbye, animal sex.
Goodbye, gay erotica.
Goodbye, tranny surprise.
Goodbye, sex grannies.
A big middle finger to all of you.
Posted by ben vershbow at 01:20 AM
| Comments (0)
tags: Online , blog , blogging , blogs , elegy , internet , movable_type , ping , social_software , socialsoftware , spam , spammer , spamming , trackback , trust , web
google dystopia 10.10.2005, 10:06 AM
Google as big brother -- the paranoia certainly seems to be creeping into the mainstream. "Op-Art" by Randy Siegel from today's NY Times:
Posted by ben vershbow at 10:06 AM
| Comments (0)
tags: 1984 , 2084 , Libraries, Search and the Web , NYTimes , Online , algorithm , art , cartoon , dystopia , editorial , google , information , internet , newspaper , orwell , paranoia , privacy , satire , search , technology , web
ubu, king again 10.07.2005, 1:21 AM
It's nice to see that UbuWeb, the great public web library of the avant garde, is back online after "a long summer of rebuilding." At times when the web feels depressingly shallow, Ubu can be the perfect medecine. Among the many masterworks you will find is Samuel Beckett's "Film" (1965), starring a very old Buster Keaton. It's wonderful that anyone can watch this online (I've just spent half an hour in its thrall).
Also worth checking out are /ubu Editions - handsomely designed electronic texts ranging across an interesting selection of poetry, prose and theatre, including Ron Silliman's "The Chinese Notebook," which Dan blogged about a couple weeks back. These, like everything else on Ubu, are free.
Posted by ben vershbow at 01:21 AM
| Comments (1)
tags: Libraries, Search and the Web , Online , avant_garde , avantgarde , beckett , buster_keaton , curated , ebook , experimental , fiction , film , gallery , internet , keaton , library , media , museum , music , poetry , samuel_beckett , silliman , theatre , ubu , ubuweb , web
premature burial, or, the electronic word in time and space 10.06.2005, 2:09 PM
We were talking yesterday (and Bob earlier) about how to better organize content on if:book - how to highlight active discussion threads, or draw attention to our various categories. Something more dynamic than a list of links on the sidebar, or a bunch of hot threads advertised at the top. A significant problem with blogs is the tyranny of the vertical column, where new entries call out for attention on a stack of rapidly forgotten material, much of which might still be worth reading even though it was posted back in the dark ages (i.e. three days ago). Some of the posts that get buried still have active discussions stemming from them. Just today, "ways of seeing, ways of writing" - posted nearly two weeks ago - received another comment. The conversation is still going. (See also Dan's "blog reading: what's left behind".)
This points to another thorny problem, still unsolved nearly 15 years into the world wide web, and several years into the blogging craze: how to visualize asynchronous conversations - that is, conversations in which time lapses between remarks. If the conversation is between only two people, a simple chronological column works fine - it's a basic back-and-forth. But consider the place where some of the most dynamic multi-person asynchronous conversations are going on: in the comment streams of blog entries. Here you have multiple forking paths, hopping back and forth between earlier and later remarks, people sticking close to the thread, people dropping in and out. But again, you have the tyranny of the vertical column.
We're using an open source platform called Drupal for our Next\Text project, which has a blog as its central element but can be expanded with modular units to do much more than we're able to do here. The way Drupal handles comments is nice. You have the usual column arranged chronologically, with comments streaming downward, but readers have the option of replying to specific comments, not just to the parent post. Replies to specific comments are indented slightly, creating a sort of sub-stream, and the the fork can keep on going indefinitely, indenting rightward.
This handles forks and leaps fairly well, but offers at best only a partial solution. We're still working with a print paradigm: the outline. Headers, sub-headers, bullet points. These distinguish areas in a linear stream, but they don't handle the non-linear character of complex conversations. There is always the linear element of time, but this is extremely limiting as an organizing principle. Interesting conversations make loops. They tangle. They soar. They sag. They connect to other conversations.
But the web has so far been dominated by time as an organizing principle, new at the top and old at the bottom (or vice versa), and this is one the most-repeated complaints people have about it. The web favors the new, the hot, the immediate. But we're dealing with a medium than can also handle space, or at least the perception of space. We need not be bound to lists and outlines, we need not plod along in chronological order. We could be looking at conversations as terrains, as topographies.
The electronic word finds itself in an increasingly social context. We need to design a better way to capture this - something that gives the sense of the whole (the big picture), but allows one to dive directly into the details. This would be a great challenge to drop into a design class. Warren Sack developed a "conversation map" for news groups in the late 90s. From what I can tell, it's a little overwhelming. I'm talking about something that draws people right in and gets them talking. Let's look around.
Posted by ben vershbow at 02:09 PM
| Comments (4)
tags: Online , blog , blogging , blogs , comment , comments , content , conversation , design , design_curmudgeonry , dialogue , display , drupal , flow , graphical , graphics , infoviz , internet , layout , metadata , movable_type , platform , publishing , software , space , time , visualization , viz , web
the big picture 10.05.2005, 7:26 PM
Though a substantial portion of our reading now takes place online, we still chafe against the electronic page, in part because today's screens are hostile to the eye, but also, I think, because we are waiting for something new - something beyond a shallow mimicry of print. Occasionally, however, you come across something that suggests a new possibility for what a page, or series of pages, can be when words move to the screen.
I came across such a thing today on CNET's new site, which has a feature called "The Big Picture," a dynamic graphical display that places articles at the center of a constellation, drawing connections to related pieces, themes, and company profiles.
Click on another document in the cluster and the items re-arrange around a new center, and so on - ontologies are traced. But CNET's feature does not go terribly far in illuminating the connections, or rather the different kinds of connections, between articles and ideas. They should consider degrees of relevance, proximity in time, or overlaps in classification. Combined with a tagging system, this could get interesting. As it stands, it doesn't do much that a simple bullet list of related articles can't already do admirably, albeit with fewer bells and whistles.
But this is pushing in an interesting direction, testing ways in which a web publication can organize and weave together content, shedding certain holdovers from print that are no longer useful in digital space. CNET should keep playing with this idea of an article ontology viewer - it could be refined into a legitimately useful tool.
Posted by ben vershbow at 07:26 PM
| Comments (1)
tags: CNET , Online , browser , cluster , constellation , design , folksonomy , infoviz , internet , layout , magazine , news , newspaper , ontology , page , print , publishing , tagging , tags , visualization , viz , web
hacking nature 10.05.2005, 7:19 AM
Slate is trying something new with its art criticism: a new "gallery" feature where each month an important artist will be discussed alongside a rich media presentation of their work.
...we're hoping to emphasize exciting new video and digital art—the kind of art that is hard to reproduce in print magazines.
For their first subject, they don't push the print envelope terribly far (just a simple slideshow), but they do draw attention to some stunning work by Canadian photographer Edward Burtynsky, who (happily for us New Yorkers) has shows coming this week to the Brooklyn Museum and the Charles Cowles Gallery in Manhattan. Burtynsky documents landscapes bearing the mark of extreme human exploitation - the infernal streams flowing from nickel mines, junked ocean liners rusting in chunks on the beach, abandoned quarries ripe with algae in their cubic trenches, and an arresting series from recent travels through China's industrial belt.
These photographs carry startling information through the image-surplussed web. But Burtynsky disappoints in one vital, perhaps deciding, respect:
...his position on the moral and political implications of his work is studiously neutral. He doesn't point fingers or call for change; instead, he accepts industry's exploitation of the land as the inevitable result of modern progress. "We have extracted from the land from the moment we stood on two feet," he said in an interview in the exhibition catalog. "The entire 20th century has been a revving up of this large consumptive engine. It's not a question of whether we are going to stop consuming. It's not going to happen…"
As someone who believes that struggling to prevent (or at least mitigate) global ecological disaster should be the transcending narrative of our times, I find Burtynsky's detachment deeply depressing and self-defeating. His images glory in the sick beauty of these ravaged scenes, and the cultural consumers that will no doubt pay large sums for these photographs at his upcoming Chelsea show only compound the cynicism.
Posted by ben vershbow at 07:19 AM
| Comments (1)
tags: Burtynsky , Online , art , beautiful , brooklyn , china , crit , criticism , environment , exhibit , gallery , images , internet , journalism , magazine , museum , new_york , nickel , photo , photography , quarry , slate , web
wikipedia compiles britannica errors 10.03.2005, 11:12 AM
Whatever one's hesitations concerning the accuracy and reliability of Wikipedia, one has to admire their panache. Wikipedia applies the de-bugging ethic of programming to the production of knowledge, and this page is a wonderful cultural document - biting the collective thumb at print snobbism.
learning from failure: the dot com archive 09.22.2005, 11:37 AM
The University of Maryland's Robert H. Smith School of Business is building an archive of primary source documents related to the dot com boom and bust. The Business Plan Archive contains business plans, marketing plans, venture presentations and other business documents from thousands of failed and successful Internet start-ups. In the upcoming second phase of the project, the archive's creator, assistant professor David A. Kirsch, will collect oral histories from investors, entrepreneurs, and workers, in order to create a complete picture of the so-called internet bubble.
With support from the Alfred P. Sloan Foundation, The Library of Congress, and Maryland's business school, Mr. Kirsch is creating a teaching tool as well as an historical archive. Students in his management and organization courses at Maryland's School of Business, must choose a company from the archive and analyze what went wrong (or right). Scholars and students at other institutions are also using it for course assignments and research.
An article in the Chronicle of Higher Education, Creating an Archive of Failed Dot-Coms, points out that Mr. Kirsch won't profit much, despite the success of the archive.
Mr. Kirsch concedes that spending his time building an online archive might not be the best marketing strategy for an assistant professor who would like to earn tenure and a promotion. Online scholarship, he says, does not always generate the same respect in academic circles that publishing hardcover books does.
"My database has 39,000 registered users from 70 countries," he says. "If that were my book sales, it would be the best-selling academic book of the year."
Even so, Mr. Kirsch believes, the archive fills an important role in preserving firsthand materials.
"Archivists and scholars normally wait around for the records of the past to cascade down through various hands to the netherworld of historical archives," he says. "With digital records, we can't afford to wait."
the database of intentions 09.16.2005, 11:16 AM
Interesting edition of Open Source last week on "Google Sociology" with David Weinberger and John Battelle, author of the just-published "The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture". Listen here.
Weinberger has some interesting things to say about Google (and the other search engines) as "publishers." I have some thoughts on that too. More to come later.
Battelle has done a great deal of thinking on search from a variety of angles: the technology of search, the economics of search, and the more esoteric dimensions of a "search" culture. He touches briefly on this last point, laying out a construct that is probably treated more extensively in his book: the "database of intentions." By this he means the archive, or "artifact," of the world's search queries. A picture of the collective consciousness formed by the questions everyone is asking. Even now, when logged in to Google, a history of all your search query strings is kept - your own database of intentions. The potential value of this database is still being determined, but obvious uses are targeted advertising, and more relevant search results based on analysis of search histories.
As regards the collective database of intentions, Battelle speculates that future advances in artificial intelligence will likely draw on this enormous crop of information about how humans think and seek.
Posted by ben vershbow at 11:16 AM
| Comments (0)
tags: Libraries, Search and the Web , Online , algorithm , audio , battelle , database , google , internet , listen , opensource , podcast , radio , radioopensource , search , searchengine , web , weinberger
uh oh 09.12.2005, 6:13 PM
It's really happening. Next Monday, The New York Times will inaugurate its "Times Select" subscription service. NYTimes.com will remain free, with much of the usual content still available (including multimedia), but op-eds and columnists will be pay-only. Oh well, the Washington Post opinion page is better anyway. The 100-article-per-month archive access is slightly tempting though.
The Times is betting that significant numbers of readers will shell out, just like they do for a premium channel on cable. Can the Times be the HBO of web news? Casual reader poll: who's thinking of paying?
(link: Letter From the Editor explaining the new service to readers)
Posted by ben vershbow at 06:13 PM
| Comments (2)
tags: HBO , NYTimes , Online , Publishing, Broadcast, and the Press , internet , journalism , media , news , newspaper , newyork , newyorktimes , subscription , times , timesselect , web
yahoo! experiments with multimedia journalism 09.12.2005, 10:36 AM
Yahoo! has enlisted tele-journalist and blogger Kevin Sites to produce a one-year web program chronicling the world's conflict zones in multimedia format.
Sites has become known for his jaunts as a "solo journalist," trundling from hot spot to hot spot with a backpack full of gadgetry, beaming reports from his one-man broadcast station. It's a formula that is tailor-made for the web. Clearly, Yahoo! was paying attention. The NY Times reports on "Kevin Sites In the Hot Zone":
As he travels to these places, Mr. Sites will write a 600- to 800-word dispatch each day and produce a slide show of 5 to 10 digital photographs. He will also narrate audio travelogues. There will be several forms of video - relatively unedited footage posted several times a week, and once a week, a more traditional video report, edited in the style of a network news broadcast.
Mr. Sites will also be the host of regular online chats with Yahoo users who will be able to post comments on message boards. And he will post quick text messages on the site updating his activities throughout the day.
Counting on war and carnage as a surefire crowd draw, Yahoo! makes a rather tawdry entrance into independent journalism. But this is a very significant move nonetheless, evidence that Yahoo! is evolving into a full-fledged media company, and suggesting that the one-man-band approach to journalism and webcast might become a regular thing. If the Sites show finds an audience, they should try out serious investigative reporting or medium-length documentary.
Posted by ben vershbow at 10:36 AM
| Comments (0)
tags: Online , Publishing, Broadcast, and the Press , blogger , blogging , broadcast , conflict , hotzone , internet , journalism , kevinsites , media , news , reporter , search , sites , war , web , yahoo , yahoo!
craigslist new orleans - web 2.0 in action 09.09.2005, 1:19 PM
You can find just about anything on craigslist. Bikes, mattresses, futons, stereos, landscapers, moving vans, graphic designers, jobs. You can even find missing persons, or a safe haven thousands of miles from what was once your home. How a public classifieds section transformed itself overnight into a dynamic networked survival book - a central node in the effort to locate the missing and provide shelter to the uprooted - captures the significance of what has happened over the past two weeks in Katrina's wake. The web has been pushed to its full potential, capturing both the enormity of the disaster (in a way that the professional media, working alone, would have been unable to), and the details - the individual lives, the specific intersections of streets - that got swept up in the flood. This give-and-take between global and "hyperlocal" is what Web 2.0 is all about. Danah Boyd recently described this as "glocalization" - "a dance between the individual and the collective":
In business, glocalization usually refers to a sort of internationalization where a global product is adapted to fit the local norms of a particular region. Yet, in the social sciences, the term is often used to describe an active process where there's an ongoing negotiation between the local and the global (not simply a directed settling point). In other words, there is a global influence that is altered by local culture and re-inserted into the global in a constant cycle. Think of it as a complex tango with information constantly flowing between the global and the local, altered at each junction.
The diverse, simultaneous efforts on the web to bear witness and bring relief to the ravaged Gulf Coast - a Knight Ridder newspaper running hyperlocal blogs out of a hurricane bunker (nola.com); a frantic text message sent from a phone in a rapidly flooding attic to relatives in Idaho who, in turn, post precise coordinates for rescue on a missing persons forum (anecdote from Craig Newmark of craigslist); an apartment rental registry turned into a disaster relief housing index; images from consumer digital cameras leading the network news; scipionus.com, the interactive map wiki where users can post specific, geographically situated information about missing persons and flood levels - that is the dance. The case of the scrappy craigslist, or rather its users, rising to the occasion is particularly moving.
Posted by ben vershbow at 01:19 PM
| Comments (0)
tags: Social Software , Web2.0 , craigslist , danahboyd , glocalization , gulfcoast , hurricanekatrina , hyperlocal , internet , katrina , network , neworleans , web
tower of babel or trivial pursuit? 12.20.2004, 3:59 PM
Read New York Times Article
In an article in yesterday’s NY Times, Alberto Manguel compares the Genesis story of Babel and the library at Alexandria with their alleged modern-day counterpart—Google’s commitment to digitize all human knowledge. Are we constructing a modern-day tower of Babel? A monument to the hubris of what might be possible if we could just get a little smarter. Will Google help us find answers to the big questions: where did we come from, and what’s the meaning of it all? I went online to find out. I Googled the question “What is the meaning of it all?” and got the following:
In an article in yesterday’s NY Times, Alberto Manguel compares the Genesis story of Babel and the ambitions of the library at Alexandria with their alleged modern-day counterpart—Google’s commitment to digitize all human knowledge. Are we constructing a modern-day tower of Babel—a monument to the hubris of what might be possible if we could just get a little smarter? Will Google help us find answers to perennial puzzlers like: where did we come from? Is anyone or anything in charge? And, what’s the meaning of it all? I went online to find out. I Googled the question “What is the meaning of it all?” and got the following:
The Meaning of Emmanuel
... "What is the meaning of it all?" "What is its purpose?" The human tendency always is to forget origins. And now that Christmas has grown to be such a ...
The Kubrick Site: John Morgan on 2001 vs. 2010
... What is the meaning of it all? Is there a God? What is the purpose of Art? Is there a merging of Art and Science?' Where Clarke in comparison only asks ...
The meaning of life, the universe and everything
... What is the meaning of it all? 'Antennae' colliding galaxies. When we contemplatethe unimaginable vastness of the universe, the incredible diversity ...
London theater musical on stage in London's West End Shaftesbury ...
... But what is the meaning of it all? Well, mainly that the dreamy idealist, Boney, had all he needed in Anastasia Barzee’s sweetly trilling Jo and never ...
'Rings' actor: 'It'll be the biggest film of all time'
... What is the meaning of it all? In some ways, that sort of inquiry is completely unfashionable. "I often think one of the reasons people are dismissive ...
Becoming a Wise Elder
... Questions such as "What is the meaning of it all?" and "Does my life make any kind of difference to anyone?" were very unlikely to arise. ...
Psychology Today: Still news
... PT: What is the meaning of it all now? BB: There was a recklessness in Kennedy's life that I didn't see, a sexual recklessness I don't understand. ...
None of these offerings brought me closer to a substantive answer. Demoralized by the thought of having to go through the other 517 possibilities. I decided to respond to the suggestion at the top of my page:
Tip: Have a question? Ask the researchers at Google Answers.
I clicked "Google Answers" and entered my question: What is the meaning of it all?
Then I had to set a price for my question between $2 and $200. I clicked on “How do I price my question?” And found the following guidelines:
*The more you pay, the more time and effort a Researcher will likely spend on your answer. However, this depends somewhat on the nature of your question.
*Above all - try to pay what the information is worth to you, not what you think you can get it for - that is the best way to get a good answer - but only you can know the value of the information you seek.
Hmm, what is the information worth to me?
I took a look at Google’s examples to get an idea of where my question might fit on the pay scale. Fifty dollars is the “minimum price appropriate for complex, multi-part questions. Researchers will typically spend at least one hour on $50 questions and be very responsive to follow-up questions.” One hundred dollar questions merit two to four hours of “highly thorough research.” Examples of hundred dollar questions included “Parking in New York City, and How does infant-family bonding develop?” The two hundred dollar question required researchers to “spend extensive amounts of time (4 hours plus).” Examples of $200 questions included: Searching for Barrett's Ginger Beer, Applications using databases, What is the impact of a baby with Down's Syndrome on its family?
None of those examples seemed to be in the same league with “what’s the meaning of it all?” Can a Google researcher find the answer in 4 hours? probably not, although I do wonder what they would come up with. Anyway, the point of all this is that Google is set up to search out trivial, quotidian sorts of things and it will be interesting to see how/if they can make the transition from those who can tell you how to “search for Barrett’s Ginger Beer,” to gatekeepers of all human knowledge.
lizards! defying the laws of mass market physics 12.16.2004, 5:25 PM
Found this yesterday on changethis.com - a site devoted to publishing and disseminating manifestos. Documents are smartly designed pdfs, spread primarily through the viral channels of the blogosphere and personal email mentions.
In "The Long Tail" Wired editor-in-chief Chris Anderson predicts a new age of abundance, in which the Internet elevates niche markets and makes mass market quotas irrelevant. Of course, this is already happening, much to the distress of mass media dinosaurs, who are scrambling to protect their creaking architecture of revenue.
The "long tail" refers to the slender expanse of obscure niche sales enjoyed by a web retailer, as represented on an x-y graph. It extends from the body of high volume, mainstream sales (Wal-Mart and the like) like the caudal appendage of a lizard.
google and big brother 12.15.2004, 7:35 PM
Can Google remain true to its promise to "do no evil," now that it has shareholders to worry about, advertisers to please, and an ever-increasing reach into the repositories of human knowledge? Google still gives you that warm and fuzzy feeling. It's got the goofy name, those cute seasonal tailorings of its masthead, the lava lamps. And this is not to mention the various amusing pastimes - the "Google Whack" game in which you try to find two words that cohabit only one of the search engine's eight billion web pages; or every writer's guilty pleasure, the Googling of the self, the "auto-Google," that delicious act of cyber-onanism.
But where might it lead? One day, when I open my fridge, might a sensor not read my searching eye and know that I am looking for milk? And knowing that I have run out, suggest an array of retailers who might be able to replenish my supply? Could Google come to mediate every exchange of information, no matter how inane, or how carnal?
Or could it come to resemble something like the Central Intelligence Corporation in Neal Stephenson's Snow Crash - a cross between the CIA, the Library of Congress, and DARPA's "Total Information Awareness" program?
Posted by ben vershbow at 07:35 PM
| Comments (0)
tags: Libraries, Search and the Web , evil , google , internet , library , library_of_congress , neal_stephenson , privacy , search , surveillance , web
Dr. Dial-up 12.15.2004, 1:05 PM
Click here to read more
There is a new initiative underway to make biomedical research immediately available on line and free to the public. According to the Pew Internet & American Life Project, 66% of those with internet access have used it to look for health/medical information. That means that over 85 million Americans (and who knows how many people worldwide) went online last year to doctor themselves. Is this a new kind of do-it-yourselfer, the amateur physician, Google-ing a diagnosis and a cure? And when all of this new “information” becomes available, will the office visit—which the HMOs are already putting the squeeze on—become a thing of the past?
NYPL ebook collection leaves much to be desired 12.10.2004, 1:51 PM
I just checked out two titles from the New York Public Library's ebook catalog, only to learn, to my great astonishment, that those books are now effectively "checked out," and cannot be downloaded again by anyone else until my copies time out.
It boggles the mind that NYPL would go to the trouble of establishing a collection of electronic titles, only to wipe out every advantage offered by digital texts. In fact, they do more than simply keep the ebooks on the level of print, they limit them further than that, since there are generally multiple copies of most print titles in the NYPL system.
The people responsible for this catalog have either entirely failed to grasp the concept of infinitely accessible, screen-based books, or they grasp it all too well and are trying to stunt it at its inception, perhaps out of fear of extinction of the print librarian. More likely, they are under heavy pressure by a paranoid copyright regime. Whatever the reason, the new ebook catalog shows a total lack of imagination and offers nearly no tangible benefit for the reader.
Beyond that, the books themselves are poorly designed and unpleasant to read. My downloaded copy of Conrad's Heart of Darkness (which, by the way, I found in the "Romance" section) evidences no more than ten minutes worth of design work, and appears to be simply a cut-and-pasted ASCII file from Gutenberg with a garish graphic slapped on the cover. My copy of Chain of Command by Seymour Hersh was a bit more respectable – more or less a pdf facsimile of the print edition.
On an amusing note, the "literary criticism" section is populated almost entirely by Cliff's Notes.
Posted by ben vershbow at 01:51 PM
| Comments (0)
tags: DRM , Libraries, Search and the Web , books , copyright , design , e-publishing , ebook , ebooks , internet , libraries , library , manhattan , new_york , publishing