en plein txt

logical negation | cross.ova.ing ][4rm.blog.2.log][ | [SUPER8.LOG]

Entries tagged "internet".

The Web is broken

...as a platform for electronic publishing for long term-storage and reference. One might embrace this instability, as net.art has done from early on. But it means that the world continues to be stuck with print books and journals for all "serious" publishing, with all the negative - and no longer necessary - implications for public access to information, research, study and learning opportunity outside rich institutions and countries.

Reasons why the Web is broken for long-term electronic publishing:

  • The URL system is broken since it doesn't sufficiently abstract from physical server addresses. This has always been a problem, but has escalated with the (a) commercialization of the Web and (b) proliferation of content management systems: (a) URLs rely on DNS, DNS does not abstract enough from IP addresses and has been tainted by branding and trademarks. (b) Content management systems create internal namespaces (that taint URLs and document structure) and are highly unstable, getting rewritten/replaced every couple of years. If a document exists on a CMS, it's unlikely to survive years on its URL. (Ultimately, CMS create just another layer of spam to the Web.)

  • As a side-effect, any kind of reference to an online resource - be it a citation, link or embedded quotation - can not reliably made, which is why the WWW has not met its original goal of providing a distributed hypertext system.

  • Still, the Web is not distributed enough because web servers provide single points of failure (and document death - an issue closely linked to the URL and DNS system).

  • HTML does not provide enough structure for research-level publishing. More capable alternatives such as extended XHTML, DocBook XML and TEI XML have not succeeded because their complexity is too much for people trained on graphical software that emulates, and thus artificially extends, analog tools and their work flows. Because of this legacy, not even the rudimentary semantic markup structure of HTML has been widely understood and used.

  • Changes / editing histories of documents can only be tracked on the level of individual content management systems (such as Wiki engines), not the Web as a whole. However, built-in revision control and version rollback are a necessary precondition for reliable referencing of documents. [Ted Nelson had that figured out in the 1970s.]

What could be done:

  • Introduce a new document identifier/addressing system that fully abstracts from DNS, using cryptographic hashes as document identifiers and a distributed registry for those document hashes.

  • Introduce document revision control and rollback on protocol/API level. This would allow server software to implement whatever individual local revision control, respectively rely on existing systems (RCS/CVS, Subversion, git etc.) while still maintaining network compatibility.

  • There is no hope for high-level standardization of document formats (such as simplified TEI XML), so simply allow any open standard document format.

This needs to be written up as a paper, with technical terms unwrapped for non-technical readers.

Tags: internet.
2nd January 2009

Critique of the "Digital Humanities Manifesto"

Remarks on the Digital Humanities Manifesto:

There are, to put it diplomatically, issues with this manifesto, both in its precision of terminology and critical thinking. First of all, the term "digital humanities" is fuzzy. Does it mean the cultural study of digital information systems, or simply the use of these systems in humanities research and education? If the latter is meant, why differentiate between humanities and other fields of study and not talk about "digital technology-based research and education" in general?

Paragraph 1 of the manifesto states that...

Digital humanities is not a unified field but an array of convergent practices that explore a universe in which print is no longer the exclusive or the normative medium in which knowledge is produced and/or disseminated.

This is a straightforward paraphrase of McLuhan's "end of the Gutenberg Galaxy", with the only catch that McLuhan referred to analog media - film, radio, television. So it seems as if the authors thoroughly confuse "electronic" and "paper" with "digital" and "analog". But, technically seen, the movable type printing press is not an analog, but a digital system in that all writing into discrete, countable [and thus computable] units.

On top of that, there are very contemporary positions in the so-called 'new media' field that are much more differentiated and a few steps ahead in their reflection of the relation between online and print publishing. In his introductory essay to the first Mag.net reader, Alessandro Ludovico soundly argues that "print is becoming the quintessence of the web", a stable long-term medium for which the unstable medium of the Web serves as a production and filtering platform.

Like all media revolutions, the first wave of the digital revolution looked backwards as it moved forward. It replicated a world where print was primary and visuality was secondary, while vastly accelerating search and retrieval.

The common assumption that media studies suffer from a lack of mid- and long-term memory is a confirmed by this paragraph. Historically, the opposite is true. In their "first wave of the digital revolution", the humanities chiefly associated the new technology with holographic visuality of "virtual reality" and "cyberspace". The humanities needed about ten years to catch up and grasp that computing and the Internet was based on code, and thus on linguistic logic.

Now it must look forwards into an immediate future in which the medium specific features of the digital become its core.

First of all, "the digital" is not a medium, but a type of information; information made up of discrete units [such as numbers] instead of an analog continuum [such as waves]. The medium - the carrier - itself is, strictly speaking, always analog: electricity, airwaves, magnetic platters, optical rays, paper.

To insist on this terminological precision is not just some technological nitpicking, but of political significance. It reminds of the concrete materiality of the Internet and computing that involves the exploitation of energy, natural resources and human labor, as opposed to falsely buying, by the virtue of abstraction, into the "immateriality" of "digital media".

The first wave was quantitative, mobilizing the vertiginous search and retrieval powers of the database. The second wave is qualitative, interpretive, experiential, even emotive. It immerses the digital toolkit within what represents the very core strength of the Humanities: complexity.

As it remains totally vague what this "second wave" represents - YouTube and social networking as the next evolutionary step after Google Search? [Seriously? How young are the people who wrote this?] -, it is nearly impossible to seriously discuss this argument. It also seems quite futile to argue whether the humanities or sciences have the better grip on "complexity" - a word which is a systems theoretical null signifier typically serving as a dialectical device for reducing the very thing it means; saying that something is "complex" is a truism, and thus a simplification.

Aside from that, the above argument is seriously flawed in its implicit assumption that there was no, or less, social and cultural complexity involved in what it calls the "quantitative" formalisms of databases and programming. It's a blatant regression behind the research of critical media scholars [like Matthew Fuller, Wendy Chun, McKenzie Wark and many others] and hacker activists of the past decade; research that has shown again and again how these very formalisms are "qualitative", i.e. designed by human groups and shaped by cultural, economical and political interests through and through.

Interdisciplinarity/transdisciplinarity/multidisciplinarity are empty words unless they imply changes in language, practice, method, and output.

And the words in this paragraph are just as empty because they state a completely generic truism.

The digital is the realm of the open: open source, open resources, open doors. Anything that attempts to close this space should be recognized for what it is: the enemy.

I'm slightly tempted to put the above paragraph, as a sarcastic joke, into my E-Mail signature, because it is the perfect [if for sure unintended] joining of the ideological opposites of a liberal Popperian ideology of "the open" with a right-wing Carl Schmittian agonistic rhetoric of "the enemy".

I'll stop here in order not to produce a prolonged rant - and sincerely apologize for my harshness if the "Digital Humanities Manifesto" should turn out to be a text written by younger students.

22nd January 2009

Rereading "The Cathedral and the Bazaar"

Replying to a question on Nettime on "open source and implementing this metaphor to a curatorial practice":

"Curatorship" remains a problematic term not only in this context, self-organization may be more appropriate, but extrapolates the systems beliefs within Internet culture. ESR's text needs, on the one hand, to be read its historical context of optimistic 1990s Internet cultural visions of "crowd wisdom", "collective intelligence", "smart mobs" etc.; cybernetic memes as much as variations of the liberal tropes of the "invisible hand" (A. Smith) and "open society (Popper). In Raymond's text - which has been overrated, but is nevertheless a historical document -, the "bazaar" is first of all a systemic free market metaphor.

Linux, more recently, Wikipedia and other phenomena show that "critical mass" theories are not completely off. The issues are, in essence, the same as with all consensus-based projects - such as architectural vision: Linux reimplemented Unix instead the Plan9 or Lisp Machine kernel architectures simply because Unix kernel architecture is c.s. textbook knowledge. Correspondingly, Wikipedia implements the most clearly consensus-based form of writing, the general encyclopedia. (Still, its value lies in the frequent eccentricity and obscurity of phenomena it tracks, unless this is been stifled by angst-ridden editorial self-control.)

That "open collaboration" is not a magic bullet, and "open curatorship" is older than "Open Source", may best be studied in the Mail Art network, beginning with Ray Johnson's New York Correspondance School in the 1960s, and with the festivals and non-juried exhibitions of previous avant-garde art movements as yet an older pretext. Bob Black said everything that needs to be said about Mail Art when comparing them to the Paralympics, i.e. a seemingly alternative but really just parallel system to the established system [hard to avoid the term here] based on its own - quantitative instead of qualitative - logic of reward and punishment.

Obsessed with egalitarianism, the Mail Art network required to never reject any contribution to an open-call project, despite the known and often enough deplored "junk mail" phenomenon. It ultimately renders "Mail Art" yet another cybernetic systems-obsessed art paralleling the decline [or rather: continually present, but ultimately dominant aspect) of Fluxus into the "intermedia" laboratory art described in S. Youngblood's "Expanded Cinema" (1970) [H. Flynt's criticism of Fluxus].

"Self-organizing systems" up your's.

Back to "The Cathedral and the Bazaar", it is, like Barthes' "The Death of Author", a text that nobody has read yet everybody has an opinion about. Contrary to popular belief and urban myths, it does not truly pitch an Open Source "bazaar" model against a proprietary Microsoft-ish "cathedral" model of software development, but analyzes the decentralized development of one specific piece of software, the Linux kernel supervised by L. Torvalds. The urban myth probably originates in the fact that non-technical readers are unlikely to understand that it is not about [what is commonly called] the "Linux operating system" as a whole. In fact, the classical "cathedral" model of software development in small, closed committees had been characteristic among others for GNU software, the free BSDs and the X Window System, i.e. all the base components of a typical "Linux distribution" except for the kernel itself.

Ten years later, a clear-cut division of "bazaar"- and "cathedral no longer exists in Free Software development: The development of the Linux kernel has become more hierarchical while the development of GNU and BSD software has become more distributed and adapted to the Internet. (Viz. the now-standard use of networked version control systems.)

While not using the term "Open Source" in its initial version, the essay preempts the later Open Source-vs.-Free Software debate by discussing open, distributed development processes as technically superior to closed processes. This is its main point [with, as pointed out, striking similarities to Bertalanffy's theory of open systems and Popper's theory of the open society as the counter-model to societies founded on philosophical idealism.] Again: While the distributed model has its advantages - most obvious in the fact that, thanks to BSD, GNU and Linux, Unix hasn't died, but improved and blessed us with tools that don't offend the human intellect such as zsh and vim never mind the complete lack of proprietary commercial interest in developing such software -, it is not the answer to all questions. Raymond's conclusion that "given enough eyes, all bugs are shallow", is a bit loudmouthed considering - for example - the issue of MD5 hash collisions.

The reverse is true as well: If there are not enough eyes, bugs can bite you, for example in FLOSS multimedia authoring software from Cinelerra to PD with its minuscule communities of often non-professional programmers.

All critique of "open systems" ideology pales, however, in comparison to the issues of (contemporary visual) art. Art literally wears the emperor's new clothes, and suffers from a severely if not pathologically distorted self-perception of its actual contemporariness. It is the only of the modern arts that is still structurally feudalist, with an economy firmly based on the notion of one material fetish object, with reproduction - unlike in books, music records, films, software - being merely a second-rate, plebeian illustration of the aristocratic "original". It is financed by the modern successors to the old feudal authorities; back then, the church and the courts, today, the rich as the successors to the aristocracy and the state as the grant-giving successor to the church.

22nd February 2009

RSS feed

Archive
Tags 16mm, analog, art, book, cultural theory, deutsch, div, economy, education, film, gnu/linux, internet, literature, music, nederlands, neoism, performance, photo, poetry, politics, super 8, systems, theology, video.