What was Web 2.0? Versions and the politics of Internet history

Posted in Events, Presentations, Seminars and presentations on May 4th, 2011 by admin – Be the first to comment

This presentation, given at the Oxford Internet Institute May 4 2011, is a reduced version of a paper of the same title that includes substantially more examples and all appropriate references. Please refer to this paper for a full account.

What was Web 2.0? [ full paper]

Introduction

In 2008, the journal Fibreculture published an issue entitled “After Convergence” exploring questions of human, social and technological connectivity within a world where computer networks had led to the convergence of formerly disparate cultural practices. In describing the several contributions to the issue, the editors wrote:

we have asked not only what makes ‘2.0’ distinct from ‘what came before’ but also how it will be understood in the future. We ask this question not least because we are somewhat alarmed by visions of proliferating version control as 2.0 merges with 3.0 and 4.0 looms on the horizon (Bassett et al.)

My paper, in general terms, takes its lead from this critical interest in things ‘2.0’, focusing specifically upon Web 2.0. I will outline the way in which the emergence of Web 2.0 brought to the web the discourse of versions. A history created in versions, a particular form of an object’s history, required that, as well as Web 2.0, there had to be as Web 1.0 and of course presupposed the emergence of Web 3.0. The articulation of one depended, explicitly or implicitly, on the others.

Is Web 2.0 dead?

Web 3.0 labels many trends in the development of the web that might presage a ‘new’ time involving such ideas as the Semantic Web, both systemically and in specific application; new investment opportunities; and even new scholarly critiques and theories. So, perhaps it is time to ask: “Le Web 2.0 est-il mort?” (Lequien). Far and wide across the web, the phrase Web 3.0 yields a vast array of returns from search engines whether they reference marketing slogans, political commentary, technical discussions or techno-evangelist opinion.

Yet the discourse of Web 3.0 bears an uncanny resemblance to the rise of Web 2.0: different in time, not substance, and marked by the same jumble of competing, but inherently irreconcilable, differences of perspective and purpose as people position themselves, their technologies, and their ideals in relation to what has come before and what might come in future. In such circumstances, the real questions to ask, then, are:how did the web come to have versions in the first place; what is the discursive process by which these versions come to make sense; and what is revealed by analyzing this history of versions?

Web 2.0 and Web 1.0 – continuity or change

Around the start of 2006, Web 2.0 became the principal way to describe the then-current web rather than being a term which looked towards an as-yet unreached future. Yet several businesses and web services thought exemplary of, or essential to, Web 2.0 date from much earlier times, as do the technologies on which they rely. Examples include blogging (Blogger and Livejournal), distribute payment services (Paypal), crowd-sourced, user-generated content (Wikipedia), social networking (Sixdegrees), and algothrimic search and associated marketing (Google).

Equally, behaviours and sensibilities which have for several years regularly been discussed in terms of Web 2.0 pre-date its origin and extend back beyond the web itself. It has been claimed that:

… the essential difference between Web 1.0 and Web 2.0 is that content creators were few in Web 1.0 with the vast majority of users simply acting as consumers of content, while any participant can be a content creator in Web 2.0 and numerous technological aids have been created to maximize the potential for content creation. (Cormode and Krishnamurthy).

The visibility of such behaviours in earlier Internet times suggest it would be less clear that Web 2.0 created this change but merely promoted it. Examples from before the web include USENET, bulletin boards, email lists, chat environments, and MUDs, all demonstrated the founding basis of socio-technological networking: people wished to share information, create content and work with others in doing so. The World Wide Web did not mean an end to these earlier forms, but enabled a rapid increase in their utility and visibility; there was also significant website creation and curation by individual users and the concept of the ‘webring’ emerged to enable networks of users and authors to emerge.

These examples stand in contradiction to the current ‘history’ of Web 2.0, which generally views the technologies, businesses and social formations of the past several years as initiating and making possible an online world of participatory, user-generated and open content and communication. Of course, the point is not that this current history is wrong: histories are not wrong, per se, but contingent on the circumstances and purposes of their creation and circulation and current contestations are fought out through, rather than by reference to, historical accounts.

Juxtaposing examples which suppose a continuity of development online, with a dominant history of Web 2.0 as radical transformation allows us to look more deeply at the cultural complexity of the notion of ‘version’. The idea of versions is becoming a cultural commonplace of today’s world because of the rise of computing: the conventions of software development are coming to play a critical role in the consumption of goods and services dependent on digital technology. While it is not new for goods and services to be promoted and sold on the basis that they replace what came before (for example, a new model of car or washing machine), it is definitely the case that, in computing culture, consumers are now receptive to the idea of purchasing something which they know will be replaced in a short period of time by a new version and willingly enter into this transaction, becoming part of the development cycle as much as being its recipients. Consumers accept, if reluctantly, that the digital products they are not entirely finished, and will be regularly ‘patched’.

These points are not merely a comment on current software marketing: they reveal the semiotic work that the discourse of versions can do. Versions allow products to claim to be new, but not threateningly so, because they also sustain continuity and promise an easy transition from that which came before. At the same time, versions allow new products to derogate earlier incarnations of themselves as limited or otherwise demanding of replacement. Through this duality of appeal, a discourse of versions both reassures consumers that they were right to buy the product first time around, but must now, of course, consume again and not rely on what they had already bought. Ultimately, the fetish object for consumer gratification becomes the process of upgrading, rather than what is possessed after that act.

Perhaps, then, technology might legitimize the move to a new, second version of the web while also sustaining continuity? Code within Web 2.0 is more sophisticated and enables developers and users to do far more; and in many different ways. Yet many Web 2.0 sites do not demonstrate technological sophistication, nor rely on innovations in code. Further the importance for the computing industry of shifting the way we think of the Internet from channel (media discourse) to platform (computing discourse) cannot be understated as a rationale, and implies technological change is consequential to a more fundamental commercial re-orientation. Thus, as Berry has argued, the insistence on technology as a discriminator between Web 2.0 and things other, is more of a demand for what ought to be, rather than an objective description of actual change. Further if technology is to authorize the legitimacy of claims for a transition to Web 2.0, then necessarily technology is presumed to determine or at least substantially control the consequences and meanings of that change.

Not all explanations or discussions of the transition to Web 2.0 relied on appeals to technology, however. We can contrast contemporary commentators Schauer and Hinchcliffe who argued respectively that new behaviours emerged because of technological development, or that the traditional behaviours became influential because of the large number of regular Internet users which the web created. In both cases however, these authors but demonstrate a fundamental tension within the language of Web 2.0: the need to manage the transition between versions, to explain the 2.0 which both ‘breaks’ with the past, and also connects to it. This tension is not easily resolvable because, in truth, the tension is what gives ‘versions’ their semiotic cogency.

Yet, without an articulation of change and continuity between ‘then’ and ‘now’ there could be no rationale for Web 2.0 and thus, even as this form of the web was proposed as novel, it had to be presented as less than novel. Throughout the texts of Web 2.0, simple dichotomies of new and old are presented hand-in-glove with the assertion of a contradictory, more developmental path from earlier times, often within a few short sentences of such assertions. While popular advice might be that “The definition of Web 1.0 completely depends upon the definition of Web 2.0.” (Strickland), in fact the existence of Web 2.0 depends utterly on Web 1.0. Without it, the absences and failures which Web 2.0 solves would not be knowable.

Web 2.0 and Web 0.0 – realignment to ideals

There are other ways in which the creation of Web 2.0 came to define the particular sense we have of the history of the web with strong claims that Web 2.0 returned to the origins of the web, and indeed the Internet more generally, realigning everyday technology and social practice with the ideals which had first given birth to the web. In other words, Web 2.0 was not a continuation from Web 1.0 so much as a ‘reset and restart’ returning the web to its alpha version 0. This alpha version, according to Berners-Lee, was:

a common information space in which we communicate by sharing information [and] … dependent on the Web being so generally used that it became a realistic mirror (or in fact the primary embodiment) of the ways in which we work and play and socialize… we could then use computers to help us analyse it, make sense of what we are doing, where we individually fit in, and how we can better work together

In this respect, the web represented ideals of social practices through network connectivity inspired by the prototypical cultural forms of the Internet As Dean puts it “Web 2.0 designates nonetheless the surprising truth of computer-mediated interactions: the return of the human”.

Within this particular conception of Web 2.0 the web needed a restart because Web 1.0 represented the failure of 1990s business. Not appreciating the web’s ‘true’ origins and seeking only commercial gain, business had imposed upon the web ideas and expectations drawn from the traditional media. Not only had most businesses involved in Web 1.0 mismanaged their own affairs pursuing the illusory goal of media convergence, but in doing so had threatened the potential of the web to transform the world. The dotcom crash showed that the reality of how and why people used the Internet was not what business had thought and thus proved the original ideals of open communication, sharing and so on were not only good, but true: “the Internet was literally given back to the people” (Raffl et al.).

Yet the emergence of Web 2.0 saw a return to speculative behaviour and commercial exploitation of the common information space:

The 2005 Web 2.0 conference reminded me of Internet trade shows during the Bubble, full of prowling VCs looking for the next hot startup. There was that same odd atmosphere created by a large number of people determined not to miss out. Miss out on what? They didn’t know. Whatever was going to happen—whatever Web 2.0 turned out to be (Graham).

Thus if Web 2.0 was a return to an earlier time, before Web 1.0, it would be marked by the same political economy as the 1990s, as part of informational capitalism and with competing forces vying to constitute the web as that particular fusion of technology and capital necessary to their commercial interests. Thus Web 2.0 could not completely ‘reset’ the web’s development, for it was intrinsically part of the competition within capital over the ways in which to appropriate the value of consumers’ attention, labour and tastes.

The particular nature of the wrong direction of the web is best understood by a specific analysis to the problem of ‘design’ and its relationship to the technologies through which design came to dominate people’s internet-mediated interactions. The importance of design is evident in the brief article of DiNucci which, in 1999, first coined the term Web 2.0, and in the opposed views of good design  of visual designers (e.g. David Siegel) and HCI experts (e.g. Jakob Nielson). Exemplifying the way the web disrupted conventions and refused easy definition of norms and standards, the debate sums up the primary techno-capitalist challenge of the 1990s: how might  media and computing combine, whether within one site, or within corporations set on paths of convergence. Yet this debate shows how Web 1.0 had diverged from the origins of the Internet, which were resolutely outside of the media, forming a space of information and collaboration that specifically did not draw on classic media forms, tropes or models. By 1995, from the corporate media perspective, rather than being a novelty suited to computer enthusiasts, the “WWW was seen principally as something you watched, rather than something you built a small corner of yourself” (Roscoe) and the source of this maturity was the imposition of media models to explain its future significance.

Hagel at the time stated that “In many respects, Web 2.0 represents a return to origins of Internet”, portraying Web 1.0 as radical discontinuity from the ‘the web’ which might have existed.  Web 2.0 could then be proposed as a further discontinuity which would undo Web 1.0. However Web 2.0 could never fully return to that time as if there had not been this misdirection. The need for the version (rather than a return to just ‘the web’) stemmed from the fact that commercialization could not be undone, and a new form and approach was required. Furthermore, a vast ‘audience’ that had developed, the users whom Berners-Lee’s idealistic vision was to serve. Their needs and expectations, courtesy of Web 1.0, were not what had be assumed in that vision. Just as Web 1.0 could be said to have failed because it did not understand the cultures of use of the Internet, so too, there was no way to reset the web without now accounting for the new cultures of use that had emerged in the 1990s.

The discourse of versions

The emergence of Web 3.0 in recent years was inevitable once Web 2.0 started to be used. It is just too easy for technology evangelists to slip into the language of versions to help communicate their messages. But, it is not just superficial talk. A move from discussing ‘the web’ to discussing Web 2.0 creates the foundations for a teleology of development legitimising what had come before, was current, and what was still to come. The emergence of one version, to replace another, ineluctably requires there to be yet another version, still to come, which in time will become current and, eventually itself be replaced. How the web came to have a history shows us that, in building this meaningful narrative, a discourse of versions works in three ways.

First, the discourse enables a return to origins, to create the legitimacy for current moves that, far from being developed from the previous version, in fact realign with a trajectory of development originally intended. The recent past is placed to one side and ‘normal’ progress is resumed. Web 1.0 becomes the repressed other, only visible because it explains the contradiction of the move to Web 2.0 from the alpha version zero. In this repression, certain key features of the Internet in society (excess value appropriation; the complex relationship of the Internet and media; and contestations within informational capitalism) are strategically obscured.

Second, a discourse of versions enables a different movement from the past into the present, whereby the recent past is normalised as creating the pre-conditions for what has now emerged. The past is overturned by incorporating it within the self of the present, not repressing Web 1.0 but adding “1” to it. Here, version zero is repressed, since the latest iteration only references that which immediately preceded it and thus helps explain away contradictions which might produce a critique of the latest version.

Third, versions create the conditions for knowing and anticipating the future in an orderly manner, managing what is to come as astutely as what has been, positioning those who control the specific meaning of each revision as the authorities on what ought to be, based on their success in modifying current reality. Web 2.0 is a part-completed project, the model for Web 3.0: thus, the reasons for current failings and problems can be safely ignored because solutions are just one step away, to the next version. The discourse of versions enables ‘erasure’ the current version, even as it speaks it, locating attention towards the version next-to-come. The perfectibility of the Internet, and along with it, the whole technocratic project that it signifies, is reassured.

Conclusion

The dominant, popular history of the web is told through versions; these versions. provide the semiotic sites at which critical debates about financial, technological and regulatory issues can be played out, in a fight to define the future, through control over the meaning of the past and the referential present. For that reason there is no single, stable accounting of the versions and what they mean: the web is not software engineering, where versions represent agreed and defined iterations of the design and coding process. Yet, the origin of versions in engineering is apt: versions create order, control and mastery over a process that might otherwise become impossibly flawed in the absence of a consensus about the history of the application or product. A history of the web told in versions is all about the way that people seek to influence the direction of future development to suit their ideals, profits, or personal ambitions but only insofar as this historical account becomes the basis for collective, or shared, understanding.

What of the other sorts of histories to which we should pay attention? Shared history of versions of the web, where periods are defined, originators and pioneers identified, and generalisations made, occludes the private, personal histories of Internet use that tell of the individual experience of connectivity and which reveal a very different kind of relationship between technology and individuals. I will conclude with two examples.

Consider the case of Justin Hall, as described by Walker:

Hall’s narration of his life online began in January 1994, with a simple homepage, and extended into a detailed hypertextual version of his life told in traditional node and link HTML. When weblogging software began accessible, Hall started using it, and posted almost daily fragments in this decade-long autobiographical project until early 2005. At this point, Hall posted a video where he discussed the problems of publicly narrating one’s life at the same time as relating to the people in one’s life, and ceased his personal blogging.

This history (and there are many more like it) stands in stark contrast to the idea that Web 1.0 was a time of static, commercially oriented content produced for a mass audience and that this inappropriate form of the web was heroically undone by the expertise (whether business or technical) of the Web 2.0 revolution, enabling people to lead social lives online. Hall can be characterized as Web 2.0 before this term existed; and returned to Web 1.0 during the time of Web 2.0.

Second, the web is effectively designed by the preferences, behaviours and interests of its users and not by software engineers (or indeed communications designers). As Millard and Ross found:

… the relationship between Web 2.0 and those original [hypertext pioneers’] visions is more complex than [expected]: many of the aspirations of the hypertext community have been fulfilled in Web 2.0, but as a collection of diverse applications, interoperating on top of a common Web platform (rather than as one engineered hypertext system)… The Web 2.0 model is heterogeneous, ad-hoc, evolutionary rather than designed…

This example suggests an entirely different history of the web, consisting in the innumerable and largely invisible minute acts by all the users of the web which, in their effects, become a crowd-sourced history and, in the end, visible only in its effects or in the recollection of the place within the crowd which any individual occupied at a given time.

Web 2.0 engendered a history of the Internet (its technologies, peoples, businesses and politics) that both depends on and asserts the primacy of the discourse of versions as the correct way to tell this history. This effective claim that the only legitimate way in which web histories can be told is with due deference to the technical language of the originating discipline is, ultimately, the most profound consequence of Web 2.0. Perhaps, if now we are asking “Is Web 2.0 dead?” we might more positively ask: what other ways might we explore the histories of the web such that users’ agency in their own historicity is more fully realized.

Growing Knowledge: what is the future of research?

Posted in Events, Seminars and presentations on May 4th, 2011 by admin – Be the first to comment

 

Disclaimer: Live blogging

Growing Knowledge: what is the future of research?

(details)

A Times Higher Education debate hosted by the British Library, featuring Matthew Gamble, David Gauntlett, Alex Krotoski, Ben Hickey and chaired by Phil Baty.


Phil Baty starts the debate: it is fundamentally about the way that IT will profoundly change the nature of research. Introduces the speakers.

Hickey

(A-level student)

Has grown up surrounded by network technologies and assumes they will be crucial at his time at university. he ponders however whether the research collaboration between people and computers might lead more traditional people to question the validity of his work because the boundaries between him as researcher and technology are indeterminate. [Cyborg researcher?]. perhaps universities, because of their traditional outlook, may hinder learning and research. On the other hand maybe technology creates too narrow a vision and the voice of experience from earlier times can shed revealing light on a problem. Points to a problem – younger people with whom Hickey spoke are largely uninterested in universities and research, seeing it as irrelevant and distanced from the real-world problems they face.

What is revealing about Hickey’s contribution is the way in which someone who have grown up with technologies of networks, intelligent agents and so on construes the role of technology in research: as something that, in effect, stands OUTSIDE of the normal practices of researchers and potentially enables research and learning directly from / with computing code, without human (academic) intervention

Gamble

(PhD candidate)

Mismatch between the potential that technology provides (connectivity, immediacy and scale) and what is current normal practice in academic research. this potential is, however, what causes the problems as well. The web might become the “invisible college” which promotes the circulation of scholarly literature outside of the norms of academic journal publishing and, indeed, the formal structures of universities.

Provides example of crowd-sourcing data analysis within Galaxy Zoo project where large amounts of data was given to many individuals online for them to do mciro-analysis of data, out of interest in the subject. discovered things which the researchers were not even aware they should be looking for.

Gamble’s critique of traditional science is important: he reveals that lurking within the technologies of network collaboration is, in fact, a deeply ideological project towards openness and altruism. Open science, while often construed as made possible through the Internet and similar tools, is more about a reaction against the institutionalised narrow and profit-oriented sciences which have emerged over the past fifty years

Notes the resistance of scientists who resist open data (the so-called “selfish scientist”) and who are obsessed with publishing, not finding things out. “Altruism is quickly beaten out of young scientists”. So, there are tools for collaboration but are not used significantly.

Concludes by calling for a different mode of publishing: it’s not just open publishing, but also publishing of data, the methods, processes, the discussions about projects and so on.

Krotoski

Discussing Web 2.0 and scholarship. Ponders the reality of such technology in the real world, outside of the world of enthusiasts (such as myself I should admit). Recounts how she spoke with phd students as they commenced their studies – almost none of them had any kind of online presence, definitely not blogging and so on. Students told her that they were discouraged by their supervisors from being online and open. They certainly were not taught about how to do it. this was, from the traditional perspective, ‘wrong’.

So, she continues, what of the future? She emphasises the validity of blogs or similar: ideas can be trialled and discussed with peers, useful self-promotion (on the basis of quality, not spin), writing becomes a habit and reflection possible. Krotoski views scientific / technological research in the USA, where this use of social media and Web 2.0 is more prominent, as being influenced by industry, who are not interested in long-term peer review publishing but rapid and iterative publishing of ideas and their development.

I wonder if there needs to be greater discrimination between types of ‘web 2.0′ use [which I had discussed with Aleks before the event, so no criticism here]. This discrimination is, pretty much, about identifying the unkown, but useful tools of the web which, probably, critics of ‘web 2.0′ use but don’t realise these tools could, from another perspective, be seen as web 2.0

K. comes back to key point: how do we trust what is online; is it valid and reliable; how can we assess that? Normal position emphasised — it’s about training people to have that capacity to assess. Baty contributes a point: traditional publishing filters the content to give it more reliability.

Gauntlett

Online publishing and distribution of information is very useful, even required, for academics. Open publishing helps the world and is ethically required; it is great, too, for academics because it makes them self-reliant. moreover, the web and similar tools makes academics public intellectuals again, rather than closeted.

scholarly publishing — from a time when distribution was very limited, and filters needed because of low bandwidth. G. has a great view on the failures of the peer-review system because it assumes reviewers are entirely uninvested in the outcome except from a rational scientific perspective. Perhaps academics can do the filtering themselves by using what is good, from their view.

Gauntlett noted he first built a website in 1997; some of the most keen advocates for web 2.0 and knowledge networking are often longer-term Internet users who, perhaps, have understood the web more from a self-creative perspective?

debate now ensues

Something of a confusion emerges from the discussion between the academics about peer review – there’s a slight problem with comparing and contrasting peer review with complete ‘openness’ (eg Twitter). In fact, the discussion might more usefully concern the reshaping of peer review so that it is more productive, in improving and expanding work in a supportive manner. One example is the peer review process of Critical Studies in Peer Production.

Question from audience regarding new kinds of research methods which the Internet might produce. — too much data produces new methods; online behaviour produces new methods; nice contradiction between Gamble enthusing about the Semantic Web vs Krotoski worried about the missing human condition.

Gauntlett makes an interesting comment — it appears that crowd-sourcing can elevate people to being partners in science (as in the Galaxy Zoo), “citizen scientists”; this is like citizen journalists and so on. I read this as another example of the meme/trope of participation and democracy which is ideally or occasionally true but, in fact, is a general mythos within which hierarchies and elites persist.

Innovative Education Online: Ideas for the future of learning & the Internet

Posted in reports, Writing on April 25th, 2011 by admin – Be the first to comment

In 2009 I ran a series of workshops as the first main component of my ALTC Fellowship to group brainstorm and analyse ideas about online learning and web 2.0 technologies.  During these workshops, so many good ideas were raised that I felt compelled to write up a report distilling the wisdom of more than 200 participants at 7 locations so that it might provide something of a guide for others.

At the same time, as I reflected on the workshops and what happened within them, I realised that they gave me an insight into the discourse of e-learning and Web 2.0 versions thereof in contemporary Australian higher education. Thus, I have also reported my responses to and analysis of those workshops. It’s one reason why the report has taken a while to produce and finalise.

Finally, then, here is the report Innovative Education Online:  Ideas for the future of learning & the Internet

My thanks again to everyone who attended and helped organise these events.

 

Knowledge / network / learning

Posted in sites on April 24th, 2011 by admin – Be the first to comment

I have just completed my Australian Learning and Teaching Council Fellowship program, Learning in Networks of Knowledge. This 2-year program involved, in part, the development of an extensive resource for academics to use to assist in selecting Web 2.0 applications for use in their teaching practice.  The choice and use of these tools was underpinned by the possibility of now facilitating student learning within the knowledge networking paradigm.

The site is now fully operational, both fixed content and regular updates, at:

http://knowledgenetworklearning.net

In a simple image, here is what I am attempting to do in proposing a knowledge network learning approach:

 

What was Web 2.0? Versions past, present, future and the development of Internet historicity

Posted in Events, Seminars and presentations on April 22nd, 2011 by admin – Be the first to comment

Upcoming Seminar at oii, Oxford
 
What was Web 2.0? Versions past, present, future and the development of Internet historicity

4 May 2011

UPDATE: my paper is slightly different, now that it is finished. I have concentrated more on detailing the particular way in which versions came to the web, the consequences of that, and generally exploring the way ‘versions’ work as a particular kind of (popular) historiography. I will work on the historicity stuff next!


In this paper, I discuss the emergence of the historicity of the Internet – that is, the explicit sense with practical consequences that the Internet has a history, and that it occupies a place in history which, through our use of it, also defines us as beings in time. While the term historicity has a long tradition within religious scholarship, marking efforts to determine the factual (as opposed to mythic) status of various ‘historical’ figures, I use the term with a more postmodern perspective. From this perspective it might be said all facts are myths and all myths are facts except that the politico-cultural discourses within which we know the world determine for us very clear, if contingent, boundaries between fact and myth. Historicity is better understood, therefore, as marking out that state in which the history of a phenomenon is established, and used, for particular purposes and said phenomenon is therefore experienced as having ‘a place’ in history.

For many years, the Internet existed as a kind of cultural future-in-the-present. For example in the 1990s, talk of the ‘Internet frontier’ was a metaphor to give cultural substance to this new and inexplicable space called cyberspace. But it was also a temporal metaphor: the frontier was the future, as much as it was a place (perhaps the past as well, so influential was America’s colonial history in this time). The speculative economics of the dot com boom were, similarly, a future-in-the-present, exploitation of which would (when that future actually arrived) bring untold wealth, as a bare handful of clever domain name squatters found. The alterity of the Internet, where people found freedoms not imaginable in ‘the real world’ was also an alternative time, if you like, a world of future possibilities, made real through the magic of networked computing. The Internet might have had a history (traced on Zakon’s timeline, sketched in Where Wizards Stay Up Late) but it had no historicity.

That has changed because of Web 2.0; not, so much, the technologies of Web 2.0 but the snowballing effects of Tim O’Reilly’s creative marketing of the term. There never was a Web 1.0 … until he (and we) started to discuss Web 2.0 Even then, 1.0 existed as a kind of shadow, rarely spoken but always implicit. Moreover, almost as soon as Web 2.0 had become popular, Web 3.0 was soon being used as well despite the fact that what it labelled (the semantic web) preceded Web 2.0 (see Allen, 2009).

What can we make of the last decade or so of the web, which has in popular commentary, clever marketing, and actual socio-technological development, become a second version of the web we had in the 1990? What are the consequences of coming into history for the Internet and is there another version yet to come? Or have we reached a time when all we have is ‘the contemporary web’? My conclusions will hopefully inform both our understanding of the Internet itself and give some guide to how we might research it.

(some of this has been sketched before when I presented on Historicising the Internet at the OII Doctoral Summer School in 2009).

Beyond the Edgeless University

Posted in Events, Summits and Workshops on April 21st, 2011 by admin – Be the first to comment

Upcoming Workshop

A Question of Boundaries: What Next for the ‘Edgeless University’?

(Workshop details)

I will be organising and facilitating a workshop on the impacts of network technologies on universities at the Oxford Internet Institute in May, focusing on a critical appraisal of the notion of ‘edgelessness’. Here is the extended abstract and plan for the workshop:


In 2009, the UK Demos Foundation released a report, The Edgeless University (Bradwell, 2009), exploring the impact of digital network technologies on British universities.

Subtitled, ‘Why higher education must embrace technology’, its author, Peter Bradwell, argues cogently for both the opportunity and necessity to remake higher education according to the new realities of a world relentlessly connected, digitised and increasingly distributed in time and space away from centralised locations.

Re repurposing Richard Lang’s insights about the edgeless city, in which the functions of the city occur, but the form is now more fluid, dispersed and without the clear boundaries which have previously helped define ‘the urban’, Bradwell proposes a shift in higher education analogous to that within the popular music industry: technologies will not ‘do away’ with universities but, to prosper, those institutions must change systematically and with a “coherent narrative” to embrace digital networks.

While much has changed in terms of the funding, politics and general cultural climate around higher education in recent times, nothing has changed to make less the threat of conservatism in the face of global knowledge networking, nor to reduce the opportunity which universities have to become central to the new forms of knowledge and learning which the Internet and related technologies demand.

This workshop will explore practical opportunities and problems that confront academics and institutions of higher learning in light of Bradwell’s prognosis for the technology-oriented future. The focus for the workshop is to ask:

what exactly should a modern comprehensive university do that will unleash the creativity of students and staff and maximise the potential of distributed, edgeless learning while, at the same time, also making the most of the physical spaces which will remain critical markers of ‘a university’?. In other words, how can we utilise digital technologies and networks to fashion ‘new’ edges — temporary boundaries, if you like — that assist us in making education a collaborative, collective experience?

Format of Workshop

The workshop will be 1/2 day, including lunch:

  • Introduction / overview (30 minutes – Matthew Allen)
  • Open discussion: what are the key changes needed for enhanced, engaged teaching and learning within the edgeless university paradigm? (30 minutes – plenary)
  • Groups work on the key changes proposed, examining critically its validity, refining it and making sense of the likely outcomes (30 minutes – sub-groups)
  • Lunch (45 minutes)
  • Report back and group presentations and discussion, including consideration of the need for edges to be re-introduced at times (60 minutes)
  • Conclusion, including overall response (15 minutes – speaker TBA)

Examples of authentic learning in Internet Communications III: NET204

Posted in Ideas, Presentations on December 4th, 2010 by admin – Be the first to comment

See also other posts including the first one, on Web Communications 101, which explains more of the context.

Internet Communities and Social Networks 204

(basic unit description)

One of the most authentic learning experiences we try to offer students in the BA (Internet Communications) is the network conference, the focal point and driving force for the unit NET204. In this unit, the whole learning journey is designed around a 3-week online asynchronous conference in the latter stages of the study period: the first part of the unit involves writing the conference paper, improving it after feedback, and also designing and discussing how to run the conference and promote it.

Because every element of the unit is designed ‘around’ the conference, this unit is more than just an authentic assessment task: rather, it is an authentic learning experience, with the assessment almost ‘blending’ in with that experience. For example – the ‘conference paper’ is submitted, assistance given and then students can improve it, rather than in traditional approaches simply being done and marked. Very few activities in the real world involve submission of intellectual work that can’t be improved once completed.

While we set up the website and managed submissions, the academics were not the only ‘producers’ and users of the web for knowledge networking, producing a Youtube video, using a NIng group and promoting the conference through Facebook and Twitter.

Portfolios, digital and reflection: interleaving Michael Dyson

Posted in Conferences, Events, Ideas on December 2nd, 2010 by admin – Be the first to comment

Listening to Michael Dyson, from Monash talking about portfolios in teacher education: great presentation.

Dyson says:

  • Education of educators is first of all premised on turning them into people who practice self-development. gives example of very first unit. [So, care of the self is central, and making students include themselves as subjects in the learning process - nice!]
  • Learning is change dramatically – globalisation, computing, and so on. [But, perhaps, there is an important qualification on some of the more optimistic claims for 'new' learning: learning is embedded within society in ways that shape those possibilities in ways that are not entirely concerned with 'better' learning. At the very least, the definition of better is contested: is it cheaper? is it more orderly and commodifiable? is to linked to national norms and needs?]
  • The creating mind is the goal. [Interesting - not creative, but more positive and active - creating. Good difference]
  • Reflection is essential to achieving the kind of succcesses in self-developmental learning; using Dewey (2003), emphasises “active persistent and careful consideration”; reflection is not taking “things for granted…[leading to] ethical judgment and strategic actions” (Groundwater-Smith, 2003).  [ Further work needed, perhaps, to understand reflection for this new generation, if one takes as given the significant changes in knowledge: is reflection as developed in 20th c the right kind of reflection?]
  • ALACT model – action, looking back, awareness of the essential aspects, create alternatives, trial.

image of ALACT

[This is really helpful - I like the added 5th step, compared to the normal action research 4-step model]

  • “the artefacts placed in their portfolio showcase who they are and their current onling learning”; these artefacts are attached to the standards which define what it is to be an educated teacher according to outcomes required. [So portfolios are a clear negotiation of the student's understanding of those requirements and standards?]
  • Exploration of the actual portfolios that students have created, using a paid-for service iwebfolio (was subsidised). Variety of successes and failures, all the material goes into a digital, not paper portfolio. Notes the fact that the metadata on when and how material uploaded is available, unlike other means of generating a portfolio. [I emphasise: the portfolio is a genuine, real requirement for teaching employment. It is authentic learning]
  • Use of standards / outcomes as information architecture to drive cognition in inputting information (adding artefacts, commenting etc [So, the portfolio is 'scaffolding' into which a building goes, with a clear design brief. It might be a hghly structured knowledge engine]

I am wondering if the students genuinely are doing this work for themselves or if they imagine an audience of ‘judges’ – their teachers who grade the portfolio or the employers who might use it? Managing multiple audiences is tricky, even with technology that allows it – because if you can shape the portfolio for several audiences…. then does the self audience survive?

Then again, maybe the whole point is that the students are not yet capable of being their own audience.

Some other portfolio software (and look how it is more than just a portfolio…)

http://www.pebblepad.com

Examples of authentic learning in Internet Communications II: WEB206

Posted in Ideas, Presentations on December 1st, 2010 by admin – 1 Comment

See also other posts including the first one, on Web Communications 101, which explains more of the context.

Web Publishing 206

(basic unit description)

Students doing the BA (Internet Communications) learn, in WEB101, to create a web presence that acts as the primary locus of their online identity, with links to other services and applications. In Web Publishing 206, the focus moves much more directly to writing effectively for the web (where writing can also including other media, but emphasises the written word).

The authenticity of the assessments in Web Publishing 206 are principally mobilised by requiring students to write regularly, on their blog, exploring different aspects and techniques of good online writing. The blog is assessed in its own terms, and also as the basis for students’ reflective essays which ensure that students are thinking about (as well as doing) this crucial online communication task.

Some examples of students’ blogs are:

Notably, most students make virtually no reference to the ‘study’ component of these blogs: these are genuine blogs addressing audiences outside universities. Use of the tag Web206 however enables academic staff to look into them to find relevant content! And one student cleverly ‘colonised’ the name WEB206 : WEB206 | a Curtin University of Technology unit

While in WEB101 there was a strong sense that other students were the audience (along with the teacher), in WEB206 students are developing a much greater awareness of real audiences. In this respect, if no other, the assessment task is significantly advantaged by making it public knowledge networking.

As before, the blogging linked with other services and tools, pricipally delicious, as in these examples:

Once again, we see the value of the tag – the tag Web206 enables just the relevant links to be pulled from delicious into the blog, enabling a student to also use delicious for many other purposes. In this way, knowledge networking drives the nature of the assessment completion.

More findings from Web206 (which has only just run for the first time in late 2010)  will emerge over time. Thanks to Dr Helen Merrick, chief wrangler of publishing.

—————————————————-

Examples of authentic learning in Internet Communications I: WEB101

Posted in Ideas, Presentations on December 1st, 2010 by admin – 1 Comment

The first of several posts, each relating to a different unit of study at Curtin

Introduction

Over the past two years, students in Internet Studies, Curtin University studying the BA (Internet Communications) and related courses have been doing a lot of authentic assessment involving online activities. These assignments are  authentic in that they are ‘true’ to the content of their studies (that is, aligned with the outcomes), ‘ real’ within the likely fields of employment for graduates, and ‘natural’  for the the emerging dominance of knowledge networking in society. More on these three variations on authenticity in a moment.

Not all assessments fit this pattern (nor should they), but we have seen significant improvements in the motivation of students to complete and exceed the requirements of assignments, as well as a greater degree of creativity and expression suggesting deeper engagement with learning. It has also, we think, improved students’ attention to more scholarly traditional assignments (such as essays) because of the variety we engendered across all assignment tasks. (And, it should be noted: essays are authentic – to the lifeworld of academic which also remains important as well as work and elsewhere).

Much of what makes these assessment approaches authentic is that they are public. Here, then, are some examples which suggest some of the value of embracing public knowledge networking as the basis for assessment, at least in courses that involve digital media and communications but, most likely, in any course where students need to work with, communicate and reflect on knowledge and, in doing so, become producers, not just receivers.

Web Communications 101 (WEB101)

A major component of the assessment in this unit is a ‘web presence’. More than a website and blog, a web presence interlinks a central node with linked  services and nodes to expand the digital footprint of a user and established their online identity. The negotiation and communication of identity is central to this unit: it’s not just ‘how to blog’.

A very small number of examples of these web presences are:

Over 400 students have taken the unit: sorry, can’t show them all. In particular, look at how some students have made their web presence almost entirely ‘real’, with bare hints of what it connects to (their study); others have not. Some students, as evidenced by these presences, are now using them as part of other units of study too.

Note that students happily created their own informal, computer-mediated network spaces such as Web101 – Curtin University | Facebook; and staff teaching also use the web as it was intended – free and rapid information exchange – to support this unit:  Web101 Assignments FAQ.

A big part of the unit also involves the use of twitter: see the most recent  Twitter search; delicious is also used.

Please look at “I Tweet Therefore I am?” by Dr Tama Leaver, chief architect of the WEB101 learning experience.

———————————-

As I have argued elsewhere: the authenticity of these assessments is not a simple ‘flip’ from artificial academic work into ‘real’ web work. They are a negotiation and a compromise in which equally valid requirements from both knowledge networking and education are brought into a creative and productive tension. In the next instalment, I will provide some examples of what happens for students in the followup unit to WEB101.