British library and information schools : the research of the School of Information Management (MIC), London Metropolitan University 9781846634031, 9781846634024

This is the third in a series devoted to the research output of British library and information schools. This e-book foc

161 83 1MB

English Pages 85 Year 2007

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

British library and information schools : the research of the School of Information Management (MIC), London Metropolitan University
 9781846634031, 9781846634024

Citation preview

ap cover (i).qxd

08/03/2007

12:45

Page 1

ISSN 0001 253X

Volume 59 Number 2 2007

Aslib Proceedings New information perspectives British library and information schools: the research of the School of Information Management (MIC), London Metropolitan University Guest Editor: Rosemary McGuinness

www.emeraldinsight.com

Aslib Proceedings: New Information Perspectives

ISSN 0001-253X Volume 59 Number 2 2007

British library and information schools: the research of the School of Information Management (MIC), London Metropolitan University Guest Editor Rosemary McGuinness

Access this journal online ______________________________

119

Editorial advisory board ________________________________

120

GUEST EDITORIAL Research in the School of Information Management, London Metropolitan University Rosemary McGuinness __________________________________________

121

Managing the relationship between knowledge and power in organisations Catherine Kelly ________________________________________________

125

The I in information architecture: the challenge of content management Sue Batley ____________________________________________________

139

Phenomenography: a conceptual framework for information literacy education Susie Andretta ________________________________________________

Access this journal electronically The current and past volumes of this journal are available at:

www.emeraldinsight.com/0001-253X.htm You can also search more than 150 additional Emerald journals in Emerald Management Xtra (www.emeraldinsight.com) See page following contents for full details of what your access includes.

152

CONTENTS

CONTENTS

Post-structuralism, hypertext, and the World Wide Web

continued

Luke Tredinnick _______________________________________________

169

Learning by doing: lifelong learning through innovations projects at DASS Shiraz Durrani ________________________________________________

187

www.emeraldinsight.com/ap.htm As a subscriber to this journal, you can benefit from instant, electronic access to this title via Emerald Management Xtra. Your access includes a variety of features that increase the value of your journal subscription.

Structured abstracts Emerald structured abstracts provide consistent, clear and informative summaries of the content of the articles, allowing faster evaluation of papers.

How to access this journal electronically

Additional complimentary services available

To benefit from electronic access to this journal, please contact [email protected] A set of login details will then be provided to you. Should you wish to access via IP, please provide these details in your e-mail. Once registration is completed, your institution will have instant access to all articles through the journal’s Table of Contents page at www.emeraldinsight.com/0001-253X.htm More information about the journal is also available at www.emeraldinsight.com/ ap.htm

Your access includes a variety of features that add to the functionality and value of your journal subscription:

Our liberal institution-wide licence allows everyone within your institution to access your journal electronically, making your subscription more cost-effective. Our web site has been designed to provide you with a comprehensive, simple system that needs only minimum administration. Access is available via IP authentication or username and password.

E-mail alert services These services allow you to be kept up to date with the latest additions to the journal via e-mail, as soon as new material enters the database. Further information about the services available can be found at www.emeraldinsight.com/alerts

Emerald online training services Visit www.emeraldinsight.com/training and take an Emerald online tour to help you get the most from your subscription.

Key features of Emerald electronic journals Automatic permission to make up to 25 copies of individual articles This facility can be used for training purposes, course notes, seminars etc. This only applies to articles of which Emerald owns copyright. For further details visit www.emeraldinsight.com/ copyright Online publishing and archiving As well as current volumes of the journal, you can also gain access to past volumes on the internet via Emerald Management Xtra. You can browse or search these databases for relevant articles. Key readings This feature provides abstracts of related articles chosen by the journal editor, selected to provide readers with current awareness of interesting articles from other publications in the field. Non-article content Material in our journals such as product information, industry trends, company news, conferences, etc. is available online and can be accessed by users. Reference linking Direct links from the journal article references to abstracts of the most influential articles cited. Where possible, this link is to the full text of the article. E-mail an article Allows users to e-mail links to relevant and interesting articles to another computer for later use, reference or printing purposes.

Xtra resources and collections When you register your journal subscription online, you will gain access to Xtra resources for Librarians, Faculty, Authors, Researchers, Deans and Managers. In addition you can access Emerald Collections, which include case studies, book reviews, guru interviews and literature reviews.

Emerald Research Connections An online meeting place for the research community where researchers present their own work and interests and seek other researchers for future projects. Register yourself or search our database of researchers at www.emeraldinsight.com/ connections

Choice of access Electronic access to this journal is available via a number of channels. Our web site www.emeraldinsight.com is the recommended means of electronic access, as it provides fully searchable and value added access to the complete content of the journal. However, you can also access and search the article content of this journal through the following journal delivery services: EBSCOHost Electronic Journals Service ejournals.ebsco.com Informatics J-Gate www.j-gate.informindia.co.in Ingenta www.ingenta.com Minerva Electronic Online Services www.minerva.at OCLC FirstSearch www.oclc.org/firstsearch SilverLinker www.ovid.com SwetsWise www.swetswise.com

Emerald Customer Support For customer support and technical help contact: E-mail [email protected] Web www.emeraldinsight.com/customercharter Tel +44 (0) 1274 785278 Fax +44 (0) 1274 785201

AP 59,2

EDITORIAL ADVISORY BOARD

Ralph Adam Freelance Journalist, City University, London, UK

120

John Akeroyd Director, Learning Resources, South Bank University, London, UK Andrew Boyd Research Associate, CIBER, School of Library, Archive and Information Studies, University College London, UK

Dr Ben Fouche Principal, Knowledge Leadership Associates, South Africa Eti Herman Vice-Director, Library, University of Haifa, Israel

David Brown Director, Publishing Relations, The British Library, UK

Nat Lievesley Network Manager, Centre for Policy on Ageing, London, UK

Peter Chapman Information Consultant, Banbury, UK

Helen Martin Media Information Consultant, London, UK

Professor Peter Cole Head of Department, Department of Journalism, University of Sheffield, UK

Anthony Watkinson Publishing Consultant, Oxford, UK

Dr Tom Dobrowolski Senior Lecturer, Institute of Information Science and the Book, University of Warsaw, Poland Harry East Honorary Research Fellow, Department of Information Science, City University, London, UK

Aslib Proceedings: New Information Perspectives Vol. 59 No. 2, 2007 p. 120 # Emerald Group Publishing Limited 0001-253X

Professor David Ellis Research Director, Department of Information and Library Studies, University of Wales, Aberystwyth, UK

Richard Withey Global Director, Interactive Media, Independent News and Media, UK

The current issue and full text archive of this journal is available at www.emeraldinsight.com/0001-253X.htm

GUEST EDITORIAL

Research in the School of Information Management, London Metropolitan University Rosemary McGuinness School of Information Management, London Metropolitan University, London, UK

Guest editorial

121 Received 24 March 2006 Revised 29 March 2006 Accepted 31 December 2006

Abstract Purpose – The purpose of this paper is to demonstrate the practical applications of Information and Knowledge Management to society from the academic team at London Metropolitan University through an introduction to the papers in this special issue. Design/methodology/approach – A narrative which provides a social context to Information and Knowledge Management. Findings – There is a concrete link between the theory and practice of Information and Knowledge Management in delivering an equitable society. Research limitations/implications – This paper is limited to a descriptive editorial. Practical implications – Information and Knowledge providers and services are primarily for individuals and organisations in society. Originality/value – This paper represents an introductory look at the Information Society. Keywords Information management, Knowledge management, Society Paper type Viewpoint

In September 2000 aggrieved farmers in North Wales voted after a meeting on 6 September to put a blockade on a fuel depot when diesel and petrol prices were increased due to the rising cost of crude oil. Two days later they started their blockade of the Stanlow oil depot in Merseyside. They were encouraged to this action by their French counterparts a week earlier. Underlying this protest was the fact that the UK Government had the highest fuel tax in Europe. It is well documented that the effect of this local protest, which spread like wildfire nationwide, was fanned by the Internet, mobile telephony and local radio. The original group were joined in this protest by hauliers, taxi drivers, and indeed had the support of the oil suppliers themselves. This convergence of technologies, aligned with the speed of transference of information, sparked scenes of panic and temporary hardship. Many petrol stations closed down and deliveries of basic food to shops were curtailed. The protests spread to other European countries – Ireland, Spain, Sweden, Norway, Germany and the Netherlands. The London Metropolitan University issue of this special edition of Aslib Proceedings opens with a reminder of the above event. That story which started on 6 September 2000 highlights the challenges to civil and informed society. The Information Management subject domain has a home in London Met in the Department of Applied Social Sciences. This location is of particular relevance when you come to read the articles by the academics.

Aslib Proceedings: New Information Perspectives Vol. 59 No. 2, 2007 pp. 121-124 q Emerald Group Publishing Limited 0001-253X DOI 10.1108/00012530710736636

AP 59,2

122

Information and Knowledge Management at London Metropolitan University is well grounded in the Social Sciences and the slant of work by the authors in this special edition of Aslib Proceedings is testament to this. In considering society it is useful to draw on the three legged stool as a metaphor. Consider that the seat of the stool is society itself and that the three legs represent the citizen, the state and capital. If one of these legs becomes either too long or too short then the stool will topple over. The oil blockade highlights the fragility of this stool impeccably. The citizen, capital and the state entered a conflict and temporarily the stool’s balance was threatened. The above event illustrates clearly how the power of the citizen can be used and how vulnerable both the state and citizen are to capital. In other societies, where the leg representing the state is too long, economic well-being can be stifled with inefficiencies and over-regulation. All three legs have to finely balance in order to have a civil and equitable society. However, without capital society would not be able to function. The development of the knowledge economy is a central plank of government policy and organisations playing their part in contributing to GDP are key to contributing to the economic – of society and to the citizens within it. Information, knowledge, and the effective management of them, are impossible to divorce from the social structure and the global web of networks. This is mirrored by the exponential growth of information and the twentieth century shift from a goods producing economy to a service economy where, according to Daniel Bell, knowledge is placed at the fulcrum. Information is the lubricant and source of energy in this society. Professionals who work managing information and knowledge have a different role today than heretofore. No longer are they merely guardians and organisers of information and knowledge – they now have to work within a greater framework. The shift is fundamental in that they have to deliver outcomes to meet the needs of the economy and society and provide a service which is increasingly accountable and, moreover, provide that service both legally and ethically. Additionally, the technological and communications infrastructure provides opportunities to push information more readily. Citizen engagement with the state is crucial as well and access to information and learning is key in promoting this. Technology aids this as well. However, the state is rightly concerned about the increasing disengagement of citizens and the manipulation and “ownership” of information is critical in this regard. One of the key debates is around “fee or free” with regard to information and the value that is placed on it and who, of course, values it. Ownership of media and information and communications technologies is a perennial issue and the question of “free” information goes beyond financial cost. How free is it from manipulation and spin-doctoring? The key to democracy is information and without it or access to it how can citizens engage in a Habermasian public sphere? In designing curricula to provide information professionals there has been a shift in emphasis here in London Metropolitan University. The principle thrust of that shift has been to move in the direction of providing the market with graduates with skills and competencies to meet the demands of the changing society. A deeper knowledge of the centrality of information and knowledge to both society and organisations is important to impart. Discussions with employers over the last few years have shown that while there are particular challenges to be met with the increase in digital

information and its organisation, they are very keen to recruit graduates with “can do” attitudes, with excellent communications skills being paramount. The vitality of the economy is fundamental to society and organisations large and small, public and private are at the heart of this. Two of the articles draw particular attention to the organisational issues around information and knowledge. The first of these by Kelly clearly steps outside of the traditional Information Management box and deals with the issue of knowledge as a power resource in organisations. The article draws on the literature from a vast range of disciplines and ultimately proposes that for organisations to be successful the transference of knowledge as a personal resource to a community and a shared one is of prime importance. The key to this is recognising the value of social capital and the need for organisations to invest in this. Through this investment the individual knowledge can become communal through the personal network of acquaintances that individuals develop. Research methods from the social sciences, such as social network analysis and ethnography, can reveal these human links which are vital in establishing Knowledge Management practice within organisations. Batley’s article is highly relevant in the emergent digital information landscape that organisations have to manage. The focus of this article is on addressing the challenge of content and records management in order that information can be readily retrieved. Information architecture provides the framework for working towards achieving this goal and her paper is all the more relevant as the demands for efficiency are ever present and compliance with legislation such as the Freedom of Information Act is a requirement for public sector organisations. The Library and Information Commission’s mantra for the information society in 1997, which set the scene for the Public Library Network, centred on their three C’s: Connectivity, Content and Competence. It was recognised that individuals would have to be able to utilise the newly arrived technological information tools. At a base level, Competency was concerned with Information Literacy. However, Information Literacy has a much wider remit than mere competence. The underlying tenet is the transference of learning skills and the development of independent learners, not only in education but in all areas of employment. Andretta, in her article, looks in particular at reflection as a key component of learning and posits that phenomenography and information literacy education has a great deal of overlap which has particular application in Higher Education. At one level, learners acquire knowledge of their particular discipline and at the second level of learning they relate to what they have learned and how they apply that knowledge. This article will be of particular relevance to information professionals with a role in providing information literacy education or training and to those who play a key part in promoting organisational learning. The traditional validity of the neutrality of the information profession is debated by Tredinnick. In his paper he discusses post-structuralism and goes on to apply these theories to information and knowledge management. Where post-structuralism has had application principally in the domains of communication and cultural studies, there has been little attempt to engage these theories in Information and Knowledge Management. Tredinnick suggests that the structures of information are not wholly neutral and that the classification schemes can indeed impose relationships between items which can result in the politicisation of information and knowledge. The neutral descriptions of knowledge are in contradiction to interpretative decisions which have a

Guest editorial

123

AP 59,2

124

socio-cultural impact which can influence the way knowledge is received by citizens and society. Workforce development is key to delivering the goals to a fair and equitable information society and Durrani’s paper highlights two key projects which address the need for leaders of libraries and information services in two distinct parts of the globe – UK and Kenya. The Quality Leaders Project – Youth (QLP-Y) is addressing one of society’s most important and yet, in many respects, least understood and most diverse groups. Many areas in the Department of Applied Social Sciences at London Met have a youth agenda. These include Cultural Studies, Health, Digital Media, Social Work and Sociology. The QLP-Y is no different. It is working to develop leaders in public libraries to provide services appropriate to youth. Refugees and asylum seekers are of great concern here. The other project described by Durrani is one set up by the Progressive African Library and Information Activist’s Group (PALIAct) where neither the state or capital have delivered the goals of society. Experts have been drawn in from the USA, UK and Finland to look at developing information services to citizens in Kenya, with funding being sought from NGOs by highlighting this essential work through agencies such as IFLA and the World Social Forum. The key conclusion from the body of work in this issue is that which highlights the centrality of both information and knowledge in society. Information and knowledge professionals not only have to sit on that three legged stool, but need to ensure that it remains upright, robust and stable. Note Rosemary McGuinness left London Metropolitan University in spring 2006, and is now a teaching fellow in the School of Education at Queen’s University, Belfast. Since her departure, the Information Management School has merged with other fields in the Department of Applied Social Sciences, to form the new multidisciplinary area of Media, Information and Communications (MIC). This area is led by Dr Anna Gough-Yates, and focuses on cultural, socio-economic, organisational, managerial and political issues as they relate to information and knowledge management, the media and global communications. Rosemary McGuinness can be contacted at: r.mcguinness @qub.ac.uk

To purchase reprints of this article please e-mail: [email protected] Or visit our web site for further details: www.emeraldinsight.com/reprints

The current issue and full text archive of this journal is available at www.emeraldinsight.com/0001-253X.htm

Managing the relationship between knowledge and power in organisations Catherine Kelly School of Information Management, London Metropolitan University, London, UK

Knowledge and power in organisations 125 Received 24 March 2006 Revised 19 October 2006 Accepted 7 December 2006

Abstract Purpose – The purpose of this paper is to focus on knowledge management implementation from an organisational culture perspective and analyse the relationship between knowledge and power within this context. It outlines the reasons why knowledge is a power resource, and proposes that, as such, it can only be managed successfully within the framework of an effective and legitimate use of all organisational power resources. The paper looks at the factors that constitute a legitimate use of power in the Western organisational context of the twenty-first century which in turn engenders the development of trust within employment relationships. The development of trust ensures that knowledge is used to further the achievement of organisational goals. Finally, the paper addresses the ways in which effective knowledge management practice contributes to this desired state, and outlines the role of the knowledge manager in facilitating this. Design/methodology/approach – The method adopted is a literature-based analysis of the main issues covered. These include: the development of the knowledge society and attendant theories around optimal organisational structures, the relationship between knowledge and power, the development of legitimate authority within organisations, and how this impacts on the creation of trust, and finally the impact which the presence of trust has on knowledge-sharing behaviours within the organisation. Findings – Pulling together evidence from across a wide range of academic disciplines leads to the conclusion that the successful management of the relationship between access to knowledge and access to power must be framed within an overall organisational context, in which all power resources are seen to be exercised in a legitimate manner. In this context, knowledge is no longer regarded as a personal power resource, but rather as a communal resource which will then be more likely to be shared freely in order to facilitate the joint and mutually beneficial achievement of organisational goals. Underpinning this organisational dynamic is an environment of trust. Originality/value – The paper provides a summary of the literature around pivotal aspects of the question of the relationship between access to knowledge and the perception of knowledge as a source of power in the organisational context. It pulls together a range of material looking at the needs of the knowledge economy and at issues around the development of legitimate authority and the development of trust in the organisational context. It then relates this back to the successful development of a knowledge-sharing culture, and outlines the role of the knowledge manager in working with employees at all levels in the organisation in developing an optimal culture for knowledge creation and sharing. Keywords Knowledge management, Trust, Authority, Organisational structures Paper type Viewpoint

Introduction It has become increasingly clear in recent years to knowledge management practitioners that, although technology solutions offer the ability to share information and knowledge, the presence of technology in itself will not create and maintain a

Aslib Proceedings: New Information Perspectives Vol. 59 No. 2, 2007 pp. 125-138 q Emerald Group Publishing Limited 0001-253X DOI 10.1108/00012530710736645

AP 59,2

126

commitment to knowledge sharing behaviour (Currie and Kerrin, 2004). Davenport and Prusak (1988, p. 80) called such cultural habits that inhibit knowledge transfer “frictions”. It is intended here to move towards addressing questions on how to effectively manage these frictions, how to develop trust in the organisation, and how to manage human behaviour in such a way as to create a conductive organisation in terms of knowledge flows (Saint-Onge and Armstrong, 2004). The focus for discussion in the following paper will be around the management of tacit knowledge, as defined by Polanyi (1966) as personal, context specific and therefore hard to formalize and communicate. It is not intended here to deal with the management of explicit or codifiable knowledge or information. Research has demonstrated that early knowledge management initiatives which have focused exclusively on technology solutions in attempting to manage tacit knowledge have consistently been shown to have met with limited success (Lucier and Torsiliera, 1997; Newell et al., 2001; McKinlay, 2002). Further studies attempting to identify critical success factors for knowledge management strategies have highlighted the view that successful knowledge management in organisations hinges on creating an organisational culture in which employees are committed to working together towards the common goals of the organisation, rather than working largely towards individualistic egoistic goals (Pan and Scarbrough, 1999; Storey and Barnett, 2000; Garvey and Williamson, 2002; Hislop, 2002). One of the key aspects of the management of organisational culture that has been offered for further investigation is how to understand and successfully manage the perceived relationship between knowledge and power in organisations in order to increase the level of appropriate sharing of knowledge within organisations. Hayes and Walsham clearly identified this problem when they commented: At present, accounts of the political and normative issues in the mainstream knowledge management literature, particularly with reference to the role of information systems, are not as plentiful as these issues warrant (Hayes and Walsham, 2000, p. 70).

It is proposed here that the question of knowledge as a power resource, however, cannot be dealt with in isolation, but needs to be looked at in the context of the overall power and authority structures within organisations. It is proposed that the development of trust is affected by the way in which power and authority are exercised within an organisational context. The way in which organisations create legitimacy in the exercise of power over their employees, and the way in which employees respond to this exercise of power, underpin the levels of trust which exist in the employee/employer relationship. The development of trust, and the existence of mutually reciprocal relationships, in turn impacts strongly on the way in which knowledge is used, shared and developed within the organisation. This paper will also look at the role of the professional knowledge manager in working collaboratively towards the creation of “high trust” work environments in order to ensure that knowledge creation and transfer is embedded in the social fabric of an organisation. To summarise, therefore, this paper will unlock the dynamics impacting on knowledge creating and knowledge sharing in Western organisations through a discussion of the following areas:

(1) The development of the knowledge society and the fundamental ways in which organisational structure affects employee behaviour. (2) The relationship which exists between knowledge and power in organisations. (3) How the exercise of power or authority may be legitimised within the organisational context. (4) The relationship between trust in the organisation and effective knowledge flows. (5) The relationship between the exercise of legitimate authority and the development of trust in the organisational context. (6) The role of the knowledge manager in creating the necessary conditions for knowledge flows. 1. The development of the knowledge society and the impact of organisational structure on knowledge sharing The genesis of the ideas around the emergence and development of a knowledge society are most widely credited to the writings of Daniel Bell in the key publication entitled The Coming of Post Industrial Society, first published in 1973. Bell’s work has provided a foundation for many contemporary writers attempting to analyse the impact of the knowledge economy. Bell created a typology of societies characterised by their dominant mode of employment. Industrial societies are described as primarily goods-producing societies. In post-industrial society he proposes, however, that the dominant source of employment is the service sector. He goes on to say, “A post industrial society is based on services. Hence, it is a game between persons.” (Bell, 1973, p. 127). Bell further develops this view when he comments, “Information becomes a central resource, and within organisations a source of power” (p. 128). Bell’s analysis of the development of the knowledge society represented a significant shift from previous analysis of societal behaviour, offered at the onset of the industrial age from the 1870s onwards in Western society. Although Bell’s writings have created a recognition of the changing drivers for organisational and societal success in the “information age”, there is also an awareness that the ways in which organisations are structured and employees managed still reflect an earlier industrial age characterised by, for example, the hierarchical/bureaucratic model of command and control discussed by Max Weber (1978) in some depth. It can be seen that the bureaucratic/hierarchical organisational model has been utilised widely during the 20th century, especially in large public sector organisations. These organisations typically have exhaustive documented rules and regulations that are ostensibly designed to: achieve organisational objectives; and control and monitor employees working towards achievement of these objectives. However, when organisations seek to closely control and monitor worker behaviour, this also has the negative effect of destroying creativity and innovation amongst employees, and also any freedom to react appropriately in order to meet customer needs. For example, employees of such organisations may focus on “ticking the box” or following the documented procedures in fulfilling a particular role, rather than completing a task or delivering a service in the most effective way possible. Many people will have experienced service delivery where the organisation may feel that it has

Knowledge and power in organisations 127

AP 59,2

128

fulfilled expectations according to its documented procedures, while the customer may not feel the service has met actual customer needs. Employees in such organisations often find themselves in the impossible situation of being unable to react appropriately to changing circumstances, as the job requirement is to stick to “the rules”. Employees over time may then themselves begin to lose sight of the actual purpose of the organisation and develop an institutionalised mind set in dealing with customers. In order to address these issues, early writers on knowledge management have proposed new forms of flat or networked organisational structures to allow for increased team working and ongoing innovation within the service based economy or the knowledge economy. For example, Thomas Stewart (1991, p. 47) suggests that managing knowledge effectively “requires a corporate culture that allows it to flow freely, which means breaking down hierarchies and getting rid of rules that stifle new ideas”. Palmer (1998) has pointed out that steep functional hierarchies can promote a culture of distrust and monitoring. However, it is very clear to many information and knowledge professionals in the field that there has not been adequate reflection on the questions this raises about the relationship between access to knowledge and access to power in the organisational context. The proposed dissolving of power bases has proved, in reality, difficult to implement without a fuller understanding of the dynamics between these two areas. It is now proposed to address this by looking a little further at knowledge and power, and why/how knowledge is seen as a source of power in the organisational context. 2. The relationship between knowledge and power in organisations Before discussing the relationship between power and knowledge, it is useful first of all to offer a definition of power. Heywood (2004) defines power simply as follows: . . .a question of who gets their way, how often they get their way and over what issues they get their way (Heywood, 2004, p. 124).

However, power can only be exercised through the use of power resources. Power resources are described by Hales as, “those things which bestow the means through which the behaviour of others may be influenced and modified” (Hales, 1993, p. 18). Hales goes on to provide an extremely useful typology of power resources, as seen in Table I. Hales points out that these resources can be available through either personal possession or through an organisational position allowing access to them, and thus the way in which this power can be utilised can also be either personal or professional. As can be seen from Table I, he outlines two types of power resources based on knowledge, namely Technical Knowledge and Administrative Knowledge. Technical knowledge allows access to and control over technical information, or know-how. A typical example would be the IT professionals within an organisation, who are crucial to the effective working of an organisation, but who are often regarded by other employees as the “techies”. Non IT professionals generally have little real understanding what IT professionals actually do on a day to day basis. This is a source of power for those with the technical know-how. This dynamic applies across many other professional and technical areas of practice, ranging from scientists, lawyers and academics to engineers, mechanics and plumbers. Administrative knowledge, however, offers a different kind of power base, in that it allows different levels of access to, and control over, organisational information.

Power resources

Personal

Positional

Physical

Individual strength/possession of means of violence

Access to means of violence

Economic

Individual wealth/income

Access to/disposal of organisational resources

Knowledge 1. Administrative

Individual expertise

2. Technical

Individual skill/expertise

Normative

Individual beliefs, values, ideas, personal qualities

Knowledge and power in organisations 129

Access to/control over organisational information Access to/control over technical information and technology Access to/control over organisational values, ideas, “Aura” of office

Source: 1993

Herbert Simon (1945) highlighted the fact that higher ranking jobs in organisations entail greater degrees of decision making (the “director”) with less focus on completing actual tasks (the “practitioner”). Therefore higher level employees spend a good deal of their time managing and co-ordinating the tasks of those lower in the hierarchy. Part of the exercise of power in this regard is associated with access to organisational knowledge and sharing this knowledge is giving away the basis for power. In saying this, it is not advocated that all information and knowledge can or should be freely shared, but rather that information and knowledge which enables employees at different levels in the organisation to do their jobs more effectively. Hislop summarises this problem when he comments: A willingness to share knowledge with others may be driven by a desire to contribute to organisational performance or to receive status and rewards from being seen to use personal knowledge, whereas a reluctance to share knowledge may be due to concerns that one is giving away what makes one powerful, or from a desire to prevent certain individuals/groups gaining access to one’s knowledge (Hislop, 2005, p. 96).

Accepting, on the basis of the above discussion, that knowledge is a power resource, then the next question is to understand how we can maximise the chances that knowledge is used in ways which further organisational goals, rather than individual goals. It is proposed that a key aspect of promoting the appropriate leveraging of knowledge as a power resource is to ensure that all power resources within the organisational context are seen by as many employees as possible as being directed in a legitimate way towards mutually agreed ends. In others words, that the organisation develops a legitimate and mutually agreed authority base. The next section discusses how this may be achieved. 3. How the exercise of power or authority may be legitimised in the organisational context Heywood offers a useful definition of the relationship between power and authority: While power can be defined as the ability to influence the behaviour of others, authority can be understood as the right to do so (Heywood, 2004, p. 130).

Table I. Power resources

AP 59,2

130

Within the Western societal and organisational context, it is important that power is seen to be exercised in what are regarded as legitimate ways, but the ways in which this legitimacy can be maximised is a subject of some debate. It is important to mention here again the work of Weber and his attempts to address this difficult question in the early part of the 20th century. Max Weber famously outlined three types of authority: traditional authority, charismatic authority, and rational-legal authority. Traditional authority is based upon respect for long established customs and traditions, and does not apply in many of today’s organisations in the Western world. Secondly, charismatic authority depends upon the power of an individual’s personality and his or her charismatic or leadership qualities. Leaders such as Bill Clinton, the former US president, could be put in this category. Thirdly, legal rational authority operates through formal and clearly defined rules, and has as its basis a respect for a rule of law. This ensures that those who exercise power do so within a framework of office that has clearly delineated rules. In each of these cases, Weber puts forward the view that authority is legitimate, if it is regarded by those subject to it, as being so (Morrison, 2005). Although legal rational authority could be regarded as the norm in many large organisations, the legitimacy it creates is based on written rules and procedures, usually sanctioned and implemented by those in positions of power. Often these rules apply more stringently to those who are lower in the hierarchy, while those at the top of the hierarchy are allowed greater freedom to act with autonomy. Organisations like this do not necessarily work on strongly democratic principles. Power can be wielded largely without reference to workers at lower levels in the organisational hierarchy. Exercise of authority in this way will often result in a low trust environment, and as a consequence, inadequate use of knowledge in furthering the attainment of organisational goals. A different view of a basis of legitimate authority has been proposed for the 21st century, and perhaps one that will better meet the needs of the knowledge economy. A very pertinent study called The Legitimation of Power (1991) by David Beetham has attempted to develop this alternative concept of legitimacy around the exercise of power. In Beetham’s view, to define legitimacy, as Weber does, as nothing more than a belief in legitimacy is to ignore some key issues. Beetham proposed that simply because people believe in the legitimacy of power does not mean that this power was acquired or is exercised in a legitimate fashion. He therefore proposed that: For power to be fully legitimate, then, three conditions are required: its conformity to established rules; the justifiability of the rules by reference to shared beliefs [and] the express consent of the subordinate to the particular relations of power (Beetham, 1991, p. 19).

It is proposed here that the exercise of legitimate authority impacts strongly on the development of trust in the workplace. This will be looked at in more detail in Section 5 below. 4. The relationship between trust in the organisation and effective knowledge flows At this point, it is important to outline the relationship between trust and the creation of knowledge enabled enterprises. In truth, knowledge management practitioners have long recognised the development of trust in the organisational context as being absolutely pivotal to the successful development of a knowledge sharing culture. The presence of

trust is regarded as a necessary condition in the facilitation of co-operative work practices and the effective use of resources. Dogdson and Rothwell (1994) argue that the basis for any effective collaborative work practice is the development of high trust relationships between related parties, and only in this way can the exchange of knowledge be truly effective. A top manager within a case study on Buckman Laboratories (Pan and Scarbrough, 1999, p. 8), a company that is often cited as a success story in its implementation of a knowledge management strategy, has highlighted this point further as follows: This is the most difficult aspect of knowledge sharing to achieve. If you can’t do it, you can’t succeed. We grew up learning to hoard knowledge to achieve power. Buckman created a culture of trust encouraging active knowledge sharing across time and space among all of the company’s employees across the world. The most valuable employee is one who becomes a source of knowledge and actively shares the knowledge with other people.

Therefore, creating trust in the organisational context is a key aspect of effective knowledge management practice. It is proposed here that if authority in an organisation is exercised in a legitimate fashion, then trust will more naturally follow from this. If trust exists, then power resources will be used to further organisational goals, rather than individual goals, and as knowledge is a power resource, knowledge flows will thus be greatly enhanced and improved. The relationship between these areas will be made more explicit in the following section. 5. The relationship between the exercise of legitimate authority and the development of trust in the organisational context How does the exercise of legitimate authority impact on the development of trust? Trust is a term which is much used, but often not so clearly understood. Often discussions around trust move backwards and forwards between attempting to define what trust is, and at the same time discussing the impact of the existence of trust in a social or organisational context. Perhaps the big question to address within the organisational context is not just one of creating trust, but of also creating trustworthiness. Harden makes this point well when he suggests that: Creating institutions that help secure trustworthiness thus helps to support or induce trust (Hardin, 2002, p. 30).

Creating organisations in which these states of trust and trustworthiness exist, it is proposed here, depends on creating organisations in which authority is exercised in a transparent and legitimate manner. To illustrate this point further, it is intended to relate the development of trust to the three conditions outlined by Beetham (1991) as essential to the creation of a legitimate authority base in organisations, namely: (1) Conformity to established rules. (2) The justifiability of the rules by reference to shared beliefs. (3) Express consent of the subordinate to the particular relations of power. Condition 1: conformity to established rules How does conformity to established rules impact on the development of trust in the organisational context? Dasgupta (1988) argues that trust is a commodity. In order for this commodity to be effectively exchanged, it must be situated within an organisation in which rules are clearly articulated, and in which enforcement and punishments (i.e.

Knowledge and power in organisations 131

AP 59,2

132

rewards and sanctions) are also clearly outlined and understood by organisational members. This important point can also be related back to Thomas Hobbes’s (1968) theories around the social contract, in which he suggests that human beings agree to form societies founded upon covenants and agreements. In agreeing to this so-called “social contract”, individuals agree to yield to a higher authority that enforces this contract. This authority ensures that the people will do what they have undertaken to do, and in doing so, protects those who agree to abide by these established rules. If people do not conform to these established rules, then they may engage in excessive individualistic or competitive behaviour which undermines the benefits of people working together towards common goals which organisations seek to harness. Thus, people may need a threat of sanctions as well as the attraction of reward systems to encourage them not to break their promise of agreeing to abide by the rules. This social contract could be said to apply equally within the microcosm of an organisation, as it applies within a wider societal context. Heywood (2004) proposes that self interest, and the recognition that private interests overlap with others, can make possible the development of an agreed set of rules within society resulting in a natural balance amongst competing individuals. This is described by Hayek, cited in Heywood, as the “spontaneous order of economic life” (Heywood, 2004, p. 44). Hardin (2002) comments that “much of our ability to trust others . . . depends on having institutions in place that block especially destructive implications of untrustworthiness” (p. 109). Condition 2: the justifiability of the rules by reference to shared beliefs Beetham (1991) secondly proposes that the perceived legitimacy of governance must be grounded in shared belief systems. How do shared belief systems impact on the development of trust in the working relationship? The knowledge management literature has, for a numbers of years, been trying to address how to manage the creation of shared belief systems, and exert what has been referred to by Etzioni (1964) as normative control. Kunda (1992) describes this normative control as the “attempt to elicit and direct the required efforts of members by controlling the underlying experience, thoughts and feelings that guide their actions” (Kunda, 1992, p. 11). The preoccupation with the development of normative control relates to the desire to gain employee commitment to organisational goals at a deeper and more committed level, and to therefore encourage the development of trusting and co-operative working relationships. Do shared belief systems therefore impact on the development of trust, and thereby effective knowledge sharing? In answering this, it will be useful to outline some of the theories as to why people trust each other. Hardin suggests that: Trust is relational. . .That is, I trust you because your interest encapsulates mine, which is to say that you have an interest in fulfilling my trust. Any expectations that I have are grounded in an undertaking of your interests, especially with respect to me (Hardin, 2002, p. 3).

Simmel argues that the dominant social relation in modern societies is exchange. Exchange teaches reciprocity and is “one of the functions that creates an inner bond

between people – a society in place of a mere collection of individuals” (Simmel, 1971, p. 175). Applying these views to the organisational context, Tyler suggests that: Trusting those that we think have well intentioned motivations extends our willingness to co-operate with others. . .the needs of organisations encourages such trust based co-operation (Tyler, 2003, p. 560).

Coleman (1990) argues that trust is based largely on self-interest, and, therefore, explicit communication about joint interests, accompanied by joint sanctioning, consciously creates solidarity in the organisational context. This is also argued by Lewis and Weigert (1985) who suggest that we trust when we believe that trusting will enhance or contribute to our interests. Therefore the embedding in organisations of shared belief systems or behavioural norms will impact strongly on the development of trust, in that organisational members will regard themselves as part of a network of mutually reciprocal relationships in which all parties will gain, if all work together towards the common objectives of the organisation. Condition 3: express consent of the subordinate to the particular relations of power When an employee agrees to work for an organisation in return for a particular pay and benefits package, that individual enters into an economic relationship with a particular organisation, based upon that agreed contract. If an employee freely and knowingly enters this employment relationship, then he/she undertakes to take their place within the organisation, within established power hierarchies. A basic level of trust is built around the acceptance from both sides of this contract as to the nature of the employment relationship, and the contractual obligations agreed to within this. Outside of these contractual obligations, however, there is another more intangible contract in existence in this employment relationship, and that is the psychological contract. Rousseau and Anton (1991) define the psychological contract as the reciprocal set of expectations that individual employees, and the organisation in which they work, have of each other. If the proposed psychological contract between the employee and the employer is made as explicit and honest from the outset (and most importantly, at the pre contract signing stage) then it creates greater levels of trust. Thus the written employment contract and a transparent psychological contract provide a framework which allow for the emergence of trust, as there is clarity of expectation around respective behaviour on both sides and, furthermore, the expectation of mutually reciprocal behaviour amongst individuals within the organisational as a whole. Hardin comments that there are inherent problems in trusting another who has great power over one’s prospects, and that one of the most important achievements in modern democratic societies is the regulation of various kinds of organisational relations to make them less subject to the caprices of power (Hardin, 2002, p. 101). People in ostensibly powerful positions do need co-operation from those under them if they are to succeed. When this is true, those with less power may well be able to trust those with great power, if both parties are to benefit form a mutually beneficial or reciprocal relationship.

Knowledge and power in organisations 133

AP 59,2

134

Organisations which have unclear contracts with employees will have higher cost bases in terms of employee turnover, with attendant loss of organisational knowledge, alongside more explicit costs around training new staff and staff recruitment costs. Equally, if the employee fails to respect and work within the existing power hierarchies, then this employee has also broken both the written contract and the psychological contract and is failing to fulfil the legitimate expectation of others. When either of these situations occurs, then it must be a choice of either party to terminate the employment contract. Systemic distrust in an organisation between management and staff is highly destructive in its impact, both for the individuals within the organisation, and in terms of the organisation meeting its objectives over the long term. Fukuyama makes this point also when he comments: The greatest economic efficiency was not necessarily achieved by rational self interested individuals but rather by groups of individuals who, because of a pre existing moral community, are able to work together effectively (Fukuyama, 1995, p. 21).

A clear employment contract, at both a tangible and an intangible level, agreed to by both employee and employer, is thus a pre requisite for the creation of an environment of trust. Trust and legitimate authority on a societal level? On a broader societal level, the same connection between the legitimate exercise of power and the consequent development of trust as pre requisites to a functioning knowledge economy globally have been made by Michalski et al. (2000) in a paper presented at an OECD sponsored conference entitled 21st Century Governance in the Global Knowledge Economy and Society. In it, the authors comment: Many places lack either the rules and/or mechanisms for sufficiently impartial enforcement needed to make contracts trustworthy, guarantees reliable, politicians or managers accountable and the institutional fabric of society legitimate. In such places, from market exchange to sharing knowledge can be burdensome to the point of economic and social breakdown. Under these circumstances the policy agenda takes as its primary aim the restoration or establishment of confidence in the laws and institutions that structure daily life. For societies where rule of law and institutional legitimacy have reigned for considerable periods of time, the challenge is how to deepen or renew confidence in the face of higher aspirations (Michalski et al., 2000, p. 480).

6. The role of the knowledge manager in creating the necessary conditions for knowledge flows Following on from the discussion in the paper so far, it becomes clear that developing trust hinges on the legitimate use of power, and that effective knowledge management in turn hinges on the existence of trust in the employment relationship. What is the role of the knowledge manager in contributing to the development of trust and thereby effective knowledge conductivities within the organisation? The knowledge manager within an organisation has a key role to play in facilitating the creation of the necessary conditions for organisational information and knowledge flows and the attendant responsiveness and adaptation to changing customer or stakeholder needs. This will entail working with people at all levels in the organisation

in creating both appropriate systems and appropriate practice around the creation of legitimate authority as outlined above, and consequently an environment of trust. At present, in many organisations, there is a managerial void in this regard, and into this vacuum rushes distrust and misinformation, overhung by dysfunctional organisational dynamics. This is further borne out by a recent Economist Intelligence Unit (2006) report entitled Foresight 2020, which surveyed business leaders and senior executives worldwide. The survey found that respondents believe that knowledge workers will be organisations’ most valuable source of competitive advantage up to the year 2020. The report states that effective customer relationships underpinned by organisational structures, which enhance communication and thereby productivity, will offer key leverage in utilising knowledge workers. Collaborative relationships amongst employees, and with customers and suppliers, are the goal of these improved organisational structures and cultures. This report forecasts that there will also be a shift in spending towards technologies that enable ongoing communication and collaboration. Therefore, knowledge management practitioners, working in tandem with senior management, HR practitioners and IT professionals, must seek to ensure that: . clarity exists around the psychological contract and organisational norms, and that employees who commit to work for the organisation do so on the basis of an understanding of these behavioural norms; . open and clear lines of communication are in place around organisational objectives, and how they may be achieved, both at a strategic, functional and individual level; . the organisation and communication of information within the organisation is achieved in an optimum way, through professional information management practice; . there is ongoing development of shared working practices directed towards continuous improvement in practice, and timely responsiveness to changing customer needs; . there is an embedding of a knowledge sharing culture through appropriate reward and sanction systems; and . systems exist to ensure that knowledge management activities are directly and clearly linked to organisational goals and objectives, and are effectively coordinated to achieve this. Summary and conclusion In this paper, a number of different perspectives on the development of knowledge sharing behaviours within organisations have been explored: (1) A brief discussion of the emergence of the knowledge society and the impact of organisational structure on knowledge flows. (2) A discussion of the key literature around the relationship between knowledge and power. (3) Discussions around the development of a legitimate authority base in organisations.

Knowledge and power in organisations 135

AP 59,2

136

(4) A discussion on the nature of trust and its relationship with knowledge management practice. (5) How the development of trust hinges on the exercise of legitimate authority. (6) The role that the knowledge manager plays in facilitating the creation of a knowledge sharing culture has been outlined. It is important to recognise that the effective management of knowledge is primarily predicated on the effective management of people. It is proposed overall that the relationship between power and knowledge is part of a wider power relationship that exists within all organisations. Therefore the successful management of this power relationship overall is predicated on effective and consensual management, based on a legitimate exercise of authority which can then result in the development of an overall environment of trust. In this context, knowledge flows are a living breathing embodiment of the levels of trust which exist within any organisation and as such may be constantly reinforced and calibrated through effective communication and clarity of expectations, backed up by reward systems as well as sanction systems, and, of course, effective underlying information management practice. The result of this is an environment which is truly driven by the adequate creation, use, and leveraging of the necessary information and knowledge to maintain strength and continuity in the face of an increasingly competitive and necessarily client focused operating environment. The aim in this paper has been to shed light on the key issues as to how to manage organisational dynamics around knowledge and power in such a way as to ensure that employees at all levels in the organisation are enabled to work in cooperative and consensual ways to ensure the success of the knowledge driven enterprise within an increasingly knowledge driven global economy. An initial blueprint of the role, which the professional knowledge manager can play in creating and facilitating such an enterprise, has also been outlined. It is undoubtedly true that success in the future for organisations of all kinds will depend on managing information and knowledge successfully, and it is suggested here that the dangers of ignoring this management imperative are grave indeed, both for the individual organisation, and for the competitiveness and effectiveness of economies as a whole. References Beetham, D. (1991), The Legitimation of Power, Macmillan Press, London. Bell, D. (1973), The Coming of Post-Industrial Society, Penguin, Harmondsworth. Coleman, J. (1990), Foundations of Social Theory, Harvard University Press, Cambridge, MA. Currie, G. and Kerrin, M. (2004), “The limits of a technological fix to knowledge management epistemological, political and cultural issues in the case of intranet implementation”, Management Learning, Vol. 35 No. 1, pp. 9-21. Dasgupta, P. (Ed.) (1988), “Trust as a commodity”, Trust: Making and Breaking Cooperative Relations, Basil Blackwell, Oxford, pp. 49-72. Davenport, T. and Prusak, L. (1988), Working Knowledge: How Organisations Manage What They Know, Harvard Business School Press, Boston, MA. Dodgson, M. and Rothwell, R. (1994), The Handbook of Industrial Innovation, Edward Elgar Publishing, Aldershot.

Economist Intelligence Unit (2006), Foresight 2020: Economic, Industry and Corporate Trends, Economist Intelligence Unit, London, available at: www.eiu.com/foresight2020 (accessed 24 November 2006). Etzioni, A. (1964), Modern Organisations, Prentice-Hall, Englewood Cliffs, NJ. Fukuyama, F. (1995), Trust: The Social Virtues and the Creation of Prosperity, Simon & Schuster, New York, NY. Garvey, B. and Williamson, B. (2002), Beyond Knowledge Management: Dialogue, Creativity and the Corporate Curriculum, Financial Times/Prentice-Hall, Harlow. Hales, C. (1993), Managing through Organisations, Thomson Learning, Surrey. Hardin, R. (2002), Trust and Trustworthiness, Russell Sage Foundation, New York, NY. Hayes, N. and Walsham, G. (2000), “Safe enclave, political enclaves and knowledge working”, in Prichard, C., Hull, R., Chumer, M. and Willmott, H. (Eds), Managing Knowledge: Critical Investigations of Work and Learning, Macmillan, London, pp. 69-87. Heywood, A. (2004), Political Theory: An Introduction, Palgrave, Basingstoke. Hislop, D. (2002), “Linking human resource management and knowledge management: a review and research agenda’”, Employee Relations, Vol. 25 No. 2, pp. 182-202. Hislop, D. (2005), Knowledge Management in Organizations: A Critical Introduction, Oxford University Press, Oxford. Hobbes, T. (1968), in Macpherson, C.B. (Ed.), Leviathan, Penguin, Harmondsworth. Kunda, G. (1992), Engineering Culture: Control and Commitment in a High-Tech Corporation, Temple University Press, Philadelphia, PA. Lewis, J.D. and Weigert, A. (1985), “Trust as a social reality”, Social Forces, Vol. 63 No. 1, pp. 967-85. Lucier, C. and Torsiliera, J. (1997), “Why knowledge programs fail”, Strategy and Business, No. 4, pp. 14-28. McKinlay, A. (2002), “The limits of knowledge management”, New Technology, Work and Environment, Vol. 17 No. 2, pp. 76-88. Michalski, W., Miller, R. and Stevens, B. (2000), “Governance in the 21st century: power in the global economy and society”, The Journal of Future Studies, Strategic Thinking and Policy, Vol. 2 No. 5, pp. 471-82. Morrison, K. (2005), Marx, Durkheim, Weber: Formations of Modern Social Thought, Sage Publications, London. Newell, S., Scarbrough, H. and Swan, J. (2001), “From global knowledge to internal electronic fences: contradictory outcomes of intranet development”, British Journal of Management, Vol. 12 No. 2, pp. 97-111. Palmer, J. (1998), “The human organisation”, Journal of Knowledge Management, Vol. 1 No. 4, pp. 294-307. Pan, S. and Scarbrough, H. (1999), “Knowledge management in practice: an exploratory case study”, Technology Analysis and Strategic Management, Vol. 11 No. 3, pp. 359-74. Polanyi, M. (1966), The Tacit Dimension, Doubleday, New York, NY. Rousseau, D. and Anton, R. (1991), “Fairness and implied contract obligations in job terminations: the role of contributions, promises and performance”, Journal of Organisational Behaviour, Vol. 12 No. 4, pp. 287-99. Saint-Onge, H. and Armstrong, C. (2004), The Conductive Organisation: Building beyond Sustainability, Elsevier Butterworth-Heinemann, Oxford.

Knowledge and power in organisations 137

AP 59,2

138

Simmel, G. (1971), Georg Simmel on Individuality and Social Form, University of Chicago Press, Chicago, IL. Simon, H. (1945), Administrative Behavior, Macmillan, New York, NY. Stewart, T.A. (1991), “Brainpower”, Fortune, Vol. 123 No. 11, pp. 44-60. Storey, J. and Barnett, E. (2000), “Knowledge management initiatives: learning from failure”, Journal of Knowledge Management, Vol. 4 No. 2, pp. 145-6. Tyler, T. (2003), “Trust within organisations”, Personnel Review, Vol. 32 No. 5, pp. 556-68. Weber, M. (1978), in Roth, G. and Wittich, C. (Eds), Economy and Society, Vols 1/2, University of California Press, Berkeley, CA. Further reading Cohen, M. (2001), Political Philosophy from Plato to Mao, Pluto Press, London. Corresponding author Catherine Kelly can be contacted at: [email protected]

To purchase reprints of this article please e-mail: [email protected] Or visit our web site for further details: www.emeraldinsight.com/reprints

The current issue and full text archive of this journal is available at www.emeraldinsight.com/0001-253X.htm

The I in information architecture: the challenge of content management Sue Batley School of Information Management, London Metropolitan University, London, UK

The I in information architecture 139 Received 24 March 2006 Revised 1 December 2006 Accepted 13 December 2006

Abstract Purpose – The purpose of this paper is to provide a review of content management in the context of information architecture. Design/methodology/approach – The method adopted is a review of definitions of information architecture and an analysis of the importance of content and its management within information architecture. Findings – Concludes that reality will not necessarily match the vision of organisations investing in information architecture. Originality/value – The paper considers practical issues around content and records management. Keywords Information management, Content management, Records management Paper type Conceptual paper

Introduction A lot of literature is being generated around information architecture (IA) at present. As is the case in many emerging disciplines, examination of the theory and practice of IA is found in the literature within diverse fields of study. In particular, IA is examined within the context of web usability and information management. Web usability literature on IA tends to focus on interface design, while information management literature tends to focus on taxonomy creation. This paper places concepts of IA within a broader framework, by acknowledging the importance of design and organisation, but focusing on managing content. Information architecture: definitions and scope A fundamental problem for anyone wishing to engage with concepts of IA is the lack of a clear definition of its scope. The Information Architecture Institute (2006) defines information architecture as: . the structural design of shared information environments; . the art and science of organizing and labeling websites, intranets, online communities, and software to support usability and findability; and . an emerging community of practice focused on bringing principles of design and architecture to the digital landscape. This largely mirrors Rosenfeld and Morville (2002, p. 4): . the combination of organization, labelling and navigation schemes within an information system;

Aslib Proceedings: New Information Perspectives Vol. 59 No. 2, 2007 pp. 139-151 q Emerald Group Publishing Limited 0001-253X DOI 10.1108/00012530710736654

AP 59,2

.

.

.

140

the architectural design of an information space to facilitate task completion and intuitive access to content; the art and science of structuring and classifying websites and intranets to help people find and manage information; and an emerging discipline and community of practice focused on bringing principles of design and architecture to the digital landscape.

The problem, or perhaps benefit, for anyone involved in the field is that there is no single, straightforward definition of information architecture. Part of the reason for this certainly lies in the fact that the discipline is, as pointed out in the above definitions, still emerging. Looking closely at the definitions offered, it is clear that there are common elements, particularly the concepts of organisation and design. Some authors would even argue that IA is synonymous with taxonomy (see, for example, Wikipedia’s, 2006 IAI, entry); this does not seem helpful as the taxonomy is only one part of the information environment. Taking inspiration from the built environment, it has been pointed out by Worple (2000, p. 32) that: Architecture is concerned with more than just the frame, skin and external detailing of a building.

Content was mentioned only in passing in the definitions of IA cited above, but it is not just the structure and the external appearance that is important in IA, the information itself is of central concern. No matter how well designed and aesthetically pleasing the IA, it is only going to be of benefit if it includes all the documentation and information its users need, and if that information and documentation can be easily retrieved. Architecture is not only about creating robust structures, it is also about functionality. Much of the literature on information architecture examines content but restricts its coverage to high level content, dealing primarily with the information audit and with the organisation and design of systems, back to IA being synonymous with taxonomy. This is a top-down approach to system content – representing an ideal. This paper considers what actually exists in the system and the type of documentation that is likely to be created by discussing how to review and manage the individual information items present in the organisation. The reality will not necessarily match the vision. Records management The field of records management is of central concern here, and a good place to begin is to define the term “record”. The Association of Records Managers and Administrators (ARMA, 1989, p. 16) define a record as: Recorded information, regardless of medium or characteristics, made or retrieved by an organisation that is useful in the operation of the organisation.

This is obviously a very broad definition, but the implication that the form or medium of the record is secondary to the information it carries is important. This is emphasised on the ARMA (2006) website: It’s estimated that more than 90% of the records being created today are electronic. Coupled with the overwhelming growth of electronic messages – most notably e-mail and instant

messaging – the management of electronic records has become a critical business issue. How that information is managed has significant business, legal, and technology ramifications. Ultimately, it doesn’t matter what medium is used to create, deliver, or store information when determining if content is a record and should be managed accordingly.

The clear implication here is that records management is selective. A lot of documentation, a lot of content, will be produced and acquired by an organisation, but not everything will merit the status of a record. It is only the records that need to be stored and made accessible to system users. Schwartz and Hernon (1993) emphasise the same point but take it a stage further when they say that records can be characterised by form, status and function. They point out that form is an important consideration in storage and retrieval, but that it is the record’s status and function that determine its value. Status refers to the activity and permanency of the record, function refers to the role it plays within the organisation. In terms of status some records will be of transient interest, while others will need to be accessible for longer periods. In terms of function some records will be of general interest and importance and will need to be widely available, while others may be of interest only to limited numbers of specialist users within the organisation. Content analysis and mapping What strategies can be employed to determine what is a record (documentation that needs to be made accessible on an information system), and how can the status and function of those records be determined? This introduces the practice of content analysis. It was stated earlier that much of the IA literature looks at high level content, a top-down approach, in focusing on taxonomies and information audit. Rosenfeld and Morville (2002, p. 221) say that: Content analysis is a defining component of the bottom-up approach to architecture, involving careful review of the documents and objects that actually exist. What’s in the site may not match the visions articulated by the strategy team and the opinion leaders. You’ll need to identify and address these gaps between top-down vision and bottom-up reality.

A lot of the documentation that exists in any system might not possess the status and function that will define it as a record. Part of developing the information architecture will involve examining the documentation, deciding what should and should not be present in the system, and drawing up guidelines for authoring and content management. There has to be a clear strategy to formalise content gathering, analysis and mapping. There is no quick and easy way to analyse content. No existing formula or software package is going to take into account the needs of a particular organisation. If the content analysis does not address specific organisational and user needs, then its value is questionable. A sensible first step is to attempt to identify a representative sample of documents that will need to be present in the system. Rosenfeld and Morville (2002) suggest a “Noah’s Ark” approach – begin the process by attempting to capture a couple of each type of animal. They divide species of document according to: (1) Format. Paper and electronic text, audio-visual materials, software applications. (2) Document type. News articles, technical reports, presentations, brochures, etc.

The I in information architecture 141

AP 59,2

142

(3) Source. Documents from the various departments within the organisation: customer support, marketing, research and development, human resources, and so on. (4) Subject. An existing taxonomy, thesaurus or classification scheme can be used to determine the subject range. (5) Existing architecture. Assuming some kind of information system is already in place, a sample of the documentation that already exists within the system can be taken. .

When the representative documentation has been gathered, the analysis of content can begin. When individual documents are being analysed there are four questions that should be kept in mind: (1) What is this? (2) How can I describe this? (3) What distinguishes this from other documents? (4) How can I make this document retrievable? The “what is this?” question is the most general. The format is not significant if applications are provided to access the content. The creator of the document and the type of document (annual report, technical report, etc.) are also of limited importance. It is the information content of the document that determines its status and function as a record, so it can be prioritised in the analysis. The information content is addressed in the final three questions: “how can I describe this?”, “what distinguishes this from other documents” and “how can I make this document retrievable?” Content mapping partly addresses these three questions and will help in determining status and function. Content mapping involves identifying what Rosenfeld and Morville (2002, p. 289) term “information chunks” within documents. They define an information chunk as: “The most finely grained portion of content that merits or requires individual treatment.” Content mapping involves asking additional questions: (1) Can this document be segmented into multiple chunks that users might want to access separately? (2) What is the smallest section of document content that needs to be individually indexed? (3) Will the content of this document need to be repurposed across multiple documents or as part of multiple processes? Finding answers to these questions involves identifying information chunks. First of all, can chunks be identified? Second, how small are the chunks? Third, will individual chunks be re-used across different documents or for different purposes – this last question obviously addresses the function or role of the content. An information chunk is essentially a record. Each chunk will have a status and a function (and a form) that will merit its definition as a discrete record. Clearly then, it is not the document itself that merits the status of a record, rather it is the chunks of information content each document contains that are the records that need to be managed.

All information systems contain a lot of records or chunks that are re-used in different documents for different purposes. Identifying and mapping the individual records or chunks is time-consuming initially, but will avoid duplication, storing the same information several times, and will make updating information on the system much quicker and more efficient. To make all this economy possible, the information architecture has to identify and map the relationships between records. Rosenfeld and Morville (2002) identify four different types of relationship that can exist between chunks: (1) Sequencing. Chunks can be placed together in a sequence. It is often possible to identify a sequential relationship, particularly when dealing with processes. In an academic environment, for example, there are a series of steps involved in developing and delivering modules and short courses. It is useful, in that case, to present information about the process sequentially. (2) Co-location. Chunks can be placed in the same document. Again using the university context as an example, module aims and learning outcomes will be reflected in the syllabus, so even though module aims, learning outcomes and syllabus may merit the status of information chunks in their own right, it is sensible to present all of those chunks on the same web-page. (3) Linking. Chunks can link to other chunks. Hypertext links can be inserted to link to other chunks or records within other documents. In a module specification, there can be a link from the name of the module convenor to individual staff web-pages. The content of a reading list can link to the library website for information about whether there are loan copies of books available. (4) Shared meaning. Even if chunks are not explicitly linked, they can share semantic characteristics that ensure they are co-located in an ad hoc manner. If metadata is created for a module specification and includes, for example, subject tags, or if the text of the module specification is automatically indexed, then anyone searching under the metadata tag or typing in a subject name or phrase in a search engine, will retrieve the module specification, plus all the other records that include the same metadata or the same text. Metadata So far the focus has been upon looking at documentation in a broad sense. Having introduced the term in the previous paragraph, it is useful at this point to consider content at the individual document level by looking at metadata. Metadata essentially helps to describe and manage documents, regardless of format, although it is usually associated with electronic resources and it is usually associated with resource discovery or information retrieval. A very simple, widely used, definition of metadata is that it means data about data. A rather more detailed definition is provided by Vellucci (1998):Data that describe attributes of a resource, characterise its relationships, support its discovery and effective use, and exist in an electronic environment.Haynes (2004) says that metadata has five purposes: (1) resource description; (2) information retrieval; (3) management of information;

The I in information architecture 143

AP 59,2

144

(4) rights management, ownership and authenticity; and (5) interoperability and e-commerce. Resource description underpins the other four functions of metadata. At this level the focus is essentially upon cataloguing. Adequate description of resources means that the associated metadata can be used for retrieval, records management, and so on. Every resource has identifiable elements that can be used to describe it and differentiate it from other resources. These would include its title, the date it was produced, its creator, and its format, for example. Consistent and detailed resource description hugely increases the efficiency of information systems for retrieval and management of information. It is clear that detailed resource description, using metadata, increases the efficiency of retrieval. Users can search for particular attributes like a particular author, date, format, and subject – assuming subject descriptors have been assigned to a resource. Essentially, metadata can increase the precision of searching by allowing for sophisticated field searching and subject searching. Subject metadata can also be used to assign resources to a category in taxonomy – increasing consistency of categorisation and so assisting browsing. In terms of managing information, some metadata elements are specifically designed for records management. The e-Government metadata standard has an element called “preservation”. This identifies resources that need to be archived and stored in the long term. In the traditional library environment, all information professionals are familiar with the life-cycle of materials – from acquisition, through circulation, to disposal. Metadata created at the ordering stage is used to manage and track all the other processes. Rights management, ownership and authenticity are extremely important. Information is a commodity with a real economic value in most organisations. Haynes (2004) points out that one of the drivers for the development of metadata standards in the publishing industry has been the need to manage intellectual property rights. Metadata can include information about ownership of intellectual property, and information about provenance that can determine the authenticity of a resource and increase its value. This is particularly important in the digital environment, where information can be easily accessed and re-used. Finally, interoperability and e-commerce is concerned with the exchange of information, sharing of resources and their commercial exploitation. It is useful to provide a formal definition of interoperability (Shirky, 2001): Two systems are interoperable if a user of one system can access even some resources or functions of other systems.

This is essentially what the UK e-Government initiative is devised to achieve. Information can flow seamlessly across government departments and can be accessed by the public. Use of the Dublin Core Metadata Element Set ensures that government websites share a common framework. MARC21 is another example – it generates metadata that allows for record sharing between libraries on a global scale. E-commerce depends on the ability to exchange data from one system to another and process it. Again, this is facilitated by a shared framework for managing e-resources.

Content management Content analysis and content mapping are part of the information architecture development process. They will influence basic functionality, i.e. software applications, and taxonomy creation. Continuing usability of the system presupposes strategies for managing the content. As stated previously, no matter how well designed the system, no matter how elegant and robust the architecture is, the system is only going to be of benefit if it includes all the documentation and information its users need. That presupposes that the information and documentation is effectively managed. Content management systems are often treated as being synonymous to information architecture in the literature. An information system stores, organises and provides access to content. The information system has an architecture. These concepts are, therefore, inter-related, and an analysis of content management can be usefully developed within the context of general aspects of information systems and information architecture. Boiko (2001, p.8) defines content management as: The process behind matching what “you” have with what “they” want. “You” are an organisation with information and functionality of value. “They” are a set of definable audiences who want that value. Content management is an overall process for collecting, managing and publishing content to any outlet.

The purpose of content management is to control the information lifecycle: through creation, approval, updating and weeding. This includes managing documents and managing records – those chunks of information that may be re-used or re-purposed across the system. There are commercial content management systems (CMS) that can be utilised, but decisions have to be made about which CMS will best suit an organisation’s needs, and the CMS has to be managed. Here the focus is upon strategy and process, rather than individual applications. McKeever (2003) outlines a four-layer hierarchy to provide a context for content management. This neatly demonstrates that content management underpins the whole system architecture. At the top of McKeever’s hierarchy is the audience layer: the users of the system, who may be staff, and/or customers of the organisation, for example. One level below that is the outlet layer: the interface to the system. Below that is the activity layer: where contents are created and deployed, deployment being controlled by content analysis and metadata tagging. At the lowest level is the content itself: the records and documentation that are created, analysed, and deployed. Here is the entire range of information architecture. Information audit and needs analysis examines the audience layer: identifying different types of user and their associated needs. Principles of user-centred design are addressed at the outlet layer, which will support a range of tasks and individual resources. At the activity layer, documents and records are created, categorised, and tagged so that they can be represented at the outlet layer and retrieved by the audience. At the content layer is the information itself. Content management incorporates everything from information through use and so does not restrict its sphere of interest to one level. Studying content management requires an examination of the whole system architecture. Before going on to consider content management strategy, it is useful to look at the process in more detail. Here the focus is on the activity level of McKeever’s model and the information lifecycle. Several writers including Boiko (2001) and Tredinnick (2005) map out the information lifecycle. Sequential models of the information lifecycle are

The I in information architecture 145

AP 59,2

146

common, but the reality is rather more complex: phases and activities within the information lifecycle are iterative. Information is continually created, deployed, communicated and used. The first phase is information creation and collection. A lot of people within the organisation may be concerned with this. Staff throughout the organisation may be authors of documents or creators of content. Some people may also have responsibility for collecting and making available external documentation or sources of information. The second phase, which some organisations may ignore – depending on how closely content is managed, is approval. Some organisations will take a laissez-faire approach and allow content creators to decide what should and what should not be accessible. Other organisations will impose some form of centralised control to manage content. Approval may be the responsibility of departmental managers, or it may be the responsibility of an information manager or information management team. Certainly once a document has been created, it has to be approved quickly, so that it can move on to the next phase. The third phase is deployment or publishing of content. This involves making the content accessible: creating web pages, assigning documentation to a category in a taxonomy, providing metadata to assist retrieval and re-use of information across multiple web pages, and so on. Creators of documents may have responsibility for publishing their own content, or the process may be overseen by specialist staff: information managers, for example. The fourth phase is review of content. Once information is published, it is certainly not the end of the lifecycle. Content has to be continually reviewed to assess its status: currency, authority, value, and so on. The metadata assigned to a document can include information about review: when a document should be updated, when the information it contains will be out-of-date, when it should have been superseded by a new document. Including review information in the metadata should at least ensure that individual documents are checked for currency and accuracy every month, every three months, or whatever period seems to be appropriate. No organisation wants documentation present in their information systems after it has lost its value or relevancy. The fifth phase is archiving and deletion of content. The review phase will identify documentation that is no longer needed. Some of the information contained in the documents will have been of transient value and the document should therefore be deleted. Other information, while no longer needed, may still have some value to the organisation and should be archived. When organisations relied on paper documentation, archiving was a given. Documents were filed away in the records management department, in the library, in people’s filing cabinets. It is easy to delete digital information, and there has to be a policy and strategy whereby important documentation is archived for future reference. A content management strategy should address a number of issues that are central to designing, maintaining and managing the information architecture. The strategy should establish roles and responsibilities, improve communication and ensure co-ordination of all information-related activities. Essentially the strategy should set out a series of objectives for managing content across the organisation, the overall aim being to improve information and resource sharing. It is important that the strategy include all stakeholders: managers, specialists and users. One of the barriers to

effective information sharing in organisations is the organisational culture. People will only participate if they see personal benefits in doing so. It can be very difficult to balance the needs of users and authors with the goals of the organisation, but an inclusive content management strategy can help. Brys’s (2004) paper provides a useful, practical outline of how to implement a content management (CM) strategy, and has been used in the following overview of the various issues: . Emphasise the importance of information and its communication. This can be helped by drawing up clear policies and guidelines. . Set out clear responsibilities. This should cover all levels: organisational, departmental and individual. . Provide training. For managers, authors and users. . Communicate clearly and inclusively. By making sure that effective communication channels about the information system are established – possibly through managers at departmental level. . Emphasise the importance of the information architecture team in providing support and quality control. . Set out workflow procedures for publication of content. To ensure smooth transition through authoring, approval and publication. . Establish best practices. This might be done on a departmental level if general organisational guidelines do not address specific needs. It is useful to divide implementation of the CM strategy into three stages: formulation of policy, planning and implementation. The CM policy would necessarily reflect the overall goals of the organisation. The CM strategy should set out a series of objectives; the policy should provide the means by which the achievement of these objectives can be measured. The UK’s e-Government Policy Framework for Electronic Records Management (2001) stresses the setting of goals to help in determining the extent to which the overall strategy is followed and how far the policy guidelines are being met. CM policy would incorporate all aspects of the strategy outlined below, but central to the policy is an emphasis upon the importance of information and its communication within the organisation, and establishment of roles and responsibilities. The policy should clearly state how the content management strategy supports the work of the organisation. It should also set out guidelines about ownership of information and adherence to legal requirements generally. The policy can be quite specific, including things like using the Dublin Core Metadata Element Set to ensure consistency in describing documents. The policy should also specify individual and group responsibilities. For example, authors may be responsible for providing metadata; departmental information managers may be responsible for quality checks and approving publication of documentation. The planning phase would emphasise Brys’s next three points, which would also be carried through into the implementation phase: provision of training, clear and inclusive communication, the importance of the information architecture team in providing support and quality control. Training programmes must be designed prior to implementation. All the training resources must be in place: training staff, accommodation for training workshops and seminars, documentation, a help desk and

The I in information architecture 147

AP 59,2

148

other online support. Everyone within the organisation should be informed about any changes. It is important in managing change that everyone who will be affected by the change feels that they are stakeholders and that proper consultation has taken place. The information audit should help people to feel that their needs are being taken into account. A commonly cited reason for IA development is to improve information flow that suggests that current channels of communication are not very effective. This can obviously create problems here. A mix of paper, electronic and face-to-face communication may need to be utilised and feedback encouraged. The final point in the planning stage pre-supposes that there is an information architecture team. If an organisation is willing to invest in improving access to information, then there should be a team of specialists, co-ordinating design, planning and implementing the new system. If the information architecture team has been involved in the information audit, they should already have quite a high profile within the organisation. At the implementation stage, the architecture is in place, and the focus is upon its maintenance and effectiveness as an information resource. This involves issues previously outlined, training, for example, will be ongoing, but publication of content and quality issues merit discussion at this stage. Workflow procedures for publication of content should be established to ensure smooth transition through the information lifecycle: from creation and collection, through approval, publication, review and disposal. Everyone involved in the process should know exactly what their role is, and should perform that role efficiently. Ensuring that information is published quickly is vital, as is ensuring that only current information is on the system. To safeguard the quality of the resource and the information it contains Brys (2004) suggests that individual departments or services should establish best practices (and also guidelines where organisational guidelines don’t seem to apply). The important point is to ensure that authors’ needs are balanced with departmental or service needs and the needs of the organisation as a whole. Consistency is the key that predicates that certain important decisions should not be left up to individual authors. By delegating responsibility to departments or services, diversity can be accommodated while ensuring a degree of consistency. Summary: implementation and maintenance issues Finally, there are various questions that any organisation implementing information architecture and/or content management systems should have considered and found answers to: . Who should be responsible for strategy and policy formulation? The final decisions will be made by senior management, but it is important to involve information and technical specialists, and, where possible, users. This relates to effective change management. It also relates to an issue raised earlier. It was stated that much of the IA literature examines high-level content: subjects that should be present within a taxonomy, and organisational needs as revealed in the information audit. That was a top down approach to system content – representing an ideal. Strategy and policy formation is also almost certainly going to represent an ideal – and in one sense that is perfectly acceptable. The strategy, in particular, will reflect the overall goals of the organisation. But by involving people at all levels of the organisation, it should be possible to learn about and incorporate (at least in the policy) what is realistically achievable.

.

.

.

.

.

Who should be responsible for design and implementation? It is to be hoped that there will be an information architecture team. Designated, specialist staff should be recruited at the start of the process to provide input into the strategy and policy formation phase, which will feed into the design and, of course, the implementation. Members of the team should be involved in the information audit – apart from the need to feed the resulting data into the system design, it will also ensure that eventual users of the system get to know the team. Ultimately, potentially everyone within the organisation will be involved. Users will influence the design, and will be involved in testing prototype systems. Who should be responsible for maintenance? Ideally the information architecture team or an information manager should have overarching responsibility for managing the content of the system as a whole and, of course, the taxonomy. If everyone can add what they like where they like, then a nicely structured search and retrieval tool will quickly descend into chaos. Who should be responsible for deciding what is included? Again, ideally an information manager, or information management team, with an overview of the whole organisation and its needs and the needs of individual members of staff, should be responsible for deciding what resources are included, and how the resources are indexed – checking subject metadata and deciding which categories in a taxonomy resources should be allocated to. It seems sensible to leave the metadata creation and categorisation to the originators of documents, but authors cannot be expected to be consistent – although providing a list of descriptors in a glossary or thesaurus can help. It may be safer to leave the indexing to the information manager or the information team – but the volume of work involved may be prohibitive. Should everything be included? If the answer is yes, then that dispenses with the need for someone to make decisions about what is included. But, organisations generate enormous amounts of documentation, should everything be accessible to everyone? One of the key issues in records management is deciding what deserves the status of a record. Role and function require objective evaluation and authors might not be the best people to assess whether their documents have a function that merits their publication. Also, what is stored and capable of being retrieved within the information system will not necessarily be made available within the taxonomy. Taxonomies should be a means of improving resource sharing, if they grow too large or too complex or if they contain a lot of documentation that is of no possible interest to the majority of users, then they lose their efficiency as information retrieval tools. This leads to another question. Who gets access – does everyone get access to everything? Some information will almost certainly have to be password protected. This also impacts on the taxonomy. Should there be a single taxonomy for the whole organisation or is it more appropriate to have a shared taxonomy plus a series of specialist taxonomies? This seems to be contrary to the whole idea of a taxonomy as a tool to facilitate resource sharing, but are some parts of the organisation sufficiently specialised to merit their own smaller, highly specialised taxonomies to facilitate resource sharing among a few highly specialised individuals? Some resources might only be for the eyes of certain people within the organisation, so are parts

The I in information architecture 149

AP 59,2

150

.

of the taxonomy closed to groups of users? The taxonomy will be mounted on an organisation’s intranet, but is it appropriate to mount parts of it on the extranet? The answer is almost certainly yes. Remember that information chunks can be re-used and so will be included in documentation or web pages designed for external users. This introduces a set of new issues around public access and security. Who should be responsible for deciding what should be removed? Does an information manager decide, or should the creators of documents decide? If the information manager is responsible for deciding what goes in then it seems logical that they should be responsible for what comes out. Straightforward if, for example, an interim report is being superseded by a final report, but the information manager may lack sufficient specialist knowledge to decide when a document contains out-of-date and possibly misleading information. So should authors or other specialist staff be involved in the weeding process? Metatags can be used to determine the life of a document or to signal when content should be reviewed. But, is it safe to let authors decide what should be deleted and what should be archived?

All these questions impact on information architecture and content management strategy and all focus primarily on the content itself. Information architecture, in focusing upon structure and design, takes a top-down approach to system development: providing a framework and techniques that assist in the development of ideal information environments. In reality, the architecture is only as good as the information it houses. By focusing on content, this paper has outlined a series of management issues that should be addressed to ensure that the reality matches the vision.

References ARMA (Association of Records Managers and Administrators) (1989), Glossary of Records Management Terms, ARMA International, Prairie Village, KS. ARMA (Association of Records Managers and Administrators) (2006), available at: www.arma. org/erecords/index.cfm (accessed 28 January 2006). Boiko, B. (2001), “Understanding content management”, Bulletin of the American Society for Information Science and Technology, Vol. 28 No. 1, pp. 8-13. Brys, C.M. (2004), Discussion Paper on Content Management Strategy, University of Glasgow Web Advisory Group, available at: www.gla.ac.uk/infostrat/WAG/paper_cmstrategy.pdf (accessed 31 January 2006). Government Policy Framework for Electronic Records Management (2001), available at: www. nationalarchives.gov.uk/electronicrecords/pdf/egov_framework.pdf (accessed 28 January 2006). Haynes, D. (2004), Metadata for Information Management and Retrieval, Facet, London. Information Architecture Institute (2006), available at: iainstitute.org/pg/about_us.php (accessed 28 January 2006). McKeever, S. (2003), “Understanding web content management systems”, Industrial Management & Data Systems, Vol. 103 No. 9, pp. 686-92.

Rosenfeld, L. and Morville, P. (2002), Information Architecture for the World Wide Web, O’Reilly, Sebastopol, CA. Schwartz, C. and Hernon, P. (1993), Records Management in the Library, Ablex, Norwood, NJ. Shirky, C. (2001), Interoperability, not Standards, available at: www.openp2p.com/pub/a/p2p/ 2001/03/15/clay_interop.html (accessed 31 January 2006). Tredinnick, L. (2005), Why Intranets Fail (and How to Fix Them), Chandos, Oxford. Vellucci, S. (1998), “Metadata”, ARIST, Vol. 33, pp. 187-222. Wikipedia (2006), available at: en.wikipedia.org/wiki/Information_architecture (accessed 28 January 2006). Worple, K. (2000), The Value of Architecture: Design, Economy and the Architectural Imagination, RIBA Future Studies, London. Corresponding author Sue Batley can be contacted at: [email protected]

To purchase reprints of this article please e-mail: [email protected] Or visit our web site for further details: www.emeraldinsight.com/reprints

The I in information architecture 151

The current issue and full text archive of this journal is available at www.emeraldinsight.com/0001-253X.htm

AP 59,2

Phenomenography: a conceptual framework for information literacy education

152

Susie Andretta

Received 24 March 2006 Revised 30 July 2006 Accepted 2 January 2007

School of Information Management, London Metropolitan University, London, UK Abstract Purpose – The purpose of this paper is to explore the adoption of a phenomenographic conceptual framework to investigate learning from the perspective of the learner, with the aim of reflecting on the features that this approach shares with information literacy education in general, and with the relational model in particular. Design/methodology/approach – The study offers an analysis of phenomenographic research on learning undertaken by Marton, which is further elaborated by examples of collaborative work by Marton and Booth, as well as by Fazey and Marton. The relationship between understanding and learning, promoted by this perspective, is explored in this paper to illustrate its impact on retention and transfer of the learning process. This is compared with the iterative and independent learning approaches promoted by information literacy education, and specific examples are used to illustrate the pedagogical overlap between phenomenography and information literacy. In addition, the paper examines the relational approach of information literacy promoted by the individual and collective works of Bruce, Lupton, and Edwards to demonstrate how the person-world relation, advocated by phenomenography, is used to examine the learner-information relationship promoted by the work of these authors. Findings – The paper reflects on the potential impact that phenomenography and the relational perspectives have on pedagogical practices in Higher Education. In particular, it aims to demonstrate how the relational approach, together with the learn-how-to-learn ethos of information literacy, is fundamental in promoting a framework for lifelong learning that leads to the empowering of the learner through an iterative cycle of reflection and practice, i.e. what phenomenography defines as variation in practice to foster the ownership of learning. Originality/value – In line with the person-world relation, the paper explores the relationship between learners and information by outlining its internal/subjective and external/objective dynamics. Claims that the learner’s ability to reflect on these dynamics enhances his or her independent learning attitude are explored in the light of current phenomenographic and information literacy research. Keywords Information literacy, Learning, Education Paper type Conceptual paper

Aslib Proceedings: New Information Perspectives Vol. 59 No. 2, 2007 pp. 152-168 q Emerald Group Publishing Limited 0001-253X DOI 10.1108/00012530710736663

Introduction This paper aims to explore information literacy from within a phenomenographic conceptual framework in an attempt to redefine learning as experienced from the perspective of the learner, and reflect on the impact that this approach exerts on information literacy practice. This paper is based on the following premises. First that information literacy is an intrinsic part of learning (Bruce, 2002; Lupton, 2004) and the foundation of independent and lifelong learning (Abid, 2004). The link between information literacy and learning is also found in the definition of an information

literate person as one who has “learned how to learn” (American Library Association, 1989). This has generated the view that information literacy is a necessary development for addressing the requirements of a learning society (Lantz and Brage, 2006) in line with Bundy’s (2001) claim that information literacy is the literacy of the 21st Century. The relationship between information literacy and the process of knowledge production is what makes information literacy the foundation of independent learning, particularly as the iterative nature of the learn-how-to-learn approach generates information-seeking attitudes that are fully transferable to other contexts (Orr et al., 2001; Bruce, 2002; Grafstein, 2002). This paper aims to illustrate how the learn-how-to-learn approach promoted by information literacy mirrors the phenomenographic principle of deep learning. The second point proposed by this paper is that the phenomenographic person-world relation[1] has influenced the development of the information literacy relational model. The aim here is to examine the main tenets of phenomenography, given that this approach is used as the conceptual framework for the relational model. This influence is evident in the individual research by Australian educators, namely Bruce (1997), Lupton (2004) and Edwards (2006), and through their collaborative work (Bruce et al., 2006) which promotes a systematic classification of information literacy through the “Six frames for information literacy education”[2]. A review of the individual and collective works is given in this paper to illustrate examples of how the subject-object relation, defined by phenomenography as the internal relation, is contextualised in the subject-information relation underpinning the phenomenon of information literacy. It is important to note that Bruce et al. (2006) do not present information literacy as a theory of learning, but propose that the participants’ perspectives of teaching and learning influence their interpretation of, and attitude towards information literacy, and that this is illustrated by the variation in the way information literacy education is implemented to suit diverse educational (and pedagogical) environments. The relational approach is promoted as one of the six frames and describes the relationship between learner and information literacy in terms of complex and different ways of interacting with information. In line with Bruce’s original relational model (1997), this frame supports the view that in order to improve learning we need to understand the students’ perspectives and appreciate the variation in the students’ conception of information literacy (Bruce et al., 2006). Learning defined through phenomenographic and relational perspectives Phenomenography has evolved from the empirical studies on learning in the 1970s into a research specialisation which focuses on human experience (Pramling, 1994), rather than on human behaviour or mental states (Marton and Booth, 1997). One of the main points of debate in the literature is whether phenomenography should be classified as part of the phenomenological tradition (Hasselgren and Beach, 1996; Marton and Booth, 1997; Marton, 1994). Given that the exploration of human experience promoted by phenomenography is also the aim of phenomenology it would be reasonable to assume an affinity between these two perspectives. However, Marton and Booth (1997) claim that a fundamental difference exists between phenomenography and phenomenology. In the latter it is the researcher’s perspective that is examined through reflection on his/her experience of the world. This is achieved by dissociating oneself from the natural attitude where the process of experiencing is taken for

A conceptual framework

153

AP 59,2

154

granted. From the outset such a philosophical approach aims to develop a single theory of experience. Phenomenography, on the other hand, employs an empirical approach to study other people’s perspective of the world and devise collective categories that describe the variation of this experience (Marton and Booth, 1997, p. 116). Therefore the aim of uncovering the richness of the individual’s experience, the person’s life-world sought by phenomenology, is in direct contrast with this phenomenographic goal of exploring the variation in the collective conception of the experience. Hasselgren and Beach (1996) present a similar point by arguing that even though phenomenology and phenomenography share the same object of research their perspectives differ substantially in that the former emphasises the individual view, or first-order reality, while the latter focuses on the perspective of others, also known as second-order reality. In a study on learning Marton (1981) observes that from a first-order perspective learning is defined by the correct acquisition of disciplinary knowledge, and therefore the learning content is dictated by the discipline studied. In his view, this type of learning reflects the notion of transferring subject-specific and ready-made concepts and principles into the learner’s head. On the other hand, the second-order perspective, promoted by phenomenography, focuses on how students relate to what they are taught and how they make use of knowledge they already possess. Learning, from this perspective, encapsulates the experience of the learner-world relationship which reflects people’s interpretation of significant aspects of the learning process. Therefore, the aim of phenomenographic research is: [. . .] not [. . .] to classify people, nor it is to compare groups, to explain, to predict, not to make fair or unfair judgments of people. It is to find and systematise forms of thought in terms of which people interpret aspects of reality which are socially significant [. . .] (Marton, 1981).

Phenomenography derives etymologically from the Greek noun fainemonon, which translates as the apparent, or that which manifests itself, and grafia which means to describe in words or pictures that which designates an aspect or an experience of reality (Marton and Booth, 1997, p. 110). It originates from research by the Department of Education and Educational Research, Go¨teborg University, in Sweden, whose main interests involved the investigation of what and how people learn from their world in order to explore the process of learning and enhance the quality of this experience. Marton’s work in 1975 marked the starting point of phenomenography (Marton, 1994) which began with the observation that some students are more effective learners than others. This study focused on two questions: first what does it mean when we say that one person is a better learner compared to another, and second why this is the case (Marton, 1994). The first question was addressed by asking students to read a text and, later on during individual interviews, to describe this experience in order to determine their level of understanding the text (content) and ascertain the process the students employed to accomplish the task (the act of learning). The initial analysis of the interviews produced a limited number of different ways of understanding the content of the text and these developed into categories of descriptions that defined in detail each different way of understanding the text. These categories were also found to be logically related to one another and set in a hierarchical order. Marton refers to these hierarchical categories as the outcome space and this gives an indication of the level of success of the learning task reflected in the different ways in which the text is understood. Therefore, the qualitative difference in the learning outcome was initially assessed by the quality of

the students’ understanding and remembering of the text. However, further analysis of the students’ account of the learning process used to understand the text generated two contrasting approaches: deep and surface learning. The identification of these two approaches helped to address the question of why some people are better learners than others. Students who tried to understand the content of the text were associated with a deep approach and the higher categories of the learning outcome, that is they achieved a better understanding of the text, while students who focused on the task of “moving the text into their heads” (Marton, 1994) employed a surface approach and their understanding of the text was consequently shallow. The findings illustrated that there is a relationship between the way learners experience the learning situation and the outcome of learning. The implication of this claim is that learning from a phenomenographic perspective involves experiencing both the act and the content; this is identified by Pramling (1994) as the “how” and the “what” of learning. The first connection between phenomenography and information literacy was established by Christine Bruce’s (1997) work which emphasises an holistic evaluation of people’s experience of information literacy as an aspect of learning, rather than the assessment of measurable attributes and skills associated with specific information-seeking practices. Bruce adopts Marton and Ramsden’s definition of learning as the starting point for her research: [. . .] a qualitative change in a person’s way of seeing, experiencing, understanding, conceptualising something in the real world – rather than a change in the amount of knowledge which someone possesses[3] (cited in Bruce, 1997, p. 60).

The relational model the Seven Faces of Information Literacy was born out of this study and in here Bruce examines the conceptions of information literacy from the perspective of educators in an HE environment: Describing information literacy in terms of the varying ways in which it is experienced by people, that is their conceptions, is the alternative which I propose. Studying information literacy from the viewpoint of the people [. . .] is the first step towards a relational view of information literacy (Bruce, 1997, p. 39).

This marks a change in the pedagogical practices underpinning information literacy, where the focus shifts from the development and assessment of information skills associated with traditional library instruction (Mellon, 1988) to the application of a reflective approach promoted by the relational model of information literacy. The latter, thanks to its holistic view of learning, can facilitate the shift towards a learner-centred pedagogy. The subject-object relation According to the phenomenography perspective the way of experiencing something is characterised by the relationship between subject and object. Here the unit of research is a way of experiencing a particular phenomenon, while the variation in ways of experiencing this phenomenon becomes the focus of the research (Marton and Booth, 1997, p. 111). Marton (1994) also states that phenomeographers adopt a number of terms to describe the relation between subject and object, such as conceptualise, understand, perceive. However, these terms should be used interchangeably to portray the process of conceptualisation not as a mental representation or a cognitive structure, but as a way of being aware of something. He argues that awareness is affected by the

A conceptual framework

155

AP 59,2

156

relation between subject and object, where the former relates to and is revealed by the latter either through experience or conceptualisation. As the two exist in relation to each other, what Marton calls the internal relation, he concludes that the way of experiencing or conceptualising the object (or phenomenon) also reveals aspects of the experiencing subject. He observes that the structural aspects of the experience define the subject’s boundaries of awareness and these are described as the internal and external horizons (Marton, 1994). However, he interprets awareness as the person’s total experience of the world at a given point in time rather than as a dichotomy of conscious and subconscious states. By this he means that awareness involves a relationship of constant variation between things in the foreground of awareness which are explicit and thematised (the internal horizon), and those in the background that are implicit and unthematised (the external horizon). By examining the differences in the structure of awareness we can deduce the meaning of the phenomenon or situation from the learner’s perspective. The outcome space of learning and its impact on the information literacy relational model Marton (1981) cautions us about defining the diverse ways of understanding reality. In his view, these perspectives are not conceived by phenomenography as individual qualities, but as categories of description that portray a collective conception of a phenomenon. Given that the second-order perspective accommodates different ways in which people experience or conceptualise any aspect of the world around them, the basis for similarities is found in the commonness of the perspective, while the source of variation originates from people’s interpretation of the phenomenon. The relationship between learners and learning is therefore encapsulated in the development of distinct categories of experience, and these are arranged according to a hierarchical structure, or outcome space, also described as a diagrammatic representation of the logical link between the categories of description[4]. Marton argues that in order to identify levels of variation in how the phenomenon is experienced, conceptualised, and understood these categories need to be explored both in terms of the common critical attributes that describe each category (the commonness of the perspective), and in terms of the features that distinguish one category from another (the variation in the interpretation of the phenomenon). Variation of the learning experience to expand the subject’s focal awareness is also promoted by Fazey and Marton, who view the qualitatively different ways in which people experience or make sense of something as crucial aspects of what is learned: Encouraging students to practice varying their perspectives, approaches, and the skills they employ is based on the variations that we have each experienced [. . .] the person-world relationship is a space established, but not delimited by, the dimensions of variation experienced by the learner. In other words understanding is the space of experiential variation (Fazey and Marton, 2002, p. 248).

Similarly, Marton and Booth (1997) promote the outcome space as an effective way of conceptualising learning and understanding, and identify different aspects that are discerned by the learner as the space of experiential variation. An example of the outcome space drawn from students’ learning experiences is outlined in Table I. The categories in the first column of this table demonstrate the distinct ways in which learning is experienced. In the first two categories this is interpreted as the memorisation of words or meaning, where the person-world relationship operates entirely on memory,

A conceptual framework

Ways of experiencing learning Temporal facet Acquiring

Knowing

Making use of

Committing to memory (words)

memorising (words)

remembering (words)

reproducing (words)

Committing to memory (meaning)

memorising (meaning)

remembering (meaning) reproducing (meaning)

Understanding (meaning)

gaining understanding (meaning)

having understanding (meaning)

being able to do something being able to do something differently being able to do something different

Understanding (phenomenon)

gaining understanding (phenomenon)

having understanding (phenomenon)

relating

Source: Marton and Booth, 1997, p. 43

leading to the reproduction of the task and surface learning[5]. On the other hand, learning that is described as a way of understanding meaning, or of becoming aware of the whole phenomenon, implies a more active level of engagement on the part of the learner which is associated with deep learning. Whilst a full interpretation of these categories goes beyond the scope of this paper, it is worth noting that in the last two categories the learning experiences, shown by understanding of the meaning or of the phenomenon, are associated with a transfer of learning. This is demonstrated by the (independent) accomplishment of a task in its own right, or the completion of the same task approached in a different way, or the ability to do something different from the task learned. The final category illustrates the learner’s ability to apply what he/she has learned which is associated with a level of understanding that goes beyond the process of memorisation underpinning the first two categories. The outcome space of learning that is based on understanding generates greater learner awareness and this supports Marton’s (1994) view that educators need to focus on how students relate to what they are learning, and on how they use the knowledge they already possess in order to enhance their learning experience. As we shall see later this type of understanding is also found in the work of Fazey and Marton (2002) and it involves the mastery of the process of learning as well as the appropriation of the content. The relational models produced by Bruce (1997), Lupton (2004) and Edwards (2006) offer individual examples of the outcome space contextualised within the information-world. Table II provides a diagrammatic representation of the categories of description that each model has devised to interpret the relationship between the user/learner and information in three different contexts. Summaries of these three models are included here to give a brief illustration of their interpretations of the relationship between learner and information[6]. Bruce’s (1997)relational model frames information literacy into seven different ways of experiencing information-use through active and reflective engagement with the

157

Table I. The outcome space of learning

AP 59,2

158

Table II. Outlines of the three relational models of information literacy

Edwards (2006), Panning for Gold: Influencing the Experience Lupton (2004), The Learning Bruce (1997), The Seven Faces of Connection. Information literacy of Web-based Information Searching and the Student Experience Information Literacy Information Technology conception Information Sources conception Information Processing conception Information Control conception Knowledge Construction conception Knowledge Extension conception Wisdom conception

Seeking evidence Developing an argument Learning as a social responsibility

Information searching is seen as: looking for a needle in a haystack a way through a maze using the tools as a filter panning for gold

relevant information practices[7]. The users’ conceptions of information literacy produce seven categories of description: (1) Information Technology conception, which associates information literacy with the use of IT to gather and communicate information. (2) Information Sources conception, where information literacy is perceived as the knowledge of sources and the ability to access these directly or indirectly via an intermediary. (3) Information Processing conception, which describes information literacy as “executing a process” (Bruce, 1997, p. 128), where a new situation is tackled through the use of an appropriate strategy to find and use information. The nature of the process varies according to the participant of this process. (4) Information Control conception. Here information literacy is associated with the effective control and manipulation of information through the use of mechanical devices, memory, or IT. (5) Knowledge Construction conception, where information literacy is perceived as “building a personal knowledge base in a new area of interest” (Bruce, 1997, p. 137). Bruce stresses that this differs from the storage of information, because it involves the application of critical analysis of the information read. (6) Knowledge Extension conception, which envisages the application of knowledge and personal perspectives that lead to new insights. (7) Wisdom conception, which is associated with the wise and ethical use of information considered in a wider historical or cultural context. In addition, the information here undergoes “a process of reflection which is part of the experience of effective information use” (Bruce, 1997, p. 148). Bruce contends that in order to fully explore the learning process, a clear understanding of the relations between the learner and the subject studied must be sought, in combination with an evaluation of the learner’s perspective of information and the learning environment. The aim is to promote an holistic experience of learning

which involves the ability to perform information literacy tasks, such as formulating an information problem, and finding an appropriate solution to these; most importantly though, this process must be perceived by the learner as fully transferable to solve other information problems even when they occur in unfamiliar contexts. Lupton explores the relational perspective of information literacy by examining the students’ relationship with information within a problem-solving scenario in Higher Education. She emphasises the connection between information literacy and learning by examining the students’ attitudes towards an assessment task. Her study generates three categories that describe their level of engagement with an essay and with the underlying information literacy practices. These are: (1) Seeking evidence. Here a familiar topic is selected and information is seen as external to the learning process. Students assume an instrumental approach and see the essay as a product that needs to be done to complete the course. Lupton describes this category further by listing three distinct types of seeking evidence, including the simple use of statistics to support an argument, searching for ideas or opinions that support the students’ viewpoint, or promoting an objective view by identifying contrasting perspectives on the topic. In this first category students focus on the essay task rather than the transferable and transformational aspects of the learning process. (2) Developing an argument. Here the information is internalised and personalised as students learn more about the topic by gathering background information to set the context and gain increased awareness of the subject. The main stages of developing an argument include learning about the topic, setting the essay within a context and reformulating the topic. Students in this category engage with the process of topic formulation, rather than limiting their efforts to the completion of the essay task. (3) Learning as a social responsibility. This offers a more comprehensive view of information literacy which is experienced as the interrelationship between the essay, the information, and the learning process. The essay is seen as an end in itself, a tool for learning and for communication. As with the second category, students in this group focus on the process of learning rather than the specific task of writing an essay. In this case, however, the emphasis is on activities that transcend the educational context, such as the application of learning to help the community bring about social and political change. This, Lupton argues, is what makes information literacy a transformational agent operating at both personal and social levels. Lupton’s research emphasises that information literacy cannot be decontextualised from the learning process (Lupton, 2004, p. 89), and as such it is seen not as a characteristic of the learner, but as a response to a context. This view echoes Bruce’s relational approach in that it examines the intent of searching for information as the driving force behind information literacy practices, rather than focusing on the measuring of competences in information seeking and use. Similarly to Lupton’s study, Edwards’ (2006) relational model of information literacy examines students in a Higher Education context, although this research covers the learners’ experience of information searching when using the Internet and library databases, rather than the accomplishment of an assessment task. Edwards

A conceptual framework

159

AP 59,2

160

identifies four main categories which describe different ways of experiencing the search and reveal different awareness structures, different approaches to learning, and different search outcomes. Information searching is conceptualised as: (1) Looking for a needle in a haystack. Students here operate under the assumption that understanding the research topic is a necessary step to find information “out there”. Edwards observes that there is little reflection on the research process and this is illustrated by the fact that students lack an appreciation of the information environment structure, or of the range of research tools at their disposal. In particular, students in this category are not aware that these tools are instrumental in retrieving the information they need. (2) Finding a way through a maze. Here the metaphor is changed from a needle in the haystack to a maze, implying that students perceive information searching as involving the systematic processing and planning of a search. Students in this category also become aware of the wide range of search tools they have access to. Whilst they still prioritise the topic of research, students engage with advanced search facilities of the available tools, and begin to assess the quality of the information retrieved. (3) Using the tools as a filter. Searching for information at this point involves the use of the searching tools as a way of filtering the information. Here planning and reflection are evident as students concentrate on a thorough analysis of the initial terms used, apply appropriate synonyms, and ultimately adapt their searching strategy in response to previous searching attempts. Their awareness of the structure characterising the tools used is also heightened, as is their ability to adapt their searching strategy according to the tool they use. (4) Panning for gold. This stage builds on the previous one where the search tools are used to filter the available information with the added outcome of limiting the results to high quality information. Students in this category select the appropriate tools to retrieve the required resources, and the searching strategy is rooted in systematic planning and careful reflection of information searching as a process. Edwards concludes that there is a major conceptual gap between students in the first category, who experience information searching as a helpless task of finding a needle in a haystack, and those in categories two, three, and four whose conceptual engagement with the process of searching is illustrated by increasingly complex ways of interacting with the tools and complemented by reflective topic and search formulation practices. The subject-information relation It is a contention of this study that the relational model of information literacy identifies information as the “object” in the subject-object relation. The categories of description derived from this relation reflect the subjects’ perception of information and further an understanding of their experience of the information literacy phenomenon: Descriptions of these conceptions, or experiences, reveal variation in the internal relation between subjects (people) and some object (in this case information) [. . .] internal variation suggests that the meaning of information literacy is derived from the ways in which people interact with information rather than from any external influence (Bruce, 1997, p. 9).

Further elaboration on the subject-information relationship is found in the collaborative work of Bruce et al. (2006) who adopt a phenomenographic perspective to devise six conceptual frames of information literacy, consisting of the Content frame, the Competency frame, the Learning to learn frame, the Personal relevance frame, the Social impact frame and the Relational frame[8]. In this section examples are drawn from individual and collaborative relational studies to demonstrate the variations in the relationship between learners and information as external-objective, internal-subjective, and transformational. The external-objective conditions show that information is experienced as part of the external environment, and Edwards’ (2006) category of looking for a needle in a haystack illustrates this by observing that the students perceive information as an external entity that exists “out there”. Bruce et al. (2006) associate the external role of information with the first two frames, where information is transmitted from one person to another (Content frame), or where it assists the performance of the relevant competence (Competency frame). In a subjective-internal condition information is seen as open to interpretation and reflection, leading to its internalisation by the learner. This process of appropriation is exemplified by Bruce’s (1997) knowledge extension conception and Lupton’s (2004) developing an argument. Bruce et al. (2006) associate this condition with the next three frames. In the Learning to learn frame, for example, information shifts to a subjective role when the learner internalises it. As it is demonstrated in the following section of this paper, the process of internalisation is associated with the phenomenographic view that full understanding is successfully accomplished only when the learner has taken it in (Fazey and Marton, 2002, p. 235). On the other hand, in the Personal relevance frame information is viewed from a specific context that gives information a personal value, Bruce’s (1997) knowledge construction conception comes under this frame. Alternatively, information may assume a social dimension through the Social impact frame, and Lupton’s (2004) learning as a social responsibility is an example of this. In the Relational frame information assumes objective and subjective guises and the shift from the former to the latter leads to a transforming effect on the learner. This transformation was first defined by Bruce (1997) as a change in the users’ perception of information from external to internal. In the first instance information is seen as an objective, or external source which becomes subjective, that is an integrated part of their personal knowledge base, thanks to a reflective process that leads to a sense of ownership of the information. Similarly, Bruce et al. (2006) argue that internalising information promotes a change in the learner’s perspective which, in turn, initiates a transformation of the relationship between user and information. As we shall see later on in this paper, the transformative impact of the subject-information relation is based on the work by Marton and Booth (1997) who argue that a change in the person-world relation necessarily initiates a change in the outcome of the relationship. Furthermore, Bruce et al. (2006) argue that the relational perspective can be employed as a way of combining a number of information literacy approaches, thus fostering variation in its provision and practice. This is in line with the phenomenographic view promoted by Marton and Booth (1997) that variation in practice encourages retention and transfer of learning which is discussed in a later section of this paper. Bruce et al. (2006) illustrate how from a relational frame’s perspective the subject-information relation promotes the principle that learning operates within a content and a process, and therefore makes use of the Content and

A conceptual framework

161

AP 59,2

162

Competency frames. In addition, learning is encapsulated in the relationship between learners and their information environments, thus enabling the application of a combination of the remaining three frames. When Learning to learn is blended with Personal relevance, for example, the emphasis is on learning that develops a conceptual structure and critical thinking to complement the need to make the learning experience relevant to personal interests, while if Learning to learn is combined with the Social impact frame the conceptualisation process of the former underpins the evaluation of perspectives associated with social change.

Exploring the relationship between understanding and learning to promote ownership of learning The studies on learning and its relationship to understanding, cited by Fazey and Marton (2002), reveal three stages of relational development between these two phenomena. This relationship is explored here because it helps illustrate the phenomenographic view of learning, and consequently provides an insight into the pedagogical rationale underpinning information literacy. In the first stage understanding refers to coming to terms with what is supposed to be learned, the next stage points to grasping, that is processing and absorbing, what is being learned, while full understanding develops in the final stage and is reflected in an established ownership of the learning experience, what Fazey and Marton describe as making what is learned “your own” (Fazey and Marton, 2002, p. 235). It is this appropriation of learning that is associated with the internalisation of information promoted by the relational model of information literacy (Bruce et al., 2006). Fazey and Marton (2002) conclude that understanding and learning are inter-related. For example high-school students in Hong Kong[9] use these two terms interchangeably as reflected in their explanation of both phenomena: “when you really understood/learned something then you can do it again” (Fazey and Marton, 2002, p. 236). One can also detect a strong element of transfer implied in this relationship and described by the students as being able to do again the thing learned or understood. The issue of transfer is examined in the next section through the principle of variation in practice. Here it is worth exploring the implications generated by the relationship between learning and understanding as these offer a clear rationale for the principle of variation in practice or iterative learning. First, understanding corresponds to what is learned, depending on the learner’s starting point. At this stage Fazey and Marton argue that motivational as well as environmental factors determine variations in the way understanding occurs. Second, understanding is achieved by the learner, in other words there is no “correct” way to achieve understanding. Arriving at one’s own understanding also reflects a sense of ownership of the process of learning. This point is linked to the emancipation and the empowerment of the learner fostered by information literacy education (Bundy, 2004a, 2004b; Andretta, 2005a, 2005b, 2006; Todd, 2006). Third, understanding enables one to do certain things, and therefore variation in the way understanding occurs is complemented by variation in what understanding enables one to do. This point complements the principle of discernment promoted by Fazey and Marton (2002), and adopted by the relational model of Bruce et al. (2006), which is explored in a later section.

Variation in practice: fostering retention and transfer through the learning to learn attitude Fazey and Marton (2002) promote variation in practice as a way of addressing the relational development between understanding and learning in order to foster deep and therefore transferable learning practices. A similar view is presented by the promoters of information literacy through iterative and reflective learning strategies (Edwards and Bruce, 2002; Andretta, 2005c). Fazey and Marton warn, however, that variation in practice does not simply mean repetition. On the contrary, this iterative approach creates conditions where the learners are able “to create, invent, adapt and progress, in the light of previous experience” (Fazey and Marton, 2002, p. 240). Variation in practice is seen by the learners as having an impact on: . their motivation to learn; . their expectation of success; . the sense of value placed on their learning; and . the sense of mastery of the content. An example of iterative variation is exemplified by Fazey and Parker’s study (2001) (quoted in Fazey and Marton, 2002, p. 240) of final year undergraduate students at Bangor, University of Wales. Here the students were introduced to variation with the aim of fostering a “learning to learn attitude” through activities such as mind-mapping, writing and editing papers as well as reviewing drafts in group-based settings. This research illustrates that the students experienced a profound change in their attitudes towards studying that moved away from simply storing facts to understanding in a meaningful way. There is substantial overlap between this approach and the learn-how-to-learn perspective that underpins information literacy education (ACRL, 2000; Bundy, 2004a), in that they both foster the development of high-order thinking that leads to more confident learners, as their attitudes towards learning and their ability to learn change. To validate the concept of variation in practice Fazey and Marton (2002) draw from research on motor skills, which promote a strong correlation between practice and skilled performance. Such a perspective, they argue, is supported by the common-sense view that one has to practice something to gain full mastery of it. They conclude that practice is what initiates the shift in the learner’s perception of a task from unfamiliar that is unpredictable and therefore difficult, to familiar and therefore easily accomplished: The progress from beginner to expert, from novice to skilled performer, involves a progression in accurately anticipating when and what will happen and matching that to appropriately selected actions. Such a progress can be expressed, from the learner’s point of view, as the change from something being difficult to it becoming easy (Fazey and Marton, 2002, pp. 243-4).

The shift from the unfamiliar to the familiar is confirmed by research examining the impact of information literacy on first year undergraduate students at London Metropolitan University (Andretta, 2005a, p. 90). This study shows that, when first introduced to the online newspaper database, the students expected the task of finding information to be difficult because they had never used this information system before: “Being a first time user of the database I assumed that locating the information I needed would be difficult”. However, as soon as the hurdle of unfamiliarity is

A conceptual framework

163

AP 59,2

164

overcome, the resource is described as a fruitful source for articles: “At the beginning this was very unfamiliar. It soon became one of my favourite databases and was especially useful for my research into the history of the press.”. It should also be emphasised that, in this case, increased confidence generated by variation in practice through familiarisation with the database is complemented by the usefulness of the resource in accomplishing an academic task, i.e. an essay. This, in turn, strengthens the student’s motivation to engage with the newspaper database because he/she perceives it as a useful source of information. Fazey and Marton (2002) conclude that as the experience of variation increases so does the effectiveness of this practice in achieving retention and transfer. Similar claims are found in the literature about the positive impact of information literacy education, exemplified by the enhancement of student learning experience, and the consequent feeling of empowerment generated by increased confidence in their ability to learn (Lupton, 2004, p. 190). Lupton’s research shows that as students’ confidence grows their level of motivation also increases, giving opportunity for greater retention and transfer of learning to new experiences. However, Fazey and Marton argue that the process of transfer, promoted by the motor-skills perspective, cannot explain how a learner can do something that is different from the activity learned through practice. In this case, they claim that what is learned differs from what appears to be learned, and that the phenomenon of transfer is fully realised only when the learner has mastered the process of variation itself (Fazey and Marton, 2002, p. 245). This point is supported by information literacy promoters such as McInnis and Symes (1991, p. 225) who argue that the emphasis in education should be moved from “what one learns [to] how one learns”. In addition, Bruce (1997) claims that the pedagogical shift from “what” to “how” requires a parallel shift in provision from content to process orientation, where the ability to learn takes priority over the amount of content learned. A specific example of this is given by Andretta and Cutting (2003) who integrate information literacy in learning outcomes that promote the development of transferable skills by articulating these in terms of the “know how”, rather than the “know what”. “Examples illustrating this transferability include competences in searching that can be used to query the Internet or a subject-specific database.” (Andretta and Cutting, 2003, p. 204). Focusing on the learner’s conceptualisation of the process rather than on the specific task is also promoted by Laurillard (1993) who argues that educators need to be clear of what the conception actually means to the learner in order to fully understand how learning is experienced, and employ the appropriate type of support to further enhance this experience. She uses research on subtraction procedures to illustrate this point and claims that this study identified over 80 ways of doing subtraction wrongly. However, as soon as the focus of the research shifts to the level of understanding the findings show only two ways of misconceptualising the subtraction process. It is at this level that the problem must be tackled so that if a student incorrectly borrows across zero then he/she needs to learn what borrowing means rather than focus on the actual task of borrowing across zero: It makes no sense to remediate a faulty procedural skill with reference to the procedure alone; we have to appeal to the conceptual apparatus that supports it as well. [. . .] knowledge is situated in action, and by the same token, action manifests knowledge; and this “buggy” behaviour manifests an underlying conceptualisation that itself needs remediation (Laurillard, 1993, p. 37).

In summary, Fazey and Marton define the phenomenographic view of learning as a change in the person-world relationship (Fazey and Marton, 2002, p. 246) and explain the dynamics of this relationship as the discernment of a specific aspect of reality which is followed by either the establishment of a sensitivity from the learner towards this aspect, or by making this reality a central feature in the learner’s world. The first year student who came to define the newspaper database as one of his favourite sources of information would subscribe to both of these categories. These permutations illustrate that either the relationship has changed, or alternatively that the person or the world perceived by the person have changed, thus influencing the outcome of the relationship: In throwing a ball, for instance, distance, ball weight, ball size, air movement etc. are all discernible and discerned, thanks to our previous [. . .] varied experiences. What the person can then do (or say, or imagine) in a relationship with the world has also changed in some way that is intimately linked to the variation in those experiences. This relational view of learning – the emphasis on the one-ness of person and world – implies that we are not aiming at describing what is taking place in terms of the interplay of any underlying mental machinery. We aim at describing the nature and dynamics of awareness – the subject-object relation is our unit (Fazey and Marton, 2002, p. 247).

The principle of discernment is fully articulated in the perspective of information literacy proposed by the relational model (Bruce et al., 2006) and demonstrates the person-information relationship: Users of the Relational frame are oriented towards the ways in which learners are aware of information literacy or specific relevant phenomena associated with information literacy. They are interested in designing experiences that help learners discern more powerful ways of seeing the phenomena in question (Bruce et al., 2006).

To contextualise the concept of learners’ discernment within the relational model of information literacy Bruce et al. use the example of searching a database. Only when the user becomes aware that the structure of the source has an impact on searching, and can distinguish the effect of this from searching that not does take into account the structure of the database, will he/she have discerned the full implications of searching, and will become acquainted with the variation of this experience. Conclusion This paper has reviewed some of the main tenets of phenomenography with the aim of presenting learning from the learner’s perspective. As we have illustrated, phenomenographic studies on the relationship between learning and understanding promote a strong correlation between practice and skilled performance, and claim that variation in practice, that is the act of learning, has a positive effect on retention and transfer of the learning process. There is a parallel between this approach and the impact of iterative and learn-how-to-learn practices supported by information literacy education, in that both demonstrate that variation in practice fosters ownership and mastery of learning that shifts the person-information relationship from unfamiliar to familiar. This shift is complemented by the learners’ increased awareness of more complex ways of experiencing a phenomenon which translates into deep and therefore transferable learning. In addition, by exploring learning from the learner’s point of view, and by focusing on the relationship between user/learner and information, the relational model

A conceptual framework

165

AP 59,2

166

proposes an holistic evaluation of learning exemplified by the qualitative changes in the way a person conceives and interacts with the world, rather than the testing of the amount of knowledge, or measuring the set of skills a learner acquires. The relational model promoted by Bruce et al. (2006), explores the dynamic relationship between learner and information within the context of information literacy, although the conceptual framework of the six frames of information literacy could be applied to any subject-specific scenario. This perspective necessarily calls for a shift of emphasis in Higher Education provision away from a “learning what” approach and towards a “learning how” attitude. To facilitate this shift Bruce et al. (2006) suggest that the relational model can be used to moderate other approaches to information literacy, thus promoting a pedagogy based on variation of learning that fosters independent and lifelong learning attitudes. Notes 1. In this paper the term person-world relation is used interchangeably with the term subject-object relation. 2. The collaborative work is part of the special issue on Information Literacy published by Italics under the auspices of the Higher Education Academy – Information and Computer Sciences. The issue is available at: www.ics.heacademy.ac.uk/italics/vol5iss1.htm (accessed 16 January 2006). 3. Original quote: Marton and Ramsden (1988, p. 271). 4. For a detailed explanation of how to apply a pheomenographic approach to explore information literacy see Edwards (2006, pp. 51-62). 5. As we have seen earlier, in his initial study Marton (1975) identified two processes of learning, surface and deep. He links the former to memorisation and describes this process as the learners moving the text into their heads. 6. There is no scope in this paper for a full account of these relational studies, or for a detailed examination of their impact on information literacy practice. Readers who wish to explore the practical implications of these approaches should read the original research by Bruce (1997), Lupton (2004), Edwards (2006) and their collective work (Bruce et al., 2006) cited in this paper. 7. The overview of Bruce’s Seven Faces of Information Literacy is taken from Andretta (2005a, p. 18). 8. The author is currently employing this six frames approach to assess the perception of information literacy from a range of views, including HE institutions, faculty and library staff as well as students (see Information literacy: from whose perspective?, available at: www.ilit.org/phd&conferences/index.htm). Findings from this research will be published at a later date, however, it should be noted here that the initial analysis has shown that the Competency and Content frames dominate the information literacy approach adopted by HE institutions in the UK. 9. A more detailed account of this study can be found in Marton and Booth (1997, p. 39-40).

References Abid, A. (2004), “Information literacy for lifelong learning”, World Library and Information Congress: 70th IFLA General Conference and Council, 22-27 August 2004, pp. 1-38.

American Library Association (1989), ALA Presidential Committee on Information Literacy, Washington DC, available at: www.ala.org/ala/acrl/acrlpubs/whitepapers/presidential. htm (accessed 7 March 2004). Andretta, S. (2005a), Information Literacy: A Practitioner’s Guide, Chandos Publishing, Oxford. Andretta, S. (2005b), “Empowering the learner ‘against all odds’”, paper presented at LILAC 2005: Librarians’ Information Literacy Annual Conference, 4-6 April 2005, Imperial College, London, available at: www.personal.leeds.ac.uk/%7Epolaw/tangentium/may05/ feature3.html (accessed 16 January 2006). Andretta, S. (2005c), “Applied information research helping students learn how to learn”, Library þ Information Update, Vol. 4 Nos 7-8, pp. 54-5. Andretta, S. (Ed.) (2006), “Information literacy: challenges of implementation”, Italics, Vol. 5 No. 1, available at: www.ics.heacademy.ac.uk/italics/vol5iss1.htm (accessed 16 January 2006). Andretta, S. and Cutting, A. (2003), “Information literacy: a plug and play approach”, Libri, Vol. 53 No. 3, pp. 202-9. Association of College and Research Libraries (2000), Information Literacy Competency Standard for Higher Education, American Library Association, available at: www.ala.org/acrl/il/ toolkit/intro.html (accessed 12 December 2002). Bruce, C. (1997), The Seven Faces of Information Literacy, Auslib Press, Adelaide. Bruce, C. (2002), Information Literacy as a Catalyst for Educational Change: A Background Paper, White Paper prepared for Unesco, the US National Commission on Libraries and Information Science and the National Forum on Information Literacy, for use at the Information Literacy, Meetings of Experts, Prague, The Czech Republic, July 2002, available at: www.nclis.gov/libinter/infolitconf&meet/papers/bruce-fullpaper.pdf (accessed 7 April 2004). Bruce, C., Lupton, M. and Edwards, S.L. (2006), “Six frames for information literacy education: a conceptual framework for interpreting the relationships between theory and practice”, in Andretta, S. (Ed.), Italics, Vol. 5 No. 1, available at: www.ics.heacademy.ac.uk/italics/ vol5iss1.htm (accessed 16 January 2006). Bundy, A. (2001), Information Literacy: The Key Competency for the 21st Century, available at: www.library.unisa.edu.au/papers/inlit21.htm (accessed 8 November 2001). Bundy, A. (Ed.) (2004a), Australian and New Zealand Information Literacy Framework. Principles, Standards and Practice, 2nd ed., Australian and New Zealand Institute for Information Literacy, Adelaide. Bundy, A. (2004b), “Zeitgeist *: information literacy and educational change”, paper presented at the 4th Frankfurt Scientific Symposium, Germany, 4 October 2004, available at: www. library.unisa.edu.au/about/papers/abpapers.asp (accessed 16 January 2006). Edwards, S. (2006), Panning for Gold. Information Literacy and the Net Lenses Model, Auslib Press, Adelaide. Edwards, S.L. and Bruce, C. (2002), “Reflective Internet searching: an action research model”, The Learning Organization, Vol. 9 No. 4, pp. 180-8. Fazey, J. and Marton, F. (2002), “Understanding the space of experiential variation”, Active Learning in Higher Education, Vol. 3 No. 3, pp. 234-50. Fazey, J.A. and Parker, S. (2001), “Variations in practice: testing a strategy for promoting understanding”, in Rust, C. (Ed.), Improving Student Learning Strategically, Proceedings of the 8th International Symposium, Centre for Staff and Learning Development, Oxford. Grafstein, A. (2002), “A discipline-based approach to information literacy”, Journal of Academic Librarianship, Vol. 28 No. 4, pp. 197-204.

A conceptual framework

167

AP 59,2

168

Hasselgren, B. and Beach, D. (1996), Phenomenography: A “Good-for-Nothing Brother” of Phenomenology?, Report No. 1996:05, Department of Education and Educational Research, Go¨teborg University, Go¨teborg, available at: www.ped.gu.se/biorn/phgraph/misc/constr/ goodno2.html (accessed 22 January 2006). Lantz, A. and Brage, C. (2006), “Towards a learning society – exploring the challenge of applied information literacy through reality-based scenarios”, in Andretta, S. (Ed.), Italics, Vol. 5 No. 1, available at: www.ics.heacademy.ac.uk/italics/vol5iss1.htm (accessed 16 January 2006). Laurillard, D. (1993), Rethinking University Teaching. A Framework for the Effective Use of Educational Technology, Routledge, London. Lupton, M. (2004), The Learning Connection. Information Literacy and the Student Experience, Auslib Press, Adelaide. McInnis, R. and Symes, D.S. (1991), “Running backwards from the finish line: a new concept for bibliographic instruction”, Library Trends, Vol. 39 No. 3, pp. 223-37. Marton, F. (1975), “On non-verbatim learning: level of processing and level of outcome”, Scandinavian Journal of Education, Vol. 16, pp. 273-9. Marton, F. (1981), “Phenomenography – describing conceptions of the world around us”, Instructional Science, Vol. 10, pp. 177-200, available at: www.ped.gu.se/biorn/phgraph/ misc/constr/phegraph.html (accessed 16 January 2006). Marton, F. (1994), “Phenomenography”, in Huse´n, T. and Postlethwaite, T.N. (Eds), The International Encyclopedia of Education, 2nd ed., Vol. 8, Pergamon, Oxford, available at: www.ped.gu.se/biorn/phgraph/civil/main/1res.appr.html (accessed 20 January 2006). Marton, F. and Booth, S. (1997), Learning and Awareness, Lawrence Erlbaum Associates, Mahwah, NJ. Marton, F. and Ramsden, P. (1988), “What does it take to improve learning?”, in Ramsden, P. (Ed.), Improving learning: New Perspectives, Kogan Page, London, pp. 268-86. Marton, F., Watkins, D. and Tang, C. (1997), “Discontinuities and continuities in the experience of learning: an interview study of high-school students in Hong Kong”, Learning and Instruction, Vol. 7 No. 1, pp. 21-48. Mellon, C.A. (1988), “Information problem-solving: a developmental approach to library instruction”, in Oberman, C. and Strauch, K. (Eds), Theories of Bibliographic Education, Bowker, New Providence, NJ, pp. 75-89. Orr, D., Appleton, M. and Wallin, M. (2001), “Information literacy and flexible delivery: creating a conceptual framework and model”, Journal of Academic Librarianship, Vol. 27 No. 6, pp. 457-63. Pramling, I. (1994), “Becoming able. Testing a phenomenographic approach to develop children’s ways of conceiving the world around us”, Acta Universitatis Gothoburgensis, Phenomenographica, available at: www.ped.gu.se/biorn/phgraph/civil/graphica/oth.su/ praml94.html (accessed 20 February 2006). Todd, R. (2006), “It’s all about getting ‘As’”, Library þ Information Update, Vol. 5 Nos 1-2, pp. 34-46. Corresponding author Susie Andretta can be contacted at: [email protected]

To purchase reprints of this article please e-mail: [email protected] Or visit our web site for further details: www.emeraldinsight.com/reprints

The current issue and full text archive of this journal is available at www.emeraldinsight.com/0001-253X.htm

Post-structuralism, hypertext, and the World Wide Web

Poststructuralism, hypertext

Luke Tredinnick School of Information Management, London Metropolitan University, London, UK Abstract

169 Received 24 March 2006 Revised 31 July 2006 Accepted 5 November 2006

Purpose – The purpose of this paper is to explore the application of post-structuralist theory to understanding hypertext and the World Wide Web, and the challenge posed by digital information technology to the practices of the information profession. Design/methodology/approach – The method adopted is that of a critical study. Findings – The paper argues for the importance of post-structuralism for an understanding of the implications of digital information for the information management profession. Originality/value – Focuses on an epistemological gap between the traditional practices of the information profession, and the structure of the World Wide Web. Keywords Worldwide Web, Information management, Information profession Paper type General review

Introduction This paper explores the application of post-structuralism to understanding hypertext and the World Wide Web. It is argued that the hypertext and the Web were developed with an explicit rejection of the epistemological models applied to traditional approaches to managing information. This rejection mirrors the critique of post-enlightenment rationalism offered by the French post-structuralist tradition. Post-structuralism therefore offers a theoretical framework and critical vocabulary that can be adapted for understanding the structures of information on the Web. Hypertext has its origins in a paper by Vannevar Bush in which the automated retrieval of conceptually related texts using in-text cross-referencing was first described in the form of a conceptual information retrieval model, the Memex machine (McKnight et al., 1991, p. 7; Woodhead, 1991, p. 5; see Bush, 1945). Bush cited “the artificiality of systems of indexing” as the key problem in effective information retrieval (1945). Hypertext itself was conceptualised by Ted Nelson, who acknowledged the influence of the Memex machine in his work (Naughton, 1999, p. 220; Gillies and Cailliau, 2000). Nelson developed the idea of hypertext into Xanadu, a projected global information system (see Nelson, 2000). He felt that the fundamental problem with indexing systems is not that they are bad, but that different people at different times require different approaches (Landow, 1997, pp. 74-5). He therefore conceptualised hypertext as a supplement to classification. The World Wide Web itself arose from Enquire, a personal information retrieval system developed by Tim Berners-Lee, apparently not directly influenced by Nelson or Bush, although with conceptual similarities (Berners-Lee, 1999, p. 5). The Web was developed as an information management tool (Berners-Lee, 1990, 1999, p. 17-19), although its potential as a global information system was recognised by Berners-Lee

Aslib Proceedings: New Information Perspectives Vol. 59 No. 2, 2007 pp. 169-186 q Emerald Group Publishing Limited 0001-253X DOI 10.1108/00012530710736672

AP 59,2

170

(1991, 1999, p. 26) from the outset. Berners-Lee (1990, 1999, p. 22) believed that the formal hierarchical structures imposed on information management solutions inhibited information retrieval and sought to overcome this problem with textual networks connected by semantic and associative relationships. The aspiration behind both hypertext and the Web was therefore in part an attempt to overcome perceived limitations in traditional approaches to managing information. These perceived limitations were identified with the imposition of formal classificatory structures on information. The perception was that these kinds of structures arbitrarily disassociated related information as a result of the application of a priori formal criteria. Hypertext and the Web, by exploiting loose associative relationships emerging from the texts of an information collection to create associative networks of information, was intended to overcome these perceived limitations. A question arises as to why networks based on associative relationships should be seen as a more effective approach to organising information. Like Bush (1945), Berners-Lee (1999) conceived of the associative network as a model of the way in which the brain processes information, and therefore a better reflection of the cognitive processes involved in information retrieval. Ellis (1992, p. 56) has observed that this perhaps betrays a naı¨ve understanding of cognitive processes. However, the role of network structures in the understanding of cognitive processes has become an increasingly important part of cognitive psychology (Gardner, 1987; Pinker, 1998). In particular, McCulloch and Pitts theory of neural interaction posits that the cognitive function resembles a network with weighted interactions (Gardner, 1987; Pinker, 1998; Waldrop, 1993). This has led to the mapping of cognitive processes through the use of graph theory (Pinker, 1998). This connection between weighted network structures and cognitive processes was highly influential on complexity theory, and the modelling of cognitive processes in information theory and neural networking (Waldrop, 1993; Lewin, 1993; Cilliers, 1998). Berners-Lee focuses attention on the relationships between the elements of an information system, rather than on the elements themselves. Berners-Lee wrote of his inspiration for the Web: In an extreme view the world can be thought of as only connections, nothing else. [. . .] I liked the idea that a piece of information is only defined by what it’s related to, and how it’s related. There is really little else to meaning. The structure is everything (Berners-Lee, 1999, p. 14).

This description is very similar to the description of information processing arising out of complexity theory and neural networking. The theory of meaning implied here is not a classical mimetic model in which tokens stand in direct correspondence to the things for which they stand in place, but that of semiosis where meaning permeates, or emerges from the interactions implicit within the symbolic system itself (see Cilliers, 1998). Although Berners-Lee does not explicitly make the connection with complexity theory, the direction of his thinking highlighted in the quotation above does suggest the influence of complexity theory, widely popularised within the computing community of the time (see Tredinnick, 2006). Cilliers (1998, 2005) has argued persuasively for the parallel between complexity theory and post-structuralist thinking. He has noted that both contain a rejection of post-enlightenment thought focussed on mimetic theories of representation and meaning. In their place, both erect theories of meaning posited on distributed

representation that have parallels with cognitive models developed in cognitive psychology. This concentration on the distributed nature of representation exploits the power of network structures to define relationships between disparate nodes. Two question arise: whether the perceived limitations of formal information organisation structures identified by Nelson and Berners-Lee do in fact limit the use of information in the way they imply, and, if they do, whether hypertext and the Web are able to overcome these limitations? It is in answer to these two questions that post-structuralism is able to clarify the rather nebulous set of ideas that contributed to the development of the Web. The critique offered by the structuralist, and later post-structuralist theorists, in many ways prefigures much of the current debate surrounding the influence of the Web on representation, and the growing dominance of network structures in information organisation and retrieval. There follows a brief overview of that critique, and the initial stages of a mapping of these issues onto the theory of the Web and hypertext. Structuralism and post-structuralism Post-structuralism originated in a reaction against the positivist outlook within the semiotic and structuralist critical tradition. Broadly speaking, semiotics is the study of signs and sign systems, and structuralism the application of semiotics to the study of cultural artefacts and cultural practices. Semiotics and structuralism largely derived from the linguistic analysis of Ferdinand de Saussure and the logic of Charles Sanders Peirce. Peirce identified a semiotic field of study, borrowing the term from Locke (Peirce, 1995, 1992; see Chandler, 2002, p. 6). By contrast, de Saussure (1966, p. 16) anticipated a “science of signs” named semiology, of which linguistics would be just one aspect. The two terms are generally used interchangeably. Eco (1976, p. 7) has written that semiotics “concerns everything that can be taken as a sign”. Harland (1987, p. 2) has observed that semiotics and structuralism appropriate the scientific stance of objectivity and the scientific goal of truth, and Lotman (1990, pp. 17, 4) has argued that behind de Saussure is the culture of the nineteenth century with its faith in positivistic science and describes semiotics as a “scientific discipline”. However, Bell et al. (2004) have termed semiotics a “so-called science”, Rylance (1987, p. 112) “quasi-scientific”, Turner (1990, p. 17) “not entirely scientific” and Harland (1987, p. 63) “more of a scientific aspiration than a realised science”. The analysis of de Saussure and Peirce was widely influential on critical and cultural theory. Culler (1975, p. 4) has suggested that this influence was based on two fundamental insights by later theorists: that social and cultural phenomena are themselves objects or events with meaning, and therefore signs; and that their meanings are not essential but defined by a network of internal and external relationships. The adoption of semiotic theory within cultural disciplines is therefore a product of its ability to reveal the mechanisms through which we make sense of the world (Turner, 1990, p. 14). Semiotics has as a result been applied to the study of diverse cultural practices (see Barthes, 1966 in Barthes, 1977, pp. 79-124, 1972; Chandler, 2002; Culler, 1975). Within the English speaking tradition, Harland (1987, p. 4) has identified a focus on concrete application within semiotic practice that is “largely indifferent to matters of philosophy” and that fails to expel “certain assumptions deriving from Anglo-Saxon empiricism”. Turner (1990, p. 22) notes that at the most elementary level, semiotics

Poststructuralism, hypertext 171

AP 59,2

172

supplies a terminology and conceptual frame for the analysis of symbolic systems and from this perspective the philosophy of structuralism becomes reduced to “method and technique” (Harland, 1987, p. 4). Within information science, semiotics has been recognised as a useful theoretical tool, although semiotic practice has been limited (Warner, 1990; Brier, 1996; Raber and Budd, 2003). Raber (2003, p. 225) has observed that “at first glance, de Saussure’s work should be a central pillar of information science, yet the discipline has not embraced the implications”. A theoretical framework for developing semiotics within the information disciplines has yet to emerge, however it is worth highlighting that semiotics is the one area of critical and cultural theory that has received serious attention within information science, and this is perhaps because its positivistic claims allied with the early scientific aspiration of Information Science. Although the techniques of semiotic analysis continue to be utilised in a range of fields, de Saussure and Peirce invited more questions than structuralism itself could answer. Pure structuralism flourished only briefly (Rylance, 1987). In its wake, the post-structuralist movement attempted to tease out some of the uncomfortable implications of semiotics that structuralism had tended to gloss over. The leading writers of this movement, Roland Barthes, Michel Foucault, Julia Kristeva, Jaques Derrida and Ge´rard Genette, came out of the same French intellectual scene which had fermented structuralism. Post-structuralism is then both a continuation and critique of the structuralist tradition that preceded it; the “post-” reflects both its temporal and philosophical status. Despite the concentration of post-structuralism on text and texts, the study of information has largely failed to exploit post-structuralist theory (Day, 2005). Roland Barthes was particularly important to this re-evaluation of semiotics after structuralism. By the late 1960s he had begun to turn away from his previous devotion to structuralist theory and explore some of the contradictions extant within the writing of de Saussure and Peirce. His essay From Work to Text (1971) represented a turning point not only in his own thought (Rylance, 1987, p. 112), but also in the structuralist movement. In it Barthes (1971) explored the transition of the literary object from a discrete artefact (the Work), to a node in a network of textual signification (the Text). He proposed that “a certain change has taken place in our conception of language and, consequently, of the literary work which owes at least its phenomenal existence to language”. That change was the break that de Saussure had made between signs, and the things that they signify, or between words and their meaning. Barthes touched on a dichotomy between the physical manifestation of the literary work, and its signifying presence within the language system. The shift from Work to Text became for Barthes (1971) a symptom of an “unease in classification”: The work is a fragment of substance, occupying a part in the space of books (in a library, for example), the Text is a methodological field [. . .] It follows that the Text cannot stop (for example on a library shelf); its constitutive movement is that of cutting across (Barthes, 1971, p. 157).

Behind the literary text is the idea of the original individual creative act, an idea of originality that the diffuse nature of meaning in text destabilises. The work “functions as a general sign” – the work can be fitted into a discrete set of classification because its signification is discrete – it signifies the idea of itself. Thus in the Work is an assumption about the mimetic quality of language, the one-to-one correspondence between words and things, that was unsettled by de Saussure’s negative value of the sign. As a result of that negative value, the Text “practices the infinite deferment of the

signified” which Barthes calls “a playing” (Barthes, 1971, p. 158) in a similar sense that Derrida later exploited (see Derrida, 1978). The Work is seen as a singularity, but the metaphor of the Text is the network (Barthes, 1971, p. 160). While it was once understood that texts have determinate meaning that are essentially open to us, providing we pay enough attention, Barthes argues that the logical conclusion of semiotic theory is that the meaning of texts is always just out of view. There is here a parallel with the kind of thinking about information, that influenced the origination of hypertext, and the Web outlined above. Barthes presents the work as bounded by the discrete signification imposed upon it, which severs its associative relationships with other works. This is a new way of looking at the function of text in culture which shifts the ontological status of text itself. Hypertext seeks to exploit the kinds of inter-relationship that Barthes suggests exists between texts, to allow individual texts to draw meaning from other texts through the use of associative connections. It encloses a tacit assumption that the full meaning of texts is drawn not simply from what they signify, but also from their place within the wider textual culture. Foucault, another key theorist of the post-structuralist movement, is most well known for exploring the role of power within discourse. Discourse for Foucault is not just a way of speaking or writing, but the “whole ‘mental set’ and ideology which encloses the thinking of all members of society” (Barry, 2002, p. 176). In other words, discourse is a framework through which knowledge is transmitted and exploited, and, what is more, a framework regulated by power relations. Those power relationships are evident both between individuals, and more importantly between groups. Discourse therefore delimits not only what it is acceptable to say, but also what it is possible to say about given subjects at given times, and is therefore always the manifestation of power. This central relationship in Foucault’s work between power and knowledge is not, as it may appear, synonymous with Bacon’s (1994, p. 43) assertion that “human knowledge and human power come to the same thing”; Bacon believed that knowledge was essentially empowering, where as Foucault argued that power regulates what comes to be constituted as knowledge (see Foucault, 1980). His archaeological method (see Foucault, 1972), to be distinguished from a traditional historical method that implied the narration of experiences from a position of power and privilege, was developed in a series of alternative histories: The Order of Things (Foucault, 1970), Madness and Civilization (Foucault, 1967), Discipline and Punish: the Birth of the Prison (Foucault, 1977) and The History of Sexuality (Foucault, 1979, 1984b, 1984c). These sought to undercut received historical understanding by highlighting the histories of the disempowered, or what he termed subjugated knowledge (Allen, 2000). Foucault has been influential in post-modernist theory (see Jenkings, 1991; Eagleton, 1996; Malpas, 2005), and certain strands of neo-Marxist critical theory (see Brannigan, 1998). However, Derrida (1978, pp. 36-76) criticised Foucault’s deployment of dominant discursive practices in analysing the status of the supposedly silenced or subjugated. Hayden White (1979, p. 81) has argued that Foucault rejected both logic and conventional analysis, however Harland (1987, p. 101) observes that “he rejects the notion of truth as a correspondence of ideas to things”, or in other words the rejection of a mimetic model of representation. In The Order of Things Foucault (1970) argues that received structures of knowledge influence the way in which the world is understood. Elsewhere, Foucault (1980, pp. 131-2) located power in its application through control of discourse by essentially delimiting what it is possible to say. He denies the

Poststructuralism, hypertext 173

AP 59,2

174

concreteness of the symbolic referent and rejects the notion that there is a “reality” which precedes discourse (White, 1979, p. 85). Derrida is probably the most famous, and the most controversial of the post-structuralist theorists. His death in 2004 brought a mixture of lament, acclaim, perplexity and hostility. The announcement of his death by French President, Jaques Chirac, indicated the degree to which Derrida had been accepted into an intellectual and academic tradition of which he was always wary (Deutscher, 2005). However, an acerbic obituary in The New York Times, which accused Derrida of being an “abstruse theorist” (Kandell, 2004), demonstrates the antipathy that Derrida sometimes provoked. In the Times Higher, Simon Blackburn (2004) suggested that “Whether his intention was serious scholarship [. . .] his practice encouraged mainly mockery”. More balanced obituaries appeared elsewhere, including that appearing in The Guardian which noted not only his influence on diverse fields such as architecture, theology, art, political theory, pedagogy, film, law and history, but also the “widespread and sometimes bitter” resistance to his writing (Attridge and Baldwin, 2004). Derrida’s general methodological approach was to subject texts to rigorous, piercing scrutiny, to uncover their hidden ambiguities, ambivalences and self-contradictions. This implies that Derrida applied nothing more than close reading, however his analysis turned on the self-negating qualities of language, which seem at the same time to denote meaning, and deny it. For Derrida, there was no “transcendental signified” in which meaning could ultimately be vested; meaning in language is deferred through endless chains of signification; we never arrive at a final signified because words are only meaningful through their difference with one another, and therefore there is no stable anchor to which meaning can be chained. Writing for Derrida was not a process of the denotation of objective meaning, but a play of meaning and signification. Derrida therefore sought out the points where texts exhibited the tension implied by a lack of signification, the points where texts spiral into self-contradiction, or get caught by their inability to articulate the clear distinctions that they assume. Although Derrida did not systemise his ideas into a deconstructive method, his work was adopted and transformed into a method, particularly in the United States where the new criticism movement was beginning to grow old. Out of this emerged confusion between the writings of Derrida, and the writings of his followers, and a conflation of some of the tenets of new criticism and Derrida’s own work. Derrida makes clear repeatedly that deconstruction is not to be considered a method to be applied to texts, but a quality of text itself. In a late roundtable discussion, he said: Deconstruction is not a method or tool that you can apply to something from the outside [. . .] Deconstruction is something which happens inside; there is a deconstruction at work in Plato’s texts, for instance (Derrida, 1997; cited by Deutscher, 2005, p. 6).

To read Derrida is therefore to grapple with two texts simultaneously: Derrida’s own, and the text that he critiques. This results in an elusive rhetorical style which while succeeding in positioning his own writing on the edge of the process of signification he critiques (see Derrida, 2002), also renders his writing subject to endless reinterpretation. The kind of critical approach Derrida adopts has led to accusations of unrestrained relativism. However, Cilliers (1998, p. 22) has written that Derrida could only be termed relativistic by the “ignorant”, Eagleton (2003, pp. 92-3) that Derrida seems “far too painstaking a reader” enthralled to an over-literal interpretation, and

Norris (2002, pp. 156-78), that Derrida’s painstaking careful analysis requires that we bring a similarly careful analysis to his own texts. This concentration of post-structuralism on a set of themes including the transmission of meaning through language, the role of networks of signification, and the perpetuation of power has obvious parallels with the descriptions on hypertext and the World Wide Web explored above. The originators of hypertext and the Web share with post-structuralist theorists similar notions about the structure and workings of text, and, in particular, the network as the coordinating principle behind the transmission of meaning through texts. Post-structuralist theory challenges the assumption that organising structures can be imposed on information in a neutral and objective fashion. This is a similar mistrust to discrete set approaches to information organization and retrieval influencing the innovators of hypertext and the Web. Classification and rationalism A key motivation informing the development of Hypertext and the Web was the belief that formal information organisation structures impeded the potential use of information as a result of the way in which they poorly reflect the cognitive process at play in information retrieval. In The Order of Things, Foucault (1970, p. xvi) outlined his interest in the influence of conceptual schemas on naturalised classification. He cited a passage in Borges in which “a certain Chinese encyclopaedia” is said to categorise animals as: a) belonging to the Emperor, b) embalmed, c) tame, d) suckling pigs, e) sirens, f) fabulous, g) stray dogs, h) included in the present classification, I) frenzied, j) innumerable, k) drawn with a very fine camelhair brush, l) et cetera, m) having just broken the water pitcher, n) that from a very long way off look like flies.

Foucault (1970, p. xvi) identified that “the exotic charm of another system, is the limitation of our own”. The strangeness of the scheme reveals the limitation of our own conceptual frameworks. Foucault suggests that there are different legitimate ways of seeing the world that are reflected in the structural order we impose on phenomena, and that these differences depend on interpreting relationships between things in varying ways. Such classifications, therefore, are not derived from properties of things, but from ways of interpreting the relationships between them. Classification becomes here an extension of the play of semiotics. Foucault proceeded to suggest that our current understanding is limited by rationalism, which denies the legitimacy of certain kinds of associative relationships while asserting the greater legitimacy of others. He attempted to outline the analogical nature of pre-modern Western thought and contrast it with the inductive nature of post-enlightenment thought. This argument was influenced by Le´vi-Strauss, who made a distinction between scientific and savage ways of thinking. Le´vi-Strauss suggested that “scientific” modes of thought were fundamentally analytical, dividing nature into ever more precise categories. “Savage” modes of thought, on the other hand, were more holistic, seeking to understand nature in its entirety, and not just in its parts (Fiske, 1990, p. 115). It also mirrored an interest within medieval studies on the rhetorical mode of medieval writing, and in particular the extension of biblical exegesis into the practices of cultural production in other fields, typified by the groundbreaking work of Robertson (1962). Foucault, however, argued that the different structures of thought in the pre-modern and

Poststructuralism, hypertext 175

AP 59,2

176

post-enlightenment ages permeates the structures of language, and by extension the classifications that we apply to things. The dilemma of subjectivity is, of course, inescapable in any act of classification, and Foucault follows in a long tradition questioning the basis of our knowledge about the phenomenal world, adopting a wholly relativist position. But Foucault’s interest is less with the ontological status of aspects of the phenomenal world, and more with the function of the kinds of claims that are made about that status in the perpetuation of dominant modes of discourse and received interpretations. In other words, for Foucault it is not just that the rational basis of any form of classification can be questioned; rather his interest is in the function of the claims to authority that are manifest in the ways in which we legitimise certain ways of carving up phenomenal experience. The modern age relies more singularly on causal relationships in the formulation of phenomenal categories; the pre-modern age on analogical relationships. Both mark out the boundaries of an aspiration to power in the legitimisation of particular forms of knowledge, and the marginalisation of others. The information profession relies heavily on the kinds of classificatory frameworks that Foucault critiques. These kinds of structures foreclose the possible meanings of information and to a degree impose particular meanings and interpretations. Foucault highlights the function of particular kinds of classifications in perpetuating certain dominant forms of knowledge. A question arises whether the particular kinds of classifications the information profession exploits act to perpetuate dominant forms of knowledge. This issue of the influence of classification on the cultural process is not unacknowledged in the information profession. Wiegand (1998, pp. 183, 190), for example, in exploring the influences on the development of Dewey Decimal has focused on the role of the curriculum at Amherst College on Dewey’s thinking, and notes that that curriculum represented “patriarchal White Western (and of course, Christian) civilization”. He has suggested that “it is probably [. . .] fair to say that for the past century the scheme itself has quietly – almost invisibly – occupied an influential position as once of the forces sustaining the discursive formations of a Eurocentric patriarchy”. This kind of influence is concealed by the apparent natural order of knowledge toward which the classification scheme strives, and the socio-cultural place of the library as a neutral gateway to knowledge. The theory of library classification in the age of Dewey made a claim to the objective status of the classification scheme as measured against the true order of knowledge (Hjørland, 2004). Dewey drew on Hegel and Bacon in providing the framework for his classification scheme, both of whom presented a teleological theory of knowledge (Graziano, 1959). However, the point is not that particular classification schemes impose particular perspectives that delimit knowledge, and act to legitimize certain privileged forms of knowledge. The point is that any act of formal classification cannot avoid doing this. The very act of classification is to impose upon texts surrogate meanings that mediate the interpretations of texts. The information profession and the practice of organising books within libraries can from this perspective be seen as legitimising particular perspectives in culture. It is precisely the limitations for the use of knowledge implied by these kinds of issues that, as we have seen, motivated the innovators of hypertext and the Web to pursue alternative ways of organising and retrieving information, although those issues were not fully theorised and remained only partially articulated

by Berners-Lee and Nelson. That is not to suggest that a concern with the perpetuation of power through dominant discourse was behind the Web, but rather recognition that particular ways of ordering information pose particular limitations which disrupted the full exploitation of information. In balance with these limitations of classifications is the pragmatic aim to best organise information and knowledge for the purposes of information retrieval. Foucault’s critique draws us towards a form of relativism in which all ways of organising information become equal, and concern with the utility of the classification scheme is marginalised. However, recognising the ways in which received knowledge can come to be inculcated into the structures of information is not necessarily the first step in rejecting all forms of information organisation, but a necessary first step in seeking ways of offsetting this influence while still aspiring to useful information organisation for the purposes of retrieval. Classification and structures of knowledge Subject classification acts to mediate the experience of the text. Particular texts are associated together, and certain aspects of the meanings of those texts valorised into a subject relationship; this by its very nature functions to foreclose the text and provide a final signification (if provisionally so). In other words, classification seeks to define the subject matter of particular texts, and in the process mediates the way in which those texts are subsequently encountered and read. But subject classification is just the most obvious way in which attempts to organise information by formal means impose limits on the uses that can be made of information. The library and information profession makes use of a wide range of apparatus for mediating the experience of texts. The catalogue surrogate, for example, seeks to crystallize the essence of texts in sets of stable qualities. These qualities can themselves be seen as a way of stabilising meaning, or of reducing what Derrida (1978) termed the play of text. The particular qualities of texts that are valorised through bibliographic description centre on quasi-objective characteristics of text, such as author, subject matter, publisher, and so on, that all act to foreclose associative relationships between different texts. This unavoidably imposes certain conceptual frameworks that mediate understanding and interpretation. Bibliographic details, such as the title, presuppose an innate textual unity (Wolfreys, 1998) and act to direct interpretation (see Genette, 1997; Kristeva, 1980; Wolfreys, 1998). These same qualities of text intrinsic to bibliographic extraction were described by Genette as the paratext, consisting of peritextual and epitextual components, or the textual scaffolding alongside, and outside the text. Gennette noted how texts rarely come without textual scaffolding that direct interpretation. Such scaffolding might include prefaces, title pages, the cover blurb, publisher’s descriptions, and so on. According to Genette (1997), paratexts of this kind impose on the meaning of texts and are in part a mechanism for directing interpretation. They focus on particular qualities of the text, and act to define the meaning of the text, so that the experience of the text is mediated by a pre-apprehension of that meaning. The bibliographic description can also be thought of as a kind of paratext acting to direct interpretation. The particular qualities isolated in the bibliographic description may appear to have no discernable impact on the apprehension of the meaning of the text they describe. However, these qualities were isolated as a result of the development of print as the

Poststructuralism, hypertext 177

AP 59,2

178

dominant form of textual representation, and reflect a particular assertion about the ontological status of texts. This is relatively clear in many cases; concepts such as publisher, place of publication, and date of publication have little relevance to the age prior to print when textual reproduction occurred through an ad hoc provision of formal and information provision, with hand-to-hand distribution, and personal copying playing a very significant role in the dissemination of texts (see Clanchy, 1993; Justice, 1996). Steinberg (1974) has traced the development of the title page out of the codex in early printed works, and the growing importance of bibliographic information as a result of the commercialisation of literary production. It is less obvious, although just as true, that other aspects of bibliographic description, such as the titles of works, and the compositions of works also resulted from the stabilising effects of print. The stabilising effect of print on titles has acted to strengthen the notion of the discrete signification of the textual work. Titles can be seen as a means of branding books, and of claiming of them a particular signification (see Genette, 1997). This process extends as far as the concept of authorship. The modern association of authorship with an original creative act largely developed after the introduction of print. The medieval use of the term author tends to refer to the authority behind the text, and the relationship between authority and authorship was much more explicit. That does not mean that texts did not have writers, as such, but that the process of composition was not valorised in an individual creative act (Burrow, 1982). The skill more prised in composition was not originality, but the skilful reuse of authoritative sources of one kind or another. Thus, much medieval literature is composite in nature, and the source of texts was not as clearly associated with an original creative act as in the modern age. Scribes had an important role not just in copying texts, but also in editing, compiling, interpreting, glossing, censoring, translating and ultimately transforming the meaning of texts in a wide variety of ways (Burrow, 1982). The concepts of intellectual property and plagiarism that developed as a way of securing the economic exploitation of texts were completely unrecognised; once “published” the potential for asserting continuing control over the structure or context or texts was extremely limited. The function of the modern concept of the author has draw specific comment from many of the key figures in the post-structuralist movement. Foucault (1984a) points out that the modern concept of the author is a post-enlightenment creation. Genette (1997) discusses the peritextual function of the author. Barthes, 1968 (in Barthes, 1977, pp. 142-8) argues that to give a text an author is to impose a limit on it, and attempt to provide it with a final signified value. Indeed, Foucault argues that the author really identifies an editorial function; that it works to enclose texts, to enclose relationships between texts, and to legitimise texts within discourse. The author is not intrinsic to the text, but imposed upon it in order to regularise discourse and knowledge. He argues that some discourses are endowed with an “authorial function” while others are deprived of it, and that this distinction having changed over time maps the application of power within discourse. The thrust of the post-structuralist critique of the author is that the identification of authorship itself acts as a means of asserting authority and control over text. Two points are worth recognising: firstly, that the elements of bibliographic description can be understood to function to direct the interpretation of particular works by highlighting particular associations with other works, and by isolating particular qualities of significance, such as author, title, publisher and date of

publication. Secondly, bibliographic description, by asserting a form of essentialism which focuses on the essential qualities of a work in which all copies partake to varying degrees, also asserts a particular understanding of the status of text. These kinds of structures can all be seen as mechanisms for closing down the play of the text, and means of isolating preferred interpretations. The bibliographic description and subject classification furnish a text with a final signified. This attempt to impose a final signification through the identification and recording of formal bibliographic description downplays the degree to which the meaning of texts and the relationship between texts is constantly under re-interpretation. Kristeva (1980) described the text as a site of constant cultural production, meaning that the impression given that the meaning of texts is stable, and their place within culture settled, is an illusion of the mechanisms which within print culture seek to stabilise the apprehension of texts. What Barthes (1971) called the “constitutive movement [. . .] of cutting across” is bounded by the mediation of bibliographic description, and the printing conventions from which the categories of bibliographic description arise. Text itself, which tended to be a constantly mutable medium in the age prior to print, has become a medium which allows the crystallisation of authoritative versions of individual texts. It is worth noting in the light of this the difficulty in achieving bibliographic control over the Web. Torok (2003, pp. 203-4) has written in this context that “the dream of universal bibliographic control seems quite remote” in a Web that “consists of a vast unchecked sea” containing a “proliferation of document formats” that amount to “a large body of knowledge needing organising”. However, Torok also notes that the Web is largely self-organising; and this indeed is precisely the point. The lack of bibliographic control is in essence a lack of centralised control; it reflects a shift in the control of text and information away from the major loci of the publishing age: publishers, governments, universities and other similar organisations. Organising information on the Web has become less a matter of defining what it is and what it is about, and more a matter of defining its relationship with other bits of information regardless of these formal qualities. The Web represents a turning away from final signification of textual works. The intertextual But that does not mean that the “cutting across” which Barthes describes ceases entirely in the printed text, but rather than our apprehension of it is suppressed. Derrida’s (1976) aphorism that “there is nothing outside the text” is frequently interpreted in the light of New Criticism that considerations external to the text, such as authorial intention, are to be discounted. However, Derrida (1988) later clarified that there is nothing outside of context: that there is no such thing as unmediated experience as founded on a metaphysical subject/object binary dualism. Such boundaries and thresholds as are imposed on texts are artificial attempts to delimit meaning. Foucault (1972, p. 25) similarly writes that “the frontiers of a book are never clear cut”. For Derrida (1978, pp. 351-378), “meaning is endlessly deferred”; he denies the possibility of a “transcendental signified” (Derrida, 2002, pp. 15-31) in which meaning can ultimately be vested. Thus, for Derrida, interpretation becomes about the play of language, rather than pinning down meaning. Barthes (1968, p. 147, in Barthes, 1977, pp. 142-8) makes a similar point that “Writing ceaselessly posits meaning ceaselessly to evaporate it, carrying out a systematic exemption of meaning”. Barthes

Poststructuralism, hypertext 179

AP 59,2

180

(1971) noted that the Work as a material object can be treated as a sign with an iconic relationship with the semantic meaning of the text from which it is comprised. From a post-structuralist perspective texts find meaning in their relationship with and difference from other texts, and the meaning of any text is diffused across this network of inter-related texts. Thus Foucault (1972, p. 25) argues that the book “is caught up in a system of references to other books, other texts, other sentences: it is a node within a network. From this perspective, traditional information management practices such as classification and cataloguing become attempts to limit this native intertextuality, to impose what Derrida termed a transcendental signified upon the textual work, and to delimit and stabilise the meaning of the text. This play of language which post-structuralism emphasises, where determinant meaning is never invested in individual texts but dispersed between texts, has become known as intertextuality, which Culler (1981, p. 114) identifies as the designation of a text’s “participation in the discursive space of culture”. The intertextuality of text derives in part from the associations that texts form between themselves by being related in some way, and in part associations that links texts due to the context in which those texts are discovered. Contexts may include paratextual scaffolding (Genette, 1997), but also the place of texts within collections by which means they become associated. Beightol (1986, pp. 94-5), for example, describes “the intertextuality that obtains between the primary texts of documents that, by virtue of having been assigned to the same class in the same classification system, are intertextually linked” and notes that “a single document may be found to partake of different intertextual relationships when it is classified by different classification systems”. As meaning is diffused on intertextual and connotative relationships, the precise value of those intertextual connotations is dependent on the contextual arrangement or retrieval of texts (see Kristeva, 1980). How texts are read depends partly on the contexts in which they are encountered, or by where, in Barthes’ (1971) sense, their works do stop on the shelf. Information management practices of classification and bibliographic description tend to constrain intertextual relationships, and constrain the contexts in which texts are encountered. By doing so, they perhaps impinge on the meaning of those texts. Under these conditions, such acts become effective interpretations in their own right, interpretations imposed on information by the information management profession. As Foucault’s analysis suggests, it is very difficult to see how, if these practices fail to acknowledge the processes at work, they can avoid becoming mechanisms for the perpetuation of dominant ideas. This description of interetextuality is also, of course, a description of hypertext; Berners-Lee (1999) made this explicit in his discussion of the epistemological basis of the Web, and Nelson described a docuverse in which “any user should be able to follow origins and links of material across boundaries of documents, servers, networks, and individual implementations” (cited by Naughton, 1999, p. 233). The action of hypertext and the World Wide Web as described by Nelson and Berners-Lee is precisely about breaking down the boundaries of the text, and making connections between texts that would not otherwise be possible. Hypertext and the Web therefore enclose a tacit rejection of the print tradition, and a promotion of the “cutting across” described by Barthes.

Post-structuralism and the World Wide Web The question that remains to be addressed is whether digital information in the model of hypertext and the Web represent a new paradigm for understanding and organising information, or a continuation by other means of an existing paradigm. It is clear that there are many parallels between the aspirations informing the creation of hypertext and the Web, and the critique of rationalism offered by post-structuralism. It is not, however, clear that these aspirations have been realised in the model of hypertext and the Web that resulted from the work of Nelson and Berners-Lee, among others. Although it is probably too early in the development of the Web to resolve this question, we can at least note some pointers that help to clarify this relationship. Hypertext is conceptualised as non-linear text with inter-textual linking through embedded cross-references. Neither of these elements are intrinsically novel (see McKnight et al., 1991, pp. 15-38). The use of in-text cross-referencing predates hypertext. Medieval manuscripts were frequently glossed with commentaries and references to other texts (see Clanchy, 1993). In the printing age, referencing and footnoting have become a normalised method of highlighting the intertextual co-dependence of texts. By highlighting non-linearity as a distinctive feature of hypertext, we tacitly imply that printed texts are understood to be rigidly sequential. This is questionable. Non-linearity has been a feature of certain texts throughout the literary age. Examples include dictionaries, bestiaries and encyclopaedias. More than one commentator has observed the way in which these forms of literature tend to defer meaning by referencing other parts of themselves (e.g. Barthes, 1968, in Barthes, 1977, pp. 142-8). However, even “linear” texts are open to be read non-sequentially. Textual apparatus such as indices and contents pages point out this possibility. Linearity or lack of it largely proves to be a false criterion of distinction. Hypertext differs, then, not in these characteristics, but in automating the retrieval of conceptually related texts, through what McKnight et al. (1991, p. 3) describe as “machine supported” links. Through this automation, the irreducible potential to combine and re-combine texts within new contexts is promoted. This challenges the stability and fixedness of the meaning of individual texts because it privileges intertextual relationships that impose on that meaning. The balance between disparate texts is upset, and the prominence of the other or external promoted. Text becomes re-conceptualised as a network of deferred meaning. It is in this incorporation of intertextual values, values that allow individual texts to become the locus of meaning drawn from disparate nodes, to become the site of the constant production of meanings, and to defer the attribution of absolute meaning, that any wider paradigm shift is likely to be centred. However, it cannot go without comment that the very structures that Berners-Lee was seeking to avoid have tended to proliferate on the Web. Berners-Lee saw hierarchical sets as problematic, largely because of the possibility of related information being disassociated by virtue of the structure of the classification used. However, hierarchical structures have tended to proliferate on the Web. This is offset by widespread use of hyperlinks to cut across these kinds of structures, but predominantly websites do not reflect a network-structure, but a semi-networked hierarchical tree. It should also be noted that the first decade of web development largely followed a publishing model of information dissemination, in which control over information was asserted through the centralisation of information creation. In this light, hypertext and the Web cannot be seen to represent a new paradigm, but merely a new media.

Poststructuralism, hypertext 181

AP 59,2

182

This mixed picture can be seen as the product of two conflicting forces that may govern the development of the Web: the decentralising desire to cede to users control over texts, and the centralising desire to maintain with producers control over texts. To date, the latter tendency has been in ascendance, with an increasing commodification of the Web from its hobbyist and newsgroup origins (see Naughton, 1999). However, this is not the whole story. In the first place, the characteristics of digital information itself perhaps pose a challenge to the way in which we regard information more generally. The ease of duplication of digital information that arises from its essential detachment from its material vehicle for most purposes to which that information is put encourages the rapid reuse, migration and transformation of information. This has made bibliographic control over digital information, and the Web in particular, intrinsically problematic. Bibliographic description became possible as a result of the stabilisation of text that arose out of printing. Steinberg has argued that: What was epoch-making in Guttenberg’s process was the possibility of editing, sub-editing and correcting a text which was (at least in theory) identical in every copy (Steinberg, 1974, p. 20).

Mechanical reproduction ensured that each copy of a single text would be for all intents and purposes identical. This allowed the qualities that all copies of a particular text shared to be abstracted and used as the basis of bibliographic description. The mutability of digital information makes this isolation of essential characteristics of texts more problematic. Text becomes more explicitly the site of constant cultural production, as it is used and reused in different contexts, and transformed in the process. To some degree, this has set the Web at odds with existing intellectual property legislation. That is to say not only that the means of protecting intellectual property becomes more problematic as the means of extracting and transforming texts becomes easier, but also that the identification of the original creative act that underpins intellectual property becomes more problematic as text is subject to constant transformation. If we accept that the contexts within which information is discovered impinges on the meaning of information, then the rapid propagation of different contextual uses of the same information makes its absolute meaning more difficult to pin down, and more subject to mutability. In the second case, the model of the Web that has dominated does not represent Berners-Lee’s original conception. Berners-Lee (1999) conceptualised the Web as a more collaborative environment, where the browser would also act as the web-editor, and users would be engaged in the use and transformation of the resources they used. The divorcing of browser and web-editor imposed on the Web a publishing model of information transmission. Such asynchronous communications tools as were developed, such as forums and discussion groups, generally relied on the gradual accretion of contributions, rather than the transformation of the original text. This has had the effect of divorcing the act of creation, and the subsequent use of information. The reader and writer remained in a very different relationship to the text. This failure to fully realise Berners-Lee’s aspirations would be beside the point, if it were not that in the recent development of what have become known as Web 2.0 technologies can be seen a return to some of those original aspirations. The Wiki, in which users engage collaboratively in the authorship of texts is a particular case in point. Wiki content is unstable, changing in relation to the use that is made of it. The authenticity of Wiki content cannot be secured against an original creative act which

stabilises an authoritative version of the text. The creative process at work on the Wiki is more akin to the transformation of manuscripts through the dissemination by hand-to-hand transmission and self-copying. It would be a mistake to push this analogy too far and claim a return to a previous textual mode, because in more important ways the Wiki and digital information is unlike the uses made of text in manuscript culture. However, they do point out a collaborative mode in knowledge creation that is increasingly finding a place on the Web, a mode which underplays the contributions of individual minds and focuses of the cultural process. The Wiki is not the only example of this mode; open source software relies in a similar way on collaboration. It is unsurprising, then, to see at the same time different approaches to the protection of intellectual property, which in the same way that the Web seeks to undermine traditional information management process, seeks to undermine copyright itself. Copyleft, for example, developed out of the GNU Public Licence Agreement, promotes the reuse of intellectual property, rather than its restriction. The boundaries between writing and reading are perhaps becoming less distinct in these mediums, such that to engage in reading becomes also a process of writing. In the light of this, the question of whether hypertext and the Web represent new paradigms is perhaps superficial in its irresolvability. What is perhaps of more importance with regard to information management practice is the critique of traditional professional practices that is implied by the hypertext model. This does not, of course, mean that traditional information management practices no longer have a place in the digital age. But it does perhaps imply that there is a need to better understand how different approaches to organising information can act to disempower users, as well as to empower them. References Allen, G. (2000), Intertextuality, Routledge, London and New York, NY. Attridge, D. and Baldwin, T. (2004), “Obituary: Jaques Derrida”, The Guardian, 11 October, p. 21. Bacon, F. (1994), Novum Organum, with Other Parts of the Great Instauration, translated and edited by Peter Urback and John Gibson, Open Court, Chicago, IL and La Salle, TX. Barry, P. (2002), Beginning Theory: An Introduction to Literary and Cultural Theory, Manchester University Press, Manchester and New York, NY. Barthes, R. (Ed.) (1968), Image Music Text: Essays Selected and Translated by Stephen Heath, Fontana Press, London. Barthes, R. (1971), “From work to text”, in Barthes, R. (Ed.), Image Music Text: Essays Selected and Translated by Stephen Heath, Fontana Press, London. Barthes, R. (1972), Mythologies; Selected and Translated from the French by Annette Lavers, Jonathan Cape, London. Beightol, C. (1986), “Bibliographic classification theory and text linguistics: aboutness analysis, interetextuality and the cognitive act of classifying documents”, Journal of Documentation, Vol. 42 No. 2, pp. 84-113. Bell, D., Loader, B., Pleace, N. and Schuler, D. (2004), Cyperculture: The Key Concepts, Routledge, London. Berners-Lee, T. (1990), Information Management: A Proposal, available at: www.w3.org/History/ 1989/proposal.html (accessed 1 June 2006).

Poststructuralism, hypertext 183

AP 59,2

184

Berners-Lee, T. (1991), “World Wide Web: a summary”, email to: alt.hypertext, June 8, 1991, available at: http://groups.google.co.uk/groups?hl ¼ en&lr ¼ &frame ¼ right&th ¼ 684c55870fb47ede&seekm ¼ 6487%40cernvax.cern.ch#link1 (accessed 28 April 2005). Berners-Lee, T. (1999), Weaving the Web: The Past, Present and Future of the World Wide Web by its Creator, Orion Business Press, London. Blackburn, S. (2004), “Derrida may deserve some credit for trying, but less for succeeding”, Times Higher Education Supplement, 12 November, p. 16. Brannigan, J. (1998), New Historicism and Cultural Materialism, Macmillan Press, Basingstoke. Brier, S. (1996), “Cybersemiotics: a new interdisciplinary development applied to the problems of knowledge organisation and document retrieval in information science”, Journal of Documentation, Vol. 52 No. 3, pp. 296-344. Burrow, J.A. (1982), Medieval Writers and Their Work: Middle English Literature and its Background 1100-1500, Oxford University Press, Oxford. Bush, V. (1945), “As we may think”, Atlantic Monthly, Vol. 176 No. 1, pp. 101-8. Chandler, D. (2002), Semiotics: The Basics, Routledge, London. Cilliers, P. (1998), Complexity and Postmodernism: Understanding Complex Systems, Routledge, London and New York, NY. Cilliers, P. (2005), “Complexity, deconstruction and relativism”, Culture, Theory & Society, Vol. 22 No. 5, pp. 255-67. Clanchy, M.T. (1993), From Memory to Written Record: England 1066-1307, Blackwell Publishing, Oxford. Culler, J. (1975), Structuralist Poetics: Structuralism, Linguistics, and the Study of Literature, Cornell University Press, Ithaca, NY. Culler, J. (1981), The Pursuit of Signs: Semiotics, Literature, Deconstruction, Routledge, London and New York, NY. Day, R.E. (2005), “Poststructuralism and information studies”, Annual Review of Information Science and Technology, Vol. 39, pp. 575-609. de Saussure, F. (1966), Course in General Linguistics, Bally, C. and Sechehaye, A. (Eds), in collaboration with Riedlinger, A., translated with an Introduction by Baskin, W., McGraw-Hill, New York, NY. Derrida, J. (1976), Of Grammatology, Gayatri Chakravorty Spivak (Trans.), Johns Hopkins University Press, Baltimore, MD and London. Derrida, J. (1978), Writing and Difference, Allan Bass (Trans.), Routledge & Kegan Paul, London and New York, NY. Derrida, J. (1988), Limited Inc., Northwestern University Press, Evanston, IL. Derrida, J. (1997), in Caputo, J.D. (Ed.), Deconstruction in a Nutshell, Fordham University Press, New York, NY. Derrida, J. (2002), Positions, translated and annotated by Alan Bass, Continuum, London. Deutscher, P. (2005), How to Read Derrida, Granta Books, London. Eagleton, T. (1996), The Illusions of Post-Modernism, Blackwell Publishing, Oxford. Eagleton, T. (2003), After Theory, Allen Lane, London. Eco, U. (1976), A Theory of Semiotics, Indiana University Press, Bloomington, IN. Ellis, D. (1992), “The physical and cognitive paradigms in information retrieval research”, Journal of Documentation, Vol. 48 No. 1, pp. 45-64.

Fiske, J. (1990), Introduction to Communications Studies, Routledge, London and New York, NY. Foucault, M. (1967), Madness and Civilization: A History of Insanity in the Age of Reason, Howard, R. (Trans.), Tavistock Publications, London. Foucault, M. (1970), The Order of Things: And Archaeology of the Human Sciences, Tavistock Publications, London. Foucault, M. (1972), The Archaeology of Knowledge, Sheridan Smith, A.M. (Trans.), Tavistock Publications, London. Foucault, M. (1977), Discipline and Punish: The Birth of a Prison, translated from the French by Sheridan, A., Allen Lane, London. Foucault, M. (1979), The History of Sexuality, Vol. 1: Introduction, translated from the French by Hurley, R., Allen Lane, London. Foucault, M. (1980), in Gordon, C. (Ed.), Power/Knowledge: Selected Interviews and Other Writings 1972-1977, Longman, London. Foucault, M. (1984a), “What is an author”, in Rabinow, P. (Ed.), The Foucault Reader, an Introduction to Foucault’s Thought, Peregrine Books, Harmondsworth, pp. 51-75. Foucault, M. (1984b), The History of Sexuality, Vol. 2: Introduction, translated from the French by Hurley, R., Penguin, Harmondsworth. Foucault, M. (1984c), The History of Sexuality, Vol. 3: Introduction, translated from the French by Hurley, R., Penguin, Harmondsworth. Gardner, H. (1987), The Mind’s New Science: A History of the Cognitive Revolution; with an Epilogue Written by the Author, HarperCollins, London. Gennette, G. (1997), Paratexts: Threasholds of Interpretation, Lewin, J.E. (Trans.), foreword by Macksey, R., Cambridge University Press, Cambridge. Gillies, J. and Calliau, R. (2000), How the Web Was Born, Oxford University Press, Oxford. Graziano, E.E. (1959), “Hegel’s philosophy as basis for the Dewey Decimal Classification schedule”, Libri, Vol. 9 No. 1, pp. 45-52. Harland, R. (1987), Superstructuralism: The Philosophy of Structuralism and Post-Structuralism, Methuen, London. Hjørland, B. (2004), “Arguments for philosophical realism in library and information science”, Library Trends, Vol. 52 No. 3, pp. 487-506. Jenkings, K. (1991), Re-Thinking History, Routledge, London. Justice, S. (1996), Writing and Rebellion: England in 1381, University of California Press, London. Kandell, J. (2004), “Jaques Derrida, abstruse theorist, dies in Paris at 74”, The New York Times, 10 October, p. 1. Kristeve, J. (1980), Desire in Language: A Semiotic Approach to Literature and Art, Columbia University Press, New York, NY. Landow, G.P. (1997), Hypertext 2.0: The Covergence of Contemporary Critical Theory and Technology, Johns Hopkins University Press, Baltimore, MD and London. Lewin, R. (1993), Complexity: Life at the Edge of Chaos, J.M. Dent, London. Lotman, Y.M. (1990), Universe of the Mind: A Semiotic Theory of Culture, Introduction by Umberto Eco, I.B., Tauris and Co, New York, NY. McKnight, C., Dillon, A. and Richardson, J. (1991), Hypertext in Context, Cambridge University Press, Cambridge. Malpas, S. (2005), The Postmodern, Routledge, London and New York, NY.

Poststructuralism, hypertext 185

AP 59,2

186

Naughton, J. (1999), A Brief History of the Future: The Origins of the Internet, Weidenfeld & Nicolson, London. Nelson, T.H. (2000), Xanalogical Structure, Needed Now More than Ever: Parallel Documents, Deep Links to Content, Deep Versioning and Deep Re-Use, available at: http://xanadu.com. au/ted/XUsurvey/xuDation.html (accessed 5 May 2005). Norris, C. (2002), Deconstruction: Theory and Practice, 3rd ed., Routledge, London and New York, NY. Peirce, C.S. (1992), in Houser, N. and Kloesel, C. (Eds), The Essential Peirce: Selected Philosophical Writings, Vol. 1, Indiana University Press, Bloomington and Indianapolis, IN. Peirce, C.S. (1995), Philosophical Writings of Peirce, Selected and Edited with an Introduction by Justus Buchler, Dover, New York, NY. Pinker, S. (1998), How the Mind Works, Allen Lane, London. Raber, D. (2003), The Problem of Information: and Introduction to Information Science, The Scarecrow Press, Lanham, MD and Oxford. Raber, D. and Budd, J.M. (2003), “Information as sign: semiotics and information science”, Journal of Documentation, Vol. 59 No. 5, pp. 507-22. Robertson, D.W. (1962), A Preface to Chaucer: Studies in Medieval Perspectives, Oxford University Press, London. Rylance, R. (Ed.) (1987), Debating Texts: A Reader in 20th Century Literary Theory and Practice, Open University Press, Milton Keynes. Steinberg, S.H. (1974), Five Hundred Years of Printing, foreword by Warde, B., Penguin Books, Middlesex. Torok, A.G. (2003), “Introduction to organizing the internet”, Library Trends, Vol. 52 No. 2, pp. 203-8. Tredinnick, L. (2006), Digital Information Contexts: Theoretical Approaches to Understanding Digital Information, Chandos Publishing, Oxford. Turner, G. (1990), British Cultural Studies: An Introduction, Routledge, London and New York, NY. Waldrop, M.M. (1993), Complexity: The Emerging Science at the Edge of Order and Chaos, Viking, London. Warner, J. (1990), “Semiotics, information science, documents and computers”, Journal of Documentation, Vol. 46 No. 1, pp. 16-32. White, H. (1979), “Michel Foucault”, in Sturrock, J. (Ed.), Structuralist and Since: from Levi-Strauss to Derrida, Oxford University Press, Oxford and New York, NY, pp. 81-115. Wiegand, W.A. (1998), “The ‘Amherst Method’: the origins of the Dewey Decimal Classification System”, Libraries & Culture, Vol. 33 No. 2, pp. 175-94. Wolfreys, J. (1998), Deconstruction: Derrida, Macmillan Press, London. Woodhead, N. (1991), Hypertext and Hypermedia: Theory and Applications, Sigma Press, Wilmslow. Corresponding author Luke Tredinnick can be contacted at: [email protected]

To purchase reprints of this article please e-mail: [email protected] Or visit our web site for further details: www.emeraldinsight.com/reprints

The current issue and full text archive of this journal is available at www.emeraldinsight.com/0001-253X.htm

Learning by doing

Learning by doing

Lifelong learning through innovations projects at DASS Shiraz Durrani School of Information Management, London Metropolitan University, London, UK

187 Received 24 March 2006 Revised 31 December 2006 Accepted 5 January 2007

Abstract Purpose – The purpose of this paper is to examine the changes in the way information management is taught at the Department of Applied Social Science (DASS) in the context of a fast-changing world situation. It looks at the way reflective learning is being incorporated in teaching and provides some details of projects and modules which incorporate reflective learning in teaching and learning programmes. Design/methodology/approach – The paper examines the changing global situation to which a university education needs to respond. It then focuses on the information field and gives details of the way in which some new projects and modules are being developed to meet new challenges. Findings – As this is ongoing work, a final analysis is not possible at this stage. External evaluation of the Quality Leaders Project (QLP) will provide further assessment of this approach and responses from employers and students will further inform the direction of this approach. Practical implications – The paper highlights the need for change in the teaching of information management. Changes in curricula and learning practices at universities and direct intervention through pilot projects can offer one solution. The experience gained has the potential of developing a new teaching model with lifelong learning at its core. Originality/value – This paper brings ideas and practices from teaching, learning and management to the information sector. It will be of interest to a number of professions: teaching, management, lifelong learning and information as well as political activists and organisations whose learning needs are largely ignored in the mainstream education systems. Keywords Innovation, Learning, Information management, Teaching Paper type Case study

Introduction Learning and teaching programmes, such as the Information Management ones housed at the Department of Applied Social Sciences (DASS) at the London Metropolitan University, have a particular responsibility in developing an appropriate learning culture among students even as they develop their knowledge, skills, and awareness. This is given greater seriousness by three linked challenges facing British society and the workplace today – at a social-economic, pedagogical, and workforce development level. This first section looks at these challenges, whilst the subsequent sections examine two projects which can be seen as a way of addressing the challenges – these are the Quality Leaders Project (www.seapn.org.uk/qlp.html) and the Progressive African Library and Information Activists’ Group (PALIAct) (www.seapn. org.uk/PALIAct-new.html).

Aslib Proceedings: New Information Perspectives Vol. 59 No. 2, 2007 pp. 187-200 q Emerald Group Publishing Limited 0001-253X DOI 10.1108/00012530710736681

AP 59,2

188

The learning and teaching environment Globalisation and the information society The first challenge facing British society, and its teaching institutions, is the process of globalisation influenced by, and in turn influencing, information technology. While it is not intended to explore this in depth in this paper, it is necessary to be clear about what the process means and what it implies for institutions such as ours. Castells’ (1997, p.1) explanation on globalisation is still valid. He saw it as: . . . [a] technological revolution, centred around information (which) has transformed the way we think, we produce, we consume, we trade, we manage, we communicate, we live, we die, we make war, and we make love: a dynamic global economy has been constituted around the planet, linking up valuable people and activities from all over the world, while switching off from the networks of the power and wealth, people and territories dubbed as irrelevant from the perspectives of dominant interests.

Sivanandan (1999) explored the wider impact of these changes: The technological revolution of the past three decades has resulted in a qualitative leap in the productive forces to the point where capital is no longer dependent on labour in the same way as before, to the same extent as before, in the same quantities as before and in the same place as before. Its assembly lines are global, its plant is movable, its workforce is flexible. It can produce ad hoc, just-in-time, and custom-build mass production, without stockpiling or wastage, laying off labour as and when it pleases. And, instead of importing cheap labour, it can move to the labour pools of the Third World, where labour is captive and plentiful and move from one labour pool to another, extracting maximum surplus value from each, abandoning each when done.

Kundnani (1999, pp. 49-50) sees “the economic paradigms of the industrial age in the process of being replaced by new paradigms of the globalised, information age”. He notes: Developments in information technology since 1970s have made possible new forms of economic organisation in both manufacturing and also in media industries, which have undergone substantial changes in the last twenty years.

The IBM Community Development Foundation (1997) report defines the term “Information Society” in terms of its economic contribution: . . .the creation, distribution, and manipulation of information has become the most significant economic and cultural activity. An Information Society may be contrasted with societies in which the economic underpinning is primarily industrial or agricultural. The tools of the Information Society are computers and telecommunications. . . . [Information Society is] characterised by a high level of information intensity in the everyday life of most citizens, in most organisations and workplaces; by the use of common or compatible technology for a wide range of personal, social, educational and business activities, and by the ability to transmit, receive and exchange digital data rapidly between places irrespective of distance.

These developments in globalisation have a profound impact on the form and content of teaching and learning in all disciplines. However, this impact is the greatest in the information sector in particular as the developments affect the very core of the teaching programmes and affects the skills necessary for students to survive in an increasingly globalised world. It is the course content and the very process of learning, the how and the what, that need to be constantly re-examined and made more relevant to today’s reality.

The constant examination and changing of the learning process is particularly necessary in all natural and social sciences as it is here that more rapid changes take place in the context of globalisation. The laws of nature do not change as rapidly and are perhaps easier to quantify and teach. In contrast, social rules are more dynamic and need constant examining and codifying. It is in this dynamic context that the new approach to learning explored in DASS needs to be understood.

Learning by doing

189 Reflective learning The need for reflective learning for lifelong learning has now come to be recognised as an important tool in the teaching and learning field. Bourner (2003, pp. 267, 272) sets reflective learning within the context of lifelong learning which is then seen in its dual aspects: planned and unplanned learning. “Much learning across the lifespan is unplanned, experiential and emergent”, he says, explaining that it is “reflection which turns experience into learning”. The key idea is that “developing students’ capacity for reflective learning is part of developing their capacity to learn how to learn”. Bourner concludes by looking at the need in universities to prepare students for lifelong learning “that will comprise reflective learning as well as planned learning”. This has particular application for public libraries which, according to Leadbeater (2003), “are in serious trouble”. Leadbeater maintains that “public service renewal requires strong political leadership. . . Libraries lack such leadership.” One way of injecting leadership is wider use of reflective learning. The Museums, Libraries and Archives Council (MLA) recognises that “lack of a learning culture is the single most important barrier to developing the workforce” (Museums, Libraries and Archives Council, 2006) and expands on this: Learning is a process of active engagement with experience. It is what people do when they want to make sense of the world. It may involve the development or deepening of skills, knowledge, understanding, awareness, values, ideas and feelings, or an increase in the capacity to reflect. Effective learning leads to change, development and the desire to learn more. The unique resources of museums, libraries and archives offer a range of learning opportunities and support for everyone to engage in learning activity.

Thus the second challenge for information management courses is to ensure that they inculcate reflective learning among students as part of a lifelong learning process. However, this is not always possible within the formal learning and teaching environment at universities. The limit on time and the need to cover large learning outcomes often leave little time to ensure that effective reflective learning takes place. It is for this reason that the approach taken by the innovations projects at DASS can provide an alternative approach to reflective learning. Leadership, innovation and workforce development The final challenge is to ensure that personal and organisational development takes place on an on-going basis in keeping with an ever-changing social reality. This can be ensured through developing leadership skills in students who will be the future leaders in the information profession. The need to constantly innovate requires appropriate skills among students and this also needs to be part of the learning and teaching process at universities.

AP 59,2

190

This is linked with the need for “workforce development”. Thus, MLA’s Workforce Development Strategy has four overarching strategic objectives as explained by The Museums, Libraries and Archives Council (2004): (1) a workforce fit for purpose: with the challenge of diversifying the workforce composition identified as the key priority; (2) enhancing leadership and workforce skills: addressing skills, knowledge, attitudes and behaviour throughout the workforce; enabling museums, libraries and archives to adapt and respond to new and emerging modernizing agendas; (3) empowering learning and change: with the biggest barrier to change identified as the lack of a learning culture in the sector and the organizations; and (4) research for action: with the challenge being the lack of robust, usable workforce data and in-depth research to enable museums, libraries and archives to make a strong case for investment to key funding bodies. If such strategies are to lead to meaningful action, information courses need to address issues raised in the above objectives. In addition, innovation is one area that has not been adequately addressed in the information sector. As Mulgan and Albury (2003) say: Innovation should be a core activity of the public sector: it helps public services to improve performance and increase public value; respond to the expectations of citizens and adapt to the needs of users; increase service efficiency and minimise costs.

Mulgan and Albury (2003) also provide a useful definition of innovation: Successful innovation is the creation and implementation of new processes, products, services and methods of delivery which result in significant improvements in outcomes efficiency, effectiveness or quality.

Yet it is not easy to teach such “innovation” within the context of a modular learning environment. It is in the above context that the rest of this paper looks at two projects run by the information management team in DASS. This does not imply that the projects are the only way in which the challenges mentioned above are addressed. Other initiatives include the development of a short course, for example on “Leadership for innovation, equality and change” and the new Masters-level module entitled “Information for Development”. Thus the projects below need to be seen as aspects of an on-going realignment of contents of various existing modules and the creation of new ones. Further discussions are also taking place within DASS to develop a department-wide modular programme which would offer all DASS modules on a pick-and-mix basis to meet varied learning needs among different organisations and individuals. The Quality Leaders Project – Youth The Quality Leaders Project – Youth (QLP-Y) is an externally funded project based in DASS, working closely with the University’s Management Research Centre (MRC). QLP is a work-based learning and developmental programme which uses the approach “management development through service development”. It requires joint work between relevant local authority departments and their local communities. It provides skills, support, and experience to “quality leaders” and their teams in

participating authorities. It also develops innovative, relevant services for young people based on consultation. Each authority releases a Quality Leader (QL) for at least two days a week, as well as a diverse team working with the QL. In addition, the authority provides a mentor and a sponsor to support the QL. One of the roles of the mentor is establishing the learning needs of the Quality Leader. Actual learning, especially reflective learning, is then assessed by the QLP Steering Group when the mentors submit regular feedback forms for which “mentors’ guidance notes” have been provided. The project started in 1999. Its feasibility study and pilot phase were funded by the Museum, Libraries and Archives Council (MLA). QLP was “highly commended” in the Organisational Change category of the Diversity Awards by the Chartered Institute of Library and Information Professionals (CILIP, 2003). In 2005, the Mayor of London’s Commission on African and Asian Heritage also acknowledged the contribution of QLP (Mayor of London, 2005). QLP-Y produces QLP News, as well as an irregular newsletter, Youth Ideas & Action, which provide update on the project as well as developments in youth work and staff development. Copies of these, together with other material such as the feasibility and pilot phase reports, conference presentations, and journal articles are available on the QLP website (www.seapn.org.uk/qlp.html). QLP-Youth – the current strand The current strand of QLP-Youth is in two phases. The first, six-month, phase was funded by the National Youth Agency. During this period, participants from youth and library authorities were given project management skills and they devised service development proposals in consultation with young people. The second phase lasts for two years during which the service development proposals will be put into practice. This phase is funded by the Paul Hamlyn Foundation and started with a Development Day on 13 October 2005, at DASS. Youth and library services from Barnet, Haringey, Lincolnshire and Portsmouth are participating in the programme. The programme places equal emphasis on developing staff as on developing services, as one is seen as strengthening the other. Skills being developed include the delivery of “audience development” workshops. These seek to increase the reach of libraries and youth services to meet the needs of all young people, particularly refugees and asylum seekers and those who have not been reached before. Audience development is a more inclusive term than “reader development”. It allows for connecting people to a learning and “reading experience” through non-print media, such as arts, cinema, music, drama and other cultural activities. It involves all the senses. Thus, the Quality Leaders develop their skills in consultation, in project management, and in organising new services which involves partnership work. Emphasis is placed on self learning, particularly through reflective learning. At the service delivery level, the programme aims to provide regular workshop sessions for young people in activities such as the ones mentioned below, the final choice being made by the young people themselves through an intensive consultation process: . presentations from writers, poets, film makers, media and other professionals; . production of youth magazines by young people themselves; . websites designed and maintained by young people;

Learning by doing

191

AP 59,2

.

. .

192

music workshops, book and newsletter production sessions, radio broadcast workshops, film and video making modules; podcasting; and organising guest speakers (including young people themselves) from different fields so as to enable the young people to meet potential role models from diverse communities and different fields.

As part of the funding arrangements, each participating authority is provided with a small budget to run audience development workshops and for the purchase of basic equipment. Whereas a “normal” teaching course at universities would find it difficult to impart skills in such a diverse range of activities, it is an integral part of QLP-Y. On a long term basis, there is scope for research to develop the “management development through service development” concept further, assess its effectiveness, and extend the scope of the project. Publishing a “QLP Manual” to capture all management and service development ideas and experiences is already being actively considered. QLP, as a model, has the potential of developing its approach so that it can be offered to a larger number of authorities within Britain. There is also a possibility of extending the programme to other fields where it is felt there is a need for improvement in staff skills or service, for example skills in delivering health information. In the past it has been used for developing staff skills and services to Black and Ethnic Minority communities, while the current strand addresses youth services and skills and includes refugees and asylum seekers. CILIP, MLA and the London Mayor’s Commission are among organisations which have expressed an interest in QLP. A group of librarians from Sweden came to London in 2005 to learn more about the QLP approach to developing services to marginalised and excluded social groups. Durrani and Bartlett (2004, p. 22) explain the rational of the QLP approach: The Public Library Service in Britain needs a major shift in its very mindset. It is sometimes assumed that the role of the public library never changes and remains the same as it has done for over 150 years. Whether the current service is relevant to meet current needs is an issue that requires urgent resolve. Questions such as: “What are libraries for?” “Who do they serve” “What services are needed?”What is the best way of providing these services?”. . . are increasingly raised by policy makers and people on whose behalf services are provided. The questions have become even more urgent in the context of rapid changes in information and communications technologies and the tremendous rate of change in every aspect of our lives.

Percival (2005) looks at the service development aspect of QLP-Y: I believe that QLP-Youth will make a significant contribution to our library service, by increasing levels of participation by young people, empowering them by giving them the confidence to air their views, and most importantly ensure that they are taken seriously. For far too long, public libraries in Britain have been run from above, with only the views of an extant readership taken into account. QLP gives young people a real chance to affect change from below and I sincerely believe that a change in the way libraries deliver their services to young people and other socially excluded groups will help bring about the fundamental social changes in wider society that we all desire.

However, such service development cannot take place unless there is development in staff skills and learning at the same time – and that is what QLP is all about.

QLP-Y approaches staff and service development for young people within the context of national policy framework, both from library and youth services aspects. In order to make this possible, a dedicated post of “Lecturer in Youth Policy” has been established in the Department of Applied Social Sciences. It is the responsibility of this post-holder to ensure that all relevant youth policies are reflected in all aspects of QLP-Y work. She also researches on practices in youth services in libraries and youth services and prepares a “good practice” guide. Such policy and practical experiences are then circulated to all through the irregular QLP Youth Ideas and Issues, in the various presentations at Development Days and included in the draft “QLP Manual” as a way of embedding relevant ones in QLP-Y programmes. QLP-Y is currently in the process of appointing external evaluators who will look at the real impact of this new approach to learning and skill development. Among the aspects examined reflective learning achieved by QLs will be included, as well as the overall impact of QLP-Y in changing services and cultures in local authorities. If funds allow, it will also include an analysis of learning achieved through QLP-Y as compared with other projects such as Mosside Powerhouse and Blackburn’s Curve. DASS, by supporting initiatives such as QLP, is clearly exploring alternative methods of providing a new learning environment which meets the needs in a changing environment. At the same time, the QLP approach combines the theories of learning and teaching and of service and project development with practice, thereby ensuring a longer-term, better embedded learning process which can be useful in other circumstances as well. Successful candidates also achieve a Diploma in Work-based Learning which requires evidence of new skills and learning acquired through participating in the programme. One example of the QLP programme influencing other aspects of teaching is the module “Combating racism, managing equality”. This was initially designed for the first phase of QLP-Y, but has since been incorporated in the MA module “Information and Social Exclusion”. This places particular emphasis on reflective learning as it is built up over several months with discussions, relevant readings, and discussions relating to reality in workplaces. Progressive African Library and Information Activists’ Group (PALIAct) The African Progressive Librarian and Information Activists’ Group (PALIAct) is a DASS initiative in partnership with a group of progressive African librarians and information workers. DASS has played a crucial role to bring together a number of progressive information activists from Africa by providing a new vision based on experiences of progressive librarians and political activists in Africa, Europe and the USA. It is part of the progressive librarians groups around the world, perhaps the most prominent one being the Progressive Librarians Guild in the USA with its publication Progressive Librarian. A brief history of PALIAct is available in Durrani (2006a). Just as QLP is an attempt to develop relevant skills and competencies in staff to deliver a relevant service to young people, PALIAct is a similar attempt at an international level. It has not developed to the same extent as QLP-Y, but has attracted a great deal of interest as seen in press coverage in a short period of time. At the same time, a pilot project is being established in Kenya and this has already set some ambitious targets. It has developed new skills and provided valuable experiences to PALIAct staff. At the same time, the Kenya Centre, in partnership with the Network

Learning by doing

193

AP 59,2

194

Institute for Global Democratization (NIGD) based in Helsinki and with the Kenya Library Association, has provided training to librarians in collecting and disseminating information from the World Social Forum, the next one taking place in Nairobi in January 2007. PALIAct aims to provide a new vision to help create a people-orientated information service that could meet the information needs of workers and peasants. It works towards providing an anti-imperialist and a Pan African world outlook among African librarians and information workers. It also seeks to set up an alternative information service in partnership with the potential users of the service as a way of showing what needs to be done. PALIAct aims to form partnerships with progressive information and other workers within Africa and overseas. One of its aims is to mainstream this new approach to service and staff development. It thus does not compete with existing service providers, but seeks to inject the spirit of innovation and risk-taking in a field which remains rather conservative and isolated from social and political forces around it. PALIAct recognises that there is a public library service in African countries and it does not claim to be the only initiative that seeks to provide a relevant service. For example, the recently concluded XVII Standing Conference of Eastern, Central, and Southern African Library and Information Professionals (SCECSAL XVII), in Dar es Salaam, Tanzania (10-14 July, 2006) attracted twenty-five papers from participants from a large number of countries[1]. At the same time, representatives from the Library Associations gave details about the innovative services they offer. For example, the Kenya Country Report (Gitachu, 2006) mentions some initiatives: KLA has over the last ten (10) years been involved with projects aimed at promoting reading and literacy among the disadvantaged urban, peri-urban slums and rural grassroots communities in Kenya. This is in line with the principle of “Education for All” and the Millenium [sic ] Development Goals, and the related global human rights covenants on access to information and education for all. KLA facilitated the reading promotion project in collaboration with the National Book Development Council of Kenya (NBDC-K) through the East African Book Development Association (EABDA), and the Ministry of Education. During this period, several Children Reading Tents (CRTs) were hosted in primary schools in identified needy rural districts/divisions in Kenya. The primary objective of the CRTs is to develop a desire for reading for children in their formative years for lifelong education, and to improve on their literacy and numeracy skills, thus enhancing their capacity for creative and life skills, and empowering them to cope with the different subjects for educational attainment.

PALIAct does not aim to replace or challenge the existing library associations and services. It is seeking active partnerships with them and with other stakeholders as the challenge of developing a relevant service requires close co-operation from all. In fact, PALIAct in Kenya worked in partnership with the Kenya Library Association in delivering a training programme for librarians from Kenya and Uganda to prepare them for the World Social Forum (WSF) taking place in Nairobi in January 2007. The purpose of the workshop was: . . .to prepare and train librarians for participation in the World Social Forum 20-25 January 2007, and in the WSF process, both as citizens and in their role as information specialists. This particular workshop should also be useful for trainers, who will themselves prepare and train more librarians for participation in the WSF (The WSF Goes to the Library, 2006).

This approach shows how PALIAct aims to develop an alternative model of information services to ensure development for working people. But such initiatives remain at the margin of policies and practices of current information services. While many information services are now looking at ways of meeting changed information needed in a fast changing world, they are incapable of bringing about the major changes in the way information services are provided. Their inability stems from a complex set of reasons, ranging from historical factors to lack of effective leadership skills among many current senior managers and decision makers. For various reasons, they are not active in the one area that is an essential requirement for any meaningful change to take place: becoming politically involved in developing an information service that is based on creating and delivering a vision of service based on meeting the needs of the majority of people. This may sometimes require the profession to challenge those in power, which is difficult in any situation, but particularly so in an African context. It is this political aspect that PALIAct seeks to incorporate in its work. It is in this aspect, among others, that it hopes to provide an alternative model of information service. Another important point which distinguishes PALIAct is that it seeks to develop a new, participative model that can be applied differently in different situations. It does not seek to impose a “universal model” in all countries or even in different parts of the same country. Local activists, working closely with communities, decide what services are relevant for meeting their local needs. In Kenya, for example, there are already two centres – one in Nairobi and one in Naivasha, each working out relevant services for their local communities. A key point to be noted about PALIAct is the term “activists” which takes it away from a strong focus on “professionals” that is extremely important for current information services. Thus PALIAct works with activists from other fields, such as medicine and agriculture among others. Thus the Naivasha Centre in Kenya is planning to work with local workers, artisans, architects, and designers in their proposals for developing a community information, educational, and culture centre. Further thoughts on ideas that have influenced the PALIAct approach in developing an alternative, anti-imperialist information service are provided by Durrani (2006b). PALIAct puts into practice several ideas on reflective learning developed in the first part of the paper. Those involved in PALIAct do not attend any formal classes where they acquire new information, knowledge, or new skills. Their learning is almost entirely through a reflective process, aided by workshops, discussions, readings, and increased awareness of new ideas and practices (for example, through PALIAct Ideas and Issues, now renamed Information Equality, Africa). Weekly workshops and discussion and learning meetings are an integral learning process for the Naivasha Centre in Kenya. The WSF workshop mentioned earlier is yet another example of such reflective learning. In this case, the evidence for the actual learning is the group’s input in the World Social Forum meeting in Nairobi in January 2007. In all cases, the evidence that any learning takes places is in the actual delivery of new services as judged by the communities served. Some early experiences from the Kenya PALIAct Centre Plans to establish a community library at Maua Primary School in Naivasha. PALIAct activists in Kenya visited the Maua Primary School in February, 2006. The community

Learning by doing

195

AP 59,2

196

in which the school stands is comprised largely of immigrant farm labourers who have moved to Naivasha from around the country to work in the flower industry. It is the only school in the community that accommodates all children regardless of where their parents work. The community and PALIAct agreed to set up a community school that would cater for all children and their parents. It was felt that the library should cater for the entire community. In such a situation, the children would improve their academic performance even more as the parents will also be informed and, one hopes, more receptive to new ideas. They discussed the long-term possibility of purchasing land and building a community educational and cultural centre with a library at the centre. In the meantime, setting up “reading tents” for children was considered as a possible activity that can start in the near future. Students of librarianship will be able to do their work experience in such initiatives and so become more aware of local conditions and needs while developing increased awareness about meeting people’s needs. The Kenya PALIAct Centre is also looking at the possibility of developing health information services among Nairobi slums. Caterpillar Book Box Project. A meeting was held in London between John Lake and Shiraz Durrani on 20 January at DASS to discuss co-operation between PALIAct and IFLA Public Libraries Standing Committee (IFLA-PLS). Lake is the Secretary of Section 8 of IFLA Public Libraries Standing Committee and Division III, Libraries Serving the General Public. The discussion covered a wide range of areas of co-operation, including the possibility of PALIAct Kenya Centres being involved in the Caterpillar Book Box Project. Lake (2006) provides further details about the project: One of strategic plans for the next 2 years [for IFLA Public Libraries Section] is to assist with practical steps for librarians and information workers in Africa particularly in relation to the provision of HIV/Aids health information and to continue a pilot mobile library type project called the Caterpillar Book Box to provide books to rural communities in Africa. The term “Caterpillar Project” was coined from an existing project in Kenya. . .which was tested in the North (Kenya) and South (rural South Africa) areas of Africa. The Caterpillar Book Box is a folding case which is 1.8m high on castors for ease of movement and the shelves accommodate approximately 100 books fuelled by a crate depot of approximately 500 books to replenish the stock in circulation. . . The pilot scheme is located in Koekenaap which is a very poor farming area where 60% of the adults are illiterate and only 30% of nine year olds can read. They are too poor to travel the 20 miles to the nearest library. The adults are nomadic as they earn a living during the grape season which lasts only three months a year before they move in search of other work. The Caterpillar Book Box is the only access that this community has to books. The children have been very excited by the existence of the first Caterpillar Book Box which bears the IFLA logo.

When finalised, this co-operation will add a new dimension to the work of PALIAct. It will connect a major international information organization with information and community activists in delivering a service with active participation of local communities. It has the potential of setting a new standard in delivering community-based information services, thereby providing an alternative model to suit African conditions. The real contribution that PALIAct can make to African

development can be better seen in the context of the brain drain facing Africa. The Association of University Teachers (2006) explains the seriousness of the situation: . . .roughly 30% of Africa’s university trained professionals live beyond the continent’s borders. Moreover, a recent estimate suggests that up to 50,000 Africans with PhDs are working outside the continent. All of this has occurred at a time in which demand for higher education in Africa is increasing.

It is in this context that PALIAct is attempting to support African professionals in their own countries, helping them to develop their skills to solve their own problems. This will have the effect, it is hoped, not only to fill the gap created through loss of trained professionals and academics as well as a way of discouraging persistence of brain drain. Putting ideas into action It is too early to judge how successful the two projects, QLP and PALIAct, will be in developing lifelong skills of participants. However, valuable lessons are being learnt on the process of learning as well as on issues around developing relevant information-related services. That DASS has encouraged such projects is an indication of its commitment to innovation in the learning and teaching, particularly in the information sector. The long term significance of the two projects discussed above should be seen not only in the context of the specific aims of each project. A crucial aspect of such projects is how and to what extent they influence the “mainstream” learning and teaching programmes in information management. Some examples of how the projects have influenced the teaching programmes have been mentioned. The first phase of a more fundamental shift in the teaching programme in Information Management was implemented in May 2006 when a number of changes were made in some of the modules taught by the author; this involved changing the module titles (and scope) of some modules, and starting a new module entitled “Information for Development”. It is noteworthy that the rational for these changes included the need for incorporating reflective learning in a more formal way. London Metropolitan University (DASS, 2006a) explains this further: There have been substantial social and technical developments since the Modules were introduced. These changes need to be reflected in the learning and teaching environment so as to meet the changing needs in the workplace as expressed in various policy initiatives at national and international levels. Key developments can be summed up as: † Rapid globalisation and development of the Information Society; † The need to ensure reflective learning among students; † The need to develop effective leadership skills; † The need to innovate in order to meet changing needs; and † Workforce development to meet new challenges as an on-going process. It thus falls on Universities to have a long term view and vision of the learning and teaching needs that a society faces in order to ensure national and personal well being. It is their responsibility to meet the needs of the society through developing appropriate and relevant learning and teaching programmes. The proposals for modification and changes in these modules are one such attempt.

Learning by doing

197

AP 59,2

198

One manifestation of changes envisaged in the above is evident in the short course on “Leadership for Innovation, Equality and Change”. The learning outcome for this course indicates that reflective learning is an important aspect of the course. The first part of the course deals with the theoretical and policy aspects of the topic. The second part then aims to put the theories learnt into practice through an active reflective learning process. London Metropolitan University, Department of Applied Social Sciences (DASS, 2006b) provides details of the course: The Course will equip you with new ideas that you can take to your workplaces. It will help you challenge organisational habits that deaden. It will introduce you to new ideas for effective and creative leadership while enabling you to create your own solutions to your own challenges. It will help you to put your ideas into action. All this will take place in a friendly learning environment offering you opportunities for discussion, debate and reflection. The Course offers a place to think, a place to be creative, a place to plan. . . and act. Each participant will produce an “Ideas and Action” plan based on their own experiences and real-life situations. You will have an opportunity to present and get feedback on your concerns and plans. In addition, there will be opportunities to set up peer support groups and a network to carry on a dialogue with participants and Course Leader after the Course so that you will be able to discuss concrete issues and challenges that emerge in your workplace over the coming months. This will be an opportunity to get support for putting into practice ideas developed during the Course. A follow-up course can be set up if there is a need to review progress and challenges. The Course will thus help you to become more aware of alternative ideas. It will provide opportunities for reflective learning from reports and journal articles and your and other people’s experiences. It will help you become confident leaders taking control of your organisations and workplaces.

The short course itself will be evaluated internally both in terms of take up and in the learning that it enabled participants to achieve.

Conclusion This paper set out to examine the learning and teaching programmes in information management at DASS. It looked at the changing needs in meeting learning needs in the context of a fast-changing world situation. We examined a number of projects and modules which seek to address the need for incorporating reflective learning in university courses. There remain a number of areas that require further research and implementation. Perhaps the key aspect is finding appropriate methods for assessing reflective learning in the projects as well as in modules and short courses. This aspect has been recognised by a number of authorities. For example, Kember et al. (2000) conclude that most courses (in the higher education) lack methods for reflective learning. At the same time, the Quality Assurance Agency (QAA) mentions the following as course requirements: . benchmarking statement on reflective skills (QAA, 2002); . mechanisms to monitor achievement (QAA, 2002); and . responsive to reflective skills (QAA, 2006).

Similarly, Dearing (1997) mentions the need for such learning as issues relevant for employability, lifelong learning and Masters requirements. The work mentioned here is part of an on-going process of change in curriculum development as well as in finding an appropriate method of enhancing the learning and teaching experience. The process includes further research and evaluation to see if the processes described in this article meet the needs of a changing learning and teaching environment.

Learning by doing

199 Note 1. The Conference Proceedings are available from: www.tlatz.org/scecsal2006/Volume1.pdf (accessed 29 July 2006).

References Association of University Teachers (2006), “Reversing Africa’s brain drain”, available at: www. aut.org.uk/media/pdf/s/8/ReversingAfricasBrainDrain.pdf (accessed 8 March 2006). Bourner, T. (2003), “Assessing reflective learning”, Education þ Training, Vol. 45 No. 5, pp. 267-72. Castells, M. (1997), The Power of Identity, The Information Age, Economy, Society and Culture, Vol. 2, Oxford. CILIP (2003), CILIP Rewards Outstanding Achievement in Promoting Diversity and Challenging Inequality, CILIP, London, available at: www.cilip.org.uk/aboutcilip/newsandpressreleases/ archive2003/news031121e.htm (accessed 30 July 2006). Dearing, R. (1997), National Committee of Inquiry into Higher Education, available at: www. leeds.ac.uk/educol/ncihe/ (accessed 15 May 2006). Durrani, S. (2006a), “Progressive librarianship in Africa, the PALIAct story”, Focus on International Library and Information Work, Vol. 37 No. 1, pp. 4-8, available at: www.cilip. org.uk/specialinterestgroups/bysubject/international/publications/focus/currentissue.htm. (accessed 29 December 2006). Durrani, S. (2006b), Politics of Information and Knowledge in Africa; the Struggle for an Information Inclusive Society in a Globalised World, paper presented at the XVII Standing Conference of Eastern, Central, & Southern African Library & Information Professionals (SCECSAL XVII), Dar es Salaam, Tanzania, 10-14 July 2006, pp. 40-65, available at: www. tlatz.org/scecsal2006/Volume1.pdf (accessed 28 July 2006). Durrani, S. and Bartlett, D. (2004), “Young people in control”, Public Library Journal, No. 2, pp. 22-5. Gitachu, R. (2006), Kenya Library Association (KLA): Kenya Country Report, 2004-2006, presented during SCECSAL business meeting held on 14 July 2006 in Dar es Salaam, Tanzania, available at: www.tlatz.org/scecsal2006/COUNTRY%20REPORT.html (accessed 29 July 2006). IBM Community Development Foundation (1997), The Net Result – Report of the National Working Party for Social Inclusion, available at: www.britishcouncil.org/ [email protected] (accessed 7 March 2006). Kember, D., Leung, D.Y.P. and Jones, A. (2000), “Development of a questionnaire to measure the level of reflective thinking”, Assessment & Evaluation in Higher Education, Vol. 25 No. 4, pp. 381-96. Kundnani, A. (1999), “Where do you want to go today?”, Race & Class, Vol. 40 Nos 2/3, pp. 49-71.

AP 59,2

200

Lake, J. (2006), “The IFLA Caterpillar Book Box Project”, PALIAct Ideas & Action, No. 1, pp. 2-3, available at: www.seapn.org.uk/documents/PALIACTIdeasandActionNo.1Jan06.pdf (accessed 30 July 2006). Leadbeater, C. (2003), “Overdue; how to create a modern public library service”, Laser Foundation Report, Demos, London, available at: www.demos.co.uk/publications/overdue (accessed 7 March 2006). London Metropolitan University, Department of Applied Social Sciences (2006a), “Minor modification proforma”, MA in Information Services Management, unpublished, internal document available from the author. London Metropolitan University, Department of Applied Social Sciences (2006b), “Leadership for innovation, equality and change; a two day Ideas into Action short course”, draft publicity, unpublished internal document, available from the author. Mayor of London (2005), Delivering Shared Heritage; the London Mayor’s Commission on African and Asian Heritage, Mayor of London, London, p. 35, The QLP section available at: www.seapn.org.uk/qlp_extract.doc (accessed 7 March 2006). Mulgan, G. and Albury, D. (2003), Innovation in the Public Sector, Prime Minister’s Strategy Unit, Cabinet Office, discussion paper, available at: www.strategy.gov.uk/downloads/files/ pubinov2.pdf (accessed 7 March 2006). Museums, Libraries and Archives Council (2004), Learning for Change: Workforce Development Strategy, London, available at: www.mla.gov.uk/resources/assets//W/wfd_learning_for_ change_pdf_5661.pdf (accessed 15 October 2005). Museums, Libraries and Archives Council, London (2006), ALM London Workforce Development Strategy, Creating Skills, Supporting Development and Championing Diversity, available at: www.mlalondon.org.uk/lmal/index.cfm?NavigationID ¼ 309 (accessed 30 December 2006). Percival, D. (2005), “Libraries, young people and QLP-Y”, forthcoming, bis, Bibliotek i Samha¨lle (BiS: Libraries in Society), available at: www.seapn.org.uk/dpercival_youthlibs0905.rtf (accessed 3 November 2005). Quality Assurance Agency (QAA) (2002) Masters Awards in Business and Management. Subject Benchmark Statements, Quality Assurance Agency for Higher Education, available at: www.qaa.ac.uk/academicinfrastructure/benchmark/masters/mba.pdf (accessed 14 May 2006). Quality Assurance Agency (QAA) (2006), Code of Practice for the Assurance of Academic Quality and Standards in Higher Education. Section 6: Assessment of Students, draft for consultation, Quality Assurance Agency for Higher Education, available at: www.qaa.ac. uk/academicinfrastructure/codeOfPractice/section6/draft/ COP%20Section%206%20draft.pdf (assessed 14 May 2006). Sivanandan, S. (1999), “Globalism and the Left”, Race & Class, Vol. 40 Nos 2/3, pp. 5-19. The WSF Goes to the Library (2006), available at: www.nigd.org/libraries/bamako-nairobi. The quoted text is written by Mikael Bo¨o¨k for the “Training the trainers, WSF Workshop for librarians”, Nairobi, 3-5 July, available at: www.nigd.org/libraries/bamako-nairobi/ tot-workshop/ (accessed 29 July 2006). Corresponding author Shiraz Durrani can be contacted at: [email protected] To purchase reprints of this article please e-mail: [email protected] Or visit our web site for further details: www.emeraldinsight.com/reprints