Are You Who You Say You Are?

  • Published on

  • View

  • Download


This article was downloaded by: [ECU Libraries]On: 10 October 2014, At: 04:57Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH,UKJournal of Access ServicesPublication details, including instructions forauthors and subscription information: You Who You Say You Are?Nancy Hunt-Coffey a b ca University of California at Los Angeles , USAb Glendale Public Libraryc Glendale Community College , Glendale, CA, USAPublished online: 20 Oct 2008.To cite this article: Nancy Hunt-Coffey (2002) Are You Who You Say You Are?, Journalof Access Services, 1:1, 119-150, DOI: 10.1300/J204v01n01_06To link to this article: SCROLL DOWN FOR ARTICLETaylor & Francis makes every effort to ensure the accuracy of all theinformation (the Content) contained in the publications on our platform.However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness,or suitability for any purpose of the Content. Any opinions and viewsexpressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of theContent should not be relied upon and should be independently verified withprimary sources of information. Taylor and Francis shall not be liable for anylosses, actions, claims, proceedings, demands, costs, expenses, damages,and other liabilities whatsoever or howsoever caused arising directly orindirectly in connection with, in relation to or arising out of the use of theContent.This article may be used for research, teaching, and private study purposes.Any substantial or systematic reproduction, redistribution, reselling, loan,sub-licensing, systematic supply, or distribution in any form to anyone is forbidden. Terms & Conditions of access and use can be found at by [ECU Libraries] at 04:57 10 October 2014 You Who You Say You Are?Network Access Managementin Community College LibrariesNancy Hunt-CoffeyABSTRACT. A growing yet unresolved problem in access services todayis controlling user access to resources in a networked environment. Thispaper describes the history of network authentication, explores the issuesinvolved in selecting an authentication method, discusses the benefits anddrawbacks of the various methods, identifies efforts by some integrated li-brary services vendors and academic institutions to address this problem,and makes recommendations for selecting an authentication scheme for acollege library. The discussions are based in part upon the results of a sur-vey conducted with California community college libraries, various inte-grated library system vendors, and online resource companies. Technicalterms are defined in a glossary. [Article copies available for a fee from TheHaworth Document Delivery Service: 1-800-HAWORTH. E-mail address: Website: 2002 by The Haworth Press, Inc. All rights reserved.]KEYWORDS. Networked information, Internet, authenticationBACKGROUND AND METHODOLOGYIntroductionRegulating user access to networked resources is a rapidly evolvingbut unresolved access services problem for libraries today. In the recentNancy Hunt-Coffey is a PhD student, University of California at Los Angeles, Ex-ecutive Analyst/Automation Services Coordinator, Glendale Public Library, andTechnology Consultant, Glendale Community College, Glendale, CA ( of Access Services, Vol. 1(1) 2002 2002 by The Haworth Press, Inc. All rights reserved. 119Downloaded by [ECU Libraries] at 04:57 10 October 2014 http://www.HaworthPress, libraries have had manageable methods of controlling access tocertain resources in a networked environment, such as the library cata-log or in-house resources. For example, different logons provide differ-ent views of the catalog data, which allow different levels ofinteractivity for the public, librarians, circulation staff and catalogers.However, with the advent of Internet-based online products that can ex-ist anywhere in the world, these system-centric mechanisms no longersuffice. Libraries have addressed this problem by developing home-grown solutions or implementing software filters; however, no industrystandards have been set.Homegrown solutions may not take into account the many questionsthat should be considered when selecting a comprehensive access man-agement system. What type of burden is placed on the user trying to ac-cess a networked resource? How difficult is it for staff to implement andmaintain the chosen technology as the user base grows? How does thelibrary gather statistics on resource usage while still protecting user pri-vacy? How can the library track down a user who has hacked a networkresource? What are the ethical implications for libraries that implementsuch technical barriers to information? It seems appropriate in this inau-gural issue of the Journal of Access Services that the current state of ac-cess to networked resources should be considered.Project BackgroundThe experiences of Glendale Community College Library provide afairly typical example of how many libraries have addressed access is-sues. The library began providing access to its first online reference re-sources in August 1996. The library employs IP filtering to control oncampus access to these resources. Some can also be accessed via theWeb by home users and distance learners. To access these resources,students must complete a paper or Web-based application form andsubmit it to a library staff member. Staff then verify that the student iscurrently registered and enter the student information into a securename database. The student is then issued a userid and password to ac-cess the online resources.While this access methodology has worked well for the past twoyears, there are some major drawbacks, especially with the remote ac-cess component. Even minimal safeguards entail some degree of incon-venience. Students who have automatic access to in-house libraryservices are not granted the same ease of access to remote online re-sources. They must first know these products are available to them,120 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 which may not always be the case with off-site users. They must takethe time to complete and submit the access form, then wait a few daysbefore it is processed for access to be enabled. With the advent of theInternet, online users have come to expect immediate and unencum-bered access to information. The time factor is important because manystudents balance full time work and school schedules, and a delay ofeven a few days can seriously affect their research effectiveness. Someargue that it is fundamentally inappropriate for libraries to erect suchbarriers to information. Furthermore, as distance education evolves, lo-cal proximity to a physical campus will be less vital, and students willbe able to choose the community college that best meets their educa-tional goals regardless of location. Easy, quick access to networked re-search resources may be a deciding factor for students when choosingwhich community college to attend.From the practical side of library workload, the need to manually as-sign passwords to each student and faculty member is labor intensive.As the user community grows, maintenance of this type of system be-comes unwieldy. Furthermore, it is very difficult to manage the ac-counts once they have been established. For example, to faithfully carryout the terms of many online product licensing agreements, student ac-counts should be deleted or disabled whenever students graduate, leavethe school, or take off a term. This level of account management wouldrequire a real-time interaction between the secure name server and thestudent enrollment database. Security and technical difficulties makethis type of connection problematic.In late 1999 the Glendale Community College Library team com-peted for and was awarded California Community Colleges Technol-ogy Model Applications Pilot funds to study and explore alternatemeans of authenticating users to online resources. This paper summa-rizes the findings from that research within the context of global trendsand how libraries in general are dealing with them.The team conducted a literature review of academic and trade jour-nals on the subject of network and remote authentication. Multiple sur-veys were developed for use with different groups, and in some casesfollow-up interviewing was conducted to gather additional data. Of the106 California Community College libraries that were sent a two pagesurvey, 54 responded (51%). The goal of this survey was to determinethe state of technology deployment at each library, the level and type oftechnical support available, and the type of authentication schemes(in-house and remote) that are currently in use. Additionally, 28 onlineproduct vendors were surveyed, and 16 responded (57%). The goal ofPart Three: The Current State of Access Services 121Downloaded by [ECU Libraries] at 04:57 10 October 2014 this survey was to determine the type of authentication methodologyrecommended or accepted by these vendors, and whether the onlineproducts could interface with an integrated library system (determinedby use of Z39.50 protocol). Finally, 13 integrated library system (ILS)vendors were surveyed, and six responded (46%). The goal of this sur-vey was to determine integratability of the library catalog with onlineresources based on usage of Z39.50 protocol, and whether the ILSwould handle the authentication of users to online products automati-callyi.e., no further authentication methodologies would be needed.HistoryTraditionally, the problem of controlling access to networked re-sources has focused on the physical location of a facility or institution.With the rise of the mainframe in the 1960s, the need to regulate accessto files and services led to the problem of user authentication. Due to theexpense of mainframe systems, time-sharing became common, and in-stitutions had to pay for the amount of time used on the machine.Tracking user access to resources on the mainframe became very im-portant because it was the basis of billing and revenue generation.1In the 1970s and 1980s, companies like CompuServe began chargingfor online access time. As Clifford Lynch points out, the focus of thesepublic networks was still internal; for the most part, they were closedcommunities. Once users got online with CompuServe, access manage-ment systems determined what services or files users could access onthe internal company servers. In the 1980s distributed computing mod-els emerged in universities and research labs, through experiments likeAthena at MIT.2 These projects explored logistical problems, such ashaving servers scattered geographically across a large campus and theneed to recognize the same set of users and their associated attributes oneach of these machines. While these experiments were cross-depart-mental, it was still focused on resources within an institution.With the emergence of the Web, users and resources can be locatedanywhere in the world. Colleges and universities must now provide ameans of authenticating remote users who access campus resources,many of which run on the same network as sensitive faculty/student in-formation. Therefore, they must make certain resources available se-curely outside of the campus network. Institution-centered accessmethodologies no longer suffice.Cryptography is generally considered the best and most broadly sup-ported solution for network security. Initial work was done on pub-122 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 lic-key encryption in the mid-1970s by Diffie-Helman and was laterconverted into an architecture by Rivest, Shamir and Adelman (RSA).3Work on digital certificates dates to the mid-1980s, when this technol-ogy was used to support secure e-mail transactions.4 It was not until the1990s that the use of cryptography and digital certificates became morewidespread. In addition to its impact upon higher education, the explo-sive growth of the Internet and e-commerce have fueled the develop-ment of more mature methods of securing online transactions andauthenticating users.The fundamental problem with controlling user access to network re-sources lies with the architecture of TCP/IP itself. TCP/IP was designedto allow robust communication among diverse platforms. To a certaindegree security was an afterthought. While the Open System Intercon-nection (OSI) model . . . designed a layer between the application andnetwork that could manage secure handshakes, TCP/IP left everythingabove the communication layer to the application.5 As a result, indi-vidual application developers were required to implement security solu-tions for each program. Rather than building a generic, reusablesecurity mechanism, applications relied on proprietary or embeddedtechniques for shuffling account information across the network to theplatform or application for validation.6 Although some efforts havebeen made with technologies like Secure Socket Layer (SSL), it hastaken a long time for security standards to be established. Even withinindividual industries, security standards have not been established. Forexample, some ILS vendors are implementing their security measuresusing different and perhaps incompatible methodologies and standards.These divergent methodologies may present significant standards prob-lems in the future.ISSUESDefining Our TermsThere are a number of issues involved in choosing the right user ac-cess methodology for a college library. Much of the decision is based onthe needs of the library, its users, and the requirements of the technolo-gists and product vendors.As Clifford Lynch adeptly points out in his White Paper on Authen-tication and Access Management Issues in Cross-Organizational Use ofNetworked Information Resources, the problem of user access to net-Part Three: The Current State of Access Services 123Downloaded by [ECU Libraries] at 04:57 10 October 2014 work resources has two parts: (1) authentication, and (2) authorization.7Authentication is the process where users supply some kind of secret in-formation (ex., a password) to establish themselves as being permittedto use an online identifier of some sort. It answers the question: Do youhave the right to use a particular userid? (Authentication of a networkresource itselfi.e., determining whether the resource that has been re-trieved has been altered in some wayis beyond the scope of this re-port.) Authorization, on the other hand, is the process of determiningwhether a userid is able to access a resource or perform a given transac-tion. It answers the question: What can/cant you have access to? Fol-lowing Lynchs use of terminology, this paper will use the term accessmanagement to indicate a methodology that incorporates both aspectsof authentication and authorization.While these definitions seem fairly straightforward, there are a num-ber of issues associated with them. Some licensing agreements requirethe institution or university to guarantee it is . . . testing or verifyingthat individuals are really a member of this community according topre-agreed criteria of having the institution vouch for or credential theindividual in some way that the resource operator can understand.8However, such licensing agreements make the assumption that an insti-tution can tell whether the user who possesses the rights to the userid isactually the person accessing the system. This is not always the case.For example, if a password is stolen, the userid is no longer associatedwith the appropriate individual and this particular assumption is invalid.Therefore, problems arise even with these basic definitions.It is important to define who are the key players in choosing an accessmanagement methodology and what are their needs. These definitionscan also be vague. For these purposes, the term library will refer to thelibrary or its parent institution that provides access to online products.(It should be noted, however, that the issues identified here could be rel-evant to any campus division that wants to provide remote access to itsresources.) The needs of the users are the foremost concern of the li-brary. The library wants to provide a common user interface for allproducts and a single login to all resources. The library is concernedwith user privacy, but it also requires some sort of usage statistics in or-der to evaluate services and to justify the cost of the online products. Tomeet the requirements of licensing agreements and for the protection ofits user community, the library seeks an authentication methodologythat protects users, is not cumbersome to support, and does not compli-cate access to the products.124 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 There are three other key groups. Systems staff members who workin the library or for the campus information systems division can becalled the technologists. The primary concern of the technologists inproviding access to network resources is security. There can be seriousconsequences for information systems staff if proper precautions arenot taken to protect sensitive data, such as student records. As a result,the technologists seek strong authentication schemes to protect sensi-tive data. The vendors provide the product that the library is licens-ing. The vendors need to protect their online resources or services fromunauthorized users while providing access for legitimate users. Thevendor loses money if unauthorized users are allowed to use their prod-ucts. The primary concern for the vendor therefore is to ensure that thelibrary is providing an authentication scheme that protects and regulatesaccess to the product. Finally, the users can be individual informationconsumers, members of particular groups, or entire institutions. Userswant easy-to-use programs that are not impeded by complex setup orspecialized software. They also want a single logon for all resources.They may or may not be concerned with having an authenticationscheme in place that protects their privacy.Sometimes the library, the technologists, the vendors, and the userscan all be part of the same organization. For example, a library may pro-vide an in-house, online resource to a select group of campus users,such as the faculty. The library may ask its in-house technology staff toprotect this resource from unauthorized access. At once, the library em-bodies the vendor of the product and the technologists who secure it. Asubset of the librarys patrons constitutes the user community. For thisdiscussion, however, these entities will be considered as distinct.Scope of User Base and Choice of User DatabaseThe first step to choosing an appropriate user access methodology isto define the scope of the user base. In a community college library set-ting, the user community is comprised mostly of students, faculty, staffand alumni. This can be a large and heterogeneous group. For example,students may leave school in the middle of the semester; adjunct facultymay not teach some semesters; and staff members may be hired or leavetheir jobs at any point during the year. All of these issues must be con-sidered when determining the scope of the user community. Once theseparameters have been defined, it must be decided just what levels of ac-cess to offer for which databases and how to authenticate and authorizeusers. Many questions arise. Does the library only plan to offer onsitePart Three: The Current State of Access Services 125Downloaded by [ECU Libraries] at 04:57 10 October 2014 access to online resources? If so, IP authentication may be a satisfactorysolution. However, only offering onsite access to resources limits userflexibility. Should the library use the circulation database or the campusstudent database (if separate) as a basis for authenticating users? If thelibrary decides to use the campus student database, how would faculty,alumni, staff and contractor accounts be addressed? What if the librarywants to deny access to online products based on a users accumulationof excessive library fines? This may not be possible if the account infor-mation is not reflected in the student database. Will the Registrars Of-fice allow library staff to make additions or notations to the studentdatabase? If the library chooses to use the circulation database instead,will students need to authenticate against a separate student database topay their tuition or retrieve records? How will the library be notified if astudent drops out in the middle of a semester or if a faculty memberfinds another job?In the survey of community college libraries, it was found that of the54 that responded, 51 make online products available inside the library.Thirty-eight also make products available remotely. Of these, ten au-thenticate users against the circulation database and four authenticateagainst the student database. In one case the campus and circulation da-tabases were the same. Twenty-three use another database, such as alisting of user names and passwords developed by the library, or theydid not know. Further research might be done with these libraries to de-termine the compelling reasons for choosing one database over another.Depending on the installed base of technology, available funding,and the spirit of collaboration on a particular campus, the best solutionmay be to use the centralized student database or another campus-widedatabase that provides the required flexibility and control. Studentscould then be assigned one master login that would allow them to inter-act with all campus online resources, including remote access to refer-ence resources.9 Excessive library fines could then be linkedautomatically to a students ability to register for classes. As soon as astudent graduates or leaves the campus, access to online resourceswould be halted. While a campus-wide solution of this magnitude mightbe expensive, it may be the most cost effective in the long run. Staffingcosts would be saved since the data would be entered into one databaseinstead of two (student records and circulation) or more. However, is-sues such as who can access and update the data in the centralized data-base would need to be resolved. Also, libraries would want to uselibrary protocol standards, such as SIP2, which student record systemsmay not support.126 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 Clifford Lynch promotes the concept of database centralization evenfurther. A single, network-wide (not merely institution wide) accessmanagement authority would simplify many processes by allowingrights assigned to an individual by different organizations to become at-tributes of a master name rather than having them embodied in differentnames authorized by different organizations . . .10 Even so, Lynch ad-mits that this view of having one central authority may be Utopian. Cen-tralization at this macro level raises questions such as: Who controls themaster database?; and How can one institution trust that another hasperformed proper authentication and authorization service? However,there are successful examples of centralized, coordinated access to on-line resources. The ATHENS project is an online network that handlesauthorization and authentication centrally for 1.5 million students and200,000 staff members in the United Kingdom higher education sys-tem.11 Even with the large user population, the project has scaled well.The obstacle here is that a project of this scope requires enormous in-sight, planning and commitment in order to be successful.Despite difficulties, the Utopian vision remains compelling. Imaginethat students could be assigned one userid and password that would fol-low them as they use school, public, community college and universitylibraries. Could such a centralized user database be developed for stu-dents throughout a state or region? Many libraries are already beginningto collaborate in interesting ways that could eventually lead to such a re-ality. Some are experimenting with combined union catalogs. Patron rec-ords could be similarly melded into a distributed database with a singleinterface. Obviously, a great deal of research and trust-building wouldneed to happen for a project of this magnitude to succeed; however, thebenefits of such an experiment could be tremendous.Feasibility/SupportIn choosing an authentication methodology, all aspects of feasibilityand support should be carefully examined. Libraries should choose asolution that can be supported and will scale as their user communitygrows. For users, access should be streamlined. Authentication that re-quires multiple logins or special software configuration presents a bar-rier to access. Users must be able to access resources independent ofphysical location and the technology available at that location.12 Tech-nologists must implement solutions able to run on low-end computersthat may have slower modems or telecommunications lines. This is es-pecially true in remote areas where it may be years before DSL and ca-Part Three: The Current State of Access Services 127Downloaded by [ECU Libraries] at 04:57 10 October 2014 ble modems will be available. Bandwidth can be especially problematicif solutions like digital certificates and SSL are implemented. De-pending on the level of encryption and amount of interaction betweenthe server and the browser, these technologies can slow the connectionto the point of making the application unusable. The vendors require asystem that is easy to deploy and administer, but which offers the appro-priate level of security. For all of these groups, the software that is cho-sen should be available commercially or free (ex., a Web browser),match the existing technology base, and be as generic as possible.13Determining feasibility is especially important for community col-leges, which may have very little technical support or expertise. Ac-cording to the survey of community college libraries, only seven of the54 respondent libraries support their own technology in-house. Eigh-teen rely entirely on another agency (generally the college informationsystems department) for help, and 29 use a combination of library andcollege staff to support the librarys technology. As authenticationmethodologies become more complex, higher level support and techni-cal knowledge is required. Further research should be done on the depthand type of technical expertise that is available to these libraries and toevaluate their ability to implement new technologies.Security and PrivacyThe issues of security and privacy are more important than ever.Lynch observes that, Any system connected to the Internet today needsto expect systematic, repeated, and concerted attacks.14 As the phe-nomenon of computer hacking has become more common, individualshave become more vulnerable. Increasingly, users rely on the Internetas a primary means of conducting business, communicating and doingresearch, which may require submitting confidential information. Like-wise, all manners of personal communication, such as correspondenceand photos, family histories, travel itineraries, and recreational prefer-ences are transmitted across the Internet, yet little protection is availablefor much of this information. Furthermore, some technologies, maygive users a permanent network address and be designed so that theusers computer is always on the network. If these PCs are networked tothe Internet (via DSL, for example) without a firewall or other means ofprotection, they could be vulnerable to attack.There are other means of obtaining personal data online. Many Websites write cookie files to user systems, which discreetly gather infor-mation on users as they navigate the site. By offering incentives or128 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 prizes, some Web sites encourage users to register their names and otherpersonal information. The site administrators can then analyze use pat-terns in conjunction with names, addresses, demographics, and otherpersonal information. Likewise, vendors may collect data on users andtheir use patterns. Lynch again notes that Because information is oftenaccessed at the publisher site, the publisher may know a great deal aboutwho is accessing what material and how often.15 In these cases the ven-dor could sell data to other commercial entities without the usersknowledge or permission. Vendors may justify this practice by claim-ing that they have a right to know who is using their resources; however,Lynch argues that it is none of the vendors business what resource theuser is accessing as long as the user is a . . . legitimate member of thehost organizations community which is included in the site license.16Some questions that libraries should ask when examining an onlineproducts license are: Does the vendor require users to register with theservice before using? Can users e-mail themselves articles from withinthe product? If so, does the vendor retain that address? What concernsexist with systems that allow customizable views where users can de-termine which resources appear on their screens? Do users compromisetheir privacy in order to register for these customizable systems? It maybe determined by the library that a particular online product should beoffered regardless of these issues; however, users should be made awareof potential compromises to their privacy.Institutions must examine their security needs, the nature of the risks,the technical support required, and balance these with the users expec-tations for easy access before selecting a solution. Some common prac-tices in community college libraries raise potential concerns. Forexample, many libraries use the same userids and passwords to accessonline products remotely as they do for student e-mail accounts. If auser name and password were stolen, the thief could potentially gain ac-cess to private e-mail and the users privileges to online resources. Se-curity should be as strong as possible without being so complicated thatan average user cannot use it easily. Another significant concern is thelicensing requirements of product vendors. Finally, as much as possi-ble, a system should also take into account future security needs. For ex-ample, a library may begin by providing low-level security for its onlineresources; however, if the long-term goal is to use the same securitymethodology to allow students and faculty to access other resources(such as student records) a stronger authentication methodology shouldbe considered. The trade-off is that stronger authentication methodolo-gies may involve complex setup and configuration procedures.Part Three: The Current State of Access Services 129Downloaded by [ECU Libraries] at 04:57 10 October 2014 A common challenge for many community college libraries is pro-viding service access to online resources remotely. Of the 54 respon-dents to our survey, 16 do not yet make online products availableremotely, although many indicated that they plan to provide remote ac-cess. Of the 38 that make the products available remotely, 34 control ac-cess by some type of login, two have implemented a proxy server, onestated that remote authentication to their online products is not neces-sary, and another did not specify. A question remaining is whether thesepasswords are secured or if they are sent merely as clear text across thenetwork. A few libraries have taken the login concept a step further bywriting cookies to the users hard drive the first time authenticationtakes place. This cookie writing eliminates the need for users to loginsubsequently if they are using the same computer. However, it does notnecessarily prevent another user of that computer from gaining unfet-tered access to the product. Further research might explore how effec-tively community college library staff feels their userid/password,cookie writing or proxy server methodologies work. In some cases, itmight be supposed that these technologies fully meet their needs. Inother cases, the systems might not provide as high security as desired,but are the best that can be implemented with available funds and tech-nical support.Even while libraries may object to certain data gathering processes,they must balance those concerns with the need to collect usage data.Often, libraries need to know how many times a particular resource isbeing accessed and which groups of users (students, faculty, etc.) areusing it. They may also need to know the subject areas or articles thatare being accessed most frequently. Depending on the authenticationscheme that is chosen, many of these data may reside on the vendorsmachines.To address this problem, Lynch suggests protecting user privacywith anonymous or pseudonymous access.17 Using anonymous access,all usernames would be transmitted to the vendor as if they were a singleuser. This solution might work for a library providing a proxy server toauthenticate its users. Proxy servers can gather user data before theyleave the local network and after the request returns from the vendor.An individuals user name can be transferred into a generic one beforeleaving the network, then restored after the results are retrieved. Someproxies can provide usage data to the level of the particular documentthat was retrieved. Responsibility for tracking resource usage is entirelythe responsibility of the library. The other, often preferable solution tothis problem is pseudonymous access. With pseudonymous access us-130 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 ers are monitored in a way where the identity of the user cannot be de-termined, but data can still be collected on group usage patterns. Forexample, an institution could assign a faculty member a user name suchas fac411. The vendor could then track the resources and documentsthat fac411 accesses without gathering any private or demographicdata, other than that the user is a faculty member. The library could thenderive reports on usage and types of queries its faculty members aremaking without compromising user identities.Usage data can be used to determine the level of granularity a li-brary will need in order to control access to resourcesi.e., to what de-gree of specificity does access need to be controlled. Will the libraryoffer electronic reserves where access must be controlled on auser-by-user basis? Or, will access be determined on a group basis (stu-dents can access one set of resources and faculty can access another)? Ifthe system becomes too finely grained, the cost of adding individual us-ers into each appropriate group increases tremendously, and manage-ment can become difficult.18 Also, finely grained systems may gatherinappropriate information about user patterns, potentially compromis-ing privacy. By contrast, if a system does not provide adequate granu-larity, libraries will not have the flexibility to allow certain kinds ofaccess to specific groups.If despite all security an unauthorized user breaks the system, theremust be some means of determining that a breach has occurred and, ide-ally, to identify this hacker. Many licensing agreements state that the in-stitution may be called upon to prosecute problem users and hold themaccountable for security breaches. Heavy security should be set on suchdatabase files, and a careful policy should be established controlling ac-cess to this database of user information.Access IssuesUltimately, institutions must analyze how authentication technolo-gies affect access within their own environments. Traditionally, manycolleges have restricted access to their collections to those currently af-filiated with the institution. Those not affiliated would not be allowed touse the resources, or they might be required to pay a fee in order to doso. By contrast, public and some college library resources have beenfree and available to anyone. This philosophy can conflict with policiesof many online product vendors that require libraries to implement userrestrictions. For example, many public libraries are required to restrictremote access to online products within a defined zip code. An unfortu-Part Three: The Current State of Access Services 131Downloaded by [ECU Libraries] at 04:57 10 October 2014 nate effect of restricting access is that it increases the divide between so-cial classes. People who live in zip code areas where the tax base ishigher will most likely have access to more and better quality resources.This is a new phenomenon of the online age, because traditional libraryresources and services have never been restricted in this special way.One library solution to this inequity is consortial purchase of online ma-terials, where groups of libraries join together to purchase a product at areduced price, and all of their constituents gain access. This may not gofar enough, though. State and federal governments may ultimately haveto take action to break the information divides.SOLUTIONSVariety of SolutionsThere are several solutions to the user authentication/authorizationproblem, and choice of the appropriate solution varies based on theneeds of the organization and the composition of its user community.Commonly, an institution will combine multiple approaches using onemethod for onsite access to resources and another for remote access. Asdiscussed previously, these include, among other things: IP filtering,passwording, proxy servers, digital certificates, and vendor-suppliedturnkey authentication. The following sections will examine varioussolutions in more detail.IP FilteringOf the several methodologies employed to control user access, thefirst and most popular in-house method of authentication among com-munity college libraries is IP filtering, where the institution guaranteesto the vendor of the product . . . that all traffic coming from a given setof IP addresses represents legitimate traffic on behalf of the licensee in-stitutions user community.19 Thirty-seven of the 51 surveyed collegelibraries that offer onsite access to electronic resources use IP authenti-cation, sometimes in combination with a userid/password, to enable ac-cess. Fourteen of the 16 vendors support IP filtering as the means ofcontrolling in-house access. One vendor supported only userid andpassword access, and one left this decision to the discretion of the li-brary.132 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 With IP filtering, the system presumes that all traffic coming from arange of IP addresses is valid. The library provides a range of IP ad-dresses to the vendor. If the vendors server recognizes an incoming IPaddress as coming from the library, it allows the user to access the re-source. This solution works very well in computer labs where there aremany shared terminals. It is a very easy solution to deploy and support,and it does not require any special software or configuration. It can alsobe readily combined with another methodology for remote authentica-tion. IP filtering is not easily vulnerable to hacking and it provides muchstronger privacy for users. Even if a hacker could steal the IP address,subnet mask, and gateway IP number, this information is useless if thatperson is not connected to the same subnet. Further, there is little moti-vation to seek to hack the system. Unlike a userid/password, which mayallow access to a users private resources, stealing IP information onlyallows the hacker to access a public networked resource.There are three major disadvantages of IP filtering. Generally it isonly useful to facility-based users, since most dial-up users are assigneda temporary network address by their ISP. This address changes witheach new connection. (Although as more users install cable modemsand DSL at their homes this could change, since these connections mayrequire the user to have a permanent IP address assigned.) It is possibleto use IP filtering for remote access if users dial directly into the institu-tional network; however, remote users may incur long distance chargesto do so. Also, it is impractical for small institutions to provide dial-upaccess, because to do so requires that they maintain a modem bank andsecurity on the dial in system. These requirements are generally beyondthe means of community college libraries.The second disadvantage of IP filtering is that access is not providedto the user, but to the users machine. It is not possible to guarantee thatall users of a given machine are valid. This problem is magnified in amulti-user environment. For example, while it would be relatively easyto dedicate an IP number to an individual users own computer, ac-countability is nearly impossible in a lab-style situation where many us-ers share a single computer.The third major drawback is that IP filtering does not offer fine gran-ularity. For example, an institution may want to offer certain resourcesto faculty and others to students. This level of granularity becomes verydifficult with IP filtering. The network administrator would need to mapgroup membership (the faculty group) to network addresses in order torestrict access. Faculty members using a computer mapped for studentuse would only be able to use the student set of resources. Con-Part Three: The Current State of Access Services 133Downloaded by [ECU Libraries] at 04:57 10 October 2014 versely, students who access a faculty computer could use the facultyset of resources, which might include sensitive data and personal re-cords. Protocols like DHCP, which dynamically assign IP addresses touser computers as they request network resources, offer virtually no ac-countability without special programming. Likewise, IP filtering wouldnot work for libraries that dial out to the Internet and are assigned ad-dresses dynamically.Userid/PasswordAs we have seen, many community college libraries have imple-mented the basic methods of controlling user access to online resourcesof assigning userids and passwords to students, or requiring them to en-ter their library card barcode numbers. The userid identifies the user as amember of a group, and has attributes associated with it that designate aparticular set of network resources to which the user or user group hasaccess. The password confirms that the user knows a secret associ-ated with that userid.20 It is intended to be known only by the user andcan be changed without affecting the attributes associated with that ac-count. Similarly, the library card barcode number combines the authori-zation/authentication functions into one. The assumption is that ifsomeone enters a library card number, that person must be the user as-sociated with that card and therefore has access to certain assigned priv-ileges.To authenticate userids and passwords, many libraries have com-bined homegrown programming scripts (generally CGI or ASP) withscripts/mini-programs provided by either the ILS vendor or the onlineproduct vendor. Generally, these check to see if the username and pass-word match a record in a database of users (for example, the circulationdatabase). Alternatively, the system may scan if the patron barcode fallsinto a valid alphanumeric range used by the library. When scanning abarcode the scripts/mini programs may also check against the librarypatron database to verify if the patrons account is in good standing (ex.,the card is not blocked or expired), and adjust the patrons allowableprivileges accordingly. Assigning userids and passwords is commonbecause, among other advantages, the overhead is small. With thismethod, resources can be used from anywhere, both locally and re-motely. They do not require special software or configuration on theusers computer. Tracking usage is easy because access is linked to auser name.134 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 There are disadvantages, however, assigning user names and pass-words becomes cumbersome as the user community grows. Userids andpasswords can be easily shared or stolen. For that reason, many securityconsultants recommend that passwords be expired every 30 days, thatthey be at least eight characters long, and that they include a combina-tion of alphanumeric characters. This places a burden on the user tochange the passwords frequently, and the passwords themselves may belong and complicated. The problem multiplies if login methods varyfrom product to product, and users have to enter a different userid andpassword for each distinct one.Another problem with the userid/password systems is that they pro-vide poor network security. As discussed, TCP/IP does not provide se-curity above the communication layer. Cearley and Winsor note thatthis means that: As a result, on most Local Area Networks and WideArea connections, usernames and passwords are transmitted in cleartext.21 When using basic login services like those built intoMicrosofts Internet Information Server, for example, passwords aresent along the network unencrypted. A piece of software and/or hard-ware called a sniffer (freeware versions are available) can be placedon the network to sniff transmitted packets and reveal userids andpasswords or barcode numbers. Even with SSL encryption, each Webserver through which the request passes must handle the username andpassword, and these could easily be captured. Therefore, the usernamesand passwords are only as secure as each Web server that handles thisinformation.22Despite its apparent simplicity, assigning userids and passwords canin fact become a very complex solution over time. It may involve asmany as two CGI scripts customized for every online resource, and aproprietary program that is provided by the ILS. Each of these in turnmay reside on a different server. As more scripts, programs and serversare introduced into the process, there is a greater chance of failure. In-dustry-wide, there has been much sharing of CGI scripts, but there is noreal standard. Therefore, technologists who move from one library toanother may need to implement different scripts and software.KerberosKerberos is an upscale version of the userid/password-barcodescheme. It is a . . . cryptographic authentication scheme designed forsecure use over public computer networks.23 Users are authenticatedagainst a central database of accounts; therefore, they have a single useridPart Three: The Current State of Access Services 135Downloaded by [ECU Libraries] at 04:57 10 October 2014 and password for every resource. Rather than being transmitted as cleartext, the password is encrypted at each point, making for better security.Kerberos emerged from experiments in the 1980s with the MITATHENA network, and it is still used on many university campuses.These experiments were focused on authenticating users to access net-worked resources located at the institution, so this solution works well ina closed environment. However, some problems arise with the need toexpand beyond the local network. Kerberos software must be loaded onthe client computer, which makes support more intensive and requiresthat users load and configure the software. (Traditionally, Kerberos hasnot been built into standard operating systems or software packages, al-though support for Kerberos is built into Windows2000.) Another limita-tion of Kerberos is that it can only handle authentication. The institutionmust develop a separate application to handle user authorization.Proxy ServerA technology that is commonly combined with IP filtering is a proxyserver. There are two main types of proxy servers: (1) mechanical and(2) application based. Both generally work in the same way. Requestsfor access to an online resource are sent through a central proxy server,which is located on campus. The requests are then directed to the ven-dor by the proxy. Since the proxy server is on campus it has a local IP;therefore, it appears to the vendor that the request is coming from acomputer on campus. The information provider allows the request toproceed as if the user were in a facility on campus.The proxy server handles the traffic from remote users, and IP filter-ing handles traffic from on campus. Nine of the online product vendorssurveyed recommended proxy servers as a solution to the remote accessproblem. This is clearly their preferred method of remote access man-agement because it relieves them from the burden of developing theirown solution. Proxy servers can provide excellent privacy to users ifimplemented properly, since the proxy server can repackage the infor-mation before it is transmitted. User information is not compromised.Proxy servers also provide detailed usage statistics.Unfortunately, proxy servers are not without inherent problems.They introduce additional manual overhead (users having to configuretheir browsers to work with the proxy) and network-related overhead(packaging and repackaging of user requests). Since the proxy must pro-cess all remote requests, it becomes a possible single point of failure.24After the transmission has been sent, accountability is poor. For example,136 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 it would be difficult for a vendor to determine where problems originatewhen the data are sent through a proxy server. If someone hacks theproxy, that person gains access to all of its resources. Another limitationis that proxies only repackage and forward data; they do not provide au-thentication/authorization service. Because of this Lynch refers to proxyservers as a specialized form of IP filtering.25 A separate homegrown ac-cess management system must be employed to verify user access.Finally, proxies can be difficult to program, requiring additional staff andexpertise. In the survey of community colleges, just two of the 54 com-munity college libraries use proxy servers to control access to online re-sources. For these libraries, the costs and difficulties are prohibitive.Mechanical ProxiesMechanical proxies26 take advantage of services built into protocolssuch as HTTP. The users configure their browsers to pass . . . all HTTP re-quests not directly to the destination host, but instead to a proxy server,which intercepts these requests and when necessary retransmits them to thetrue destination host.27 In addition to the previously mentioned advan-tages/disadvantages, mechanical proxies require a Web browser that sup-ports Automatic Proxy Configuration: a feature available in release 2.x (orlater) of Netscape Navigator/Communicator and in version 4.x ofMicrosofts Internet Explorer. AOLs browsers lower than version 4.0 donot support Automatic Proxy Configuration; therefore the user must estab-lish an AOL connection, and then use an appropriate version of Netscape/IE. Automatic Proxy Configuration requires the user to configure anInternet browser to use the proxy server, which can require moderate tech-nical expertise. Furthermore, some ISPs require users to point their brows-ers to their own proxy servers to gain Internet access. The drawback withthat is that many browsers cannot handle multiple proxy settings, whichwould make it impossible to access some online resources. For example,some users associated with more than one library would need more thanone distinct proxy setting, which would require them to configure theproxy settings for one set of resources and reconfigure them for another.Application ProxiesApplication proxies,28 also known as pass through proxies orcalled gateways (although they do not generally perform the protocoltranslation of traditional gateways), work like mechanical proxies byreceiving the request from the user and forwarding it to the appropriatePart Three: The Current State of Access Services 137Downloaded by [ECU Libraries] at 04:57 10 October 2014 vendor; however, application proxies do not make use of protocol ser-vices like HTTP. Instead, the proxy software handles the interaction. Italso may dynamically rewrite the data from the remote server beforepresenting it to the user.29 The advantage of the application proxy is thatusers do not have to configure their browsers in order to gain access toresources. There are some disadvantages, too. Because the proxy serverhandles the bulk of the negotiation between the user and vendor ma-chines, the overhead due to the additional processing is significant. Ap-plication proxies are not as scalable to support large number of users.Some application proxies may be blocked by corporate firewalls, whichdo not allow traffic on multiple ports. Furthermore, if a vendor changesa Web site to include Java, PDFs, URLs in cascading style sheets, orJavaScript, it could impact the ability of the proxy to rewrite pages andpresent them to the user.30 Finally, many application proxies requirethat every unique URL for a vendor product be pre-entered. This is notthe case for mechanical proxies, which only require the entry of the do-main. Entering URLs is problematic in proportion to their size and thenumber that need to be entered. If the library uses the proxy for remoteaccess to e-journals, which may have unique URLs for each article, thiscan become cumbersome.CertificatesCertificates are based on the cryptographic public key concept. Acertificate is a special kind of public key that has been issued by a Cer-tificate Authority (CA). The CA attests that the owner of the key is au-thentic.31 Wiseman summarizes the process: The user is provided witha cryptographic version of an ID card, digitally signed by an authoritywhich recognizes them.32 The certificate provided by the CA containsthe users name, public key, expiration data, the name of the certificateissuer, the serial number of the certificate, and the digital signature ofthe certificate issuer.33 Most recent work on certificates is generallybased on version 3 of the X.509 digital certificate international stan-dard.There are many advantages to digital certificates. Once configured,they do not need to be reconfigured for every new application or service(unless the user has certificates with more than one institution). Withcertificates, users and their institutions do not need to manage multipleusernames/passwords. Digital certificates provide end-to-end encryp-tion, which ensures better security. Unlike Kerberos, browsers andservers already have built in support for certificates. The X.509 stan-138 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 dard is a well-defined protocol that allows for authentication of re-sources for multiple agencies. Thus, this solution could allow studentsto interact seamlessly with secure resources that are located anywhereon campus or cross-organizationally.There are some major drawbacks to the use of digital certificates,however. They require a certificate server, and the library or collegemust distribute certificates to the students, or outsource that job at addi-tional expense. The users must install the certificate on every machinethat they use, causing problems similar to those associated with me-chanical proxies. Implementing this type of solution requires that thevendor accept certificates as a means of authentication, which involvesadditional programming and setup expense. Furthermore, it is very dif-ficult to expire certificates prior to their expiration date. If students donot pay their tuition bills or if they leave in the middle of a term, it can bedifficult to revoke their ability to access network resources. High-levelcryptography cannot be implemented if the institution serves interna-tional users, since the browsers that support high level encryption can-not always be downloaded outside of the United States. Certificates aredifficult to administer in a lab, where users must install their own certif-icates, then remove them upon completion of each task. If users leavetheir certificate settings and forget to logoff, their accounts can be com-promised. Certificates can be protected on the local machine by apassphrase, so that if someone else uses that machine they cannotsubmit the certificates as credentials without it. As with any sort ofpasswording, the institution must be ready to support users who haveforgotten their passphrase.34Like the other solutions, certificates do not completely solve the useraccess problem. At present, certificates, like Kerberos, only perform theauthentication function; they do not provide authorization services.Some recent research into establishing an authorization system thatcould be coupled with digital certificates is promising, however.35 TheDigital Library Federation initiated a pilot project . . . to investigate anew architecture for networked service provision across organizationalboundaries,36 which employed a standard called LDAP 3.0 (Light-weight Directory Access Protocols). LDAP has many advantages. Itprovides APIs for accessing information across the network using Java,C, C++, or Perl scripts, so that almost any application can connect. Itsupports synchronization among directories and fault tolerance so theuser database could be distributed across multiple servers.37 LDAP sup-ports public key authentication to its own directories, so that vendorswho access the user database for verification could also be assigned aPart Three: The Current State of Access Services 139Downloaded by [ECU Libraries] at 04:57 10 October 2014 digital certificate, ensuring that only approved vendors can access thedatabase. It allows for flexible granularity, so the library can define theappropriate level of group membership for its community. With LDAP,users can even update some of their own attributes. Many Web brows-ers, Web servers and certificate servers are integrating LDAP featuresand functionality. As a result, user attributes and data can be stored, eas-ily updated and backed up. Programmers who are familiar with pro-gramming languages such as C++ or Java can use the API to customizetheir applications to allow LDAP authorization.With a certificate/LDAP hybrid, the user would send a request to avendor for an online product or service. The vendors server would ex-amine the users certificate and either act as the CA, or check the certifi-cate against the schools or a commercial CA. Once the user isauthenticated, the vendor would then pass its own certificate to theLDAP authorization server at the institution, which would check thevendors certificate against its CA database to make sure the vendor isapproved to access the files. The LDAP server would then run a queryagainst the user database and pass to the vendor the attributes associatedwith the user (i.e., the list of resources that the user can access). Thevendor server would then allow (or deny) the user access to the resourcebased on the access management data that it has retrieved.In many ways this certificate/LDAP hybrid solution provides the bestmeans for controlling user access to the network. Digital certificatesprovide the best commercially available, open standard solution for se-curing a users transaction. LDAP provides a flexible means of autho-rizing the user and gives the desired level of granularity. Furthermore,because user attributes are kept private on the institutional LDAPserver, in-depth user data are not available over the public network. Ifsecurity were compromised during the authentication process, minimaluser data would be subject to damage.Property Vendor SolutionsAware of the need for remote access to online resources, many li-brary catalog vendors have implemented access management schemesbuilt into their integrated library systems. Most catalog vendors supportversion three of Z39.50, which allows them to provide an interface notonly to catalog data but also to online products that support Z39.50. Theadvantage of this scheme is that users do not need to learn multiplesearching strategies for the different online products that the libraryprovides. However, the Z39.50 protocol is fairly limited to the library140 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 world; therefore, it is not supported by a great number of vendors. Thesurvey showed that only five of 16 of the online product vendors that re-sponded support the Z39.50 standard. Also, Z39.50 does not handle ac-cess management. As a result, many ILS vendors have opted to developtheir own proprietary means for access management.The most compelling reason for a library to use an ILS based authen-tication solution is that by doing so it does not have to develop its ownauthentication routine, eliminating the need to deploy a proxy server ormanage digital certificates. Most ILS solutions interface directly withthe patron database and can therefore provide real time information onpatron status. Libraries do not have to maintain a separate name serverof approved users; the system interfaces with the circulation databasedirectly. Some solutions also negotiate the authentication process withthe online product vendor and will accommodate multiple authentica-tion methodologies.All ILS provided solutions are designed to interface with the li-brarys patron database. They are not designed to interact with the cam-pus student database. Some will create batch programs to upload thestudent database file into the patron database, but batch processing doesnot provide for real time updates to the database. A second disadvantageis that ILS provided solutions can be proprietary, and ILS vendors willoften not reveal their codes.Among the community college libraries surveyed, four ILS vendorscurrently dominate that market: Endeavor, Epixtech, DRA and Sirsi. Toillustrate the problems of access management in ILS systems, the fol-lowing discussions will examine two of these.Endeavor VoyagerThe Complete SolutionIncorporated in 1994, Endeavor is a relative newcomer to the ILSbusiness, although its customer base is growing. According to the sur-vey, approximately 25% of the respondent libraries use or are in theprocess of upgrading to Endeavors Voyager product. Endeavor has de-veloped an integrated solution to the access management problem. InVoyager, when a library user is added to the system, that person is as-signed to one or more patron groups (ex., staff, student, faculty, etc.).The groups are then given specific access rights to resources. For exam-ple, the group faculty might be given access rights to more or differ-ent resources from those available to the group staff. This ability tocontrol access based on group level provides a useful level of granular-ity for libraries. When users access Voyager remotely, they log in onlyPart Three: The Current State of Access Services 141Downloaded by [ECU Libraries] at 04:57 10 October 2014 once, regardless of the number or type of resources that are available tothem. The interactions between users and the server are encrypted toprovide secure transactions. Voyager authenticates the user against theunderlying Oracle patron database using a CGI script. The system canapprove or deny access to resources, based on factors such as excessivefines, expired cards, etc. If the user is approved, Voyager then handlesthe authentication negotiation with the vendor. The Voyager server isprogrammed to recognize which methodology each online product ven-dor requires for authentication (ex., IP filtering, userid/password, ven-dor provided script, etc.), and it presents the appropriate accessinformation to the vendor. This negotiation is seamless to the user.There is one specific disadvantage to this access management meth-odology. If a library wishes to purchase an online product that uses anauthentication methodology that is not supported by the ILS, the librarywould have to develop a separate solution to handle these transactions.For example, an ILS may not support technologies such as IP filteringfor remote users. If, as is sometimes the case, a product vendor supportsIP filtering as the only means of controlling access to the product, the li-brary would have to implement a proxy server in order to make remoteuser sessions look as if they were coming from inside the library net-work. However, the library can still use the access management tech-nology built into the ILS to handle the authentication/authorizationproblem.DRAThe Open StandardDRAs Taos product makes an access management solution avail-able that is similar to Endeavors Voyager product. Beyond that, DRAalso recently released a product called Patron Authentication Server(PAS), which is part of its Web2 1.3 product, but licensed separately.PAS uses a combination of HTTP and XML to provide particular kindsof patron authentication for services that cannot be easily integratedinto the ILS. For example, a library might want to make interlibraryloan services or an indexing or abstracting database available only to itspatrons. Some of these services might not interface with the ILS di-rectly; however, using PAS, the library could still take advantage of theILS access management capabilities either in-house or remotely. Whenusers of the DRA ILS want to access a resource, they enter their useridand password, initiating an HTTP (HTTPS is also supported) request tothe DRA Patron Authentication Server. The server looks up the user inthe patron database and returns an XML file to the client containing the142 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 authentication response. The authentication response can be patronvalid for activity, unverified borrower, borrower reported to collec-tion agency, etc. The server does not make any decisions; it just returnsthe data. The client must then process the response and determine if theuser can access the resource.There are a few clear advantages to the authentication methodologythat DRA has implemented. XML is an open standard, and PAS hasbeen written so that processing of patron data is performed on the client.This is a significant step toward establishing common standards for li-brary access management. If an authentication issue arose that cannotbe handled by the DRA ILS directly, a client could be written by the li-brary or an online product vendor to interface with the DRA PASserver. True to the open standard philosophy, DRA makes the requireddata elements and format description available to any developers. Theobvious disadvantage to this system is that a client of some sort must bewritten for each application. A client may be as simple as a customizedWeb page or some CGI scripts, but it may also involve developing aspecialized Web browser that must be downloaded.FUTURE DIRECTIONS AND CONCLUSIONSNew authentication technologies are on the horizon such as smartcards, positive photo identification, fingerprint and biometric devices thatprovide much stronger security. Many of these require additional special-ized hardware at each workstationex., a card reader, fingerprint pads,etc. Smart cards may eventually overlay the certificate-based structure,making certificates more portable for the user. Instead of using a user-id/password system, or requiring special browser configuration, the userswill simply swipe their cards and gain access to all of their resources. Be-yond that, biometric devices that identify people by their physical traitssuch as fingerprints, irises, faces or voice patterns are beginning to appearin the business world. At present, they can be quite expensive and maynot be practical for libraries for several years.RecommendationsWhich technology is the best choice for community college libraries?There are a number of issues involved in selecting the right access man-agement methodology. For community college libraries, budget limita-tions and technical support are major issues. Many college libraries doPart Three: The Current State of Access Services 143Downloaded by [ECU Libraries] at 04:57 10 October 2014 not have on-site staff dedicated to technology support. Furthermore,many specialized technical skills are required. For example, it is possi-ble that the technicians who are employed at community colleges mayspecialize in desktop and ILS support, but not in server and networkingdevelopment. They may not be familiar with technologies such as proxyand certificate servers. For the immediate future, the most practical op-tion is a tiered approach to the problem, based on technical expertiseand funding available to the library. The ultimate solution may be a col-laborative approach.Tier OneTier One is most likely to be implemented by libraries that have lim-ited technical expertise and budget constraints. These libraries can per-haps benefit most by selecting an ILS that can perform accessmanagement for the library. By implementing an ILS, not only does thelibrary benefit from having a solid access management system that pro-vides privacy, granularity, and ease of use and maintenance, but it alsogains by having a Web based catalog available for remote users, fullMARC records, and the ability to integrate Z39.50 resources. Further-more, library staff do not have to create and maintain separate databasesof userids and passwords, access to resources can be linked to fines orother administrative stops, and access can be granted on a group member-ship basis. Depending on the online resources that a library wants to makeavailable, additional technologies, such as a proxy server, may need to beimplemented. A time may come in the future when the library wishes tomigrate to a new ILS, which can be an expensive and arduous task, butthese new services might also provide valuable enhancements.Tier TwoAt Tier Two, some libraries may need to offer levels of access and re-sources beyond those possible with an ILS. It may also be the case that,during the process of implementing and maintaining an ILS, staff havedeveloped higher technical competencies. If the longer-term goal is tomigrate to this solution, then the library would be wise to build upon theaccess management system of its ILS by implementing a proxy serverthat is suited for libraries, such as EZProxy. Not only will an applicationproxy allow the library to use resources that can only be accessedthrough IP authentication, but it will also allow the library to gatherin-depth statistics on usage. It will provide enhanced privacy protection144 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 for users while allowing for proper user accountability. The ILS canhandle authentication and authorization functions as well as interac-tions with vendors, who may require a userid and password or a refer-ring URL. In cases where IP filtering is the preferred method ofauthentication, users who are approved by the ILS can be forwardedonto the proxy server to negotiate the interaction with the vendor.In the Long RunIn the long run, the optimum solution might be for libraries to pooltheir resources, especially when providing services to their remote users.Thus, college libraries should consider forming and funding consortialtechnology development offices, which can implement and standardizesystem-wide solutions that meet the needs of member libraries. The digitalcertificate/LDAP hybrid is currently the best solution to the user accessproblem. Even so, it is cost prohibitive for many individual libraries to im-plement and support this technology, which requires a certificate serverand the issuing of certificates on a user-by-user basis. Furthermore, mostvendors do not currently support digital certificate technology. By combin-ing funds and resources, a group of libraries could implement a solutionlike digital certificates/LDAP, and in doing so they could achieve the criti-cal mass required to convince online product vendors to accept certificates.They could also benefit from group pricing of resources. Centralized tech-nical support might provide expertise beyond any possessed by individuallibraries. The ATHENS project in the United Kingdom is a model of suc-cessful cooperation and resource sharing among multiple institutions.The digital certificates/LDAP solution addresses many of the ques-tions raised at the beginning of this article. Certificates are not difficult toinstall and would therefore not be a tremendous burden on the user.Again, cooperation may be the ultimate solution. While there would be anumber of political and bureaucratic issues involved in setting up thesekinds of consortia, the cost savings and benefits could be tremendous inthe long run. For this and many other reasons, the climate is ripe forconsortial development aimed at addressing issues that are common to alllibraries, among which access management is one of the most pressing.GLOSSARYThe following are definitions for terms and acronyms appearing inthis article.Part Three: The Current State of Access Services 145Downloaded by [ECU Libraries] at 04:57 10 October 2014 APIAbbreviation of application program interface, a set of rou-tines, protocols, and tools for building software applications. Agood API makes it easier to develop a program by providing all thebuilding blocks. A programmer puts the blocks together. ( Server Pages. A specification for a dynamically createdWeb page with a .ASP extension that utilizes ActiveX scriptingusu-ally VB Script or Jscript code. When a browser requests an ASP page,the Web server generates a page with HTML code and sends it back tothe browser. So ASP pages are similar to CGI scripts, but they enableVisual Basic programmers to work with familiar tools. ( trusted third-party organization or company that issues digitalcertificates used to create digital signatures and public-private keypairs. The role of the CA in this process is to guarantee that the individ-ual granted the unique certificate is, in fact, who he or she claims to be.Usually, this means that the CA has an arrangement with a financial in-stitution, such as a credit card company, which provides it with infor-mation to confirm an individuals claimed identity. CAs are a criticalcomponent in data security and electronic commerce because theyguarantee that the two parties exchanging information are who theyclaim to be. ( of Common Gateway Interface, a specification fortransferring information between a World Wide Web server and a CGIprogram. A CGI program is any program designed to accept and returndata that conforms to the CGI specification. The program could be writ-ten in any programming language, including C, Perl, Java, or Visual Ba-sic. CGI programs are the most common way for Web servers tointeract dynamically with users. Many HTML pages that contain forms,for example, use a CGI program to process the forms data once itssubmitted. ( for Dynamic Host Configuration Protocol, a protocol forassigning dynamic IP addresses to devices on a network. With dynamicaddressing, a device can have a different IP address every time it con-nects to the network. In some systems, the devices IP address can evenchange while it is still connected. DHCP also supports a mix of staticand dynamic IP addresses. Dynamic addressing simplifies network ad-146 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 ministration because the software keeps track of IP addresses ratherthan requiring an administrator to manage the task. ( technology uses existing 2-wire copper telephone wiring todeliver high-speed data services to businesses and homes . . . DSL usesthe existing phone line and in most cases does not require an additionalphone line . . . In its various formsincluding ADSL, HDSL, IDSL,R-ADSL, SDSL, and VDSLDSL offers users a choice of speeds rang-ing from 32 Kbps to, in laboratory settings, more than 50 Mbps . . . Overany given link, the maximum DSL speed is determined by the distancebetween the customer site and the Central Office. ( for HyperText Transfer Protocol, the underlying proto-col used by the World Wide Web. HTTP defines how messages are for-matted and transmitted, and what actions Web servers and browsersshould take in response to various commands. ( known as S-HTTP. An extension to the HTTP protocolto support sending data securely over the World Wide Web. Not allWeb browsers and servers support S-HTTP. S-HTTP was developed byEnterprise Integration Technologies (EIT), which was acquired byVerifone, Inc. in 1995. ( Library System, traditionally known as an online li-brary catalog system which may include the online public access cata-log (OPAC), circulation, acquisitions, cataloging and other features.ISPShort for Internet Service Provider, a company that provides ac-cess to the Internet. For a monthly fee, the service provider gives you asoftware package, username, password and access phone number.( for Lightweight Directory Access Protocols, a set of pro-tocols for accessing information directories. LDAP is based on the stan-dards contained within the X.500 standard, but is significantly simpler.And unlike X.500, LDAP supports TCP/IP, which is necessary for anytype of Internet access. Because its a simpler version of x.500, LDAPis sometimes called X.500-lite. Although not yet widely implemented,Part Three: The Current State of Access Services 147Downloaded by [ECU Libraries] at 04:57 10 October 2014 LDAP should eventually make it possible for almost any applicationrunning on virtually any computer platform to obtain directory informa-tion, such as e-mail addresses and public keys. ( for Open System Interconnection, an ISO standard forworldwide communications that defines a networking framework forimplementing protocols in seven layers. Control is passed from onelayer to the next, starting at the application layer in one station, proceed-ing to the bottom layer, over the channel to the next station and back upthe hierarchy. ( for Portable Document Format, a file format developed byAdobe Systems. PDF captures formatting information from a variety ofdesktop publishing applications, making it possible to send formatteddocuments and have them appear on the recipients monitor or printeras they were intended. ( transfer protocol between library automation devices andAutomated Circulation Systems (ACS). This protocol provides a stan-dard interface between a librarys Automated Circulation System andlibrary automation devices. The protocol was developed originally asan interface between the Automated Circulation system and 3MSelfCheck system . . . This standard will be applicable to ACS systeminterfaces to automated devices and services where patron informationand/or library material information is required. (3M Standard Inter-change Protocol v.2.00, available upon request at for Secure Sockets Layer, a protocol developed byNetscape for transmitting private documents via the Internet. SSLworks by using a private key to encrypt data thats transferred over theSSL connection. Both Netscape Navigator and Internet Explorer sup-port SSL, and many Web sites use the protocol to obtain confidentialuser information, such as credit card numbers. By convention, Webpages that require an SSL connection start with https: instead of http:.( versus HTTPSWhereas SSL creates a secure connection be-tween a client and a server, over which any amount of data can be sentsecurely, S-HTTP is designed to transmit individual messages securely. . . SSL and S-HTTP have very different designs and goals so it is possi-148 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 ble to use the two protocols together. Whereas SSL is designed to estab-lish a secure connection between two computers, S-HTTP is designedto send individual messages securely. ( for Transmission Control Protocol/InternetProtocol, the suite of communications protocols used to connect hostson the Internet. TCP/IP uses several protocols, the two main ones beingTCP and IP. TCP/IP is built into the UNIX operating system and is usedby the Internet, making it the de facto standard for transmitting dataover networks. ( mostly widely used standard for defining digital certificates.XMLExtensible Markup Language, a specification developed by theW3C. XML is a pared-down version of SGML, designed especially forWeb documents. It allows designers to create their own customizedtags, enabling the definition, transmission, validation, and interpreta-tion of data between applications and between organizations.Z39.50The Z39.50 standard supports a client-server computing envi-ronment in which a single user interface (the client) can access informa-tion from multiple sources (the servers). The system allows libraries acommon user interface for searching their library catalog, locallymounted reference databases and popular commercial databasesmounted on remote servers around the world. ( Clifford Lynch. The Changing Role in a Networked Information Environ-ment, Library Hi Tech 15/1-2 (1997): 30-38. Note: much of this history of authentica-tion is based on Lynchs work in this article.2. Clifford Lynch. Authentication: Progress in the US. (report #30) 1/19/98, [4/2/01].3. Kent Cearley and Lindsay Winsor. Securing IT Resources with Digital Certifi-cates and LDAP. University of Colorado System Office. 3/31/98 [3/31/01].4. Lynch. Changing Role.5. Cearley and Winsor.6. Ibid.7. Lynch. A White Paper on Authentication and Access Management Issues inCross-Organizational Use of Networked Information Resources. April 14, 1998 [4/13/01].8. Ibid.9. There are companies that sell entire access management packages to campuses.Some of these packages include access management components for online resources.One example is Campus Pipeline [4/2/01].Part Three: The Current State of Access Services 149Downloaded by [ECU Libraries] at 04:57 10 October 2014 http:// Lynch. White Paper.11. Norma Wiseman. AthensOne Year On. Presentation to CNI. Seattle 12/98 [4/2/01].12. Lynch. White Paper.13. Ibid.14. Lynch. Changing Role.15. Lynch. White Paper.16. Ibid.17. Ibid.18. Ibid.19. Ibid.20. University Common Authentication Project: Information Resources & Com-munications. 7/1 97 [4/2/01].21. Cearley and Winsor.22. Timothy Cole. Using Bluestem for Web Users Authentication and AccessControl of Library Resources, Library Hi Tech 15/1-2 (1997): 58-71.23. William V. Garrison and Gregory A. McClellan. Authentication and Authori-zation, Part 2. Tao of Gateway: Providing Internet Access to Licensed Databases, Li-brary Hi Tech 15/1-2 (1997): 39-54.24. George Machovec. User Authentication and Authorization in a Networked LibraryEnvironment, 11/97 [3/30/01].25. Lynch. White Paper.26. An example of a mechanical proxy server is Microsoft Proxy Server.27. Lynch. White Paper.28. There was an excellent discussion of the benefits of application proxies on theweb41ib discussion group in late 1999 and early 2000.29. A product that many university libraries have implemented is EZProxy by Use-ful Utilities. See [3/30/01].30. Lynch. White Paper.31. Cearley and Winsor.32. Norman Wiseman. Authentication: Progress in the UK, (report #31) 1/19/98 [11/12/99].33. Ira H. Fuchs. Remote Authentication and Authorization for JSTOR. 9/3/98 [11/17/99].34. Lynch. The Changing Role.35. See David Millman and Sal Gurnani on An Architectural Prototype for Certifi-cate-based Authentication and Authorization, [4/2/01]; and Ariel Glenn and David Millman.Access Management of Web-based Services: An Incremental Approach to Cross-or-ganizational Authentication and Authorization, 11/98 [4/2/01].36. David Millman. Cross-Organizational Access Management: A Digital LibraryAuthentication and Authorization Architecture. [4/2/01].37. Cearley and Winsor.Submitted: 2/7/2001Accepted: 7/13/2001150 JOURNAL OF ACCESS SERVICESDownloaded by [ECU Libraries] at 04:57 10 October 2014 http://


View more >