[Eril-l] Discovery System/ILS comparison charts

Donley, Leah donley at bnl.gov
Fri Nov 9 09:42:16 PST 2018


Thank you, Robert!  This looks very helpful.

Regards,
Leah


Leah Donley
Research Library - Building 477
Brookhaven National Laboratory
Tel: 631-344-7469
Email: donley at bnl.gov<mailto:donley at bnl.gov>



From: Robert Heaton <robert.heaton at usu.edu>
Sent: Wednesday, November 07, 2018 4:40 PM
To: Donley, Leah <donley at bnl.gov>; 'eril-l at lists.eril-l.org' <eril-l at lists.eril-l.org>
Subject: RE: Discovery System/ILS comparison charts

Leah,

We have not gone through the process of considering and selecting a discovery system recently, but we have found this article very helpful in framing our planning: Joseph Deodato, "Evaluating web-scale discovery: A step-by-step guide," Information Technology and Libraries, vol. 34, no. 2 (2015), https://ejournals.bc.edu/ojs/index.php/ital/article/view/5745/pdf. It gives example user surveys, sample searches for testing relevancy ranking, questions by category to ask vendors in an RFP, and a scoring-rubric template. There's enough detail there that you'd need to pick and choose the processes and criteria that best apply to your needs.

Another helpful resource is Mary Pagliero Popp & Diane Dallis (eds.), Planning and Implementing Resource Discovery Tools in Academic Libraries, 2012, Information Science Reference/IGI Global. Along with other valuable information, chapter 8, "Designing an evaluation process for resource discovery tools," by David Bietila and Tod Olson, includes two "checklists": one for evaluating technical features, the other for evaluating user-facing functions. There would still be some work to define something like a rubric, expanding on what you're looking for in each feature area. I hope I'm not out of line in reproducing those tables here:

Table 1. Technical evaluation checklist with sample requirements
Technical Feature

Important Issues


Hardware requirements


*        Affects cost of ownership

*        Backup or development server may be desired

*        Hosted applications require no local hardware purchase

*        Tradeoff between cost and local control of environment


Operating system requirements


*        Staff must be able to support the OS, keep up with patches, security, system monitoring

*        Overhead is different depending on a familiar OS vs. unfamiliar; bringing in a new OS will place additional demands on staff

Indexing performance and capacity


*        Indexing may affect search performance, or require a dedicated indexing host if at a large scale

*        Complete re-index vs. incremental updates

*        Issues of scale with respect to the data (records or full text) and the number of simultaneous users

Search responsiveness


*        Do the users perceive a performance problem?

Server downtime requirements


*        Users increasingly expect 24 by 7 uptime

*        Locally-hosted systems may require some maintenance window


Ongoing maintenance requirements


*        Regular backups

*        Frequency of software updates

*        Staff time required for software updates

*        Data management issues, e.g. regular imports

Staff skills required for ongoing operations and maintenance


*        OS administration skills

*        Any required programming or scripting skills

*        Familiarity with application operation and configuration

Configuration options and customization


*        Visual customization and branding

*        Display of data fields, labels

*        Choice of data types and sources to include in indexes

Support for importing local data, supported data formats


*        Can key local collections or data sources of local significance be added?

*        Supported data formats (MARC, Dublin Core, etc.)


Integration APIs and output formats


*        Possible to integrate with other applications:

     *   incorporate results into other systems
     *   launch searches into the discovery tool

*        Are the APIs well-documented? Do they conform to existing standards?

*        Ensure support for needed formats, e.g. XML APIs, SRU, RSS feeds, etc.

Browser requirements


*        Must work equally well with all major Web browsers

*        Does not require JavaScript to be turned on


Table 2. Functional evaluation checklist
Functional Area

Specific Features

Content

*        handling of content types, data formats, and metadata formats
*        coverage

o   the range of particular article databases and electronic resources included in the index
*        stop words
*        language support

o   non-english searching

o   storage and display of non-english data

o   support for unicode-encoded text.

Search

*        advanced (or fielded) searching
*        Boolean searching
*        nested searching
*        wildcards and truncations.

Query expansion

*        search suggestions
*        related or synonymous search terms
*        spelling suggestions
*        optional federated search features.

Results

*        search performance
*        sorting options
*        deduping of results
*        support for export to citation management tools
*        search refinements
*        faceted browsing of results
*        relevance ranking

Record management

*        saving searches and records
*        adding tags or personal content
*        annotation or manipulation of saved content.

Administration

*        logging features

o   search logs

o   download logs
*        administrative interface and configuration options



Robert Heaton
Collection Management Librarian
Utah State University Libraries

From: Eril-l <eril-l-bounces at lists.eril-l.org<mailto:eril-l-bounces at lists.eril-l.org>> On Behalf Of Donley, Leah
Sent: Wednesday, November 7, 2018 7:32 AM
To: 'eril-l at lists.eril-l.org' <eril-l at lists.eril-l.org<mailto:eril-l at lists.eril-l.org>>
Subject: [Eril-l] FW: Discovery System/ILS comparison charts

I'm realizing that I didn't word my question very well.  That being said the responses I've received so far have been very helpful!

I'm hoping for a form, maybe a chart or question-based (or perhaps something better that I'm not thinking of!) that we can use while evaluating the various platforms to compare their various features and functionality.  So something that will allow us to compare the nitty gritty (and big picture) functionality as it relates to our local workflows.  The feedback I received from staff after the demos we saw was helpful, but not very focused.  We're a small staff and somewhat specialized in our roles so I'm really interested in being able to compare how each product would work beyond "I really like this" or don't like this and not really looking at the details (although the initial gut reactions are of course helpful too).

Thanks again,
Leah


Leah Donley
Research Library - Building 477
Brookhaven National Laboratory
Tel: 631-344-7469
Email: donley at bnl.gov<mailto:donley at bnl.gov>



From: Eril-l <eril-l-bounces at lists.eril-l.org<mailto:eril-l-bounces at lists.eril-l.org>> On Behalf Of Donley, Leah
Sent: Tuesday, November 06, 2018 11:15 AM
To: eril-l at lists.eril-l.org<mailto:eril-l at lists.eril-l.org>
Subject: [Eril-l] Discovery System/ILS comparison charts

Has anyone used a comparison sheet/chart etc. when looking at Discovery Systems and/or Library Services Platforms?  I'm looking for something to guide us and organize our thoughts as we utilize various platform trials from different vendors.


Thank you,
Leah


Leah Donley
Research Library - Building 477
Brookhaven National Laboratory
Tel: 631-344-7469
Email: donley at bnl.gov<mailto:donley at bnl.gov>


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.eril-l.org/pipermail/eril-l-eril-l.org/attachments/20181109/186ed429/attachment-0001.html>


More information about the Eril-l mailing list