Friday, January 19, 2007

Thursday, January 18, 2007

OJAX Federated Search Service Software

An exciting announcement about OJAX, an open-source federated search tool.
OJAX federated search service software is now in Beta release and available for download. Version 0.7 has improved performance, stability and user feedback, as well as additional features such as RSS/Atom feed support. (Atom feeds of stored searches alert users when new content matching their interests is harvested.)

OJAX illustrates how federated search services can respond to new user expectations generated by Web 2.0:

  • Rich, dynamic user experience. OJAX uses Ajax technology to provide immediate dynamic response to user input.
  • Intuitive interface. The OJAX interface provides the simplicity and familiarity of Google but with the power of advanced search
  • Integration, interoperability and reuse. OJAX uses loosely coupled Web Services and supports the OpenSearch RSS standard, thus facilitating integration with a range of virtual library environments, institutional repositories, course management systems and institutional portals.
  • Open source standards-compliance. OJAX supports best-practice open source standards and software, including OpenSearch, OAI-PMH, StAX and Apache Lucene.
Features of OJAX:
  • Auto-completion of search terms
  • Triggering of auto-searches
  • Dynamically scrollable search results - no more navigating between pages
  • Auto-expansion of search result details
  • Rapid sorting of results
  • Integrated with the Firefox 2 / IE 7 search feature
  • Supports OpenSearch Discovery
  • Stored searches as Atom feeds
  • Includes an OAI-PMH harvester
  • Easy to install in your own institution
Further information, demo and download.

Two alternative packages are available:

  1. OJAX GUI, Web Services & Harvester
  2. OJAX GUI, Web Services, Harvester & example repository index
--
Dr Judith Wusteman

WebDAV

I'm wondering why Web-based Distributed Authoring and Versioning (WebDAV) is not more common. It seems pretty simple and has MS support, yet I never have heard of it being used. Or am I just missing it?
The WebDAV protocol's aim was to make the World Wide Web a readable and writable medium, in line with Tim Berners-Lee's original vision. It provides functionality to create, change and move documents on a remote server (typically a web server or "web share"). This is useful, among other things, for authoring the documents which a web server serves, but can also be used for general web-based file storage that can be accessed from anywhere. Important features in WebDAV protocol include locking (overwrite prevention), properties (creation, removal, and querying of information about author, modified date, etc.), name space management (ability to copy and move Web pages within a server's namespace) and collections (creation, removal, and listing of resources). Most modern operating systems provide built-in support for WebDAV. With the right client and a fast network, it can be almost as easy to use files on a WebDAV server as those stored in local directories.

Wednesday, January 17, 2007

Call to Koha Users

LibLime, the support company for Koha, posted this announcement:
The Koha project is working to improve Koha's visibility by adding Koha users to an important automation list. This list is maintained by Marshall Breeding (U.S. researcher), and tracks libraries worldwide and what ILS they are using. Marshall puts out a library technology guide every year (this year's is upcoming) which is very influential in helping libraries select an automation system.

Traditionally, Koha has not been included in his guide, and we are trying to change that this year :) In fact, Marshall has specifically invited Koha users to include themselves in the guide, to ensure open-source automation gets the recognition it deserves. We've been encouraging Koha users to add information about themselves to the site, and adding libraries ourselves as we come across them.

If your library is using Koha, make sure you are counted! To make sure your library has already been added to Marshall Breeding's list, you can do a search for your library.

If your library is not listed, you can add yourself.

You'll need to add your library by the end of the month to be counted for this year's guide.

Tuesday, January 16, 2007

PURL Spam

The spam problem on the OCLC PURL server has been resolved.
A new PURL server has been put into service.

Deleting what we though were spam. We added a disclaimer about PURLS on the first page.

53674 PURLS were deleted along with 95 user ids.

PURLS has now change, to request a user id you will need to request it from the System Administrator (me for now).

Users that have an existing USER ID should be able to create PURLS, DOMAIN, GROUPS etc.

Don't know if I got all we think we're spam, but if any are found let me know, also any USER ID that should not have been deleted and was, let me know.

Tom Dehn
OCLC Inc.

D-Lib Magazine

The latest issue of D-Lib Magazine contains several pieces of possible interest to catalogers.
  • Distinguishing Content from Carrier: The RDA/ONIX Framework for Resource Categorization by Gordon Dunsire discusses the results of a meeting between the RDA and ONIX communities.
  • Resource Description and Access (RDA): Cataloging Rules for the 20th Century by Karen Coyle and Diane Hillmann examines problems with RDA.
  • The Online Library Catalog: Paradise Lost and Paradise Regained? by Karen Markey calls for a replacement to the OPAC.

Link Evaluator

Link Evaluator is a free Firefox add-on from OCLC.
Link Evaluator is a Firefox extension designed to help users evaluate the availability of online resources linked to from a given Web page. When started, it automatically follows all links on the current page, and assesses the responses of each URL (link).

Link Evaluator examines both the HTTP status code and the page contents returned by each URL.

Friday, January 12, 2007

OLAC Newsletter

I got my copy of the OLAC Newsletter yesterday. Always a good read. This issue has conference reports as well as the ever enlightening Cataloger's Judgment. The reports often have the PP presentation and examples available. The on-line version is open and free to all. Looks like it was a great conference. The next one is in two years, start planning to attend now. A three year membership in OLAC is only $70.00, a best buy to be sure.

Wednesday, January 10, 2007

MIT Catalog

The MIT catalog is once again available for download, not as MARC only as MODS and MODS/RDF. If you need or want a large dataset for testing or research this is a good option. The announcement gives more details.

The CONSER Standard Record and RDA

A comparison of the proposed CONSER standard record and RDA has been mounted on the JSC Web site.

Tuesday, January 09, 2007

Monday, January 08, 2007

Using Barcodes in Search

Brian Surratt at Texadata mentions he is having some success scanning barcodes on the back of books when doing ISBN searches on OCLC. It is not yet 100% accurate, many of these numnbers are in the 024 field. Yet it is working and will only get better with time.

I've been using barcode scans in ISBN searching on my Z39.50 client with mixed results. OCLC is creating 13 digit codes based on the 10 digit input code. They are also moving those in 024 to 020. Few if any institutions will do this work. Using scans in Z30.50 searches will be uncertain for many years to come. Maybe the profiles could do some remapping, have an ISBN search hit both the 020 and 024 fields?

Additions to the MARC Code Lists for Relators, Sources, Description Conventions

The codes listed below have been recently approved for use in MARC 21 records. The codes will be added to the online MARC Code Lists for Relators, Sources, Description Conventions.

The codes should not be used in exchange records until after March 5, 2007. This 60-day waiting period is required to provide MARC 21 implementers time to include newly defined codes in any validation tables they may apply to the MARC fields where the codes are used.

Term, Name, Title Sources

The following code is for use in subfield $2 in Bibliographic and Community Information records in fields 600-651 and 655-657 and in subfield $f in field 040 (Cataloging Source) in Authority records.

Addition:
biccbmc
BIC Children's Books Marketing Classifications (London: Book Industry Communication) [use after March 5, 2007]
Category Code Sources

The following code is for use in subfield $2 in field 072 in Authority and Bibliographic records (Subject Category Code / Code source).

Addition:
biccbmc
BIC Children's Books Marketing Classifications (London: Book Industry Communication) [use after March 5, 2007]
Classification Sources

The following code is for use in subfield $2 in field 084 in Bibliographic and Community Information records (Other Classification Number), in subfield $2 in field 084 in Classification records (Classification Scheme and Edition) and in subfield $2 in field 065 in Authority records (Other Classification Number).

Addition:
ykl
Yleisten kirjastojen luokitsujarjestelma Finnish public libraries classification system [former code: fplcs] [use after March 5, 2007]

Friday, January 05, 2007

Rights in the PREMIS Data Model

News from LC:
The Library of Congress' Network Development and MARC Standards Office is pleased to announce the availability of a study written by Karen Coyle on how rights information needed for digital preservation activities is handled in the PREMIS data dictionary.
Rights in the PREMIS Data Model.

MAB2 --> MARC21

Moving from MAB2 to MARC 21 Recording of multipart monographs and their parts in MARC 21: Predefinitions and examples gives a German view of MARC that is interesting.

Thursday, January 04, 2007

Notice to Music Teachers

Cora Bigwood, my spouse, is running for region rep for the American Orff Schulwerk Association (AOSA). If you or someone you know belongs to AOSA check you mail for your ballot and consider voting for Cora for Region III. Here is part of a bio she recently did:
My music background began with a BM in Music Teacher Education K-12 (Instrumental and Vocal) from the University of Houston, a MM in Music Education from New England Conservatory. I have Orff Certification, a Kodaly Certificate, Mid-Management and Supervision Certificates.

I presently serve as the TMEA Elementary VP and as a music teacher of 27 years in the Texas public school system. Having taught at the elementary, secondary, and college levels my leadership in our organization gives me a wide perspective into the many issues that face all disciplines (band, orchestra, vocal, elementary, college) across Texas.

My responsibilities on the State board are numerous. As division VP, I organized the Elementary portion of the 2006 TMEA state conference and am proud of the following achievements:

  • Over 50 clinicians from around the state and country were solicited for our members
  • Increased the performance venues for my division and solicited performances throughout the state
  • Implemented poster sessions for interdisciplinary lesson plans
In addition to these achievements at the 2006 convention, I actively participated in meetings and committees focusing on the following issues:
  • Region and area alignment implementation
  • TMEA appeals process
  • Vertical alignment of the Music TEKS as defined in the TMEA/TMAC Curriculum-Assessment Project
  • Neutralizing the Negative Impact of TAKS
  • Non-attendance due to TAKS remediation
  • Staffing shortages and ever-present state funding concerns

As a teacher, I have worked with elementary through secondary-aged children, teaching general music, choir, and individual flute students. I judge UIL events and supervise student teachers. I train our future music educators at the college level during summers (University of Houston and University of Texas at Brownsville).

I present staff development workshops for school districts, and Education Service Centers. In my spare time, I also make music on my own. From 1993-2003, I managed and performed in the folk band Permanent Wave, which performed for my session at TMEA 2002. In addition to the above, my Orff ensemble has also performed at TMEA. I have performed with the Bay Area Chorus, Clear Lake Symphony, and Baytown Symphony and Houston Symphonic Band.

I can't think of anyone better qualified. Anyone who read this knows I am very proud of her accomplishments.

MARC Content Designation Utilization (MCDU) Project

The progress report on the MARC Content Designation Utilization (MCDU) Project is available. Very important work.

OPAC Replaced by FISH

Christopher Harris at Infomancy has an OPAC replacement, a FISH. FISH: Free (as in kittens) Integrated Search Handler. Using freely available tools has put together a system to replace the catalog. Seeing what he has done is fascinating and inspiring.

With all the work being done by the Koha and Evergreen folks and the SOLR/Lucene crowd, and the eXtensible Catalog, it seems like open-source solutions for the OPAC will become more attractive options to the large commercial products.

unAPI Plugin for WordPress

Technosophia has announced a WordPress plugin for unAPI.
I've finally gotten around to updating the unAPI plugin for WordPress so that it fits into the WordPress plugin architecture, making it simple to install and maintain. I'm calling it version 1.0 since it's the first substantial release of the plugin since I got involved.
And just what is unAPI?
unAPI is a tiny HTTP API for the few basic operations necessary to copy discrete, identified content from any kind of web application.

There are already many cool APIs and protocols for syndicating, searching, harvesting, and linking from diverse services on the web. They're great, and they're widely used, but they're all different, for different reasons. unAPI only provides the few basic operations necessary to perform simple clipboard-like copy of content objects across all sites. It can be quickly implemented, consistently used, and easily layered over other well-known APIs.

Wednesday, January 03, 2007

OCLC PURL Server

This meaasge from Tom Dehn was distributed on the PURL-L e-mail distribution list.
There are 58000+ spam sites in the PURL Server.

I am working on a means to delete them (It needs further development and testing).

The PURL server has been available to anyone in the past, but we have to discuss that policy, because being available to everyone, means it's available to spammers. Also the ability to delete a PURL was not part of the initial development.

As of this time the ability to add or modify purl or domains has been disabled. It will remain that way until we have a derived a new policy.

The ability to access existing PURLs is still available.

What tool won't the spammers destroy?