RFID matters…

5 03 2016

So we are finally RFID-tagging our library. Better late than never! A few thoughts on our experiences. Unlike many other libraries who had the luxury of being able to hire a company to do this, we’ve had to use our existing staff resource due to a lack of funding availability. We’re looking at tagging something around 150,000 items over three floors on one site.

We opted to use teams of two people in order to reduce the boredom factor. Everyone from the top down is involved and gets at least a weekly one-hour slot on the rota. We’ve exceeded expectations in terms of throughput and are getting about 5,000 items per week tagged. The funny thing about it is that it is strangely therapeutic sticking tags on books when you do other more cerebral activities for the remainder of the week. It also aids greatly in staff communication and you discover things about the library that you never knew. Anyway, if that helps anyone else that has to sell this to their staff, so much the better.

The other thing we had to do was to gauge the right time to start tagging books that were being returned. We opted to do this about 3 months in to a 10 month project which was probably a little late. It is surprising how much stuff gets tagged via that route. You do however have to be careful not to re-tag a book which was returned to the shelf following this. I’ve done it myself once or twice and was sent to stand in the corner!

 





Conducting a lean and yet not mean process review

24 02 2015

In 2014, we conducted a process review in preparation for migrating to a new Library Management System (LMS). We had been using our SirsDynix system since about 1998, and in common with many other libraries, we were keen to move to a next generation system reflecting the changing landscape of the sector and in particular the shift towards digital.

Taking note of the JISC LMS Change tools, we wanted to conduct a process review which would help to inform our specification and in turn the system we ended up with. I attended a useful talk at the Ex Libris User Group meeting in which a university described the procedure they had followed prior to choosing and implementing Ex Libris ALMA. They described the use of the LEAN process (this area is acronym laden) in which essentially the following steps are taken:

  1. Document current processes.
  2. Review each process and derive the objective which is behind it
  3. Identify the quality drivers which need to be in place for the objective to be achieved/
  4. Describe how we will measure whether this has happened.

The great thing about the process is that it is totally focused around the needs of the user as opposed to what a librarian thinks the user needs. The other great thing is that it empowers your staff to tell you their experiences from the coal face which is likely to lead to much better outcomes than relying on a bunch of senior staff theorising in an ivory tower.

Having said that, it is still quite difficult to get some of your library staff to be objective and to really unshackle themselves from that ever-lurking “the librarian always knows best” attitude.

One of the other challenges is to train staff who prefer words (presumably that’s why some of us became librarians) to use largely visual tools such as flow charts. However, I think some brief work to explain the workings of these beforehand did pay dividends and so it was not as much of an obstacle as I had feared.

We were very careful to decouple the processes from the library system “per se”. We asked staff to discount the fact that we do such-and-such because the it’s the only way the library system can handle a process and instead try to think critically about why we do things. We framed this as trying to think of this from the point of view of the services which the user needs instead.

It was pleasing to see that leaving junior and middle level managers to get on with this in groups using post it notes was something that people felt they were able to actively participate in. Almost all groups realised at least to some degree that various services which we had been offering were no longer as necessary as they had been. This makes the decision to decommission any services that really aren’t needed far easier to implement as your staff are with you rather than resisting change. What was perhaps less successful was our request to think of services in new areas which we don’t offer at present but perhaps should be doing in the 21st century. I found this especially surprising from younger staff who were probably HE Library users themselves quite recently but perhaps this indicates just how fast things are changing in the sector.

If I was doing it again, what would I change? I think it would be very powerful to introduce some real users into some of the groups to offer their view and trigger conversations in areas which might not have been considered. The old Amazon voucher trick may well be money well spent and provide a better outcome.

 





Store forms

6 06 2013

We’ve been using bits of paper to record requests for material in the stores for years. Each one has to be checked by someone (who has to look up the item in the catalogue again) to ensure the user has completed it correctly. So I have done some work on our Sirsidynix Symphony system to automate this process and allow staff to do more interesting things.

The basic idea was to expose a link on categories of material which are located in store and provide a request button at item level. We also wanted it to be as user-friendly as possible. So if you click the link when in public mode, it will prompt for login and then take you back to the request form.

Amongst other things, it means requests will be creatable from outside (via the web) and we can start to offer a pre-ordering service.

When the request is created, it appears in the user account area so they can monitor progress. The requests are printed in batch (one page per request) each day and the “fetch” can happen as normal with the slips being inserted to the item.

A fair amount of customisation was needed both at the public interface and within the reports. Some of it ain’t pretty but it seems to work.

The request form is rendered entirely from javascript as it was the only way to pass the parameters from page to page. Data codes on one page are not necessarily available on another and finding which apply where is often a case of trial and error.

For the reports, we wanted the user’s name to appear at the top of the slip (in bold was not possible so we resorted to lots of stars). To do this, we actually modify the default finished output file with a custome “reformatting” report. Again, crude but seems to do the job.

Next step is to load this all onto the live server!





COSIEMEA Conference Birmingham

9 06 2011

Just started. Looking forward to finding out about latest developments in Web Services and mobile apps for Symphony. Session on creating a forgot pin routine on elibrary. So useful and relatively simple.





Koha implementation at Staffordshire University

8 06 2011

A really useful discussion with the library guys from Staffs Uni about their experience of implementing Koha at a University. They have been brave enough to put their heads above the parapet and show this can be done. We will need to give careful consideration about whether to follow as the arguments in favour of doing so are extremely persuasive. A great meeting





Having fun with Google Analytics

3 06 2011

We’ve had GA running on the library catalogue and repositories for some while now. But it was very much a 10 minute set up and see what it produced approach. Today I’ve been looking at page event tracking. We’d like to know which 856 links our users clicked on so we can see what impact these have. Similarly in the eprints repositories, it would be great to know which files were being downloaded. There are in-house stats for eprints and part of the experiment we are doing will be to see how far we can push GA to replicate (or exceed?) that, or not as the case may be.

Basically adding page event code is a two stage process. First, adding the asynchronous GA code to your header just above the </head> tag. In our SirsiDynix Symphony LMS (we are on the revD version of elibrary), you amend html_head.h and html_head_tabbed.h and paste the code there. Next, you need to go to the page with the event you want to track. For us, this was view_marc.h and a line which looked like

<a href=”#” onclick=”javascript:open_win(‘<SIRSI_Data List_DC=”nH” NORTL CVTJS>’)”>

We simply added a call to the _gaq.push function viz

<a href=”#” onclick=”javascript: _gaq.push([‘_trackPageview’, ‘<SIRSI_Data List_DC=”nH” NORTL CVTJS>’]); open_win(‘<SIRSI_Data List_DC=”nH” NORTL CVTJS>’) “>

Now waiting to see if it works!

A question which remains is how will this react to a user coming from a pdf link result on Google? It is important to us as we believe a lot of our traffic to the repositories is likely to be via Google. If it bypasses the header of the web page, presumably it won’t record. Hmm, more testing needed methinks.