March 15, 2018

RLUK presentation: Understanding Academics

Today I'm presenting at RLUK about our 18 month UX project, Understanding Academics. We're really proud of this project and what we've achieved with it. This speed presentation aims to provide an overview of the project and what it achieved.

We've already written about our UX activities a lot including this project. If you're interested you could look at some of our previous posts including our first post on the project, Understanding Academics UX project, and one detailing how we actually dealt with all the data the project created, Processing and ACTING ON 100+ hours of ethnography: a 5 stage approach.

There are also lots of other posts about our UX activities and we are in the process of writing an article which, when published, will be linked to from here.

And as always if you want to know more please get in touch with us.

January 30, 2018

Running library induction like a marketing campaign

By Ned Potter, Academic Liaison Librarian 

Running Academic Library induction as a marketing campaign from Ned Potter

We won an award! York just received a Bronze Marketing Award from CILIP's Publicity and Public Relations Group at the annual PPRG Conference, held in Birmingham in January.

It was in relation to the work we'd done on our Induction project - in 2016 (and then this year as well) we got rid of the big orientation games, and ran it like a marketing campaign. It worked really well and we got a great response - some elements weren't successful and we improved on them this year.

The slides from my talk are above, or alternatively you can view a video below. This is the audio from my talk, with a version of the slides designed for the video instead of for face-to-face presenting - there's videos-with-the-video, and other things to make it a bit more interesting to watch...

It was a great conference. The other award winners had all done really good campaign marketing too, and this supports an idea I bang on about all the time, which is that it's campaign marketing that really works. The same message, tailored across multiple platforms, for a concerted period of time: that's what is required for marketing to be effective.

Thanks to the (soon to be renamed!) PPRG group for the award and the excellent event. 

January 25, 2018

Digital Literacy Programme

By Michelle Blake, Head of Relationship Management

At York we have recently set up a Digital Literacy Programme to improve the digital capabilities and confidence of all our staff and students. Over the last few years we’ve concentrated on student digital literacy, working in partnership with other professional support services and academics on enhancing programmes through the York Pedagogy project. This project involved us commenting on every undergraduate and postgraduate programme and we’ll write something on it soon. It also saw the development of our Skills Guides and Digital Wednesdays initiative.

Over the last year we have begun to focus on staff and our first stage, information gathering, has now completed. It sought to:
  • Identify the issues and challenges when developing staff digital capabilities across the University;
  • Identify priorities for digital skills development for Library staff and any gaps in confidence; 
  • Learn lessons about staff digital literacy that can be applied to the wider University staff. 

We have been able to condense the detailed findings into these key issues to be addressed:
  • Gaps in foundational level understanding and skills that lead to longer term problems with confidence, attitude and skill 
  • Lack of understanding of collaborative and organisational tools
  • Staff expectations (their own and their managers)
  • Support for new staff 
  • How to support and change practice amongst existing staff 
  • Lack of awareness of what support is available and how to access it
  • Commitment to ongoing staff development
  • Sharing of good practice, inspiring staff and maximising impact
  • Attitudinal barriers and varying levels of confidence
  • Sustainability and scalability

The main outcome of our first project is the establishment of three new projects under the Digital Literacy Programme:

Library & Archives: building digital skills

Timescale: November 2017 -  Summer 2019 (anticipated)
This project will take forward many of the findings of this initial scoping project. It will include the analysis of the pilot group data and alignment of individual team recommendations to develop, implement and evaluate a programme of digital skills support for Library & Archives staff. Project components to be taken forward:
  • Three level training and development programme:
  • IT Essentials: How IT works
  • Working practices: How we work
  • Aspirational: How do I make it work better
  • Creation of role descriptors
  • Exploration of scalability and sustainability
  • Support for existing staff
  • Support for new staff
(It is anticipated that this project will be used for the basis for staff across the University)

Digital leaders

Timescale: October 2017 - December 2019
This project partners with the University’s Leadership and Development Team to embed digital skills within the Leadership in Action programme by the:
  • Creation of a new session on digital leadership
  • Creation and embedding of a self-diagnostic resource for participants 
  • Links to sources of further support for developing personal digital skills
  • Embedding digital tools and tasks throughout the course to model some best practice use of collaborative tools in particular

The first cohort commences in January 2018 and the second cohort finishes in September 2018. We expect to trial out approaches during 2018 and refine these for 2019 and beyond.

Training pathways

Timescale: January 2018 - December 2020 (anticipated)
The Digital Literacy Training Pathways project seeks to understand the digital literacy linked to core working practices from across a range of staff stakeholder groups, to ensure that we are providing adequate support and training to new and existing staff. The project proposes the development of generic and bespoke support and training materials tailored to the requirements of staff undertaking a range of roles at the University at different stages in their career.

Some of the expected outputs of the training pathways project include:
  • Surfacing of existing content under the banner of staff digital literacy and identified practices;
  • Identification of digital literacy training requirements of academic staff, professional staff and administrative staff based on work undertaken with pilot groups;
  • Development of new training and support materials based on stakeholder requirements and core digital competencies, including:
  • Online support materials
  • Face to face workshops/activities to be embedded in existing programmes
  • Awareness raising and marketing of digital tools/practices;
  • Creation of a collection of case studies from pilot groups to encourage digital literacy development more widely across the University;
  • Development of online platform and access routes to training materials. 

We expect this to be the first post to introduce the programme of work that we’re undertaking and hope to provide regular updates about the projects here.

January 09, 2018

Reading Lists: finding out what users really want

By Elizabeth Simpson and Kirsty Whitehead

In summer 2016 it was decided that the Library would look to upgrade its reading list system. The in-house system we had developed was at the end of its life and we wanted to switch to a product provided by a commercial provider. Members of the Academic Liaison team were on the project group and their primary responsibility was to engage our university community in the process of selecting the new system.

Gathering user requirements
We were fortunate to be able to pull feedback from a variety of sources. In addition to the knowledge that we already have through our work with academic departments, we were able to draw on existing feedback from our subject emails, CRM, and most importantly, the Understanding Academics project that had recently started. Rather than going back out and asking people to comment again, we started by synthesising the rich commentary we already had. The aim was to be able to clearly articulate what features people would like to see in the new reading list system. 

What did we do?
Initial work had already been done to code up the Understanding Academics interviews. During the summer of 2016 we looked at the 30+ pages of comments related to reading lists, and the first step we took was to analyse the comments in more detail and identify the main themes that emerged. We also made a note of our initial thoughts and ideas about how we could take this forward, and created a summary of them. Below is a snapshot of the document:


Number of comments
Summary of comments
Take forward....
Alternatives to EARL
Currently using paperpile to generate lists

Needs to be more user friendly for staff and student users. Issues raised:
  • multiple screens, slowness, too many clicks, it’s ugly, difficulty moving items within lists and between lists (esp if scans are attached), only works properly with IE.
  • rolling over could be smoother or automatic
  • should be able to hide specific items for future use.
  • difficulty finding items which are already in stock
Invite staff/student reps to the supplier demos.
  • make it easier for staff to input the details of the item(s) they want to include (importing/bookmarklet) esp. for items on the catalogue.
  • provide more flexibility for editing lists
  • make it easier to find items already in the catalogue?
  • improve rollover (automatic if possible)

However, this didn’t provide clear enough criteria that we could use further on in the project in terms of assessing potential systems. We developed a second document that really drilled down and listed each requirement separately, stating the needs and goals of our academic staff and students. Below is a snapshot of the document, in total there were 39 requirements listed. This was also useful to share with other colleagues in the project team who came from Collections and the Electronic Learning Development Team (ELDT) team.


Epic Category
I Want....(to do)
So That (Goal)...
Would Like to Have
HL Acceptance Criteria
Create a new reading list
Academic staff
Import references from a list I have already created e.g. word, paperpile, google doc, Endnote or other reference management software i.e. batch import.
I don't have to spend hours/days to create the list in a new piece of software. And it is easy to include things that I know we already have in stock.
Must Have
Must be able to import files from a variety of locations in a variety of formats (eg RIS).
Must be able to identify resources which are in stock from imported list, and automatically create links to these on reading list.
Create a new reading list
Academic staff
Import a list of references from our library catalogue e.g. list of starred items, rather than having to type everything into EARL individually and hope I've included the right information to find the item I want.
I don't have to spend hours/days to create the list in a new piece of software. And it is easy to include things that I know we already have in stock.
Must Have
Must be able to import files from a variety of locations in a variety of formats eg RIS.
Must be able to identify resources which are in stock from imported list, and automatically create links to these on reading list.

How did this feed into the selection process? We decided that we needed to involve academic staff and also students in the process of selecting a reading list system. Using the feedback we had gathered via the process above and also from our conversations with other institutions, we devised 3 scenarios for the invited suppliers. These 3 scenarios covered all the points from the user requirements spreadsheet.

Five people from the project team were the official scorers but the staff and students who attended were also given a scorecard with which to score the scenarios, and prompts were provided. There was also space for them to write any other further comments. The feedback from the university community was to act as a benchmark should we all have wildly different scores. As it happened, we all scored the suppliers in a very similar manner.

"What does EARL* even mean?” This was one of the comments about our old system (called EARL) from one of the Understanding Academics interviews. We used switching to the new system as a chance to update the name and simply call the system Reading Lists. This has elicited lots of positive feedback from academic staff and students, who refer to lists as ‘reading lists’ and were often confused by our use of ‘resource’ lists. The new name reflects this user preference, regardless of the fact that lists contain much more than just reading.

What’s next? 18 months later, as part of reflecting on how successful the project has been, we’ve been able to return to the user requirements spreadsheet and evaluate the system that we have now. Having looked at each requirement individually, 91% of them have been met either partially, fully or, in some cases, requirements have been exceeded. The few requirements which haven’t been met relate to referencing styles, which is work in progress, and restrictions in the library catalogue system. After a really busy few months combining the start of term and the switch to the new system, being able to go back to the user requirements spreadsheet and use this to evaluate the work that’s been done has been invaluable, especially in helping us to plan our next steps.

Indeed, the work that we’re doing around this shouldn’t stop now that the system is fully up and running. We need to sustain a dialogue with users of the system in order to remain aware of their experiences of using Reading Lists: what they like and don’t like about it, and how this should shape proposed developments in the system.

For instance, in November we ran a focus group to which we invited staff and students from all departments. We felt it was timely to do this since we had reached the end of Reading Lists’ first term, and it seemed a good way to keep the conversation open with the academic community who had been so engaged with the process right from the start. The session allowed to us to gather some useful feedback about how people are using the system, and most of it was very positive, and it also gave us the chance to talk to people about the proposed changes that we want to make to Reading Lists. We will continue to run regular focus groups so that we can continue this conversation, and we are also continually collecting feedback that we receive during Boards of Studies in our departments, anecdotally from staff and students and via the training sessions that we provide. One of the great things about having a new system means that we can finally be more responsive to the concerns that our user community have and we are keen to ensure that we continue to actively engage with them.
*Easy Access to Resource Lists