Tales from The Library: A growing collection of library love stories 

I love pictures. They often take me places and creates different kind of stories in my head. I like many different kinds of pictures but I really dig pictures of libraries and the people who work there and the people who uses them. Pictures of rows of books, students that are reading, kids learning to code and all the small and big sparks of the good things that libraries are.

I’ve visited many libraries in my life and had many great library experiences. Some of them I’ve captured in a photo. I thought it could be nice to find those photos, put ’em up somewhere so people can enjoy them (I don’t really know what I’m doing when I take pictures and it’s just iPhone pics but it’s pic’s of libraries so that secures 99% of the joy) and use them for websites, library marketing etc.

So here you go: Tales from The Library, a growing collection of quality library love stories from around the world: https://www.instagram.com/librarylovestoires



Tales from The Library: https://www.instagram.com/librarylovestoires 

Don’t be square. Roskilde University Library

Sparks. New York University Library

Alone with books. TU Delft Library


And The Winner is.. How libraries can support students working for a better life on campus

Altruism. I find people who are voluntarily doing things for other people simply because they think it makes a different, one of the most beautiful things in this world. Those people are making the planet a better place to live. All the time. A lot of altruism is going on in higher education and on campus. Students are doing social and educational events, arrange debates, do bars, board game clubs, campus days etc. in order to create a good study environment and great experiences for their peer students.

I think the academic library has a huge stake in students success and well being in higher education; as libraries we are platforms and spaces who support students studies, learning and collaboration and we lack the evaluation and authority of the mother institution which gives us a position, where we can focus on their success without limits but our own imagination and drive for making them the best students they can be. In this context I’ve been thinking about how the library can support the voluntarily spirits and activities on campus and here is one idea: The Study Environmental Award.

The Study Environmental Award is given to students and student organisations who do activities for their peer students in order to make a greater study environment.

The purpose of the award is two-pieced:

1) To honor those students and organisation who is putting effort, energy and good ideas into creating a better study environment

2) To inspire other students to do the same by putting a positive focus on voluntarily actives targeting the study environment on campus.

The concept is simple really: Just like the Oscars and the Grammy’s are events that put praise and focus on the film and music but also the whole industries, The Study Environment Award is meant to both give a big hug and a high-five to the cool students who make an effort to make study life better for everybody but also to open fellow students eyes for the opportunities to contribute.

Two campuses, two libraries, two student councils, two awards

I’m library director for both Roskilde University Library and Faculty Library of Social Sciences at Copenhagen University Library and we have created a Study Environment Award on both campuses. We haven’t done it alone. Both at Roskilde University and Faculty of Social Sciences we have teamed up with the student councils which have a shared interest in supporting and developing the study environment. Collaborating with student organisations is really awarding: They bring in valuable and highly relevant perspectives which library folks is not able to cover alone, they often have a huge outreach on campus and collaborating with them, creates ownership and awareness of the library activities.

For the actual ‘Award show’ we are aiming a tapping into some of the larger activities going on at campus. At the Faculty of Social Sciences the award are given on the yearly Campus Day in the end of April and in Roskilde at the students yearly Summer Party in the beginning of June.

17457806_10154283447915951_893634756879903624_nThe Handshake. Emma Bach from the Student Council at the Faculty of Social Sciences and yours truly settling the deal on The Study Environment Award.

And The Winner is… 

The award show got everything it takes to be an.. award show: Reading and praise for the nominated students and organisations, an envelope with the winner, the breathless seconds just before the winner is announced, endless happiness and joy and spotlight for the honorable winner, reading of motivation for selecting the particular winner, handover of prize, bubbles and flowers and thank you speech from the winner.

In both case the prize has been money to make new activities for the study environment.

At The Faculty of Social Sciences the award went to ‘Pedalarmen’ (The Pedal Arm) – a space for fixing bikes, drinking coffee and have chats with fellow students across campus. One of the reasons The Pedal Arm won was, that it is not focused on subjects or education – it simply creates a social space, a welcoming, friendly and open get-together, where students can meet with other students (and teachers) from other disciplines (and learn something about how to fix ones bike – Copenhagen is a biking city you know and for many people the bike is like an extra part of the body).

The happy winners of the Study Environment Award 2017 on The Faculty of Social Sciences

Benefits of the award approach to supporting student driven study environment activities

The Award approach comes with different benefits:

  1. It celebrates the cool students and voluntarily spirits that is doing something good for the study environment
  2. By creating focus it can inspire others to do the same
  3.  It’s an well known concept, easy arranged, and everybody loves a good award show
  4. Collaborating with student councils and student organizations creates valuable network within the student community for the library
  5.  Having The Library name on a popular award is good branding and story telling about the library

This is working for us and can be twisted in many ways. A Study Environmental Award is just one of a number of different ways the academic library can support student driven activities on campus and in higher education. Let me hear yours.



“Facts are real. Sincerely, Your Librarians” and other badass signs from #WomensMarch

On January 21 2017, millions of people around the world took the streets to rally for women’s rights at the Women’s March. The first protest was planned for Washington, D.C., and was organized as a grassroots movement that took place on the day after Donald Trump’s January 20 inauguration. It aimed to “send a bold message to our new administration on their first day in office”. Quickly the rally spread to other cities around the world.

I find the Women’s March a mind blowing and strong statement and I find great comfort in the fact that a huge population of fellow citizens around the world will unite to protect women’s rights and other causes under fire like protection of the natural environment, LGBTQ rights, racial justice, freedom of religion, and workers’ rights.

Lots of library workers participated in the Women’s March and some of them where out there flashing some some really badass protest signs. I’ve collected some of them here for collective memory and inspiration (I’ve tried to located the primary sources for the photos with out much luck. If you know any persons who took the photos please ask them to reach out to me, so I can either make a reference to them or, if they don’t want the photo in this blog post, I can remove it).

“Thank a school librarian if you can tell fact from fiction”

“Librarians against Trump – ‘Book Him'”

“Read and Resist” – Librarians are pissed

“You know it’s bad when Librarians are marching”

“Librarians against post-truth”

“Real news? Fake news? Ask a Librarian”

I’m telling you, kids are the future. This one proves it big time.

Yep, science really is real

Love this one
My fave: “Facts are real. Sincerely, Your Librarians”



Why usability testing should be a part of regular library activity

Guest post from Library Lab fellow and UX ninja, Anneli Friberg, Linköping University Library

The interest for user experience (UX) and usability in libraries has grown rapidly over the past years and now has become an essential tool for developing and assessing a library’s digital services and physical spaces. It is necessary, though, to recognize that UX incorporates much more than just usability. Norman and Nielsen (n.d.) summarize user experience as something that ‘encompasses all aspects of the end-user’s interaction with the company, its services, and its products’ and continues:

The first requirement for an exemplary user experience is to meet the exact needs of the customer, without fuss or bother. Next comes simplicity and elegance that produce products that are a joy to own, a joy to use. True user experience goes far beyond giving customers what they say they want, or providing checklist features. In order to achieve high-quality user experience in a company’s offerings there must be a seamless merging of the services of multiple disciplines, including engineering, marketing, graphical and industrial design, and interface design.

Furthermore, they state that it is important to separate the overall user experience from usability, since the latter ‘is a quality attribute of the UI [user interface], covering whether the system is easy to learn, efficient to use, pleasant, and so forth.’ (Norman and Nielsen, n.d.).

At Linköping University Library (LiUB) we are slowly moving towards a ‘culture of usability’ where users are being observed interacting with both physical and virtual spaces, the way Godfrey (2015) advocates, but this paper will only focus on the library’s online presence. The main objective with this paper is to argue for continuous usability testing, as a part of regular library activity.

Usability testing per se is nothing new within the library sector, but it is usually done in the process of launching a new or redesigned website/UI or implementing a new library system. Most often it has a distinct focus on web development, and is not so much used to develop other services or physical spaces. This is confirmed in numerous articles and UX-blog posts (e.g. Gasparini 2015; Godfrey 2015; Broadwater 2016; Dominguez, Hamill & Brillat 2015). Sometimes the tests are not conducted by library staff, but by external consultants. Our approach, however, is to use an in-house, continuous process which is applied not only to the library’s website structure, but also to other digital services such as the search box on the library start page and link resolver user interface and the link resolver icon in the discovery tool.

Rettig (2014) asks whether such a thing as ‘grassroots UX’ exists in libraries. She wonders if ‘the UX hopeful, [who] do not have the mandate or team or job title’, can find ‘ways to apply UX methods to smaller-scale, day-to-day work in the library?’ I am inclined to say that it is possible. A UX perspective can and should be integrated in any development project, big or small. The UX philosophy does not have to be initiated as a top-down initiative, and in a sense LiUB’s systematic way of doing usability testing started out as a grassroots initiative.


Linköping University (LiU) is one of 16 Universities in Sweden. LiU has four campuses in three cities (Linköping, Norrköping and Stockholm) and has four faculties: Science & Engineering, Medicine & Health Science, Arts & Science and Educational Sciences. LiUB consists of four physical libraries, one on each campus, with approximately 90 staff members in total.

In order to make sure that LiUB contributes in a useful and valuable way to student learning and research, we have tried to find different ways to understand our users’ needs and behaviour. We use our insights to improve the digital library in order to provide a user-friendly and intuitive way for students and researchers at LiU to access the information they need for their studies and research.

The groundwork for the library’s systematic user involvement was done within a web strategy project in the spring of 2014. Throughout the project we had the opportunity to test different methods for collecting user data. During this time we also formed a usability team at the library. The team consists of five people (of which three are librarians), including myself, with different skills and roles such as system manager, computer programmer, webmaster, UX expert and cognitive scientist. Over the last 24 months, the usability team has gathered once a month to do testing. The advantage of having a permanent usability team is that the library does not have to mobilize a team whenever the need occurs. This approach is also advocated by Nichols, Bobal & McEvoy (2009):

A permanent usability team allows an organization to build expertise and tackle more usability projects than ad hoc teams. Having a usability team already in place makes it more likely that usability studies will be done on projects that may otherwise have been overlooked because of the “burden” of asking staff to be part of another project on top of their already busy schedule.

The LiU Library Experience

The web strategy project in 2014 established usability and user benefits as central to the continuous web development process. In order to accomplish a user-centered library website we decided to find a doable model for user-involvement. The book Rocket Surgery Made Easy: the Do-It-Yourself Guide to Finding and Fixing Usability Problems by Krug (2010) became our inspiration. The workflow for usability testing at LiUB is illustrated in Fig. 1.


Fig 1.: The workflow for usability testing at LiUB

When we first started, we asked ourselves how many test participants were needed. According to Nielsen (2012) five users are enough when doing usability testing, because then ‘you almost get close to user testing’s maximum benefit-cost ratio.’ Krug (2010, p. 43) on the other hand claims that three users are good enough for ‘the do-it-yourselfer’, considering ‘you’re not interested in what it takes to uncover most of the problems; you only care about what it takes to uncover as many problems as you can fix.’

As we evidently belong in the category of ‘do-it-yourselfers’ we started with three test participants per session during the first year. The previous semester we decided to increase the number to four users per session, since we thought we had the capacity to expand. Although, after our last evaluation we decided to go back to only three users again, since it was difficult for me as facilitator, but also for the observers, to stay focused and perceptive with four users and to get enough time for summarizing and debriefing. Krug (2010, p. 43) made a list of arguments why three test participants are enough, and after trying with four, I am willing to agree. Some of Krug’s reasons are:

  • The first three users are very likely to encounter many of the most significant problems related to the tasks you’re testing.
  • Finding three participants is less work than finding more.
  • Testing with three users makes it possible to test and debrief in the same day.
  • When you test with more than three at a time, you often end up with more notes than anyone has time to process – many of them about things that are really “nits”. This can make it harder to see the most serious problems – the “can’t see the forest for the trees” effect.

For the tests we use randomly chosen employees and/or students as test participants. In my experience, engaging face to face is the most successful way to recruit users. For example, I usually recruit students I meet in the library. Regarding employees we always recruit research or teaching staff such as PhD students, lecturers, university teachers and professors. My experience is that most students and employees I ask are willing to help us as long as they can find the time for it. They all want to be part of a process that aims to improve the user experience.

When it comes to deciding what to test, we make a preliminary plan at the beginning of each semester. This plan sometimes changes during the semester. What we actually test depends on different projects in progress at the library. We never test systems or interfaces that we can’t alter or modify ourselves to some extent.

We conduct usability testing monthly during each semester, which gives us approximately eight test sessions per year. This enables an agile and iterative approach to assessing the users’ experiences of the digital library as well as helping in the development of our digital services.

On the test day, the usability team divides into two groups in two different locations: a test room (see Fig. 2) and an observation room (see Fig. 3). The facilitator and one observer goes to the test room, while the rest of the team goes to the observation room. Often the latter are accompanied by other observers and stakeholders; sometimes colleagues from other departments within the University such as the division for IT Services, sometimes external such as librarians from other universities.

Fig 2.: Test room

Fig 3.: Observation room

We combine different methods like observation, think-aloud protocol and capturing screen activity. By using different practices that complement each other, we avoid the uncertainty of using just one method. One of the benefits of triangulation of data is that we get a more complete picture of the usability issues that need to be addressed.

Each test person is given a specific assignment based on a common user scenario for the service to be tested. The test person attempts to complete the assignment while thinking aloud. If needed, the facilitator encourages the test participant to think aloud and describe what he/she is trying to do. At the same time, the team in the observation room records what the test person says and does. We use Camtasia to record screen activity, and we set up an Adobe Connect meeting to share screens between the test room and the observation room. Obviously we do not record anything without permission from the users. Before we begin the test session, the test participant signs a written consent.

After the test, the facilitator and observer from the test room join the rest of the usability team in the observation room and a debriefing session starts. We then collect and discuss the usability problems we have noticed and put them together in an aggregated list of feasible improvements. We also prioritize the things on the list.

After each test session the usability team starts to improve the things listed. Depending on what the problems are and what has to be done, we involve different colleagues outside the usability team. The recordings have proven valuable for the analyses and development in between the test sessions. They are an essential complement to the observers’ notes.

Another valuable complement is so called guerrilla testing, which we do sometimes in between the monthly test sessions. This type of testing is both agile and flexible. It is a ‘low cost method of user testing. The term “guerrilla” refers to its “out in the wild” style, in the fact that it can be conducted anywhere…’ (GOV.UK n.d.) When we perform guerrilla testing we approach people in the library and ask them to give quick feedback. This fits well with our thinking that some testing is better than no testing.


The improvements we have made as a result of what we have seen during our usability testing ranges from very small terminological changes to more structural changes on our website. One of the first things we tested was the information architecture for a new library website. For that, we used a tool called Treejack. We did one test session with students and one with employees. This enabled us to get valuable feedback on the site structure.

For several years we had a tabbed search box on the library start page (see Fig. 4). Last year we decided to renew the design, inspired by MIT Libraries. Before we launched the new search box (see Fig. 5) we made a prototype which we used to perform both regular usability testing and guerrilla testing. The feedback we got gave us useful input to the design process.

Fig 4.: Old desgin of the library start page with a tabbed search box

Fig 5.: The new search box

We have also tested different features and new services for the discovery tool, such as a new search service for e-publications. We tested this service twice – once with undergraduate students and once with PhD students. In addition to getting feedback on what adjustments to do, we also learned that undergraduate students have quite a different attitude to journals than PhD students have. We have seen this in other situations, for instance when doing interviews as part of the web strategy project in 2014, but seeing this again during usability testing confirmed our previous insights.

Things we have also tested and improved are terminology, holdings information and link resolver user interface. Sometimes we make changes and then we do a new round of testing, but more often we get indirect feedback on changes we have done while testing new things.


A vast understanding about our users is the foundation of any user-centered development. By combining qualitative and quantitative methods and applying a UX-perspective we are better equipped to meet our users’ changing needs and behaviour. It allows a more agile workflow. The trick is to keep it simple. We do not consider ourselves researchers. What we do are continuous modifications based on input we get from real users. Our motivation is to enhance users’ experiences of the library’s digital services.

Based on our experiences from the last 24 months we have found that systematic usability testing can and should be a part of the regular library activity and that it can encompass so much more than just the website structure. The key to success is the model itself, particularly when it is carried out monthly during the academic year. By involving real users continuously, we avoid getting stuck in our own internal assumptions of how users interact with the library’s digital services.

Additionally, usability testing is an excellent way to make our services more visible to users.


Broadwater, T 2016, Why am I doing this to our users? A case study about the wrong turns taken during a redesign project and the impact of design-by-committee on team morale, viewed 11 July 2016, < http://libux.co/why-am-i-doing-this-to-our-users/>

Dominguez, G, Hammill, SJ, Brillat, AI 2015, ‘Toward a usable academic library web site: a case study of tried and tested usability practices’, Journal of Web Librarianship, vol. 9, issue 2-3, pp. 99-120.

Gasparini, AA 2015, ‘A holistic approach to user experience in the context of an academic library interactive system’. Lecture Notes in Computer Science, vol. 9188, pp. 173-184.

Godfrey, K 2015, ‘Creating a culture of usability’, Weave: Journal of Library User Experience, vol. 1, issue 3, viewed 8 October 2015, <http://dx.doi.org/10.3998/weave.12535642.0001.301>

GOV.UK n.d., Guerrilla testing: getting input into products and services, viewed 27 September 2016, <https://www.gov.uk/service-manual/user-centred-design/user-research/guerrilla-testing.html>

Krug, S 2010, Rocket surgery made easy: the do-it-yourself guide to finding and fixing usability problems, New Riders Publishing, Berkeley, CA.

Nichols, J, Bobal, AM, McEvoy, S 2009, ‘Using a permanent usability team to advance user-centered design in libraries’, Electronic Journal of Academic and Special Librarianship, vol. 10, no. 2, viewed 13 July 2016, <http://southernlibrarianship.icaap.org/content/v10n02/nichols_j01.html>

Nielsen, J 2012, How many test users in a usability study?, viewed 8 July 2016, <https://www.nngroup.com/articles/how-many-test-users/>

Norman, D and Nielsen, J n.d., The definition of user experience, viewed 8 July 2016, <https://www.nngroup.com/articles/definition-user-experience/>

Rettig, M 2014, Grassroots UXD in the library: a review essay, Weave: Journal of Library User Experience, vol. 1, issue 1, viewed 1 April 2016, <http://dx.doi.org/10.3998/weave.12535642.0001.103>