Press "Enter" to skip to content

Living evidence-based practice: Part 1

Last updated on 26 August 2022

My expertise in evidence-based library and information practice (EBLIP) has developed academically, both in the LIS degree program and through undertaking research; applying evidence-based approaches to government record keeping policy in my early career days; being a guide, facilitator and mentor of EBLIP across the library in a previous role. And now? A return to living evidence-based practice in transforming a research support service.

Early in my current role, a peer from another university library reached out to ask, how are you applying evidence-based practice to your new role? Tell me! So I did. We still chat regularly which I love. I’m so excited to be expanding my professional network in this role.  There are some very fabulous and knowledgeable people in the research support space here in Australia and New Zealand. Just sayin’.

I asked, do you think it would be helpful if I shared the process on my blog? Yes! She replied.

In this post, I will share the (thinking) steps I’ve taken so far in applying what I’ve come to know about evidence-based practice to an emerging research support service.

Grab yourself a cuppa.

A little background first

The University of the Sunshine Coast is a multi-campus, regional university in Queensland with a growing research capacity. USC’s research is making a splash. The number of higher degree by research (HDR) students is increasing each year. So, the creation and commencement of my role of Research Support Librarian last year demonstrates a strategic commitment by the Library on its continuing contribution to USC’s research.

The initial focus was on improving the HDR student experience. Evidence of this need was mounting. We needed to get some ‘runs on the board’ quickly – try out a few things, establish stakeholder relationships, evaluate and keep improving.

Design thinking meets evidence-based practice

In moving the research support service forward, I am combining design thinking with evidence-based practice. This means taking a client/student-centred approach. Also, not make assumptions about how students or clients engage with, or want to engage with the library service. Then incorporating this with other local and professional evidence. This way of working is about being open minded to the story the evidence tells.

By using design thinking to solve problems and seize opportunities, iterative steps and improvements create a ‘snowball’ effect in the eventual transformation of the service. This promotes an agile way of working that takes into account a number of realities.

    1. In practice, you don’t have time to figure out every detail before implementing an improvement, transformation or something new. Even the ‘something new’ might need to start off as a MVP (minimum viable product) to limit costs and test the return on investment.
    2. The industry landscape, and therefore the local context, is not static. What the evidence says about the local context one week may be different the next.
    3. There are organisational cultural considerations to become aware of when implementing improvement, change or something new. Taking the team’s ‘pulse’ at each or even during each iteration is critical evidence to consider for the next step.
[MUST READ: Howard, Zaana & Davis, Kate (2011) From solving puzzles to designing solutions: Integrating design thinking into evidence based practiceEvidence Based Library and Information Practice6(4), pp. 15-21.]

 

Here’s what I’m thinking this combined, ‘design thinking meets EBP’ looks like.

Iterative improvements inform the strategic approach to transforming the research support service at USC. Is ‘rapid EBLIP’ a thing?

 

As the strategic, ‘big picture’ thinking progresses to shape a renewed or transformed service, the iterative improvements or ‘runs on the board’ help both inform strategic decisions as well as deliver value to students and clients at a more rapid pace. 

Initial evidence-based practice steps in service improvement

If we consider the ‘big picture here’ of enhancing the HDR student experience (so not the incremental improvements themselves but their sum, perhaps), I’d say I’m roughly going backwards and forwards between steps one (articulate), two (assemble) and sometimes three (assess) the 5 A’s model.

1. Defining the problem/opportunity

As a team, we needed to bring together what we currently knew about the HDR experience. Here’s a snapshot of the questions asked.

It was interesting to see how my very early research into evidence-based practice rang true in this scenario. The ‘proximity’ of evidence to the question or problem can have most weight when assessing or appraising evidence. Local evidence was more appropriate than other types of evidence at this stage.

[SEE: ‘Spheres of relevance’ diagram in Howlett, A. & Howard, Z. (2015). Exploring the use of evidence in practice by Australian special librariansInformation Research20(1), paper 657. Retrieved from http://InformationR.net/ir/20-1/paper657.html]

 

2. Weighing up and appraising the evidence: what did it tell me?

There is no such thing as perfect data. We can only understand the limitations of data and evidence in order to ‘weigh’ it up appropriately. Just like understanding a research study’s limitations and assessing its validity and reliability, I can’t stress enough that we must do the same when we appraise (or assess) local evidence. ‘Best available’ evidence is what we have to go on in professional practice and so its important to know how to identify it.

A mix of evidence, collected to answer the above questions about the HDR experience, told me:

a) The library’s research support service must evolve now, rather than later, to meet current and future needs. It must grow sustainably and strategically, particularly if it is to serve beyond HDRs, to broader university research community.

b) There’s an opportunity for the Library to build upon its service experience strengths and to be more explicitly valued as partner in the university’s research endeavours.

c) Incremental improvements and experiments that will deliver value and make a positive impact on the HDR student experience can more or less start being developed straight away. Online workshops (and recordings), participating in student association meetings and relevant working groups, conducting topic mapping for future training program and generating a ‘welcome email’ template and process, are examples of current improvements and experiments.

From here, interesting things then started to happen. Other stakeholders wanted in, and wanted to know what we were up to 😉 …

 

3. Moving forward: the next questions needing answers

Okay, so the problems and opportunities listed above now pose new questions. How does a library’s research support service evolve? How do we up our stakeholder engagement game? What training topics do we focus upon first to build the foundation for more?

And more specifically,

  • What service model should the library adopt to ensure its sustainability going forward?
  • How have research support services evolved over time? What frameworks, methodologies etc have been used to advance services? 
  • What stakeholder engagement and outreach practices should we implement to increase our local intel and communicate our contribution more strategically?
  • What are the trends and issues relating to HDR student support and the functions or services that library offers to support research?

Parts of these questions may be answered with ‘point in time’ evidence gathering. For example, a literature review will help generate a narrative of what’s been happening over the last five to ten years. Other parts will help focus conversations, professional development outcomes, and our ‘ear to the ground’ attention to what’s happening ‘out there’ over the next while. Also, continued local evidence gathering, such as through feedback processes will help build the local evidence base to make informed, future decisions about the service.

 

Applying evidence-based practice is messy, huh? The main takeaways here are:-

  • ask specific questions about the service (often the hardest part)
  • match the questions with ‘best available’ evidence
  • understand limitations of data and evidence
  • be open-minded and client-centred
  • take action
  • do it all again

I welcome your input, suggestions and feedback. What would you do differently? Or, are there any parts that are inspiring?

(Visited 180 times, 1 visits today)