Tag Archives: Jisc

Jisc co-design workshop: ‘Digital Skills for Research’

On 25th April I was amongst a number of colleagues from a wide range of stakeholders and organisations invited by Jisc to a co-design workshop on ‘Digital Skills for Research’.

Co-design is Jisc’s “collaborative innovation model” by which they have engaged with the sector to identify 6 discrete challenges to explore and we were here today to think about next generation research environments.

Our first task was to brainstorm around research support roles which perhaps turned out to be rather more discursive than our hosts expected. Rather than a clearly defined list of discrete roles, a consensus began to emerge that associated skill sets are extremely fluid and that professional nomenclature, and institutional structure, can potentially have the effect of artificially limiting the scope of a role  (it is perhaps instructive in this context to revisit the range of job titles posted to the UKCoRR mailing list in 2016, and, no doubt, the many similarities and differences of the associated job descriptions.)

research_support

This led on to a discussion about qualifications, training and appropriate professional accreditisation, which is certainly an issue for repositories and Open Access (also see related  blog posts from Cambridge at the bottom of this post). A similar issue was also raised in the context of research software development.

ARMA’s Professional Development Framework is perhaps the obvious resource for research support skills and the Vitae Researcher Developer Framework (RDF) is a valuable resource for researchers and their support services alike and might highlight the increasingly collaborative relationship between researchers and support services – a theme that also emerged at a recent White Rose Libraries Digital Scholarship Event

We got back on track in the next exercise, considering the range of (digital) skills required by professionals working in research support roles, although it was observed that ‘skill’ doesn’t necessarily describe the sheer range of knowledge required, such as an overview of the myriad funder policies around open access and data management for example.

To consider bibliometrics as but one skill set that might often fall to a Librarian or other information professional in lieu of a trained bibliometrician (how many institutions have one of these exotic beasts?), there are myriad proprietary data sources covering both ‘traditional’ and ‘alternative’ metrics that we must be familiar with and which might help to inform impact assessment and yet there is no clear training offer, at least I’m not aware. The best resource I have found is MyRI, a collaborative project of three Irish academic libraries, freely available at http://myri.conul.ie/.

So, which organisations are responsible for fostering these myriad digital skills, now and in the future? The day’s final exercise identified the usual suspects – Jisc, ARMA, CILIP and of course our very own UKCoRR, though there is the ongoing question around our capacity as an unfunded, voluntary organisation and positioning in relation to other organisations. We hope to continue our consultation around the future vision and remit for UKCoRR (survey) at our members’ day at the University of Warwick on 7th July.

A useful lunchtime conversation with Helen Blanchett considered some sort of OA training provision and network support from Jisc, the discussion was obviously informal but we hope that Helen will be in Warwick for our members’ day to discuss the idea further.

If the landscape is complex now it will only become more so, with ever more specialist roles and associated skill sets and the final discussion was around the potential role for Jisc and by extension, for our own purposes, for UKCoRR:

future_research

Jisc’s provision currently comprises “144 guides and case studies”, as well as a number of face to face and online courses, both fee paying and free models, including their Digital leaders programme and the Digifest conference for example; one suggestion was that there is in fact a gap for a conference dedicated to cutting edge digital research practice, in view of the fact that Digifest2017 focused on teaching rather than research.

Related posts from University of Cambridge Office of Scholarly Communication:

Tagged ,

Results of the Sherpa FACT accuracy testing – 95% accurate

UKCoRR welcomes the results of a  recent exercise – undertaken by UK librarians, repository managers and Sherpa Services – that has shown that the results produced by SHERPA/FACT (Funders & Authors Compliance Tool) have an accuracy rate of over 95%

The FACT service was developed to help researchers get a simple answer to the question “does this journal have an open access publishing policy compliant with my funder’s open access mandate?”. The FACT service – which draws its information from the SHERPA/ROMEO and SHERPA/JULIET databases – seeks to provide a yes/no answer to this question, as well as providing information about how an author can comply with a funder policy.

There had been some discussion at the SHERPA/FACT Advisory Board, raised by UKCoRR members and their institutions – as to whether the information provided by FACT was accurate.

To address this issue an exercise was undertaken – by members of UKCoRR – to manually check a statistically significant number of journal/funder combinations and then compare the information this group had found with the information provided by FACT. Where the independent reviewers arrived at a different conclusion to that provided by FACT, then that journal/publisher combination was subjected to detailed and exhaustive investigation to arrive at an evidenced answer.

At the end of this exercise, it was found that the FACT service provides correct information in over 95% of cases.

The study clearly highlighted the difficulties that even highly experienced repository staff have at deciphering publisher OA policies. Indeed, the initial testing undertaken by UKCoRR members suggested that FACT was only accurate on 57% of occasions. When these journal/funder combinations were investigated further, however, close examination of the often complex conditions and the interactions between different statements and policies showed that FACT was correct in almost all of the cases.

The SHERPA/FACT team as well as the SHERPA/FACT Advisory Group would like to extend their thanks to the UKCoRR Members who took part in this checking process for their time commitment as well as their extensive knowledge of this area of work. This exercise has proved that the SHERPA/FACT service can be relied upon as a source of advice for UK researchers. UKCoRR also encourages it’s members to continue to communicate with SHERPA/FACT where discrepancies are found to continue to improve the quality of the information SHERPA/FACT relies upon.

Issues of interpretation and the interaction between the various policies have been seen to be the key to the discrepancy between our manually checked results and Sherpa’s findings. There is further work to be done here and UKCoRR looks forward to continuing to work with the SHERPA/FACT Advisory Group to develop increased clarity in this area.

To see the full data and study methodology, visit Figshare.

The study was commissioned by the SHERPA/FACT Advisory Board – which includes representatives from UKCoRR, Jisc, Wellcome Trust, Research Councils UK (RCUK), CRC Nottingham, Association of Learned & Professional Society Publishers (ALPSP), Higher Education Funding Council for England (HEFCE), Publishers Association and SCONUL.

A blog post from Jisc on this project is available as well as a press release on the project.

Tagged , , , , , , ,