Falk-Krzesinski, "Administrator (Institutional Use of the Data): Data-informed Strategic Planning for the Research Enterprise"

BaltimoreNISO 476 views 26 slides Feb 21, 2019
Slide 1
Slide 1 of 26
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26

About This Presentation

This presentation was provided by Holly Falk-Krzesinski of Elsevier during the NISO event, "Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems," held on February 20, 2019.


Slide Content

Research Metrics: Data-informed Strategic Planning for the Research Enterprise NISO Virtual Conference ♦ February 20, 2019 Is This Still Working? Incentives to Publish, Metrics, and New Reward Systems Holly J. Falk-Krzesinski, PhD Vice President, Research Intelligence ♦ Elsevier

Presentation Roadmap Research metrics for institutions Data for metrics in institutional research information systems

Presentation Roadmap Research metrics for institutions Data for metrics in institutional research information systems

Strategic Context for Research Metrics Decreasing government grant funding for research Increasing competition for government research funding Rise of interdisciplinary and international grand challenge themes Increased team science and cross-sector collaboration Competition to attract the best research leaders globally Growing need to demonstrate both economic and social impact of research

Research Metrics at Different Levels https://libraryconnect.elsevier.com/articles/librarian-quick-reference-cards-research-impact-metrics Journal Level CiteScore Journal Impact Factor Scimago Journal Rank (SJR) Source Normalized Impact Per Paper (SNIP) Article Level Citation count Citations per paper Field-Weighted Citation Impact (FWCI) Outputs in top quartile Citations in policy and medical guidelines Usage Captures Mentions Social media Researcher Level Document count Total citations h -Index i10-Index g-Index

Research Metrics Throughout the Research Process

Categories of Metrics for Analysis https://www.elsevier.com/__data/assets/pdf_file/0020/53327/ELSV-13013-Elsevier-Research-Metrics-Book-r5-Web.pdf

Categories of Metrics for Analysis Qualitative input Metric theme Metric s ub-theme Specific Metrics Funding Awards Awards Volume Outputs Productivity of research outputs Scholarly Output Number, Type and Growth Subject Area Count Visibility of communication channels Publications in Top Journal Percentiles Research Impact Research i nfluence Citations Count Field-Weighted Citation Impact Outputs in Top Citations Percentiles Citations per publication Cited publications h -indices Number of citing countries Views Count Outputs in Top Views Percentiles Views per Publication Field-Weighted Views Impact Knowledge transfer Academic-Corporate Collaboration Citing-Patents Count Patent-Cited Count Engagement Academic network Collaboration Collaboration Impact Non-academic network Academic-Corporate Collaboration Academic-Corporate Collaboration Impact Economic development Academic-Corporate Collaboration Citing-Patents Count Patent-Cited Count Societal Impact Broader impact Academic-Corporate Collaboration Citing-Patents Count Patent-Cited Scholarly Output Patent-Citations Count Mass Media Media Exposure Field-Weighted Mass Media

Analyze research strengths Determine where research is a good potential investment Demonstrate impact (Return On Investment) of research funding Showcase researchers or identify rising stars Tell a better narrative about everything that is happening with research Research Metrics Use Cases

A Building Need to Demonstrate Impact https://report.nih.gov/nihdatabook/report/20

“ Given how tight budgets are around the world, governments are rightfully demanding effectiveness in the programs they pay for. To address these demands, we need better measurement tools to determine which approaches work and which do not. ” Bill Gates Gates Foundation Annual Letter 2013

Research Impact Frameworks https://becker.wustl.edu/impact-assessment/

Number of Library holdings ( WorldCat OCLC) Views on Slideshare Plays on YouTube Amazon book reviews Clinical citations or Health policy/guideline citations Government policy citations News mentions Patent citations Academic: Industry partnerships Licenses Business consultancy activities Number of patents filed and granted Wikipedia citations Blog mentions StackExchange links Downloads from Github , RePEc , IRs Citations (field normalised, % iles , counts) Collaborators on Github Full text, pdf, html views on ScienceDirect, Figshare etc Social media metrics (Shares, likes, +1, Tweets) Educational impact Societal impact Commercial impact Innovation Informational impact Academic impact Promotion / attention / buzz Types of impact Diverse Set of Metrics for Demonstrating Impact Qualitative input: Expert feedback on quality and impact of research

Wide Range of Research Output Types abstracts articles audio files bibliographies blogs blog posts books book chapters brochures/pamphlets cases catalogues clinical trials code/software collections commentaries conference papers corrections data sets designs/architectural plans editorials exhibitions/events expert opinions file sets figures government documents grants guidelines images interviews issues journals learning objects lectures/presentations letters live performances manuscripts maps media files musical scores newsletters news online courses papers patents policy posters preprints press releases projects recorded works reference entries/works reports research proposals reviews retractions speeches standards syllabi technical documentation textual works theses/dissertations videos visual arts volumes web pages web resources other https://plumanalytics.com/learn/about-artifacts/

https://rdmi.uchicago.edu/papers/08212017144742_deWaard082117.pdf Research Data Metrics Goal: Metric: How to measure Research Data is Shared : 1 Stored, i.e. safely available in long-term repository) Nr of datasets stored in long-term storage Mendeley Data, Pure; Plum Indexes Figshare, Dryad, Mendeley Data and working on Dataverse 2. Published , i.e. long-term preserved, accessible via web, have a GUID, citable, with proper metadata Nr of datasets published, in some form Scholix, ScienceDirect, Scopus 3. Linked , to articles or other datasets Nr of datasets linked to articles Scholix, Scopus 4. Validated , by a reviewer/curated Nr of datasets in curated databases/peer reviewed in data articles Science Direct, DataSearch (for curated dBses ) Research Data is Seen and Used : 5. Discovered Nr of datasets viewed in databases/websites/ search engines DataSearch, metrics from other search engines/repositories 6. Identified DOI is resolved DataCite has DOI resolution: made available? 7. Mentioned Social media and news mentions Plum and Newsflo 8. Cited Nr of datasets cited in articles Scopus 9. Downloaded Downloaded from repositories Downloads from Mendeley Data , access data from Figshare/Dryad 10. Reused Mention of usage in article or other dataset ScienceDirect, access to other data repositories

Open Science Metrics Impact of Open Science Engagement in Open Science activities and impact of that engagement https://ec.europa.eu/research/openscience/pdf/os_rewards_wgreport_final.pdf

Always use both qualitative and quantitative input into your decisions Always use more than one research metric as the quantitative input Using multiple metrics drives desirable changes in behavior There are many different ways of representing impact A research metric’s strengths can complement the weaknesses of other metrics Combining both approaches will get you closer to the whole story Valuable intelligence is available from the points where these approaches differ in their message This is about benefitting from the strengths of both approaches, not about replacing one with the other Golden Rules for Using Research Metrics

Mechanisms for Gathering Data for Metrics From the NISO Code of Conduct for Altmetrics: https://www.niso.org/press-releases/2016/02/niso-releases-draft-recommended-practice-altmetrics-data-quality-public Describe all known limitations of the data Provide a clear definition of each metric Describe how data are aggregated Detail how often data are updated

Responsible Metrics Robustness : basing metrics on the best possible data in terms of accuracy and scope Humility : recognizing that quantitative evaluation should support – but not supplant – qualitative, expert assessment Transparency : keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results Diversity : accounting for variation by field, and using a variety of indicators to support diversity across the research system Reflexivity : recognizing systemic and potential effects of indicators and updating them in response http://www.hefce.ac.uk/pubs/ rereports /year/2015/ metrictide /

https://www.snowballmetrics.com/ https://www.elsevier.com/connect/new-metrics-will-make-journal-evaluation-easier-and-more-transparent Transparency in Research Metrics

Presentation Roadmap Research metrics for institutions Data for metrics in institutional research information systems

Gathering Data for Evidence Gather data as you go along rather than retrospectively Think about what success would look like for each question or impact activity and how to evidence it Use all the data available, be clear & specific, and build a coherent narrative to provide context

Data in Institutional Systems Persons - researchers, postgraduate students, external persons Organizations - faculties, departments, research groups, external units Publications - peer-reviewed journal articles, books, chapters, theses, non-textual, etc. Publishers and journals - names, IDs, ratings Bibliometrics - citations, impact factors, altmetrics Activities - conferences, boards, learned societies, peer reviewing, prizes Narratives - narrative recordings of the impact of research Datasets - stored locally or in separate data repository Equipment - type, placement, ownership details Funding opportunities - funder, program, eligibility, etc. Grant applications - stage, funder, program, amount applied, documents attached Grant awards - funder, program, amount, dates, contract docs, applicants, budget Projects - budget, expenditure, participants, collaborators, students, outputs Press clippings - national and international papers, electronic media Joachim Schöpfel et al. / Procedia Computer Science 106 (2017) 305 – 320

Sources of Data and Ingestion Options T ype of Data Source(s) of Data Ingestion into Pure, a CRIS Persons Internal HR system Pure XML format (automatic recurring sync job) Organizations Internal HR system Pure XML format (automatic recurring sync job) Publications Manual entry Online sources, e.g. Scopus Legacy systems User-friendly templates Single-record import Automated import by person Automated import by department Pure XML format (single or repeated legacy import ) Elsevier PRS service Publishers and journals Manual entry Online sources, e.g. Scopus Legacy systems Automatically together with import Elsevier PRS service Pure XML format (single or repeated legacy import ) Bibliometrics Scopus Web of Science Automatically together with import Automatic sync job for citations (Scopus and WoS) Pure XML format (single or repeated legacy import ) Elsevier PRS service (Scopus bibliometrics only) Activities Manual entry Legacy systems User-friendly templates XML format for legacy import Narratives Manual entry User-friendly templates Datasets Manual entry Legacy systems User-friendly templates XML format for legacy import Elsevier’s Pure’s datamodel

Research Metrics Dashboard https://www.osti.gov/biblio/1462196

Research Metrics in Research Information Systems https://plumanalytics.com/integrate/load-your-data/pure-plumx-integration/