Data and Software Citations. What you don’t know CAN hurt you.
Date
Wednesday, February 15, 2023 11:00 AM EST
Description
When we read a published scholarly article we rarely, if ever, ask to see the machine actionable version of the text. And yet this hidden version is used to enable much of the downstream services such as automated attribution and credit. When it comes to data and software citations in the reference section, recently the probability of an accurate machine-readable version was very low. For some journals, even zero.
Why you ask? The citation looks just fine in the online version and the downloadable PDF, what could possibly have gone wrong?
Well, there is a plethora of challenges to uncover. First, data and software citations require different validation steps during the production process. Because of this, there machine-readable text is commonly not analyzed correctly, and some text might be altered such that the citation is no longer actionable. How many times of you see that name of the journal in the title of the dataset? Gobs. Further, Crossref requirements are also different for these types of citations causing those citations sent improperly to land on the cutting room floor in many cases.
In this session we will detail the differences in the production process and provide specific guidance to make the necessary corrections. This work has been led by the Journal Task Force for the FORCE11 Software Citation Implementation Working Group.