For the third year in a row, Crossref hosted a roundtable on research integrity prior to the Frankfurt book fair. This year the event looked at Crossmark, our tool to display retractions and other post-publication updates to readers.
Since the start of 2024, we have been carrying out a consultation on Crossmark, gathering feedback and input from a range of members. The roundtable discussion was a chance to check and refine some of the conclusions we’ve come to, and gather more suggestions on the way forward.
In our previous blog post in this series, we explained why no metadata matching strategy can return perfect results. Thankfully, however, this does not mean that it’s impossible to know anything about the quality of matching. Indeed, we can (and should!) measure how close (or far) we are from achieving perfection with our matching. Read on to learn how this can be done!
How about we start with a quiz? Imagine a database of scholarly metadata that needs to be enriched with identifiers, such as ORCIDs or ROR IDs.
We’re in year two of the Resourcing Crossref for Future Sustainability (RCFS) research. This report provides an update on progress to date, specifically on research we’ve conducted to better understand the impact of our fees and possible changes.
Crossref is in a good financial position with our current fees, which haven’t increased in 20 years. This project is seeking to future-proof our fees by:
Making fees more equitable Simplifying our complex fee schedule Rebalancing revenue sources In order to review all aspects of our fees, we’ve planned five projects to look into specific aspects of our current fees that may need to change to achieve the goals above.
On behalf of the Nominating Committee, I’m pleased to share the slate of candidates for the 2024 board election.
Each year we do an open call for board interest. This year, the Nominating Committee received 53 submissions from members worldwide to fill four open board seats.
We maintain a balanced board of 8 large member seats and 8 small member seats. Size is determined based on the organization’s membership tier (small members fall in the $0-$1,650 tiers and large members in the $3,900 - $50,000 tiers).
Crossref’s Similarity Check service is used by our members to detect text overlap with previously published work that may indicate plagiarism of scholarly or professional works. Manuscripts can be checked against millions of publications from other participating Crossref members and general web content using the iThenticate text comparison software from Turnitin.
The 2000 members who already make use of Similarity Check upload almost 2,000,000 documents each month to look for matching text in other publications.
We have some great news for those 2000 members –– a completely new version of iThenticate is on its way, and will start to roll out to users in the coming months.
New functionality has been developed based on your feedback over the past few years and includes:
An improved Document Viewer that makes PDFs searchable and accessible, with responsive design for ease of use on different screen sizes. All of the functionality of the Viewer and the Text-only reports in the previous version have been streamlined into just two views: Sources Overview and All Sources.
Improved exclusion options to make refining matches even easier. Smarter citation detection now identifies probable citations both inline and in reference sections.
A new “Content Portal” where you can see what percentage of your own content has been successfully indexed for the iThenticate comparison database, and download reports of indexing errors that need to be fixed.
A new API for integration with manuscript submission systems allows display of the largest matching word count and the top 5 source matches alongside the Similarity Score.
The maximum number of pages and file size per document has been doubled to 800 pages/200 MB.
Crossref members can use Similarity Check directly by logging in, or via an integration with a submission/peer review system. We are working with many system providers to bring v2.0 to you as soon as possible. In the meantime, we are looking for members to help us test the new system directly in the iThenticate user interface. If you are interested and can spare a few hours some time in the next month please let me know.
And if your organization is not yet using Similarity Check to assess the originality of the manuscripts you receive do take a look at the many benefits the service has to offer.