For the third year in a row, Crossref hosted a roundtable on research integrity prior to the Frankfurt book fair. This year the event looked at Crossmark, our tool to display retractions and other post-publication updates to readers.
Since the start of 2024, we have been carrying out a consultation on Crossmark, gathering feedback and input from a range of members. The roundtable discussion was a chance to check and refine some of the conclusions we’ve come to, and gather more suggestions on the way forward.
In our previous blog post in this series, we explained why no metadata matching strategy can return perfect results. Thankfully, however, this does not mean that it’s impossible to know anything about the quality of matching. Indeed, we can (and should!) measure how close (or far) we are from achieving perfection with our matching. Read on to learn how this can be done!
How about we start with a quiz? Imagine a database of scholarly metadata that needs to be enriched with identifiers, such as ORCIDs or ROR IDs.
We’re in year two of the Resourcing Crossref for Future Sustainability (RCFS) research. This report provides an update on progress to date, specifically on research we’ve conducted to better understand the impact of our fees and possible changes.
Crossref is in a good financial position with our current fees, which haven’t increased in 20 years. This project is seeking to future-proof our fees by:
Making fees more equitable Simplifying our complex fee schedule Rebalancing revenue sources In order to review all aspects of our fees, we’ve planned five projects to look into specific aspects of our current fees that may need to change to achieve the goals above.
On behalf of the Nominating Committee, I’m pleased to share the slate of candidates for the 2024 board election.
Each year we do an open call for board interest. This year, the Nominating Committee received 53 submissions from members worldwide to fill four open board seats.
We maintain a balanced board of 8 large member seats and 8 small member seats. Size is determined based on the organization’s membership tier (small members fall in the $0-$1,650 tiers and large members in the $3,900 - $50,000 tiers).
The Metadata Manager tool is in beta and contains many bugs. It’s being deprecated at the end of 2021. We recommend using the web deposit tool as an alternative, or the OJS plugin if your content is hosted on the OJS platform from PKP.
Once you click Deposit, we immediately process the deposit and display the results for accepted and rejected deposits. All deposit records accepted by the system have a live DOI.
All deposit results are archived and available for reference on the Deposit history tab on the top menu bar.
You can also see your deposit history in the admin tool - go to the Administration tab, then the Submissions tab. Metadata Manager deposit filenames begin with MDT. You can even review the XML that Metadata Manager has created your behalf.
Updating existing records and failed deposits
Metadata Manager also makes it easy to update existing records, even if you didn’t use Metadata Manager to make the deposit in the first place. You must add the journal to your workspace before you can update the records associated with it - learn more about setting up a new journal in your workspace.
Accepted and Failed submissions can be updated using the respective tabs in the workspace. Click into the journal, and then click into the article. Add or make changes to the information, and then deposit.
What does the status “warning” in my submission result mean?
When similar metadata is registered for more than one DOI, it’s possible that the additional DOIs are duplicates. Because DOIs are intended to be unique, the potentially duplicated DOI is called a conflict. Learn more about the conflict report.
In Metadata Manager, if you register bibliographic metadata that is very similar to that for an existing DOI, you will see a status “warning” with your submission result. This is accurate.
When you return to your journal workspace in Metadata Manager to review your list of DOIs, the DOI that returned the “warning” will display as “failed”. This is inaccurate, as you can see if you try to resolve the DOI in question. We are working on improving the wording in this part of the process to make it less confusing.
Page owner: Sara Bowman | Last updated 2022-July-22