Summary review statistics

OJS version 3.1.2 onwards.

From 2021, many funding agencies (mostly European but it will expand) will require that all funded research be immediately free to read on publication.

This is imaginately called Plan S. There are several requirements that a journal must satisfy (see Part III). I’m writing about this one:

at least basic statistics must be published annually, covering in particular … the number of reviews requested, the number of reviews received …

I don’t know why that information would interest anyone outside the journal, but if it is required how do we find it? I can see the Review Report spreadsheet from which that information might be extracted with some effort – is that the only option? An obvious place to provide it would be in Statistics->Editorial.

Thanks, Brendan.

Hi, Brendan,
I would like to add to your request to have more visibility in “report format” about journal statistics…
Lucia

Thanks @bdm and @lsteele. Can I clarify your request? Is the need to show, for each submission, how many reviewers were assigned and what their recommendations were? Or is the need to get overall statistics for the journal, similar to those in the editorial statistics page (average number of reviews per submission, etc)?

Hi @NateWr, The need is for annual summary statistics. Here is the official requirement:

  • The journal/platform must provide, on its website, a detailed description of its editorial policies and decision-making processes. In addition, at least basic statistics must be published annually, covering in particular the number of submissions, the number of reviews requested, the number of reviews received, the approval rate, and the average time between submission and publication.

Cheers, Brendan.

Thanks @bdm! Sorry to hammer home on this, but to clarify further, are you expecting a stat for a year like this:

  • 425 reviews assigned / 129 accept / 221 reject / 75 abandoned

Or more like this:

  • Each submission was assigned an average of 2.75 reviews, 1.25 reviews recommended to accept, 1 to reject, and 0.5 reviews were abandoned.

Also, should these statistics count only published submissions or include those that are rejected? Should it be broken down by accepted/rejected? Do they need to be done a section-by-section basis (ie - some formats may not undergo a typical peer review)?

Hi @NateWr,

Alas what I quoted is the full text of what I know.

I think that “number of submissions” means all submissions in that year, regardless of what happened to them.

The “approval rate” is I think that fraction of submissions that are accepted by the journal (not the fraction of reviews that recommend acceptance). It doesn’t say what to do about articles submitted in one year and accepted/rejected in the next year. But actually that sort of averages out over time, so I think it matters. So something like this:

In 2020, 451 articles were submitted, 305 were rejected, 107 were accepted and 115 were published. In total, 678 reviews were requested and 499 were received. For the 115 articles published in 2020, the average time since submission was 201 days.

All the numbers except the 201 days are just counts of events that occurred during the calendar year and don’t require tracking the progress of individual papers. (That’s why 305+107 is not equal to 451 for example.)

I don’t think there is a need for OJS to display this information to the public, but if it is available to editors we can easily maintain a table on our information pages by adding an extra row every year.

Thanks, Brendan.

Hi Nate and bdm,
just adding to this interesting discussion (I am using still 3.1.1.2)
My report wish list would include
Submission report: showing how many submissions the journal has received and what happened to them in how many days, which editor handled the submission, the topic, etc; This type of report, run regularly, would enable you to create accurate stats;
Peer review report: showing all reviewers involved in the journal and to what extent; this report would allow you to go into more details into your reviewer activity.
Weekly manuscript status report which I create weekly for our editors and which is very time consuming: I think statuses are not so clear and I have to keep an excel sheet regularly updated with statuses and dates which is rather old-fashioned. In my dream list I would like to see in the Submission tab each ms status more accurately. Presently you cannot distinguish between ms under review or with author for revision.
Weekly production report, this would be again very useful to track down those manuscript in production (think for example for first, second, final proofs or galleys)

Ideally you would export a table of data (xls?, cdv?) and then trigger your stats in journal-consistent way over the years. Let’s also bear in mind whether we use a submit date or decision date, as in real life these can overlap between two years (submitted in December, decided on in January…).

And I am sorry to leave this last, but it is not least, I honestly have not managed to pull out a user list… I find it quite amazing that you can only pull out a xml list of fields and not a csv, much more popular.

I hope to have been able to contribute to this hot discussion!
Lucia

Thanks, that’s helpful!

@bdm Have you seen the editorial stats that are available from 3.2 on? These can be viewed by going to Statistics > Editorial. They should include all off the stats that you listed in your example except the review assignments.

(Pay careful attention to the description of some of the stats. As @lsteele pointed out, it matters a lot when a submission was submitted vs when an editorial decision was made, and this can make it difficult to get accurate statistics except for date ranges from at least a year ago.)

@lsteele I can see what you’re looking for is more of an overview of current editorial status. I’ll take each one in turn:

  • Submission Report: some overall statistics should be available from the editorial stats (received, accepted, days to decisions). Have you seen that? What we don’t have is a breakdown of these stats by editor.
  • Peer Review Report: I don’t think we have had a request for this specifically in the past, but we have had similiar requests to share this information with reviewers (see #3534). It wouldn’t be hard, once we have this data, to provide an overall list to journal managers.
  • Weekly manuscript status report: It’s interesting that you provide a spreadsheet to your editors each week. Would you be willing to share more information about this? I’d love to what the motivation is, what is lacking in the submission list for the editors to review it themselves, and what you gain by compiling it and sending it out to them each week. That will help us as we think about what information is lacking in the existing submission lists. We do have a lot of requests for better tracking of submissions within their stage. In particular, the review stage gets the most requests and we’re cautiously optimistic that we’ll see forward progress on this in the next couple major releases.
  • Weekly production report: Stages outside of review are harder for us to track automatically. That’s because the copyediting and production stages have no fixed tasks. Each journal has their own process and we have no way of saying where, in that process, the submissions have reached. We have discussed providing a generic tasks tool, which would help journals define tasks and track how many are done on each stage, but we haven’t yet been able to prioritise work on that.

I honestly have not managed to pull out a user list

We have this scheduled for release in 3.3. It should be possible to get a CSV export from that version on.

@NateWr, Yes I know about the Editorial statistics, that’s why I initially asked only about reviews. If review statistics were added to that page, it would be a fine solution.

Incidentally, all the percentages on that page (the four items called “XXX Rate”) show as “0.00%” for us. We are using 3.1.2.4 with PHP 7.2.30 and PostgreSQL 9.6.17.

Cheers.

The rates are only going to work when a long date range has been specified. That’s because we only count submissions that were submitted and received a final decision within that range. So if the date range is the last three months, probably very few (or zero) have been submitted and been accepted/rejected in that period.

@NateWr
It still shows 0.00% for all percentages even if I choose a long range, including “all dates”.

I notice that sometimes when changing the date range there is a pop-up “An unexpected error occurred. Please reload the page and try again.” If that happens, a very long error message appears in the php log:

Slim Application Error:\nType: Illuminate\Database\QueryException\nCode: 22012\nMessage: SQLSTATE[22012]: Division by zero: 7 ERROR: division by zero (SQL: select statistics.*, COALESCE(statistics.submission_declined_initial, 0) …

It’s about 5000 characters long. I can send it all (how?) if that would help.

An unexpected error occurred

I believe that this has been fixed in more recent versions, but if you see this issue after you have upgraded to v3.2.1 or above please let us know!

It still shows 0.00% for all percentages even if I choose a long range, including “all dates”.

If you’re still on 3.1.2 you may be getting hit with some bugs we had with the editorial stats there. I can’t remember the exact details but I know some of the stats had some problems in getting calculated correctly, in certain circumstances where data was missing. If you’re able to run a test upgrade on a copied instance, I’d be curious to know if those numbers get fixed in our latest versions.

Hi Nate, I am very happy to expand on reports, as we are asked more and more frequently for in-depth analysis.
Referring to your replies above:

Submission Report: I have tested it and I get a very garbled, imported xls table.

I am not a very experienced user of OJS or skilled to create any “do it yourself” adaptations, unfortunately… so if I am doing anything incorrectly I will be very happy to learn more.

Peer Review Report : I exported the reviews report, but the final results is also garbled (It appears to pull in the abstract):

Weekly manuscript status report: here is a comparative view of the submission tab in OJS and my report (anonymised)

This is my comments on the submission view, from an editor perspective:

  1. you cannot sort submissions by status: this means you have all subs in order of receipt and this has to be what you want. An editor should focus only on submissions which are still in editorial review, ignoring the accepted ones (for example). However, if I remove him from the Participants, I will not easily see who the editor handling the submission was
  2. it is very impossible to distinguish between a submission under review and a submission with authors for revision
  3. there are no dates (submission date and due dates) easily visible
  4. you have not clear visibility of reviewers invited, reviewers who have accepted, reviewers who need to be reminded. There is no alert as to Response late, for example (while there is for A review is overdue). This means you have to investigate each single reviewer invitation to find out when the response or review is due.
  5. when you have more rounds of review, there is no visibility in the submission table of which round of review the article has reached, which would easy to display and important (in my report I decided to skip this to make life easier)
  6. as a journal manager, you have no visibility in the submission table of the editor assigned (in case you have more than one editor, of course)
  7. when you open a submission detail, you lose the ms number… this really is weird

Weekly production report: I have started to us OJS for basic production tracking. What I find strange is that you can notify a participant but assign a file, so I resort to copy/past the text of a notification email in the discussion and attach the files I need to submit (instead of selecting from the submission files, but this is our choice). This works quite well. Maybe to help in the production tracking a simple flag system could help: each user could identify with flags different stages (proof 1/2, with author, etc)

Sorry it this has become a long email… and thanks for following up on reports. I really appreciate to share my difficulties.
Lucia

I have tested it and I get a very garbled, imported xls table.

Hmm, those spreadsheets look to me like maybe they need to be imported or opened with a specific delimiter, and that might be why they are looking garbled in your instance. @pmangahis have you encountered this before? Do you know if there are any recommendations for making sure the columns show up properly when a report is opened?

here is a comparative view of the submission tab in OJS and my report (anonymised)

Thanks for that! This looks like it aligns with a lot of the feedback we’ve gotten from focus groups that we’ve run at our community sprints. Editors want more insight into what’s going on within a stage – particularly the review stage – and how long a submission has sat with editors/reviewers.

We’ve collected a number of these requests in a project on Github. I think the most relevant to your needs is going to be work to improve visualisation of review stage status. If you see any proposals in there that you particularly like, let us know!

We don’t have this work scheduled at the moment but we know it’s a priority and we want to get to it as soon as possible.

you cannot sort submissions by status

This is also a community priority.

There is no alert as to Response late , for example (while there is for A review is overdue )

This alert should appear whether the reviewer has missed a response due date or the review due date. I think I’ve heard before that this should be distinguished better, so that’s confirmation of that.

as a journal manager, you have no visibility in the submission table of the editor assigned

The next major version (3.3) will introduce a filter for journal managers to view submissions by assigned editor.

when you open a submission detail, you lose the ms number

This is addressed in v3.2.

you can notify a participant but assign a file,

I think we’ve had this request before but I couldn’t find it filed so I created a feature request.

@bdm Thanks, we got sidetracked a bit but I have filed your original post as a feature request: https://github.com/pkp/pkp-lib/issues/6130

Dear Nate, thanks a million… I love the way you make our small problems look important and track them to improve the workability of the system.
It looks like I should really get V.3.2 installed, which our provider is reluctant to do because of fixes which may be required once installed.
I’d love to be able to export a users table: any hint on how to do this from xml would be greatly appreciate.
Have a good day,
Lucia