Upgrade to 3.1.1-2 from 2.4.8 PHP Fatal error: Call to a member function getStatus()

see the faq section, there is an issue about that.

Nice, doing that right now. Will removal of the users remove the submissions or I need to clean up manually?

OJS handles that by merging accounts. So when you want to delete an account, you will actually merge it to another account. In our installation I have one account just for this purpose and I merge all spam accounts to that account. The merge will move all user actions (submissions etc.) to the other user and remove the merged account.

Does 2.X have a mapping table from roles_id to role names? I’ve beeing tryingto find everyone that has not published anything and is not a reviewer, but I don’t know the reviewer role ID
Edit:

Found it in the code directly
define(‘ROLE_ID_SITE_ADMIN’, 0x00000001);
define(‘ROLE_ID_JOURNAL_MANAGER’, 0x00000010);
define(‘ROLE_ID_EDITOR’, 0x00000100);
define(‘ROLE_ID_SECTION_EDITOR’, 0x00000200);
define(‘ROLE_ID_LAYOUT_EDITOR’, 0x00000300);
define(‘ROLE_ID_REVIEWER’, 0x00001000);
define(‘ROLE_ID_COPYEDITOR’, 0x00002000);
define(‘ROLE_ID_PROOFREADER’, 0x00003000);
define(‘ROLE_ID_AUTHOR’, 0x00010000);
define(‘ROLE_ID_READER’, 0x00100000);
define(‘ROLE_ID_SUBSCRIPTION_MANAGER’, 0x00200000);

see https://github.com/pkp/ojs/blob/master/plugins/importexport/users/sample.xml

Thanks. What do you do with the user that you merged all other with? I’m assuming I can delete that user and delete all bogus submissions as well, just not sure how

EDIT:
Just found your other topic @ajnyga
for i in {INSERTIDHERE..INSERIDHERE}; do php tools/deleteSubmissions.php $i; done

After a round of cleaning, I’m still getting duplicates with the same submission_file_id

[code: Installer Installer::provideSupplementaryFilesForReview]

DB Error: Duplicate entry '2147483647-1' for key 'PRIMARY'

ojs2: DB Error: Duplicate entry '2147483647-1' for key 'PRIMARY'

Searching on my database dump I can see 8 article_id with the same submission_file_id. Does that make sense? What’s the best aproach to clean duplicates like this?

There are also 27k articles with submission id = null. Will that be a problem?

There is no delete option in OJS. You can only merge users. You can of course delete users straight from the database and try to clean everything attached to them, but it is probably a lot of work. Individual submissions you can delete from the OJS UI.

where exactly are the duplicates? In what table?

do you mean that in the database table submissions (or articles) you have submissions with submission_id null?

I’m still not a 100% sure where they are. From the logs, the duplicate is when it tries to insert into submission_supplementary_files

(mysql): INSERT INTO submission_supplementary_files
(file_id, revision)
VALUES
(2147483647, 1)
1062: Duplicate entry ‘2147483647-1’ for key ‘PRIMARY’

I’m now doing a full sql dump with “mysqldump -p --skip-extended-insert | grep -i 2147483647” and finding EVERYTHING with that id. I don’t know why multiple articles in the 2.x database would have several entries in articles table with submission_file_id = 2147483647.

The nulls, yeah I see on the “articles” table on the 2.X database 27k entries with null in the submission_file_id.

I’m still struggling to understand all of what the upgrade from 2.X to 3.X does but I feel that the code is not lenient enough to bypass/skip some errors. I would prefer to have the system upgraded and have some duplicates removed or orphaned (and then manually fixed) then dying simply.

remember to check the revision number as well. Depending where the duplicates are, you could have a submission file with id “2147483647” and there could be several of these files in the database. However, it is only a problem if they also have the same revision number.

But why would articles have the same submission file id? I can’t think of a reason why that would happen.

In any case I’ve removed all articles with tools/deleteSubmissions.php that have that particular file id, let’s see if it works. The waiting is what kills me :slight_smile: . I’ve cleared the metrics table but that did not reduced the time it takes to upgrade, I’m still around 8 hours.

It is weird that it takes 8 hours even without the metrics. Or you have a large database to begin with with a lot of journals and submissions. If that is the case, prepare for a very very long upgrade with the metrics table and when you get past these errors. I did an upgrade with a database with 100 journals and it took 36 hours to complete and 30 hours of that was upgrading the metrics table. But it probably does not affect your upgrade yet, because you probably crash before that.

With the nulls, it could be that you need to have this commit: pkp/pkp#3783 consider supp files without file id in the migration · pkp/ojs@1899a4e · GitHub (which you have, if you use the latest stable branch for 3.1.1 from Github, but if you downloaded OJS from the PKP site, you do not have that).

There are a lot of cases where you have a single file_id with several revisions. If for example in OJS2 a reviewer uploaded several review files, all those files have the same file_id and just a different revision number. I do not remember all the places where OJS2 created revisions of the same file, but there are definitely other places as well.

But if I understand correctly the duplicates have to come originally from here: ojs/Upgrade.inc.php at ojs-stable-3_1_1 · pkp/ojs · GitHub so check the original article_supplementary_files table for duplicate entries.

It does not seem to have reduced any significantly when I truncated the metrics table.
Database dump takes 106mb, so I would say very small, and maybe around 25GB of files.

I do have that patch though, I’ve applied manually as a tentative to fix the issue.

For this duplicate, it is always the same revision (1). I will check the supplementary_files table once I’m at home to take a look.

Edit:
I see it spends quite a long time on (4 hours already):

UPDATE articles SET review_file_id=NULL WHERE review_file_id = 0

-----<hr>
-----<hr>
(mysql): DELETE FROM review_rounds WHERE submission_id = 24372

-----<hr>
-----<hr>
(mysql): SELECT file_id, revision, file_name FROM article_files WHERE file_id = 0

-----<hr>
-----<hr>
(mysql): DELETE FROM article_files WHERE file_id = 0

-----<hr>
-----<hr>
(mysql): UPDATE articles SET review_file_id=NULL WHERE review_file_id = 0

-----<hr>

hmm, maybe there are not that much statistics gathered then. I our installation the metrics table takes up half of the database size, 600 megabytes out if 1GB and in a similar installation with 100 journals it takes, as I mentioned above, around 30 hours to complete a single sql (out of the 36 hours total it took to upgrade). But you are probably not even reaching the part where the metrics table updates start.

The part you mentioned does take a considerable amount of time as well, true.

This looks like the next part which takes most time:

(mysql): UPDATE submission_files SET file_stage=10 WHERE file_id IN (SELECT file_id FROM submission_galleys)

+----+------+-----------------------------------+----------+---------+------+--------------+-----------------------------------------------------------------------------------------------------+
| Id | User | Host                              | db       | Command | Time | State        | Info                                                                                                |
+----+------+-----------------------------------+----------+---------+------+--------------+-----------------------------------------------------------------------------------------------------+
| 24 | root | localhost                         | revistas | Query   | 5015| Sending data | UPDATE submission_files SET file_stage=10 WHERE file_id IN (SELECT file_id FROM submission_galleys) |
| 64 | root | localhost                         | NULL     | Query   |    0 | NULL         | show processlist                                                                                    |
+----+------+-----------------------------------+----------+---------+------+--------------+-----------------------------------------------------------------------------------------------------+

Ok, now this borders the ridiculous.
I’m still getting the same error with duplicate 2147483647, but i have zero mentions of that id in the whole database before the upgrade. How is that possible?

I was going to say earlier that the number sounds weird (big). Seems that there is something special about that number php - Incorrect Integer (2147483647) is inserted into MySQL? - Stack Overflow. It is basically the biggest number there is in mysql.

So I think that, for some reason, the upgrade keeps looping something almost indefinitely, something it should not, and ends up trying to insert that max number twice => your error.

I think that in order get into bottom of this, you need to add some debugging code around here: ojs/Upgrade.inc.php at ojs-stable-3_1_1 · pkp/ojs · GitHub

Maybe @asmecher has some pointers to this, this is nothing I have seen before.

edit: or do you have values larger than that number as a file_id in the article_supplementary_files table? Because in a 32 bit system that is the highest value being supported I think and any higher value than that would be converted to 2147483647.

You see, now that’s an amazing rationale and it makes sense to me, including my is it taking so long.
I’m trying to upgrade using the 3.0.2 now to see what happens (although I tried before but was before handling the null issues). I will investigate this after it finishes (or crashes)

3.0.2 failed with the below error:

(mysql): SELECT * FROM users WHERE user_id = 23

-----<hr>
-----<hr>
(mysql): SELECT * FROM user_settings WHERE user_id = '23'

-----<hr>
PHP Warning:  assert(): Assertion failed in.../lib/pkp/classes/submission/PKPSubmissionFileDAO.inc.php on line 355
PHP Fatal error:  Call to a member function getFilePath() on null in .../lib/pkp/classes/submission/PKPSubmissionFileDAO.inc.php on line 377

@ajnyga On article_supplementary_files there are 30 entries, nothing bigger than 46371, so it looks legit. Now we need to find what could be causing the loop.