Description of issue or problem I’m having:
We are looking for a recommended method for generating automatic and backups for a production environment. Our server has 160GB of files and a 22GB database.
It currently updates the file copy and a database dump which usually takes some time in a defined routine at night. We believe that this is not the best process as there could be a difference between the time of copying files and copying the database.
Would there be any method of putting the System on read-only?
Should we suspend web access to generate the backup?
Steps I took leading up to the issue:
Dump database and file copy at scheduled times.
It is interesting to read up on your case. If you need to protect the database on the process of MysqlDump you can use the lock tables by using this query :
FLUSH TABLES WITH READ LOCK;
After the dump is finished you can release the lock by this query :
UNLOCK TABLES;
However, since our team never think about this case it made us never test this on our real OJS client installation.
There is a github issue filed to add a maintenance mode to OJS:
However, I don’t recommend something like this for a regular backup process as a multi-GB backup can take a long time, and having the website regularly in maintenance or read-only mode is annoying for users if their working time zone corresponds to your maintenance window.
I would recommend leaving the site live, and backing up first the database, then the files directory. The contents of the files directory are generally added to (e.g. new uploads) and almost never modified. If the database is backed up first and e.g. an author submits a new submission between the two processes, then the database will be consistent and the files directory will contain an extra file or two that is not referred to by the database; you will be able to restore from this backup without problems.
Regards,
Alec Smecher
Public Knowledge Project Team