OJS 2.4.6 Reading Tools validate URLs timeout error

Hi there,

I am running a 2.4.6 instance of OJS and the editor would like to enable Reading Tools. However, when we try running the “Validate URLs for Reading Tools” link (i.e. journalURL/rtadmin/validateUrls) we get a 500 timeout error.

Internal Server Error
The server encountered an internal error or misconfiguration and was unable to complete your request.
Please contact the server administrator, root@localhost and inform them of the time the error occurred, and anything you might have done that may have caused the error.
More information about this error may be available in the server error log.

Errors in /etc/httpd/logs/error_log:

[Mon Mar 28 11:35:19 2016] [warn] [client] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server, referer: http://canadianfoodstudies.uwaterloo.ca/index.php/cfs/rtadmin
[Mon Mar 28 11:35:19 2016] [warn] [client] (104)Connection reset by peer: mod_fcgid: ap_pass_brigade failed in handle_request_ipc function, referer: http://canadianfoodstudies.uwaterloo.ca/index.php/cfs/rtadmin
[Mon Mar 28 11:35:35 2016] [error] mod_fcgid: process /var/www/journals/cfs/fcgi-bin/php-wrapper(25497) exit(communication error), get signal 11, possible coredump generated

I have tried changing various timeout, memory limit, URL length, etc. values in the following locations to no avail. Any pointers in the right direction would be greatly appreciated!


  • max_execution_time = 300
  • memory_limit = 256M


  • Timeout 3600
  • ProxyTimeout 3600
  • MaxKeepAliveRequests 1000


  • FcgidIdleTimeout 3600
  • FcgidBusyTimeout 600
  • FcgidIOTimeout 300




Graham Faulkner

What is the exact message from your web server’s error log?

Ooops, sorry about that. I have edited the original question to include the error log messages.

Thanks for catching that.

Just checking in again to see if anyone has some insight to share.



The code that is likely timing out is probably in RTAdminHandler::rtadmin_validate_url().

Perhaps you could put an error_log() statement to log each $url at the top of the function to see which URL is being processed before it dies (or if it appears to be a cumulative timeout).

Thanks for the feedback Clinton.

I put an error_log() in the code you suggested and watched the logs. It would always stop at the same (but valid) URL, so I wondered if it was maxing our memory. I bumped up php.ini’s memory_limit and my.conf’s wait_timeout after some experimentation, and it would run a bit further, but eventually time out.

However, I discovered I could also just run the validate command on each related item set, which accomplishes the same thing. This is what I ended up recommending to the journal editor, as the memory limits required were getting too high.