Difference between revisions of "ISFDB:Configure Nightly Processing"

From ISFDB
Jump to navigation Jump to search
(Page creation)
 
(Updated the installation instructions to reflect the current setup on the lve server)
Line 1: Line 1:
Configure crontab to run nightly_update.py in the "nightly" subdirectory under INSTALL_HTML. This module should be scheduled to run two times. The first run should be scheduled to run every night with the parameter "nightly", e.g.:
+
Note: the following two files reside in the "nightly" subdirectory under INSTALL_HTML.
  
00    01    *    *    *    /var/www/html/nightly/nightly_update.py nightly > /dev/null 2>&1
+
First, configure crontab to run nightly_job.py once a day, e.g.:
  
When it runs, this task will regenerate database statistics and rerun the nightly cleanup reports.
+
00    01    *    *    *    /var/www/html/nightly/nightly_job.py > /dev/null 2>&1
  
The second run of this cron job should be scheduled to run once a week with the parameter "weekly", e.g.:
+
When it runs, this task will regenerate database statistics and rerun the cleanup reports.
  
00    02    *    *    2    /var/www/html/nightly/nightly_update.py weekly > /dev/null 2>&1
+
Next, configure crontab to run monthly_job.py once a month, e.g.:
  
When it runs, this task will recreate the "Suspected Duplicate Authors" cleanup report. This process takes a long time and puts a lot of stress on the server, so it should not be run nightly on a live system.
+
00    02    7    *    *    /var/www/html/nightly/monthly_job.py > /dev/null 2>&1
 +
 
 +
When it runs, this task will recreate the "Suspected Duplicate Authors" cleanup report. Note that this process takes a long time and puts a lot of stress on the server, so it should only be run infrequently.
  
 
[[Category:Installation instructions]]
 
[[Category:Installation instructions]]

Revision as of 15:41, 21 April 2021

Note: the following two files reside in the "nightly" subdirectory under INSTALL_HTML.

First, configure crontab to run nightly_job.py once a day, e.g.:

00    01    *    *    *    /var/www/html/nightly/nightly_job.py > /dev/null 2>&1

When it runs, this task will regenerate database statistics and rerun the cleanup reports.

Next, configure crontab to run monthly_job.py once a month, e.g.:

00    02    7    *    *    /var/www/html/nightly/monthly_job.py > /dev/null 2>&1

When it runs, this task will recreate the "Suspected Duplicate Authors" cleanup report. Note that this process takes a long time and puts a lot of stress on the server, so it should only be run infrequently.