Jump to content

Extension:DumpsOnDemand

From mediawiki.org
MediaWiki extensions manual
DumpsOnDemand
Release status: stable
Implementation Special page
Description Allows users to request and download database dumps on the wiki
Author(s) Mainframe98talk
Latest version 1.0.2 (2020-03-28)
Compatibility policy Snapshots releases along with MediaWiki. Master is not backward compatible.
MediaWiki >=1.41.0
Database changes No
License MIT License
Download
Readme
  • $wgDumpsOnDemandUseDefaultJobQueue
  • $wgDumpsOnDemandRequestLimit
  • $wgDumpsOnDemandDumpFileBackend
  • $wgDumpsOnDemandCompression
  • dumprequestlog
  • dumpsondemand
  • dumpsondemand-limit-exempt
Quarterly downloads 7 (Ranked 130th)
Public wikis using 5,645 (Ranked 46th)
Translate the DumpsOnDemand extension if it is available at translatewiki.net
Vagrant role dumpsondemand
Issues Open tasks · Report a bug

The DumpsOnDemand extension allows users to request and download database dumps on the wiki. Database dumps can be downloaded from Special:RequestDump. If the user has the dumpsondemand right, they can request a new dump whenever they like. Two dumps are available: a dump containing only the current revisions, suitable for bot use and a dump containing all revisions, suitable for archiving.

DumpsOnDemand is based on the Dumps sub-extension for Wikia's WikiFactory extension.

Installation

[edit]
  • Download and move the extracted DumpsOnDemand folder to your extensions/ directory.
    Developers and code contributors should install the extension from Git instead, using:cd extensions/
    git clone https://gerrit.wikimedia.org/r/mediawiki/extensions/DumpsOnDemand
  • Add the following code at the bottom of your LocalSettings.php file:
    wfLoadExtension( 'DumpsOnDemand' );
    
  • Configure as required.
  • Yes Done – Navigate to Special:Version on your wiki to verify that the extension is successfully installed.


Vagrant installation:

  • If using Vagrant , install with vagrant roles enable dumpsondemand --provision

Configuration

[edit]

Parameters

[edit]
  • $wgDumpsOnDemandUseDefaultJobQueue - This setting will make the jobs used by DumpsOnDemand execute unconditionally. By default, none of the dump jobs are run, unless specified by the job runner. Enabling this setting is only recommended for small wikis or wikis that have sufficient job runner capacity.
  • $wgDumpsOnDemandCompression - This setting configures which compression format should be used to compress the dumps. By default, DumpsOnDemand chooses an algorithm based on the available PHP extension. An invalid option will result regular dumps without compression. Supported options are:
    • gz for GZip
    • bz2 for BZip2
    • zip for Zip
  • $wgDumpsOnDemandRequestLimit - This setting configures the time between subsequent dump requests. It specifies an amount in seconds that should have passed before a new dump can be requested. Users with the dumpsondemand-limit-exempt right can ignore this restriction.
  • $wgDumpsOnDemandFileBackend - This setting specifies an ObjectFactory spec for a FileBackend instance. The provided object will be used by DumpsOnDemand to write the dumps too and read the urls from. DumpsOnDemand only provides a backend that writes to $wgUploadDirectory , but you can add your own by extending the FileBackend class and specifying it in this setting.

User rights

[edit]
  • dumpsondemand - This user right allows users to request a new dump on Special:RequestDump.
  • dumpsondemand-limit-exempt - This user right allows users to ignore the time limit between dump requests. Users must still have the dumpsondemand right to request a new dump.
  • dumprequestlog - This user right allows users to view the database dump request log.

Other important notes

[edit]
  • DumpsOnDemand generates the dumps using the JobQueue. Given that the creation of a database dump can take a long time, DumpsOnDemand jobs are not executed along with the regular jobs by default. This can be disabled by setting $wgDumpsOnDemandUseDefaultJobQueue to true, but that is only recommended for small wikis or wikis with sufficient job running capacity.

See also

[edit]