Manual:Edit throttling
Edit throttling is a security mechanism to deal with buggy or malicious bots by disallowing multiple edits from the same user or IP address within a certain time period.
Rationale
[edit]Wikis in general, including wikis running the MediaWiki engine, depend on human intervention to maintain editorial integrity. If one contributor, through malice or ignorance, makes unwanted edits, other contributors can simply revert or correct the edits.
This strategy, however, depends on the community being able to keep up with the pace of unwanted edits. A buggy or malicious bot can make edits much faster than a human can by hand, and may be able to outpace even a large community of editors trying to clean up its mistakes.
One tool in the MediaWiki software to deal with unruly bots is a block -- disallowing edits from the IP address the bot software is running from. Another possibility is edit throttling -- disallowing more than a certain number of edits from any one user or IP address during a particular amount of time.
Hostile bots could, of course, keep their edit rate below the throttling rate. However, this would be OK: at least the community can respond with other measures to handle the unwanted edits. The bot wouldn't be running fast enough to be impossible to manually revert.
Design
[edit]The general idea is to restrict edits from any given user or IP address to X edits per every Y seconds. It's important for an individual MediaWiki installation to tune the variables X and Y to block hostile or buggy bots without blocking perfectly well-meaning human editors.
In addition, it's probably worth considering how big the wiki is, and how active the community is -- how much damage can be done if a hostile bot does X edits in Y seconds, before someone notices and uses another mechanism to stop the bot. Lastly, the suggested delay for friendly, well-behaved bots should be set to not exceed this rate (or vice versa).
Some example values of X and Y:
- X = 1, Y = 10: no more than 1 edit in 10 seconds. Note that a human could conceivably do two edits in 10 seconds, and get a warning. Probably X = 1 is never going to be a good choice.
- X = 20, Y = 600: this means 20 edits per every 10 minutes. Probably less likely that a human being could keep up this rapid pace of editing.
- X = 100, Y = 3600: few human editors would do more than 100 edits per hour. However, for some sites, 100 edits could cause pretty impressive damage. This can be easily reverted through their user contributions page, but if your wiki only has 50 pages this is clearly too high a setting.
This can be configured with $wgRateLimits.
With that setting, you can throttle the following actions:
- edits
- page moves (with Special:MovePage)
- file uploads
- rollbacks
- new password requests ("E-mail new password" in Special:Userlogin)
- sending mails to other users with Special:EmailUser
- purging of pages
- purging of link tables
- rendered files (standard and non-standard)
- stashing edits into cache prior to saving, and adding or removing change tags
Also, you can define different levels of throttling depending on user's status:
- for each registered user
- for each new user (i.e. without "autoconfirmed" right)
- for each anonymous user
- for all IPs from the same subnet
- for the sum of all anonymous users
Users with noratelimit right will be completely exempted from throttling.
Advantages
[edit]- Limits damage from hostile or buggy bots
- Relatively invisible to human editors
Disadvantages
[edit]- Misconfigured installations could block perfectly good edits by humans
- Many users sharing the same IP address could kick in throttling
- High temptation for malicious admins to misuse the feature
See also
[edit]- Extension:AbuseFilter - extension that allows customizable edit throttling to be set up