Topic on Project:Support desk

500/502 errors affecting certain pages on my wiki

6
Slgrandson (talkcontribs)

Morning, MW team. This month, certain pages in a certain custom namespace on my creative-venture wiki have caught rather bad luck in regards to creation and/or access/loading.

As explained and detailed in this Miraheze Phabricator filing, a slew of titles in the custom "Entry" namespace--for my conlang dictionary--cannot be created through Page Forms for whatever reason. (The affected range is "S"/"T", but a few pages outside of it also fall into this trap; the form used is "Entry/RFM", imported/expanded from the original Referata version. ["RFM" stands for Tovasala, the conlang so covered.] Everything worked fine up till said form's [re-]introduction at the start of March, and that may be causing the problem--the same reason why I tagged this case as "Unbreak Now!" on Phabricator.)

This contributor uses Chrome on a Galaxy Tab A to edit Miraheze. On and off, on affected links, Chrome outputs this browser-specific message:

This page isn't working

constantnoble.miraheze.org is currently unable to handle this request.

HTTP ERROR 500

And sometimes, Miraheze itself displays its "502 Bad Gateway" message.

Both of them, inevitably, cause me to hit "Refresh" several times or more until the affectee appears.

On a related note: The Form namespace isn't configured to recognise subpages at the moment, at least on my wiki, and I don't want anything to break through ManageWiki or otherwise. See this November 2020 discussion with developer/Referata founder Yaron Koren (talk · contribs), as well as this local namespace API overview.

TL/DR: A 500/502 problem that only affects a certain subset of titles in the Entry namespace (viz. those starting with "S"/"T"), and no other pages. Extensions involved are DPL3 (Universal Omega (talk · contribs)) and Page Forms (Koren).

What I designate as "Wave 1" of the Referata-to-Miraheze imports can't finish soon enough on account of that bottleneck....

P.S. Sending @Yaron Koren: Yeah, you know the deal.

(MW 1.37.1 [8cea63d] / PHP 7.4.28 [fpm-fcgi] / 10.5.15-MariaDB-1:10.5.15+maria~bullseye-log)

--Slgrandson (talk) 14:00, 26 March 2022 (UTC)

Universal Omega (talkcontribs)

DPL3, shouldn't be involved in this cause anymore. My recent changes to it seem to have successfully mitigated it causing issues like this. Though it does seem PageForms could likely be at fault.

Slgrandson (talkcontribs)

And to @Bawolff: From Universal Omega (talk · contribs) himself at MH Phab, minutes ago (and apologies if it's nighttime EDT where I am):

[410434a990e99c6d7d7f7a99] /w/index.php?title=Entry:slenge&action=submit   PHP Fatal Error from line 341 of /srv/mediawiki/w/includes/libs/objectcache/MemcachedPeclBagOStuff.php: Allowed memory size of 134217728 bytes exhausted (tried to allocate 20480 bytes)

...which may bring us one step closer to diagnosis (if that comes to pass). --Slgrandson (talk) 02:32, 13 April 2022 (UTC)

Bawolff (talkcontribs)

Pretty generic OOM error. Note, php OOM errors are unreliable when it comes to what line number they occur at, so there is not necessarily any reason to think it has something to do with MemcachedPeclBagOStuff.php.

Its still possible that this is triggered by an extension that has some sort of memory leak, or just processes a huge amount of data. You might want to try and generate a minimal test case.

128MB, while definitely a reasonable limit, is borderline on the low side for a memory limit (For comparison, i think WMF uses 666mb). Perhaps it should just be increased. However, whether or not that is an appropriate solution depends on the hosting situation so would be better answered by Miraheze staff (I want to be very clear that it would be 100% reasonable for miraheze staff to say, that due to the way that their hosting is setup, 128mb is what the memory limit has to stay at).

Slgrandson (talkcontribs)

From Universal Omega (talk · contribs) at MH Phab, last night:

I did some digging into this. The main issue seems to be the mentioned OOM (which I found is caused by RegexFunctions), with both Loops seemingly playing a part as well, though a bit rare for those ones. I decided to remove all overrides for the Loops extension, so the max limit must remain at 100, due to more than that causing issues with us.

Though it seems that ParserFunctions is another one that due to how much usages there is, is also playing a big part here, causing OOMs, and significantly slowing down the jobs.

Another (unrelated, this one is not causing OOMs, by itself anyway) issue is DPL3, once I tested by disabling ParserFunctions, I got an error for DPL3 sometimes:

2022-04-14 01:28:06 refreshLinks Template:Definition/Arrays pages={"1647":[3034,"dr\u00e8ve"]} rootJobSignature=28e5340e1f5eb5a76c907900dd8ec37c53011775 rootJobTimestamp=20220413202831 triggeredRecursive=1 causeAction=edit-page causeAgent=Routhwick namespace=10 title=Definition/Arrays requestId=da452712a4d937b5d2641c4f (uuid=f759c456ba214c50adf311d4833c64e7,timestamp=1649892120) t=553 error=Wikimedia\Rdbms\DBQueryError: Error 1146: Table 'constantnoblewiki.dpl_clview' doesn't exist (db101)
Function: MediaWiki\Extension\DynamicPageList3\Query::buildAndSelect - Entry:drève
Query: SELECT DISTINCT `page`.page_namespace AS `page_namespace`,`page`.page_id AS `page_id`,`page`.page_title AS `page_title`  FROM `page` INNER JOIN `dpl_clview` `cl1` ON ((`page`.page_id = cl1.cl_from AND (cl1.cl_to = 'rfm=Tovasala' OR cl1.cl_to = 'en' OR cl1.cl_to = '' OR cl1.cl_to = 'rfm=CH')))   WHERE `page`.page_is_redirect = 0 AND `page`.page_namespace = 3034  LIMIT 500
this one is because when DPL3 is installed with ManageWiki, it does not create the VIEW.

And on top of that, "RegexFunctions has been disabled as it's causing OOMs." Effectively meaning my Dictionary entries have once again been gored away for the time being, design-wise. Once everything returns to normal, I'll remind you.

In the meantime, bringing over RgxF developer @Skizzerz: Perhaps he may look into the issue further than we can. --Slgrandson (talk) 16:32, 14 April 2022 (UTC)

Skizzerz (talkcontribs)

RegexFunctions will not block you from using a terrible regex that causes all sorts of backtracking and uses up a ton of resources. Either optimize your regexes or move to a solution like Scribunto (and lua's pattern matching, which is a lot lighter-weight than regex). If you want to go the former route, there is plenty of information online on how to avoid regex patterns that cause excessive backtracking.

Reply to "500/502 errors affecting certain pages on my wiki"