Jump to content

Extension talk:MetaMan

Add topic
From mediawiki.org

Detected bug in an extension! Hook MetaMan:removeDeletedPage failed to return a value; should return true to continue hook processing or false to abort.

Backtrace:

  1. 0 /home/content/e/g/y/egykalapalatt/html/wiki_new/mediawiki/includes/Article.php(2478): wfRunHooks('ArticleDeleteCo...', Array)
  2. 1 /home/content/e/g/y/egykalapalatt/html/wiki_new/mediawiki/includes/Article.php(2283): Article->doDelete('a lap tartalma:...', false)
  3. 2 /home/content/e/g/y/egykalapalatt/html/wiki_new/mediawiki/includes/Wiki.php(470): Article->delete()
  4. 3 /home/content/e/g/y/egykalapalatt/html/wiki_new/mediawiki/includes/Wiki.php(63): MediaWiki->performAction(Object(OutputPage), Object(WikilogItemPage), Object(Title), Object(User), Object(WebRequest))
  5. 4 /home/content/e/g/y/egykalapalatt/html/wiki_new/mediawiki/index.php(118): MediaWiki->initialize(Object(Title), Object(WikilogItemPage), Object(OutputPage), Object(User), Object(WebRequest))
  6. 5 {main}

213.134.24.86 01:42, 12 November 2009 (UTC)Reply

2009-11-12: Fixed, thx for testing. There was a colon missing in metaman.php where the hook gets registered ("MetaMan::removeDeletedPage" is correct).

Hanged up on retrieving categories and properties

[edit]

Great extension with a nicest approach of managing categories amongst all similar ones i've met. hanged up though. because of failed attempt to create a table by firebug's "words".

Internal Error. A database error has occurred
Query: create table `ru_metaman_pagerev` (
id integer(10) unsigned not null,
rev integer(10) unsigned not null,
foreign key(id) references page(page_id) on delete cascade
) engine = InnoDB, default charset = binary;
Function:
Error: 1005 Can't create table 'xxx.ru_metaman_pagerev' (errno: 150) (xxx.mysql.xxx.com.ua)

Correct syntax for my mysql server was

create table `ru_metaman_pagerev`(
id integer(10) unsigned not null,
rev integer(10) unsigned not null,
foreign key(id) references ru_page(page_id) on delete cascade
) engine = InnoDB, default charset = binary

Nevertheless it is still hanging up. Firebug doesn't tell anything specific, just shows JSON.parse marked with red :

function printSuggestions(request) {
var suggestions = JSON.parse(request.responseText);

Let me know if I can do anything to help you to fix this. I like this extension and would love to have it on my site --Erithion 11:26, 4 March 2011 (UTC)Reply

2011-05-09: I'm gonna have to take a closer look into this issue. For now: The extension needs to be able to create additional database tables. If that, for some reason, isn't possible with your MediaWiki, there is no similarity matrix and thus no metadata suggestions. Thank you though for providing such detailed feedback, I really appreciate that!

Yes, it is possible. I have a complete control over my mediawiki database. However, it seems the problem is precisely in this matrix. Am debugging your extension in Zend Studio, but being not a web developer, I cannot yet say exactly why there was such a big alloc. But I found that it hung in MetaMan::iteratePagePairs. And in the Firebug console, I discovered the following:
( ! ) Fatal error: Allowed memory size of 104857600 bytes exhausted (tried to allocate 89653261 bytes) in /home/raide/mediawiki-1.16.0/extensions/metaman/metaman.body.php on line 732
Call Stack
#TimeMemoryFunctionLocation
10.053386660{main}( )../index.php:0
20.197112949476AjaxDispatcher->performAction( )../index.php:74
30.197212950188<a href='http://www.php.net/call_user_func_array' target='_new'>call_user_func_array</a> ( ???, ??? )../AjaxDispatcher.php:103
40.199313192916MetaMan::getSuggestions( ??? )../AjaxDispatcher.php:0
54.670013435748MetaMan::iteratePagePairs( ??? )../metaman.body.php:883
671.5634103343960MetaMan::getTermVector( ??? )../metaman.body.php:809
Hope that makes sense to you --Erithion 11:46, 6 June 2011 (UTC)Reply

2011-06-06: That does indeed make some sense. MetaMan doesn't scale and tries to process everything in-memory. Depending on the size of your DB and the size of indiviual pages, memory can be exhausted. For now you could try to change the allowed memory size. I'm sorry the extension doesn't identify that issue by itself.