Jump to content

Extension talk:External Data/Archive 2019 to 2020: Difference between revisions

From mediawiki.org
Latest comment: 3 years ago by Alex Mashin in topic Raw data
Content deleted Content added
Line 986: Line 986:


:::Oh, sorry, that's right - with "text" you need the "regex" parameter. There may be a way to do a hack and set "regex" so that it holds the entire text - I tried that for a little bit, but couldn't get it working. (Actually, I'm not sure "regex" is working at all right now.) But I guess there's no standard way for External Data to simply get the entire contents of a URL and set a variable to that. [[User:Yaron Koren|Yaron Koren]] ([[User talk:Yaron Koren|talk]]) 15:25, 8 November 2020 (UTC)
:::Oh, sorry, that's right - with "text" you need the "regex" parameter. There may be a way to do a hack and set "regex" so that it holds the entire text - I tried that for a little bit, but couldn't get it working. (Actually, I'm not sure "regex" is working at all right now.) But I guess there's no standard way for External Data to simply get the entire contents of a URL and set a variable to that. [[User:Yaron Koren|Yaron Koren]] ([[User talk:Yaron Koren|talk]]) 15:25, 8 November 2020 (UTC)
::::This will work:
::::<syntaxhighlight lang="lua">mw.ext.externaldata.getWebData {
:::: url = uriToJson
:::: , format = "text"
:::: , data = { json = 'text' }
::::}.json</syntaxhighlight>
::::The key part is <syntaxhighlight lang="lua" inline>data = { json = 'text' }</syntaxhighlight>. <code>text</code> is a pre-defined external variable for the format <code>text</code>. [[User:Alex Mashin|Alex Mashin]] ([[User talk:Alex Mashin|talk]]) 06:36, 15 August 2021 (UTC)


== Content from csv-file with #get_web_data does not render on a wikipage ==
== Content from csv-file with #get_web_data does not render on a wikipage ==
Line 1,031: Line 1,038:
|data=SomeColumn=SomeColumn}}
|data=SomeColumn=SomeColumn}}
{| class="wikitable"
{| class="wikitable"
! SomeColumn{{#for_external_table:<nowiki/>
! SomeColumn{{#for_external_table:
{{!}}-
{{!}}-
! {{{SomeColumn}}}
! {{{SomeColumn}}}
Line 1,046: Line 1,053:
|data=SomeColumn=SomeColumn}}
|data=SomeColumn=SomeColumn}}
{| class="wikitable"
{| class="wikitable"
! SomeColumn{{#for_external_table:<nowiki/>
! SomeColumn{{#for_external_table:
{{!}}-
{{!}}-
! {{{SomeColumn}}}
! {{{SomeColumn}}}

Revision as of 06:36, 15 August 2021

Fetch data from external RDF-file

As I get it, this is not supported by this extension as of now (Apr. 2020). Would it make sense to add it, or is there some other way to do that?

Scenario

We have a list of Open Hardware projects, each hosted in its own git repository. Each such repository has a meta-data.ttl (RDF/Turtle) file in it. In our MediaWiki page, we would like to have a table showing this meta-data, on the wiki page for the respective project.

fetchURL error after upgrade

Hello... I recently upgraded mediawiki to 1.33 and can no longer use #get_web_data. I get errors about a string being passed to Http::get() instead of an array. Even using the sample code/urls from the ExternalData wiki get the same error.

[XUNJvirhcjlnb-i73WIgxAAAA@g] /wiki/index.php?title=X&action=submit TypeError from line 98 of /myurl.com/wiki/includes/http/Http.php: Argument 2 passed to Http::get() must be of the type array, string given, called in /myurl/wiki/extensions/ExternalData/includes/ED_Utils.php on line 873

Running
MediaWiki 1.33.0
PHP 7.1.14 (cgi-fcgi)
MySQL 5.6.41-84.1
ICU 4.2.1

Any chance someone has seen this before?

Are you using the very latest External Data code? This bug may have just been fixed a few days ago. Yaron Koren (talk) 02:25, 2 August 2019 (UTC)

Yeah. I just updated now to be sure, and am still seeing the same error. I did do a manual update from an older version of MW. Made the necessary updates to the syntax in my LocalSettings file. Perhaps I have missed something?

Are you sure you're using the latest ED code? ED_Utils.php doesn't call Http::get() any longer - it now calls HttpWithHeaders::get(). Yaron Koren (talk) 15:09, 2 August 2019 (UTC)
According to Yaron Koren's comments you need to switch that extension to the master branch. This solves the problem in my case: "MW 1.33/REL1_33" with "External Data master" Spas.Z.Spasov (talk) 21:24, 21 November 2019 (UTC).

Populating Wiki Page from CSV spreasdsheet

Hi there! SO, I run a website with a wiki. ā€¦ I'm interested in having wiki pages contain an entry populated from a spreadsheet I will be hosting on my own website (Encyclopedia_Greyhawkania_DATA ONLY.csv).

As an example, I'd like it to look similar to this example page where I've manually hand-written entries from this index-spreadsheet (without calling it from the spreadsheet. It's the bit at the bottom under "Encyclopedia Greyhawkania".


I'd like to call data from the spreadsheet using the {{PAGENAMEE}} as part of the call parameter. Ideally what it'd do is use the {{PAGENAMEE}} to search ColumnA, for example, then return any row (with cells A through D) that has the {{PAGENAMEE}} in it, and have it in a table or a list of some kind.

Is there any chance I can get some help with how to call the data?

Something like this might work, if I understand the question correctly:
{{#get_web_data:url=URL goes here
 |format=csv with header
 |data=A=ColumnA,B=ColumnB,C=ColumnC,D=ColumnD
 |filters=ColumnA={{PAGENAMEE}}
 }}
 {| class="wikitable"
Ā ! A
Ā ! B
Ā ! C
Ā ! D {{#for_external_table:
 {{!}}-
 {{!}} {{{A}}}
 {{!}} {{{B}}}
 {{!}} {{{C}}}
 {{!}} {{{D}}}
 }}
 |}
Yaron Koren (talk) 17:00, 24 July 2019 (UTC)
Thank you so much Yaron Koren!!!
I used a variation of that, replacing the "! A" with just the direct title of the header (e.g. "Topic"), instead of using the parameter.
I got it to generate the table, but, with no rows below the headers. I wasn't able to make it work with using {{PAGENAMEE}}. I even tried using a different parameter (e.g. "Sulm"). It simply wasn't populating the table, but, the Extension troubleshooting section says that might've been because of the "$wgTimeOut" thingie ... so I updated that in my LocalSettings.php, and sure enough, even with up to 300 seconds, it's giving a "500 error".
Here's what I'm using, on the off chance that you might be able to help refine it more.
  {{#get_web_data:url=https://greyhawkonline.com/docs/Encyclopedia_Greyhawkania_DATAONLY.csv
   |format=CSV with header
   |data=A=Topic,B=Type,C=Product,D=Page/Card/Image
   |filters=Topic=Sulm
   }}
   {| class="wikitable"
  Ā ! Topic
  Ā ! Type
  Ā ! Product
  Ā ! Page/Card/Image {{#for_external_table:
   {{!}}- 
   {{!}} {{{A}}}
   {{!}} {{{B}}}
   {{!}} {{{C}}}
   {{!}} {{{D}}}
   }}
   |}
I don't know if you're able to help troubleshooting it, but, here's the address for the page I'm using as an example.
-IcarusATB (talk) 16:20, 25 July 2019 (UTC)
That's strange - I just tried out that exact wikitext on one of my wikis, and it worked out fine, displaying five well-formatted rows (plus the header row). Is it still not working for you? Yaron Koren (talk) 18:44, 26 July 2019 (UTC)
First, why would the behaviour of the code change on my end? I ahven't changed the code further, therefore, it's still getting the same result. That's really not "strange", at all. What it indicates, to me, is that if you're writing, testing, and checking it on your server and it works fine, there's likely something on your server that isn't on mine. Probably an extension, addon, plugin, whatever, that I don't konw to install, or something else, other than the code, that's making it behave differently.
IcarusATB (talk) 01:49, 27 July 2019 (UTC)
I thought maybe the CSV data was temporarily down or something for you, but working when I tried it. I still think it's strange. Yes, we have different setups, but I can't think of any reason why we'd be seeing these different results, assuming you're using the latest External Data code. Yaron Koren (talk) 18:04, 29 July 2019 (UTC)

Get template data from another wiki

Hello, Can I use this extension to do a semantic query from another site?

If I have a a page "Foo" on SiteA that contains a template "Foo" as:

{{Foo|Bar=123}} 

and I want to create a page on SiteB called "Foo Status" that contains a template "Foo" as:

{{Foo|Bar= {{#get_web_data:url=SiteA |format={???}}} |}}

is this possible?

Yes - if the source wiki stores its data via either SMW or Cargo, both of those extensions provide an API of sorts to let you get query results in either CSV or JSON formats, either of which is parseable by External Data. Yaron Koren (talk) 18:46, 17 April 2019 (UTC)
Thanks, Yaron. Any chance you could point me to an SMW example of this?Ā :-)
I can't think of one. Yaron Koren (talk) 02:18, 18 April 2019 (UTC)

prerequisites for getting LDAP data

Hi.

I'm trying to use the #get_ldap_data: function of "External Data" to get LDAP attributes about my users and I'm getting the following error:

Fatal error: Call to undefined function ldap_connect() in /opt/htdocs/mediawiki/extensions/ExternalData/ED_Utils.php on line 136

A quick "grep -R ldap_connect" in the "extensions/External Data" folder shows only a use-call to ldap_connect and nothing anywhere actually defining it.

A quick search online of "ldap_connect" seems to indicate that it is defined in the PHP module "PHP-LDAP"

A quick inspection of my phpinfo() page shows that the php-ldap module is not loaded.

Before I install the php-ldap module on my system, can someone confirm that php-ldap is indeed a pre-requisite for the #get_ldap_data: function of "External Data" to work.

[Solved] Poor man's Sync from WikiA to WikiB

Hi.

Can I use this extension as a real-time clone of a page in one wiki to another?

For example, Mediawiki Site A has a page called "Foo" with arbitrary text.

Can I create a page called "Foo" on another wiki that contains something like:

{{#get_web_data:url=https://www.mysite.com/A/Special:GetData/Foo|format=raw|data=Foo=Foo}}
{{#external_value:Foo}}

and functionally get a copy of the page Foo on Site A?

Well, I'm pretty sure you can get the right wikitext - after all, a single piece of text is valid CSV. Whether it'll display correctly is a different story - template calls won't work, for instance, unless you have a local copy of those templates. It's probably easier to just do an iframe, using one of the iframe-supporting extensions, with "action=render" on the source URL to leave out the skin. Yaron Koren (talk) 02:25, 18 April 2019 (UTC)
The ideal way is also the poor man's way.. set up a PyWikiBot (it's free) and use the "https://www.mediawiki.org/wiki/Manual:Pywikibot/transferbot.py transferbot.py] script. - Revansx (talk) 17:10, 13 May 2020 (UTC)
That would actually move over the contents, which is different. Having two different copies of the content, both of which can be edited, is not necessarily ideal. Yaron Koren (talk) 17:31, 13 May 2020 (UTC)
Well, if a cron job was used to run the PWB script such that WikiB was always "synch'ed" with what WikiA had.. then it would meet the need of this topic as it was originally expressed when I wrote it. - Revansx (talk) 23:45, 13 May 2020 (UTC)

Nested SQL functions throw a rdbms error

Per an email with Yoren, "There's no special handling of replace() or any other commands."
This would lead me to think the below syntax should work... but it does not.
Does anyone have any suggestions or corrections to the code I'm trying to get working? Much appreciated if you do.

Unable to perform nested SQL REPLACE functions

I am attempting to query data from a local database, but need to replace 3 different characters with alternatives that won't trigger mediawiki parsing of the result. No matter the format I have tried, I receive the error pictured to the right when trying to display the page.

This snippet works, but only replaces one of the three characters necessary:

|data=zoneNAME=replace(zone_settings.name,'_',' ')

I need the below SQL code to work, but it creates a parse error whenever run inside a mediawiki page:

|data=zoneNAME=replace(replace(replace(zone_settings.name,'_',' '), '[','('), ']',')')

or with html codes in place of characters:

|data=zoneNAME=replace(replace(replace(zone_settings.name,'_',' '), &#91;,&#40;), &#93;,&#41;)

and in any combination of the character/html code I could think of. I even tried variations using the mediawiki replace command like this:

|data=zoneNAME={{#replace:{{#replace:(replace(zone_settings.name,'_',' ')|&#91;|&#40;}}|&#93;|&#41;}}

and

|data=zoneNAME={{#replace:{{#replace:(replace(zone_settings.name,'_',' ')|<nowiki>[</nowiki>|<nowiki>(</nowiki>}}|<nowiki>]</nowiki>|<nowiki>)</nowiki>}}

Same error, different SQL commands

In a very similar usage scenario, I would like to capitalize the first letter of the string returned, and make sure the rest of the letters are lowercase. Unfortunately, I must also replace all '_' with a space, as in my above issue. Each of these examples works separately, but once they are nested together, I get the same error as in my image above:

|data=itemNAME=REPLACE(item_basic.name,'_',' ')
|data=itemNAME=UCASE(item_basic.name)
|data=itemNAME=LCASE(item_basic.name)

Once nested, in any combination, the same error as above is shown:

|data=itemNAME=REPLACE(UCASE(item_basic.name),'_',' ')

Eventually, this is the SQL I would like run, but if it's too much for the extension I will work on finding a different solution:

CONCAT(UCASE(LEFT(item_basic.name, 1)), LCASE(SUBSTRING(item_basic.name, 2)))

or

UPPER(LEFT(mob_spawn_points.polutils_name,1))+LOWER(SUBSTRING(mob_spawn_points.polutils_name,2,LEN(mob_spawn_points.polutils_name)))

Thanks for any help provided!

Get nested JSON data

If an API returns a non-flat JSON structure, it seems like it's not possible to access the deeper data. Or is it? Sophivorus (talk) 03:27, 28 June 2019 (UTC)

Add http headers to get_web_data

We need to get data from API with authentication header, produced dynamically. I have read this answer, which not helping - this is not soap API.

To solve it, I inserted hook mechanism into the get_web_data logic, so I can alter data before request. But now I have other problem - Http class have no way to add headers to the request. I needed to extends it just for this need. (And to redeclare also its get() and post() methods, because the origin uses Http::request instead of self::request)

So the full solution patch is here in gerrit - allowing hook the call ($url and $options passing by ref) and inserting additional headers by $options['headers']. I think that the HttpWithHeaders class is overkill - this feature would be better in the core Http class - but that's for start.

Getting file data from UNC path?

Hello, this is working great for with local files, but I would to use a json file stored on a network drive. I have tried using a mapped drive (which I did not expect to work anyway) but I had hoped i could access via \\domain\sharedfolder\file.json defined in LocalSettings. I tried with both forward and backward slashes, but always get Error: No file found. I hope I am just doing something syntactically wrong. I appreciate the help!

I don't know - the key is that the server can access that path, not just your computer. Could that be the issue? Yaron Koren (talk) 19:22, 6 August 2019 (UTC)

Yes, it is accessible from the server. Is there a particular syntax to use for UNC paths? Thank you!

I don't know - External Data is using the PHP file_get_contents() function, so based on what it says here, the path you entered should work. Yaron Koren (talk) 17:28, 7 August 2019 (UTC)

Stale cache

If {{#get_web_data:}} manages to fetch data, and then, when the cache expires, the data source is gone, will the function return the stale data from the cache?

What about "Stored Procedures" in SQL systems.

Hello. There is SQL application on my network I'm trying to get data from using "Extension:External Data" and the SQL app owner has told me that they can make a "Stored Procedure" available to me rather than grant me "read" access to the SQL database itself.. Can someone shed some light on how this could be done using this extension? What should I ask for? Are SQL "Stored Procedures" the SQL implementation of SOAP? Can someone please offer some insight on how to do this if it's possible at all. Thanks!

You can do it - you just need to create a "mini-API", i.e. a script in some language like PHP or anything else, that calls that stored procedure, possibly passing in to it some values from the query string, and then output the results on the screen in CSV or JSON. External Data can then access that API via #get_web_data, and make use of the results. Yaron Koren (talk) 13:23, 6 September 2019 (UTC)
Thanks, Yaron. I'll pursue this approach.

Use to populate a template in a for loop?

First off, amazing work on this plugin. I really like it! However, would it be at all possible to use the #for_external_table or similar to populate a template? I.e something like:

{{#for_external_table:
{{Template
| field 1 = {{{fieldvalue}}}
| field 2 = {{{fieldvalue2}}}
}}
}}

That would totally make my day. I run an RPG system where it's handy to have information in a database for the various apps that access it (rather than scraping from the wiki) and this would certainly save on queries per page load for those pages that list skills or objects.

I'm glad you like it! Yes, you can do that using #display_external_table. Yaron Koren (talk) 00:34, 22 September 2019 (UTC)

Argh how did I miss that XD Thank you!

TypeError

Just to let you know:

  • MediaWiki: 1.33.0
  • PHP: 7.3.10
  • External Data: 1.9.1
TypeError from line 98 of /w/includes/http/Http.php: Argument 2 passed to Http::get() must be of the type array, string given, called in /w/extensions/ExternalData/includes/ED_Utils.php on line 873

Backtrace:

#0 /w/extensions/ExternalData/includes/ED_Utils.php(873): Http::get(string, string, array)
#1 /w/extensions/ExternalData/includes/ED_Utils.php(976): EDUtils::fetchURL(string, string, integer)
#2 /w/extensions/ExternalData/includes/ED_ParserFunctions.php(133): EDUtils::getDataFromURL(string, string, array, string, integer, integer)
#3 /w/includes/parser/Parser.php(3528): EDParserFunctions::doGetWebData(Parser, string, string, string)
#4 /w/includes/parser/Parser.php(3235): Parser->callParserFunction(PPTemplateFrame_DOM, string, array)
#5 /w/includes/parser/Preprocessor_DOM.php(1285): Parser->braceSubstitution(array, PPTemplateFrame_DOM)
#6 /w/includes/parser/Parser.php(3409): PPFrame_DOM->expand(DOMElement)
#7 /w/includes/parser/Preprocessor_DOM.php(1285): Parser->braceSubstitution(array, PPFrame_DOM)
#8 /w/includes/parser/Parser.php(3049): PPFrame_DOM->expand(DOMElement, integer)
#9 /w/includes/parser/Parser.php(1359): Parser->replaceVariables(string)
#10 /w/includes/parser/Parser.php(491): Parser->internalParse(string)
#11 /w/extensions/PageForms/specials/PF_RunQuery.php(98): Parser->parse(string, Title, ParserOptions, boolean, boolean)
#12 /w/extensions/PageForms/specials/PF_RunQuery.php(26): PFRunQuery->printPage(string, boolean)
#13 /w/includes/specialpage/SpecialPage.php(569): PFRunQuery->execute(string)
#14 /w/includes/specialpage/SpecialPageFactory.php(558): SpecialPage->run(string)
#15 /w/includes/MediaWiki.php(288): MediaWiki\Special\SpecialPageFactory->executePath(Title, RequestContext)
#16 /w/includes/MediaWiki.php(865): MediaWiki->performRequest()
#17 /w/includes/MediaWiki.php(515): MediaWiki->main()
#18 /w/index.php(42): MediaWiki->run()
#19 {main}

Jaider msg 14:48, 5 October 2019 (UTC)

Sorry, External Data is well overdue for a new version. This problem may have been fixed already - could you try running the latest External Data code to see if the problem is still there? Yaron Koren (talk) 17:16, 6 October 2019 (UTC)
Yes, I have just checked out master now and I confirm it is fixed. Thanks. Jaider msg 17:35, 6 October 2019 (UTC)
Great. And I'll try to release a new version soon. Yaron Koren (talk) 01:21, 7 October 2019 (UTC)

MW Version 1.33.1 Upgrade

Hi,

I too have just attempted to upgrade from 1.27.5 to 1.33.1 of Mediawiki. I rely heavily on ExternalData (Using development Master after 1.33.1, have tried REL_1.33 too) to populate my pages and would like to get my site back up and running.

As an example, when I use a page that has a MYSQL (Version 5.6.22) running on Apache (Version 2.4) and PHP (Version 7.2.21) I get overlapping trace backs on the page. Below is the listing from the Apache log file.

I am not sure how to provide the relevant information to debug.

Any suggestions?

Thanks,

Gregg


more MW.log [218636] PHP Warning: A non-numeric value encountered in /.../mediawiki/1.33.1/includes/libs/rdbms/database/Database.php on line 304, referer: .../mediawiki/1.33.1/index.php?title=Special:UserLogin&returnto=Special%3ARunQuery%2FSystem+Table&returntoquery=pfRunQueryFormName%3DSystem%2BTable%26System_Table%255BApproach%255D%3DStandard%26System_Table%255BPlayer%255D%3DAnyone%26System_Table%255BPartner%255D%3DAnyone%26wpRunQuery%3DDisplay%2BSystems%26pf_free_text%3D [239131] PHP Stack trace:, referer: .../mediawiki/1.33.1/index.php?title=Special:UserLogin&returnto=Special%3ARunQuery%2FSystem+Table&returntoquery=pfRunQueryFormName%3DSystem%2BTable%26System_Table%255BApproach%255D%3DStandard%26System_Table%255BPlayer%255D%3DAnyone%26System_Table%255BPartner%255D%3DAnyone%26wpRunQuery%3DDisplay%2BSystems%26pf_free_text%3D [239167] PHP 1. {main}() /.../mediawiki/1.33.1/index.php:0, referer: .../mediawiki/1.33.1/index.php?title=Special:UserLogin&returnto=Special%3ARunQuery%2FSystem+Table&returntoquery=pfRunQueryFormName%3DSystem%2BTable%26System_Table%255BApproach%255D%3DStandard%26System_Table%255BPlayer%255D%3DAnyone%26System_Table%255BPartner%255D%3DAnyone%26wpRunQuery%3DDisplay%2BSystems%26pf_free_text%3D [239186] PHP 2. MediaWiki->run() /.../mediawiki/1.33.1/index.php:42, referer: .../mediawiki/1.33.1/index.php?title=Special:UserLogin&returnto=Special%3ARunQuery%2FSystem+Table&returntoquery=pfRunQueryFormName%3DSystem%2BTable%26System_Table%255BApproach%255D%3DStandard%26System_Table%255BPlayer%255D%3DAnyone%26System_Table%255BPartner%255D%3DAnyone%26wpRunQuery%3DDisplay%2BSystems%26pf_free_text%3D

[removed the rest]

I think that's just a single error message ("A non-numeric value encountered"). What code do you have on on line 304 of /includes/libs/rdbms/database/Database.php? Yaron Koren (talk) 14:27, 14 October 2019 (UTC)

Hi Yaron, Here is what I see from the standard MW 1.33.1 installation: 304: if ( $this->flags & self::DBO_DEFAULT )

Here is a bit longer code snippet:

      /**
        * @note exceptions for missing libraries/drivers should be thrown in initConnection()
        * @param array $params Parameters passed from Database::factory()
        */
       protected function __construct( array $params ) {
               foreach ( [ 'host', 'user', 'password', 'dbname', 'schema', 'tablePrefix' ] as $name ) {
                       $this->connectionParams[$name] = $params[$name];
               }
               $this->cliMode = $params['cliMode'];
               // Agent name is added to SQL queries in a comment, so make sure it can't break out
               $this->agent = str_replace( '/', '-', $params['agent'] );
               $this->flags = $params['flags'];

304: if ( $this->flags & self::DBO_DEFAULT ) {

                       if ( $this->cliMode ) {
                               $this->flags &= ~self::DBO_TRX;
                       } else {
                               $this->flags |= self::DBO_TRX;
                       }
               }
               // Disregard deprecated DBO_IGNORE flag (T189999)
               $this->flags &= ~self::DBO_IGNORE;
               $this->sessionVars = $params['variables'];
               $this->srvCache = $params['srvCache']Ā ?? new HashBagOStuff();
               $this->profiler = is_callable( $params['profiler'] )Ā ? $params['profiler']Ā : null;
               $this->trxProfiler = $params['trxProfiler'];
               $this->connLogger = $params['connLogger'];
               $this->queryLogger = $params['queryLogger'];
               $this->errorLogger = $params['errorLogger'];
               $this->deprecationLogger = $params['deprecationLogger'];
               if ( isset( $params['nonNativeInsertSelectBatchSize'] ) ) {
                       $this->nonNativeInsertSelectBatchSize = $params['nonNativeInsertSelectBatchSize'];
               }
               // Set initial dummy domain until open() sets the final DB/prefix
               $this->currentDomain = new DatabaseDomain(
                       $params['dbname']Ā != Ā ? $params['dbname']Ā : null,
                       $params['schema']Ā != Ā ? $params['schema']Ā : null,
                       $params['tablePrefix']
               );
       }

Hope that helps and let me know if you need anything else.

Thanks again

Gregg

Okay, thanks. Are you modifying the value of $edgDBFlags in your LocalSettings.php file? Yaron Koren (talk) 16:47, 15 October 2019 (UTC)

-- Aha. Yes I am

$edgDBFlags['bdb'] = "DBO_DEFAULT";

Should I remove this statement?

Gregg

--

Hi Yaron,

Removing this statement seems to fix this problem!

I have different problems now, but at first glance, I am back in business for the most part.

Thanks,

Gregg

Great! That particular statement looks unnecessary (DBO_DEFAULT is already the default), though if you do want to have it, I think you should remove the quotes around DBO_DEFAULT. Yaron Koren (talk) 19:31, 15 October 2019 (UTC)

TAB delimited CSV files

I have a TAB delimited text data file on my server that I have configured the "External Data" extension to be able to read.

I have proven that the |delimiter= argument works with |format=csv by replacing all my tab literals with the @ char such that |format=csv|delimiter=@ actually works well.

My question is: how can I express a TAB as a delimiter? as that is the way the data file is being generated.

Thank you! /Rich

That's a good question, and I'm surprised that this never came up before, given that TSV (tab-separated values) is a somewhat popular data format. There was no way to handle this, as far as I know - I just checked in a way to do this, so now, if you have "delimiter=\t", it will handle tabs, because the "\t" will be interpreted as an actual tab. Yaron Koren (talk) 03:04, 29 October 2019 (UTC)
Thanks, Yaron! You rock. --- Rich
Unfortunately, after manually making the changes identified in your commit [1] to my v1.8.3 version of ED, it was not successfulĀ :/ .. The affected parts of "ED_ParserFunctions.php" were identical to your commit. Any idea why this isn't working for me?
line 103 as $delimiter = str_replace( 'T', "\t", $args['delimiter'] ); works better for me. just fyi

[1] https://github.com/wikimedia/mediawiki-extensions-ExternalData/commit/33eed80a2100c4922387da91033e799c144f4618

Sorry about that - that always seems to happen when I don't test my changes! I checked in what I think is a fix for this. Yaron Koren (talk) 18:35, 29 October 2019 (UTC)
Hi Yaron, apart from the doGetWebData function, could you also make the change within the doGetFileData function? Thanks! --Platinops (talk) 18:02, 3 December 2019 (UTC)
Good idea - I think I forgot about #get_file_data... I just checked in that change. Yaron Koren (talk) 14:39, 4 December 2019 (UTC)

#get_web_data question

What am I doing wrong?

My wiki page isn't passing data to my php page via POST process.

Here's the code in the page itself: {{

  1. get_web_data:

|url=ht*tp://localhost/experiments/program_title_count.php |format=CSV |post data=office=03809 |data=title=1,titleCount=2 |cache seconds=0.05 }}

Title Count {{#for_external_table:
{{{title}}} {{{titleCount}}}

}}

And here's the code in "program_title_count.php" that is supposed to get the POSTed data:

$whichOffice = $_POST["office"];

if(empty($whichOffice))

{

$whichOffice = "03808";

}

$query1 = "SELECT title,COUNT(*) AS titleCount

FROM list_personnel INNER JOIN affiliation_program ON list_personnel.id_list_personnel = affiliation_program.id_list_personnel

WHERE program_number = '$whichOffice'

GROUP BY title";

It runs the query, but uses the default value for $whichOffice (03808) rather than "03809". What am I missing? Sorry, I'm not an IT professional; please forgive me if I'm missing something obvious.

What versions of External Data and MediaWiki are you running? Yaron Koren (talk) 18:48, 15 November 2019 (UTC)

Lua modules

Hi. Is there a way to use this extension through a Scribunto module? I would like to get the raw file and use my own functions to process it. Tinker Bell (talk) 23:47, 1 December 2019 (UTC)

Not entirely sure what you mean here, but this details how to call parser functions from modules: Extension:Scribunto/Lua reference manual#frame:callParserFunction DSquirrelGMš“£š“Ÿš“’ 00:12, 2 December 2019 (UTC)
No, DSquirrelGM, I just want to get a JSON file from a webserver, and process it with a function I wrote in Lua. And using callParserFunction won't work because it only generates a strip marker that can't be used by Scribunto. Tinker Bell (talk) 06:25, 6 December 2019 (UTC)

Can I use this extension to upload CSV/xml files and have mediawiki turn them to wiki tables?

Hi, unfortunately the example you provide in the extension is not working. Will this extension allow me to convert an uploaded CSV file in my mediawiki to a wiki table that can be used in a page? Thanks MavropaliasG (talk) 15:11, 2 December 2019 (UTC)

Hello, MavropaliasG, your code works as it is expected. I've tested it on my private wiki, here is the result: https://i.stack.imgur.com/EOhqG.png IMO, the problem in your case is the version of Extension:ExternalData, try to use the master branch, instead of REL1_33 for the extension. This solves the very similar problem with my MW 1.33. Regards. Spas.Z.Spasov (talk) 15:38, 2 December 2019 (UTC)

Hi thanks for the reply. I was talking about the example given on the extension page in that box (under download). It links to a mediawiki which returns an error. Anyway I wanted to know if this extension allow me to convert an uploaded CSV file in my mediawiki to a wiki table that can be used in a page? Thank you MavropaliasG (talk) 15:53, 2 December 2019 (UTC)

Yes. The error you're seeing on discoursedb.org is due to a temporary bug in another extension. Yaron Koren (talk) 17:13, 2 December 2019 (UTC)

php warning on rebuildall.php

Hi, I'm seeing a php warning upon deploying here. The error looks like this:

PHP Warning:  array_key_exists() expects parameter 2 to be array, null given in /var/www/wiki/extensions/ExternalData/includes/ED_Utils.php on line 165

My versions are:

  • MW 1.34.0
  • Running on debian 10.2
  • php version 7.3.14
  • ED from git, and the 1_34 branch checked out

The workflow leading up to this is essentially, install MW, composer install some extensions (SMW etc.), then in order:

  • update.php --quick
  • importDump.php --no-updates (about 6000 XML dumped pages)
  • rebuildall.php (the warning occurs here 3 times)

This warning doesn't appear in the 1.31.6 LTS.

Any thoughts?

--JosefAssad (talk) 09:21, 19 February 2020 (UTC)

I don't know why it's happening for one MediaWiki version and not another, but I'm guessing you can ignore that warning. Are you seeing any actual problems? Yaron Koren (talk) 15:51, 19 February 2020 (UTC)

Nope, no obvious errors, but to be honest I haven't explicitly tested ED yet.Ā :) I have been assuming it's php being chatty php; will update here if I see something obvious. --JosefAssad (talk) 08:15, 24 February 2020 (UTC)

rebuildData.php doesn't seem to work from cron for get_web_data [solved]

We're using ED (1.9) more an more to query data and store in articles and have been using the rebuildData script to keep things fresh. Some server environment variables were adjusted yesterday which has cause some really weird behavior when it comes to the cron jobs.

  • When a user views an article that uses #get_web_data, everything fetches and renders as expected.
  • When a user saves an article, the #store_external_table function works as expected, storing SMW subobjects.
  • If I run rebuildData with sudo, the objects are stored. (without sudo, they do NOT get fetched/stored)
  • When the cron runs, the subobjects disappear (as though the #get_web_data failed)

I suspect it's some kind of permissions piece that has gone bad, but I can't figure why an article save would work fine and the cron (running as the same user apache does, www-data) would not. #get_file_data updates work totally fine. Is there a nuance in the extension or environment I should be looking for?

Thanks!

Lbillett (talk) 17:01, 4 March 2020 (UTC)

Yah so this was all my fault. The changes we made seemed to cause some obsolete environment variables to get picked up from etc/bash.bashrc and etc/wgetrc. While I don't know why some were set and others not depending on how it was being run (cron vs page save) seems clear this was the issue. All perfect now. - Lbillett (talk) 19:40, 19 March 2020 (UTC)
That's great to hear! Yaron Koren (talk) 20:35, 19 March 2020 (UTC)

function fetchURL: options for HTTP request differ if caching is used

Hi Yaron,

I am using External Data to fetch some JSON data from an asset management software. This worked great until I started using caching by adding

 $edgCacheTable = 'ed_url_cache';

to my settings. After that I only saw

 Error: No contents found at URL https://...

on the respective pages. Looking at the code in

 includes/EDUtils.php

I saw that you are using different options for the HTTP request when caching is used (line 914 of that file in the master branch) than when caching is not used (lines 880 ff in the master branch). After changing the options for the caching case to the same ones as for the non-caching case I was able to get rid of that error.

Is there a reason you a using different options? Am I missing something?

Regards, HermannSchwƤrzler (talk)

Sorry about that. No good reason. I just checked in what I think is a fix for this - hopefully the new code works better for both you and the person below. Yaron Koren (talk) 01:26, 6 March 2020 (UTC)
Thank you, Yaron! I looked at your commit and it's exactly the change that I would have made for fixing this problem.Ā :-) I have to find some time to test this in my setup, maybe the Wuestenarchitekten are faster in testing it...

Class 'LoggerFactory' not found

I'm trying to update our wiki to this:

MediaWiki 1.34.0 (94e7e1a)
PHP 7.4.3 (fpm-fcgi)
ExternalData 1.9.1 (b4671a2)

I get the following error: [787309cf2a5266c9e3679c48] /FBX_COMP Error from line 32 of /var/www/html/extensions/ExternalData/includes/ED_HttpWithHeaders.php: Class 'LoggerFactory' not found

I then found this and upgrade ExternalData to 2.0.1 (b4671a2) but now get: Error: No contents found at URL http://localhost/api.php?action=query&list=categorymembers&cmtitle=Category:COMPs&format=xml&cmlimit=500

The mentioned URL does return a list as expected.

My Template that is using ExternalData has the following:

{{#get_web_data: url={{SERVER}}{{SCRIPTPATH}}/api.php?action=query&list=categorymembers&cmtitle=Category:{{{Category|}}}&format=xml&cmlimit=500
  | format=XML
  | data=title=title}}
{|
! class="navBox"|{{{Category|}}}
|-
|{{#display_external_table:template=catList
  | delimiter=ā€¢&nbsp;
  | data=title=title}}
|}

Any hints on what I should do? Thx! --Wuestenarchitekten (talk) 19:49, 5 March 2020 (UTC)

It sounds like you're seeing the same issue as the person above. Do you have $edgCacheTable set to some value? Yaron Koren (talk) 19:51, 5 March 2020 (UTC)
Ah - I didn't even see that - yes, same setting $edgCacheTable = 'ed_url_cache'; - Do I need to create that table again?--Wuestenarchitekten (talk) 20:00, 5 March 2020 (UTC)
No. The code needs to be fixed; until that happens, you should probably just comment out that setting. Yaron Koren (talk) 20:04, 5 March 2020 (UTC)

get_db_data with special character get corrupted

Hello, i try to update my wiki and change the connection for external_data from mssql to mysql (mssql is not supportet in this new wiki-version). i have a Problem with the get_db_data function. One Column has a SpecialCharacter in the Name: Ƥ. As you can see in the Errormessage, this will be convertet to \xC3\xA4. If i remove this column, every works fine. So, the sql statement itself get corcupted. The return data itself also contains Special Characters like Ƥ,ƶ,Ć¼ and these comming correct in the results. (i have shorten the result message and remove some internal information):

[7593c8420d9ec2599c4e594d] /SQL_Server Wikimedia\Rdbms\DBQueryError from line 1603 of XXXXX\wiki\includes\libs\rdbms\database\Database.php: A database query error has occurred. 
Query: SELECT ServerName,DatabaseName,IntegritƤt,Ansprechpartner FROM `SQL_Datenbanken` WHERE ServerName='XXX' LIMIT 10000
Function: EDUtils::searchDB
Error: 1064 You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '\xC3\xA4t,Ansprechpartner FROM `SQL_Datenbanken` WHERE ServerName' at line 1 (mysql.XXXXX)
Backtrace:

#0 XXXXX\wiki\includes\libs\rdbms\database\Database.php(1574): Wikimedia\Rdbms\Database->getQueryExceptionAndLog()
#1 XXXXX\wiki\includes\libs\rdbms\database\Database.php(1152): Wikimedia\Rdbms\Database->reportQueryError()
#2 XXXXX\wiki\includes\libs\rdbms\database\Database.php(1807): Wikimedia\Rdbms\Database->query()
#3 XXXXX\wiki\extensions\ExternalData\includes\ED_Utils.php(485): Wikimedia\Rdbms\Database->select()
#4 XXXXX\wiki\extensions\ExternalData\includes\ED_Utils.php(250): EDUtils::searchDB()

My Installation/Configuration:

  • MediaWiki: 1.34.0
  • PHP: 7.4.5 (apache2handler)
  • MySQL: 8.0.19
  • Elasticsearch: 6.5.4

--TomyLee (talk) 11:23, 28 April 2020 (UTC)

delimiter=T (TAB) not working after upgrade to 2.0 on MW 1.34.1

Using ED 1.9 on MW 1.31 I had wiki page that was reading a locally hosted tab-separated variable text file with the following code:

LocalSettings.php

# File XYZ.txt is a "tab-separated-variable" file with n rows of m tab-separated columns
$edgFilePath['XYZ'] = "/opt/htdocs/XYZ.txt";

with the wiki page as:

{{#get_file_data:
 file=XYZ
 |format=CSV
 |data=aaa=1,bbb=2,ccc=3
 |delimiter=T
 |filters=1=<someValueThataaaIsInFileXYZ>
 |cache seconds=1400
}}
* {{#external_value:aaa}}
* {{#external_value:bbb}}
* {{#external_value:ccc}}

I upgraded my site to MW 1.34.1 and ED 2.0 and now #get_file_data: produces the following error:

Notice: Undefined variable: regex in "/opt/htdocs/mediawiki/extensions/ExternalData/includes/ED_ParserFunctions.php" on line 229

and my #external_value: statements generate the error:

Notice: Undefined offset: 0 in /opt/htdocs/mediawiki/extensions/ExternalData/includes/ED_ParserFunctions.php on line 442

Any thoughts on what might have gone wrong? -- Revansx (talk) 00:02, 2 May 2020 (UTC)

I don't know, but there's a more recent version of External Data, 2.0.1 - you should try upgrading to that. Yaron Koren (talk) 23:10, 3 May 2020 (UTC)
not working on ED 2.0.1 either - same issue. Have you tested 2.0.1 with a teb delimited file? (i.e. |delimiter=\t )? - Revansx (talk) 23:04, 12 May 2020 (UTC)

UPDATE - This fixed it.

diff --git a/includes/ED_ParserFunctions.php b/includes/ED_ParserFunctions.php
index 0cc9886..3b5ee12 100644
--- a/includes/ED_ParserFunctions.php
+++ b/includes/ED_ParserFunctions.php
@@ -190,6 +190,12 @@ class EDParserFunctions {
                } else {
                        $format = '';
                }
+
+                $regex = $format === 'text' && array_key_exists( 'regex', $args )
+                       Ā ? html_entity_decode( $args['regex'] )
+                       Ā : null;
+
+
                if ( $format == 'xml' ) {
                        if ( array_key_exists( 'use xpath', $args ) ) {
                                // Somewhat of a hack - store the fact that

Urlencoding encodes space to + instead of %20

the very useful function {{{term.urlencode}}} encodes spaces in "+", instead it should encode it in "%20"

Does this lead to problems? Yaron Koren (talk) 13:41, 12 May 2020 (UTC)
It can yes. -- GreenC (talk) 04:45, 12 October 2020 (UTC)

Undefined variable regex in ED_ParserFunctions.php after upgrade to mw 1.34.1 and ED 2.0.1

I was successfully using External Data on my MW 1.31 wiki to read a local file on my wiki server.

After I upgraded to MW 1.34.1 and ED 2.0.1 my {{#get_file_data: .. }} call no longer works and I see this error:

Notice: Undefined variable: regex in /opt/htdocs/mediawiki/extensions/ExternalData/includes/ED_ParserFunctions.php on line 233

exact same wiki page code .. exact same tab-delimited file.. switched to a true csv and it works ... tab-delimited causes the error. - Revansx (talk) 22:05, 12 May 2020 (UTC)

UPDATE - This fixed it.

diff --git a/includes/ED_ParserFunctions.php b/includes/ED_ParserFunctions.php
index 0cc9886..3b5ee12 100644
--- a/includes/ED_ParserFunctions.php
+++ b/includes/ED_ParserFunctions.php
@@ -190,6 +190,12 @@ class EDParserFunctions {
                } else {
                        $format = '';
                }
+
+                $regex = $format === 'text' && array_key_exists( 'regex', $args )
+                       Ā ? html_entity_decode( $args['regex'] )
+                       Ā : null;
+
+
                if ( $format == 'xml' ) {
                        if ( array_key_exists( 'use xpath', $args ) ) {
                                // Somewhat of a hack - store the fact that
Sure, that would fix the "undefined variable" problem - but did that also fix the problem of reading from the tab-delimited file? That would be surprising. Yaron Koren (talk) 23:28, 12 May 2020 (UTC)
So far so good. I think it was the same problem.. The regex wasn't defined in the "doGetFileData" function at all. I don't know why but true CSV was working and Tab-separated was not.. i added the patch listed above and for whatever reason, all the errors went away and the parser function was able to ready the values from the TSV file. *shrugs* - Revansx (talk) 00:36, 13 May 2020 (UTC)
Okay, great. I just checked in a change to initialize $regex, so hopefully everything works now. Yaron Koren (talk) 13:58, 13 May 2020 (UTC)
Thanks. I'll test it today and assuming everything works well i'll list this topic as "solved" - Revansx (talk) 17:05, 13 May 2020 (UTC)

HTTP Request Options

I would like to create a new configuration variable for the extension to hold default HTTP options to use in EDUtils::fetchURL(). I need it to handle redirects properly; I am also considering using proxies, like i2p. Would filing an issue and uploading a patch be OK, or the author won't have it?
Alex Mashin (talk) 09:54, 4 June 2020 (UTC)

HTML mode

Please note this feature request.
Alex Mashin (talk) 14:33, 9 June 2020 (UTC)

Stale cache

Please review this patch. Alex Mashin (talk) 09:31, 26 June 2020 (UTC)

Encoding detection

Please review a new patch to improve encoding detection in loaded texts.

Alex Mashin (talk) 07:21, 4 July 2020 (UTC)

SOAP

Did {{#get_soap_data:}} ever work? If it did, was it cached, as the page claims?
Alex Mashin (talk) 22:02, 10 July 2020 (UTC)

I don't know how well it worked - I never tried it myself. Yaron Koren (talk) 18:02, 13 July 2020 (UTC)

Patchset for Scribunto

Please review this patchset.
Alex Mashin (talk) 17:48, 13 July 2020 (UTC)

Graceful handling of missing data

If I can't be sure the external URL will be present or active, is there a graceful way for it to fail without the red message 'Error: No contents found at URL', so I can just silently not display the data? Vicarage (talk) 09:58, 27 July 2020 (UTC)

I can hack it by suppressing .error with CSS and using {{#ifeq:{{#external_value:title}} to check if any data was imported, but the method is crude and nasty. Vicarage (talk) 17:12, 27 July 2020 (UTC)

Have you considered caching the data, so that it will still be displayed if the URL temporarily goes offline? Or is this more than a temporary problem? Yaron Koren (talk) 17:17, 27 July 2020 (UTC)
its mapping 5000 pages to 1000 directories, both in flux, so its a matter of speculatively hoping data might be there rather than hard-coding 1000 links. Vicarage (talk) 22:22, 27 July 2020 (UTC)

Exception with get_db_data - "Prefix must be a string"

I use get_db_data with MySQL and MariaDB and now have to specify a prefix even an empty string.

[exception] [d450ec84b1a6cfaf825fbc01] /wiki/Customers   InvalidArgumentException from line 68 of /var/www/html/1.35.0-wmf.41/includes/libs/rdbms/database/domain/DatabaseDomain.php: Prefix must be a string.
#0 /var/www/html/1.35.0-wmf.41/includes/libs/rdbms/database/Database.php(291): Wikimedia\Rdbms\DatabaseDomain->__construct(string, NULL, NULL)
#1 /var/www/html/1.35.0-wmf.41/includes/libs/rdbms/database/DatabaseMysqlBase.php(111): Wikimedia\Rdbms\Database->__construct(array)
#2 /var/www/html/1.35.0-wmf.41/includes/libs/rdbms/database/Database.php(433): Wikimedia\Rdbms\DatabaseMysqlBase->__construct(array)
#3 /var/www/html/local-extensions/ExternalData/includes/connectors/EDConnectorRelational.php(66): Wikimedia\Rdbms\Database::factory(string, array)
#4 /var/www/html/local-extensions/ExternalData/includes/EDParserFunctions.php(79): EDConnectorRelational->run()
...

To fix this I added entries like this to each database definition in LocalSettings.php:

$edgDBTablePrefix['mydb'] = '';

I also got this error when updating from 1.34 to 1.35 and the latest version of External Data. None of my tables have a prefix but once I realised what it was it was easily rectified by setting:

$edgDBTablePrefix['mydb'] = '';

It would be useful if this setting defaulted to an empty string to prevent this error and if the error message said "Database table prefix must be a string"

R Appleby 4 Jan 2021

Patch for review

Please review this patch.
Alex Mashin (talk) 04:45, 12 August 2020 (UTC)

[SOLVED} Can not get #for_external_table to work on MW 1.34

MW 1.34, External Data 2.0.1 (9a1fdcb)

First I Verified that url: https://discoursedb.org/AfricaCSV.txt has correct data

Then I created page called "Test" in my wiki with wikitext:

{{#get_web_data:url=https://discoursedb.org/AfricaCSV.txt
  |format=CSV with header
  |data=Country name=Country,Countries bordered=borders,Population=population,Area=area
}}

{| class="wikitable"
! Name
! Borders
! Population
! Area 
{{#display_external_table:template=Country info row
  |data=Country name=Country,Countries bordered=borders,Population=population,Area=area
}}
|}

with page "Template:Country info row" as:

|-
|{{{Country name}}}
|{{{Countries bordered}}}
|{{{Population}}}
|{{{Area}}}

Results are:

  • Only Table Heading renders. No Data Rows in the table
  • No errors on the page
  • No errors in the debug

Can someone help me debug this?

Thanks!

/Rich

Fwiw, this works fine:
{{#get_web_data:url=https://discoursedb.org/AfricaCSV.txt
  |format=CSV with header
  |data=Country name=Country,Countries bordered=borders,Population=population,Area=area
}}
<table class=wikitable>
{{#for_external_table:<tr><td>{{{Country name}}}</td><td>{{{Countries bordered}}}</td><td>{{{Population}}}</td></tr> }}
</table>
turns out the documentation is misleading for #display_external_table. Turns out you do no need to have a "|data=local=external,..." line in that function. "|data=local=external,..." should ONLY be in the #get_web_data function.
It depends on whether the template has named or numbered parameters, I think. Yaron Koren (talk) 19:52, 12 August 2020 (UTC)
External variables stay in {{#get_web_data:}}. For displaying data, you need only local ones.
Alex Mashin (talk) 03:10, 13 August 2020 (UTC)
  • The correct code is:
{{#get_web_data:url=https://discoursedb.org/AfricaCSV.txt
  |format=CSV with header
  |data=Country name=country,Countries bordered=borders,Population=population,Area=area
 }}

 {| class="wikitable"
Ā ! Name
Ā ! Borders
Ā ! Population
Ā ! Area 
 {{#display_external_table:template=Country info row
  |data=Country name=Country name,Countries bordered=Countries bordered,Population=Population,Area=Area
 }}
 |}

Alex Mashin (talk) 03:08, 13 August 2020 (UTC)

Skipping processing if no data

Its great to see the error suppression going in, but what will #external_value return if there is no data?

I have in mind logic like {#get_web_data: url=http://fanfiles.priory/{{#rreplace:{{{1|}}}/{{PAGENAMEE}}|[ ]|_}}/files.csv | format=CSV | data=file=1,thumb=2,title=3}} {{#ifeq:{{#external_value:title}}|||{{#display_external_table:template=FileRow|data=file=file,thumb=thumb,title=title}}}}

But can I be sure that if the file is missing then {{#external_value:title}} will return a blank string I can check for? Vicarage (talk) 09:32, 22 August 2020 (UTC)

Ah-ha, setting $edgExternalValueVerbose = false; in LocalSettings.php gets rid of the red error, and leaves the function returning a blank I can test against. Vicarage (talk) 09:53, 22 August 2020 (UTC)

How to solve problem database error?

How to solve problem database error?

[633ddcf8fda36b6380350be9] /index.php?title=%E0%B8%A0%E0%B8%B2%E0%B8%A7%E0%B8%B0%E0%B8%AA%E0%B8%A1%E0%B8%AD%E0%B8%87%E0%B9%80%E0%B8%AA%E0%B8%B7%E0%B9%88%E0%B8%AD%E0%B8%A1&action=submit Wikimedia\Rdbms\DBQueryError from line 1603 of D:\WAPP\mediawiki\includes\libs\rdbms\database\Database.php: A database query error has occurred. Did you forget to run your application's database schema updater after upgrading?

Query: SELECT name,age FROM "test"

Function: EDConnectorRelational::searchDB

Error: 42P01 ERROR: relation "test" does not exist

LINE 1: ...searchDB index.php@TPITAPWOD01 */ name,age FROM "test" ...

It seems like you're connecting to a PostgreSQL database. Does that database contain a table called "test"? And is it capitalized exactly that way, and not, say, "Test"? Yaron Koren (talk) 02:25, 16 September 2020 (UTC)
I connect PostgreSQL database and my table name "test". Somebody suggested me to run update.php file but it still appear same error. Ss.sss16 (talk)
Are you sure all the database connection details are correct? Yaron Koren (talk) 02:55, 16 September 2020 (UTC)
Yes, I'm sure.Now,it doesn't error but data not appear.Ss.sss16 (talk)
Well, that sounds better already. Are you sure the variable names you're using in #external_value or #for_external_table are correct? Yaron Koren (talk) 03:31, 16 September 2020 (UTC)
Thank you for your support. I would like to ask you how to random data from database to display on website everyday.Ss.sss16 (talk)
I don't understand. Yaron Koren (talk) 03:40, 16 September 2020 (UTC)
I want to random vocabulary in my database to display on wiki everyday (1 word per day). Can I do it? Ss.sss16(talk)
Sure - it sounds like you just need to get the External Data calls to work. Yaron Koren (talk) 04:26, 16 September 2020 (UTC)

Using #switch with #for_external_table

I have a setup like this:

{{#get_web_data:
 url= ...
 |format=JSON
 |use jsonpath
 |data=index=$[*].index,name=$[*].name,description=$[*].quote,image=$[*].img,month=$[*].month,year=$[*].year,set=$[*].set
}}

{| class="wikitable"
! Name
! Image
! Date
! Set
! style="width:60%" | Description {{#for_external_table:<nowiki/>
{{!}}-
{{!}} #{{{index}}} [[{{{name}}}]]
{{!}} [[File:{{{image}}}|150px]]
{{!}} {{month|{{{month}}}}} {{{year}}}
{{!}} {{set|{{{set}}}}}
{{!}} {{{description}}}
}}
|}

When I save this, the month and set values do not display; they return 'default'. To be clear, everything else here works fine. Template:month is

{{#switch:{{{1}}}
|1=January
|2=February
|3=March
|4=April
|5=May
|6=June
|7=July
|8=August
|9=September
|10=October
|11=November
|12=December
|default}}

and template:set is basically the same thing but with different words. If I remove the template call and just leave {{{set}}} and {{{month}}} on their own, they display as the numbers that they are supposed to have. How come they won't pass into the template properly? ParserFunctions is definitely enabled, if I call month with parameter 2 it does return 'February', it's just here in the #for_external_table that it doesn't work. I'm stumped here Kumi netsuha (talk) 21:31, 21 October 2020 (UTC)

Well, for the "month" template, you're passing the month and year together as the first parameter - so it's no surprise that this template isn't working, for at least that reason. If you fix that problem, does it work better? Yaron Koren (talk) 02:18, 22 October 2020 (UTC)

Raw data

Hi. Is it possible to get a raw resource, using mw.ext.externalData.getWebData, but without any further processing? I want to get a JSON file, but I want to process on my own, without using any extension functions. Thanks. --Tinker Bell ā˜… ā™„ 03:45, 31 October 2020 (UTC)

Try using "format=text". Yaron Koren (talk) 21:42, 5 November 2020 (UTC)
@Yaron Koren: It doesn't work: doing mw.ext.externaldata.getWebData { url = uriToJson, format = "text" } returns nil, not a string. --Tinker Bell ā˜… ā™„ 09:46, 8 November 2020 (UTC)
Oh, sorry, that's right - with "text" you need the "regex" parameter. There may be a way to do a hack and set "regex" so that it holds the entire text - I tried that for a little bit, but couldn't get it working. (Actually, I'm not sure "regex" is working at all right now.) But I guess there's no standard way for External Data to simply get the entire contents of a URL and set a variable to that. Yaron Koren (talk) 15:25, 8 November 2020 (UTC)
This will work:
mw.ext.externaldata.getWebData {
::::	url		= uriToJson
:::: , format	= "text"
:::: , data	= { json = 'text' }
::::}.json
The key part is data = { json = 'text' }. text is a pre-defined external variable for the format text. Alex Mashin (talk) 06:36, 15 August 2021 (UTC)

Content from csv-file with #get_web_data does not render on a wikipage

<math>R_\text{min}</math> is supposed to look like Rmin, but if i retreive the code/text "<math>R_\text{min}</math>" from a csv-file it will print <math>R_\text{min}</math> on the page, not Rmin.

This works fine if I use <math>R_\text{min}</math> directly on the page, not from the csv-file.

Same if I try to use a template f.ex. {{mytemplate}}, it will print {{mytemplate}} on the page, not the content from the template.

Does anyone know what the problem is? or is it not possible?

~ Thanks - rada 05 nov 2020 ~

My guess is that it's not possible... what's coming from the outside data sources (like this CSV file) is supposed to be data, not wikitext or anything else wiki-related. If you want these values to get parsed, you may need to use External Data in conjunction with Scribunto. (I'm not sure that will help, but you can do a lot with Lua.) Yaron Koren (talk) 21:55, 5 November 2020 (UTC)

Select on database views broken

When going from MW 1.34.4 to MW 1.35.0 the results from a database query show up empty when the select is on a view.

  • For MW 1.34.4 the ExternalData version isĀ : 1.9.1 (b4671a2)
  • For MW 1.35.0 the ExternalData version isĀ : 2.2 (4312dfa)
  • Database is MariaDB versionĀ : 10.4.11-MariaDB

LocalSettings.php settings:

wfLoadExtension( 'ExternalData' );

$edgDBServer['somedatabase'] = "localhost";
$edgDBServerType['somedatabase'] = "mysql"; 
$edgDBName['somedatabase'] = "somedatabasedb";
$edgDBUser['somedatabase'] = "somedatabaseuser";
$edgDBPass['somedatabase'] = "password";

$wgHTTPTimeout = 20;

When using the below code on MW 1.35 directly on a table it returns value(s)

{{#get_db_data:
db=somedatabase
|from=sometable
|data=SomeColumn=SomeColumn}}
{| class="wikitable"
! SomeColumn{{#for_external_table:
{{!}}-
! {{{SomeColumn}}}
}}
|}{{#clear_external_data:}}

When the same is done on a view it does not return any value(s).

{{#get_db_data:
db=somedatabase
|from=viewonsometable
|data=SomeColumn=SomeColumn}}
{| class="wikitable"
! SomeColumn{{#for_external_table:
{{!}}-
! {{{SomeColumn}}}
}}
|}{{#clear_external_data:}}

SQL query for database view with upper case letters in column name:

SELECT SomeColumn
FROM sometable

Rewriting the view to lower case:

SELECT SomeColumn AS somecolumn
FROM sometable


I think I found the problem.
When selecting data on a database view with {{#get_db_data: it seems that the column names are converted to lower case.
When rewriting the column name in the view to use only lower case letters it works just fine.
I have updated the above examples and added the views.
I tried to find out where and why this happens but I did not succeedĀ :-)
I tested this on an clean MW-1.35.0 install with only ExternalData (2.2 (4312dfa)) enabled. --Felipe (talk) 11:07, 19 November 2020 (UTC)
Some more digging.
When using the view with the upper case column name SELECT SomeColumn
and data=SomeColumn=SomeColumn the below error is in the Apache log
PHP Notice: Undefined index: somecolumn in C:\\xampp\\htdocs\\mediawiki-1.35.0\\extensions\\ExternalData\\includes\\connectors\\EDConnectorRelational.php on line 115
This seems to mean that it tries to doe a select on somecolumn but that column does not exist.
This only happens in MW-1.35 and not in MW-1.34.4 --Felipe (talk) 12:55, 19 November 2020 (UTC)
Some more digging and found the code in extensions\ExternalData\includesEDParsesParams.php that is responsible for lowering the case on the data field.
On line 54 the result of !( $this->parserĀ ? $this->parserĀ : $this )->preservesCase() = 1 (true) and is stored in $lower
With that value for $lower it enters protected static function paramToArray on line 55 and thus converting de data fields to lower case.
I can not find out why the result of !( $this->parserĀ ? $this->parserĀ : $this )->preservesCase() is true
and why you would convert the data (column) names to lower case when querying against MySQL or MariaDB.
I did notice that the column names from a SELECT on a view follow what is in the view.
This means that if the view contains capital letters in the table names it returns those names.
The query against the database works just fine but the returned table names do not match with what Externaldata is expecting.
somecolumn in the SELECT is not the same as SomeColumn returned by the database.
If lowering the case is necessary ($lower=true) then Externaldata should also lower the case for all returned table names. Is this correct?
It was nice digging into the code, it was to long agoĀ :-) --Felipe (talk) 11:45, 20 November 2020 (UTC)
	protected function __construct( array &$args ) {
		// Bring keys to lowercase:
		$args = self::paramToArray( $args, true, false );
		// Add secrets from wiki settings:
		$args = self::supplementParams( $args );

		// Text parser, if needed.
		if ( static::$needs_parser ) {	// late binding.
			// Encoding override supplied by wiki user may also be needed.
			$this->encoding = isset( $args['encoding'] ) && $args['encoding'] ? $args['encoding'] : null;
			try {
				$this->parser = EDParserBase::getParser( $args );
			} catch ( EDParserException $e ) {
				$this->error( $e->code(), $e->params() );
			}
		}

		// Data mappings. May be handled by the parser or by self.
		if ( array_key_exists( 'data', $args ) ) {
			// Whether to bring the external variables to lower case. It depends on the parser, if any.
			$lower = !( $this->parser ? $this->parser : $this )->preservesCase();	// late binding in both.
			$this->mappings = self::paramToArray( $args['data'], false, $lower );
		} else {
			$this->error( 'externaldata-no-param-specified', 'data' );
		}

		// Filters.
		$this->filters = array_key_exists( 'filters', $args ) && $args['filters']
					   ? self::paramToArray( $args['filters'], true, false )
					   : [];

		// Whether to suppress error messages.
		if ( array_key_exists( 'suppress error', $args ) ) {
			$this->suppress_error = true;
		}
	}

Sorry about that, and thanks for all the research into the problem. Alex Mashin just checked in a fix for this - if you can, please get the latest code, and let us know if that fixed things. Yaron Koren (talk) 19:03, 20 November 2020 (UTC)

No problem and yes, that fixed it. Thanks to Alex and yourself for this quick fix. --Felipe (talk) 13:03, 21 November 2020 (UTC)