API talk:Parsing wikitext
Add topicMinor issues with expandtemplates
[edit]Hi,
I just want to report two issues with expandtemplates :
- expandtemplates seems to work for only one level of templates (templates inside templates are not expanded)
- expandtemplates doesn't take into account <includeonly> or <noinclude>
For example, on the French Wikipedia, expandtemplates for {{e}} gives <includeonly>{{exp|e}}</includeonly><noinclude> {{/Documentation}} </noinclude>
- Possibly you should report this to http://bugzilla.wikimedia.org/, so the issues can be properly tracked and fixed. -- Tbleher 10:16, 22 January 2008 (UTC)
- Ok, done. There's the same problem for Special:ExpandTemplates. --NicoV 14:54, 22 January 2008 (UTC)
- Seems to be fixed --NicoV 17:54, 23 January 2008 (UTC)
- Ok, done. There's the same problem for Special:ExpandTemplates. --NicoV 14:54, 22 January 2008 (UTC)
error with parse...
[edit]i am getting the following return...
<?xml version="1.0"?> <api> <error code="params" info="The page parameter cannot be used together with the text and title parameters" xml:space="preserve">
my query string looks like this...
/api.php?action=parse&page=Houdini
i am using mw 1.15.1 i get nothing of value from the mw debug log and i have verified that my rewrite rules are not adding the title parameter. any thoughts? 67.97.209.36 14:25, 18 November 2009 (UTC)
- if i change my query string to this...
/api.php?action=parse&text={{Houdini}}
- i get the expected result.
- 67.97.209.36 16:46, 18 November 2009 (UTC)
I'm seeing the same problems. I think we should write it up as a bug. --Duke33 01:58, 14 January 2010 (UTC)
- I took the liberty of writing it up:bugzilla:22684 --Duke33 17:16, 1 March 2010 (UTC)
Parsing into printable text ?
[edit]Hi,
Is there a way to parse a wiki text to get a simplified text (without HTML, external and internal replaced by their text, ...) ?
My need is the following :
- The project Check Wikipedia uses a configuration file for each wiki (for example: en)
- It's used among other things to generate pages in Wiki format (for example: en)
- In the configuration file, you can see for example a description of error n°1: error_001_desc_script=This article has no bold title like <nowiki>'''Title'''</nowiki>, so it contains Wiki text.
- I am writing a Java program (WikiCleaner) to help fixing the errors reported by this tool. I'd like to display this text in my program as a simple text: This article has no bold title like '''Title'''.
Thanks, NicoV --16:33, 28 March 2010 (UTC)
Error with DPL
[edit]DPL and DPL in templates are not expanded. e.g. these fail
- api.php?action=query&prop=revisions&titles=yourtitle&rvprop=content&rvexpandtemplates
- api.php?action=expandtemplates&text={{TemplateWithDPL}}
Hamishwillee 00:02, 19 July 2010 (UTC)
Simple template output wrapped in paragraph tags
[edit]Suppose I have Template:Foo, which contains this wikitext:
This calls the hello world template: {{hello world}}.
Template:Hello world contains this text:
Hello, world!
I would expect the text output to look like this:
This calls the hello world template: Hello, world!
This is my API call:
http://example.com/w/api.php?action=parse&pst=1&disablepp=1&format=json&redirects=1&text=%7B%7BFoo%7D%7D
The problem
[edit]The json returned wraps the output in paragraph tags for some reason, which I don't want.
array(1) {
["parse"]=>
array(12) {
["title"]=>
string(3) "API"
["text"]=>
array(1) {
["*"]=>
string(59) "<p>This calls the hello world template: Hello, world!.</p>"
}
["langlinks"]=>
array(0) {
}
["categories"]=>
array(0) {
}
["links"]=>
array(0) {
}
["templates"]=>
array(0) {
}
["images"]=>
array(0) {
}
["externallinks"]=>
array(0) {
}
["sections"]=>
array(0) {
}
["displaytitle"]=>
string(3) "API"
["iwlinks"]=>
array(0) {
}
["properties"]=>
array(0) {
}
}
}
- Templates that contain tables do not wrap their output in paragraph (<p>) tags.
- When the templates are used on a normal wiki page, no such paragraph tag wrapping occurs
"byteoffset" is a misnomer
[edit]In the "sections" array returned by this API call, the field named "byteoffset" holds the offset of the section within the wikitext markup. Contrary to its name, the offset is measured in code points, not bytes. (Note: non-BMP characters count as one. Beware of UTF-16, especially in JavaScript, where
"💩".length === 2
.) Keφr 22:11, 10 March 2014 (UTC)
- ARGH! This just bit me very hard. I'm going to see if we can note this in the documentation. Enterprisey (talk) 05:42, 2 October 2022 (UTC)
action=parse should give a basetimestamp/starttimestamp when given an oldid
[edit]Say, I want to use action=parse to take the wikitext and some of the more "advanced" parser output, like the section list or the XML tree. If I want to transform the wikitext and save it back, with edit conflict resolution, I need a starttimestamp and/or basetimestamp. Without having it here, I have to round-trip the server twice. Kludgy. Keφr 11:11, 11 March 2014 (UTC)
Argument "disablepp" does not work
[edit]Hi, I hope that I am doing something wrong. Here's my query:
As you can see in the output, there is still a comment with preprocessor stuff:
<-- \nNewPP limit report\nParsed by mw1092\nCPU time usage: 0.364 seconds\nReal time usage: 0.482 seconds\nPreprocessor visited node count: 934/1000000\nPreprocessor generated node count: 2144/1500000\nPost\u2010expand include size: 12867/2048000 bytes\nTemplate argument size: 1558/2048000 bytes\nHighest expansion depth: 7/40\nExpensive parser function count: 2/500\n-->
--2A02:8071:B486:2300:2210:7AFF:FEF8:7EEE 21:52, 9 October 2014 (UTC)
- You're right. This is a known bug, as reported here and here. – RobinHood70 talk 08:35, 10 October 2014 (UTC)
Request Header Or Cookie Too Large
[edit]Hi, how do I avoid "400 Bad Request - Request Header Or Cookie Too Large"?
or "414 Request-URI Too Long"?
I'm trying to parse the content of this, I would like to make a few automated changes to the source and then to sow it in a html page. אור שפירא (talk) 07:10, 22 July 2015 (UTC)
- Are you using GET requests or POST requests ? Try using POST requests (for GET requests, parameters are in the URI; for POST requests, they are outside the URI). --NicoV (talk) 07:15, 22 July 2015 (UTC)
- I am having the same issue. I don't see a way to use action=parse with the
text
parameter if the payload is larger than about 6KB, since action=parse seems to be limited to GET requests, and most webservers will limit the URL sizes to approximately 8192 bytes. - Is there a solution to this? There's a lot of speed-profiling I'd like to do on my wiki that is much easier if I can mix-and-match which parts I send, and then look at the PP report - but I don't have a way to do that if I'm limited to tiny payloads. 198.134.98.50 06:29, 4 August 2021 (UTC)
- I am having the same issue. I don't see a way to use action=parse with the
Import wikitext from parsertree
[edit]I wanted to edit wikitext automatically and easily, so I tried to edit it with parsetree. But I couldn't find a way to get wikitext from parsetree. Can't I convert parsetree to wikitext with api? --Gustmd7410 (talk) 23:40, 3 June 2018 (UTC)
Group categories by sections
[edit]Currently, it is possible to get list of Categories for a specific section: https://ru.wiktionary.org/w/api.php?action=parse&page=весь&prop=sections|categories&format=json§ion=5
. Is it possible to get a map (dict in Python) with categories corresponding to each section? Example:
{ section_1 -> [categories of section_1], section_2 -> [categories of section_2], ...}
Soshial (talk) 13:43, 9 June 2019 (UTC)
How can the API be used to get the printable version?
[edit]Some context: Archiving selected pages from our MediaWiki based project wiki using a browser to manually save them can become a bit tedious, so we have a script to do that. Now we want to switch to the current LTS version 1.31.5, which together with other requirements means the script has to use Bot passwords for logging in. But using Bot passwords means index.php isn't available:
Clients using bot passwords can only access the API, not the normal web interface.
Glueing together what action=parse and prop=text|headhtml return I get an HTML document almost, but not quite, entirely unlike the result from index.php. Indeed it is good enough for the rest of the script to work without further changes -- except for the missing parameter "printable=1" in the links to the stylesheets. The stylesheets retrieved with these links result in very noticeable differences, e.g. the wrong font in a very small size.
Hacking the missing parameter into the links with brute force seems to work, but is not a solution I really like. Is there a way to ask action=parse to provide the printable version of the rendered HTML?
-- Cat's paw (talk) 17:19, 16 January 2020 (UTC)
"Gives the templates"
[edit]What does "gives the templates" mean?
Suppose I have a template named Q. It has something to do with Wikidata.
In my personal wiki, I can find all the pages transcluding Q easily enough with another API. And then I can fetch the page, requesting that it "gives the templates".
But what I really want to know are the template arguments for each instance (I expect many instances on most pages).
Suppose I want to do this as a stopgap measure before wading into Extension:Cargo. Will this API do the job for me or not? What does it mean "gives the templates"? The names of all templates used, or the invocations of each template, with the full text of each invocation?
Undefined "give" likely applies to much else herein. MaxEnt (talk) 21:02, 27 November 2021 (UTC)
Replace "let" and "const" keywords with "var" in the examples
[edit]Hi there! Please someone replace let and const keywords with var keyword in the examples because they've already been declared in the system, will not work, and will produce an error message while running in console. I tried with var and saw the result without any issues. See an example of the error message from browsers:
Uncaught SyntaxError: Identifier 'params' has already been declared
Thanks! Aram (talk) 21:14, 30 April 2022 (UTC)
Unrecognized parameter: parsoid
[edit]Hello,
I am trying to use the "parsoid" parameter in a parse action API call in order to retrieve an HTML convert of some wikitext that includes <section> ... </section>
tags to wrap logical sections, as is described in part of the Specs/HTML/2.7.0 .
Parsoid doesn't seem to work. I get the warning: "Unrecognized parameter: parsoid." in my API response text. Moreover, a quick inspection of the API Help page of the Wiki I am working in shows that, unlike the Mediawiki API Help page, I do not have "parsoid" listed as one of the available parameters. The Wiki I am working on is at version 1.39.2.
Isn't parsoid pre-packaged and automatically enabled with Mediaiwiki 1.39.2? Shouldn't it just work then? Is there a mismatch in versions of the the documentation I am reading and the one available on Mediawiki? 217.67.225.138 12:03, 30 March 2023 (UTC)
mobileformat
[edit]Is the prop "mobileformat" stable?
Are there any plans to change/update the render output any time soon? TomerLerner (talk) 19:18, 7 March 2024 (UTC)