Extension talk:AddScript
Add topicXSS Vulnerability Description
[edit]Version 1.1 is vulnerable to XSS attacks. Consider the following:
- Malicious user creates a page called "EvilScript" and fills it with evil JavaScript (perhaps adding another
<script>
tag to the DOM, etc). - Malicious user creates a page containing the following:
<addscript src='../index.php?title=EvilScript&action=raw&ctype=text/javascript'/>
- Note: This could even be the same page (for example in a JS comment), so an admin going there for the first time to check it out/delete it could be hit immediately!
- This becomes the following in the document HEAD:
<script src="/wikiroot/jsscripts/../index.php?title=EvilScript...
- Note: Modern browsers will intelligently strip '
/../
' along with the parent dir when found in fetched URLs.
- Note: Modern browsers will intelligently strip '
- Innocent visitor visits the containing page, which loads EvilScript - user's account is compromised.
Hopefully the above will help the author correct this very serious problem. --Jimbojw 19:37, 29 March 2007 (UTC)
- So, is the fix for this sort of vulnerability to remove the '
/../
' in the user provided string? Is this 'good enough' since any other URL must be part of the sysop controlled 'jsscripts' path? Jean-Lou Dupont 12:37, 30 March 2007 (UTC)
- Disallowing the patterns '/./' and '/../' in the URL might be good enough - I'm not sure. If you intend for all the scripts to be directly under the 'jsscripts' directory (ie no subdirectories), then it might be sufficient to strip out all forward slashes.
- As a side note, if you continue allowing users to specify information that's to become part of a URL, you should really
urlencode()
the input. As it stands now, even if you solved the '/../' problem, people could still specify a double-quote to escape thesrc
attribute, then specify other attributes within the<script>
tag. I'm not sure how effective this would be as an attack vector, but it could be done.
- As a side note, if you continue allowing users to specify information that's to become part of a URL, you should really
- Regarding a more robust solution to the current dilemma: since you're requiring file-system level access in the first place (to put the scripts in the jsscripts dir), you could have a mapping of short-hand strings to full paths. For example, instead of this:
<addscript src="path/to/whatever.js" />
You could have the user provide this:<addscript name="whatever" />
Then, in the configuration for the extension (LocalSettings.php
), you'd have a mapping of names to paths. Continuing with this example, you'd use this:$wgAddScriptMappings = array( 'whatever' => 'path/to/whatever.js' );
- Regarding a more robust solution to the current dilemma: since you're requiring file-system level access in the first place (to put the scripts in the jsscripts dir), you could have a mapping of short-hand strings to full paths. For example, instead of this:
- Implementation of any or all of the above recommendations is left as an exercise for the reader :) --Jimbojw 20:44, 30 March 2007 (UTC)
What Jimbojw suggests is quite simmilar to the precautions i took with Extension:HTMLets. It's actually quite simmilar to this, only that it reads HTML from files and inserts it into pages directly, instead of referencing scripts via URL. But the code would look simmilar, I guess. -- Duesentrieb â 23:52, 30 March 2007 (UTC)
- I have uploaded v1.2, please check it out. I have tested it on my home windows system as well as on my 1and1 Linux hosted site. Jean-Lou Dupont 01:10, 31 March 2007 (UTC)
Script doesn't work on bluehost.com...
[edit]...but with a small mod it can do. Instead of
return file_exists( $bpath."/{$spath}" );
if you use
return file_exists( self::$base.$uri.".js" );
then things seem to work. I haven't had time to figure out which PHP settings cause this, maybe will come back to this later. - all the best, Adam
Which version of PHP are you running on bluehost.com ? There isn't much code behing the code line you are pointing in your snippet... only 'dirname' based statements... Jean-Lou Dupont 19:05, 21 April 2007 (UTC)
- The code snippets were regarding the CheckURI function in your code. My PHP version is 5.1.6 (although bluehost's default PHP is 4.something - I just asked to be upgraded so that MediaWiki would install). Do you want me to send you phpinfo()? --Amacieli 09:34, 24 April 2007 (UTC)
- Another question: Just noticed that headscripts are only loaded if I use &action=purge in the URL. I don't think this is a cache setting problem as my other wiki pages update just fine. What I'm doing that's not working is using <addscript src=mapcontrol /> as the first line in my wiki page. Any ideas? --Adam 12:19, 24 April 2007 (UTC)
- This is probably 'parser cache' related. Please use Parser Cache Control extension. Alternatively, give a try to version 1.3 of the extension. Jean-Lou Dupont 15:14, 24 April 2007 (UTC)
- What Adam is describing (with regards to the
action=purge
statment) is still the case with 1.3. The problem is that any modifications to$wgOut
that a parser extension makes (even a hook to ParserBeforeTidy) will expire as soon as the output is cached. - The good news is that there are ways around this without forcing a purge on ever page view. I discuss one such method in my article Doing more with MediaWiki parser extensions. Hope this helps :) --Jimbojw 18:38, 24 April 2007 (UTC)
- I went too fast in modifying the code for releasing v1.3 (release 1.3 still works OK but wasn't really necessary): I had analysed the problem and came to the conclusion that the answer was Extension:ParserCacheControl *if* one wants to retain all the power of the 'parser cache' functionality without going through additional hoops (thanks anyways Jimbojw ) Jean-Lou Dupont 18:53, 24 April 2007 (UTC)
- No problem. Another option would be to just expire the cache yourself. I believe this is possible to do from the parser tag code. In fact there used to be an extension - called PurgePage? - that added a tag called
<purge>
which would do just that. I'm not sure if PurgePage is still around or not. - For what it's worth, I strongly encourage you to find a way to not force the user to constantly purge/bypass the cache. For large, popular pages on wikis with appreciable amounts of traffic, it may not be feasible to disable caching due to performance losses. --Jimbojw 22:09, 24 April 2007 (UTC)
- I gave Jimbojw's code a whirl, and it works as advertised, you can just refresh your browser without purging, and the head script stays. I just replaced the addMeta line with addScript so that I got what I needed. Thanks for all your help, Jimbojw and JLD! --Adam 02:29, 25 April 2007 (UTC)
- No problem. Another option would be to just expire the cache yourself. I believe this is possible to do from the parser tag code. In fact there used to be an extension - called PurgePage? - that added a tag called
- I went too fast in modifying the code for releasing v1.3 (release 1.3 still works OK but wasn't really necessary): I had analysed the problem and came to the conclusion that the answer was Extension:ParserCacheControl *if* one wants to retain all the power of the 'parser cache' functionality without going through additional hoops (thanks anyways Jimbojw ) Jean-Lou Dupont 18:53, 24 April 2007 (UTC)
- What Adam is describing (with regards to the
- This is probably 'parser cache' related. Please use Parser Cache Control extension. Alternatively, give a try to version 1.3 of the extension. Jean-Lou Dupont 15:14, 24 April 2007 (UTC)
Multiple Instances
[edit]I'm finding that in some cases the feedScripts()
function is called multiple times due to another extension. When this happens the <script> link is added multiple times. My solution is to add $this->slist = array();
to the end of the feedScripts()
function just before the return call. --Hypercubed 08:07, 20 January 2008 (UTC)