Manual:Creating a bot: Difference between revisions
Leucosticte (talk | contribs) Created page with "'''Robots''' or '''bots''' are automatic processes that interact with Wikipedia (and other Wikimedia projects) as though they were human editors. This page att..." |
(No difference)
|
Revision as of 18:56, 11 October 2014
Robots or bots are automatic processes that interact with Wikipedia (and other Wikimedia projects) as though they were human editors. This page attempts to explain how to carry out the development of a bot for use on Wikimedia projects and much of this is transferable to other wikis based on MediaWiki. The explanation is geared mainly towards those who have some prior programming experience, but are unsure of how to apply this knowledge to creating a Wikipedia bot.
Why would I need to create a bot?
Bots can automate tasks and perform them much faster than humans. If you have a simple task that you need to perform lots of times (an example might be to add a template to all pages in a category with 1000 pages), then this is a task better suited to a bot than a human.
Considerations before creating a bot
There are a number of semi-bots available to anyone. Most of these take the form of enhanced web browsers with MediaWiki-specific functionality. The most popular of these is AutoWikiBrowser (AWB), a browser specifically designed to assist with editing on Wikipedia and other Wikimedia projects. A complete list of semi-bots can be found at Manual:Tools/Editing tools. Semi-bots, such as AWB, can often be operated with little or no understanding of programming.
If you decide you need a bot of your own due to the frequency or novelty of your requirements, you don't need to write one from scratch. Many bots publish their source code, which can sometimes be reused with little additional development time. There are also a number of standard bot frameworks available for download. These frameworks comprise the vast majority of a bot's code. Since these bot frameworks are in common usage and the complex coding has been done by others and has been heavily tested, it is far easier to get bots based on these frameworks approved for use. The most popular and common of these frameworks is Pywikibot (PWB), a bot framework written in Python, which is well documented and tested and for which, in addition to the framework, many standardized scripts (bot instructions) are available. Other examples of bot frameworks can be found below. For some of these bot frameworks, such as PWB, a general familiarity with scripts is all that is necessary to run the bot successfully as the complex code that makes up the framework has been written and tested by others and is frequently updated (it is important to regularly apply the framework updates for these bots).
If you wish to write a new bot, be aware that it may require significant programming ability and a completely new bot will be required to undergo substantial testing before it will be approved for regular operation. Planning is crucial to obtain an error-free, efficient, and effective program. The following initial considerations are important:
- Will the bot be manually assisted or fully automated?
- Will you create the bot alone, or with the help of other programmers?
- What language will be used to implement the bot?
- Will the bot's requests, edits, or other actions be logged? If so, will the logs be stored on local media, or on wiki pages?
- Will the bot run inside a web browser (for example, written in Javascript), or will it be a standalone program?
- If the bot is a standalone program, will it run on your local computer, or on a remote server such as the Wikimedia Labs?
- If the bot runs on a remote server, will other editors be able to operate the bot or start it running?
How does a MediaWiki bot work?
Overview of operation
Just like a human editor, a MediaWiki bot reads wiki pages, and makes changes where it thinks changes need to be made. The difference is that although bots are faster and less prone to fatigue than humans, they are nowhere near as bright as we are. Bots are good at repetitive tasks that have easily defined patterns, where few decisions have to be made.
In the most typical case, a bot logs in to its own account and requests pages from the wiki in much the same way as a browser does – although it does not display the page on screen, but works on it in memory – and then programmatically examines the page code to see if any changes need to be made. It then makes and submits whatever edits it was designed to do, again in much the same way a browser would.
Because bots access pages the same way people do, bots can experience the same kind of difficulties that human users do. They can get caught in edit conflicts, have page timeouts, or run across other unexpected complications while requesting pages or making edits. Because the volume of work done by a bot is larger than that done by a live person, the bot is more likely to encounter these issues. Thus, it is important to consider these situations when writing a bot.
APIs for bots
In order to make changes to wiki pages, a bot necessarily has to retrieve pages from the wiki and send edits back. There are several Application Programming Interfaces (APIs) available for that purpose.
- MediaWiki API (api.php). This library was specifically written to permit automated processes such as bots to make queries and post changes. Data is available in many different machine-readable formats (JSON, XML, YAML,...). Features have been fully ported from the older Query API interface.
- Status: Available on all Wikimedia projects, with a very complete set of queries. The ability to edit pages via api.php has also been enabled on all Wikimedia projects, enabling bots to operate entirely without screen scraping.
- There is also an API sandbox for those wanting to test api.php's features.
- Screen scraping (index.php). Screen scraping involves requesting a Wikipedia page, looking at the raw HTML code (what you would see if you clicked View → Source in most browsers), and then analyzing the HTML for patterns. There are certain problems with this approach: the Wikipedia interface can change without notice, which may break the bot code, and calling for HTML creates a larger server load than processing the wikitext itself. There is basically no reason to use this technique anymore.
- Status: Deprecated.
- Special:Export can be used to obtain bulk export of page content in XML form. See Manual:Parameters to Special:Export for arguments;
- Status: Built-in feature of MediaWiki, available on all Wikimedia servers.
- Raw (Wikitext) page processing: sending a
action=raw
or aaction=raw&templates=expand
GET request to index.php will give the unprocessed wikitext source code of a page. An API query withprop=revisions&rvprop=content
orprop=revisions&rvprop=content&rvexpandtemplates=1
is roughly equivalent, and allows for retrieving additional information.- Status: Built-in feature of MediaWiki, available on all Wikimedia servers.
Some web servers are configured to grant requests for compressed (gzip) content. This can be done by including a line "Accept-Encoding: gzip" in the HTTP request header; if the HTTP reply header contains "Content-Encoding: gzip", the document is in gzip form, otherwise, it is in the regular uncompressed form. Note that this is specific to the web server and not to the MediaWiki software. Other sites employing MediaWiki may not have this feature. If you are using an existing bot framework, it should handle low-level operations like this.
Logging in
Approved bots need to be logged in to make edits. Although a bot can make read requests without logging in, bots that have completed testing should log in for all activities. Bots logged in from an account with the bot flag can obtain more results per query from the Mediawiki API (api.php). Most bot frameworks should handle login and cookies automatically, but if you are not using an existing framework, you will need to follow these steps.
For security, login data must be passed using the HTTP POST method. Because parameters of HTTP GET requests are easily visible in URL, logins via GET are disabled.
To log a bot in using the MediaWiki API, 2 POST requests are needed:
- Request 1
- URL:
http://en.wikipedia.org/w/api.php?action=login&format=xml
- POST parameters:
lgname=BOTUSERNAME
lgpassword=BOTPASSWORD
If the password is correct, this will return a "NeedToken" result and a "token" parameter in XML form, as documented at API:Login. Other output formats are available. It will also return HTTP cookies as described below.
- Request 2
- URL:
http://en.wikipedia.org/w/api.php?action=login&format=xml
- POST parameters:
lgname=BOTUSERNAME
lgpassword=BOTPASSWORD
lgtoken=TOKEN
where TOKEN is the token from the previous result. The HTTP cookies from the previous request must also be passed with the second request.
A successful login attempt will result in the Wikimedia server setting several HTTP cookies. The bot must save these cookies and send them back every time it makes a request (this is particularly crucial for editing). On the English Wikipedia, the following cookies should be used: enwikiUserID, enwikiToken, and enwikiUserName. The enwiki_session cookie is required to actually send an edit or commit some change, otherwise the MediaWiki:Session fail preview error message will be returned.
Editing; edit tokens
Wikipedia uses a system of edit tokens for making edits to Wikipedia pages, as well as other operations that modify existing content such as rollback. The token looks like a long hexadecimal number followed by '+\', for example:
- d41d8cd98f00b204e9800998ecf8427e+\
The role of edit tokens is to prevent "edit hijacking", where users are tricked into making an edit by clicking a single link.
The editing process involves two HTTP requests. First, a request for an edit token must be made. Then, a second HTTP request must be made that sends the new content of the page along with the edit token just obtained. It is not possible to make an edit in a single HTTP request. An edit token remains the same for the duration of a logged-in session, so the edit token needs to be retrieved only once and can be used for all subsequent edits.
To obtain an edit token, follow these steps:
- MediaWiki API (api.php). Make a request with the following parameters (see API:Edit - Create&Edit pages).
action=query
prop=info
titles=PAGENAME
intoken=edit
The token will be returned in the
edittoken
attribute of the response.
If the edit token the bot receives does not have the hexadecimal string (i.e., the edit token is just '+\') then the bot most likely is not logged in. This might be due to a number of factors: failure in authentication with the server, a dropped connection, a timeout of some sort, or an error in storing or returning the correct cookies. If it is not because of a programming error, just log in again to refresh the login cookies. The bots may use Assert Edit Extension to make sure that they are logged in.
Edit conflicts
Edit conflicts occur when multiple, overlapping edit attempts are made on the same page. Almost every bot will eventually get caught in an edit conflict of one sort or another, and should include some mechanism to test for and accommodate these issues.
Bots that use the Mediawiki API (api.php) should retrieve the edit token, along with the starttimestamp
and the last revision "base" timestamp, before loading the page text in preparation for the edit; prop=info|revisions
can be used to retrieve both the token and page contents in one query (example). When submitting the edit, set the starttimestamp
and basetimestamp
attributes, and check the server responses for indications of errors. For more details, see API:Edit - Create&Edit pages.
Generally speaking, if an edit fails to complete the bot should check the page again before trying to make a new edit, to make sure the edit is still appropriate. Further, if a bot rechecks a page to resubmit a change, it should be careful to avoid any behavior that could lead to an infinite loop and any behavior that could even resemble edit warring.
Overview of the process of developing a bot
Actually, coding or writing a bot is only one part of developing a bot.
Idea
The first task in creating a MediaWiki bot is extracting the requirements or coming up with an idea.
Specification
- Specification is the task of precisely describing the software to be written, possibly in a rigorous way. You should come up with a detailed proposal of what you want it to do. Try to discuss this proposal with some editors and refine it based on feedback. Even a great idea can be made better by incorporating ideas from other editors.
- In the most basic form, your specified bot must meet the following criteria:
- The bot is harmless (it must not make edits that could be considered disruptive to the smooth running of the encyclopedia)
- The bot is useful (it provides a useful service more effectively than a human editor could)
- The bot does not waste server resources.
Software architecture
- Think about how you might create it and which programming language(s) and tools you would use. Architecture is concerned with making sure the software system will meet the requirements of the product as well as ensuring that future requirements can be addressed. Certain programming languages are better suited to some tasks than others, for more details see the section on programming languages below.
Implementation
Implementation (or coding) involves turning design and planning into code. It may be the most obvious part of the software engineering job, but it is not necessarily the largest portion. In the implementation stage you should:
- Create an account for your bot. Click here when logged in to create the account, linking it to yours.
- Create a user page for your bot. Your bot's edits must not be made under your own account. Your bot will need its own account with its own username and password.
- Add the same information to the user page of the bot. It would be a good idea to add a link to the approval page (whether approved or not) for each function.
- Code your bot in your chosen programming language.
Testing
A good way of testing your bot as you are developing is to have it show the changes (if any) it would have made to a page, rather than actually editing the live wiki. Some bot frameworks (such as pywikibot) have pre-coded methods for showing diffs.
Documentation
An important (and often overlooked) task is documenting the internal design of your bot for the purpose of future maintenance and enhancement. This is especially important if you are going to allow clones of your bot. Ideally, you should post the source code of your bot on its userpage or in a revision control system (see #Open-source bots) if you want others to be able to run clones of it. This code should be well documented (usually using comments) for ease of use.
Maintenance
Maintaining and enhancing your bot to cope with newly discovered bugs or new requirements can take far more time than the initial development of the software. Not only may it be necessary to add code that does not fit the original design, but just determining how software works at some point after it is completed may require significant effort (this is another reason to document your code as you go along).
General guidelines for running a bot
In addition to the official bot policy, which covers the main points to consider when developing your bot, there are a number of more general advisory points to consider when developing your bot.
Bot best practices
- Set a custom User-Agent header for your bot (per the Wikimedia User-Agent policy, if your bot will be operating on Wikimedia wikis).
- Use the maxlag parameter with a maximum lag of 5 seconds. This will enable the bot to run quickly when server load is low, and throttle the bot when server load is high.
- If writing a bot in a framework that does not support maxlag, limit the total requests (read and write requests together) to no more than 10/minute.
- Use the API whenever possible, and set the query limits to the largest values that the server permits, to minimize the total number of requests that must be made.
- Edit (write) requests are more expensive in server time than read requests. Be edit-light and design your code to keep edits to a minimum.
- Try to consolidate edits. One single large edit is better than 10 smaller ones.
- Enable HTTP persistent connections and compression in your HTTP client library, if possible.
- Do not make multi-threaded requests. Wait for one server request to complete before beginning another
- Back off upon receiving errors from the server. Errors such as squid timeouts are often an indication of heavy server load. Use a sequence of increasingly longer delays between repeated requests.
- Make use of the Assert Edit extension, an extension explicitly designed for bots to check certain conditions, which is enabled on Wikipedia.
- Test your code thoroughly before making large automated runs. Individually examine all edits on trial runs to verify they are perfect.
Common bot features you should consider implementing
Manual assistance
If your bot is doing anything that requires judgment or evaluation of context (e.g., correcting spelling) then you should consider making your bot manually-assisted, which means that a human verifies all edits before they are saved. This significantly reduces the bot's speed, but it also significantly reduces errors.
Disabling the bot
It is good bot policy to have a feature to disable the bot's operation if it is requested. Remember that if your bot goes bad, it is your responsibility to clean up after it! You could have the bot refuse to run if a message has been left on its talk page, on the assumption that the message may be a complaint against its activities; this can be checked using the API meta=userinfo
query (example). Or you could have a page that will turn the bot off if text on the page is changed (e.g. require the page be empty, contain only the word "True", or something similar); this can be checked by loading the page contents before each edit.
Open-source bots
Many bot operators choose to make their code open source, and occasionally it may be required before approval for particularly complex bots. Making your code open source has several advantages:
- It allows others to review your code for potential bugs. As with prose, it is often difficult for the author of code to adequately review it.
- Others can use your code to build their own bots. A user new to bot writing may be able to use your code as an example or a template for their own bots.
- It encourages good security practices, rather than security through obscurity.
- If you leave the project, it allows other users to run your bot tasks without having to write new code.
Open-source code, while rarely required, is typically encouraged in keeping with the open and transparent nature of wikis, though there are some cases when code should not be made public. For example, the open proxy-finding code of ProcseeBot could be used for malicious purposes on other sites.
Making code open source can add some extra work to coding. One has to make sure that sensitive information such as passwords is separated into a file that isn't made public.
There are several options available for users wishing to make their code open. Some users choose to put the code in a subpage of the bot's userspace, although this can be a hassle to maintain if not automated and results in the code being multi-licensed under the wiki's licensing terms in addition to any other terms you may specify. Another solution is to use a revision control system such as SVN, Git, or Mercurial. Wikipedia has articles comparing the different software options and websites for code hosting, many of which have no cost. The Wikimedia Toolserver also offers SVN hosting for its users.
Programming languages and libraries
- See also: API:Client code
Bots can be written in almost any programming language. The choice of a language often depends on the experience of the bot writer (which languages are familiar) or on the availability of pre-developed libraries to perform the desired task. The following list includes some languages that have libraries to assist with bot tasks.
Perl
Perl has a run-time compiler. This means that it is not necessary to compile builds of your code yourself as it is with other programming languages. Instead, you simply create your program using a text editor such as vim. You then run the code by passing it to an interpreter. This can be located either on your own computer or on a remote computer (webserver). If located on a webserver, you can start your program running and interface with your program while it is running via the Common Gateway Interface from your browser. Perl is available for most operating systems, including Microsoft Windows, Mac OS X and UNIX/Linux. If your internet service provider provides you with webspace, the chances are good that you have access to a perl build on the webserver from which you can run your Perl programs.
Guides to getting started with Perl programming:
- A Beginner's Introduction to Perl
- CGI Programming 101: Learn CGI Today!
- Perl lessons
- Get started learning Perl
Libraries:
- MediaWiki::Bot – A fairly complete MediaWiki bot framework written in Perl. Provides a higher level of abstraction than MediaWiki::API. Plugins provide administrator and steward functionality.
- Mediawiki::API – a library by CBM with robust automatic error handling and wrappers for many common API.php uses. This is not the same as the library on CPAN.
PHP
PHP can also be used for programming bots. MediaWiki developers are already familiar with PHP, since that is the language MediaWiki and its extensions are written in. PHP is an especially good choice if you wish to provide a webform-based interface to your bot. For example, suppose you wanted to create a bot for renaming categories. You could create an HTML form into which you will type the current and desired names of a category. When the form is submitted, your bot could read these inputs, then edit all the articles in the current category and move them to the desired category. (Obviously, any bot with a form interface would need to be secured somehow from random web surfers.)
The PHP bot functions table may provide some insight into the capabilities of the major bot frameworks.
Key people[php 1] | Name | PHP Version | last update | Uses API[php 2] | Exclusion compliant | Admin functions | Plugins | Repository | Notes |
---|---|---|---|---|---|---|---|---|---|
Adam | BasicBot | 5 or 4 | 2007 | Unknown | Unknown | No | No | wikisum.com | Fairly out of date |
w:User:Cyberpower678, w:User:Addshore, and w:User:Jarry1250 | Peachy | 5.2.1 | 2014 | Yes | Yes | Yes | Yes | GitHub | Large framework, currently undergoing rewrite. Documentation currently non-existent, so poke w:User:Cyberpower678 for help. |
w:User:Addshore | mediawiki-api-base | 5.3 | 2014 | Yes | N/A | N/A | extra libs | GitHub | Base library for interaction with the mediawiki api, provides you with ways to handle logging in, out and handling tokens as well as easily getting and posting requests. |
w:User:Addshore | mediawiki-api | 5.3 | 2014 | Yes | No | some | extra libs | GitHub | Build on top of mediawiki-api-base this adds more advanced services for the api such as RevisionGetter, UserGetter, PageDeleter, RevisionPatroller, RevisionSaver etc. |
Kaspo | Phpwikibot | Unknown | 2009 | Partial | No | No | No | Google Code | Uses a single class. |
Jarry1250 | Wikibot | 5 | 2009 | Yes | Yes | No | No | enwiki | Used solely by LivingBot. A fork of Phpwikibot. Uses a single class. |
Foxy Loxy | PHPediaWiki | 5 | 2009 | Yes | No | Yes | No | SourceForge | Fork of SxWiki |
Sam Korn | Pillar | 5 | 2009 | Yes | Yes | Yes | No | Google Code | MIT license |
w:User:nzhamstar | Wikimate | 5.3.2 | 2014 | Yes | No | No | No | GitHub | Supports main article stuff. Authentication, Reading and editing pages/sections, checking if pages exist. Tested and working. Aims to be easy to use. |
Григор Гачев | Apibot | 5.1 | 2014 | Yes | Yes | Yes | Yes | Latest development code | Full API support up to MW 1.21 incl., gzipped xfers, HTTPS, HTTP auth, GET sorting, auto site/user/paraminfo caching and usage, page bot exclusion compliance, close to 1000 functions, DB support, etc etc. Easily extendable modular structure. An UNIX-like overlayed 'assembly line' framework. AGPL 3.0 or later. |
Kaleb Heitzman | MediaWIkiBot | 5 | 2012 | Yes | No | No | No | GitHub | Supports the entire API including uploading and importing. Also supports Semantic MediaWiki. Single Class that creates dynamic methods to work with any of the API calls. |
Edward Z. Yang | Wikpedia Bot in PHP | Unknown | 2005 | No | No | No | No | enwiki | "Probably stale" source code |
GeorgeMoney | Bot Framework | 5 or 4 | 2006 | Unknown | Unknown | Unknown | No | enwiki | Hard to get hold of current code |
Cobi | wikibot.classes | 5 | 2010 | Yes | Yes | No | No | enwiki | Used by multiple large bots (e.g. ClueBot and SoxBot). Uses several classes. |
Chris G | botclasses.php | 5 | 2012 | Yes | Yes | Yes | No | Toolserver | Fork of wikibot.classes. Updated for 2010 API changes. Supports file uploading. |
Python
Python is a popular interpreted language with object-oriented features.
- Getting started with Python
- Official Python tutorial
- Beginner's Guide to coding in python
- Dive Into Python
- Non-Programmer's Tutorial for Python 2.6 on w:Wikibooks
- Libraries
- Pywikibot–The most used Python bot framework.
- wikitools—A lightweight bot framework that uses the MediaWiki API exclusively for getting data and editing, used and maintained by Mr.Z-man (downloads)
- mwclient—An API-based framework maintained by Bryan
- mwparserfromhell - A Python parser for MediaWiki text, maintained by The Earwig
Microsoft .NET
Microsoft .NET is a set of languages including C#, C++/CLI, Visual Basic .NET, J#, JScript .NET, IronPython, and Windows PowerShell. The Microsoft Visual Studio integrated development environment is often used, or the free Microsoft Visual Studio Express versions. Using Mono Project, .NET programs can also run on Linux, Unix, BSD, Solaris and Mac OS X as well as under Windows.
Getting started:
- Add links here!
- MSDN Visual Basic portal
- MSDN Visual C# portal
Libraries:
- DotNetWikiBot Framework – a full-featured client API on .NET, that allows to build programs and web robots easily to manage information on MediaWiki-powered sites. Now translated to several languages. Detailed compiled documentation is available in English.
- WikiFunctions .NET library – Bundled with AWB, is a library of stuff useful for bots, such as generating lists, loading/editing articles, connecting to the recent changes IRC channel and more.
- WikiAccess library
- MediaWikiEngine, used by Commonplace upload tool
- Tyng.MediaWiki class library, a MediaWiki API written in C# used by NrhpBot
- LinqToWiki, strongly typed library for accessing most of MediaWiki API, with support for autocompletion
Java
Java programs are generally developed with an IDE, such as Eclipse or NetBeans; development using a command line console (with the javac and java programs) is also an option.
Getting started:
Libraries:
JavaScript
JavaScript is a scripting language used mainly on web pages. JavaScript can be used to enhance Wikipedia by adding scripts to your vector.js or your monobook.js pages. In some circumstances it is also possible to execute scripts off-site.
Libraries:
- A MediaWiki module exists for Node.js. The module can also be added to your Wikimedia .js page and used as library for on-wiki JS calls. It provides a framework of standard requests (e.g. log in, log out, etc.) as well as a general wrapper method for the MediaWiki API and includes throttling.
Ruby
Ruby is a popular dynamic, object-oriented programming language. It is designed for programmer productivity and fun.
Libraries:
- MediaWiki::Gateway – Ruby framework for the API. No longer in active development, tested up to MediaWiki 1.22, compatible with Wikimedia wikis.
- wikipedia-client - Ruby framework using the API.
- mediawiki/ruby/api, Ruby API client library in active development as of April 2014
Chicken Scheme
Iron Chicken is an extension or "egg" for Chicken Scheme that makes the Mediawiki API programmable using s-expressions, and presents API and HTML output as SXML that can be queried easily.
A simple example that gets members of a category and writes them to a page in the client user's userspace is:
Libraries:
Common Lisp
- CL-MediaWiki implements MediaWiki API as a Common Lisp package. Is planned to use JSON as a query data format. Supports maxlag and assert edit extensions.
Haskell
Tcl
- MediaWiki Tcl Bot Framework, includes IRC-RC Interface
C++/Qt4.5
- [1] simple media wiki bot, written on C++/Qt and with only few functions like auth, get page source, put page source. (sources are published on Russian language forum – see attachment to topic.)
VBScript
VBScript is a scripting language based on the Visual Basic programming language. A VBScript engine is installed by default on most Windows machines, making installation of additional frameworks unnecessary (and making distribution of scripts easy). VBScript used by many system administrators as an automation tool. It can be used to easily control Internet Explorer to automate repetitive tasks. Alternatively, you can use MSHTML (the Trident (layout engine)) in VBScript with COM.
Getting started:
- VBScript, MSDN
- Windows Script 5.6 Documentation, Microsoft
- InternetExplorer Object, MSDN
- Document Object, MSDN
- Displaying Data by Using Internet Explorer, TechNet
- MSHTML Reference, MSDN
Examples:
- w:User:Smallman12q/Scripts/cleanuplistingtowiki - Login and give preview of edit
- w:User:Smallman12q/VBS/Savewatchlist - Login, get raw watchlist, save to file, logout, close IE
- w:Commons:User:Smallbot#Sources - Several scripts showing the usage of VBScript (Javascript, XMLHTTP, MSHTML, XMLDOM, COM) for batch uploads.
Google Apps Script
Google Apps Script is a cloud based scripting language for light-weight application development in the Google Apps platform. Without any type of installation, this scripting language helps one to create BOT Example:
- NeechalBOT source code Wikipedia Bot using Google apps script
Bash
Bash (Unix shell) is a Unix shell.
- See API:Client_code/Bash. Requires cURL package.