mediawiki-api

How to get Infobox from a Wikipedia article by Mediawiki API?

↘锁芯ラ 提交于 2019-12-28 02:40:13
问题 Wikipedia articles may have Infobox templates. By the following call I can get the first section of an article which includes Infobox. http://en.wikipedia.org/w/api.php?action=parse&pageid=568801&section=0&prop=wikitext What I want is a query which will return only Infobox data. Is this possible? 回答1: You can do it with a url call to the Wikipedia API like this: http://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=content&format=xmlfm&titles=Scary%20Monsters%20and%20Nice

How to use mediawiki API to create a table for a wiki page in PHP

╄→гoц情女王★ 提交于 2019-12-25 03:16:10
问题 I intend to use MediaWiki API to create a table for a wiki page. In MediaWiki, the syntax for creating a table should be like the following: {|class="wikitable sortable" !'''Full Name'''||'''Short Name''' |- |MediaWiki||MW |} When I called MediaWiki API, I stored the contents which I want to add to the page in a string: $content = "{|class='wikitable sortable'<br>!'''Full Name'''||'''Short Name'''<br>|-<br>|MediaWiki||MW|}"; I intended to used "br" to make line breaks, but failed to create a

How can I access a JS object property whose value is an unknown integer?

别等时光非礼了梦想. 提交于 2019-12-24 07:40:39
问题 The JSON response returned from this wikipedia API call is a series of nested objects. To travel down the object property chain and access the text of interest, I have to first access a property whose value is a random number, dependent upon the wikipedia page I query by title. An example for the page titled "San%20Francisco" (page id = 49728): Object Property Chain : responseJSON.wiki[0].query.pages[<<page id>>].extract Example API Call: https://en.wikipedia.org/w/api.php/?origin=*&format

How to permanently delete a page from a MediaWiki wiki?

血红的双手。 提交于 2019-12-23 07:03:05
问题 I administer a MediaWiki wiki that has been hit by a ton of spam. I've managed to get rid of the spam pages that have been added to the wiki by using an extension, but the pages' data is still present in the wiki's MySQL database. This is bloating the database to over 3GB. Is there a way to permanently delete the spam pages from the wiki, so that they are completely removed from the database? 回答1: Run the maintenance script named deleteArchivedRevisions.php. Note that running MediaWiki

How to get internal link from latest revision of a wikipedia page?

吃可爱长大的小学妹 提交于 2019-12-23 02:57:07
问题 I'm trying to extract internal links from wikipedia pages. This is the query I'm using /w/api.php?action=query&prop=links&format=xml&plnamespace=0&pllimit=max&titles=pageTitle However, the result does not reflect what's on the wiki page. Take for example a random article here. There are only a dozen of links on this page. However, when I make the query, /w/api.php?action=query&prop=links&format=xml&plnamespace=0&pllimit=max&titles=Von_Mises%E2%80%93Fisher_distribution I got back 187 links. I

MediaWiki Query and/or WikidataQuery to find Wikipedia article

一笑奈何 提交于 2019-12-22 11:34:07
问题 This isn't so much a question abut AngularJS as it is about the Wikimedia and Wikidata query API's. Although I am trying to display the content of a Wikipedia article in AngularJS after doing a certain query that isn't the problem. I already know how to display it... the problem is the search for an article or articles. I'm trying to query Wikipedia by historical event date as well as by geo-location. Let's pick a random event, any event. Let's say " 1986 Mozambican Tupolev Tu-134 crash ".

Download images with MediaWiki API?

人走茶凉 提交于 2019-12-22 06:28:35
问题 Is it possible to download images from Wikipedia with MediaWiki API? 回答1: No, it is not possible to get the images via the API. Images in a MediaWiki are stored just in folders, not in a database and are not delivered dynamically (more information on that in the Manual:Image administration). However, you can retrieve the URLs of those image files via the API. For example see the API:Allimages list or imageinfo property querymodules. Then you can download the files from those URLs with your

Retrieve a list of all Wikipedia languages programmatically

我与影子孤独终老i 提交于 2019-12-22 05:35:16
问题 I need to retrieve a list of all existing languages for a certain wiki project. For example, all Wikivoyage or all Wikipedia languages, just like on their landing pages. I prefer to do this via MediaWiki API , if it's possible. Thanks for your time. 回答1: Approach 3: Using an API in the Wikimedia wiki farm and Extension:Sitematrix https://commons.wikimedia.org/w/api.php?action=sitematrix&smtype=language While this will return all wikis, the matrix knows about, it is easily filtered client side

How do I log into mediawiki using PHP cURL?

时光总嘲笑我的痴心妄想 提交于 2019-12-21 21:31:13
问题 I'm trying to integrate mediawiki into my website, but I'm having trouble. I think the problem has something to do with the cookies because I get a success from the mediawiki API. Here's my code: function mw_session_manager($Action = "") { $Root = $_SERVER['SERVER_ADDR']; $API_Location = "${Root}/w/api.php"; $expire = 60*60*24*14 + time(); $CookieFilePath = tempnam("/tmp/thedirectory", "CURLCOOKIE"); $CookiePrefix = 'theprefix'; $Domain = 'thedomain'; if($Action == 'login') { // Retrieves

Get location with Wikimedia API

假如想象 提交于 2019-12-21 20:58:44
问题 How to get location of the Wikipedia article with Mediawiki API in terms of city/country? Let's say I want to determine what country, what city the Sagrada Familia cathedral is located in? What property should I use? 回答1: Try the following query: https://en.wikipedia.org/w/api.php?action=query&prop=coordinates&titles=Sagrada%20Fam%C3%ADlia&coprop=country|type|name|dim|region And see Extension:GeoData for documentation. I'm not sure if we can get the city name using Wikipedia API, but there