mediawiki

Working example of wikitext-to-HTML in ANTLR 3

*爱你&永不变心* 提交于 2019-12-06 11:47:35
问题 I'm trying to flesh out a wikitext-to-HTML translator in ANTLR 3, but I keep getting stuck. Do you know of a working example that I can inspect? I tried the MediaWiki ANTLR grammar and the Wiki Creole grammar, but I can't get them to generate the lexer & parser in ANTLR 3. Here are the links to two grammars I've tried using: http://www.mediawiki.org/wiki/Markup_spec/ANTLR http://www.wikicreole.org/wiki/EBNFGrammarForCreole1.0 I can't get any of these two to generate my Java Lexer and Parser.

WikiEditor toolbar is missing

不打扰是莪最后的温柔 提交于 2019-12-06 09:19:47
I am running MediaWiki 1.23 with a custom skin under Chrome . I have done as instructed in the MediaWiki page about WikiEditor(Extension:WikiEditor) . Below, I provided the lines I've added after the end of the default configuration in LocalSettings.php : # End of automatically generated settings. # Add more configuration options below. require_once( "$IP/skins/Fresh/Fresh.php" ); require_once( "$IP/extensions/WikiEditor/WikiEditor.php" ); $wgDefaultUserOptions['usebetatoolbar'] = 1; $wgDefaultUserOptions['usebetatoolbar-cgd'] = 1; $wgDefaultUserOptions['wikieditor-preview'] = 1; Afterwards, I

No 'Access-Control-Allow-Origin' header is present. Origin is therefore not allowed access

大城市里の小女人 提交于 2019-12-06 08:33:21
问题 EDIT: No JSONP! Yes i know CORS is handled by the server and Yes the server does support it. The issue is on my side. I am trying to use MediaWiki API from the browser. I am sending a GET request through XMLHttpRequest but due to CORS issues it's just not working. I am getting the following message from my browser after it receives the response: XMLHttpRequest cannot load https://en.wikipedia.org/w/api.php?format=json&action=query&list=search&srsearch=Oculus&utf8. No 'Access-Control-Allow

How to add other languages to TeX

不问归期 提交于 2019-12-06 07:38:23
问题 In MediaWiki if you add in formulas non english text it cuts. For example if you write \text{щfбb} ( щ and б russian (cyrillic) symbols) output will be fb not щfбb . 回答1: First of all if you have MediaWiki version lower than 1.18 then open file includes/Math.php and find (this code for version 1.16): escapeshellarg( $wgTmpDirectory ).' '; escapeshellarg( $this->tex ).' '; and replace with: escapeshellarg( $wgTmpDirectory ).' '; setlocale(LC_CTYPE, "en_US.utf8"); $cmd .= escapeshellarg( $this-

Find main category for article using Wikipedia API

时光总嘲笑我的痴心妄想 提交于 2019-12-06 06:44:05
问题 I have a list of articles and I want to find the main category of each article. Wikipedia lists its main categories here - http://en.wikipedia.org/wiki/Portal:Contents/Categories. I am able to find the subcategories of each article using: http://en.wikipedia.org/w/api.php?action=query&prop=categories&titles=%s&format=xml I also am able to check whether a subcategory is within a category: http://en.wikipedia.org/w/api.php?action=query&titles=Dog&prop=categories&clcategories=Domesticated

How to login using mediawiki's api, curl, and bash?

青春壹個敷衍的年華 提交于 2019-12-06 06:09:44
My understanding of the process: From mediawikis login manual https://www.mediawiki.org/wiki/API:Login When using MediaWiki's web service API, you will probably need your application or client to log in. This involves submitting a login query, constructing a cookie, and confirming the login by resubmitting the login request with the confirmation token returned. 1) Attempt to login with username and password, this will fail with 'result="NeedToken"' as part of response html. Response will also contain the token to be passed in for the next login attempt. 2) Attempt to login again, this time

Sharepoint 2010 search cannot crawl mediawiki site

余生颓废 提交于 2019-12-06 05:54:22
问题 Using Sharepoint 2010 enterprise search, we are trying to crawl our internal mediawiki based wiki site. Search fails with error : 'The URL was permanently moved. ( URL redirected to ... )'. Since the wiki site has case sensitive URLs, when Sharepoint 2010 tries to crawl with lower case URL names, the Wiki says 'page does not exists' and redirects with 301 !!! Any got a solution ? Thanks in advance. 回答1: By default, all links crawled are converted to lower case by the SharePoint search indexer

Use SQL to delete old MediaWiki revisions without shell access?

元气小坏坏 提交于 2019-12-06 03:52:48
Does anyone know a SQL query that will purge a MediaWiki database of old revisions? My database has grown out of control, and I need to prune it to make it possible to download and manage. I don't have shell access so, I need to do this with a SQL query. I have tried the solution suggested here, but it doesn't work http://www.mediawiki.org/wiki/Extension_talk:SpecialDeleteOldRevisions2#Deleting_only_archived_revisions Thanks for reading :) Nicholas As you, I don't have shell access to my MediaWiki. So I can't do a lot of things like maintenance. Here is my solution : host your MediaWiki web

Making registration for media wiki require admin approval?

穿精又带淫゛_ 提交于 2019-12-06 03:29:55
问题 A wiki I maintain has been hit pretty hard by spam bots... we don't have a lot of users, and I'd rather not saddle the legitimate users with captcha. Is there a simple way to make registration confirmation go to an admin? I've looked through the manual, and haven't been able to figure out how to do it. 回答1: You could create a new user right, e.g. "approved", allow admins to assign that right and restrict things like editing to only approved users, like this: // Disallow editing and uploading

Query Wikipedia pages with properties

心不动则不痛 提交于 2019-12-06 02:12:22
问题 I need to use Wikipedia API Query or any other api such as Opensearch to query for a simple list of pages with some properties. Input: a list of page (article) titles or ids. Output: a list of pages that contain the following properties each: page id title snippet/description (like in opensearch api) page url image url (like in opensearch api) A result similar to this: http://en.wikipedia.org/w/api.php?action=opensearch&search=miles%20davis&limit=20&format=xml Only with page ids and not for a