Apart from reporting on the English Wikipedia, The Signpost routinely covers stories from elsewhere in the Wikimedia world (for example, see this week's "News and notes" on activities of the French and Dutch chapters), and is read by many Wikimedians from other projects. Starting this week, we are offering a global subscription service that can deliver each new Signpost issue to user talk pages on any Wikimedia project (example). You can subscribe (and unsubscribe) on the Sign-up page on Meta.
This is an extension of the existing talk page delivery on the English Wikipedia, which has over 1,000 subscribers and is carried out by MZMcBride's bot EdwardsBot, which has reliably distributed more than 47,000 Signpost copies since October last year, in addition to other newsletters. The new global message delivery service is also based on EdwardsBot, and is open to others who need to establish subscriber networks across other Wikimedia projects.
Another new feature we are introducing with this issue is a "Share this" list of links on each page, enabling readers to easily share our articles via email, Twitter, Facebook and other social bookmarking sites. Although it is standard practice on many news sites to arrange them into a conspicuous icon bar, we tried to keep them unobtrusive – click "show" to expand the list in the top right corner. Suggestions and problems can be reported here.
In related news, The Signpost's Twitter feed (inaugurated 18 months ago and also available on Identi.ca) surpassed 1,000 followers last week. Apart from announcements of each new Signpost issue, it contains notice of Wikipedia- and Wikimedia-related news in advance of more detailed coverage in the next Signpost issue. And for a few weeks now, the official Wikipedia page on Facebook has been featuring highlights from new Signpost issues.
To serve this expanding readership, we still need more good writers, especially for the following two beats:
One possibility for helping out is to write up suggestions from our tip line – even if it just becomes a short note in the "Briefly" part of a section. Or you could scour the sources listed on our resources page, which has an overview of each section's scope. For more advice, participate in the Newsroom, ask regulars in our IRC channel #wikisignpost webchat, or contact wikipediasignpostgmail.com.
We hope all readers are enjoying The Signpost, and we welcome feedback and questions.
Reader comments
Much of the Spaarnestad collection of some 2.5 million images dating back to the late 19th century narrowly escaped destruction in the mid-1980s, when the original publishing house experienced a financial and housing crisis. But prompt action by the newly formed non-profit Spaarnestad Foundation saved this priceless record of modern Dutch history. Private benefactors and the City of Haarlem provided funds for the interim location of the collection, which was transferred to the National Archives in 2008.
The donation is part of the Archive's Images for the Future project to preserve and digitise visual materials, and to make them publicly available, and was the result of a collaboration with Wikimedia Nederland. One of the most significant gifts of historical images ever made to the Wikimedia Foundation, it was marked at a public event in The Hague attended by several current and former politicians, who shared their personal memories surrounding specific images now freely available at Commons. A spokesperson for the Archives said, "Wikipedia is a good, reliable and social platform, and our goal is to disseminate our materials as widely as possible."
Lodewijk Gelauff, Vice Chair of Wikimedia Nederland, said "This generous release will provide photos for many related Wikipedia articles that until now had no image to accompany the article.... One of the best ways to get a good photo is through partnerships like this. [I hope] that soon more institutions will follow the example of the National Archive.... I invite everybody to incorporate the images on their language projects as they become available in the near future on Wikimedia Commons."
In July, French-speaking Wikimedians Ludo29 and Inisheer took part in the "Raid Paris – Cap Nord", a photographic challenge where competitors are ranked by a jury on the basis of the pictures they take during the trip. The journey starts in Paris, goes through Finland, Sweden and Norway, up to North Cape in Norway, the northernmost point of Europe, and ends back in Paris. Over the four weeks, the raiders drove 12,000 km in a car branded with the logos of Wikipedia, Wikimedia Commons and Wikimédia France. The French chapter provided financial support.
The two Wikimedians took around 300 photographs of landscapes, buildings, fauna and flora of these nordic countries, many of which filled gaps in the Commons. The Wikimedians produced content for Wikinews, including an interview of Philippe Boucher (Google translation), creator and organiser of the raid; they also wrote a report about lifeboatmen in Norway (Google translation).
The Wikimedian team was ranked seventh out of 22 teams in the challenge. The Wikimedia logos on their car provided opportunities to talk with local people about the Foundation and its projects.
An article in The New York Times, "Limbaugh taken in: the judge was not loaded for bear", reports that conservative US talk show host Rush Limbaugh relied on erroneous information from the Wikipedia article about federal judge Roger Vinson when he told listeners that Vinson is an avid hunter and hobby taxidermist who, in 2003, hung the stuffed heads of three bears killed by himself over a courtroom door, to "instill the fear of God" into the accused. Limbaugh insinuated that this might improve the chances of the court case against President Obama's health-care act which Vinson is currently hearing.
The hoax information had been added by a new user on September 13 (UTC), who removed it the next day. Denying that the statement was based on Wikipedia, a spokesman said it came from an article on the website of the Pensacola News Journal – coincidentally the offline reference cited in the Wikipedia article (but with a non-existent date: "June 31, 2003"). However, its managing editor denied it had ever published such an article – a point also made in its own coverage of the affair (Rush Limbaugh falls for wacky hoax about Judge Roger Vinson).
Less than a year ago, Limbaugh criticized journalists who rely on Wikipedia without fact-checking as "literal professional scum", after false quotes attributed to him in Wikiquote made it into the media (see Signpost coverage). In 2005, he criticized Wikipedia as biased and announced he would insert the word "afristocracy" into Wikipedia to "spread" it (see Signpost coverage and deletion discussion).
After extensive articles about the Wikimedia Foundation's Public Policy Initiative had appeared in Inside Higher Ed and USA Today (see last week's In the news), two student newspapers covered it from a local perspective:
An article in George Washington University's GW Hatchet (Wikipedia recruits GW students to edit website's content) quoted Dr Joseph Cordes, associate director of the Trachtenberg School of Public Policy and Public Administration, who had supported the three-day workshop at the school to train Wikimedia's "Campus ambassadors" (Signpost coverage), and Dr Donna Infeld, whose graduate course on public policy earlier this year had been the first at the university to offer Wikipedia assignments. One of her students was quoted regarding his experiences contributing to the article Don't ask, don't tell as part of that course.
At Georgetown University, The Hoya interviewed Professor Rochelle Davis (Wikipedia: a class tool), whose "Introduction to the study of the Arab world" course participates in the Wikipedia initiative. Correcting earlier media reports that she had assigned students to read Wikipedia, she said "some of the interviews seemed to have missed that. Wikipedia is an encyclopedia. I wouldn't assign my students to read it, just like I wouldn't assign them Britannica."
Also last week, Indiana University announced that a seminar at its School of Public and Environmental Affairs program would be producing public policy articles for Wikipedia, proudly describing the School as "one of five leading public-policy programs where the Wikimedia Foundation, the nonprofit organization behind Wikipedia, is debuting its Public Policy Initiative".
See other Signpost coverage of the Public Policy Initiative: "Introducing the Public Policy Initiative", "Public policy initiative announces advisory board, starts training campus ambassadors", "Public policy initiative announces participating classes", and "Experiments with article assessment".
This week, we have a ticket to ride with WikiProject Trains. Started in March 2004, the project has grown to include over 58,000 pages, including 35 featured articles, 22 featured lists, 157 good articles, and an unassessed article backlog of 13,000. The project's total number of members is staggering. WikiProject Trains is aided by a network of related projects, including WikiProject Stations, WikiProject Streetcars, and several projects covering country-specific railways. The project maintains a featured portal, to-do list, manual of style, resources for new articles, and 12 task forces. We interviewed Mjroots, DavidCane, Slambo, Iridescent, Redrose64, Simply south, and Oakshade.
What motivated you to join WikiProject Trains? Do you consider yourself a railfan?
The project covers a wide range of articles related to modern and historical trains throughout the world. What are some of your favorite articles?
The project has 34 featured articles, 22 featured lists, and 156 good articles. Have you contributed to any of these articles? Are there any other articles you are currently working on bringing to FA or GA status?
The project is home to 12 task forces with focuses ranging from drawing maps and adding images to reducing the assessment backlog and maintaining the project's portal. Tell us a little about how these task forces contribute to the project's overall goals.
{{WikiProject Trains}}
to a talk page, I usually omit all the task force parameters; the only ones I ever set are |locos=yes
and/or |models=yes
, since it's a pretty clear decision whether the article falls within those or not. As for participation in task forces, I don't.Are there any other projects that collaborate and share resources with WikiProject Trains?
Anything else you'd like to add?
Next week we'll admire a buttress. Until then, you can build up your knowledge of WikiProjects in the archive.
Reader comments
The Signpost congratulates two editors on their promotion to adminship.
Choice of the week. The Signpost asked FA nominator and reviewer Aaroncrick to select the best of the week:
“ | As always, the FAC process throws up some obscure and fascinating articles, making the choice of the week difficult. William Calcraft and his 450 executions over 45 years was one of the more enthralling articles, describing his 45-year career of 450 executions, many using the unusual short-drop method. Three articles were on my short-list: Wintjiya Napaltjarri, Princess Charlotte of Wales and FC Barcelona—all intriguing reads. Being Australian, I couldn’t help but feel a connection with Napaltjarri, an Indigenous artist whose work is showcased in many of the country’s galleries. More will be familiar with Princess Charlotte, who died in 1817 when just 21, ending her quest to become Queen. However, being a sports lover and contributor, my choice of the week is FC Barcelona. The article is a comprehensive, engaging account of one of the biggest and most successful sporting clubs in the world. | ” |
Four featured articles were delisted:
Eleven lists were promoted:
Choice of the week. We asked FL nominator and reviewer Wizardman for his choice of the best:
“ | My original plan was to look past the sports-related articles, since I'm more familiar with them, but due to how many were promoted this week I'm choosing two articles as best, one sports and one not. The sports article that caught my eye was List of Major League Baseball hitters with four home runs in one game. It really shows how significant of an accomplishment hitting four home runs in a game is, though that could be my bias talking. The second article I would choose is Registrar of the University of Oxford. Oxford is not something usually on my radar, but this list is a very nice look at the position and the people who have held it. It was a very interesting read, with the descriptions of each person adding a good deal to what the article is about. | ” |
One topic was promoted: Supernatural (season 1) (nom), concerning an American television series broadcast 2005–06, with 22 episodes. There are two featured articles and one good article (nominator Ophois).
Seven images were promoted:
Choice of the week. Dschwen, a regular reviewer and nominator at featured picture candidates, told The Signpost:
“ | My choice this week, the Impala, is not only a masterfully executed shot of a beautiful animal, but it comes as a package. Linked in its image description are two more detail shots of the head and horns, and when viewed through our sister site commons this featured picture has annotations that pop up when you hover the mouse over the animal's various body parts, indicating characteristic traits of its species. It is exciting to see the potential of our online format being used in a way that greatly enhances the educational value of a nice piece of photography (picture above right). | ” |
This article is a continuation of Tools, part 1, in a series meant to introduce readers to useful tools for editing. This time, we will be treating tools related to internal links (wikilinks), and the version histories of wiki pages.
Many tools consist of user scripts, JavaScript code running in your browser, that can be imported by adding importScript("User:Example/awesome script.js")
to your skin.js page. Compatibility varies with skin and browser, with Internet Explorer being the most problematic. A more extensive script list is at Wikipedia:WikiProject User scripts/Scripts.
Various other tools are hosted on the Wikimedia Toolserver (currently provided by the German Wikimedia chapter) and can be accessed via a web interface. Some are also hosted on non-Wikimedia websites.
Dabfinder adds a "Find disambiguations" link to your sidebar, outlines disambiguation links in green and allowing you to fix them on the fly without having to go to a separate page. Unlike Begriffsklärungs-Check ("disambiguation check") from the German Wikipedia, it works on all languages.
javascript:importScript('User:Splarka/dabfinder.js');findDABsButton()
into your browser's address bar while viewing or previewing the article of interest.{{subst:js|User:Splarka/dabfinder.js}}
to your Special:MyPage/skin.js page.Dablinks is a Toolserver tool which checks for disambiguation links. It can check individual pages or up to 500 pages from a category, list, or a user's recent contributions. A companion tool, accessible via "(fix links)", Dab solver provides an easy to use menu driven interface for resolving all links. The tools also collects statistics to assist WikiProject Disambiguation.
Linkclassifier assigns over a dozen possible attributes to links. Users can opt to use either the default style sheet or create their own with the looks and color they want for each attribute. The default highlights disambiguation and self-redirects links and outlines non-free images. What sets this tool apart from others is the ability to identify Set index articles. While there is no firm standard, they are typically hybrids between a list article and a disambiguation page. Writers may intentionally link to these if they wish to have a description or history of a Ship's name, for example.
javascript:void(importScript('User:Anomie/linkclassifier-demo.js'))
into your browser's address bar while viewing or previewing the article of interest.{{subst:js|User:Anomie/linkclassifier.js}}
and importStylesheet('User:Anomie/linkclassifier.css');
to your Special:MyPage/skin.js page.The Contributors tool lists users who have edited a page, based on the number of their contributions. This is a good way to identify the major contributors to an article. The tool can also display the page history in other formats. (Documentation)
Article revision statistics by X! also shows users who have edited a particular page, sorted by number of edits; but it provides many other statistics about the page's history, such as the number of edits per month, or the percentage of anonymous edits.
WikiDashboard displays an article together with a timeline showing editing activity, and also lists the contributors with the most edits to the article. Editing while using the dashboard is not possible. It was the subject of an article in Technology Review last year; see also the Signpost coverage of its release in 2007.
Revisionjumper allows easier navigation of a page's history, generating diffs between arbitrary revisions or time periods with just a few clicks. It was developed on the German Wikipedia and is used by around 1500 users.
{{subst:js|MediaWiki:Gadget-revisionjumper.js}}
to your Special:MyPage/skin.js page, or go to Special:Preferences, check its box under "Gadgets", and click "Save".WikiBlame (documentation) searches revisions of a page for a text string in either the HTML or wikitext. It then displays the revision dates where the string exists or does not by a green circle and red X. This is useful if one needs to ask the author of a particular statement for a clarification or a reference, and is certainly faster than doing it by hand. Article Blamer by X! promises similar functionality in a streamlined interface. WikiTrust (see below) is another alternative.
WikiTrust analyzes an article's history and the contributions of its authors to calculate a trust score for each part of the text, which is displayed as a color (white=trustworthy, yellow or orange = unstable). It is also possible to check directly who contributed that part: A CTRL-ALT-click on a word will take you to the diff where it was added.
WikiTrust is currently available as a browser add-on for Firefox. The Wikimedia Foundation has indicated that it may eventually be integrated into Wikipedia itself (see Signpost coverage).
Page view statistics graphs the number of views per day for a Wikipedia page. The tool aggregates a list of the "most viewed pages", although this is often several months behind. The data is also used in a new, still experimental tool by Emw that graphs over larger periods.
Note: Due to past problems with the underlying data (squid logs), page views may be under-reported from November 2009 to July 2010.
The Arbitration Committee opened no new cases, leaving one open.
This case resulted from the merging of several Arbitration requests on the same topic into a single case, and the failure of a related request for comment to make headway. Innovations have been introduced for this case, including special rules of conduct that were put in place at the start. However, the handling of the case has been criticized by some participants; for example, although the evidence and workshop pages were closed for an extended period, no proposals were posted on the proposed decision page and participants were prevented from further discussing their case on the case pages (see earlier Signpost coverage).
The proposed decision, drafted by Newyorkbrad, Risker, and Rlevse, sparked a large quantity of unstructured discussion, much of it comprising concerns about the proposed decision (see earlier Signpost coverage). A number of users, including participants and arbitrator Carcharoth, made the discussion more structured, but the quantity of discussion has continued to increase significantly. Rlevse had said that arbitrators were trying to complete the proposed decision before September 6, but it was later made clear that he will no longer be voting on this decision. This week, arbitrators made further additions to the proposed decision and further attempts to manage the quantity of discussion.
Before discretionary sanctions can be imposed on an editor, the editor is required to be "given a warning advising of the problems with his or her editing". Additionally, where appropriate, the editor should be "counseled on specific steps that he or she can take to improve" his or her editing. The exception to this requirement is where there is "gross misconduct".
Littleolive oil filed a clarification request regarding this requirement and asked that the revert restriction that was imposed on her by Future Perfect at Sunrise be overturned by the Committee. A few users characterised the request as “forum shopping” and arbitrator Coren alleged that it was “not a request for clarification...but an appeal/protest.” However, the filer stated that the clarification fundamentally affects the restriction and that the Committee should stand by its statements - that discretionary sanctions may be appealed to the Committee.
Arbitrators Newyorkbrad and Roger Davies clarified that the warnings should come from "a neutral third-party" rather than "an opponent in a content dispute". In response, an administrator suggested that the discretionary sanctions from this case be replaced with what some arbitrators refer to as “standard discretionary sanctions”. However, practical issues with the latter approach were pointed out in the Climate change case:
The "standard" sanctions have changed to some extent every time that the Committee has used them, so they're hardly standard. As the wording on that page changes, editors in affected areas will have no way of knowing that the "rules" have changed. This will also lead to disputes about whether the current wording of the so-called standard discretionary sanctions, or the one in effect at the time of the decision, will hold sway. [Even with announcements of changes]...sanctions cover hundreds of pages and potentially apply to thousands of editors. Most of them don't watch WP:AN or the village pumps, and even fewer of them watch arbitration pages.
— Arbitrator Risker, Sentences taken from comments made at 04:30, 6 September 2010 (UTC) and 17:39, 6 September 2010 (UTC) in Wikipedia:Arbitration/Requests/Case/Climate change/Proposed decision.
In light of the clarification, the filer suggested the Committee keep the existing discretionary sanctions wording for this case because it “clearly defines the criteria” making it “easier to determine if the criteria has been met or not”. Recently, she also asked the Committee whether she will need to file another case to deal with the "allegations of wrongdoing" and editors that were "improperly sanctioned".
At the time of writing, arbitrators have not yet responded to the request to reimpose an Eastern European topic ban on Radeksz. It has been over a week since the request was filed.
Arbitrators have responded to the request to impose a topic ban on Ferahgo the Assassin from race and intelligence related articles that has been filed. It has been over a week since the request was filed. Kirill Lokshin stated that he did not see any reason to presume wrong-doing, but Shell Kinney advised Ferahgo the Assassin to avoid editing the topic, particularly in light of policy and the facts of this case. Roger Davies also stated that he would support a topic ban.
Reader comments
On the Wikimedia Techblog, contractor Chad Horohoe announced the first Wikimedia "hack-a-ton", an event when developers, amateur and professional, get together with the explicit aim of bug-fixing and generally getting "down and dirty with the code". Designed to act as a counterpoint to the "MediaWiki Developers' Meetup" in Berlin, which is focused on demonstrations, workshops and small group discussions, the event is scheduled for October 22–24 in Washington DC. Bugs for the weekend are going to be tracked using a new keyword in Bugzilla, "bugsmash". MediaWiki has around 4900 bugs and feature requests outstanding from a total pool of around 25000, though not all relate to the core MediaWiki software.
We continue a series of articles about this year's Google Summer of Code (GSoC) with Samuel Lampa, a biotechnology student at Uppsala University, who describes his project to develop a system for the general import and export of RDF metadata from the Semantic MediaWiki software.
“ | Some of you might know Semantic MediaWiki, the MediaWiki extension that (if installed, which is not currently the case on Wikimedia wikis) lets users annotate facts in articles with a special syntax, which makes them "machine readable". This allows external software tools to use the facts for powerful stuff like integrating data, querying the data in a bandwidth-saving way, providing powerful search facilities, and so on. For example, on the Stockholm article, one would add: [[is capital of::Sweden]] . Annotations are of course best embedded in templates such as the infobox on the Methane article, where they can make use of the already formatted information without bothering users with additional syntax.
Apart from Wikipedia, MediaWiki is used by numerous organizations and companies for all kinds of knowledge bases. In fields such as construction and engineering there are loads of data available in strictly formalized and standardized document formats that, if stored in Semantic MediaWiki, could be turned into "machine readable", queryable databases, by simply adding semantic annotations in the templates, for example. Now, what if one exposes this data in a standardized format that the rest of the web was using, everyone using the same identifier for "Stockholm" and "Bosch spark plug no 0001"? This would enable connecting all the data available into a big "web of things" instead of "web of documents", which can be much more smartly queried – asking explicitly for "all cities in" "Europe", or "all spark plugs that fits" "Volvo V70", for example, instead of guessing the keyword combination that returns such a document on a search engine like Google. Such a format is already available, and called RDF. Semantic MediaWiki already allows the static export of articles in RDF, but does not allow its import; nor does it provide a method out of the box to select from remote only exactly those pieces of data you want. The RDFIO extension, which I built for the Google Summer of Code, addresses the mentioned gaps by providing ability to import RDF as well as an interface for both the querying and updating of facts via a so-called "SPARQL endpoint" (see here for an example) which external tools can also very easily talk to. This new ability to update semantic facts remotely opens up for some interesting use cases. For example, chemists and biologists using Bioclipse can take their working data and export it to a wiki where their peers can make corrections, before importing it again for further analysis, etc. This workflow is in fact already possible as hinted in this blog post / screencast, and is the focus my current work (progress documented on the blog). For a more technical description as well as download and install instructions, see the RDFIO Extension page. The development, and thoughts behind RDFIO was documented on this blog. |
” |
Not all fixes may have gone live to WMF sites at the time of writing; some may not be scheduled to go live for many weeks.