As the Signpost has moved from publishing every month to every two weeks every three weeks every two and a half weeks or so more frequently, we've hit our share of snips and snags, including a couple weeks ago, when a rather brashly opinionated technology report spurred about a hundred kilobytes of discussion, a big-ass thread at administrators' noticeboard slash incidents, and a currently-open request for comment linked to from WP:CENT.
In light of this most recent debacle, I've been going through old Signpost archives in order to find some editorial guidance. What I've found is grim: it turns out this is far from the only time we've made a questionable call on a hot-button issue. In fact, there we have run a great number of ill-advised pieces over the years. But under new editorship, we too have the chance to turn a new leaf. So I'd like to take a few minutes and apologize for some of the times we've gotten it wrong over the years.
Starting from the beginning.[1]
“ | Anyone who wants to write for The Signpost needs to be screened for ideological sympathies and potential fifth-columnism. Fortunately, this is simple. Our process is to ask them what they think of a world in which every single person on the planet is given free access to the sum of all human knowledge. If they say something like "hell yeah!" or "based!" or "that would be nice", that's how we know they are a Communist, or a homosexual, or some other kind of freak, and we start keeping a close eye on them and their associates. | ” |
Wikipedia's new sound logo has been rolled out, as announced on Wikimedia News.
The Verge says the winner of the contest to create the sound was Thaddeus Osborne, "a nuclear engineer and part-time music producer from Virginia". Osborne describes the sound design as a combination of whirring pages and clicking keys.
Gizmodo says the sound is "cute". – B
The WMF has announced that the WMF board ratified the Universal Code of Conduct Enforcement Guidelines on 9 March 2023. This means the Enforcement Guidelines are now in force and—
may not be circumvented, eroded, or ignored by Wikimedia Foundation officers or staff nor local policies of any Wikimedia project.
A Voter Comments Report summarising community comments made as part of the recent community vote on the Enforcement Guidelines (see previous Signpost coverage) has been published as well.
The Enforcement Guidelines now in force state:
Enforcement of the UCoC by local governance structures will be supported in multiple ways. Communities will be able to choose from different mechanisms or approaches based on several factors such as: the capacity of their enforcement structures, approach to governance, and community preferences. Some of these approaches can include:
- An Arbitration Committee (ArbCom) for a specific Wikimedia project
- An ArbCom shared amongst multiple Wikimedia projects
- Advanced rights holders enforcing local policies consistent with the UCoC in a decentralized manner
- Panels of local administrators enforcing policies
- Local contributors enforcing local policies through community discussion and agreement
As for systemic failure to follow or enforce the Code, the Guidelines state:
Systemic failure to follow the UCoC
- Handled by U4C
- Some examples of systemic failure include:
- Lack of local capacity to enforce the UCoC
- Consistent local decisions that conflict with the UCoC
- Refusal to enforce the UCoC
- Lack of resources or lack of will to address issues
The "U4C" here refers to the UCoC Coordinating Committee that the WMF will form. The adoption of the Enforcement Guidelines attracted press coverage (see this issue's In the media section); the underlying Wikimedia Foundation press release is here. – AK
An attempt to create a policy about AI-generated articles is happening at Wikipedia:Large language models.
The draft policy as of this writing includes reiterations of existing content policies, including no original research and verifiability. The draft adds in-text attribution is necessary
for AI generated content.
In related news, the Wikimedia Foundation has published a "Copyright Analysis of ChatGPT" (which, despite the title, also touches on the subject of AI-generated images), and on March 23 held a community call on the topic of "Artificial Intelligence in Wikimedia" (meeting notes). – B & T
Canary Media, an affiliate of the activist non-profit RMI, reports that Wikipedia has a climatetech problem. They urge "climatetech professionals" to edit articles because "The problem is that Wikipedia is often out of date, particularly when it comes to emerging or fast-changing subjects such as clean energy and decarbonization."
WMF staffer Alex Stinson is quoted giving some good advice in a 2020 article, as well as on a Wikiproject page, including finding malicious edits, flagging bad information, and marking a missing citation. – S
The Hong Kong International Airport sponsored free airline tickets to let tourists know that the the city was open for business after a long COVID slowdown. The plan was to give away 500,000 tickets via multiple lotteries, including 80,000 to be distributed by Cathay Pacific. According to Mothership, Cathay Pacific was to give out 12,500 of those tickets for the Singapore-Hong Kong route, to people who applied between March 2 and March 8. All you had to do to enter the contest was fill out an online form and answer some trivia questions about the history of Cathay Pacific. To find the answers (according to pageview data) about 6,000 more people than usual visited the English Wikipedia's article about Cathay Pacific on March 2. And about 5 people, using IP addresses traceable to Singapore, made about a dozen edits "to prevent others from winning". For example the founding date in the article was changed from 1946 to 1947 and then to 1949. In the first 43 minutes that registration was open 100,000 people entered the contest and the contest was closed early. A Wikipedia admin locked the article at about the same time.
Cathay Pacific named the winners on-time on March 20, as Mothership reported the outcome. It turns out the "free tickets" weren't really free because a fuel surcharge, taxes and other fees needed to be paid to get the tickets. The net you needed to pay: S$194.50 (about US$145) to get the S$474.50 tickets. – S
Wikipedia:Arbitration/Requests/Case/World War II and the history of Jews in Poland was accepted 13 March. New parties were added to the case as recently as 24 March.
Timeline relevant to the case will be (target dates according to clerks):
Scope: Conduct of named parties in the topic areas of World War II history of Poland and the history of the Jews in Poland, broadly construed
In accepting the case, arbitrator CaptainEek said we've received scholarly rebuke for our actions, and it is apparent that the entire Holocaust in Poland topic area is broken
, referring to an academic paper about the management of English Wikipedia's editing process on the topic (see prior Signpost coverage in In the media, issue 4, and Recent research, issue 6).
An editor has requested that Arbcom de-sysop an admin, Dbachmann, at Wikipedia:Arbitration/Requests#Dbachmann.
Back in 2014 we did a poetical, jokey featured content report for April Fools'. We haven't done this in recent years, particularly as the monthly schedule meant that we rarely had a suitable date for it. But this year...
Ten articles were promoted to Featured article status this period.
Six featured pictures were promoted this period, including the ones at the top and bottom of this article.
One featured list was promoted this period.
A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.
This arXiv preprint[1] (which according to the authors grew out of a student project for a course titled "Critical Thinking in Data Science" at Harvard University) finds that
[...] Google and its most prominent returned results – Wikipedia and YouTube – simply reflect the narrow set of cultural stereotypes tied to the search language for complex topics like "Buddhism," "Liberalism," "colonization," "Iran" and "America." Simply stated, they present, to varying degrees, distinct information across the same search in different languages (we call it 'language bias'). Instead of presenting a global picture of a complex topic, our online searches turn us into the proverbial blind person touching a small portion of an elephant, ignorant of the existence of other cultural perspectives.
Regarding Wikipedia, the authors note it "is an encyclopedia that provides summaries of knowledge and is written from a neutral point of view", concluding that
[...] even though the tones of voice and views do not differ much in Wikipedia articles across languages, topic coverage in Wikipedia articles tends to be directed by the dominant intellectual traditions and camps across different language communities, i.e., a French Wikipedia article focuses on French thinkers, and a Chinese article stresses on Chinese intellectual movements. Wikipedia’s fundamental principles or objectives filter language bias, making it heavily rely on intellectual and academic traditions.
While the authors employ some quantitative methods to study the bias on the other three sites (particularly Google), the Wikipedia part of the paper is almost entirely qualitative in nature. It focuses on an in-depth comparison of a small set of (quite apparently non-randomly chosen) article topics across languages, not unlike various earlier studies of language bias on Wikipedia (e.g. on the coverage of the Holocaust in different languages, see our previous coverage here and here). Unfortunately, the paper fails to cite such such earlier research (which has also included quantitative results, such as those represented in the "Wikipedia Diversity Observatory", which among other things includes data on topic coverage across 300+ Wikipedia languages) – despite asserting "there has been a lack of investigation into language bias on platforms such as Google, ChatGPT, Wikipedia, and YouTube".
The first and largest part of the paper's examination of Wikipedia's coverage concerns articles about Buddhism and various subtopics, in the English, French, German, Vietnamese, Chinese, Thai, and Nepali Wikipedias. The authors indicate that they chose this topic starting out from the observation that
To Westerners, Buddhism is generally associated with spirituality, meditation, and philosophy, but people who primarily come from a Vietnamese background might see Buddhism as closely tied to the lunar calendar, holidays, mother god worship, and capable of bringing good luck. One from a Thai culture might regard Buddhism as a canopy against demons, while a Nepali might see Buddhism as a protector to destroy bad karma and defilements.
Somewhat in confirmation of this hypothesis, they find that
Compared to Google’s language bias, we find that Wikipedia articles' content titles mainly differ in topic coverage but not much in tones of voice. The preferences of topics tend to correlate with the dominant intellectual traditions and camps in different language communities.
However, the authors also observe that "randomness is involved to some degree in terms of topic coverage on Wikipedia", defying overly simplistic predictions of biases based on intellectual traditions. E.g.
Looking at the Chinese article on "Buddhism", it addresses topics like "dharma name", "cloth", and "hairstyle" that do not exist on other languages' pages. There are several potential causes for its special treatment on these issues. First, many Buddhist texts, such as the Lankavatara Sutra (楞伽经) and Vinaya Piṭaka (律藏), that address these issues were translated into Chinese during medieval China, and these texts are still widely circulated in China today. Second, according to the official central Chinese government statistics, there are over 33,000 monasteries in China, so people who are interested in writing Wikipedia articles might think it is helpful to address these issues on Wikipedia. However, like the pattern in the French article, Vietnam, Thailand, and Nepal all have millions of Buddhist practitioners, and the Lankavatara Sutra and Vinaya Piṭaka are also widely circulated among South Asian Buddhist traditions, but their Wikipedia pages do not address these issues like the Chinese article.
A second, shorter section focuses on comparing Wikipedia articles on liberalism and Marxism across languages. Among other things, it observes that the "English article has a long section on Keynesian economics", likely due to its prominent role in the New Deal reforms in the US in the 1930s. In contrast,
In the French article on liberalism, the focus is not solely on the modern interpretation of the term but rather on its historical roots and development. It traces its origins from antiquity to the Renaissance period, with a focus on French history. It also highlights the works of French theorists such as Montesquieu and Tocqueville [...]. The Italian article has a lengthy section on "Liberalism and Christianity" because liberalism can be seen as a threat to the catholic church. Hebrew has a section discussing the Zionist movement in Israel. The German article is much shorter than the French, Italian, and Hebrew ones. Due to Germany's loss in WWII, its post-WWII state was a liberal state and was occupied by the Allied forces consisting of troops from the U.S., U.K., France, and the Soviet Union. This might have influenced Germany's perception and approach to liberalism.
Among other proposals for reducing language bias on the four sites, the paper proposes that
"[Wikipedia] could potentially invite scholars to contribute articles in other languages to improve the multilingual coverage of the site. Additionally, Wikipedia could merge non-overlapping sections of articles on the same term but written in different languages into a single article, like how multiple branches of code are merged on GitHub. Like Google, Wikipedia could translate the newly inserted paragraphs into the user’s target language and place a tag to indicate its source language.
Returning to their title metaphor, the authors give Wikipedia credit for at least "show[ing] a rough silhouette of the elephant", whereas e.g. Google only "presents a piece of the elephant based on a user's query language". However, this "silhouette – topic coverage – differs by language. [Wikipedia] writes in a descriptive tone and contextualizes first-person narratives and subjective opinions as cultural, historical, or religious phenomena." YouTube, on the other hand, "displays the 'color' and 'texture' of the elephant as it incorporates images and sounds that are effective in invoking emotions." But its top-rated videos "tend to create a more profound ethnocentric experience as they zoom into a highly confined range of topics or views that conform to the majority's interests".
The papers singles out the new AI-based chatbots as particularly problematic regarding language bias:
The problem with language bias is compounded by ChatGPT. As it is primarily trained on English language data, it presents the Anglo-American perspective as truth [even when giving answers in other languages] – as if it were the only valid knowledge.
On the other hand, the paper's examination of the biases of "ChatGPT-Bing" [sic] highlights among other concerns its reliance on Wikipedia among the sources it cites in its output:
[...] all responses list Wikipedia articles as its #1 source, which means that language bias in Wikipedia articles is inevitably permeated in ChatGPT-Bing's answers.
Other recent publications that could not be covered in time for this issue include the items listed below. Contributions, whether reviewing or summarizing newly published research, are always welcome.
From the abstract:[2]
"A systematic review was conducted to identify and evaluate how DH [Digital Humanities] projects perceive and utilize Wikidata, as well as its potential and challenges as demonstrated through use. This research concludes that: (1) Wikidata is understood in the DH projects as a content provider, a platform, and a technology stack; (2) it is commonly implemented for annotation and enrichment, metadata curation, knowledge modelling, and Named Entity Recognition (NER); (3) Most projects tend to consume data from Wikidata, whereas there is more potential to utilize it as a platform and a technology stack to publish data on Wikidata or to create an ecosystem of data exchange; and (4) Projects face two types of challenges: technical issues in the implementations and concerns with Wikidata’s data quality."
From the abstract:[3]
"In this work we introduce WikiEvolve, a dataset for document-level promotional tone detection. Unlike previously proposed datasets, WikiEvolve contains seven versions of the same article from Wikipedia, from different points in its revision history; one with promotional tone, and six without it. This allows for obtaining more precise training signal for learning models from promotional tone detection. [...] In our experiments, our proposed adaptation of gradient reversal improves the accuracy of four different architectures on both in-domain and out-of-domain evaluation."
From the abstract:[4]
"Wikipedia’s policy on maintaining a neutral point of view has inspired recent research on bias detection, including 'weasel words' and 'hedges'. Yet to date, little work has been done on identifying 'puffery,' phrases that are overly positive without a verifiable source. We [...] construct a dataset by combining Wikipedia editorial annotations and information retrieval techniques. We compare several approaches to predicting puffery, and achieve 0.963 f1 score by incorporating citation features into a RoBERTa model. Finally, we demonstrate how to integrate our model with Wikipedia’s public infrastructure [at User:PeacockPhraseFinderBot] to give back to the Wikipedia editor community."
From the abstract:[5]
"The shortage of volunteers brings to Wikipedia many issues, including developing content for over 300 languages at the present. Therefore, the benefit that machines can automatically generate content to reduce human efforts on Wikipedia language projects could be considerable. In this paper, we propose our mapping process for the task of converting Wikidata statements to natural language text (WS2T) for Wikipedia projects at the sentence level. The main step is to organize statements, represented as a group of quadruples and triples, and then to map them to corresponding sentences in English Wikipedia. We evaluate the output corpus in various aspects: sentence structure analysis, noise filtering, and relationships between sentence components based on word embedding models."
Among other examples given in the paper, a Wikidata statement involving the items and properties Q217760, P54, Q221525 and P580 is mapped to the Wikipedia sentence "On 30 January 2010, Wiltord signed with Metz until the end of the season."
Judging from the paper's citations, the authors appear to have been unaware of the Abstract Wikipedia project, which is pursuing a closely related effort.
From the paper:[6]
"[...] the UnderRepresented Writers Knowledge Graph (URW-KG), a dataset of writers and their works targeted at assessing and reducing their potential lack of representation [...] has been designed to support the following research objectives (ROs):
1. Exploring the underrepresentation of non-Western writers in Wikidata by aligning it with external sources of knowledge; [...]A quantitative overview of the information retrieved from external sources [ Goodreads, Open Library, and Google Books] shows a significant increase of works (they are 16 times more than in Wikidata) as well as an increase of the information about them. External resources include 787,125 blurbs against the 40,532 present in Wikipedia, and both the number of subjects and publishers extentively grow.
[...] the impact of data from OpenLibrary and Goodreads is more significant for Transnational writers [...] than for Western [...]. This means that the number of Transnational works gathered from external resources is higher, reflecting the wider [compared to Wikidata] preferences of readers and publishers in these crowdsourcing platforms."
To simplify things, years in the headers will link to the documentation for all pranks that year. The Signpost coverage – where available – will be linked in the text highlighting some of the best or most controversial pranks. Since the early days of Wikipedia tended to have the biggest pranks, the second half will cover rather more years.
We didn't have The Signpost to document Wikipedia's first April Fools, and it was fairly tame compared to later years: A proposal to delete the Main Page, an attempt to block localhost for vandalism, and other things mentioned in joking that feel like things that would later happen. We did get one rather good news item on our main page:
...but it was shortlived, and compared to what was to come....
While having an Arbitration Committee was controversial in 2005, publishing a blatant hoax as featured article on the main page and announcing Wikipedia's imminent takeover by Britannica was apparently fine and dandy. And things got more and more goofy as the day went on:
Even the interface changed. The text you clicked on to "edit this page" was replaced with "vandalise this page". And then later...
Our coverage attempts to dig through this chaos. An attempt to set rules was put in place, and the original plan for this year – just using a silly article that sounded fake, but was actually real – would be used in later years, instead of inventing fancy mediæval toilet paper holders.
More user interface shenanigans: the "delete" tab became "baleeted". Cyde changed "My watchlist" to "Stalked pages", and was blocked accordingly. Drini (later renamed Magister Mathematicae) wasn't blocked for his unprotection of the main page, though.
I'd say the meanest prank, however, was adding this to the community bulletin board:
Paid editing for all?
Our coverage is here.
The first year The Signpost missed out on any coverage. The big innovation this year was finally implementing Raul's idea from 2005 for Today's featured article:
George Washington was an early inventor of instant coffee, and worked to ensure a full supply to soldiers fighting at the front. Early on, his campaign was based in Brooklyn, but later he crossed into New Jersey toward a more profitable position. In the countryside, he demonstrated a love of wild creatures, and was often seen with a bird or a monkey on his shoulder. Washington's choice beverage was taken up by the soldiers for its psychoactive properties, even though it tasted terrible. Some thought his brewed powder could even remedy the chemical weapons then in use. But, despite this, Washington failed in his first bid for the Presidency, as papers were filed too late, and the nominator forgot to tell him about it. (more...)
Recently featured: New Carissa – Ivan Alexander of Bulgaria – Cleveland
Meanwhile, we rescinded the payments from last year. Recent changes got a new notice:
Wikipedia AnnouncementThe Wikimedia Foundation has decided there is no other option at the present than to charge people to edit the English Wikipedia. "For too long people have been free to hack this website. It's about time they paid" states Theresa Knott the new funding officer. "Allowing free access to all simply encourages vandalism. By asking for a quid an edit we stop kids vandalising, spammers spamming and edit warriors warrioring " Minor edits will naturally be cheaper, although the exact pricing details have not yet been fully worked out. Debate on this is welcome. All users should register their credit card at Wikipedia:Credit Card Registration by noon on 1.4.07. Otherwise their editing privileges will be suspended. Members of the cabel are, of course, exempt. |
Honestly, the did you know section really knocked it out of the park this year:
(Aside: this is one of the illustrations in Wiener sausage.
...That's a wiener, alright.)
The featured article was Ima Hogg, one of those people who probably hated her parents a bit for their naming choices. To quote the article: She endeavored to downplay her unusual name by signing her first name illegibly and having her stationery printed with "I. Hogg" or "Miss Hogg".
Six administrators were blocked this year, one for making Wikipedia's tagline " "From Wikipedia, the free encyclopedia administer [sic] by people with a stick up their lavender passageway". Lovely.
Besides the above-linked article, we also had a short history of April Fools' on Wikipedia.
Finally, my favourite joke nomination at featured picture candidates, "800 x enlargement of a pixel"
Today's featured article: "The Museum of Bad Art (MOBA) is a world-renowned institution dedicated to showcasing the finest art acquired from Boston-area refuse. The museum started in a pile of trash in 1994, in a serendipitous moment when an antiques dealer came across a painting of astonishing power and compositional incompetence that had been tragically discarded."
Other jokes include a to close English Wikipedia, and the dark Terminal Event Management Policy, about what to do if the world was ending on Wikipedia, particularly useful as Skynet was approved to begin operations.
The page that collects jokes also has this hilarious, but undocumented screenshot:
We briefly covered things, but, honestly, the only things worth speaking about is the main page fun. The article was wife selling, which is kind of boring, but DYK once again ruled the roost:
More next issue!
The financial sector of the economy has taken a beating over the last few weeks. Three of the larger US mid-sized banks, Silicon Valley Bank (SVB), Signature Bank and Silvergate Capital collapsed or closed their business in March. They had all been important sources of dollars for cryptocurrency traders. The run on SVB was the "largest bank run in modern U.S. history".
Outside the US, Credit Suisse, one of the thirty most systemically important banks in the world, had to be bailed out by another systemically important Swiss bank, UBS, with the help of the Swiss central bank.
Fortunately, the threat of bank runs seems to be over. The US stock market even went up in March. But the finance sector has one eternal problem – the only product it can sell is trust. Bank depositors need to trust that they can get their money back at a moment's notice. Stock market traders should know that buying stocks is risky, but they need to trust that they will be treated fairly by corporate management and when the time comes to sell the stocks. Without trust, there is no financial sector.
So should we trust banks? Do they lie to us? This article examines this question from a special Wikipedia point of view. On the English-language Wikipedia, do banks follow Wikipedia's rules on paid editing? Or are the articles about banks riddled with falsehoods placed there by sock puppets? I examine the articles about the thirty most systemically important banks in the world, as listed by the Financial Stability Board.
The answers are not uniform for all thirty banks. On average, each of the bank articles examined has been edited by 17.5 socks. The range is very wide however. The article on France's Groupe BPCE was edited by only one blocked sock, the lowest number. The article on America's Goldman Sachs was edited by 55 blocked socks, the highest number. So it appears that at least some of the world's systemically important banks might have been breaking our rules.
This is not an academic paper – simply a quick overview of an important recent question – but the method used should be explained so that the reader can better follow the evidence presented in the following table and the explanations below it. I've used the same method before in articles about the The oligarchs' socks last year and The "largest con in corporate history"? last month. It's just about counting the number of banned or blocked editors who have edited an article and determining how many of those editors were blocked as sock puppets (users who use multiple accounts to deceive) and how many of those socks were blocked or banned for reasons likely to indicate paid editing. In this case I looked for wording in the block summary or in the sockpuppet investigation such as "undeclared paid editing" or UPE, "terms of use" or ToU, "conflict of interest" or COI. I also included those socks who were noted as being blocked by checkusers which indicates a serious case of socking, where the sock can only be unblocked by another checkuser. The first step – finding the editors who have been banned or blocked on each article – is easily automated. The remaining steps sometimes involve the use of judgement.
The bank articles examined are about the thirty most systemically important banks in the world. These banks are all active internationally, are generally large and well known, and are involved in some of the more complex and controversial areas of the banking business, such as derivatives trading. They are selected by the Financial Stability Board because they present the largest risk to the global economy if they were to fail. They should also be the most trustworthy banks in the world, if only because they are the most tightly regulated, both domestically and internationally.
The table below starts with Risk Class in the first column (officially, the Financial Stability Board calls this Buckets). JP Morgan Chase should be viewed as the most important of the 30 systemically important banks listed, should it fail. The fourth column gives the total number of editors blocked or banned who have edited the article. This number includes blocked bots, editors blocked for vandalism, and for many other wiki-sins likely unrelated to paid editing. The fifth column is the most important. It includes sock puppets blocked for paid editing or conflict of interest violations, plus official checkuser blocks. The sixth column includes other socks who may or may not be suspected of paid editing.
Many sock puppets and sock farms have edited several of these articles, suggesting that there might be a network of socks operating in banking sector articles. The seventh column lists several socks who have attracted my attention, mostly for the number of these articles they have edited. But others are also listed, such as Russavia and Eostrix, who have become infamous on Wikipedia for their editing or other actions. Similarly I have listed some of the sock farms (large collections of sock puppets apparently working together) in the final column.
Risk class | Country | Bank | Total blocked or banned |
COI, UPE, or checkuser blocked editors |
Other blocked socks | Selected socks | Selected sock farms |
---|---|---|---|---|---|---|---|
4 | US | JP Morgan Chase | 84 | 11 | 20 | Anandmoorti Cyberfan195 Kkm010 LivinRealGüd |
JayJasper MP1440 Rock5410 VentureKit |
3 | US | Bank of America | 125 | 21 | 28 | Anandmoorti Coffeedrinker115 Cyberfan195 LivinRealGüd WikiDon |
JayJasper MP1440 VentureKit Wikiwriter700 Yoodaba |
3 | US | Citigroup | 88 | 12 | 28 | Anandmoorti Coffeedrinker115 Cyberfan195 Kkm010 Pig de Wig |
JayJasper MP1440 Rock5410 VentureKit Wikiwriter700 |
3 | UK | HSBC | 72 | 12 | 21 | Anandmoorti Coffeedrinker115 Kkm010 WikiDon |
GoldDragon JayJasper VentureKit Yoodaba |
2 | CHN | Bank of China | 32 | 2 | 11 | Kkm010 Russavia |
Yoodaba |
2 | UK | Barclays | 63 | 7 | 9 | Coffeedrinker115 Cyberfan195 Kkm010 |
Rock5410 |
2 | FRA | BNP Paribas | 38 | 8 | 9 | Avaya1 Cyberfan195 Kkm010 |
JayJasper Yoodaba |
2 | DEU | Deutsche Bank | 65 | 21 | 17 | Cyberfan195 CLCStudent Kkm010 LivinRealGüd |
Excel23 MP1440 VentureKit Wikiwriter700 Yoodaba |
2 | US | Goldman Sachs | 108 | 28 | 27 | Cyberfan195 CLCStudent Glaewnis Kkm010 LivinRealGüd Russavia |
AlexLevyOne JayJasper MP1440 VentureKit Wikiwriter700 Yoodaba |
2 | CHN | Industrial and Commercial Bank of China | 27 | 3 | 10 | Cyberfan195 Kkm010 Russavia |
Yoodaba |
2 | JPN | Mitsubishi UFJ Financial Group | 16 | 1 | 2 | Cyberfan195 Kkm010 |
|
1 | CHN | Agricultural Bank of China | 26 | 1 | 11 | Cyberfan195 Kkm010 Russavia |
|
1 | US | BNY Mellon | 20 | 5 | 3 | Cyberfan195 Kkm010 |
JayJasper |
1 | CHN | China Construction Bank | 19 | 0 | 6 | Cyberfan195 Kkm010 |
JayJasper |
1 | CH | Credit Suisse | 30 | 6 | 5 | Cyberfan195 Kkm010 LivinRealGüd |
MP1440 |
1 | FRA | Groupe BPCE | 3 | 0 | 1 | Bitholov | |
1 | FRA | Crédit Agricole | 13 | 0 | 2 | Cyberfan195 Kkm010 |
|
1 | NLD | ING | 38 | 4 | 22 | Anandmoorti Cyberfan195 Kkm010 |
Rock5410 VentureKit |
1 | JPN | Mizuho Financial Group | 17 | 2 | 4 | Cyberfan195 Kkm010 |
|
1 | US | Morgan Stanley | 44 | 8 | 12 | CLCStudent Cyberfan195 Kkm010 LivinRealGüd |
MP1440 VentureKit Wikiwriter700 Yoodaba |
1 | CAN | Royal Bank of Canada | 28 | 6 | 9 | Cyberfan195 Eostrix Torontopedia |
GoldDragon Yoodaba |
1 | ESP | Banco Santander | 36 | 7 | 10 | CLCStudent Cyberfan195 Kkm010 |
VentureKit |
1 | FRA | Société Générale | 32 | 6 | 8 | Cyberfan195 Glaewnis Kkm010 |
MP1440 |
1 | UK | Standard Chartered | 31 | 5 | 5 | Cyberfan195 Kkm010 |
Rock5410 VentureKit |
1 | US | State Street | 20 | 3 | 2 | Cyberfan195 LivinRealGüd Pig de Wig |
VentureKit Yoodaba |
1 | JPN | Sumitomo Mitsui | 12 | 3 | 1 | Cyberfan195 | |
1 | CAN | Toronto-Dominion Bank | 32 | 8 | 4 | Cyberfan195 | Excel23 MP1440 |
1 | CH | UBS | 51 | 9 | 10 | Cyberfan195 Kkm010 LivinRealGüd |
JayJasper John254 |
1 | ITA | UniCredit | 14 | 0 | 3 | Cyberfan195 | |
1 | US | Wells Fargo | 76 | 11 | 26 | CLCStudent Cyberfan195 Kkm010 LivinRealGüd Pig de Wig |
Excel23 JayJasper MP1440 Rock5410 Yoodaba |
The table shows a range of sock puppet editing among these articles on systemically important banks. Some banks such as the French banks Groupe BPCE and Crédit Agricole, Italy's UniCredit, and the China Construction Bank show no sock puppeting by the most likely paid editing socks. On the other hand, three banks, America's Goldman Sachs, the Bank of America, and Germany's Deutsche Bank, have had over 20 socks of this type editing their articles. In general, with some exceptions, American banks have had the most edits by the most likely paid editing socks. Chinese and Japanese banks, along with the above French and Italian banks have the fewest socks of this type editing the articles about them. The two Swiss banks in the news, Credit Suisse and UBS, are in the broad middle ground with only six and nine socks of this type editing the articles about them.
The articles on banks in the highest two risk classes tend to have higher indications of socking than other classes and the articles about the lowest risk class have the lowest indications of socking, with the possible exception of Wells Fargo.
The seventh column shows a selected group of editors who have edited the articles and been blocked for socking. Glaewnis – whose block summary reads "UPE – appeal is only to the Arbitration Committee" has edited two of these articles. Kkm010 who edited several articles on the Adani group, edited at least 23 of these articles. Several other now blocked socks edited multiple articles in this list.
Perhaps the most interesting column in the table is the final one showing the sock farms who edited multiple articles. The Yoodaba sock farm edited at least 11 of these articles. They are known for editing business articles, especially finance articles, as well as political articles. The JayJasper and VentureKit sock farms edited almost as many.
We remind our readers that no examination based purely on Wikipedia's edit history can prove or disprove whether an editor has been paid to edit articles. Nevertheless, we can say that we have little or no reason to suspect those twelve banks which have fewer than five editors listed in column 5 of paying for Wikipedia editing. Similarly, we might say that the three banks with more than twenty editors listed in column 5 are the most likely among these thirty banks to have paid for Wikipedia editing.
None of this evidence can be taken as final proof of any rules being broken, but there certainly is some interesting evidence.