Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Elon Musk envisions Grokipedia — an xAI anti-woke spin on Wikipedia — as an ultimate monument to human knowledge, something complete and honest enough Drilling in stone and Preserve it in space. In fact, it’s a hot mess, and it’s even worse now that anyone can suggest edits.
Grokipedia was not always editable. when It was first launched in Octoberapproximately 800,000 articles written with Grok have been closed. I thought it was chaos then too – Racist, transphobic, and embarrassingly flattering of Muskand in places copied directly from Wikipedia – but at least it was to be expected. That changed a few weeks ago, when Musk said… Version 0.2 has been released And opened the door to anyone to suggestion Modifications.
Suggesting edits on Grokipedia is simple, so simple that the site apparently doesn’t feel the need to give instructions on how to do it. You can select some text, click the Suggest Edit button, and fill out a form with a summary of your proposed change, with the option to suggest content and provide supporting sources. Review editorial suggestions are Grok, xAI’s Problematic, Musk worships chatbot AI. Grok, yes, the chatbot, will also be the one making the actual changes to the site. Most edits on Wikipedia do not require approval, but there is an active community of human editors who watch…Recent changesPage closely.
It’s not entirely clear what changes Grok is making. The system is confusing and not very transparent. Grokipedia tells me there have been “22,319” edits approved so far, though I have no way of knowing what those were, what pages they were made on, or who suggested them. It contrasts with Wikipedia’s well-documented edit logs, which can be sorted by pages, users, or, in the case of anonymous users, IP addresses. My hunch is that many Grokipedia edits add internal links to other Grokipedia pages within articles, although I don’t have conclusive proof other than scrolling through a few pages.
The closest I could get to seeing where the modifications actually occurred was on Home. There’s a small panel below the search bar that shows five or so recent updates in rotation, though these only give the name of the article and say an unspecified edit has been approved. Not completely comprehensive. It is at the mercy of whatever users want to suggest, resulting in a confusing mix of stories. Elon Musk and the religious pages were the only things that seemed to pop up frequently when I looked, interspersed with things like TV shows friends and Traitors in the United Kingdom It is requested to note the potential medical benefits of camel urine.
Wikipedia has a clear timeline of edits showing what happened, who did what, and reasons for doing so, with viewable chat logs for controversial issues. There’s also ample guidance on editing style, source requirements, and processes, and you can compare edited versions directly from the site to see exactly what has been changed and where. Grokipedia had no such guidelines — and it showed that many submissions were a jumbled mess — but it did have an editing history. It was a nightmare that only hinted at transparency. The history — which displays only the timestamp, suggestion, Grok decision, and often complex reasoning by the AI — must be scrolled manually in a small pop-up window on the side of the page with no ability to skip forward or sort by time or edit type. It’s frustrating, with only a few modifications, and doesn’t show where the changes are actually implemented. With further modifications, it will be completely unusable.
Not surprisingly, Grok doesn’t seem to be the most consistent editor. It makes for confusing reading at times and editorial records reveal a lack of clear guidance for willing editors. For example, the edit history of Musk’s bio page shows several suggestions about his daughter, Vivian, being transgender. The editors suggest using her name and pronouns consistent with her gender identity and the pronouns assigned at birth. Although it’s almost impossible to follow precisely what happened, Grok’s decision to edit incrementally means there’s a confusing mix of the two all over the page.
As a chatbot, Grok is persuadable. As for the proposed edit to Musk’s bio page, one user suggested, “This statement needs to be fact-checked,” referring to a quote about the fall of Rome being linked to falling birth rates. In a more wordy response than he should have, Grok dismissed the suggestion as unnecessary. For a similar request in a different form, Grok came to the opposite result, accepting the suggestion and adding the kind of information he had previously said was unnecessary. It is not too difficult to imagine how one might manipulate requests to ensure that modifications are accepted.
While all of this is technically possible on Wikipedia, the site has a small army of volunteers Officials – Chosen after a review or selection process – to keep things under control. They enforce standards by blocking accounts or IP addresses from editing pages and locking them out in cases of page vandalism or edit wars. Grokipedia doesn’t seem to have anything to do the same, leaving it entirely at the mercy of random people and a chatbot that called itself MechaHitler. The issue appeared on several pages related to World War II and Hitler, for example. I have found repeated (rejected) requests to point out that the dictator was also a painter and that far fewer people died in the Holocaust than they actually did. The corresponding pages on Wikipedia were “protected,” meaning they could only be edited by certain accounts. There were also detailed records explaining the decision to protect them. If the editing system – or the site in general – were easier to navigate, I’m sure I’d find more examples.
Such pages are obvious targets for abuse, and it is not surprising that they are among the first pages targeted by malicious editors. It won’t be the last, and with Grokipedia’s chaotic editing system and Grok’s limited guardrails, it may soon be difficult to tell what’s vandalism and what’s not. At this rate, Grokipedia doesn’t look ready for the stars, it looks like it’s ready to collapse into a morass of hard-to-read misinformation.