It seems to be adding tons of articles, then some of them get deleted.
I assume it's been allocated lots of compute.
The entire model is outcompeting wikipedia on quantity per topic.
If wikipedia merges/integrates some article and Grokipedia has a
specific page for it, the search engine/LLM will get that version front and center.
Grokipedia seems to have no scope limit, so wikipedia "non-notable" entries
will be SEO-optimized towards sites with the topic-names, eventually
settling on AI content farms as primary destination.