A study from Profound of OpenAI’s ChatGPT, Google AI Overviews and Perplexity shows that while ChatGPT mostly sources its information from Wikipedia, Google AI Overviews and Perplexity mostly source their information from Reddit.
A study from Profound of OpenAI’s ChatGPT, Google AI Overviews and Perplexity shows that while ChatGPT mostly sources its information from Wikipedia, Google AI Overviews and Perplexity mostly source their information from Reddit.
I think the academic advice about Wikipedia was sadly mistaken. It’s true that Wikipedia contains errors, but so do other sources. The problem was that it was a new thing and the idea that someone could vandalize a page startled people. It turns out, though, that Wikipedia has pretty good controls for this over a reasonable time-window. And there’s a history of edits. And most pages are accurate and free from vandalism.
Just as you should not uncritically read any of your other sources, you shouldn’t uncritically read Wikipedia as a source. But if you are going to uncritically read, Wikipedia’s far from the worst thing to blindly trust.
Yeah, a lot of people had your perspective about Wikipedia while I was in college, but they are wrong, according to Wikipedia.
From the link:
I personally use ChatGPT like I would Wikipedia. It’s a great introduction to a subject, especially in my line of work, which is software development. I can get summarized information about new languages and frameworks really quickly, and then I can dive into the official documentation when I have a high level understanding of the topic at hand. Unfortunately, most people do not use LLMs this way.
The whole paragraph is kinda FUD except for this. Normal research practice is to (get ready for a shock) do research and not just copy a high-level summary of what other people have done. If your professors were saying, “don’t cite encyclopedias, which includes Wikipedia” then that’s fine. But my experience was that Wikipedia was specifically called out as being especially unreliable and that’s just nonsense.
Eesh. The value of a tertiary source is that it cites the secondary sources (which cite the primary). If you strip that out, how’s it different from “some guy told me…”? I think your professors did a bad job of teaching you about how to read sources. Maybe because they didn’t know themselves. :-(
Let me clarify then. It’s unreliable as a cited source in Academia. I’m drawing parallels and criticizing the way people use chatgpt. I.e. taking it at face value with zero caution and using it as if it’s a primary source of information.
Did you read beyond the sentence that you quoted?
Here:
Example: you’re a junior developer trying to figure out what this JavaScript syntax is
const {x} = response?.data
. It’s difficult to figure out what destructuring and optional chaining are without knowing what they’re called.With Chatgpt, you can copy and paste that code and ask “tell me what every piece of syntax is in this line of Javascript.” Then you can check the official docs to learn more.