AI can be a useful tool and I think it will slowly become more common in the workplace, for example it can be very convenient for knowledge retrieval, but it’s laughable to think that it can replace humans. I’d wager any time “AI” can replace a human the job could’ve already been automated through other means.
Generalized LLMs like ChatGPT are. If you train a model on your own documentation then all it “knows” is what is in the docs and it can perform very well at finding relevant results. It’s just kind of a context-aware search engine at that point.
The problem again is that companies mostly aren’t doing that, they’re trying to replace humans with ChatGPT.
AI can be a useful tool and I think it will slowly become more common in the workplace, for example it can be very convenient for knowledge retrieval, but it’s laughable to think that it can replace humans. I’d wager any time “AI” can replace a human the job could’ve already been automated through other means.
LLMs are absolute garbage for knowledge retrieval.
Generalized LLMs like ChatGPT are. If you train a model on your own documentation then all it “knows” is what is in the docs and it can perform very well at finding relevant results. It’s just kind of a context-aware search engine at that point.
The problem again is that companies mostly aren’t doing that, they’re trying to replace humans with ChatGPT.
Except that your context aware search engine would tell you when there is no result and AI will just make shit up and distort the results it did find.
It’s not true.
Vector dbs and LLMs are really powerful at knowledge retrieval.
See notebooklm and open-source alternative.