

It’s like that “Joey Repeat After Me” meme from friends haha
It’s like that “Joey Repeat After Me” meme from friends haha
I mean it implies that they CAN start with the conclusion or the “thought” and then generate the text to verbalize that.
It’s shocking to what length humans will go to explain how their wetware neural network is fundamentally different and it’s impossible for LLMs to think or reason in any way. Honestly LLMs teach us more about human intelligence (or the lack thereof) than machine intelligence. Like obi wan said, “The ability to speak does not make one intelligent” haha.
But how is this different from your average redditor?
You know they don’t think - even though “It’s a peculiar truth that we don’t understand how large language models (LLMs) actually work.”?
It’s truly shocking to read this from a mess of connected neurons and synapses like yourself. You’re simply doing fancy word prediction of the next word /s
This was shit graphics even back in 1996 because it only uses primary or fully saturated colors. It’s “dev art”, made by someone with no artistic talent. Or maybe made for 4 year old kids. I missed out on a lot of games with good gameplay because I just can’t stand on this abomination of a color palette.
The worst thing? When games actually went with a more pastel or naturalistic color palette, moronic games journalists would say the colors look “drab” or some shit.
Better yet, teach AI to write code replacing specific optimized AI networks. Then automatically profile and optimize and unit test!