It’s easy to tune a chat to confidently speak any bullshit take. But overriding what an AI has learned with alignment steps like this has been shown to measurably weaken its capabilities.
So here we have a guy who’s so butthurt by reality that he decided to make his own product stupider just to reinforce his echo chamber. (I think we all saw this coming.)
It’s easy to tune a chat to confidently speak any bullshit take. But overriding what an AI has learned with alignment steps like this has been shown to measurably weaken its capabilities.
So here we have a guy who’s so butthurt by reality that he decided to make his own product stupider just to reinforce his echo chamber. (I think we all saw this coming.)