Maven (famous)@lemmy.zip to Programmer Humor@lemmy.ml · 4 months agoIn case Copilot was too much worklemmy.zipimagemessage-square9fedilinkarrow-up1135arrow-down13file-text
arrow-up1132arrow-down1imageIn case Copilot was too much worklemmy.zipMaven (famous)@lemmy.zip to Programmer Humor@lemmy.ml · 4 months agomessage-square9fedilinkfile-text
cross-posted from: https://lemmy.zip/post/27030131 The full repo: https://github.com/vongaisberg/gpt3_macro
minus-squarefossphi@lemm.eelinkfedilinkEnglisharrow-up2·4 months agoIs this the freaking antithesis of reproducible builds‽ Sheesh, just thinking of the implications in the build pipeline/supply chain makes me shudder
minus-squareOsrsNeedsF2P@lemmy.mllinkfedilinkarrow-up3·4 months agoJust set the temperature to zero, duh
minus-squareFinadil@lemmy.worldlinkfedilinkarrow-up1·4 months agoLooking at the source they thankfully already use a temp of zero, but max tokens is 320. That doesn’t seem like much for code especially since most symbols are a whole token.
minus-squareXanthrax@lemmy.worldlinkfedilinkarrow-up1·4 months agoYou’d have to consider it somewhat of a black box, which is what people already do.
minus-squarebountygiver [any]@lemmy.mllinkfedilinkEnglisharrow-up1·edit-24 months agothis is how we end up with lost tech a few decades later
Is this the freaking antithesis of reproducible builds‽ Sheesh, just thinking of the implications in the build pipeline/supply chain makes me shudder
Just set the temperature to zero, duh
Looking at the source they thankfully already use a temp of zero, but max tokens is 320. That doesn’t seem like much for code especially since most symbols are a whole token.
You’d have to consider it somewhat of a black box, which is what people already do.
this is how we end up with lost tech a few decades later