swiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-22 months agoLavalamp too hotdiscuss.tchncs.deimagemessage-square76fedilinkarrow-up1499
arrow-up1499imageLavalamp too hotdiscuss.tchncs.deswiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-22 months agomessage-square76fedilink
minus-squaredream_weasel@sh.itjust.workslinkfedilinkarrow-up4·2 months agoThis kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.
This kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.