swiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-29 days agoLavalamp too hotdiscuss.tchncs.deimagemessage-square77fedilinkarrow-up1493
arrow-up1493imageLavalamp too hotdiscuss.tchncs.deswiftywizard@discuss.tchncs.de to Programmer Humor@programming.dev · edit-29 days agomessage-square77fedilink
minus-squaredream_weasel@sh.itjust.workslinkfedilinkarrow-up4·7 days agoThis kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.
This kind of stuff happens on any model you train from scratch even before training for multi step reasoning. It seems to happen more when there’s not enough data in the training set, but it’s not an intentional add. Output length is a whole deal.