fossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 2か月前Squiggly Boiemander.xyzimagemessage-square67fedilinkarrow-up1877
arrow-up1877imageSquiggly Boiemander.xyzfossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 2か月前message-square67fedilink
minus-squareTamo240@programming.devlinkfedilinkEnglisharrow-up8·2か月前Its an abstraction for neural networks. Different individual networks might vary in number of layers (columns), nodes (circles), or loss function (lines), but the concept is consistent across all.
minus-squareNotANumber@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·edit-22か月前Kinda but also no. That’s specifically a dense neural network or MLP. It gets a lot more complicated than that in some cases.
Its an abstraction for neural networks. Different individual networks might vary in number of layers (columns), nodes (circles), or loss function (lines), but the concept is consistent across all.
Kinda but also no. That’s specifically a dense neural network or MLP. It gets a lot more complicated than that in some cases.