fossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 17 hours agoHave you know???.mander.xyzexternal-linkmessage-square54fedilinkarrow-up1413arrow-down122
arrow-up1391arrow-down1external-linkHave you know???.mander.xyzfossilesque@mander.xyzM to Science Memes@mander.xyzEnglish · 17 hours agomessage-square54fedilink
minus-squareNewOldGuard@lemmy.mllinkfedilinkEnglisharrow-up3arrow-down1·11 hours agoThe training is a huge power sink, but so is inference (I.e. generating the images). You are absolutely spinning up a bunch of silicon that’s sucking back hundreds of watts with each image that’s output, on top of the impacts of training the model.
minus-squarem532@lemmygrad.mllinkfedilinkEnglisharrow-up1·5 hours agoThe inference takes <10 wH aka pretty much nothing.
The training is a huge power sink, but so is inference (I.e. generating the images). You are absolutely spinning up a bunch of silicon that’s sucking back hundreds of watts with each image that’s output, on top of the impacts of training the model.
The inference takes <10 wH aka pretty much nothing.