misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 day agoAlibaba Releases Advanced Open Video Model, Immediately Becomes AI Porn Machinewww.404media.coexternal-linkmessage-square206fedilinkarrow-up1616arrow-down112file-textcross-posted to: [email protected]
arrow-up1604arrow-down1external-linkAlibaba Releases Advanced Open Video Model, Immediately Becomes AI Porn Machinewww.404media.comisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 day agomessage-square206fedilinkfile-textcross-posted to: [email protected]
minus-squareAbsoluteChicagoDog@lemm.eelinkfedilinkEnglisharrow-up31·11 hours agoWhat’s the URL for this generator so I can avoid it?
minus-squarebrucethemoose@lemmy.worldlinkfedilinkEnglisharrow-up3·edit-24 hours agoThis is Lemmy. Why not self host the generation? :)
minus-squareSippyCup@feddit.nllinkfedilinkEnglisharrow-up27·10 hours agoDefinitely don’t click on this link otherwise you might try to install an AI locally
minus-squaremadcaesar@lemmy.worldlinkfedilinkEnglisharrow-up5·8 hours agoDepraved! Disgusting! I’d never! Unless??? (👁 ͜ʖ👁)
minus-squaremorrowind@lemm.eelinkfedilinkEnglisharrow-up4·7 hours agogood luck trying to run a video model locally Unless you have top tier hardware
minus-squarebrucethemoose@lemmy.worldlinkfedilinkEnglisharrow-up4·7 hours ago1.4B should be surprisingly doable, especially once quantization/optimized kernels are figured out. HunyuanVideo can already run on a 12GB desktop 3060.
What’s the URL for this generator so I can avoid it?
This is Lemmy. Why not self host the generation? :)
Definitely don’t click on this link otherwise you might try to install an AI locally
Depraved! Disgusting! I’d never!
Unless??? (👁 ͜ʖ👁)
good luck trying to run a video model locally
Unless you have top tier hardware
1.4B should be surprisingly doable, especially once quantization/optimized kernels are figured out.
HunyuanVideo can already run on a 12GB desktop 3060.