alessandro@lemmy.ca to PC Gaming@lemmy.ca · 8 months agoConvai aims to put an end to basic 'one-line' NPCs with AI but says it 'needs more high quality writers and artists, not less' to get it donewww.pcgamer.comexternal-linkmessage-square17fedilinkarrow-up1107arrow-down16cross-posted to: games@sh.itjust.works
arrow-up1101arrow-down1external-linkConvai aims to put an end to basic 'one-line' NPCs with AI but says it 'needs more high quality writers and artists, not less' to get it donewww.pcgamer.comalessandro@lemmy.ca to PC Gaming@lemmy.ca · 8 months agomessage-square17fedilinkcross-posted to: games@sh.itjust.works
minus-squaretvbusy@lemmy.dbzer0.comlinkfedilinkarrow-up7arrow-down3·8 months agoInternet connection required at all times? No thanks
minus-squareumbrella@lemmy.mllinkfedilinkarrow-up4arrow-down1·8 months agotbf you would need a pretty beefy gpu to do both rendering and ai locally. as much as i hate to say it (because this idea sounds awesome) the tech is not there yet, and depending on the cloud for this always goes wrong.
minus-squarecynar@lemmy.worldlinkfedilinkarrow-up2·8 months agoI limited LLM would run on a lot of newer gfx cards. It could also be done as a semi online thing. If you have the grunt, you can run it locally. Otherwise, you can farm it out to the online server.
Internet connection required at all times? No thanks
tbf you would need a pretty beefy gpu to do both rendering and ai locally.
as much as i hate to say it (because this idea sounds awesome) the tech is not there yet, and depending on the cloud for this always goes wrong.
I limited LLM would run on a lot of newer gfx cards. It could also be done as a semi online thing. If you have the grunt, you can run it locally. Otherwise, you can farm it out to the online server.