QuentinCallaghan@sopuli.xyzM to Political MemesEnglish · 1 month agoMeanwhile at DeepSeeksopuli.xyzimagemessage-square142fedilinkarrow-up11.01Karrow-down115
arrow-up1991arrow-down1imageMeanwhile at DeepSeeksopuli.xyzQuentinCallaghan@sopuli.xyzM to Political MemesEnglish · 1 month agomessage-square142fedilink
minus-square474D@lemmy.worldlinkfedilinkarrow-up2arrow-down3·1 month agoYou can do it in LM Studio in like 5 clicks, I’m currently using it.
minus-squareAtHeartEngineer@lemmy.worldlinkfedilinkEnglisharrow-up5arrow-down1·1 month agoRunning an uncensored deepseek model that doesn’t perform significantly worse than the regular deepseek models? I know how to dl and run models, I haven’t seen an uncensored deepseek model that performs as well as the baseline deepseek model
minus-square474D@lemmy.worldlinkfedilinkarrow-up1·1 month agoI mean obviously you need to run a lower parameter model locally, that’s not a fault of the model, it’s just not having the same computational power
minus-squareAtHeartEngineer@lemmy.worldlinkfedilinkEnglisharrow-up2·1 month agoIn both cases I was talking about local models, deepseek-r1 32b parameter vs an equivalent that is uncensored from hugging face
You can do it in LM Studio in like 5 clicks, I’m currently using it.
Running an uncensored deepseek model that doesn’t perform significantly worse than the regular deepseek models? I know how to dl and run models, I haven’t seen an uncensored deepseek model that performs as well as the baseline deepseek model
I mean obviously you need to run a lower parameter model locally, that’s not a fault of the model, it’s just not having the same computational power
In both cases I was talking about local models, deepseek-r1 32b parameter vs an equivalent that is uncensored from hugging face