In contrast, citizens in China, India, and Turkiye prefer a quick end to the war even if Ukraine has to concede territory.
i wouldn’t say this is supporting Russia, more like wanting an end to this senseless violence (actually interesting how this is a virtually non-existent trend among westerners, i’m assuming because they’ve never had any proximity to similar violence and misery, and are therefore ready to sacrifice as many Ukranians as is takes) :(
certainly more weights contain more general information, which is pretty useful if you’re using a model is a sort of secondary search engine, but models can be very performant in certain benchmarks while containing little general data
this isn’t really by design, up until now (and it’s still continuing to be that way), it’s just that we don’t know how to create an LLM, which can generate coherent text without absorbing a huge portion of the training material
i’ve tried several models based on facebook’s llama LLMs, and i can say that the 13B and definitely 30B versions are comparable to chatGPT in terms of quality (maybe not in terms of the amount of information it has access to, but definitely in other regards)
recent advancements in LLMs that are small enough to be accessible to regular ppl (alpaca/llama), but also performant enough to sometimes even outperform chatGPT, are more interesting to me personally
while the size of this model is certainly super impressive, even if the weights were released, it would require like half a terabyte of VRAM at int4 MINIMUM, so you’d need like ~100k usd just to run inference on this thing at decent accuracy :(
my brother in christ, read the article, it has a source directly linking to a government website publishing preliminary drafts of legislative documents
kinda sick of ppl here not bothering to extend their critique of the source material beyond publisher bad lmao