• CyberSeeker@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    9 months ago

    Who cares if the code is open source, or pre-training weights are released? Virtually every Masters in CS student in 2024 is building this from scratch. The differentiator is the training dataset, or at worst, the weights after fine tuning the model.

  • BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    9 months ago

    It’s size makes it basically useless. It underperforms models even in it’s active weight class. It’s nice that it’s available but Grok-0 would have been far more interesting.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    4
    ·
    9 months ago

    This is the best summary I could come up with:


    It is fine-tuned for applications such as natural language dialog, and represents the raw base model checkpoint from the pre-training phase, which concluded in October 2023.

    Grok will be familiar to users of Musk’s social media platform, X, and subscribers have been able to ask the chatbot questions and receive answers.

    If a user flicks through a dog-eared copy of The Hitchhiker’s Guide to the Galaxy radio scripts, the following definition can be found lurking in Fit the Tenth: "The Hitchhiker’s Guide to the Galaxy is an indispensable companion to all those who are keen to make sense of life in an infinitely complex and confusing universe, for though it cannot hope to be useful or informative on all matters, it does make the reassuring claim that where it is inaccurate, it is at least definitively inaccurate.

    The release comes on the first anniversary of the launch of OpenAI’s GPT-4 model, and Musk’s legal spat with his former AI pals remains in the background.

    OpenAI responded by releasing a trove of emails, claiming Musk was fully aware of its plans and wanted it folded into Tesla.

    By opening up the weights behind Grok-1, Musk is attempting to plant a flag in the opposite camp to the proprietary world of OpenAI.


    The original article contains 639 words, the summary contains 210 words. Saved 67%. I’m a bot and I’m open source!