• corvus@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    7 hours ago

    I use jan-beta GUI, you can use locally any model that supports tool calling like qwen3-30B or jan-nano. You can download and install MCP servers (from, say, mcp.so) that serve different tools for the model to use, like web search, deep research, web scrapping, download or summarize videos, etc there are hundreds of MCP servers for different use cases.

    • wise_pancakeOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      10 hours ago

      Basically it’s a layer to let your LLMs plug into tools.

      They generally run on your machine (I use docker to sandbox them) and may or may not call out to useful APIs

      One example is I just wrote one to connect to my national weather services RSS feed, so my LLMs can get and summarize the weather for me in the morning.

      Works well with Gemma 3n