• dinckel@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    3 months ago

    It’s a really bold claim. Every time a new package manager and/or dependency resolver comes around, we have the exact same headline

    • BitSound@lemmy.world
      link
      fedilink
      arrow-up
      9
      arrow-down
      2
      ·
      3 months ago

      It is a bold claim, but based on their success with ruff, I’m optimistic that it might pan out.

      • monogram@feddit.nl
        link
        fedilink
        English
        arrow-up
        10
        ·
        3 months ago

        pipx, poetry, pipsi, fades, pae, pactivate, pyenv, virtualenv, pipenv

        Let’s hope this next one will be the true standard.

          • dallen@programming.dev
            link
            fedilink
            arrow-up
            7
            ·
            3 months ago

            I’ve been mostly a poetry guy but have tested out uv a bit lately. Two main advantages I see are being able to install Python (I relied on pyenv before) and it’s waaay faster at solving/installing dependencies.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              arrow-up
              2
              ·
              3 months ago

              Yeah, it certainly looks nice, but my problems are:

              • everything runs in a docker container locally, so I don’t think the caching is going to be a huge win
              • we have a half-dozen teams and a dozen repositories or so, across three time zones, so big changes require a fair amount of effort
              • we just got through porting to poetry to split into dependency groups, and going back to not having that is a tough sell

              So for me, it needs to at least have feature parity w/ poetry to seriously consider.

              • Eager Eagle@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                3 months ago

                uv is still faster with a cold cache

                and uv does have dep groups

                about the second problem, there’s an issue open on writing a migration guide, but migrating manually is not too difficult.

                • sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  3 months ago

                  I’m not really worried about the migration work, from what I can tell it’s basically just moving a few things around. I’m more worried about losing features the team likes largely for performance reasons.

                  Our primary use cases are:

                  • dev tools - standardize versions of tools like black, pylint, etc; not necessary if we move to ruff, we’ll just standardize on a version of that (like we do with poetry today)
                  • tests - extra deps for CI/CD for things like coverage reports

                  I like the syntax poetry has, but I’d be willing to use something else, like in PEP 735.

                  One thing we also need is a way to define additional package repos since we use an internal repo. I didn’t see that called out in the PEP, and I haven’t looked at uv enough to know what their plan is, but this issue seems to be intended to fix it. We specify a specific repo for a handful of packages in each project, and we need that to work as well.

                  I’m currently looking to use ruff to replace some of our dev tools, and I’ll look back at uv in another release or two to see what the progress is on our blockers.

  • ertai@programming.dev
    link
    fedilink
    arrow-up
    22
    arrow-down
    1
    ·
    edit-2
    3 months ago

    Yet another python packager............... insane that such a popular language still doesn’t have this basic problem solved.

    • CodeMonkey@programming.dev
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      3 months ago

      pip is a perfectly usable package manager and is included in most python distributions now. Is it perfect? No, but it is good enough for every team I have been on.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        3 months ago

        it’s usable, yet it doesn’t attempt to solve a a third of the problems uv, poetry, and pdm address.

        it’s also not hard to end up with a broken env with pip.

      • Moc@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        3 months ago

        Except that it’s slower than uv and therefore strictly worse for build processes

      • corsicanguppy
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        pip is …

        … a tool that obscures its state from the system, gleefully supports supply-chain attacks, and has the same bad level of validation as debs.

      • uthredii@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Putting aside the speed uv has a bunch of features that usually require 2-4 separate tools. These tools are very popular but not very well liked. The fact these tools are so popular proves that pip is not sufficient for many use cases. Other languages have a single tool (e.g. cargo) that are very well liked.

    • sum_yung_gai@lemm.ee
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      3 months ago

      I use poetry and it works really well. I would consider it solved but that doesn’t mean there isn’t the possibility of a better solution.

    • ertai@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Glad I use arch btw, pacman manages my python packages so I don’t have to deal with all this mess.

    • corsicanguppy
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      still doesn’t have this basic problem solved.

      It was solved by the OS on which it installs. Because that’s the system-wide package management of record, and the only one which properly reports to enterprise monitoring.

      Losers reinventing RPM are the same class of slug as those inventing their own crypto.

      • gigachad@sh.itjust.works
        link
        fedilink
        arrow-up
        12
        ·
        3 months ago

        We do geodata science and rely on some pretty specific C++ libraries that are only distributed via conda. While on unix-based systems it’s possible to get some of them from other channels or even building them from source, we mostly have Windows machines in production where we are not that flexible. Docker is unfortunately no solution due to security concerns.

        If you are asking why I hate it: It’s bloated, uses more space than needed and it’s rare I can reproduce an environment from the environment file without running into errors. Using it feels unintuitive, I still google command after years. It was very slow until recently, when the libmamba solver was finally integrated. Last but not least licensing is a pain in the ass.

        • db0@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          3
          ·
          3 months ago

          Interesting. We use conda via micromamba for my own project, as it makes the install for end-users much easier when they can just run a shell script, to install python, cuda, and all the dependencies needed.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 months ago

          I share the same frustration trying to replicate an environment. I’m glad I can avoid it these days, the community needs a way out of the conda lock-in.

        • rutrum@lm.paradisus.day
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 months ago

          I’ve been using micromamba/mamba and not had solving issues like I did with conda. Im glad conda integrated libmamba.

          Question: why were docker containers deemed security risks?

            • BatmanAoD@programming.dev
              link
              fedilink
              arrow-up
              3
              ·
              3 months ago

              I’m no expert, but isn’t running in a VM strictly better than running on raw metal from a security perspective? It’s generally more locked down, and breaking out of the virtualization layer requires a separate security breach from gaining access to the running container.

          • gigachad@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            3 months ago

            Yes, mamba is a huge improvement. Regarding docker I can’t really tell you as I’m not an infrastructure guy.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      ·
      3 months ago

      Looks like it has basic support:

      • required-python = "..."
      • dependencies = [ ... ]

      Once it gets dependency groups, I’ll try it out. I’m currently using poetry, which works, but I’m always interested in better perf.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        3 months ago

        it already has dep groups; e.g.

        uv add --optional staging pytest

        then

        uv sync --extra staging

        to install / uninstall packages accordingly.

        They have a --dev shorthand for dev dependencies, but it seems the dependency group PEP is not final, so there isn’t a standardized way of doing this yet.

        • beeng@discuss.tchncs.de
          link
          fedilink
          arrow-up
          3
          ·
          3 months ago

          Private PyPI too?

          We’re coming from poetry but it’s slow and needs its own .venv, so a UV binary would be very nice.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          3 months ago

          Oh cool, I’ll definitely look into that.

          And honestly, the one I need more is a test group for CI, for things like coverage reporting and whatnot. If I can get that and if having multiple package indexes works properly (i.e. it can check my private repo first, and then pypi), I can probably port our projects to uv, at which point it’s an internal discussion instead of a technical one.