quickslowdown a day ago

I highly, highly recommend uv. It solves & installs dependencies incredibly fast, and the CLI is very intuitive once you've memorized a couple commands. It handles monorepos well with the "workspaces" concept, it can replace pipx with "uv tool install," handle building & publishing, and the docker image is great, you just add a FROM line to the top and copy the bin from /uv.

I've used 'em all, pip + virtualenv, conda (and all its variants), Poetry, PDM (my personal favorite before switching to uv). Uv handles everything I need in a way that makes it so I don't have to reach for other tools, or really even think about what uv is doing. It just works, and it works great.

I even use it for small scripts. You can run "uv init --script <script_name.py>" and then "uv add package1 package2 package3 --script <script_name.py>". This adds an oddly formatted comment to the top of the script and instructs uv which packages to install when you run it. The first time you run "uv run <script_name.py>," uv installs everything you need and executes the script. Subsequent executions use the cached dependencies so it starts immediately.

If you're going to ask me to pitch you on why it's better than your current preference, I'm not going to do that. Uv is very easy to install & test, I really recommend giving it a try on your next script or pet project!

  • actinium226 a day ago

    The script thing is great. By the way those 'oddly formatted' comments at the top are not a uv thing, it's a new official Python metadata format, specifically designed to make it possible for 3rd party tools like uv to figure out and install relevant packages.

    And in case it wasn't clear to readers of your comment, uv run script.py creates an ephemeral venv and runs your script in that, so you don't pollute your system env or whatever env you happen to be in.

  • fluidcruft a day ago

    I generally agree but one thing I find very frustrating (i.e. have not figured out yet) is how deal with extras well, particularly with pytorch. Some of my machines have GPU, some don't and things like "uv add" end up uninstalling everything and installing the opposite forcing a resync with the appropriate --extra tag. The examples in the docs do things like CPU on windows and GPU on Linux but all my boxes are linux. There has to be a way to tell it that "hey I want --extra GPU" always on this box. But I haven't figured it out yet.

    • shawnz a day ago

      Getting the right version of PyTorch installed to have the correct kind of acceleration on each different platform you support has been a long-standing headache across many Python dependency management tools, not just uv. For example, here's the bug in poetry regarding this issue: https://github.com/python-poetry/poetry/issues/6409

      As I understand it, recent versions of PyTorch have made this process somewhat easier, so maybe it's worth another try.

      • fluidcruft 8 hours ago

        uv actually handles thr issues described there very well (uv docs have have a page showing a few ways to do it). The issue for me is uv has massive amnesia about which one was selected and you end up trashing packages because of that. uv is very fast at thrashing though so it's not as bad as if poetry were thrashing.

      • tmaly 10 hours ago

        I end up going to the torch website and they have a nice little UI I can click what I have and it gives me the pip line to use.

        • shawnz 10 hours ago

          That's fine if you are just trying to get it running on your machine specifically, but the problems come in when you want to support multiple different combinations of OS and compute platform in your project.

      • amelius 17 hours ago

        On nvidia jetson systems, I always end up compiling torchvision, while torch always comes as a wheel. It seems so random.

    • DrBenCarson a day ago

      It sounds like you’re just looking for dependency groups? uv supports adding custom groups (and comes with syntactic sugar for a development group

      • fluidcruft 12 hours ago

        It is... but basically it need to remember which groups are sync'd. For example if you use an extra, you have to keep track of it constantly because sync thrashes around between states all the time unless you play close and tedious attention. At least I haven't figured out how to make it remember which extras are "active".

            uv sync --extra gpu
            uv add matplotlib # the sync this runs undoes the --extra gpu
            uv sync # oops also undoes all the --extra
        
        What you have to do to avoid this is to remember to use --no-sync all the time and then meticulously manually sync while remembering all the extras that I do actually currently want:

            uv sync --extra gpu --extra foo --extra bar
            uv add --no-sync matplotlib
            uv sync --extra gpu --extra foo --extra bar
        
        It's just so... tedious and kludgy. It needs an "extras.lock" or "sync.lock" or something. I would love it if someone tells me I'm wrong and missing something obvious in the docs.
        • DrBenCarson 12 hours ago

          To make the change in your environment:

          1. Create or edit the UV configuration file in one of these locations:

          - `~/.config/uv/config.toml` (Linux/macOS)

          - `%APPDATA%\uv\config.toml` (Windows)

          2. Add a section for default groups to sync:

          ```toml

          [sync]

          include-groups = ["dev", "test", "docs"] # Replace with your desired group names

          ```

          Alternatively, you can do something similar in pyproject.toml if you want to apply this to the repo:

          ```toml

          [tool.uv]

          sync.include-groups = ["dev", "test", "docs"] # Replace with your desired group names

          ```

          • fluidcruft 9 hours ago

            Thank you! That's good to know. Unfortunately it doesn't seem to work for "extras". There may be some target other than sync.include-groups but I haven't found it yet.

            What I am struggling with is what you get after following the Configuring Accelerators With Optional Dependencies example:

            https://docs.astral.sh/uv/guides/integration/pytorch/#config...

            Part of what that does is set up rules that prevent simultaneously installing cpu and gpu versions (which isn't possible). If you use the optional dependencies example pyproject.toml then this is what happens:

                $ uv sync --extra cpu --extra cu124
                Using CPython 3.12.7
                Creating virtual environment at: .venv
                Resolved 32 packages in 1.65s
                error: Extras `cpu` and `cu124` are incompatible with the declared conflicts: {`project[cpu]`, `project[cu124]`}
            
            And if you remove the declared conflict, then uv ends up with two incompatible sources to install the same packages from

                uv sync --extra cpu --extra cu124
                error: Requirements contain conflicting indexes for package `torch` in all marker environments:
                - https://download.pytorch.org/whl/cpu
                - https://download.pytorch.org/whl/cu124
            
            After your comment I initially thought that perhaps the extras might be rewritten as group dependencies somehow to use the ~/.config/uv/config.toml but according to the docs group dependencies are not allowed to have conflicts with each other and must be installable simultaneously (makes sense since there is an --all-groups flag). That is you must be able to install all group dependencies simultaneously.
    • satvikpendem a day ago

      This happened to me too, that is why I stopped using it for ML related projects and stuck to good old venv. For other Python projects I can see it being very useful however.

    • ibic 21 hours ago

      I'm not sure if I got your issue, but I can do platform-dependent `index` `pytorch` installation using the following snippet in `pyproject.toml` and `uv sync` just handles it accordingly.

      [tool.uv.sources] torch = [{ index = "pytorch-cu124", marker = "sys_platform == 'win32'" }]

      • satvikpendem 17 hours ago

        Some Windows machines have compatible GPUs while others don't, so this doesn't necessarily help. What is really required is querying the OS for what type of compute unit it has and then installing the right version of an ML library, but I'm not sure that will be done.

        • fluidcruft 16 hours ago

          Even without query, just setting an environment variable or having remember which extras are already applied to the already synced .venv some way.

    • synergy20 a day ago

      i use uv+torch+cuda on linux just fine,never used the extra flag, i wonder what's the problem here?

      • dagw 18 hours ago

        Getting something that works out of the box on just your computer is normally fine. Getting something that works out of the box on many different computers with many different OS and hardware configurations is much much harder.

  • scribu a day ago

    The install speed alone makes it worthwhile for me. It went from minutes to seconds.

    • BoorishBears a day ago

      I was working on a Raspberry Pi at a hackathon, and pip install was eating several minutes at a time.

      Tried uv for the first time and it was down to seconds.

      • guappa 18 hours ago

        Why would you be redoing your venv more than once?

        • dagw 15 hours ago

          Once rebuilding your venv takes negligible time, it opens up for all kinds of new ways to develop. For example I now always run my tests in a clean environment, just to make sure I haven't added anything that only happens to work in my dev venv.

          • kstrauser 13 hours ago

            That's smart. Oh, you used `pip install` to fix a missing import, but forgot to add it to pyproject.toml? You'll find out quickly.

        • BoorishBears 10 hours ago

          It has nothing to do with redoing venv: some package installs were just taking multiple minutes.

          I cancelled one at 4 minutes before switching to uv and having it finish in a few seconds

  • midhun1234 a day ago

    Can confirm this is all true. I used to be the "why should I switch" guy. The productivity improvement from not context switching while pip installs a requirements file is completely worth it.

  • mcintyre1994 a day ago

    That scripting trick is awesome! One of the really nice things about Elixir and its dependency manager is that you can just write Mix.install(…) in your script and it’ll fetch those dependencies for you, with the same caching you mentioned too.

    Does uv work with Jupyter notebooks too? When I used it a while ago dependencies were really annoying compared to Livebook with that Mix.install support.

  • para_parolu a day ago

    As a person who don’t work often on python code but occasionally need to run server or tool I find UV blessing. Before that I would beg people to help me just not to figure out what combination of obscure python tools I need. Now doing “uv run server.py” usually works.

  • ibic 21 hours ago

    I happened to use uv recently for a pet project, and I totally agree with you. It's really really good. I couldn't believe its dependency resolution and pulling can be so fast. Imho, it's the python package manager (I don't know the most suitable name to categorize it) done right, everything just works, the correct way.

  • insane_dreamer 16 hours ago

    uv is great and we’re switching over from conda for some projects. The resolver is lightning fast and the toml support is good.

    Having said that, there are 2 areas where we still need conda:

    - uv doesn’t handle non-python wheels, so if you need to use something like mkl, no luck

    - uv assumes that you want to use one env per project. However with complex projects you may need to use a different env with different branches of your code base. Conda makes this easy - just activate the conda env you want — all of your envs can be stored in some central location outside your projects — and run your code. Uv wants to use the project toml file and stores the packages in .venv by default (which you don’t want to commit but then need different versions of). Yes you can store your project venv elsewhere with an env var but that’s not a practical solution. There needs to be support for multiple .toml files where the location of the env can be specified inside the toml file (not in an env var).

    • serjester 15 hours ago

      You may want to checkout uv’s workspaces - they’re very handy for large mono repos.

      • insane_dreamer 11 hours ago

        Thanks. I looked at that but I believe it solves a different problem.

  • crabbone 18 hours ago

    > It solves & installs dependencies incredibly fast

    If you are lucky, and you don't have to build them, because the exceptionally gifted person who packaged them didn't know how to distribute them and the bright minds running PyPI.org allowed that garbage to be uploaded and made it so pip would install that garbage by default.

    > can replace pipx with "uv tool install,"

    That's a stupid idea. Nobody needed pipx in the first place... The band-aid that was applied some years ago is now cast in stone...

    The whole idea of Python tools trying to replace virtual environment, but doing it slightly better is moronic. The virtual environments is the band-aid. It needs to go. The Python developers need to be pressured into removing this garbage, and instead working on having program manifests or something similar. Python has virtual environments due to incompetence of its authors and unwillingness to make things right, once that incompetence was discovered.

    ----

    NB. As it stands today, if you want to make your project work well, you shouldn't use any tools that install packages by solving dependencies and downloading them from PyPI. It's not the function of the tool doing that, it's the bad design of the index.

    The reasonable thing to do is to install the packages (for applications) you need during development, figure out what you actually need, and then store the part you need for your package to work locally. Only repeat this process when you feel the need to upgrade.

    If you need packages for libraries, then you need a way to install various permutations within allowed versions: no tool for package installation today knows how to do it. So, you might as well not use any anyways.

    But, the ironic part is that nobody in Python community does it right. And that's why there are tons of incompatibilities, and the numbers increase dramatically when projects age even slightly.

IshKebab a day ago

Uv really fixes Python. It takes it from "oh god I have to fight Python again" to "wow it was actually fast and easy".

I think all the other projects (pyenv, poetry, pip, etc.) should voluntarily retire for the good of Python. If everyone moved to Uv right now, Python would be in a far better place. I'm serious. (It's not going to happen though because the Python community has no taste.)

The only very minor issue I've had is once or twice the package cache invalidation hasn't worked correctly and `uv pip install` installed an outdated package until I `uv clean`ed. Not a big deal though considering it solves so many Python clusterfucks.

  • javchz a day ago

    Agree. I mostly do front end in my day job, and despite JavaScript being a bit of a mess lang, dealing with npm is way better than juggling anaconda, miniforge, Poetry, pip, venv, etc depending on the project.

    UV is such a smooth UX that it makes you wonder how something like it wasn’t part of Python from the start.

    • baq a day ago

      +1

      …but we did have to wait for cargo, npm (I include yarn and pnpm here) and maybe golang to blaze the ‘this is how it’s done’ trail. Obvious in hindsight.

      • dontlaugh 20 hours ago

        Ruby's bundler had already invented the correct model many years ago. It only took time for others to accept that.

        • zelphirkalt 20 hours ago

          Wait, a bundler? What needs to be bundled when using Ruby? Maybe this is not the same meaning as with JS bundlers. And why does a bundles manage dependencies?

        • hackerbrother 19 hours ago

          Yeah, but unlike bundle, uv locks in your Python version and downloads that Python version as needed. It’s like bundle and rbenv combined.

          • dontlaugh 19 hours ago

            Sure. That seems less important to me than the packages. As long as the language version is checked, that’s the important bit.

    • Aeolun a day ago

      More importantly, migrating from npm, to pnpm, to yarn, to bun, is very nearly seamless. Migrating in the Python ecosystem? Not anywhere close.

      • woodrowbarlow 18 hours ago

        standardizing pyproject.toml helped but it didn't go quite far enough.

    • EdwardDiego 20 hours ago

      Feels like you're doing it wrong if you're dealing with all of those.

      • IshKebab 17 hours ago

        "depending on the project"

    • zelphirkalt 20 hours ago

      You mean off the job you have to juggle all those tools? On the job that would be kind of crazy, to allow every project its own tool chain.

  • dilawar a day ago

    True.

    I had to give up on mypy and move to pyright because mypy uses pip to install missing types and they refuse to support uv. In the CI pipeline where I use UV, I don't have a pip installed so mypy complains about missing pip.

    Of course I can do it by myself by adding typing pkgs to requirement.txt file then what's the point of devtools! And I don't want requirements.txt when I already got pyproject.toml.

    Once you get used to cargo from rust, you just can't tolerate shitty tooling anymore. I used to think pip was great (compared to C++ tooling).

    • WhyNotHugo 18 hours ago

      Mypy doesn't install anything by default, you're probably setting the `--install-types` flag somehow.

    • IshKebab a day ago

      Pyright is waaay better than Mypy anyway so I'd say they did you a favour.

  • tacitusarc a day ago

    I think their only big gap is the inability to alias general project non-python scripts in uv. This forces you to use something like a justfile or similar and it would be much more ergonomic to just keep it all in uv.

  • guappa 21 hours ago

    uv belongs to a startup. They will surely introduce some wacky monetisation scheme sooner or later.

    I wouldn't get too used to it.

    • IshKebab 17 hours ago

      Maybe, but even if that is the case it's sooooo much better that even the worst case (fork when they try to monetise it) is way better than any alternatives.

      • Hackbraten 14 hours ago

        > fork when they try to monetise it

        To maintain a successful fork, not only are you going to need to find people who volunteer for maintaining a fork at that scale (including a large user base due to popularity), you’ll need to find skilled Rust developers, too.

        That’s going to be immensely difficult.

        • IshKebab 12 hours ago

          I mean, maybe. But I'd still much rather take that risk than flagellate myself with Pip or Pyenv or Poetry.

  • loeber a day ago

    Strong agree. The respectful act of other package managers would be consider themselves deprecated and point to uv instead.

    • baq a day ago

      The risk is obviously uv losing funding. I kinda hope the PSF has thought about this and has a contingency plan for uv winning and dying/becoming enshittified soon after.

      • guappa 21 hours ago

        If they never made any plan about how modules are installed and there is no official way… i doubt they made a plan about uv.

        • zelphirkalt 10 hours ago

          I think you mean packages, not modules. And actually there is site-packages, where your stuff lands when installed with pip, so there kind of is an official way. Just many tools implementing it.

      • loeber 7 hours ago

        It's open source. If necessary, uv can be forked and maintained entirely as OSS.

  • albert_e 17 hours ago

    I am sold. Sign me up.

    I have never used virtual environments well -- the learning curve after dealing with python installation and conda/pip setup and environment variables was exhausting enough. Gave up multiple times or only used them when working through step wise workshops.

    If anyone can recommend a good learning resource - would love to take a stab again.

  • kyawzazaw a day ago

    Which companies run heavily (either solely or huge parts) run on Python? They should take initiative and start blogging.

  • robertlagrant 20 hours ago

    I totally disagree. Having a single vendor with that much power is a bad idea. If the PSF were able to focus on tooling rather than their current focus, they would be great stewards of this sort of thing. Sadly I doubt that will happen, in which case I think many options is the best approach.

    • IshKebab 17 hours ago

      > If the PSF were able to focus on tooling rather than their current focus

      Well yeah maybe if the PSF were able to get their shit together it wouldn't have taken a single third party vendor to do it for them. But they weren't and it did, so here we are.

  • crabbone 18 hours ago

    Uv doesn't fix anything. The fixing that Python needs is the removal of the concept of virtual environments and fixing the import and packaging systems instead.

    The only thing it does, it makes bad things happen faster. Who cares...

    • IshKebab 17 hours ago

      Well maybe "fixes" is the wrong word. It certainly fixes the bad UX caused by virtual environments.

      Basically it handles the virtual environments for you so you don't have to deal with their nonsense.

      But you're right it doesn't fix it in the same way that Deno did.

  • tootie a day ago

    Every time people have debates over the merits of languages I always put developer environment at the top of my list. Build tools, IDE, readable stack traces. Those things boost productivity for more than concise list comprehensions or any gimmicky syntax thing. It's why Python always felt stone age to me despite have such lovely semantics.

kubav027 a day ago

I am pretty happy with poetry for near future. I prefer using python interpreters installed by linux package manager. In cloud I use python docker. Poetry recently added option to install python too if I changed my mind.

I have already setup CI/CD pipelines for programs and python libraries. Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD. I do not think it is worth the time right now.

But if you use older environments without proper lock file I would recommend switching immediately. Poetry v2 supports pyproject.toml close to format used by uv so I can switch anytime when it would look more appealing.

Another thing to consider in long term is how astral tooling would change when they will need to make money.

  • js2 a day ago

    > I prefer using python interpreters installed by linux package manager.

    uv will defer to any python it finds in PATH as long as it satisfies your version requirements (if any):

    https://docs.astral.sh/uv/concepts/python-versions/

    It also respects any virtual environment you've already created, so you can also do something like this:

       /usr/bin/python3 -m venv .venv
       .venv/bin/pip install uv
       .venv/bin/uv install -r requirements.txt # or
       .venv/bin/uv run script ...
    
    It's a very flexible and well thought out tool and somehow it manages to do what I think it ought to do. I rarely need to go to its documentation.

    > Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD.

    I found it very straightforward to switch to uv. It accommodates most existing workflows.

  • irjustin 20 hours ago

    I'm pretty much with you and still trying to figure out why I want to switch away from pyenv+poetry.

    I get that uv does both, but I'm very happy with pyenv+poetry combo.

    Old baggage, but I came from the rvm world which attempted to do exactly what uv does, but rvm was an absolute mess in 2013. rbenv+bundler solved so many problems for me and the experience was so bad that when I saw uv my gut reaction was to say "never again".

    But this thread has so many praises for it so one day maybe i'll give it a try.

    • armanckeser 19 hours ago

      Uv dependency solving is light years faster than poetry. If you are working on actual projects with many dependencies, poetry is a liability

  • WhyNotHugo 18 hours ago

    Yeah, using the package manager is the logical choice and usually the most likely one to work.

    IIRC, uv downloads dynamically linked builds of Python, which may or may not work depending on your distribution and whether linked libraries are locally available or not. Not sure if things have changed in recent times.

kylecordes a day ago

UV is such a big improvement that it moves Python from my "would use again if I had to, but would really not look forward to it" pile to my "happy to use this as needed" pile. Without disparaging the hard work by many that came before, UV shows just how much previous tools left unsolved.

  • crabbone 18 hours ago

    It doesn't do anything differently beside the speed... Why do people keep praising it so much? It doesn't solve any of the real problems... come on. The problems weren't the tools, the problems are the bad design of the imports and packaging systems which cannot be addressed by an external tool: the language needs to change.

TheIronYuppie a day ago

For scripting... HIGHLY recommend putting your dependencies inline.

E.g.:

  #!/usr/bin/env python3
  # /// script
  # requires-python = ">=3.11"
  # dependencies = [
  #     "psycopg2-binary",
  #     "pyyaml",
  # ]
  # ///
Then -

  uv run -s file.py
  • maleldil a day ago

    How does this interact with your code editor or IDE? When you edit the file, where does the editor look for information about the imported third-party libraries?

    • AlphaSite a day ago

      Usually the VENV and import lines are enough

      • maleldil 19 hours ago

        How do you determine where the venv is? AFAIK, uv run in script mode creates the venv in some random temporary directory.

        • JimDabell 17 hours ago

          I don’t know of a convenient way of doing it, but a clumsy way of doing it is to run this in your script:

            import os
            
            print(os.environ['VIRTUAL_ENV'] + '/bin/python')
          
          Then, e.g. in VS Code, you bring up the command palette, run Python: Select Interpreter, and enter the result.
  • marcthe12 a day ago

    Do you need a wrapper script for scripts in the PATH or execve? I would usualy chmod+x the script but I am not sure here.

    • Manfred a day ago

      If you want to make it work regardless of where uv is installed, you can use the following shebang line:

        #!/usr/bin/env uv run --script
    • tetha a day ago

      Not at a laptop to try this right now, but shouldn't this be possible with the shebang? Something along the lines of:

          #!/home/tetha/Tools/uv run
      • dfinninger a day ago

        Yes it is, I just converted my work scripts over this afternoon.

            #!/usr/bin/env uv run
selimnairb 19 hours ago

I have been using Python for 20 years, and have been an intermediate to advanced user of it for last 5-7 years. I use it mostly for scientific computing (so lots of Numpy, SciPy, etc.), IoT data processing, and also for some microservices that don’t need to be super fast. I publish and maintain a few packages in PyPI and conda (though I almost never use conda myself), including a C++ library with Python bindings generated by SWIG (SWIG wouldn’t be my first choice, but I inherited it).

In what I’ve done, I’ve never found things like pipenv, let alone uv, to be necessary. Am I missing something? What would uv get?

  • crabbone 18 hours ago

    If you need to package for Anaconda, uv has nothing to offer you. It's a replacement for a number of PyPA tools, so it's not compatible with Anaconda tools.

    The selling point of uv is that it does things faster than the tools it aims to replace, but on a conceptual level it doesn't add anything substantially new. The tools it aims to replace were borne of the defects in Python import and packaging systems (something that Anaconda also tried to address, but failed). They are not good tools designed to do things the right way. They are band-aids designed to mitigate some of the more common problems stemming from the bad design choices in the imports and packaging systems.

    My personal problem with tools like uv is that, just like Web browsers in the early days of the Web tried to win users by tolerating the mistakes made by the Web site authors, it allows to delay the solution of the essential problems that exist in Python infrastructure by offering some pain relief to those who are using the band-aid tools.

runjake a day ago

For my use cases, uv is so frictionless it has effectively made Python tolerable for me. I primarily discovered it via Simon Willison's (@simonw) blog posts[1]. I recommend his blog highly.

1. https://simonwillison.net/tags/uv/

vslira a day ago

I'm using exclusively uv for personal projects - and small prototypes at work - and I can't recommend it enough.

Uv makes python go from "batteries included" to "attached to a nuclear reactor"

  • scratchyone a day ago

    i’ve started slipping uv into production work projects along with an auto generated requirements.txt for anyone who doesn’t wanna use uv. hoping i can drive adoption on my team while still leaving an alternative for people who don’t wanna use it

    • globular-toast 18 hours ago

      You mean `uv pip compile pyproject.toml > requirements.txt`?

jillesvangurp 21 hours ago

I dabble with python occasionally and I'm always fighting with tools and tool combinations that don't really combine well. The last time I settled on using conda to get some isolation of python versions and then pipenv for getting some sane package management with a lock file. Not pretty but it kind of worked. Except I had a hard time convincing vs code and pycharm of the correct environment with that combination (couldn't resolve libraries I installed). I got it working eventually but it wasn't a great experience.

It sounds like uv should replace the combination. Of course there is the risk of this being another case of the python community ritually moving the problem every few years without properly solving it. But it sounds like uv is mostly doing the right thing; which is making global package installation the exception rather than the default. Most stuff you install should be for the project only unless you tell it otherwise.

Will give this a try next time I need to do some python stuff.

  • fnands 21 hours ago

    Do. I was sceptical at first - exactly because of the points you make: I mostly do ML, so for getting PyTorch and Cuda etc. to play nice conda was basically my go-to.

    We use poetry at work, but getting it to play nice with PyTorch is always a bit of an art. I tried to get into Pixi, but have been a little annoyed as it seems to have inherited conda's issues when mixing conda and PyPi.

    uv so far has been relatively smooth sailing, and they even have an entire section on using it with PyTorch: https://docs.astral.sh/uv/guides/integration/pytorch/

mrbonner a day ago

Ans you can now install Python and set it to the default in your path with the --default flag. Big plus for me to replace pyenv finally.

  • thefreeman a day ago

    finally! this was the thing keeping me from switching every time i looked into it.

rsyring a day ago

15 year Python dev who usually adopts tooling slowly. Just do it, uv's absolutely worth it.

I also use mise with it, which is a great combination and gives you automatic venv activation among other things.

See, among other mise docs related to Python, https://mise.jdx.dev/mise-cookbook/python.html

See also a Python project template I maintain built on mise + uv: https://github.com/level12/coppy

  • jdxcode a day ago

    ideally mise could be replaced entirely by uv or at least just be a thin wrapper around uv (in some ways that's already the case), but given this article requires the use of the custom uv-python-symlink utility it seems uv isn't quite there yet

    • rsyring a day ago

      Mise does way more than uv, it's a much larger scope than just Python tooling.

      I think the current status quo, that of mise utilizing uv for it's Python integration support, makes sense and I don't see that changing.

      Also, FWIW, mise has other methods for Python integration support, e.g. pyenv, virtualenv, etc.

      Edit:

      Ha... Didn't realize who I was replying to. Don't need me to tell you anything about mise. I apparently misinterpreted your comment.

      • jdxcode a day ago

        the reality that I'm sure you've heard me say many times is that I'm just not a python dev and astral is likely always going to build a better solution around python than I ever could. They've just focused a lot more on the package manager side of things than the runtime/venv management side so far but I suspect that will change—and given astral's velocity I doubt we'll be waiting long

        and btw mise's venv support isn't going anywhere probably ever, but I do hope that at some point we could either let uv do the heavy lifting internally or point users to uv as a better solution

  • NeutralForest a day ago

    I used to install Python through mise but now I just use uv tbh.

    • rsyring a day ago

      Similar. But we get other benefits through mise, like tasks and other tool installs (e.g. Terraform). So we still use them together.

      • NeutralForest a day ago

        That's fair, it's also nice if you have a backend in Python and a frontend in JS since mise also handles node.

        • rsyring a day ago

          Forgot about that! Yes, another significant benefit of why we use mise.

          In particular, we use flask-vite and it's so nice to be able to have the right version of Node specified in the same management system as we specify the Python version. This solved a not insignificant amount of angst around FE development for me personally since I spend most of my time in the BE.

          It's not like it was insurmountable before. But now, with mise, it's in that "just works" category for me.

          • NeutralForest a day ago

            100% agreed, it just takes a task that was a 10-15min setup depending on your environment and personal knowledge to a 2min thing. It just makes life easier and it puts the bar for starting lower, a win in my book =)

unsnap_biceps a day ago

Uv in script mode has made me love python again.

pzo a day ago

I want to switch to uv from pyenv but one use case that didn't manage to figure out is if I can have similar setup like pyenv that I install few python version and setup one to be a global default (configured in zsh). I know for bigger projects proper way is to setup virtual environment for all new project but I do many mini (throwaway) python scripts and experiments or testing repos in python and would be really annoying to setup environment for those - so far pyenv worked well for me for such cases without having pretty much dependency conflicts.

  • js2 a day ago

    Yes, uv is well suited to that use case by declaring your Python version and/or dependencies right in the script itself:

    https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...

    You can use an alternate shebang line so you can run the script directly:

      #!/usr/bin/env -S uv run --script
      # /// script
      # requires-python = ">=3.12"
      # dependencies = [
      #   "requests",
      #   "typer-slim",
      # ]
      # ///
      import requests
      import typer
      # ...
    • pzo a day ago

      Yes but I then still then have to declare all dependencies for all tiny throwaway script, right now I have global python in pyenv and installed tons of plugins and didn't have too much issues with conflicts so was good enough for me

      • jsmeaton 18 hours ago

        I used to have the same setup - a global tools venv with some useful dependencies.

        Uv takes the position that since it’s so fast to install dependencies and create environments, you don’t maintain a global venv.

          uvx ruff file.py
        
        Will setup a venv, install ruff into it, and run over your file. https://docs.astral.sh/uv/guides/tools/

        Otherwise you can:

          uv run —-with package file.py
        
        If you don’t want to declare your dependencies.

        https://docs.astral.sh/uv/guides/scripts/#running-a-script-w...

      • biorach a day ago

        How about just install a python version with uv and install everything into that?

        Or better, do the above, then create a virtual env, set the virtual env in your.bashrc and install everything into that

        Better still... use uv script --init See other comments on this post

      • sorenjan 15 hours ago

        You could make an alias for creating a new python script that adds your favorite libraries as dependencies:

          alias newpy='function _newpy() { 
              default_deps=("rich" "requests");
              uv init --script "$1" && uv add "${default_deps[@]}" --script "$1"; 
          }; _newpy'
BewareTheYiga 5 hours ago

I can't say enough good things about UV. It has simplified and accelerated my python and Jupyter projects. I even run it in my pipelines.

eikenberry a day ago

Maybe this one will finally be adopted as the official package manager for Python? Only 20 years late, but it would be a nice development.

  • wirHga 20 hours ago

    Please not. Python core is suffering from corporate capture and ruins everything it touches.

    The PSF would probably dig up some old posts from the uv authors, defame them, take the code and make it worse.

  • 0cf8612b2e1e a day ago

    Pfft. Pull the other one. The PSF hates the idea of dealing with something so icky.

    I have been pretty pleased with uv, but I am continually worried about the funding model. What happens when the VC starts demanding a return?

    • __MatrixMan__ a day ago

      We fork it. Whatever carrot the VC's dangle can be chased by the handful who care, and the rest of us can continue using the important part.

      • Kwpolska a day ago

        Ah, so that's why it's written in Rust. Less people who care about Python packaging are capable of forking it.

        • __MatrixMan__ 12 hours ago

          I guess this is a joke but honestly I think that Python -> Rust is a pretty strong one.

          I'd bet that the sort of person who is maintaining packaging for a bunch of Python users would like an opportunity to learn Rust on the job. I would.

          • eikenberry 11 hours ago

            IMO this is unlikely. Rust and Python have very little in common.

aequitas 21 hours ago

I recently switched our Python projects to uv and it love it. It just does everything and is really fast (this just cannot be underestimated in what it means for your workflow).

I've tried almost every Python packaging solution under the sun in the past 15 years but they all had their problems. Finally I just stuck with pip/pip-tools and plain venv's but strung together with a complicated Makefile to optimize the entire workflow for iteration speed (rebuilding .txt files when .in requirements changes, rebuilding venv if requirements change, etc). I've been able to reduce it to basically one Make target calling uv to do it all.

tomrod a day ago

I love using it. I'm concerned that they go the route of Terraform and put in play pricing and values that differ from what their users support.

bnycum a day ago

I decided to give uv a shot on my new machine over pyenv and I've been enjoying it. Just last week I had to generate out 90 slides from some data last minute. Quickly created a project added in my dependencies (pandas, matplotlib, python-pptx), then crunched out some code. Absolutely zero friction with a much easier to use set of commands in my opinion.

bigfatfrock a day ago

I converted along with most of the people in this thread.

IMO no really hard problem is ever truly solved but as can be seen in other comments, this group of people really crushed the pain of me and *many* others, so bravo alone on that - you have truly done humanity a service.

xucian 5 days ago

has anybody doing complex projects achiever success with uv completely replacing pyenv, and had mostly pros and few or no cons?

I'm very comfortable with pyenv, but am extremely open to new stuff

  • NeutralForest a day ago

    Sure, I've basically replaced pyenv, pyenv-virtualenv, poetry; with uv. I can't think about cons personally, though you might need to dig into the docs at times.

  • __mharrison__ a day ago

    Teaching a course for a corporate client this week for data scientists. The first day (of five) we covered uv. Minds blown.

    "Course was worth it just for uv"

  • emgeee a day ago

    I've used uv to work on the feast feature store project to great success

  • ath3nd a day ago

    I worked in a large-ish org where 20+ python projects, their CI/CD pipelines and their docker images were migrated from `pyenv` + `.python-version` + `requirements.txt` to `uv` in basically a single day.

    If you are comfortable with `pyenv`, the switch to `uv` is basically a walk in the park. The benefit is the speed + the predictable dependencies resolution.

    • ttyprintk 18 hours ago

      Astral ships a Docker image that provides their tox-uv. I saw maybe a 3x speed up setting up environments.

  • rat87 a day ago

    I don't know how complex your project is but I moved my previous work from pyenv to rye(UV and rye have merged, most work is being done on uv, today I'd probably use UV)

    And am currently trying to move current work to UV. The problems seem to be possibility of unknown breakage for unknown users of the old project not any known technical issue.

    I'd highly reccomend UV. Its just easier/more flexible. And it downloads third party pre compiled python builds instead of the extra time and complexity to get it compiling locally. Its much nicer especially when maintaing an environment for a team that just works without them having to know about it

    One downside of UV is that unlike pyenv and rye it doesn't shim python. Pyenv shim did give me some trouble but rye simples shim didn't. The workaround is to run stuff with uv run x.py instead of python x.py

    • ttyprintk a day ago

      `uv tool install` will create shims in .local/bin

ashvardanian a day ago

I’m enjoying UV a lot as well. If anyone from the Astral team sees this, I’d love to request more functionality or examples around packaging native libraries.

At this point, just thinking about updating CIBuildWheel images triggers PTSD—the GitHub CI pipelines become unbearably slow, even for raw CPython bindings that don’t require LibC or PyBind11. It’s especially frustrating because Python is arguably the ultimate glue language for native libraries. If Astral’s tooling could streamline this part of the workflow, I think we’d see a significant boost in the pace of both development & adoption for native and hardware-accelerated tools.

zahlman a day ago

> Maybe I installed some other things for some reason lost in the sands of time.

FWIW, I was able to confirm that the listed primary dependencies account for everything in the `pip freeze` list. (Initially, `userpath` and `pyrsistent` were missing, but they appeared after pinning back the versions of other dependencies. The only project for which I couldn't get a wheel was `python-hglib`, which turned out to be pure Python with a relatively straightforward `setup.py`.)

randomsolutions 17 hours ago

My biggest issue is using uv envs in vscode under WSL. Starting up interactive sessions takes forever. Its just too slow, can't figure out what the deal is.

lmeyerov a day ago

Are people seeing it work well in GPU/pydata land and creating multiplatform docker images?

In the data science world, conda/mamba was needed because of this kind of thing, but a lot of room for improvement. We basically want lockfile, incremental+fast builds, and multi-arch for these tricky deps.

  • maleldil a day ago

    It works transparently. The lock file is cross-platform by default. When using pytorch, it automatically installs with MPS support on macOS and CUDA on Linux; everything just works. I can't speak for Windows, though.

  • insane_dreamer 16 hours ago

    Yes because the pypi cupy/cudann packages now work seamlessly with jax. Until not long ago we had to use the conda packages.

  • throwaway314155 a day ago

    Works better than poetry for cuda-versioned pytorch. I don't have overlap with your other domains unfortunately (ML, not data science).

    • lmeyerov a day ago

      Thanks!

      I think the comparison for data work is more on conda, not poetry. afaict poetry is more about the "easier" case of pure-python, and not native areas like prebuilt platform-dependent binaries. Maybe poetry got better, but I typically see it more like a nice-to-have for local dev and rounding out the build, but not that recommended install flow for natively-aligned builds.

      So still curious with folks navigating the 'harder' typical case of the pydata world, getting an improved option here is exciting!

      • throwaway314155 10 hours ago

        That's fair. I guess when you see people champion poetry (less so lately) you hope it works as well as pip/conda despite the complexities of pytorch in particular. Finding that the community in particular simply doesn't use that library has a shock of sorts - like this package manager is great, but "your type ain't welcome".

        In any case I believe uv is trying to be _the_ solution and Id be pretty surprised if your libs weren't well supported, or on the roadmap at least.

o10449366 a day ago

If uv figures out a way to capture the scientific community by adding support for conda-forge that'll be the killshot for other similar projects, imo. Pixi is too half-baked currently and suffers from some questionable design decisions.

  • bitvoid a day ago

    Out of curiosity, which design decisions do you find questionable and what do you feel is half-baked with Pixi? It's been working well for us.

  • kyawzazaw a day ago

    which subfield of scientific community uses that? and for what purpose, if you could summarize

    • tempay a day ago

      The key thing of conda-forge is that it's language (rust/go/c++/ruby/java/...) and platform (linux/macos/win/ppc64le/aarch64/...) agnostic rather than being python only.

      If you want you can depend on a C++ and fortran compiler at runtime and (fairly) reliably expect it to work.

theogravity a day ago

Is there a guide for how to use uv if you're a JS dev coming from pnpm?

I just want to create a monorepo with python that's purely for making libraries (no server / apps).

And is it normal to have a venv for each library package you're building in a uv monorepo?

  • BiteCode_dev a day ago

    If the libraries are meant to be used together, you can get away with one venv. If they should be decoupled, then one venv per lib is better.

    There is not much to know:

    - uv python install <version> if you want a particular version of python to be installed

    - uv init --vcs none [--python <version>] in each directory to initialize the python project

    - uv add [--dev] to add libraries to your venv

    - uv run <cmd> when you want to run a command in the venv

    That's it, really. Any bonus can be learned later.

    • NeutralForest a day ago

      There's also workspaces (https://docs.astral.sh/uv/concepts/projects/workspaces/) if you have common deps and it's possible to act on a specific member of the workspace as well.

      • BiteCode_dev a day ago

        That's one of the bonus I was thinking about. It's nice if you have a subset of deps you want to share, or if one dep is actually part of the monorepo, but it does require more to know.

    • theogravity a day ago

      Thanks. Why is the notion of run and tool separate? Coming from JS, we have the package.json#scripts field and everything executes via a `pnpm run <script name>` command.

      • BiteCode_dev a day ago

        Tool ?

        Maybe you mean uv tool install ?

        In that case it's something you don't need right now, uv tool is useful, but it's a bonus. It's to install 3rd party utilities outside of the project.

        There is no equivalent to script yets, althought they are adding it as we speak.

        uv run exec any command in the context of the venv (which is like a node_modules), you don't need to declare them prior to calling them.

        e.g: uv run python will start the python shell.

    • new_user_final a day ago

      uv sync if you clone a github repo

      • BiteCode_dev a day ago

        uv run in the freshly cloned repo will create the venv and install all deps automatically.

        You can even use --extra and --group with uv run like with uv sync. But in a monorepo, those are rare to use.

        • theogravity a day ago

          Thanks for the info.

          I looked at the group documentation, but it's not clear to me why I would want to use it, or where I would use it:

          https://docs.astral.sh/uv/concepts/projects/layout/#default-...

          (I'm a JS dev who has to write a set of python packages in a monorepo.)

          • BiteCode_dev a day ago

            sync is something you would rarely use, it's most useful for scripting.

            uv run is the bread and meat of uv, it will run any command you need in the project, and ensure it will work by synching all deps and making sure your command can import stuff and call python.

            In fact, if you run a python script, you should do uv run python the_script.py,

            It's so common uv run the_script.py will work as a shortcut.

            I will write a series of article on uv on bitecode.dev.

            I will write it so that it work for non python devs as well.

            • theogravity 11 hours ago

              Did you mean group and not sync?

              Really looking forward to the articles!

77ko a day ago

uv is excellent! The only think I'm missing is an easy way to update all packages in an env, something like `uv update --all` or `uv update plotly`.

Which would fit in with existing uv commands[1] like `uv add plotly`.

There is an exisiting `uv lock --upgrade-package requests` but this feels a bit verbose.

[1]: https://docs.astral.sh/uv/guides/projects/#creating-a-new-pr...

stuaxo a day ago

This makes sense for people keen on pyenv.

I'm still very keen on virtualenvwrapper, I hope that the fast dependency resolution and install of uv can come there and to poetry.

moltar a day ago

I wasn’t able to figure how to make a uv installed python version a global when “python” is called, at least in the current shell, as I need it in CI.

  • kstrauser a day ago

    That feature's in preview now. You can run it like:

      uv python install --preview --default 3.13
    
    and then you get Python 3.13 whenever you run `python` outside of an environment that declares something else.
    • leejoramo a day ago

      This is great news. I had hacked together some bash and fish scripts to mostly do this but they still had some rough edges. I missed that uv now had this ready for preview

      • kstrauser a day ago

        I just found that a couple weeks ago.

        I'm an end user, too. I don't have anything to do with uv development. I stumbled across it in a GitHub issue or something and passed along the info.

    • moltar a day ago

      Thank you!! Will try it tomorrow.

      • kstrauser a day ago

        You bet. I was so happy to find that!

BiteCode_dev a day ago

Note that despite the title, the author is not switching from pyenv to uv, but from pip, pyenv, pipx, pip-tools, and pipdeptree to uv, because uv does much more than pyenv alone.

It replaces a whole stack, and does each feature better, faster, with fewer modes of failure.

oblio a day ago

How does this compare to Mise: https://mise.jdx.dev/lang/python.html ?

  • claytonjy a day ago

    mise is higher level. i use it to install uv in projects with other non-python dependencies (helm, terraform, npm), which i also install with mise.

    Then all the python dependencies are managed with uv.

    For a non-python project which needs a python-based CLI tool, i’m not sure if i’d use mise or uv (uvx).

xenophonf a day ago

What does uv offer over bog-standard setuptools, pip, pip-tools, and build?

Right now, the only thing I really want is dependency pinning in wheels but not pyproject.yaml, so I can pip install the source and get the latest and greatest, or I can pip install a wheel and get the frozen dependencies I used to build the wheel. Right now, if I want the second case, I have to publish the requirements.txt file and add the wheel to it, which works but is kind of awkward.

  • baq a day ago

    It does everything with less surprises and faster. Just try it.

    • xenophonf 17 hours ago

      > Just try it.

      I don't need to be told to RTFM. I was asking for advice. My attention span is my most valuable commodity, and since I'm not really surprised or slowed down by setuptools, etc., it sounds like uv probably isn't worth investigating.

      Thanks.

      • baq 16 hours ago

        It probably is. Just try it is the advice.

        • xenophonf 9 hours ago

          That's unhelpful.

          To answer my own question—and to actually help other people with similar use cases—I read about uv's build process and dependency locking. It does not appear to be able to lock dependencies for build distributions (wheels).

          https://docs.astral.sh/uv/concepts/projects/build/

          https://docs.astral.sh/uv/pip/compile/

          However, it does mention that Python supports multiple build systems, which I didn't know, so this hasn't been a complete waste of my time.

          Thanks!

surfingdino a day ago

So... I am switching a project from pip to uv. I am hoping for things to be "better", but so far it's been a bit of a familiar "why does it not work as described?" journey.

  • zahlman a day ago

    Could you give some detail on things you've found that still don't work as described?

    • surfingdino a day ago

      I could use more guidance on migration for setups where development and testing is using Docker. I figured things out eventually. The issue here is lack of good tutorials that cover cases other than a happy path.

      • zahlman 21 hours ago

        > I figured things out eventually. The issue here is lack of good tutorials that cover cases other than a happy path.

        I'd like to encourage you to blog about it, then.

jgalt212 a day ago

Because pyenv compiles from source, it's optimized for your own platform. In practice, are these performance differences noticeable?

  • zanie a day ago

    Hi! I work on the Python distributions uv uses. Performance is really important to us and we're on the bleeding edge of performant Python builds. Our distributions use profile guided optimization (PGO) as well as post-link optimizations (BOLT). From my understanding, these are not enabled by pyenv by default because they significantly increase build times. It's possible there are some platform-specific build benefits, but I'd be surprised if it was significant.

    I can set up some benchmarks comparing to pyenv on a couple common platforms – lately I've just been focused on benchmarking changes to CPython itself.

  • zahlman a day ago

    For what it's worth, I didn't notice a difference between my distro-provided Python 3.12 and the one I built from source - and enabling profile-guided optimization made only a slight difference. I haven't tested with the precompiled versions uv uses, so they could be slower for some reason, but I doubt it. On the other hand, my hardware is rather old, so maybe newer machines allow for significant optimizations that the system version wouldn't have. But I still kinda doubt it.

    If performance is important to you, the ancient advice to profile bottlenecks and implement important parts in C where you can, still applies. Or you can try other implementation like PyPy.

globular-toast a day ago

I've stuck with simple tools for all these years: pip, pip-tools, virtualenvwrapper etc. I've tried other stuff like poetry and it's always seemed like hard work. I'm glad I waited for uv. The one thing I wish it supported is having venvs outside of project directories. It's so much nicer to have them all in one place (like ~/.venvs or something) which you can ignore for backups etc. That's the only thing I miss, though.

  • sitic 20 hours ago

    I'm also only missing virtualenvwrapper-like support for central named venvs in uv.

    I'm too used to type virtualenvwrapper's `workon` and `mkvirtualenv` commands, so I've written some lightweight replacement scripts of virtualenvwrapper when using uv. Supports tab completion and implements only the core functionality of virtualenvwrapper:

    https://github.com/sitic/uv-virtualenvwrapper

OutOfHere a day ago

The functionalities of three tooling projects, namely uv, ruff (linter), and pyright (type checker) need to merge and become mandatory for new Python projects. Together they will bring some limited sanity to Python.

  • wiseowise a day ago

    Ruff is already integrated into uv and Astral are working on type checker.

    • maleldil a day ago

      How is ruff integrated? As far as I understand, it's still a separate tool that you need to install.

  • __mharrison__ a day ago

    What benefit does merging provide?

    • OutOfHere a day ago

      In an ideal world they shouldn't have to, but in the real world, it makes it easier for enterprises to adopt without friction. Adopting three tools is a threefold bigger challenge in enterprises, but thinking about it as a single tool makes it more amenable to enterprise adoption where it's needed the most. The merging I suggest is only logical, more like a bundling.

      • __mharrison__ a day ago

        Hmmm, I've never seen that and I feel like I work with some pretty locked down companies.

        • OutOfHere 14 hours ago

          If you can get whatever you want into a production workflow, then it's not really that locked down, is it...

          • __mharrison__ 8 hours ago

            If you can get uv in, ruff should be easy...

whimsicalism a day ago

frankly the only pain point i have working with uv is that it's too new for the LLMs to know about it

  • ddejohn a day ago

    What do LLMs have to do with package and project management?

    • Kwpolska a day ago

      Some people are too lazy to read docs, so they ask the bullshit generators instead.

rullopat a day ago

[flagged]

  • mrlatinos a day ago

    Please, it's every 4 years.

  • affinepplan a day ago

    uv is worth changing to. it legitimately solves python packaging problems the way no other proposed solution thus far could.

  • fwip a day ago

    I think you might have your dates confused. Pyenv was first released about 8 years ago.

  • alexjplant a day ago

    You're being downvoted (for snark presumably) but you have a point.

    During my tenures as a Python developer I've had to deal with pip, pipx, venv, pipenv, setuptools, conda, and poetry. I'd not heard of pyenv or uv until this thread (or maybe I've touched pyenv and got it confused with one of the 7 other tools I mentioned) and I'm sure there are other dependency/environment management tools floating around that I missed.

    Now that I'm back to Go it's `go get` with some mise tasks. It's a serious breath of fresh air. The Python ecosystem probably won't ever catch up to npm when it comes to cranking out shiny new things but it's definitely more bleeding edge than a lot of other languages.

    • turtlebits a day ago

      In the past 10 years, virtualenv and pip have been perfectly fine for me. They still are. I ignored any new tooling.

      uv is great so far, I did run into a hiccup where moving from pip with a requirements.txt file to uv slowed a CI pipeline way down that I had to revert.

      • __mharrison__ a day ago

        Odd that it slowed down. I wondered what happened. For me and my clients, uv has been 2-4 orders of magnitude faster.

    • zahlman a day ago

      > I'd not heard of pyenv or uv until this thread (or maybe I've touched pyenv and got it confused with one of the 7 other tools I mentioned)

      I must have seen at least a dozen threads here about uv since joining half a year ago. But maybe that's because I actively look for Python discussion (and I'm probably just generally more active here).

      I wish I'd paid more attention a few years ago and thought about the packaging problem more intensely - in an alternate universe I might have released Paper before uv came out, but I didn't even really start considering these issues until mid-2023 (and of course I've had plenty of other things to distract me since then).

      For what it's worth, my general thesis is that most of the problems really boil down to Pip being what it is, and a lot of the rest boil down to Setuptools being what it is.

    • JimDabell 17 hours ago

      > Why in the Python ecosystem you change package manager every 2 week?!?

      setuptools (2006), pip (2008), venv (2011), conda (2012), pipx (2017), pipenv (2017), poetry (2018)

      They don’t have a point. You listed seven tools – most of which aren’t package managers at all – which were created over the course of twelve years. That’s not even remotely like changing package manager every two weeks. That goes far beyond hyperbole, straight into misrepresentation.

    • rat87 a day ago

      The reason there have been so many is because the standard included tools (pip, venv) are not great. And others could still use improvements.

      Venv and setup tools aren't really package managers. Pipx is only meant for installing Dev tools per user (in isolated Venvs).

      pyenv does something a bit different from those tools you listed(maybe it'd part of cones I haven't tried it). Its not a dependency manager its a python version manager like nvm (node version manager). It helps you manage downloading and compiling python from source and it let's you specify python version in a .python-version file and provides a shim to find the right python for a project(compiling it if its not already available).

      I tried pipenv and despite being hyped for it, it had a lot of issues. Then I tried poetry which seemed much better but was still sort of slow and got stuck updating lock files sometimes.

      I haven't even tried pdm. Or various conda package managers since its mainly used by scientists with lots of binary dependency needs.

      Then ~~uv~~ rye came along and seemed to fix things. It replaced pip+pip tools/pipenv/poetry. Also replaced pipx(install python tools in isolated venvs then add it to users ./local/bin). Also replaced pyenv but instead of compiling python which takes a while and can be troublesome it downloads portable builds https://astral.sh/blog/python-build-standalone (which do have some downsides/compatibility issues but are usually better then compiling python). It was also written in rust so avoided circular venv issues that sometimes come with installing python packages since it had a self contained binary(plus some shims).

      Then UV came along, the projects merged and most development is happening in uv. Despite the rye-> switch most things are pretty similar and I feel a lot of excitement towards it. The one big difference is there's no shims to automatically call the right python for a project from UV. Instead you need to run uv run script.py

      Astral the guys behind UV took over the independent python builds and have also built the most popular python formater/linter these days - ruff (also written in rust, also fast they're also looking into adding a better type checker for python type hints).

      I'd reccomend trying it for your next project I think it could become the defacto packaging/version tool for python

      • zahlman a day ago

        `venv` is fine. The work of just creating the virtual environment is hardly anything, and `venv` really can't screw it up. If you create environments `--without-pip`, it's actually faster than `virtualenv` and `uv venv` in my testing (because those are fundamentally doing the same thing with a little extra window dressing). What slows it down is bootstrapping Pip into the new environment, via the standard library `ensurepip`, which requires running zipped un-bytecode-compiled code from a bundled wheel.

        (As it happens, this is the exact topic of the blog post I'm currently in the middle of writing.)

        Pip is indeed not great (understatement - there are many other things about it that I have picked on or will pick on in this series).

        >Venv and setup tools aren't really package managers.

        Setuptools nowadays is a build backend. Long ago (when expectations were much lower), Pip had Setuptools as a dependency, and the combination was about as close to a "package manager" as anyone really cared for. But Pip today really can't be called anything like a "package manager" either - it only installs the packages, and resolves dependencies. It records some basic information about the dependency graph of the things it installed - in the sense that it preserves such information from the wheels it installs - but it doesn't really do any processing of that information. If you ask it to remove packages it's (last I checked) not very smart about that. It doesn't help you figure out what packages you need, and it doesn't help you maintain your `pyproject.toml` file.

        And, of course, it doesn't create or keep track of virtual environments for you. (Pipx does the latter, wrapping Pip and `venv`, but it only installs "application" packages that define an entry point.)

        Poetry and PDM are the only things listed that really belong to the same category as uv. They're not only package managers, but complete workflow tools. (Conda is a package manager, for languages other than Python as well, but it's not meant to take over your entire Python workflow in the same way.) They even wrap themselves around the process of uploading to PyPI (which seems really excessive to me; seriously, `twine` is fine too.)