• wise_pancake
    link
    fedilink
    arrow-up
    4
    ·
    14 hours ago

    60k rows is generally very usable with even wide tables in row formats.

    I’ve had pandas work with 1M plus rows with 100 columns in memory just fine.

    After 1M rows move on to something better like Dask, polars, spark, or literally any DB.

    The first thing I’d do with whatever data they’re running into issues with is rewrite it as partitioned and sorted parquet.

    • Onno (VK6FLAB)@lemmy.radio
      link
      fedilink
      arrow-up
      3
      ·
      14 hours ago

      My go-to tool of late is duckdb, comes with binaries for most platforms, works out of the box, loads any number of database formats and is FAST.