• Trigg@lemmy.world
    link
    fedilink
    arrow-up
    78
    arrow-down
    3
    ·
    8 months ago

    Man updating packages by compiling them is so stupid

    Oh look 15 updated packages from AUR

    • funkajunk@lemm.ee
      link
      fedilink
      English
      arrow-up
      28
      arrow-down
      1
      ·
      8 months ago

      I always go with the binary version if it’s available in the AUR, ain’t nobody got time for that.

    • BCsven
      link
      fedilink
      arrow-up
      9
      arrow-down
      4
      ·
      8 months ago

      I mean yes if time is an issue, but compiled code on your own hardware is specifically tuned to your machine, some people want that tiny tweak of performance and stability.

        • BCsven
          link
          fedilink
          arrow-up
          1
          arrow-down
          7
          ·
          8 months ago

          But compiled on some other machine. Compiling on your own hardware optimizes it for that specific hardware and what that chip supports etc.

            • BCsven
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              8 months ago

              Ah, thought you meant in the AUR. I’m used to OBS where you have binaries and source available (OBS meaning OpenBuildService, not the screen recorder)

      • zueski@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        I use both for different purposes. Gentoo’s feature flags are the reason I wait for compiles, but only for computers a touch the keyboard with. Everything else gets Arch.

      • adONis@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        would you mind elaborating on the benefits? like what does one actually gain in a real-world scenario by having the software tuned to a specific machine?

        disk space aside, given the sheer amount of packages that come with a distro, are we talking about 30% less CPU and RAM usage (give or take), or is it more like squeezing out the last 5% of possible optimization?

        • BCsven
          link
          fedilink
          arrow-up
          4
          ·
          8 months ago

          Closer to thr 5% . Between the intermediate code and final code writing there is an optimization stage. The compiler can reduce redundant code and adjust based on machine. i.e. my understanding is an old 4700 can have different instruction sets available than the latest intel.gen chip features. Rather than compile for generic x86 the optimization phase can tailor to the machine’s hardware. The benefits are like car tuning, at some point you only get marginal gains. But if squeezing out every drop of performance and reducing bytes is your thing then the wasted compiling time may not been seen as waste.

  • msage@programming.dev
    link
    fedilink
    arrow-up
    24
    ·
    edit-2
    8 months ago

    Special Fuck You to:

    • clang
    • LibreOffice
    • Firefox
    • llvm

    I only use dwm, so no idea how long it takes to compile KDE or Gentoo Gnome.

    Everything else is so quick. Just those four take 20-30 minutes each.

    • porl@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      8 months ago

      Before I had a proper internet connection (had to ask permission to borrow a dial up account) I bought a magazine that had a picture of a cow on it saying that Larry the cow was different. It was a DVD image of the stage one mirror of this new fangled Gentoo thing.

      Learnt from the magazine how to install a bootloader and so on and then “bravely” typed emerge world into the terminal after configuring the list of all the packages I wanted. Including a full desktop (KDE I think but may have been Gnome). And Firefox. And Open Office. And some multimedia stuff I don’t remember.

      On a Pentium ii.

      Took a week before I could do the next step :D

  • rottingleaf@lemmy.zip
    link
    fedilink
    arrow-up
    19
    ·
    8 months ago

    12 hours, yes? My first Gentoo install took like 3 times that for all the things stupid me wanted to have.

    • fl42v@lemmy.ml
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      Relatable. Me: wants musl libc and to build stuff with clang (so that it’s not gnu/gentoo). Firefox: doesn’t want neither muls, nor clang due to some god knows how old bug.

      • rottingleaf@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        Even under FreeBSD and OpenBSD they use GCC for things requiring it, which kinda highlights Gentoo philosophy’s problem in this regard. Setting USE flags mostly globally seems like a cool idea, but when for customization it gets down to setting them for every package - one could as well use FreeBSD ports.

  • Pacmanlives@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    8 months ago

    I am rolling a few Gentoo VM’s these days and it’s really not that bad to compile things these days and I am on an old ass (10 year) dual Xeon setup. I remember X taking a few days to a week to compile back in the 2000’s

  • adONis@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    Oh man, just today I was messing around with flatpak, where I tried building webkit2, which took ages, or almost an hour (to be more specific).

    And I was thinking to myself if that’s what Gentoo feels like.

  • cygon@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    I usually compile with --quiet-build=y, it doesn’t have to be configures and makefiles blasting into a shell window the whole time. On the rare occasions where a build fails there’s still the log in /var/tmp/portage/....