OK, maybe you wouldn’t pay three grand for a Project DIGITS PC. But what about a $1,000 Blackwell PC from Acer, Asus, or Lenovo?


Besides, why not use native Linux as the primary operating system on this new chip family? Linux, after all, already runs on the Grace Blackwell Superchip. Windows doesn’t. It’s that simple.

Nowadays, Linux runs well with Nvidia chips. Recent benchmarks show that open-source Linux graphic drivers work with Nvidia GPUs as well as its proprietary drivers.

Even Linus Torvalds thinks Nvidia has gotten its open-source and Linux act together. In August 2023, Torvalds said, “Nvidia got much more involved in the kernel. Nvidia went from being on my list of companies who are not good to my list of companies who are doing really good work.”

    • qaz@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 hours ago

      I don’t care why they got their shit together, I’m happy as long as they fix the open source drivers.

  • I Cast Fist@programming.dev
    link
    fedilink
    English
    arrow-up
    21
    ·
    9 hours ago

    Linux, after all, already runs on the Grace Blackwell Superchip. Windows doesn’t.

    And why is that?

    Project DIGITS features the new NVIDIA GB10 Grace Blackwell Superchip, offering a petaflop of AI computing performance for prototyping, fine-tuning and running large AI models.

    With the Grace Blackwell architecture, enterprises and researchers can prototype, fine-tune and test models on local Project DIGITS systems running Linux-based NVIDIA DGX OS, and then deploy them seamlessly on NVIDIA DGX Cloud™, accelerated cloud instances or data center infrastructure.

    Oh, because it’s not a fucking consumer product. It’s for enterprises that need a cheap supercomputer

  • AnUnusualRelic@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    7 hours ago

    Or you can just buy any random potato computer (or assemble it yourself from stuff you found) and still run Linux on it.

  • collapse_already@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 hours ago

    Haven’t they been making things like the Jetson AGX for years? I guess this is an announcement of the next generation.

    • GrumpyDuckling@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      8 hours ago

      It’s a pile of shit compared to any other sbc. It’s difficult to develop or run anything because it has an arm chip

      • collapse_already@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 hours ago

        But arm is the most deployed microprocessor in the world? I’d much rather write arm assembly than Intel or PowerPC. For higher level languages, arm has good compiler support. Can you explain why you don’t like arm? I’m genuinely curious because it is probably my favorite development environment (I mostly write embedded system software).

        • GrumpyDuckling@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 hours ago

          Linux packages don’t work on it unless they’re custom compiled, OS is supplied by Nvidia unless you make or compile your own os, so support for these will be abandoned when the next one comes out. Minimal performance for the price in exchange for lower power consumption. Really only useful for image recognition for OEMs for automated factory quality inspection, robotics, etc. where Internet access is limited.

          • collapse_already@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            The AGX that I use has Ubuntu 22.04 lts. I have been able to update it with apt. For us, it has been a good environment for CUDA. We run a rust application that uses c++ cuda image processing on the back end. Sorry people are downvoting you.

  • XNX@slrpnk.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    9 hours ago

    Cant load the article. Does it mention if this will be ARM computers?

  • reksas@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    10 hours ago

    I’m planning on getting new pc soon. I was planning on avoiding nvidia because i had read it might be more difficult to get drivers. Does this mean they are going to improve things in general or just for the newest and likely most expensive stuff? I dont want to buy the newest possible gpu since they always have bloated price for being new and a bit older ones are likely decent enough too.

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      8 hours ago

      Modern nvidia GPUs work great, like rtx 900 and newer

      The main problem are nvidia legacy cards where nvidia isn’t updating their proprietary drivers and isn’t making them open source which leads to the decision to go with nuveau on newer kernels which has less features and uses more power, but is wayland compatible.

  • But_my_mom_says_im_cool@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    12
    ·
    8 hours ago

    I fucking wish you could filter out words like “Linux” on Lemmy so I don’t have to hear it anymore. I avoid Linux out of spite to all the Linux bros

  • Chemical Wonka@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    258
    arrow-down
    7
    ·
    edit-2
    2 days ago

    Don’t forget those who made it happen. Nvidia was “forced” to integrate Linux into its ecosystem

    Nvidia has always been hostile to the Linux community or negligent to say the least

    1000043457

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      21 hours ago

      Nvidia was “forced” to integrate Linux into its ecosystem

      100% bullcrap.

      Nvidia’s servers for data processing have always run Linux. And you know what those servers run? It’s not Windows, that’s for sure. So why would they write multiple versions of a driver for the same hardware interface? Their servers use the same drivers that you would use for gaming on a Linux desktop system.

      In fact, no version of Windows is supported on their DGX servers, and AFAIK you can’t even install Windows on it (even if you managed, it wouldn’t be usable).

      Long story short, a vendor we were working with (about 6 or 7 years ago now), was working on their Linux version of their SDK. We wanted to do some preliminary testing on Nvidia’s new T4s that at this point were only available via Nvidia’s testing datacenter (which we had access to).

      During a call with some of the Nvidia engineers I had to ask the awkward question of “any chance there’s a Windows server we can test on?”. I knew it was a cringe question and I died a little during the 10 second silence until one of the Nvidia guys finally replied with “no one uses Windows for this stuff”. And he said it slowly like the reply to such a question needed to go slow to be understood, because who else would ask that question unless you’re slow in the head?

      Nvidia has always been hostile to the Linux community or negligent to say the least

      People say “hostile”, but I think a better word is arrogant. They wanted to force the industry to use their own implementations they owned or pioneered like egl-stream instead of open standards. But AMD and Intel have proven that open source graphics drivers not only work, but benefit from being open so that the community can scratch their own itches and fix issues faster.

      • priapus@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 hours ago

        Yep, Nvidia has never been hostile towards Linux, they benefit from supporting it. They just don’t care to support the desktop that much, and frankly neither do AMD or Intel. They often take an extremely long time to fix simple bugs that only effect desktop usage. Fortunately, in their case, the drivers can be fixed by other open source contributors.

    • mac@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Am I missing something here? Nvidia never caved to their demands IIRC

    • Cris@lemmy.world
      link
      fedilink
      English
      arrow-up
      90
      arrow-down
      2
      ·
      2 days ago

      Man, I completely forgot about that. That’s honestly wild to think about in retrospect…

      • CeeBee_Eh@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        21 hours ago

        It’s not. It had nothing to do with it. Nvidia was all in with Linux as soon as they realized their hardware could be used for data processing and AI. That realization was way more than a decade ago.

        • umbrella@lemmy.ml
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          3
          ·
          edit-2
          8 hours ago

          their drivers were good for AI and compute way before the leak.

          but suddenly their desktop drivers are open, after hackers leaked their desktop driver code? mmmmmmm…

          • priapus@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            ·
            7 hours ago

            They only open sourced the kernel drivers, which just makes sense for them to do. Userspace drivers, which these attackers wanted to be open, are still very much closed. Likely had nothing to do with it.

    • projectmoon@lemm.ee
      link
      fedilink
      English
      arrow-up
      27
      ·
      2 days ago

      Don’t know about “always.” In recent years, like the past 10 years, definitely. But I remember a time when Nvidia was the only reasonable recommendation for a graphics card on Linux, because Radeon was so bad. This was before Wayland, and probably even before AMD bought ATI. And it was certainly long before the amdgpu drivers existed.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        14
        ·
        2 days ago

        Yeah it was before AMD did graphics.

        ATI had an atrocious closed source driver. I used it … but it was not good at much of anything.

        • endeavor@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          2 days ago

          I had ati card on my pc when pentium 4 was all the rage. I literally spent my teenage years learning english in order to get the dumb games I saved up ages for to work without crashing constantly. Its shocking how the same terrible card manufacturer is part of the company that makes the only cpu worth damn and great gpus.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        2 days ago

        Nvidia is still rather nice with FreeBSD, because their official proprietary driver there is, well, fully official, while drivers ported from Linux somewhat lag behind and have problems sometimes.

  • Serge Matveenko@lemmings.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    1 day ago

    Well, it’s still a modified custom distro and other distros will need to invest extra effort to be able to run there. So, no actual freedom of choice for users again…

    • sprack@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      10 hours ago

      Not true. You can run other distros, but it won’t be ready to go without a decent amount of work.

        • sprack@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          9 hours ago

          They said there is no freedom of choice. You are free to choose any distro you want. HW mfgs aren’t under obligation to provide inbox support for niche markets.

          • A_Random_Idiot@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            9 hours ago

            Folks, I think we got a bot here thats just spewing pre-prepared lines, regardless of how irrelevant the comment is.

            • sprack@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              9 hours ago

              Versus someone that doesn’t understand the market segment this is meant for. People using it for training don’t care about the distro or gfx drivers. It’s an appliance.

  • shortwavesurfer@lemmy.zip
    link
    fedilink
    English
    arrow-up
    78
    ·
    2 days ago

    Honestly, I’ve found that my compute needs have been surpassed quite a while ago, and so I could easily get away with buying a $300 computer.

    • Snot Flickerman@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      56
      arrow-down
      2
      ·
      2 days ago

      Honestly, for real, a lot of low-power PCs are really useful once they have crap like Windows off of them and a lightweight Linux distro on them.

      • shortwavesurfer@lemmy.zip
        link
        fedilink
        English
        arrow-up
        35
        ·
        2 days ago

        Exactly. Get yourself a somewhat low-end PC, wipe windows, and install Linux Mint, and you’re pretty much golden.

        • Emi@ani.social
          link
          fedilink
          English
          arrow-up
          11
          ·
          2 days ago

          Did exactly this with an old laptop and use to mainly for tv and occasionally browsing when staying at our hut/cottage? Still bit slow but works.

      • ricecake@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        13
        ·
        2 days ago

        I’ve found my preferences have been creeping up in price again, but only because I’ve found I want an actually physically lightweight laptop, and those have been getting more available, linux-able and capable.

        I only need a few hundred dollars worth of computer, and anything more can live on a rack somewhere. I’ll pay more than that for my computer to be light enough I don’t need to think about.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Up until the early 2000s, serial computation speed doubled about every 18 months. That meant that virtually all software just ran twice as quickly every 18 months of CPU advances. And since taking advantage of that was trivial, new software releases did, traded CPU cycles for shorter development time or more functionality, demanded current hardware to run at a reasonable clip.

      In that environment, it was quite important to upgrade the CPU.

      But that hasn’t been happening for about twenty years now. Serial computation speed still increases, but not nearly as quickly any more.

      This is about ten years old now:

      https://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance/

      Throughout the 80’s and 90’s, CPUs were able to run virtually any kind of software twice as fast every 18-20 months. The rate of change was incredible. Your 486SX-16 was almost obsolete by the time you got it through the door. But eventually, at some point in the mid-2000’s, progress slowed down considerably for single-threaded software – which was most software.

      Perhaps the turning point came in May 2004, when Intel canceled its latest single-core development effort to focus on multicore designs. Later that year, Herb Sutter wrote his now-famous article, The Free Lunch Is Over. Not all software will run remarkably faster year-over-year anymore, he warned us. Concurrent software would continue its meteoric rise, but single-threaded software was about to get left in the dust.

      If you’re willing to trust this line, it seems that in the eight years since January 2004, mainstream performance has increased by a factor of about 4.6x, which works out to 21% per year. Compare that to the 28x increase between 1996 and 2004! Things have really slowed down.

      We can also look at about the twelve years since then, which is even slower:

      https://www.cpubenchmark.net/compare/2026vs6296/Intel-i7-4960X-vs-Intel-Ultra-9-285K

      This is using a benchmark to compare the single-threaded performance of the i7 4960X (Intel’s high-end processor back at the start of 2013) to that of the Intel Ultra 9 285K, the current one. In those ~12 years, the latest processor has managed to get single-threaded performance about (5068/2070)=~2.448 times the 12-year-old processor. That’s (5068/2070)^(1/12)=1.07747, about a 7.7% performance improvement per year. The age of a processor doesn’t matter nearly as much in that environment.

      We still have had significant parallel computation increases. GPUs in particular have gotten considerably more powerful. But unlike with serial compute, parallel compute isn’t a “free” performance improvement – software needs to be rewritten to take advantage of that, it’s often hard to parallelize solving problems, and some problems cannot be solved in parallel.

      Honestly, I’d say that the most-noticeable shift is away from rotational drives to SSDs – there are tasks for which SSDs can greatly outperform rotational drives.

      • SreudianFlip@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        My line for computational adequacy was crossed with the Core2Duo. Any chip since has been fine for everyday administration or household use, and they are still fine running linux.

        Any Apple silicon including the M1 is now adequate even for high end production, setting a new low bar, and a new watershed.

      • shortwavesurfer@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        You know, that would explain a lot because I had no idea that there was an authentication pin and that’s total bullshit.

    • SaharaMaleikuhm@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 day ago

      For real, I’m happily using an APU for 90% of the time. I barely need a dedicated GPU at all any more. I use Mint btw.

    • QuarterSwede@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      2 days ago

      I bought a former office HP EliteDesk 800 G2 16GB for $120 on eBay or Amazon (can’t recall) 2 years ago with the intention of it just being my server. I ended up not unhooking the monitor and leaving it on my desk since it’s plenty fast for my needs. No massive PC gaming rig but it plays Steam indie titles and even 3D modeling and slicing apps at full speed. I just haven’t needed to get anything else.

      • shortwavesurfer@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        2 days ago

        Being blind, I don’t play video games and don’t do any kind of 3D graphics and stuff like that. So many, many computers would fit my specifications.

        Edit: My laptop right now is a Dell Latitude E5400 from like 2014 with eight gigabytes of RAM and a 7200 RPM drive with an Intel Core i5 and it works well enough. Honestly, the only problem with it is that it does not charge the battery. So as soon as it is unplugged from the wall, it just dies. And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery. I figure with it being 10 years old already, at some point I will have to replace it.

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          2 days ago

          It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

          And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

          At least some Dell laptops authenticate to the charger so that only “authentic Dell chargers” can charge the battery, though they’ll run off third-party chargers without charging the battery.

          Unfortunately, it’s a common problem – and I’ve seen this myself – for the authentication pin on an “authentic Dell charger” to become slightly bent or something, at which it will no longer authenticate and the laptop will refuse to charge the battery.

          I bet the charger on yours is a barrel charger with that pin down the middle.

          hits Amazon

          Yeah, looks like it.

          https://www.amazon.com/dp/B086VYSZVL?psc=1

          I don’t have a great picture for the 65W one, but the 45W charger here has an image looking down the charger barrel showing that internal pin.

          If you want to keep using that laptop and want to use the battery, I’d try swapping out the charger. If you don’t have an official Dell charger, make sure that the one you get is one of those (unless some “universal charger” has managed to break their authentication scheme in the intervening years; I haven’t been following things).

          EDIT: Even one of the top reviews on that Amazon page mentions it:

          I have a DELL, that has the straight barrel plug with the pin in it. THEY REALLY made a BAD DECISION when they made these DELL laptops with that type of plug instead of making it with a dog leg style plug. I have to replace my charger cord A LOT because the pin gets bent inside and it stops charging at that plug, but the rest of the charger is still good…

        • Cyborganism
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          Oh hey, I have question for you then. Are you using any braille system with your computer? Or is it a kind of voice reader thing you have going on? What do you use for reading posts and comments on Lemmy?

          • shortwavesurfer@lemmy.zip
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            2 days ago

            I use my phone a lot more frequently than I use my computer and I use the TalkBack screen reader on my phone primarily. I can read and write Braille of course and have been able to do so since I was a little kid but I don’t do it very often primarily because I’ve always found reading to be slow for me and so I prefer audio. I’m able to better absorb information through audio than through reading it directly and always have been.

            Edit: I’m not totally blind so my primary navigation is through memorization of where things are and then to read posts and stuff like that that’s long I use the screen reader. So, for example, on my home screen, I know where I’ve placed my app icons, so I can just easily navigate to them, and in settings, for example, I know roughly where the menus are that I’m looking for, and so can navigate to them quickly. I also use the magnification gestures a lot. So, primarily, I navigate with memorization, magnification gestures, and screen reader for longer stuff.

                • CancerMancer@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 hours ago

                  Can you actually understand mumble rappers?

                  Seriously though I find these accessibility discussions very informative and they make me think about how I develop things and share information. Thank you for sharing.

        • Snot Flickerman@lemmy.blahaj.zoneOP
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          edit-2
          2 days ago

          Oh snap I am really sorry to intrude but I have a question for someone like yourself who is an avid PC user and is also blind.

          How do you feel about the prohibitive cost of braille terminals? I am not blind but I remember seeing the film Sneakers when I was young and the blind hacker Whistler using a braille terminal. As an adult I looked into them and was shocked that some cost more than a mid-range laptop. Are they even that useful or is this a relic that I recall but has been superseded by more useful assistive technologies?

          • shortwavesurfer@lemmy.zip
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 days ago

            Mind you, I don’t use Braille super often. And the Braille note taker devices are quite expensive. For sure. But just direct Braille displays have come down quite a bit in price. I remember a couple of years ago, a Braille display was launched called the Orbit Reader 20, which is a 20 cell Braille display. And I think it was like $400 or something like that. Compared to the $5,000 that some Braille note-taker devices can cost, $400 is nothing.

              • shortwavesurfer@lemmy.zip
                link
                fedilink
                English
                arrow-up
                2
                ·
                2 days ago

                Same here. It used to be that you had to get them subsidized by government programs such as vocational rehabilitation. But now they are affordable by just saving for a little bit.

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      I was that way for the longest time. I was more than content with my 4 core 8 thread 4th Gen. i7 laptop. I only upgraded to an 11th Gen. i9 system because I wanted to play some games on the go.

      But after I upgraded to that system I started to do so much more, and all at once. Mostly because I actually could, and the old system would cry in pain long before then. But Mid last year I finally broke and bought a 13th Gen. i9 system to replace it and man do I flog the shit out of this computer. Just having the spare power lying around made me want to do more and more with it.

      • shortwavesurfer@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        My current laptop is the Dell Latitude E5400 and it has like 4 threads with 8 gigs of RAM and a 7200 RPM drive and it works well enough even though it’s 10 years old. Honestly, the only problem with it is that it does not charge the battery. It’s something in the charging circuitry. Since it works fine when it’s on wall power, but it absolutely will not charge a battery anymore.

        • SynopsisTantilize@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          I’m still dailying my Acer c720 Chromebook with Linux mint lol. I’m thankful the flimsy changing port hasn’t given out yet. But it’s coming.

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    2 days ago

    NVCC is still proprietary and full of telemetry. You cannot build CUDA without it.