RickyWars1

  • 25 Posts
  • 96 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle
  • Everyone’s hating but honestly fair enough move.

    On the whole, nobody uses Bing or takes it seriously anyways and so I guess they have to find their niche. It’s certainly not aimed at us (Lemmy/Fediverse users) who are generally more privacy conscious. If it can attract some mainstream users (e.g., Google users, people like your parents, etc) or stop some users from immediately switching their search engine to Google, then it might be a good decision for them.

    Bing providing the exact same service as Google but worse clearly wasn’t working for them.


  • RickyWars1toScience Memes@mander.xyzFlowchart for STEM
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    4 days ago

    Certainly not limited to IT. One of my professors from many years was an aerospace engineer1. He recounts to us the time that he busted his ass on some design for a long time and managed to make some huge cost savings. And then after it was done he realized that all he really did with his extra hard work was help some executives and stockholders get a bit richer. Not long after that he switched to education.

    1Not in the defense industry



  • Ok then. I’ll echo what some others are saying about 16GB being sufficient. If you were in engineering every now and then it’s not enough but I don’t think its the case for comp sci. I’d leave the door open and get one in which you can upgrade the RAM though.

    One thing to look out for is CPU performance. I find the laptop CPU market is a disaster right now in which you really don’t know what you’ll get. LTT has a recent video on the topic. For most courses it won’t actually matter that much. Some examples of the ones where it could make a difference are numerical linear algebra courses, machine learning (classical, not neural networks), and computer vision (again, classical). In some of these extra RAM might also be helpful but I’d prioritize a better CPU over the RAM. You may look at CPU benchmarks to get an idea of their performance.

    In terms of GPU… I don’t think you’ll get anything capable enough for training neural networks at this price point, which is the only thing you may need it for in comp sci. But it’ll help with light gaming (but I imagine integrated graphics is good enough for minecraft these days—but dont quote me on that).

    Also lastly, I would still recommend finding something with decent Linux support even if you dont want to use it (yet), you may choose to install it down the line. My Dell XPS/Precision has pretty poor linux support with buggy trackpad issues which has caused issues for me in the past. Many comp sci students end up switching to Linux/dual booting for a good reason.












  • RickyWars1toPC GamingAMD confirms Radeon GPU sales have nosedived
    link
    fedilink
    arrow-up
    8
    arrow-down
    2
    ·
    3 months ago

    But AMD would rather sell two cards at 1000 each than take the bet of trying to sell four at 750.

    At the same time though this might not be unreasonable. I don’t know what the profit margin on these cards is given the R&D, manufacturing costs, and other various overheads, but it might be WAY more worth it to sell two at 1000 then four at 750. Might even be worth it to sell one at 1000 vs four at 750 depending on how slim it is after all those costs.