• 0 Posts
  • 417 Comments
Joined 3 years ago
cake
Cake day: June 30th, 2023

help-circle
  • I had an upgrade plan for my PC that involved a step up to a 4k monitor, but when the time came, it was hard enough just finding a 4k monitor with decent specs that I stopped to really think about whether I would really benefit from it. I already knew I didn’t need it, but I realized that I wouldn’t even really gain anything from it. I already used the UI scaling with the one 4k monitor I had at work, so that was a wash. And for games, I didn’t really have any times when I wished the resolution was higher than the 1440p I was already using, but I did have times when I wished it would generate the frames faster or more consistently.

    Part of the change was a new GPU to handle 4k better (they were supposed to justify each other), but I ended up just getting an ultrawide 1440p monitor instead.

    I don’t think I’ll ever bother with higher than 4k for TV or 1440p for PC.





  • I bet that the question depends more on management than the customers or type of work.

    Like a good manager that doesn’t take shit from customers will be way better than the ones that bend over backwards for any complaint.

    Same thing for the ones who are chill as long as things are getting done vs the one that is more interested in seeing the illusion of work being done even if things are neglected (because all their attention is making sure people look busy rather than really understanding the work to evaluate results).






  • I’m planning on going back to a restaurant I was at last week… But to try one of the other dishes that looked good that I didn’t try (even though the dish I had might be the best fish dish I’ve ever tried). But I have struggled with trying new things at restaurants in the past because it’s hard to commit to a mystery dish that might be good vs the known dish that I know I’ll like (usually burgers or pasta).

    But after trying a taster menu at a michelin star restaurant (not the one I might go back to today), which was full of dishes I’d never order on my own but every single one of them were amazing, I have an easier time taking that risk.

    But I never blamed anyone other than myself for not trying new dishes before that.

    Edit: I went back and had one of the other dishes, unfortunately it wasn’t as good as the last one, but it was still decent and now I know I prefer the first dish. Though there’s still others I want to try, first dish might be out of season by the time I try everything. And horrible things didn’t happen just because I ended up trying a dish that I wouldn’t choose again.



  • Yeah, I don’t get that. I can’t even think of very many shows that I’ve watched through twice, let alone repeatedly. If we limit it to shows where the second watch is just for myself and not to show someone else, Breaking Bad is the only one I can think of.

    And even expanding to shows watched multiple times to show someone else, the list expands to Lost, Malcolm in the Middle, Futurama, Naruto, Wandavision, and Loki.

    Oh and I guess MXC, I was playing the twitch channel that was nothing but all MXC episodes on a loop for a while, though that was more because there were enough episodes that it took a while before it seemed repetitive.





  • Yeah, that use of them makes sense, as a method to churn out hypotheses. But their wording suggests to me that they might not have been created for that purpose (Hanlon’s uses the word “never”) and I think the vast majority of the time I see people invoking them in discussions is to try to discredit another comment, not to explain why they are presenting a hypothesis (in fact, once you have the hypothesis, the brainstorming method used to get there isn’t really relevant anymore, next step should be determining ways to support or oppose that hypothesis).

    It’s just frustrating seeing people quoting razors as if they are supporting evidence, and that is the pseudologic part.

    I’ll also point out that “pseudoscience” or “pseudologic” doesn’t mean it’s useless, just that it isn’t as profound as many seem to believe it is.


  • Everything AI boom is likely a lie, and Nvidia bribing Trump to sell H200s to China, at 25% export tariff, is proof of incapacity or unwillingness of US industry to deploy them.

    I’d love for you to be right (I’d like to see nvidia compete as an underdog since they are fairly anticompetitive in their dominant position) but think this reasoning is flawed.

    Wanting to sell to China just means that demand isn’t exceeding supply, or maybe even that they have access to more supply that they’d use if they could sell to China, which is a massive market. Or even if they don’t have any excess supply, higher demand means they can set higher prices and still expect to sell all inventory.

    Like the US car companies wanting to sell cars in China doesn’t imply that they are unable to sell cars in the US, it just means they want to sell cars to China and the US.

    I agree with the rest of your comment and think it was well said, sorry about this nitpick.


  • On the flip side, it takes longer to type the text than it does to say it, plus verbal communication can be two ways even when the talking is mostly on one side because you can add acknowledgements when you understand without interrupting or you can interrupt when something is said that you don’t follow.

    I do better with text myself, but communication is something where you need to meet in the middle, assuming you’re open to communication in the first place. If you just don’t want to communicate, then the easiest to blow off is the preferred method. Which actually is another reason I personally like text communication, because I can ignore it in the moment and get back to it later, but you can do this with calls by asking to schedule a call instead of taking it right then.



  • Yeah, Java’s enforcement of everything must be a class put me off of the language right from the start. I was already used to C++ at that point and hated that I couldn’t just write a quick little test function to check something, it needed a bunch of boilerplate to even get started.

    I still think C++ has a great balance between object oriented and sequential programming. Not sure if it’s the best, but I can’t think of ways to improve on it, other than built in concurrency object stuff (like monitor classes that have built in locks that prevent more than one thread from accessing any of its functions at the same time, basically guaranteeing any code in it has mutual exclusion).