• 0 Posts
  • 410 Comments
Joined 3 years ago
cake
Cake day: June 30th, 2023

help-circle


  • I’m planning on going back to a restaurant I was at last week… But to try one of the other dishes that looked good that I didn’t try (even though the dish I had might be the best fish dish I’ve ever tried). But I have struggled with trying new things at restaurants in the past because it’s hard to commit to a mystery dish that might be good vs the known dish that I know I’ll like (usually burgers or pasta).

    But after trying a taster menu at a michelin star restaurant (not the one I might go back to today), which was full of dishes I’d never order on my own but every single one of them were amazing, I have an easier time taking that risk.

    But I never blamed anyone other than myself for not trying new dishes before that.

    Edit: I went back and had one of the other dishes, unfortunately it wasn’t as good as the last one, but it was still decent and now I know I prefer the first dish. Though there’s still others I want to try, first dish might be out of season by the time I try everything. And horrible things didn’t happen just because I ended up trying a dish that I wouldn’t choose again.



  • Yeah, I don’t get that. I can’t even think of very many shows that I’ve watched through twice, let alone repeatedly. If we limit it to shows where the second watch is just for myself and not to show someone else, Breaking Bad is the only one I can think of.

    And even expanding to shows watched multiple times to show someone else, the list expands to Lost, Malcolm in the Middle, Futurama, Naruto, Wandavision, and Loki.

    Oh and I guess MXC, I was playing the twitch channel that was nothing but all MXC episodes on a loop for a while, though that was more because there were enough episodes that it took a while before it seemed repetitive.





  • Yeah, that use of them makes sense, as a method to churn out hypotheses. But their wording suggests to me that they might not have been created for that purpose (Hanlon’s uses the word “never”) and I think the vast majority of the time I see people invoking them in discussions is to try to discredit another comment, not to explain why they are presenting a hypothesis (in fact, once you have the hypothesis, the brainstorming method used to get there isn’t really relevant anymore, next step should be determining ways to support or oppose that hypothesis).

    It’s just frustrating seeing people quoting razors as if they are supporting evidence, and that is the pseudologic part.

    I’ll also point out that “pseudoscience” or “pseudologic” doesn’t mean it’s useless, just that it isn’t as profound as many seem to believe it is.


  • Everything AI boom is likely a lie, and Nvidia bribing Trump to sell H200s to China, at 25% export tariff, is proof of incapacity or unwillingness of US industry to deploy them.

    I’d love for you to be right (I’d like to see nvidia compete as an underdog since they are fairly anticompetitive in their dominant position) but think this reasoning is flawed.

    Wanting to sell to China just means that demand isn’t exceeding supply, or maybe even that they have access to more supply that they’d use if they could sell to China, which is a massive market. Or even if they don’t have any excess supply, higher demand means they can set higher prices and still expect to sell all inventory.

    Like the US car companies wanting to sell cars in China doesn’t imply that they are unable to sell cars in the US, it just means they want to sell cars to China and the US.

    I agree with the rest of your comment and think it was well said, sorry about this nitpick.


  • On the flip side, it takes longer to type the text than it does to say it, plus verbal communication can be two ways even when the talking is mostly on one side because you can add acknowledgements when you understand without interrupting or you can interrupt when something is said that you don’t follow.

    I do better with text myself, but communication is something where you need to meet in the middle, assuming you’re open to communication in the first place. If you just don’t want to communicate, then the easiest to blow off is the preferred method. Which actually is another reason I personally like text communication, because I can ignore it in the moment and get back to it later, but you can do this with calls by asking to schedule a call instead of taking it right then.



  • Yeah, Java’s enforcement of everything must be a class put me off of the language right from the start. I was already used to C++ at that point and hated that I couldn’t just write a quick little test function to check something, it needed a bunch of boilerplate to even get started.

    I still think C++ has a great balance between object oriented and sequential programming. Not sure if it’s the best, but I can’t think of ways to improve on it, other than built in concurrency object stuff (like monitor classes that have built in locks that prevent more than one thread from accessing any of its functions at the same time, basically guaranteeing any code in it has mutual exclusion).


  • Sometimes tech presentations make me feel really bad for the person giving it. They are up there trying their best but clearly don’t have the skills to do more than just communicate information but still try to make their presentation cool and fun and it just falls flat.

    Anyone can be cool, but not everyone can be cool on demand or on stage.

    Though on the other hand, just because a presenter can pull off the cool factor, it doesn’t mean what they are presenting is actually cool. The coolness of a presentation has no correlation with the coolness of what is being presented, unless that coolness is just information about the product (though even then, they are probably skipping over the flaws and enshitification).


  • Battery tech has improved a lot since they came out and (at least for the other two systems mentioned, plus any other device I’ve replaced the battery for) you can often find batteries with better specs than the original for longer lasting (before recharge is required, can’t say yet about total lifetime).

    It’s also not a bad idea to check if your batteries have become danger pillows, though the controller ones tend to be housed in hard plastic that makes it less obvious, but my wii u battery did feel like it had a bit of a bulge to it, leading to a nervous period where I had gotten rid of the old battery but was still waiting for the new one to be shipped lol.



  • Business logic would be transformations to the data. Like for a spreadsheet, the data layer would handle the reading/writing of files as well as the storage of each cell’s content. The business logic layer would handle evaluating each of the formulas in the cells, and the presentation layer draws it on the screen.

    I think the part where it gets confusing is that each of these layers are pretty tightly coupled. The end destination of the presentation layer might change, one might show it on a GUI, another might print it, and another might convert it to pdf or html, but each of those presentation layers needs to understand the data that it is presenting, so it’s tightly coupled to the data layer. Same with the businesses logic layer, though it’s tightly coupled on both the input and output sides. The design of the data layer constrains the possibilities of the other two, so it’s hard to draw a clear boundary between the layers because they all need to know how to walk the same data.

    My mental flow chart for this is more of a data layer in the middle instead of business logic, where business logic is to the side with arrows going both ways between it and data layer, then the presentation layer also accessing the data layer directly, which I suppose is a different permutation of what you described.

    Though another way to look at it does make sense. For a website, think of the database as the data layer, the server scripts as the business logic layer, and the client side scripts/html/css as the presentation layer. That one also follows the layered approach where the presentation layer is talking with the business logic layer.


  • Yeah, well-designed abstraction can help enable more concurrency. That said, concurrency isn’t easy at any point once there’s shared data that needs to be written to during the process. Maybe it’s not so bad if your language has good concurrency support (like monitor classes and such that handle most of the locking behind the scenes), but even then, there’s subtle pitfalls that can add rare bugs or crashes to your program.


  • Though on the flip side, remember that however old you are right now, it’s also the youngest you’ll ever be going forward.

    Feeling old in your 20s? Many people are active into their 60s, some keep going strong into their 90s.

    Unless you’re recovering from illness or injury, the current version of your body might be the best version you’ll ever see again.

    Though one suggestion that left my own body far more capable, if you’re the skinny and weak type, do some proper workouts. Proper as in spend the time to learn proper form and also ensure you’re getting enough energy and protein in your diet. You’ll gain strength that will stay with you until you do get really old (assuming your body doesn’t atrophy due to starvation or being bedridden before then).

    Eg, when I first started working out, I couldn’t curl 20 lbs, had to go down to 15. But I was curling 20 a week or two later and was pushing 40 lbs about a year or two later, then my workout habit dropped off and 8 years have passed and I can still curl over 30 lbs when I get curious in the dumbbell section of stores that carry them.