

It’s good to encourage reuse, which is eBay’s main thing. I wouldn’t have a reason to buy anything new from them however.
e


It’s good to encourage reuse, which is eBay’s main thing. I wouldn’t have a reason to buy anything new from them however.


I got my home server (Lenovo thinkcentre, i7 6700) for $30 minus ram or storage at my local university surplus store a few years ago, and I have no regrets. Added a 256gb sata SSD, 16 gb RAM, 8tb HDD all refurbished for like +$150 when that was still cheap.
That one was, but the current one is not


If you are playing games where trackpads are useful, there’s not really another option, so it’s automatically a good value. I know my steam deck experience would have been a whole lot worse without those, and I would probably never consider a gaming handheld without them. But for the gaming I do with an xbox controller, I currently just use it for some video games at my computer (where I have access to a mouse anyways), and maybe split screen with family. I think my Gamesir Cyclone 2 controller (at half the price) is an unambiguously better deal for that, in the premium controller space. If I were using it mostly for couch gaming, that might put the Steam controller in a better position, or if I were mostly playing games that support whatever haptic trigger things Sony has, that controller might be in a better position.
One of the other more unique features is the tracking in the Steam Frame. It would be cool if they could standardize that sort of thing so it would work with other headsets. I wonder if they’ve considered that.


They can make it illegal to sell certain types of hardware to consumers that allows you to install your own operating system. Of course, there is still virtualization. You can run Linux in a web browser nowadays (badly, but if there was a reason to improve it I’m sure it would be done)


As an amateur computer graphics person, the best way to draw accurate stars is to just pre render it onto a cubemap. But if you really need that subpixel worth of parallax to be completely accurate for every star, there are a couple ways I can think of off of the top of my head. With any you’d want to make sure you only store position, size, and color, since stars are all spheres anyways. With effort, you can be very flexible with how these are stored. (4 bits color temperature, 4 bits size, 3*32 bits coordinates maybe)
Worse ideas:
This is not that well suited to most usual rendering techniques, because most stars are probably going to be much smaller than a pixel. Ray tracing would mean you need to just hit every star by chance (or artificially increase star size and then deal with having tons of transparency), hardware rasterization is basically the same and additionally is inefficient with small triangles. I guess you could just live with only hitting stars by chance and throw TAA at it, there’s enough stars that it doesn’t matter if you miss some. That would react badly to parallax though and defeats the purpose of rendering every star in the first place.
It’s much more efficient to do a manual splatting thing, where for each star you look at what pixel(s) it will be in. You can also group stars together to cull out of view stars more efficiently. Subpixel occlusion will be wrong, but it probably doesn’t matter.
This is all just for the viewport, though. Presumably there are other objects in the game besides stars, which need to have reflections on them of the stars. Then that becomes an entirely different problem.
The real answer though is that you wouldn’t try to render all of the stars, even if you want parallax. Maybe some of the closer and larger ones as actual geometry, simplify a ton of stuff in the background, render things as volumes or 2d billboards, have a cubemap for the far distance, etc
Edit: also ofc this presumes you know the position, scale, temperature of every star
I also like the idea of baking all of the stars into a volume in spherical coordinates, centered around the origin


It’s clear that several people in charge of the youtube livestream have no idea about how to do that correctly. I think the difference is just effort. Viewership was tiny compared to Apollo 11, as was the hype leading up to it. It’s clear that NASA could provide a whole lot better footage if even some random youtuber (Everyday Astronaut) can beat them. So that aspect is, as you said, because as a society we don’t really care about the Artemis launch. SpaceX does put a fair amount of effort into their livestreams, and you can easily tell by watching them.
For the recorded footage, film often has a lot higher dynamic range than digital cameras and usually looks a whole lot better when recording a launch up close.
Far shots are limited by atmospheric distortion and physical limits from diffraction for a given aperture size. None of that can change.
IDK anything about the quality of the original live broadcast of Apollo 11, so i don’t have anything to compare in that regard


Yeah, I rewatched the launch from Everyday Astronaut’s livestream and he actually had better footage, he had a tracking camera showing the booster separation
Outside of the launch part, I think it’s mostly because SpaceX has set the standard so high, with tons of high resolution cameras streaming over Starlink even during reentry


it’s a reference to this xkcd
edit: as an april fools thing probably


Still very fond of WBOR from when I was following the Internet Roadtrip
No one’s stopping you


Oh, yeah, I didn’t see those. I think my point still stands though, really those specular highlights shouldn’t be that bright, but the AI can figure out that it’s plausible for them to be brighter and that it would fit the target style better.


Yeah, probably the main reason it’s getting the little bit of praise that it does is that they’re showing it off on games with fairly flat-looking skin shaders. Unfortunately a problem with this sort of thing is that getting that “2023” image is the result of giving a whole team a huge amount of time to model one man’s face. If you’re Bethesda and you just want to get NPCs into Starfield, it would be a similar amount of work. A bit less, since the first people already gave a talk on it, but still much more work then just getting a diffuse BRDF with some subsurface scattering and calling it good. But you also need a process that can be applied to every single NPC…
And looking at Striking Distance Studios, the company where that 2023 image is from:
In February 2025, it was reported that most of the studio’s developers had been laid off.
Yeah, I think it’s safe to say that the work those people put in will never be directly reused.
Another reason the DLSS version looks a bit more realistic there is because of the specular highlights on the eyes, for example. They probably aren’t reflecting anything real, or else they would be there in the original. But the AI knows that specular highlights add realism and are plausible in this scene, so it puts them there. That’s something that an artist could do if given a specific shot and camera angle, but in the general case they can’t really do that without causing problems.


Fun fact that you may or may not have heard before: the light flicker animation in Half Life Alyx is actually the exact same one used in the original Quake. Half Life 1 was built on the Quake engine, and the same animation was carried over into Source and then Source 2.
https://www.alanzucconi.com/2021/06/15/valve-flickering-lights/
I think with the straight/gay labels, you’re not going to be not attracted to someone just because they say that they’re a guy or girl. So really there’s just some appearances that you find attractive, and some not. For most people, those line up pretty well with femininity and masculinity, with maybe a few other restrictions on top. Any label is going to be a simplification, you can’t describe with one word the whole range of people you are attracted to.



For reference


It does seem still very impressive against other top laptop CPUs.

Although I heard from Jeff Geerling’s review that the neo often noticably throttles after a few seconds.
It also has pretty terrible IO.
I think the biggest attraction is the build quality, screen, etc. Most cheap laptops seem to cheap out on those a lot in my experience, and Apple did not. If you’re not stressing the CPU or GPU, it’ll still feel almost as high quality as any other MacBook.


I have a 2 core, 2 thread, 4gb RAM 3855u Chromebook that I installed Plasma on, and it’s usually pretty responsive.


sounds like that’s planned but maybe not in yet
Can get a used quest 2 for like $100, not crazy but not nothing. Unfortunately not powerful enough to view the more detailed rooms or avatars, or so I’ve heard.