

At the datacenter scale Gaudi 3 was pretty good, at least when it came out.
e


At the datacenter scale Gaudi 3 was pretty good, at least when it came out.


Intel GPU support?
ZLUDA previously supported Intel GPUs, but not currently. It is possible to revive the Intel backend. The development team is focusing on high‑quality AMD GPU support and welcomes contributions.
Anyways, no actual AI company is going to buy $100M of AI cards just to run all of their software through an unfinished community made translation layer, no matter how good it becomes.
OneAPI is decent, but apparently usually fairly cumbersome to work with and people prefer to write software in cuda as it’s the industry standard (and the standard in academia)


Intel’s Gaudi 3 datacenter GPU from late 2024 advertises about 1800 tops in fp8, at 3.1 tops/w. Google’s mid 2025 TPU v7 advertises 4600 tops fp8, at 4.7 tops/w. Which is a difference, but not that dramatic of one. The reason it is so small is that GPUs are basically TPUs already; almost as much die space as is allocated to actual shader units is allocated to matrix accelerators. I have heard anecdotally.


It’s not even a pivot. They’ve been focusing on AI already. I’m sure they want it to seem like a pivot (and build up hype); the times before apparently just having the hardware and software wasn’t enough. nobody cared when the gaudi cards came out, nobody uses sycl or onednn, etc


although I like a lot of what Valve does (I have a lot of Steam games, valve games, have a steam deck oled, use steamvr, etc) they are a fairly flawed company. sweeney is so great at shooting himself in the foot though that any opinion he has people will by default believe the opposite of (and probably should)


Note that many game studios prioritize consoles first and whatever room there is to turn up or down the settings is what makes it to the pc version. When benchmarking level performance targets they’re going to use console hardware. They’re selling it for the Xbox series S, so that’s probably going to be what the devs are considering as the lower end of a decent experience. So that means that an a580, rx 6600, or 2060 super will be more than good enough. The series S only has 10 gb of shared cpu-gpu ram, so it will have to run on fairly limited ram as well.


Decent I think, as long as you don’t want to use XeSS
Lead is actually a slight concern with new nozzles or abrasive filaments especially, as there’s usually a bit of lead in brass


yep, I got through about 50 hours of Subnautica at ~15 fps back in the day.
Even portal with RTX is kinda playable on the steam deck though if you use that config that one person made to enable fsr3 and do some default settings changes, and put the gpu clock up to full
DM me if you want me to send you my monster sorting program I made a couple of years ago that has pictures of all of the pages in the monster manual in it
As alternatives to webkit/chromium/gecko browsers go, I like ladybird’s speed of progress and their mentality of doing everything themselves (no external dependencies), but Kling’s political views are concerning. Servo is going slower but still making progress (fell behind in implementing web standards), and both are kinda terrible in terms of speed afaik


Sure, I could definitely see situations where it would be useful, but I’m fairly confident that no current games are doing that. First of all, it is a whole lot easier said than done to get real-world data for that type of thing. Even if you manage to find a dataset with positions of various features across various biomes and train an AI model on that, in 99% of cases it will still take a whole lot more development time and probably be a whole lot less flexible than manually setting up rulesets, blending different noise maps, having artists scatter objects in an area, etc. It will probably also have problems generating unusual terrain types, which is a problem if the game is set in a fantasy world with terrain that is unlike what you would find in the real world. So then, you’d need artists to come up with a whole lot of datat to train the model with, when they could just be making the terrain directly. I’m sure Google DeepMind or Meta AI whatever or some team of university researchers could come up with a way to do ai terrain generation very well, but game studios are not typically connected to those sorts of people, even if they technically are under the same company of Microsoft or Meta.
You can get very far with conventional procedural generation techniques, hydraulic erosion, climate simulation, maybe even a model of an ecosystem. And all of those things together would probably still be much more approvable for a game studio than some sort of machine learning landscape prediction.


I don’t know of any games that use machine learning for procedural generation and would be slightly surprised if there are any. But there is a little bit of a distinction there because that is required at runtime, so it’s not something an artist could possibly be involved in.
you can download the arch wiki on kiwix (for android), it’s like 30 megabytes


Are there any alternatives that are decently fast for large files? My computer and my phone both get at least 300 mbps from the router, and I have yet to find a local file transfer application that will be anywhere near that fast for large files (destiny, local send, kde connect, might have tried others, I don’t remember)


You know, the new word is ‘affordability.’ Another word is just ‘groceries.’ It’s sort of an old-fashioned word but it’s very accurate. And they’re coming down
such an eloquent speaker


I suppose it depends on if you count conservation as philanthropy. Like I said though, it’s not that significant compared to his overall wealth.


But it seems like almost every other storefront operates under those margins for digital sales (not just in gaming)
Notable that Epic Games takes only a 12% cut, and 0% of the first $1 million in sales (effectively 0% for the vast majority of indie games). A cynical take is that they’re just doing this to attract developers to their store, which is almost certainly true, but it doesn’t necessarily mean they’ll take a higher cut if they become dominant. Unfortunately the Epic Games platform is missing the majority of extra features that Steam has (built in streaming, family share, input binding, big picture mode, etc)
Tim Sweeney, CEO of Epic Games, is about 80% as wealthy as Gabe Newell, and has done much more philanthropy, although it only represents probably less than one percent of his net worth.


people are saying that the witcher 3 works really well with the winulator app (uses wine and box86, which i’ve heard usually performs a tiny bit better than FEX, what valve is using, at the cost of occasional innacuracies)

not disagreeing, but if you just want to run the witcher 3 on your phone you can do it right now
Yes, it works out to a ton of power and money, but on the other hand, 2x the computation could be like a few percent better in results. so it’s often a thing of orders of magnitude, because that’s what is needed for a sufficiently noticeable difference in use.
basing things on theoretical tops is also not particularly equivalent to performance in actual use, it just gives a very general idea of a perfect workload.