Off-and-on trying out an account over at @tal@oleo.cafe due to scraping bots bogging down lemmy.today to the point of near-unusability.

  • 20 Posts
  • 1.02K Comments
Joined 2 years ago
cake
Cake day: October 4th, 2023

help-circle
  • I generally agree that improving mod accessibility to the public is desirable.

    I just stuck maybe a couple hundred mods into Starfield this week using Creations, including a number of paid ones. I’m fine with paid mods, but Bethesda still needs to deal with some basic issues.

    1. While I had fewer problems than I had with installing mods on prior Bethesda games using third-party mod managers, the need to troubleshoot hasn’t gone away. I installed some high-resolution texture mods that crashed Starfield shortly after boot. Bethesda doesn’t detect crashes in that scenario and offer a way to “roll back” to a “safe mode” or anything like that. I poked around a bit, and, as with their prior games, Starfield has a plugins.txt containing a list of modules loaded, and one can just remove the leading asterisk to disable them. But that’s going to be unacceptable for general use if you want all players to have access to mods. Either troubleshooting has to be pretty idiot-proof, or not be necessary at all. You definitely can’t put someone in a situation where they effectively can’t access the mod manager.

    2. For more-advanced users, troubleshooting tools still aren’t great. Bethesda would benefit from something that can at least do a binary-search for a breaking mod: turn off the latter half of a problematic mod list, see if the problem goes away. If it does, the problem is in the latter half; repeat for that half. If it doesn’t, the problem is in the first half; repeat for that half. Various tools that I’ve used in the past can do this, like git bisect. Conflict Catcher on the classic MacOS had a particularly good implementation that could detect multiple extensions that conflicted with each other; I’ve never seen another tool do this.

    3. Bethesda doesn’t, AFAICT, do adult mods in their own mod repository, which are popular for a number of their prior games. Nexusmods carries things that Creations doesn’t. LoversLab carries things that Nexusmods doesn’t. I appreciate if Microsoft doesn’t want to be in the business of distributing adult mods. However, I am confident that a lot of people would like to use those, as with prior Bethesda games. If one wants a lower bar to use, not requiring use of external mod managers would be desirable. I do think that extending the in-game mod manager to support external mod repositories would lower the barrier there. If Bethesda wants their game to be a platform, then that means more stuff strengthens the platform.

    4. Loading time still increases as mod count rises, as with prior Bethesda games. It can easily take minutes. It should be possible, at bare minimum, to have a progress bar up showing about how long it’ll take to complete load based on prior loads, if the mod list hasn’t changed. Personally, I’d like to see the load time reduced. If they have to validate content or something or build an index, only do it the first time a mod list changes and then cache the index.

    5. It’d be nice to have a “recommends” option. That is, if a mod requires another mod, when installing the first mod, ask the user if they want to install the latter mod. Nexusmods can do this. Bethesda’s Creations can’t — they will keep one from enabling an installed mod with missing dependencies, but the user basically needs to read mod descriptions and install appropriate dependency mods. That’s a barrier to use.

    6. Bethesda’s Creations store just has abysmal filtering options. I get that it’s for a single game, and so it’s hard to amortize costs, but browsing through what’s there is just atrocious. You don’t have the ability to apply multiple criteria when searching for games.

    7. The Creations store always re-downloads the list of Creations, instead of caching it. Exit Creations and go back in and everything gets re-downloaded again. This is obnoxious.

    8. I understand that there are some technical limitations associated with the Creations mod manager. The Dark Mode Terminal mod, for example, says that the Creations release cannot work around a bug associated with changing mod load order that the Nexus release doesn’t have a problem with.

    9. One popular thing to make as mods in many games is skins or cosmetic changes, like to clothing or the like. Fallout: New Vegas and Fallout 4 had “cinematic kills”, where sometimes the camera would pan away, allowing one to see one’s own character. Starfield doesn’t do this, which means that there are few opportunities to see one’s character, unless one leaves the camera in third-person (which is generally not great from a gameplay standpoint). This is an issue that I also would say applies to Cyberpunk 2077’s clothing options — lots of work went into creating many clothing options, but one so rarely actually sees oneself in the game that it has little impact. Ditto for a number of cosmetic options, like hairstyle and the like. I think that it’d be beneficial if they could work some way to see oneself more frequently into the game in terms of people reskinning things.

    10. For Fallout 76, Bethesda made money by mostly selling cosmetic items used by people who want to build themed player CAMPs. I was never personally very interested in building elaborate CAMPs just for the sake of looks, though clearly there are some people who are. However, my take is that these items were generally quite expensive compared to the cost of assets in the base game, though I’ll admit that I don’t know what volume they sell at. At least for me, the idea of paying for more content and functionality, to keep expanding that aspect of the game, is interesting. Buying cosmetic clutter items isn’t terribly interesting. I’m sure that they gather statistics on what players actually get, and Starfield’s Creations seem to me to have a different focus than the Fallout 76 Creations, so that’s good so far as it goes from my standpoint. My own interest would be in, say, getting new handcrafted cities and quests and the like. Getting a new player home or a different style of couch to put in it doesn’t really interest me nearly as much.

    11. I am pretty convinced that if Bethesda wants to have a low bar to extensively-modded games for the general playerbase, they’re going to have native support for something like Wabbajack. That lets one player or team assemble a curated mod collection, and then lets other users install it en masse. I’m not saying that using mods should end there, but my experience is that, years after a Bethesda game has come out, there are many different mods in similar areas (e.g. relighting mods, say). Some of those are successor mods. Some have advantages and drawbacks. Trying to evaluate what the best “current” choice is for a wide variety of mod types is a large task. Letting someone just choose from a list of “mod collections” and easily install all of the mods in a collection would, I think, greatly reduce the bar to get players able to use many mods at once.


  • One thing that annoys me about loading animations designed to conceal the game needing to load is that there’s no guarantee that — especially with PC games, as the game is played on faster computers — the bottleneck may become the animation completing rather than the actual loading.

    Static loading screens don’t have this problem.

    I kind of like Fallout 4’s approach of putting a single model up that you can rotate and look at while something is loading. It’ll add to the loading time a bit, but at least there’s something the player can fiddle with for a few seconds.





  • And over the past several months, the team and I have spent a great deal of time analyzing your feedback. What came through was the voice of people who care deeply about Windows and want it to be better.

    Today, I’m sharing what we are doing in response. Here are some of the initial changes we will preview in builds with Windows Insiders this month and throughout April.

    More taskbar customization, including vertical and top positions: Repositioning the taskbar is one of the top asks we’ve heard from you. We are introducing the ability to reposition it to the top or sides of your screen, making it easier to personalize your workspace.

    I actually have seen people here complaining about the Windows 11 taskbar, and I believe I recall someone specifically raising this limitation. Like, they probably are addressing things that Windows users care about.


  • He could probably run an NFS server that isn’t a closed box, and have that just use the Synology box as storage for that server. That’d give whatever options Linux and/or the NFS server you want to run have for giving fair prioritization to writes, or increasing cache size (like, say he has bursty load and blows through the cache on the Synology NAS, but a Linux NFS server with more write cache available could potentially just slurp up writes quickly and then more-slowly hand them off to the NAS).

    Honestly, though, I think that a preferable option, if one doesn’t want to mess with client global VM options (which wouldn’t be my first choice, but it sounds like OP is okay with it) is just to crank up the timeout options on the NFS clients, as I mention in my other comment, if he just doesn’t want timeout errors to percolate up and doesn’t mind the NAS taking a while to finish whatever it’s doing in some situations. It’s possible that he tried that, but I didn’t see it in his post.

    NFSv4 has leases, and — I haven’t tested it, but it’s plausible to me from a protocol standpoint — it might be possible that it can be set up such that as long as a lease can be renewed, it doesn’t time out outstanding file operations, even if they’re taking a long time. The Synology NAS might be able to avoid taking too long to renew leases and causing clients to time out on that as long as it’s reachable, even if it’s doing a lot of writing. That’d still let you know if you had your NFS server wedge or lost connectivity to it, because your leases would go away within a bounded amount of time, but might not time out on time to complete other operations. No guarantee, just it’s something that I might go look into if I were hitting this myself.


  • That’s a global VM setting, which is also going to affect your other filesystems mounted by that Linux system, which may or may not be a concern.

    If that is an issue, you might also consider — I’m not testing these, but would expect that it should work:

    • Passing the sync mount option on the client for the NFS mount. That will use no writeback caching for that filesystem, which may impact performance more than you want.

    • Increasing the NFS mount options on the client for timeo= or retrans=. These will avoid having the client time out and decide that the NFS server is taking excessively long (though an operation may still take longer to complete if the NFS server is taking a while to respond).




  • I think that kisses having magic powers is just something of a general theme for stories at the time and place that the Brothers Grimm were collecting folklore, not something gender-specific.

    https://en.wikipedia.org/wiki/Sleeping_Beauty

    “Sleeping Beauty” (French: La Belle au bois dormant, or The Beauty Sleeping in the Wood;[1][a] German: Dornröschen, or Little Briar Rose, Italian: La Bella Addormentata), also titled in English as The Sleeping Beauty in the Woods, is a fairy tale about a princess cursed by an evil fairy to sleep for a hundred years before being awakened by a handsome prince.

    The version collected and printed by the Brothers Grimm was one orally transmitted from the Perrault version,[10] while including its own attributes like the thorny rose hedge and the curse.[11]

    There, it’s a prince’s kiss that breaks a curse.

    https://en.wikipedia.org/wiki/The_Frog_Prince

    “The Frog Prince; or, Iron Henry” (German: Der Froschkönig oder der eiserne Heinrich, literally “The Frog King or the Iron Henry”) is a German fairy tale collected by the Brothers Grimm and published in 1812 in Grimm’s Fairy Tales (KHM 1).

    There, it’s a princess’ kiss.

    EDIT: Though I suppose one could take issue with the disproportionate-to-population level of royalty involved in doing all this kissing.

    EDIT2: You know, oddly enough, I’m racking my brain and I can’t think of present-day legends and stories where kisses do magical or supernatural things. There are some characters I can think of where a kiss might have some incidental effect — I’m pretty sure that I vaguely remember there being some Marvel Comics X-Men story where Rogue kisses her boyfriend and puts him in a coma, as an incidental effect of skin-on-skin contact. There are some kiss-adjacent things, like vampire stories where a kiss segues into a bite on the neck. But magical kissing seems to be out-of-vogue today.



  • Oh, this is neat.

    I do kind of wish that there was more of a summary of how it works on the page from a user standpoint. For example, the page links to niri, which the author previously used. That describes the basic niri paradigm right up top:

    Windows are arranged in columns on an infinite strip going to the right. Opening a new window never causes existing windows to resize.

    Every monitor has its own separate window strip. Windows can never “overflow” onto an adjacent monitor.

    Workspaces are dynamic and arranged vertically. Every monitor has an independent set of workspaces, and there’s always one empty workspace present all the way down.

    The workspace arrangement is preserved across disconnecting and connecting monitors where it makes sense. When a monitor disconnects, its workspaces will move to another monitor, but upon reconnection they will move back to the original monitor.

    I mean, you can very quickly skim that and get a rough idea of the way niri would work if you invested the time to download it and get it set up and use it.

    It does say that reka uses river, and maybe that implies certain conventions or functionality, but I haven’t used any river-based window managers, so it doesn’t give me a lot of information.


  • Game streaming serices are never going to catch on because the capital needed to build out the infrastructure is ridiculous.

    I don’t know about “never”, but I’ve made similar arguments on here predicated on the cost of building out the bandwidth — I don’t think that we’re likely going to get to the point any time soon where computers living in datacenters are a general-purpose replacement for non-mobile gaming, just because of the cost of building out the bandwidth from datacenter to monitor. Any benefit from having a remote GPU just doesn’t compare terribly well with the cost of having to effectively have a monitor-computer cable for every computer that might be used concurrently to the nearest datacenter.

    But…I can think of specific cases where they’re competitive.

    First, where power is your relevant constraint. If you’re using something like a cell phone or other battery-powered device, it’s a way to deal with power limitations. I mean, if you’re using even something like a laptop without wall power, you probably don’t have more than 100 Wh of battery power, absent USB-C and an external powerstation or something, due to airline restrictions on laptop battery size. If you want to be able to play a game for, say, 3 hours, then your power budget (not just for the GPU, but for everything) is something like 30W. You’re not going to beat that limit unless the restrictions on battery size go away (which…maybe they will, as I understand that there are some more-fire-safe battery chemistries out there).

    And cell phone battery restrictions are typically even harder, like, 20 Wh. That means that for three hours of gaming, your power budget because of size constraints on the phone is maybe about 6 watts.

    If you want power-intensive rendering on those platforms doing remote rendering is your only real option then.

    Second, there are (and could be more) video game genres where you need dynamically-generated images, but where latency isn’t really a constraint. Like, a first-person shooter has some real latency constraints. You need to get a frame back in a tightly bounded amount of time, and you have constraints on how many frames per second you need. But if you were dynamically-rendering images for, I don’t know, an otherwise-text-based adventure game, then the acceptable time required to get a new frame illustrating a given scene might expand to seconds. That drastically slashes the bandwidth required.

    What I don’t think is going to happen in the near future is “gaming PC/non-portable video game consoles get moved to the datacenter”.



  • I don’t know what the situation is for commercial games — I don’t know if there’s a marketplace like that — but I do remember someone setting up some repository for free/Creative Commons assets a while back.

    goes looking

    https://opengameart.org/

    It’s not highly-structured in the sense that someone can upload, say, a model in Format X and someone else can upload a patch against that model or something like that with improvements and changes, though. Like, it’s not quite a “GitHub of assets”.

    I haven’t looked at it over time, but I also don’t think that we’ve had an explosion in inter-compatible assets there. Like, it’s not like a community forms around a particular collection of chibi-style sprite artwork at a particular resolution, and then lots of libre games use those assets, the way RPGMaker or something has collections of compatible commercial assets.

    I’m sure that there must be some sort of commercial asset marketplace out there, probably a number, though I don’t know if any span all game asset types or if they permit easily republishing modifications. I know that I’ve occasionally stumbled across a website or two that have individuals sell 3D models.


  • My first question is “why is that the case?”

    Like, is FAIR being (rationally) chosen because people simply cannot afford the private plans on offer, and private plans don’t provide a minimal-enough level of coverage? If so, maybe the problem is actually that we need more availability of housing, that people are financially-stretched too far.

    Or are people irrationally getting too little fire insurance, and FAIR just provides an opportunity to do that? Then you’d think that we should improve the information available about fire insurance plans.

    Or is FAIR providing a better cost-for-value, in which case one would want to look a breakdown of why private plans would cost more — like, is the market not competitive?

    My own gut guess is that the most-likely largest culprit is the first, because I am comfortable saying that California has a very real undersupply of housing, which makes housing highly-unaffordable in California, and causes people to be under greater financial pressure. Like, we’d like to have more housing, which would reduce housing prices, which would permit people to spend less on housing, which would permit people to be less-financially-stretched, which would let people, among other things, spend more on insurance for that housing. I don’t know that that’s the dominant factor, but I’m pretty sure that it is a factor.

    searches for an affordability metric

    https://www.affordabilityindex.org/rankings/states/

    On this metric, California ranks #43 out of 50 states plus DC on housing affordability, measuring what housing costs relative to income. That’s not the bottom of the bin, so it’s likely that one can’t chalk it up only to that, but it isn’t great, and I’d bet that it is a substantial factor.

    takes another look at the metric

    I’d also guess that it’s pretty good odds that the ratio being computed (income to price) is very probably using pre-tax income, and California is exceptionally high in absolute cost of housing among the states, second only to Hawaii. Because we use a progressive income tax system, having higher income means that each additional dollar in income goes less towards making housing affordable as income rises — instead, some of it goes towards effectively subsidizing standard of living in other states that have lower median income. So you’d expect the affordability issues in California to be more-severe than just that ratio suggests; California’s higher income would have less real-world effect than lower housing prices in other states relative to the ratio that the metric uses.

    EDIT: This affordability metric ranks California at #47 on housing affordability out of 50 states plus DC.

    https://www.realtor.com/research/state-report-cards-2025/

    It’s also based on median (I assume pre-tax) income relative to house price.




  • I think that you have two factors here. GDC isn’t specific to PC gaming, and additionally, a lot of titles will see both PC and console releases.

    For a game that is intended to see only a PC release, my guess is that that that might affect system requirements of the game.

    For games that see console releases, things like “will fewer people have consoles” — because current-gen consoles are very unlikely to change spec, just price, is how this manifests itself. “Is the Playstation 6 going to be postponed” is a big deal if you were going to release a game for that hardware.