• rolandtb303@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    4 minutes ago

    self-contained, offline

    While on one part i do like the all-ini-one ui and services, i feel as though it could have been done a little better without hosting a mini web-server just to use localhost on it.

    Most if not all of the tools here are based on snapshots of online websites running in a browser, along with Docker ontop of it. While the intention is good and there are some neat ideas in here, why not just bundle native, offline FOSS programs that do the job already? for instance, cyberchef can be replaced with respective linux programs (eg base64, hexdump, grep, awk/sed, and gpg, just to name a few. graphical versions of these programs exist as well, so it’s not like you need to use the terminal, it’s just the most versatile environment for this type of stuff). No need for a webserver or anything.

    However i will say, the offline wikipedia and maps are cool, unfortunately they’re the only neat things in this project.

    Now let’s get to the point, an AI chatbot. What, does the dev think we have money to burn? Much less if SHTF and NVDIA RTX GPUs are scrapped for metal? (which they should be anyways). Now i know it’s local, and that it most likely has data already trained on it so that it has the 100% guarantee of not huffing its own fumes and hallucinating, but compared to the absolute power usage that’ll bring because of the sheer amount of resources it’s hogging out trying to spit out an answer, a search engine could do just as good, and it won’t hog up your GPU while at it. That’s not even getting into the current ssd/gpu/ram situation right now. On its front page, its own recommended spec sheet says 32 gigs of ram. yeah that’s a bit steep. 1TB SSD, i could kinda see why, but if i assume that most of the information is just text, you don’t really need 1TB, but it is better safe than sorry. Still, that’ll be pretty expensive if we’re going by today’s prices. When SHTF do you really think that most people are going to be rocking killer rigs with 8/16core CPUs, 32+ gigs of RAM and an RTX GPU? For the millionares and spoiled gamers who already have those? Sure, but for the masses? They’ll mostly be using laptops with 4-6 cores, 8 gigs of ram, and a mid-range gpu if they’re lucky, or integrated graphics.

    Sure, you can say that having AI in it is somehow beneficial and tout how “everyone is using it”, but don’t get all pissy when your power bank runs out of juice at the worst time, let alone word gets out and your place gets raided and your 20-year-old 5090 is turned into scrap. All because you thought AI is good enough.

    All in all it’s a good premise, but it could be executed way better than just snapshotting websites, then slapping AI onto it and calling it a day.

  • Jack@slrpnk.net
    link
    fedilink
    arrow-up
    5
    ·
    1 hour ago

    Sounds like a very cool idea, the implementation… not so much.

    Also who would you use a chatbot in a survival situation when it halucinates and can be replaced with a simple search in this case?

  • solrize@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    2 hours ago

    Too much AI, too much Internet dependence, need a complete distro including all data with single click install from USB stick and no downloading. IDK if Khan Academy even allows that.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      2 hours ago

      What internet dependence, it’s completely offline once you install it. Also, nobody is forcing you to use the LLM bundled with it. Stop perseverating.

  • Señor Mono@feddit.org
    link
    fedilink
    arrow-up
    9
    ·
    3 hours ago

    They lost me at AI…

    But yea, having the Wikipedia and offline maps stored away (e.g. for offline use on an android phone) is a great thing. I use a thumb drive instead of a interconnected thingy with “command centre”. Feeling old now.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      5
      ·
      2 hours ago

      Actually, having a local agent on the system with a bunch of data on it that helps you find things would be great for most people. People really need to get over the whole knee jerk reaction to all things AI related. It’s getting really tiresome.

      • Señor Mono@feddit.org
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        44 minutes ago

        People putting me in drawers is also getting tiresome.

        I’m using coding agents as a tool for software development and like to summarize complex matters. That doesn’t mean, that I like having AI “features” cramped into everything.

        And also there are the valid ethical questions AI-apologists and enthusiasts tend to dream away.

        Edit: especially if the stack is advertised for survival, mind this. The more moving parts, the higher the chances for a critical failure.

        • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
          link
          fedilink
          arrow-up
          2
          ·
          37 minutes ago

          You really don’t understand how having an AI agent look through your wiki and pull up relevant links quickly is useful? Also, what higher chance of failure are you talking about. It’s not like it’s an integral part of the system. The whole Ollama component is entirely opt in. At least try to make some sense here.

          • Señor Mono@feddit.org
            link
            fedilink
            arrow-up
            1
            ·
            8 minutes ago

            In that particular case no.

            My setup is searchable ZIM files on a pen drive. I open the file with a viewer. No servers involved, no handoff of data involved. More like a book as last resort.

            Maps are openstreet maps downloads in osmand.

            From my perspective there is no need to over engineer a basic tool.

            For example, take a look at those TEOTWAWKI preppers. They’re hoarding books, ebooks, literature and knowledge in the most robust and reliable way. No need to to add a tech stack that might or might not yield the proper answer and might or might not hide important facts in the summary.

  • webghost0101@sopuli.xyz
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    2 hours ago

    I love the concept, i have even been working on something similar but, big buts…

    Recommend ubuntu? While many are moving away from it.

    Ai chat with ollama as a prominent feature? Controversy aside, this survival computer better packs some hardware, which may cost more precious possibly limited power.

    Note taking app? Besides the intention to run it on ubuntu which i presume already includes something to work with markdown… any computer with a terminal can make notes as far as i know.

    Hardware scoring and community leaderboard? wtaf

    Things like offline wikipedia in kiwix are indeed pretty cool but in general the way this software describes itself feels sloppy and based more on vibes then anything though out.