• 3 Posts
  • 1.12K Comments
Joined 4 years ago
cake
Cake day: January 17th, 2022

help-circle
  • It’s a pragmatic compromise. The assumption is that Google is not literally evil, solely a very large advertisement company which subsidize very cool hardware in order to sell more ads. It’s the same principle as using a rooted Meta Quest when one doesn’t even have a Facebook or WhatsApp account.

    I imagine than everybody who is into that situation will move to Motorola or Valve Frame when those will become available. Until then the bet is that the hardware does not have hardware backdoors because so far nobody disclosed any.

    If you really are into trusting hardware I recommend checking https://precursor.dev/ and similar initiatives.

    I did mention Linux phones too but again that’s not for everyone.

    IMHO it’s much better to use a GrapheneOS deGoogle Android device today, knowing the limitation, than using a Googled Android device today, Pixel or not, and complaining about all the limitations about it while waiting for a theoretical better solution that is simply not yet available.



  • Yes, I have a PinePhone and PinePhone Pro both with PostMarketOS so doing this is as easy as few sudo apk add packagename or sudo apk del firefox.

    Now… if you want a daily driver then as few others hinted at, it’s much harder. I would instead, if deGoogle Android is an acceptable compromise for you, get a 2nd hand Pixel 8 or above, install GrapheneOS on it, remove the browser and that’s pretty much it already since it doesn’t come with an app store or equivalent. Well, there’s the GrapheneOS equivalent but there are ~10 apps on it at most last time I checked.







  • IMHO the question depends on :

    • who you are (boring, rando, political dissident, journalist, etc)
    • who you talk to (family, friends, work, etc)
    • what alternatives actually exist

    So… sure Signal is not perfect but if you can’t convince your family members to move to DeltaChat it sure beats using WhatsApp, Telegram, etc.







  • utopiah@lemmy.mltoPrivacy@lemmy.mlClaude: papers please?
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    8 days ago

    IMHO LLM usage isn’t coherent with independence. That being said I wrote quite a bit on self-hosting LLMs. There are quite a few tools available, like ollama itself relying on llama.cpp that can both work locally and provide an API compatible replacement to cloud services. As you suggested though typically at home one doesn’t have the hardware, GPUs with 100+GB of VRAM, to run the state of the art. There is a middle ground though between full cloud, API key, closed source vs open source at home on low-end hardware : running STOA open models on cloud. It can be done on any cloud but it’s much easier to start with dedicated hardware and tooling, for that HuggingFace is great but there are multiples.

    TL;DR: closed cloud -> models on clouds -> self-hosted provide a better path to independence, including training.