Do you have any experience with Linux viruses? Have you had your own Linux installations infected with viruses or malware?
Do you have any experience with Linux viruses? Have you had your own Linux installations infected with viruses or malware?
You sure though?
What do you want? It should go without saying that I am absolutely sure of my own experience.
In probably 15 years total of running Linux I have not had a single problem with malware or viruses. Part of that time was also running Windows regularly and my Windows systems DID become infected with both malware and viruses occasionally, despite my best efforts. And you’re not mentioning the fact that Linux runs on 63% of the server market and those systems are under constant attack.
Reports of Linux system infections are truly rare, and considering the nature of the user community would be widely and loudly reported if they were happening.
Do you have any experience in this matter? Have you had your own Linux installations infected, or are you a Windows user questioning what you’re reading? (Perfectly reasonable if the 2nd one’s the case.) Please fill us in on the details.
There can be a period after graceful shutdown where the UPS is still running and the server will not restart if mains power comes back on. Not a likely scenario, but for apps you can’t afford to have down it’s something to consider.
I’ve used Linux Mint and other distros daily for more than 10 years. Never had a virus or malware issue and don’t even run antivirus software.
During that same time I’ve had to help friends remove viruses and malware from their Windows machines dozens of times. The latest Windows disaster I’ve assisted with was a few months ago. A retired friend had her Windows 10 machine hijacked and $8K stolen from her savings account. Making sure the malware was removed required hours of work formatting the drive and reinstalling Windows.
IMO you are far safer with a plain vanilla Linux install that you are with Windows, no matter what steps you take to secure your Windows installation.
Every wifi device we own that’s connected to wifi and the Internet can be precisely located by the companies involved even when using a VPN.
If you have an Android phone you’ve probably noticed a prompt at some point asking for your permission to transmit precise location information and enable wifi scanning. Those wifi SSIDs and MAC addresses along with its GPS location is sent back to Google. The combination of all that information is almost as unique as a fingerprint. They can use that along with signal strength of each AP in the area to determine your device’s location with precision. (Google used to allow apps like Maps to be used with wifi scanning turned off, but no more.)
Your Google stick can’t tell it’s on a VPN directly, but even without GPS Google can still pinpoint its physical location using their database of SSIDs and MAC addresses, and if they want to they can determine you’re using a VPN by comparing that to the expected location of your IP address. There probably aren’t enough people doing this right now to make it worth the trouble to detect your VPN, but IMO it’s just a matter of time before they decide it is.
I also expect that Google sells that information to every company willing to pay for it, so almost every single wifi enabled device can be precisely located if it can transmit data to the Internet.
We live in a scary time.
No matter where it appears to be on a map, Missouri is in the deep south.
Thanks for that list. No need here for more advanced hardware so I’ll have to put off networking upgrades until I can come up with a reason to justify it.
As a home user, what additional features have you found useful on enterprise networking equipment? Just because what I’m doing is already ridiculously complex doesn’t mean it can’t be more so.
OpenWRT is amazingly flexible and would be a great place to start.
I switched from DD-WRT last year and have been amazed how good OpenWRT is. There are thousands of software packages that allow you to do pretty much anything you can think of on inexpensive hardware. Used Netgear R7800s are available for less than $50 on ebay or there are plenty of newer hardware options if you want to spend more. Those thousands of downloadable software packages include Wireguard and Adguard Home, plus there are OpenWRT integrations for Home Assistant. The forum is full of people who are happy to help newcomers.
I started by running OpenWRT in a virtual machine to get familiar with the UI and moved on to a live installation. Highly recommended, especially if you enjoy learning.
On my Linux Mint laptop Winboat installed quickly and allowed me to install and run the one program I use that requires Windows. This biggest issues were with that same app’s windows when they were rendered on the Linux desktop. They sometimes couldn’t be moved, resized or closed, however the same app ran just fine on the Winboat Windows Desktop itself.
The latest version is identified as an alpha release on the UI, so these problems aren’t surprising. What is surprising is how well so much of this works for an alpha release, particularly how polished the installation process is.
Looking forward to using Winboat when it progresses to the beta.
4gb isn’t much ram, but it can be surprisingly useful if you configure Zswap. Lots of guides out there. Here’s one of them.
I didn’t figure it out either. It was a educated guess and I got lucky.
I had similar issues with Home Assistant initially and had two failures that looked like database corruption in less than 6 months. I decided to give it one last try and switched to MariaDB. That was nearly 3 years ago. Since then it’s been rock solid.
You had a lucky escape, HA is addictive.
I have a Surface Laptop 4 and have been running Mint exclusively for a couple of years. It’s less well supported than the tablets and the initial installation took a bit of work, but once installed it has worked perfectly. The Linux Surface project has a detailed feature matrix that shows what’s supported for each model.
A single Opteron 6272 is somewhat faster than the N200, but the Opteron’s TDP is 115 watts while the N200’s is only 6 watts. OP’s server with 2 processors is more than 2x as fast as my single processor laptop, but can require nearly 40x the electricity. For a home server it’s major overkill.
Sounds like my laptop will be plenty fast for some time to come.
This platform doesn’t use much power to begin with, but I do run TLP using a battery profile despite the fact it’s always plugged in. My intent is to lower the power consumption a bit further and extend battery run time if the AC fails. There’s no noticeable impact on application performance. If you’re running Linux maybe it will work on your hardware.
Tangential question: What kind of server apps require that kind of processing power? I run a server on an Intel N200 laptop with multiple apps and services and it rarely uses more than 12% CPU and 15 watts. I’m wondering if I’m going to eventually run into something that needs a more powerful platform.
If I’m understanding what you want to do, I have this set up on an OpenWRT router with multiple remote endpoints used for different devices. Our phones go to a hosted Wireguard server in one city, PCs to an OpenWRT router in a different location, and IOT devices that aren’t blocked and guest devices exit access the Internet locally. With some additional work you should also be able to have remote devices connected via WG exit wherever you like.
Policy Based Routing on OpenWRT makes this possible and it should be doable as long as the devices you want to allow to exit the remote server are included in that server’s “Allowed IPs” setting. (Maybe there’s a way around that, but I haven’t had to deal with it.)
As others have said, get something that works with OpenWRT. It’s unbelievably flexible and the OpenWRT forum can be really helpful, both for finding ways to implement things and for solving problems.
Glad you haven’t had any issues. Have a good night.