No.
No.


Very rarely, but probably only in situations where you would too. No, usually I put my HTML in HTML files. They’re usually building blocks… page components, not a full page. I regulate the page flow in PHP, and I don’t like it cluttered up with tons of HTML, inside or outside of echos. I have been known to do stuff like this though:
echo “<div class=‘whatever-container’>”.$Page->pagecomponents[‘contents_of_some_html_file’].“</div>”;
If I go and look at $Page, it will show that $this->pagecomponents is set by reading my template files in so I can grab HTML structures dynamically. If the contents of pagecomponents[‘component’] are set dynamically (they usually are), there won’t be some ugly <php ?> tag in the HTML file, but my $Page class will handle populating it somehow. The architecture I usually use is $Validator is instantiated for a page load, then $Data, so whatever user activity $Validator has detected and cleaned up tells $Data what to do with the data backend (which is usually a combination of Maria and Redis) then $Data gets fed into $Page which figures out what page to build, looks at all my HTML building blocks and figures out how to put them together and populate whatever it needs to. So it will usually be something like (very simplistically)
$Validator = new Validator($_GET, $_POST);
$Data = new Data($Validator);
$Page = new Page($Data);
renderPage($Page->Page);


I don’t like reading it captain pedantic. Deal with it. :)


On the one hand, you do have good reasons to use classes.
Rather than piecemeal loading all these functions from every page where a bunch of them aren’t being used, you can create three classes.
has all your database interactions in it and then you can treat all database interactions as an object. My queries are usually all executed with $Data->runQuery();
Since you’re working in raw PHP with no frameworks or libraries, you NEED to validate every input users send, or bots are going to spam the shit out of your database. The way you have things now, you’re probably either calling some function(s) on every form submit (every time $_SERVER[‘request_method’]===‘POST’) OR you’re just not doing it. When working in raw PHP, I always write a Validator class which sits in between every $_GET and $_POST and makes damn sure what ever is coming in meets a set of criteria that I expect. I’m happy to go into the architecture of this with you if that would be helpful.
I’m assuming you might have something for each page like
include(‘header.php’);
<my page specific PHP is here>
include(‘footer.php’);
Instead, I like to write a page builder class that constructs my pages dynamically based on routing. So then any given page becomes and instance of $Page and you populate it with various methods (like $Page->renderForm(‘form’);) You can also then base the routing logic on your form submissions.
On the other hand… it’s probably fine at this stage to just not use classes and if it works, why fix it?
You probably feel like you don’t have a need for classes because you’re just not comfortable working with them yet, and need more experience thinking through architecture. This is fine. This is normal. This is exactly where you should be, given what you say about your experience level.
SQL injection probably didn’t work because PDO protects you from that to some extent. Doesn’t mean you shouldn’t account for it in your input processing.
Most of my HTML comes from echo.
Good, it should. I effing HATE reading through code where people are tagging in and out of PHP all the time. It looks so ugly. That’s not a standard best practice, just MY personal practice. IMHO, for HUMAN readability purposes, HTML should either be in echos or template files.
I fricken hate this:
<a> <bunch> <of> <html> <php? run_some_php('here'); />
Don’t effing make me read that. I co-run an independent coding shop and whenever we work in PHP, I tell people please not to do that.

Kristi Noem wants us to stop saying the name of murderer Jonathan Ross.


Steve Rogers, when the monkeys flew.


mander.xyz for science articles and memes. startrek.website and tenforward@lemmy.world for star trek memes. cybersecurity@sh.itjust.works for cybersecurity news, but not a lot of discussion.
The self hosting communities on lemmy.ml and lemmy.world are pretty active and good at answering questions.


I have kind of the opposite experience. When I go on reddit, I feel depressed and angry, when I go on Lemmy I laugh and learn stuff. Probably the communities I subscribe to though. I get political and regional news from Reddit (and don’t have an actual reddit account anymore). I get funny science and Star Trek memes from Lemmy and cyber security and tech news.


I haven’t tried.


Effing thank you!


I have neither used Bazzite nor CachyOS. You’re sure you don’t want to try Linux Mint? It’s extremely stable Linux for your grandma. Seriously, my dad’s laptops run Mint, and have for the last 5-6 years. When he gets a new laptop, I go over and install Mint for him (and he doesn’t know what Linux even means, he keeps calling LibreOffice “linux”). He asks me for help with his Windows desktop all the time (which he needs for certain software), but linux “just works” (his words). My son’s gaming computer and our house TV (which is an oldish Dell All-In-One that both my son and my wife need to be able to use) also run Mint.
For me, work computers that need to be stable run Mint, work computers that need to be secure run Qubes and servers run Debian.


When any criticism of the Israeli government or expression of solidarity with the Palestinian people is “antisemitic”, you’ve damaged and weakened the very meaning of the word “antisemitism” and the moral authority of those who wield it.

Perhaps it suggested that centrist, “business as usual” candidates aren’t what Americans want?



I got it and it’s funny, but I think lots of people need that /s tag or they don’t process it correctly.


Thank you. You saved me a Google search.


Hello. My name is Inigo Montoya. You killed my father. Prepare to die.


Artemis is a boondoggle corporate giveaway. Its main purpose is to funnel money into the pockets of big contractors as quickly and efficiently as possible.
I worked on it for a year and a half, and saw so much mismanagement and self-sabotage, I can’t even say. I’ve made multiple posts about it in the past. NASA spent $10 million at least having my team fail to build something that we could have built for probably $2.5 million. Most of that money vanished into the pockets of a giant, evil corporation that mostly builds weapons. I can tell you the guys (and they were all men) that we worked with from that company were laughing all the way to the bank when they canceled our project. Now they’re launching without that component.
I have lots of feelings.


Capitalism is happy to have cheap code that works “well enough” to sell, and mostly prefers it to expensive code that works “really well.”
The future is full of buggy ass code that runs most services and devices, who’s main priority is vacuuming up data about its users and everyone and everything around them, and then a few high quality products and services only the rich can afford.

I think this is getting downvoted because the headline sounds like it’s casting shade. It’s not, it’s actually an historical reference to a political movement from 100 years ago.
No. Niche, hipster, “latest hotness” distros sometimes vanish. Debian, Fedora, Gentoo, Kali, Qubes, Mint are all examples of community maintained distros that have been around for a long time.
Since you’re looking for “stability” highly recommend Mint.