

That’s me. Best way I can describe it is like a word cloud but no text or dialog. A bunch of concepts with varying importance and strength of connections.


That’s me. Best way I can describe it is like a word cloud but no text or dialog. A bunch of concepts with varying importance and strength of connections.


Sort of. There isn’t a Nobel prize for economics. He has a Sveriges Riksbank that’s given at the same ceremony.
Maybe it’s because dogs are often big enough to kill you. My cat gets angry I didn’t snuggle right and swats. The cats that big enough to kill us people seldom live with and attacks are frequent.


GNOME devs making a bad decision!? Never! /S


ls -hal


Linux isn’t going to help much when the applications are using a lot ram. Firefox is an absolute ram hog linux or windows. Linux is just going to use less of the ram for it self.
I’m going to agree with a lot of the other posters and say QT with QT creator. It’s a tested and well though out implementation. It’s signals and slots event system is straight forward and easy to learn.
Whatever route you take learn Model View Controller (MVC). It gets in the mindset of keeping your data model seprate from things that use the data and things that change the data.


Agreed. I wasn’t trying to say they are always better just explain the difference.
I almost exclusivity use Linux and it handles this great. .so libraries are stored with a version number and a link to the latest. So math3.so and math4.so with math.so being a link to math4.so. that way if needed I can set a program to use math3.so and keep everything else on the latest version.


So the basic purpose of a library is to allow code that does some useful thing to be easily used in multiple programs. Like say math functions beyond what is in the language it self or creating network connections.
When you build a program with multiple source files there are many steps. First each file compiled into an object file. This is machine code but wherever you have calls into other files it just inserted a note that basicly says connect this call to this part of another file. So for example connect this call to SquareRoot function in Math library.
After that has been done to every file needed then the linker steps in. It grabs all the object files combines them into one big file and then looks for all the notes that say connect this call to that function and replaces them with actual calls to the address where it put that function.
That is static linking. All the code ends up in a big executable. Simple but it has two big problems. The first is size. Doing it this way means every program that takes the squareroot of something has a copy of the entire math library. This adds up. Second is if there is an error in the math library every program needs to be rebuilt for the fix to apply.
Enter dynamic linking. With that the linker replaces the note to connect to the SquareRoot function in math library with code that requests the connection be made by the operating system.
Then when the program is run the OS gets a list of the libraries needed by the program, finds them, copies them into the memory reserved for that program, and connects them. These are .so files on Linux and .dll on Windows.
Now the os only needs one copy of math.so and if there is a error in the library a update of math.so can fix all the programs that use it.
For GPL vs LGPL this is an important distinction. The main difference between them is how they treat libraries. (There are other differences and this is not legal advice)
So if math.so is GPL and your code uses it as a static link or a dynamic link you have to providd a copy of the source code for your entire program with any executable and licence it to them under the GPL.
With LGPL it’s different. If math.so is staticly linked it acts similar to the GPL. If it’s dynamicly linked you only have to provide the source to build math.so and licences it under LGPL. So you don’t have to give away all your source code but you do have to provide any changes to the math library you made. So if you added a cubeRoot function to the math library you would need to provide that.


Redhawk the coked out shop kid you normally want nothing to do with but will also give you a ride to another town at 3 am.
At work a mix of red hat, fedora, centos, and red hawk. At home mint debian spin. It just works and games run great. I don’t have time to deal with the red hat crap if i’m not getting paid.
Honestly it’s better but still a mess of design choices. For open source graphics editor check out Krita.


For ntsc vhs players it wasnt a component in the vcr that was made for copy protection. They would add garbled color burst signals. This would desync the automatic color burst sync system on the vcr.
CRT TVs didn’t need this component but some fancy tvs would also have the same problem with macrovission.
The color burst system was actually a pretty cool invention from the time broadcast started to add color. They needed to be able stay compatible with existing black and white tv.
The solution was to not change the black and white image being sent but add the color offset information on a higher frequency and color TVs would combine the signals.
This was easy for CRT as the electron beam would sweep across the screen changing intensity as it hit each black and white pixel.
To display color each black and white pixel was a RGB triangle of pixels. So you would add small offset to the beam up or down to make it more or less green and left or right to adjust the red and blue.
Those adjustment knobs on old tvs were in part you manually targeting the beam adjustment to hit the pixels just right.
VCRs didn’t usually have these adjustments so they needed a auto system to keep the color synced in the recording.


Or toss a flash bang in the crib.
Honestly Red Hat only has a big grip on the mid to small size business side.
Steam play. I spent nine years with linux as my main work os. Then I’d come home and game on windows. Once Steam play was mature I setup a dual boot to give it shot. I think I booted into windows twice after that.


It was something around 40 TB X2 . We were doing a terrain analysis of the entire Earth. Every morning for 25 days I would install two fresh drives in the cluster doing the data crunching and migrate the filled drives to our file server rack.
The drives were about 80% full and our primary server was mirrored to two other 50 drive servers. At the end of the month the two servers were then shipped to customer locations.


It was clearly an attack. By who is unknown.
Notably this was in 2003 before git (2005) so linux source was in a central bitkeeper repo. So a commit with no associated data about who did it should not have been possible.
Here is a more detailed article. https://lwn.net/Articles/57135/
Stop doing what you enjoy and pay attention to what I want.