

Codeberg does actively try to prevent bot scraping.


Codeberg does actively try to prevent bot scraping.


Is it easier to secure, monitor fewer, bigger reactors or thousands of* small ones? Accidents are still going to happen and I know which scenario makes more sense to me. Especially in light of Trump’s recent push to deregulate nuclear energy, kill the EPA, and pretty much any other kind of sensible management efforts of technology that is great until something goes wrong then it quickly becomes a multi-generational clusterfuck.
Solar, batteries and long-range transmission infrastructure just makes too much sense I guess.


Darwin just getting ever more creative over time.


Straight out of the NSA ANT catalog aka LOUDAUTO and others.
This is like that part in Don’t look up when the Jennifer Lawrence’s character tells her BF to wait 6 months before she meets his mother.


Prosody XMPP + Pidgin/(Monal|Xabber) has always worked for me. It is not hard to setup or manage, has E2E encryption too.


Well … are you a fish? If so, no… if not, yes.
I sincerly hope that this is a troll post because if not, well. Sigh.


Because every single foreign government hacks every other foreign government every single chance they get. If I get any say in the matter I’d rather keep my list of enemies as small as possible(aka only the US government). Most rational people would agree with that. At least you have some say in accountability for the US government, in theory at least.
I feel like every time this topic comes up people forget all of this and also forget that China’s energy, automotive, literally every industry in China is controlled by PRC/CCP, 100%. Even the US/China joint ventures have to follow rules laid out by the PRC/CCP.
Ignore the idiot posting about this RAT.
If you want to secure your Linux system, use ClamAV, a local firewall like UFW or even opensnitch for a start. Also use your head when adding apps to your system. Stick to the official repos from your distro. Things like Arch’s AUR, random PPAs in Ubuntu and any random github project are going to be much riskier by their very nature so act accordingly.
If you need to risky stuff, do it a VM and network that guest into a private internal network that can only exit over a companion PFSense VM that is dual homed to the regular LAN and the private internal network. Take a snapshot of the risky guest before you use it in a session and when you are done, roll back to your clean snapshot.
Store your passwords in something like Keepass(strong master password!) and then use syncthing to push copies of the database to at least one other box locally or in the cloud if you really have to.
It seems to just be more attack surface for very little actual gain on JS. At least with JS I have NoScript, Ublock and some actual say over what loads/runs on my box. For this reason, I usually just disable all wasm/webgl/webrtc until I find out that I actually need it which for me is basically never or only for very short periods.


Some upgrades require human input like when core service config files upgrades are offered. (ex. would like to update /etc/samba/smb.conf with the maintainer’s version or keep your own?)
In my experience this can occasionally cause background apt processes to hang while they wait for your answer to that kind of question. There is a debconf trick you can try. debian_frontend=noninteractive. You can create your own cronjob, as root, that runs a script with this export command, apt update, then apt dist-upgrade -y.


So if AI is running fuzzers to find bugs, credit should go to the fuzzers, not the AI.
Please stop reposting the Anthropic shit posts. This is pure advertising spam from a disreputable company.


If your machine is a Tuxedo laptop, this thread might interest you. Seems as though this user was hitting thermal limits and their laptop would freeze/poweroff to keep from dying.


Run your workload in a guest VM and limit its resources to whatever you desire. You can also consider c-groups if you already know which processes are causing all of the trouble.


Ignoring what users want is the tradition in GNOME and yeah ofcourse Fedora is gonna do whatever RedHat/IBM tells them to do including push AI-slop.


I would look at these things first.


The content produced by humans was scraped en-masse for the explicit purpose of training models which were then monetized into business products.
I struggle to reconcile that with Fair Use.
I can see if the source was EULA’d to remove all rights to what you post to things like Reddit, Stack Overflow, and if somehow those entities were contacted ahead of time and negotiated usage. You, I and the web server logs know that this was almost never the case.


This project has never been more relevant in light of the recent acceleration of enshitification over at Microslop. Might be time to donate a few bucks.


It should but you can test that assumption by trying to ping any other device on the non-guest wifi. (and try ping in the other direction)
Check your passkeys. You might still have one in the OS credential manager.