- 10.9.2025: Added equal sign after matching SID to prevent false positives. Thank you Stoker for pointing this out.
- 14.9.2025: Instead of forbidden (F=403), return 422 "Unprocessable Content" error which is more descriptive in Apache. Unfortunately Apache does not offer cutting the connection (444) like NGINX does.
Undeclared automated forum spamming for backlinks started in this forum a few weeks ago. CPU usage and load peaked because of the database server handling a lot of automated request. Finally the server became unresponsive until the attackers were blocked on the web server and operating system level. Banning IP addresses and users does not help because as long as spammers can access the forum as they will cause excessive load with huge amount of requests even if they are banned or blocked. HTTP requests execute many php scripts and database queries which will quickly overload the sql server, forum software, web server, operating system and finally hosting. There has been hundreds of request per second and thousands of lingering connections.

How to block the attackers?
Here is a universal fix to block malicious spam attacks against phpBB forum software. Before you apply this fix, add to phpBB bot configuration bots, crawlers and spiders to make sure they will not be assigned SIDs.
NGINX
Add the following configuration to the file
/etc/nginx/sites-available/YOUR_WEBSITE
Code: Select all
map $http_user_agent $denyagent {
default 1;
~*bot 0;
~*crawler 0;
~*spider 0;
}
server {
set $botdeny "";
if ($denyagent) {
set $botdeny X;
}
if ($http_referer = "") {
set $botdeny "${botdeny}X";
}
if ($args ~ "sid=") {
set $botdeny "${botdeny}X";
}
if ($botdeny = XXX) {
return 444;
}
}
Add the following configuration to the file
YOUR_WEBSITE_ROOT_DIRECTORY/.htaccess
Code: Select all
<IfModule mod_rewrite.c>
RewriteCond %{HTTP_USER_AGENT} !(bot|crawler|spider) [NC]
RewriteCond %{HTTP_REFERER} ^$
RewriteCond %{QUERY_STRING} (sid=) [NC]
RewriteRule ^(.*)$ - [R=422,L]
</IfModule>
Re-starting NGINX to apply configuration changes
After that you need to reload your NGINX configs so that the changes take effect. If the reload fails, you will have to look for the errors from the logs. If your configs have errors when you start or restart the server, NGINX will stop working until you fix the errors. I personally found that annoying and wrote the following script to first test that the configs are OK before reloading NGINX and if not, it will show immediately you the errors in configurations.
Code: Select all
#!/bin/bash
#
# Test NGINX configuration and reload NGINX them if configuration is OK
# Public domain, 2025
# Functions
ok() { echo -e '\e[32m'$1'\e[m'; } # Green
die() {
echo -e '\e[1;31m'$1'\e[m'
exit 1
}
# Sanity check
[ $(id -g) != "0" ] && die "Script must be run as ROOT."
[ $# != "0" ] && die "Usage: $(basename $0)"
# Test NGINX configuration
/usr/sbin/nginx -t
if [[ $? == 0 ]]; then
ok "Testing NGINX configuration"
else
die "Testing NGINX configuration failed, error $?"
fi
# Reload NGINX
/usr/bin/systemctl reload nginx.service
if [[ $? == 0 ]]; then
ok "Reloading NGINX configuration"
else
die "Reloading NGINX configuration failed, error $?"
fi
ok "NGINX reloaded, all good"
If you have web server access logging on, you can find the sources from those logs. I personally prefer to have logging disabled to guarantee absolute privacy and reduce web server load. You can find from the access log the sources of traffic, frequency of visits as well as User-Agent string. If you count the number of requests from each IP address excluding legitimate bots who identify themselves, you will find out the sources. Those requests that try post to your forum, login, register and often clear cookies, are the most promising. When there are hundreds or thousands of request coming every hour, those can not originate from users so they must be caused by malicious scripts.
Here are some simple *nix commands for NGINX web server you can use to find out the sources.
Find how many times the top 20 IP addresses have accessed forum:
Code: Select all
awk '{print $1}' YOUR_HTTP_ACCESS_LOG | sort | uniq -c | sort -nr | head -20
Code: Select all
grep -i -Ev 'bot|spider|crawler' YOUR_HTTP_ACCESS_LOG | awk -F' - |\"' '{print $1" - "$7}' | sort | uniq -c | sort -nr | head -20
Code: Select all
grep -e posting -e registration -e delete_cookies -e login YOUR_HTTP_ACCESS_LOG | awk -F' - |\"' '{print $1" - "$7}' | sort | uniq -c | sort -nr | head -20
You can block IP addresses in your web server configuration if you want. The following examples are for blocking malicious IP addresses, but that is futile as the attackers are using automated proxy lists that grow all the time. When you have managed to block some IP addresses, there are new already coming taking your forum down.
The principle is universal and available in all web servers. The following examples are written for NGINX. You could deny access from your web server, but that would cause your web server to send 403 error page which would still generate traffic and eat your server's resources. Another, smarter option is just to cut the connection immediate by returning error 444 "No Response". This will cut the connection immediately. Alternatively you can block those IP addresses from your firewall.
In NGINX you can do this by editing your web server configuration file:
Add to the beginning of the configuration file a GEO block like this with malicious IP addresses:/etc/nginx/sites-available/YOUR_WEBSITE
Code: Select all
geo $block_spambots {
default 0;
47.236.134.202 1; # agressive DDoS forum posting bot
47.82.0.0/16 1; # agressive DDoS forum posting bot
47.79.0.0/16 1; # agressive DDoS forum posting bot
87.120.166.175 1; # forum registration bot
147.45.66.176 1; # forum registration bot
3.1.218.249 1; # a bot searching for wordpress vulnerabilities
172.173.151.173 1; # a bot trying to run cron
}
Code: Select all
if ($block_spambots) {
return '444';
}
Happy hacking,
Santeri