Skip to content

NODEBB: Nginx error performance & High CPU

Solved Performance
69 3 36.3k 2
  • @phenomlab

    I don’t understand all you say.
    Finally what we can do ?

    Actually we have 1.1k users online

    We have a lot of inscriptions

    @phenomlab 362 user inscription in two days and many user on just read forum

  • @phenomlab

    I don’t understand all you say.
    Finally what we can do ?

    Actually we have 1.1k users online

    We have a lot of inscriptions

    @DownPW said in NODEBB: Nginx error performance & High CPU:

    I don’t understand all you say.
    Finally what we can do ?

    My point here is that the traffic, whilst legitimate in the sense that it’s from another site that has closed, could still be nefarious in nature so you should keep your guard up. However, a number of signups can’t be wrong - particularly if they are actually posting content and not performing requests that actually do not pertain to available URL’s on your site.

    I see no indication of that, so the comfort level in the sense that it’s legitimate traffic does increase somewhat accompanied by the seemingly legitimate registrations. However, because all the source IP addresses and within the Cloudflare ranges, you have no ability to tell really who they are without performing the steps I outlined in the previous post.

    The good news is that your site just got a huge increase in popularity, but with that will always be a need to keep a close eye on activity. It would only take one nefarious actor to potentially bring down your site.

    The nginx configuration you’ve applied will indeed alleviate the stress placed on the server but is a double edged sword in the sense that it does make the goalpost much wider in terms of any potential attack.

    My advice herein would be to not scale these settings too high. Use sane judgement.

    For the NodeBB side, I know they have baked rate limiting into the product but I’m sure you can actually change that behaviour.

    Have a look at

    /admin/settings/advanced#traffic-management
    

    You’ll probably need to play with the values here to get a decent balance, but this is where I’d start.

  • @DownPW said in NODEBB: Nginx error performance & High CPU:

    I don’t understand all you say.
    Finally what we can do ?

    My point here is that the traffic, whilst legitimate in the sense that it’s from another site that has closed, could still be nefarious in nature so you should keep your guard up. However, a number of signups can’t be wrong - particularly if they are actually posting content and not performing requests that actually do not pertain to available URL’s on your site.

    I see no indication of that, so the comfort level in the sense that it’s legitimate traffic does increase somewhat accompanied by the seemingly legitimate registrations. However, because all the source IP addresses and within the Cloudflare ranges, you have no ability to tell really who they are without performing the steps I outlined in the previous post.

    The good news is that your site just got a huge increase in popularity, but with that will always be a need to keep a close eye on activity. It would only take one nefarious actor to potentially bring down your site.

    The nginx configuration you’ve applied will indeed alleviate the stress placed on the server but is a double edged sword in the sense that it does make the goalpost much wider in terms of any potential attack.

    My advice herein would be to not scale these settings too high. Use sane judgement.

    For the NodeBB side, I know they have baked rate limiting into the product but I’m sure you can actually change that behaviour.

    Have a look at

    /admin/settings/advanced#traffic-management
    

    You’ll probably need to play with the values here to get a decent balance, but this is where I’d start.

    @phenomlab

    I think you’re right Mark and that’s why I come here looking for your valuable advice and expertise 😉

    Basically, the illegal site that closed was a movie download site A topic was opened on our forum to talk about it and many came looking for answers on why and how.

    You’re actually right about the fact that we can’t be sure of anything and there are bot attacks or ddos in the lot of connexions

    I activated the under attack mode on Cloudflare as you advised me to see (just now.) and we will see like you said

    As you advised, I also reset the default nginx configuration values ​​and removed my nginx modifications specified above.

    I would like to take advantage of your expertise, see a hand from you to properly configure nginx for ddos ​​and high traffic. (What precise modifications to specify as well as the precise values.)

  • @phenomlab

    I think you’re right Mark and that’s why I come here looking for your valuable advice and expertise 😉

    Basically, the illegal site that closed was a movie download site A topic was opened on our forum to talk about it and many came looking for answers on why and how.

    You’re actually right about the fact that we can’t be sure of anything and there are bot attacks or ddos in the lot of connexions

    I activated the under attack mode on Cloudflare as you advised me to see (just now.) and we will see like you said

    As you advised, I also reset the default nginx configuration values ​​and removed my nginx modifications specified above.

    I would like to take advantage of your expertise, see a hand from you to properly configure nginx for ddos ​​and high traffic. (What precise modifications to specify as well as the precise values.)

    @DownPW ok, good. Let’s see what the challenge does to the site traffic. Those whom are legitimate users won’t mind having to perform a one time additional authentication step, but bots of course will simply stumble at this hurdle.

  • @DownPW ok, good. Let’s see what the challenge does to the site traffic. Those whom are legitimate users won’t mind having to perform a one time additional authentication step, but bots of course will simply stumble at this hurdle.

    @phenomlab

    number of user is better (408) but a lot of loose connexion. navigation is hard

  • @phenomlab

    number of user is better (408) but a lot of loose connexion. navigation is hard

    I have chaneg nginx conf with :

    worker_rlimit_nofile 70000;

    events {
    worker_connections 65535;
    multi_accept on;
    }

    CF is under attack mode

  • I have chaneg nginx conf with :

    worker_rlimit_nofile 70000;

    events {
    worker_connections 65535;
    multi_accept on;
    }

    CF is under attack mode

    @DownPW I still have access to your Cloudflare tenant so will have a look shortly.

    EDIT: I am in now - personally, I would also enable this (and configure it)

    d85820ea-6643-49bd-98da-a8537e970f04-image.png

    b6a188a9-deba-4980-b4a1-99df7975160d-image.png

  • @DownPW I still have access to your Cloudflare tenant so will have a look shortly.

    EDIT: I am in now - personally, I would also enable this (and configure it)

    d85820ea-6643-49bd-98da-a8537e970f04-image.png

    b6a188a9-deba-4980-b4a1-99df7975160d-image.png

    @phenomlab I have already activate it and add a waf rules for russian country

    2e3108dc-8d68-4e48-91c7-e1c3bc00c229-image.png

    With this bots settings :
    e12409dd-df54-4def-b998-470786c3afa9-image.png

    and this settings for ddos protection :

    2e176374-f7d1-48e4-b38f-83360a0f182a-image.png

  • @phenomlab I have already activate it and add a waf rules for russian country

    2e3108dc-8d68-4e48-91c7-e1c3bc00c229-image.png

    With this bots settings :
    e12409dd-df54-4def-b998-470786c3afa9-image.png

    and this settings for ddos protection :

    2e176374-f7d1-48e4-b38f-83360a0f182a-image.png

    @DownPW said in NODEBB: Nginx error performance & High CPU:

    I have already activate it

    Are you sure? When I checked your tenant it wasn’t active - it’s from where I took the screenshot 😁

  • @DownPW said in NODEBB: Nginx error performance & High CPU:

    I have already activate it

    Are you sure? When I checked your tenant it wasn’t active - it’s from where I took the screenshot 😁

    @phenomlab

    yep I activate it after 😉

  • @phenomlab

    yep I activate it after 😉

    For your information @phenomlab ,

    • I have ban via iptables suspicious ip address find on /etc/nginx/accesss.log and virtualhost access.log like this : iptables -I INPUT -s IPADDRESS -j DROP
    • Activate bot option on CF
    • Create contry rules (Russie and China) on CF WAF
    • I left under attack mode option activated on CF
    • I have just change nginx.conf like this for test (If you have best value, I take it ! ) :
    worker_rlimit_nofile 70000; 
    
    events {
    
    	worker_connections 65535;
    	multi_accept on; 
    }
    
    http {
    
    	##
    	# Basic Settings
    	##
    
    	limit_req_zone $binary_remote_addr zone=flood:10m rate=100r/s; 
    	limit_req zone=flood burst=100 nodelay; 
    	limit_conn_zone $binary_remote_addr zone=ddos:10m; 
    	limit_conn ddos 100;
    

    100r/s iit’s already a lot !!

    and for vhost file :

    server {
    	.....
    
            location / {
    				
    				limit_req zone=flood; #Test 
    				limit_conn ddos 100; #Test 
    }
    

    –> If you have other ideas, I’m interested
    –> If you have better values ​​to use in what I modified, please let me know.

  • For your information @phenomlab ,

    • I have ban via iptables suspicious ip address find on /etc/nginx/accesss.log and virtualhost access.log like this : iptables -I INPUT -s IPADDRESS -j DROP
    • Activate bot option on CF
    • Create contry rules (Russie and China) on CF WAF
    • I left under attack mode option activated on CF
    • I have just change nginx.conf like this for test (If you have best value, I take it ! ) :
    worker_rlimit_nofile 70000; 
    
    events {
    
    	worker_connections 65535;
    	multi_accept on; 
    }
    
    http {
    
    	##
    	# Basic Settings
    	##
    
    	limit_req_zone $binary_remote_addr zone=flood:10m rate=100r/s; 
    	limit_req zone=flood burst=100 nodelay; 
    	limit_conn_zone $binary_remote_addr zone=ddos:10m; 
    	limit_conn ddos 100;
    

    100r/s iit’s already a lot !!

    and for vhost file :

    server {
    	.....
    
            location / {
    				
    				limit_req zone=flood; #Test 
    				limit_conn ddos 100; #Test 
    }
    

    –> If you have other ideas, I’m interested
    –> If you have better values ​​to use in what I modified, please let me know.

    @DownPW my only preference would be to not set worker_connections so high

  • @DownPW my only preference would be to not set worker_connections so high

    @phenomlab

    Ok so what value do you advise?

  • @phenomlab

    Ok so what value do you advise?

    @DownPW you should base it on the output of ulimit - see below

    https://linuxhint.com/what-are-worker-connections-nginx/#:~:text=The worker_connections are the maximum,to accommodate a higher value

    With that high value you run the risk of overwhelming your server.

  • @DownPW you should base it on the output of ulimit - see below

    https://linuxhint.com/what-are-worker-connections-nginx/#:~:text=The worker_connections are the maximum,to accommodate a higher value

    With that high value you run the risk of overwhelming your server.

    @phenomlab

    Thanks mark 😉

    My ulimit is 1024, so I will set it to 1024

  • @phenomlab

    Thanks mark 😉

    My ulimit is 1024, so I will set it to 1024

    @DownPW And the worker_processes value ? I expect this to be between 1 and 4 ?

  • @DownPW And the worker_processes value ? I expect this to be between 1 and 4 ?

    @phenomlab

    worker_processes auto;
    
  • @phenomlab

    worker_processes auto;
    

    @DownPW ok. You should refer to that some article I previously provided. You can probably set this to a static value.

  • @DownPW ok. You should refer to that some article I previously provided. You can probably set this to a static value.

    @phenomlab

    Ok I will see it for better worker_processes value

    I add a rate limite request and limit_conn_zone on http block and vhost block :

    – nginx.conf:

    http {
    
    	#Requete maximun par ip 
    	limit_req_zone $binary_remote_addr zone=flood:10m rate=100r/s; 
    	#Connexions maximum par ip 
    	limit_conn_zone $binary_remote_addr zone=ddos:1m;
    
    -- vhost.conf :
            location / {
    				
    				limit_req zone=flood burst=100 nodelay; 
    				limit_conn ddos 10;
    

    –> I have test other value for rate and burst but they cause problem access to the forum. If you have better, I take it

    I add today a proxy_read_timeout on vhost.conf (60 by default)

    proxy_read_timeout 180;
    

    I have deactivate underattack mode on CF and change for high Level

    I have add other rules on CF waf :

    9059db6c-f4f3-4415-bb47-ab88b2b41861-image.png

  • @phenomlab

    Ok I will see it for better worker_processes value

    I add a rate limite request and limit_conn_zone on http block and vhost block :

    – nginx.conf:

    http {
    
    	#Requete maximun par ip 
    	limit_req_zone $binary_remote_addr zone=flood:10m rate=100r/s; 
    	#Connexions maximum par ip 
    	limit_conn_zone $binary_remote_addr zone=ddos:1m;
    
    -- vhost.conf :
            location / {
    				
    				limit_req zone=flood burst=100 nodelay; 
    				limit_conn ddos 10;
    

    –> I have test other value for rate and burst but they cause problem access to the forum. If you have better, I take it

    I add today a proxy_read_timeout on vhost.conf (60 by default)

    proxy_read_timeout 180;
    

    I have deactivate underattack mode on CF and change for high Level

    I have add other rules on CF waf :

    9059db6c-f4f3-4415-bb47-ab88b2b41861-image.png

    @DownPW what settings do you have in advanced (in settings) for rate limit etc?


Did this solution help you?
Did you find the suggested solution useful? Support 💗 Sudonix with a coffee
If your organisation needs deeper expertise around infrastructure, security, or technology leadership, learn more about Phenomlab Ltd. Many of the deeper technical guides behind Sudonix are published there.

Related Topics
  • What’s going on with NodeBB?

    Performance nodebb script die
    20
    5 Votes
    20 Posts
    1k Views
    @cagatay The most reliable way to upgrade Node.js on Ubuntu depends on how you originally installed it. Method 1: Using NVM (Recommended) If you already use Node Version Manager (NVM), upgrading is simple. NVM allows you to keep both versions and switch between them if needed. Install Node 22: nvm install 22 Switch to Node 22: nvm use 22 Set it as your default: nvm alias default 22 Verify the change: node -v Method 2: Using NodeSource (PPA) If you installed Node.js via apt using the NodeSource repository, you need to update the repository script to point to the new version. Remove the old NodeSource list (optional but cleaner): sudo rm /etc/apt/sources.list.d/nodesource.list Download and run the NodeSource setup script for Node 22: curl -fsSL [https://deb.nodesource.com/setup_22.x](https://deb.nodesource.com/setup_22.x) | sudo -E bash - Install/Upgrade Node.js: sudo apt-get install -y nodejs Verify the installation: node -v Method 3: Using the ‘n’ Package If you have npm installed, you can use the n interactive manager. Clear the npm cache: sudo npm cache clean -f Install the ‘n’ helper: sudo npm install -g n Install Node 22: sudo n 22 Update your shell: hash -r Troubleshooting Permission Denied: If you see permission errors using Method 2 or 3, ensure you are using sudo. Path Issues: If node -v still shows version 20 after upgrading via NVM, restart your terminal or run source ~/.bashrc. Conflicts: Avoid mixing these methods. If you switch from apt to nvm, it is best to sudo apt remove nodejs first to avoid path conflicts.
  • TNG + Nodebb

    General tng genealogy nodebb plugin
    4
    0 Votes
    4 Posts
    1k Views
    @Madchatthew said in TNG + Nodebb: you have to try and use duck tape and super glue to change something to make it do what you want it to do I couldn’t have put that better myself.
  • 0 Votes
    1 Posts
    2k Views
    No one has replied
  • SEO and Nodebb

    Performance nodebb seo
    2
    2 Votes
    2 Posts
    901 Views
    @Panda It’s the best it’s ever been to be honest. I’ve used a myriad of systems in the past - most notably, WordPress, and then Flarum (which for SEO, was absolutely dire - they never even had SEO out of the box, and relied on a third party extension to do it), and NodeBB easily fares the best - see below example https://www.google.com/search?q=site%3Asudonix.org&oq=site%3Asudonix.org&aqs=chrome..69i57j69i60j69i58j69i60l2.9039j0j3&sourceid=chrome&ie=UTF-8#ip=1 However, this was not without significant effort on my part once I’d migrated from COM to ORG - see below posts https://community.nodebb.org/topic/17286/google-crawl-error-after-site-migration/17?_=1688461250365 And also https://support.google.com/webmasters/thread/221027803?hl=en&msgid=221464164 It was painful to say the least - as it turns out, there was an issue in NodeBB core that prevented spiders from getting to content, which as far as I understand, is now fixed. SEO in itself is a dark art - a black box that nobody really fully understands, and it’s essentially going to boil down to one thing - “content”. Google’s algorithm for indexing has also changed dramatically over the years. They only now crawl content that has value, so if it believes that your site has nothing to offer, it will simply skip it.
  • Where are widgets stored?

    Solved Configure nodebb
    3
    1 Votes
    3 Posts
    932 Views
    @phenomlab Thanks, have DMed you
  • Nord VPN & Bitdefender

    Solved Performance nord vpn bitdefender
    8
    1 Votes
    8 Posts
    2k Views
    @JAC been working fine. No complaints from me
  • NodeBB slow after DB recovery

    Solved Performance performance nodebb
    1
    5 Votes
    1 Posts
    684 Views
    No one has replied
  • [NODEBB] Help for my custom CSS

    Solved Customisation nodebb css bugfix
    237
    49 Votes
    237 Posts
    93k Views
    @baris said: You should change your selectors so it doesn’t look at the entire document. You probably only want to apply fancybox to stuff inside the #content element which is what changes when the user navigates around the page. So use $('#content a').... for your selectors then the forum logo in the header won’t be selected. I modified the JS Fancybox code now and this code and it seem better // --------------------------------------------- // Fancybox Media Reader (Without Website Logo) // --------------------------------------------- if (top.location.pathname !== '/login') { $(window).on('action:posts.loaded', function(data) { console.log("Polling DOM for lazyLoaded images to apply Fancybox"); $(document).ready(function() { $('#content a').not('.forum-logo').not(".avatar").not(".emoji").not(".bmac-noanimate").each(function() { $('#content a[href*=".jpg"], #content a[href*=".jpeg"], #content a[href*=".png"], #content a[href*=".gif"], #content a[href*=".webp"]').addClass("noanimate"); }); }); }); } if (top.location.pathname !== '/login') { $(document).ready(function() { $(window).on('action:ajaxify.end', function(data) { $('#content a').not('.logo').not(".avatar").not(".emoji").not(".bmac-noanimate").each(function() { $('#content a[href*=".jpg"], #content a[href*=".jpeg"], #content a[href*=".png"], #content a[href*=".gif"], #content a[href*=".webp"]').addClass("noanimate"); data.preventDefault() // Strip out the images contained inside blockquotes as this looks nasty :) $('#content blockquote img').remove(); }); Fancybox.bind( '#content a[href*=".jpg"], #content a[href*=".jpeg"], #content a[href*=".png"], #content a[href*=".gif"], #content a[href*=".webp"]', { groupAll: true, } ); }); }); } // Chat fancybox - fires when chat module loaded and AJAX calls new chat $(document).ready(function() { $(window).on('action:chat.loaded', function(data) { // >>> Se limiter au contenu principal uniquement <<< $('#content img').not('.forum-logo').not(".avatar").not(".emoji").not(".bmac-noanimate").each(function() { var newHref = $(this).attr("src"); $(this).wrap("<a class='fancybox' href='" + newHref + "'/>"); $('#content a[href*=".jpg"], #content a[href*=".jpeg"], #content a[href*=".png"], #content a[href*=".gif"], #content a[href*=".webp"]').addClass("noanimate"); data.preventDefault(); // Strip out the images contained inside blockquotes as this looks nasty :) $('#content blockquote img').remove(); }); Fancybox.bind( '#content a[href*=".jpg"], #content a[href*=".jpeg"], #content a[href*=".png"], #content a[href*=".gif"], #content a[href*=".webp"]', { groupAll: true, } ); }); }); For the logo, I must use overflow: visible !important; on [component="brand/logo"] /* --- Logo --- */ [component="brand/logo"] { max-height: 50px; width: auto; height: auto; max-width: 100%; display: block; object-fit: contain; object-position: left center; overflow: visible !important; } Better result !!