Skip to content

Virtualmin SQL problem

Solved Performance

Did this solution help you?
Did you find the suggested solution useful? Why not buy me a coffee? It's a nice gesture, and a great way to show your appreciation 💗

Related Topics
  • NodeBB socket with CloudFlare

    Unsolved Performance
    23
    1 Votes
    23 Posts
    2k Views

    @DownPW it’s your only realistic option at this stage.

  • Coding question: fetch vs $.ajax call from Shopify

    Solved Performance
    4
    3 Votes
    4 Posts
    245 Views

    @Panda You should be able to use {% javscript %} as shown in this video - it’s quite the watch, but very educational, and provides insight as to how this works - see below screenshot for an example

    cdb160e9-d955-498c-b921-982db2986e2b-image.png

  • SEO and Nodebb

    Performance
    2
    2 Votes
    2 Posts
    317 Views

    @Panda It’s the best it’s ever been to be honest. I’ve used a myriad of systems in the past - most notably, WordPress, and then Flarum (which for SEO, was absolutely dire - they never even had SEO out of the box, and relied on a third party extension to do it), and NodeBB easily fares the best - see below example

    https://www.google.com/search?q=site%3Asudonix.org&oq=site%3Asudonix.org&aqs=chrome..69i57j69i60j69i58j69i60l2.9039j0j3&sourceid=chrome&ie=UTF-8#ip=1

    However, this was not without significant effort on my part once I’d migrated from COM to ORG - see below posts

    https://community.nodebb.org/topic/17286/google-crawl-error-after-site-migration/17?_=1688461250365

    And also

    https://support.google.com/webmasters/thread/221027803?hl=en&msgid=221464164

    It was painful to say the least - as it turns out, there was an issue in NodeBB core that prevented spiders from getting to content, which as far as I understand, is now fixed. SEO in itself is a dark art - a black box that nobody really fully understands, and it’s essentially going to boil down to one thing - “content”.

    Google’s algorithm for indexing has also changed dramatically over the years. They only now crawl content that has value, so if it believes that your site has nothing to offer, it will simply skip it.

  • build nodebb Warning in entrypoint size limit

    Solved Performance
    2
    0 Votes
    2 Posts
    215 Views

    @eeeee they are nothing to worry about, and can be ignored.

  • NodeBB v3.0.0-rc.1

    Performance
    1
    1 Votes
    1 Posts
    125 Views
    No one has replied
  • Optimum config for NodeBB under NGINX

    Performance
    4
    3 Votes
    4 Posts
    814 Views

    @crazycells hi - no security reason, or anything specific in this case. However, the nginx.conf I posted was from my Dev environment which uses this port as a way of not interfering with production.

    And yes, I use clustering on this site with three instances.

  • 5 Votes
    13 Posts
    659 Views
    'use strict'; const winston = require('winston'); const user = require('../user'); const notifications = require('../notifications'); const sockets = require('../socket.io'); const plugins = require('../plugins'); const meta = require('../meta'); module.exports = function (Messaging) { Messaging.notifyQueue = {}; // Only used to notify a user of a new chat message, see Messaging.notifyUser Messaging.notifyUsersInRoom = async (fromUid, roomId, messageObj) => { let uids = await Messaging.getUidsInRoom(roomId, 0, -1); uids = await user.blocks.filterUids(fromUid, uids); let data = { roomId: roomId, fromUid: fromUid, message: messageObj, uids: uids, }; data = await plugins.hooks.fire('filter:messaging.notify', data); if (!data || !data.uids || !data.uids.length) { return; } uids = data.uids; uids.forEach((uid) => { data.self = parseInt(uid, 10) === parseInt(fromUid, 10) ? 1 : 0; Messaging.pushUnreadCount(uid); sockets.in(`uid_${uid}`).emit('event:chats.receive', data); }); if (messageObj.system) { return; } // Delayed notifications let queueObj = Messaging.notifyQueue[`${fromUid}:${roomId}`]; if (queueObj) { queueObj.message.content += `\n${messageObj.content}`; clearTimeout(queueObj.timeout); } else { queueObj = { message: messageObj, }; Messaging.notifyQueue[`${fromUid}:${roomId}`] = queueObj; } queueObj.timeout = setTimeout(async () => { try { await sendNotifications(fromUid, uids, roomId, queueObj.message); } catch (err) { winston.error(`[messaging/notifications] Unabled to send notification\n${err.stack}`); } }, meta.config.notificationSendDelay * 1000); }; async function sendNotifications(fromuid, uids, roomId, messageObj) { const isOnline = await user.isOnline(uids); uids = uids.filter((uid, index) => !isOnline[index] && parseInt(fromuid, 10) !== parseInt(uid, 10)); if (!uids.length) { return; } if (roomId != 11) { // 5 Is the ID of the ID of the global chat room. Messaging.getUidsInRoom(roomId, 0, -1); // Proceed as normal. } else { user.getUidsFromSet('users:online', 0, -1); // Only notify online users. } const { displayname } = messageObj.fromUser; const isGroupChat = await Messaging.isGroupChat(roomId); const notification = await notifications.create({ type: isGroupChat ? 'new-group-chat' : 'new-chat', subject: `[[email:notif.chat.subject, ${displayname}]]`, bodyShort: `[[notifications:new_message_from, ${displayname}]]`, bodyLong: messageObj.content, nid: `chat_${fromuid}_${roomId}`, from: fromuid, path: `/chats/${messageObj.roomId}`, }); delete Messaging.notifyQueue[`${fromuid}:${roomId}`]; notifications.push(notification, uids); } };
  • NODEBB: Nginx error performance & High CPU

    Solved Performance
    69
    14 Votes
    69 Posts
    6k Views

    @phenomlab

    Seems to be better with some scaling fix for redis on redis.conf. I haven’t seen the message yet since the changes I made

    # I increase it to the value of /proc/sys/net/core/somaxconn tcp-backlog 4096 # I'm uncommenting because it can slow down Redis. Uncommented by default !!!!!!!!!!!!!!!!!!! #save 900 1 #save 300 10 #save 60 10000

    If you have other Redis optimizations. I take all your advice

    https://severalnines.com/blog/performance-tuning-redis/