Skip to content

Google slow to crawl website (Search Console)

Tips

Related Topics
  • Google looks to AI paywall option

    Discussion
    1
    1 Votes
    1 Posts
    302 Views
    No one has replied
  • Profil Photos

    Solved Customisation
    2
    1 Votes
    2 Posts
    191 Views

    @cagatay the fastest way to do this would be to modify the auto generated sitemap.xml file so that it does not index users. It might be paint to do the same thing with users in terms of guest permissions.

    Let me have a look.

    Edit - you can do this with permissions. Go to /admin/manage/privileges then look in the left where it says guests - remove the tick from the view users permission for guests then click save

    You can test this out using an incognito or non logged in session. Attempting to view users should then ask you to login.

  • Am I deferred from the Google search hub?

    Solved Configure
    6
    0 Votes
    6 Posts
    263 Views

    @DownPW Yes, exactly.

  • SEO and Nodebb

    Performance
    2
    2 Votes
    2 Posts
    323 Views

    @Panda It’s the best it’s ever been to be honest. I’ve used a myriad of systems in the past - most notably, WordPress, and then Flarum (which for SEO, was absolutely dire - they never even had SEO out of the box, and relied on a third party extension to do it), and NodeBB easily fares the best - see below example

    https://www.google.com/search?q=site%3Asudonix.org&oq=site%3Asudonix.org&aqs=chrome..69i57j69i60j69i58j69i60l2.9039j0j3&sourceid=chrome&ie=UTF-8#ip=1

    However, this was not without significant effort on my part once I’d migrated from COM to ORG - see below posts

    https://community.nodebb.org/topic/17286/google-crawl-error-after-site-migration/17?_=1688461250365

    And also

    https://support.google.com/webmasters/thread/221027803?hl=en&msgid=221464164

    It was painful to say the least - as it turns out, there was an issue in NodeBB core that prevented spiders from getting to content, which as far as I understand, is now fixed. SEO in itself is a dark art - a black box that nobody really fully understands, and it’s essentially going to boil down to one thing - “content”.

    Google’s algorithm for indexing has also changed dramatically over the years. They only now crawl content that has value, so if it believes that your site has nothing to offer, it will simply skip it.

  • Google Authenticator for 2FA

    Tips
    7
    6 Votes
    7 Posts
    609 Views

    @crazycells yes, this is something I see on a daily basis and despite how shockingly simple it is to conduct SIM jacking, it seems that several of the USA based banks are reluctant to switch to at least TOTP in the same sense as the USA has been extremely slow to adopt chip and pin - something Europe has been making use of for years.

    And they wonder why cheque and wire fraud is rife in America.

  • Google sued for unauthorised use of NHS data

    Privacy
    1
    1 Votes
    1 Posts
    207 Views
    No one has replied
  • SEO

    General
    19
    7 Votes
    19 Posts
    1k Views

    Your google ranking is too low for that. Your clicks on your website have to increase significantly for this. It also helps if you have a lot of links to your website, e.g. on other websites as a reference. So that your posts or your page are high in google and can be found, you have to rise in the ranking

    Sitemap google refresh

    If you do not use WordPress or another CMS whose sitemap automatically sends a ping to Google when it is updated, you can use the "ping" function to request the update. Send an HTTP GET request like this: http://www.google.com/ping?sitemap=https://example.com/sitemap.xml
  • nginx seo urls

    Solved Configure
    15
    3 Votes
    15 Posts
    1k Views

    @riekmedia that looks fine to me