Skip to content

httpd down due to enabling php zip extension

Solved Hosting
  • Will enabling php zip extension cause a server (System: Centos 7, Apache server) http down?
    That’s what happened the other day.

  • @ash3t only if the extension isn’t in the specified path within the conf file. Does it start if you remark the extension in PHP.ini ?

  • @phenomlab It would? By enabling, I mean, the cPanel admin install the php zip extensions on, like on this.

    After installing, I am able to see it got enabled on php_info.php.

    the httpd down is the whole server, like 50 website is down. The reason why I am asking is:

    • The is no specific error message/report/logs about why the httpd is down. It says, “connection issue” when accessing the websites.

    • things are running normal before, and after uninstalled ZIP & reboot the httpd.

    Do you know what may potentially cause this and impact the whole server? I thought, for example, if flarum needs one extension, simply turn on the extension by using the easyApache 4, and it won’t cause any damages. And each cPanel users will be able to use the extension, with no need to do anything in the conf file. Thanks for your help. 🙂

  • @ash3t if you’ve enabled the extension by cPanel then this should work without issue, and certainly won’t cause websites to go down as a result. However, what may be the case is a change of PHP version.

    Sometimes, inadvertently selecting this can mean the default PHP extensions are enabled and not the ones that you require for your website to function. I’ve seen this happen several times in cPanel and it’s a known problem.

  • @phenomlab Thanks for sharing your past experience. Do you mean that change the PHP version would cause one website to go down? However, it won’t cause the whole server’s httpd to go down, right?

    The things is that since after installing the ZIP extension, the httpd was down, and without finding the cause, it is a bit worrisome to enabled the ZIP again. Since it is not just causing one website to go down, but all the websites that sharing the same IP.

  • @ash3t said in httpd down due to enabling php zip extension:

    Do you mean that change the PHP version would cause one website to go down? However, it won’t cause the whole server’s httpd to go down, right?

    Potentially, but this depends on what each website relies on in terms of topology. Can you provide more detail as to what technologies (such as WordPress etc) are running on these sites ?

    @ash3t said in httpd down due to enabling php zip extension:

    The things is that since after installing the ZIP extension, the httpd was down, and without finding the cause, it is a bit worrisome to enabled the ZIP again. Since it is not just causing one website to go down, but all the websites that sharing the same IP.

    So does the issue resolve itself when you remove the zip PHP extension ?

  • @phenomlab Unfortunately, I cannot provide more details. It my friend’s server, and as I know, it has run many small website. I image most of them would be using WordPress, if not, then just static html.

    “So does the issue resolve itself when you remove the zip PHP extension ?”
    As far as I know, since there is no error messages, we don’t know the issue yet. I believe the server is up and running now.

    My friend suspected that some malware stored in zip were pushed into our server and extracted afterwards. The situation is : The symptoms were the server ran so fast due to high CPU load and busy to deal with heavy connections.

    Is there a way to run any security checks for this situation?

  • @ash3t that doesn’t sound symptomatic of malware, but is heavily aligned to DDoS (Distributed Denial of Service) which is where the target machine receives thousands of connection requests per second and it’s overwhelmed meaning real visitors and sites cannot be served.

    Without any specific monitoring in place, it’s going to be very difficult to determine the exact cause. There are numerous tools that can scan for malicious activity - although much of this depends on the back end technology being used (cPanel, Plesk etc). One of the best products around for protection is imunify360.

    https://bobcares.com/blog/install-imunify360-cpanel/

    It’s not free, but worth every penny.

  • @phenomlab Thanks, that’s a relief. I have checked with my friend, the server already has ddos protection.

    For now, it seems that we cannot find a clue about it. What would you suggest that we should keep an eye on as we are thinking about enabling the zip extension again.

  • @ash3t my personal preference here would be to have some form of monitoring - something like SNMP counters using a product such as cacti, LibreNMS, or observium (I have extensive experience with these).

    Taking this route in terms of monitoring means you can draw some form of parallel with a specific time and function. In terms of malware protection, imunify360 really is difficult to beat.

    The only real issue with SNMP is that the community needs to be secured adequately to prevent abuse from external sources. For example, it’s possible to execute commands on a read and write community with a weak community string. For this reason, you’d close read only and restrict the accessing hosts to trusted IP addresses only.

  • @phenomlab Thanks for your suggestion! As far as I know, my friend got a monitoring system now.

  • @ash3t Good news. Thanks.

  • @ash3t I’m going to mark this as solved for the time being. Let me know if this isn’t the case, or if you need any further help.

  • phenomlabundefined phenomlab has marked this topic as solved on
  • phenomlabundefined phenomlab unlocked this topic on

Did this solution help you?
Did you find the suggested solution useful? Why not buy me a coffee? It's a nice gesture, and a great way to show your appreciation 💗

Related Topics
  • Is no cpanel on host normal?

    Solved Hosting
    8
    3 Votes
    8 Posts
    541 Views

    @Panda if just seems bizarre practice to me. They clearly state that cPanel comes with the package, yet don’t seem to offer it unless you complain it’s missing!

  • 3 Votes
    6 Posts
    595 Views

    @DownPW said in Nginx core developer quits project in security dispute, starts “freenginx” fork:

    Maybe virtualmin implement it in the future…

    I don’t think they will - my guess is that they will stick with the current branch of NGINX. I’ve not personally tested it, but the GIT page seems to be very active. This is equally impressive

    8ac0d197-68fa-4bd8-bfa3-87237bf8f1f4-image.png

    I think the most impressive on here is the native support of HTTP 3

  • Come back PhP, all is forgiven!

    Hosting
    3
    4 Votes
    3 Posts
    252 Views

    @phenomlab said in Come back PhP, all is forgiven!:

    I used IONOS for a while, and realised that Hetzner provide a much better deal for those experienced with Linux. I know @cagatay, @DownPW and myself all use Hetzner, and I think @Madchatthew (whom I haven’t seen for a while ) was also considering taking their services. There’s an affiliate link below if you’d like to go down that route

    Yep hetzner is very very cool and I haven’t seen before a panel magentment as complete as him : backup, snapshot, add cpu core, ram is easy.

    @phenomlab said in Come back PhP, all is forgiven!:

    Obtaining a VPS comes with the double-edged sword of being completely on your own with no support, although by using Virtualmin, you’ll find life so much simpler (something I know @DownPW can attest to, as I managed to convert him )

    Yep Virtualmin is very cool 😉
    And it makes life much easier for server management, domain, nginx and so on even if it is always better to know how to do all this in CLI. I would say that the 2 are really complementary

  • Is nginx necessary to use?

    Moved Solved Hosting
    2
    1 Votes
    2 Posts
    382 Views

    @Panda said in Cloudflare bot fight mode and Google search:

    Basic question again, is nginx necessary to use?

    No, but you’d need something at least to handle the inbound requests, so you could use Apache, NGINX, Caddy… (there are plenty of them, but I tend to prefer NGINX)

    @Panda said in Cloudflare bot fight mode and Google search:

    Do these two sites need to be attached to different ports, and the ports put in the DNS record?

    No. They will both use ports 80 (HTTP) and 443 (HTTPS) by default.

    @Panda said in Cloudflare bot fight mode and Google search:

    Its not currently working, but how would the domain name know which of the two sites to resolve to without more info?
    Currently it only says the IP of the whole server.

    Yes, that’s correct. Domain routing is handled (for example) at the NGINX level, so whatever you have in DNS will be presented as the hostname, and NGINX will expect a match which once received, will then be forwarded onto the relevant destination.

    As an example, in your NGINX config, you could have (at a basic level used in reverse proxy mode - obviously, the IP addresses here are redacted and replaced with fakes). We assume you have created an A record in your DNS called “proxy” which resolves to 192.206.28.1, so fully qualified, will be proxy.sudonix.org in this case.

    The web browser requests this site, which is in turn received by NGINX and matches the below config

    server { server_name proxy.sudonix.org; listen 192.206.28.1; root /home/sudonix.org/domains/proxy.sudonix.org/ogproxy; index index.php index.htm index.html; access_log /var/log/virtualmin/proxy.sudonix.org_access_log; error_log /var/log/virtualmin/proxy.sudonix.org_error_log; location / { proxy_set_header Access-Control-Allow-Origin *; proxy_set_header Host $host; proxy_pass http://localhost:2000; proxy_redirect off; proxy_set_header Host $host; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Api-Key $http_x_api_key; } location /images { index index.php index.htm index.html; root /home/sudonix.org/domains/proxy.sudonix.org/ogproxy; } fastcgi_split_path_info "^(.+\.php)(/.+)$"; listen 192.206.28.1:443 ssl http2; ssl_certificate /home/sudonix.org/domains/proxy.sudonix.org/ssl.combined; ssl_certificate_key /home/sudonix.org/ssl.key; }

    The important part here is server_name proxy.sudonix.org; as this is used to “map” the request to the actual domain name, which you can see in the root section as root /home/sudonix.org/domains/proxy.sudonix.org/ogproxy;

    As the DNS record you specified matches this hostname, NGINX then knows what to do with the request when it receives it.

  • VPS Provider

    Solved Hosting
    7
    6 Votes
    7 Posts
    362 Views

    @phenomlab thank you very much. I will use that link when I set up my new server.

    Thanks again!

  • How to check my website is opening in all countries or not?

    Solved Hosting
    3
    2 Votes
    3 Posts
    388 Views

    @phenomlab developing an own app is a big time-consuming job, above tool worked perfectly to run quick sample tests.

    i have used temp mail to log in.

    thanks

  • Domain name factors

    Hosting
    16
    1 Votes
    16 Posts
    879 Views

    @phenomlab said in Domain name factors:

    @jac Yes, but don’t forget that Matomo (and most browsers) alike will allow you to “opt out” or not be tracked, so you can’t really rely on these 100%.

    Absolutely, very true pal.

  • Virtualmin Letsencrypt Renewal

    Solved Hosting
    13
    1 Votes
    13 Posts
    1k Views

    @gotwf said in Virtualmin Letsencrypt Renewal:

    I favor KISS engineering

    Then I think you’ll be able to appreciate this
    https://content.sudonix.com/keep-it-simple-stupid/