Home
/
Website Help
/
Speed And Uptime
/
How to reduce the number of executions generated by your website?

How to reduce the number of executions generated by your website?

To achieve a good website performance it is very important to optimize the number of executions. The most common cause for an excessively high number of executions is either something irregular happening to your website that should be addressed and stopped or in the better case a legitimate peak in your website traffic, which can also be addressed and handled.

What is execution?

A simple example of execution is when a visitor opens your website and your index PHP file is loaded. This counts as one execution. The more visitors your website has, the more executions it will generate. Note that this is valid only for dynamically generated content. If you open a picture or an HTML page a new execution will not be generated on the server. Executions are counted for the following scripting languages – PHP, Perl, Python, Ruby, etc.

Using Traffic tool to identify irregularities that cause a high number of executions

To reduce the number of executions generated by your website you have to identify the source of the issue. Sometimes the high number of executions is not created by a high number of legitimate visitors and you may be able to lower it by taking some corrective actions. The best way to do this on a SiteGround hosting account, go to Site Tools > Statistics > Traffic.

The Behaviour tab provides you with useful information about which part of your website is most viewed. You should check this section to identify the pages that generate the executions. Below we will talk about the most common execution generators:

  • Chat/calendar or other modules that refresh their content constantly can generate many executions and you will be able to see the executions in this section. For example, if your site uses a calendar module and you see that the calendar.php script has been accessed many times then it is a good idea to disable this module.
  • Application login pages. Very often sites are abused by bots that try to gain administrative access. If you see many requests to your site’s admin login page (Joomla! administrator folder, WordPress wp-admin.php, etc.) you can do two things:
  • Comments sections. Very often sites are abused by bots that try to post spam comments. If you see many requests to your site’s comments section (WordPress wp-comments.php, Joomla! JComments, etc.) then add CAPTCHA that will show an image to your visitors before allowing them to post a comment to prevent bot’s requests. There are many CAPTCHA extensions for WordPress, Joomla!, Drupal, etc. that you can use to protect your site’s comments section/contact forms.

As you probably know, search engines use bots to index websites on the Internet. Those visits to your website are recorded and shown in the Behaviour tab. Sometimes bots generate too many executions and you need to either block the bots or decrease their crawl rate. If you notice too many requests from certain bots than you can change their crawl rate. The way you set the crawl rate depends on the particular bot.

Using the SuperCacher to handle executions generated by legitimate high traffic

In some cases, the number of executions is too high because your site has become too popular or because you’ve recently launched a new marketing campaign. If this is the case then a possible solution is to configure your site to use our SuperCacher service. To reduce the number of executions you need to enable the SuperCacher Dynamic cache feature, that is available for all GrowBig hosting plans and higher. Right now this feature supports WordPress, Joomla! and Drupal websites. Once you enable Dynamic Caching, the SuperCacher will generate cached copies of your site’s pages. When a new user attempts to open one of the pages it will be served from the cache instead of generating a new execution.

 

Share This Article