Netregistry has been the leader in Australian web hosting since 1997. Introducing load balanced, clustered cloud web hosting in 1999 we continue to be innovative. We commissioned Load Impact — the global leader in load/stress testing and performance measurement of websites — to benchmark our web hosting against the competition.
Netregistry web hosting outperformed the competition both local and global. In fact, with our proprietary Dynamic Server Allocation® technology, load times decreased with additional users as our system dynamically added servers to the cluster in response.
Not all web hosting is the same. If you want your website to stay up and stay fast, Netregistry's Cloud Web Hosting is the only choice.
Load testing report Netregistry 2011
Netregistry is a web hosting provider. They wanted to run simple load tests on 6 different web hosting solutions, including their own. For the purpose, they purchased standard web hosting accounts with 5 of their competitors and set up one account using their own service. They registered the domain names “loadtesting001.com”, “loadtesting002.com” and so on up until “loadtesting006.com” and asked us (loadimpact.com) to run tests on all the domains/services.
|Web Hosting provider
The load tests were executed by Ragnar Lonn <firstname.lastname@example.org>
2. Load test summary
- The Netregistry site was the one that outperformed the rest by a large margin. It served clients flawlessly with practically no errors up to load levels of hundreds of concurrent users.
- Crazydomains displayed good performance.
- GoDaddy, Webcentral and AussieHQ showed lower than average performance.
- MelbourneIT performed worse than all others, logging errors at very low load levels.
Note: in the graph above, the total number of transactions is dropping after 35 users because the response times are getting worse with increasing load, lowering total throughput
Netregistry set up the 6 web hosting accounts and configured 4 different websites on each account:
- Wordpress installation
- Joomla installation
- Magento installation
- “Static” web site
A total of 24 individual sites to be tested. Due to various circumstances, explained later, we decided to focus on the Wordpress and Joomla sites (12 sites).
Some things worth noting about the sites:
- The sites were set up by Netregistry. Load Impact staff was not involved in any installation or configuration of the sites or the CMS applications.
- The CMS installations were “basic” and contained almost no information, which means that the databases were more or less empty at the time the tests were run (a populated database would likely result in worse performance than what was recorded in these tests).
- The load testing was conducted from a load generator node in Seattle, USA. All the tested sites displayed similar network round trip delays to and from the load generator node. The measured delays were between 200 and 250 ms. The exception was the MelbourneIT
site (located in the US), that had very low delays of only a few milliseconds. To avoid giving them an unfair advantage we introduced 200 ms (100 ms in each direction) artificial delay when testing that particular site.
- The data-generating tests were conducted between 10am and 5pm AEST, as Netregistry wanted tests to be run during normal office hours for their Australian customers
We measured two main metrics during the tests:
Average URL load time: this is a value that represents the average time it took the simulated users to load a URL at a certain load level during the test. The actual user-experienced load time of a whole web page will be much greater than this value. In the case of the Joomla tests, the user-experienced page load time was about 40 times the Average URL load time, while for the Wordpress tests, the user-experienced page load time was about 5 times the Average URL load time.
Transactions: this is the number of executed HTTP GET transactions. The graphs showing transactions will show how many successful (OK) HTTP transactions were executed, as well as how many failed (Error) ones there were. Errors should always be zero or close to zero, otherwise user experience is ruined.
3.3. User scenarios, test 1 (January 1, 2010)
- The Joomla and Wordpress tests consisted of a scenario recording where the user would open the first page on the site, wait 10-15 seconds and then click on a page with some content.
- The wait time after finishing the scenario (and running it again) was randomized between 10 and 20 seconds.
||Page view time avg. in seconds
|Wordpress (2 pages)
|Joomla (2 pages)
4. Load test execution
4.1 Preliminary testing (November 26, 2010)
We started running some initial, low-volume tests around 5 am AEST. We tested the Wordpress installations on all sites with 10-50 simulated users to make sure there were no problems with the scenarios and to see if any of the sites would slow down already at those low load levels. We noted that a couple of the sites were struggling already at these load levels — GoDaddy suffered from high error rates. As did MelbourneIT. Webcentral, on the other hand, exhibited rapidly increasing response times. As we were interested in finding out the limits for all the sites, we adjusted our configured load levels to each site individually in preparation for the real testing later on.
4.2 Starting the real tests (November 26, 2010)
- The real load testing started at 10 am AEST. We first ran low-volume tests (10-30 simulated users) for some of the Joomla installations. We started with low loads here as the Joomla sites had proved to be heavier than the Wordpress sites. These tests ended at noon AEST.
- The initial results for the Joomla sites indicated we could use more load, so between noon and 5 pm AEST we tested both the Joomla and Wordpress sites using 30-150 simulated users.
- Analysing the results, it became obvious that a couple of the sites could not handle anywhere near the load we had subjected them to, while others had no problem at all. The stellar performer was Netregistry, while Crazydomains came in a distant second and the others were way behind. The bad performers would either show excessive load times (be very slow) or excessive amounts of errors (failed transactions) when they were stressed.
4.3 Continuing the tests (November 30, 2010)
- The testing again started at 10 am AEST and continued until 5pm AEST. We ran a number of Wordpress and Joomla tests. Each test was individually configured to use a load level appropriate for the site in question, to start at a level where load times were quite low and there were few or no errors, and then to ramp up to where errors and/or load times got excessive.
- We also ran a few tests of the Magento sites, at 10-50 client levels. However, the Magento sites proved to be much heavier than the other sites and several of the tests were so slow they timed out. Also, Crazydomains and AussieHQ started to filter out the traffic from our load generator node, so we got no useful Magento results from those two sites.
- Considering we were now prevented from testing Crazydomains and AussieHQ, we decided to focus on the Wordpress and Joomla tests where we already had enough data to draw some conclusions.
4.4 Final test (December 3, 2010)
- We executed a single test at 11.30am, testing the Wordpress installation hosted by Netregistry with up to 300 concurrent users.
- Netregistry was the site that outperformed the rest by a large margin. It served clients flawlessly with almost no errors up to load levels of hundreds (200-300) of concurrent users when running both the Wordpress and Joomla tests.
- Crazydomains displayed good performance, supporting up to 100-150 users but overall with higher response times than Netregistry.
- GoDaddy, Webcentral and AussieHQ showed lower than average performance. GoDaddy started showing errors early, while the response times of Webcentral and AussieHQ were rapidly increasing at fairly low load levels.
- MelbourneIT performed worse than all others, logging many errors at very low load levels.
Appendix 1 — graphs of Wordpress tests
Appendix 2 — graphs of Joomla tests