Performance Analysis Of 4 Xoops Article Modules
THE ANALYSIS ORIGIN
The idea for this analysis really stems from the ideals I had behind the conception of the AMS module. These where to bring to Xoops a highly functional article management system, while being aimed squarely at providing a high level of performance and scalability. During the final stages of the development of AMS I read several posts on the Xoops forums asking questions regarding the best ways to get performance from their Xoops environment. This highlighted to me that their really wasn't a great deal of information available for users who wanted to use anything other than feature set as their primary selection criteria for choosing what modules they would run on their site.
This gave me the idea of creating a series of tests which I would then run a selection of article modules through on several different servers ranging from low end to high end, with an ever increasing amount of content. This also gave me an opportunity to display the strengths of AMS, as going on the shear amount of features available to the end user WF-Section wins over every module.
So the stage was set. I had a test script developed (thanks Jan ;-)) that would insert a definable amount of topics, and articles into each topic for the News 1.2.1, WF-Section 2.07 Beta, Articles 0.17 and AMS modules which I had chosen to use in the analysis. I also had several PC's lying around of varying specifications that were a good representation of computer hardware over the last 5 years. Next was to carefully define the environment as without a level playing field the results would loose their accuracy and usefulness as a tool.
THE ANALYSIS ENVIRONMENT
The Test Script
As mentioned above, the test script was written specifically to insert a user definable amount of topics, and insert a user definable amount of articles per topic for all four modules to be used in the analysis. The test script was also configured to first remove all existing content when run so as to be sure the next test was been run on a clean slate. I decided to run a total of eleven tests, elevating the topics and articles per topic on each new test so as to gather data on each modules scalability. The details of each of these eleven tests are;
The Servers - Hardware
To give the analysis more depth, I decided to run the series of tests across three different machines. I have named these the low end server, the mid level server and the high end server. These definitions are given as reference compared to the hardware the average Xoops user runs their Xoops environment on. Below is a list of the hardware specifications of these servers.
Low End Server:
Mid Level Server:
High Level Server:
The Servers - Software
Another very important factor in getting relevant results was to ensure that each servers operating environment were kept identical. For this the first thing I had to do was carefully define what software was to be used which I broke down into categories;
With this environment setup on each of the three servers, I then created shortcuts to each modules index, topic, article and admin views from each server on the computer I was to run the tests from. This was to ensure that I was accessing each section directly so as not to incur any additional overhead from the Xoops system itself.