You are not logged in. Click here to log in.

codebeamer Application Lifecycle Management (ALM)

Search In Project

Search inClear

Tags:  not added yet
Please contact sales@intland.com for more information about high availability clustering!

codeBeamer ALM's Official Scalability Performance Test Report 2019

with up to 1,000 concurrent users, 10 million work items, and 35,000 requirements in a single document

The purpose of this document

In February 2019, we carried out a comprehensive performance testing session of codeBeamer ALM with 1,000 users, 10 million artifacts (work items), and 35,000 requirements in a single document (tracker).

The purpose of this document is to give you an overview of both the process and the environment that we have used for testing the performance of our ALM platform in order to help you interpret and understand the results. These tests may be repeated with the same configuration any time, and can be reasonably expected to yield similar results.

Testing showed that codeBeamer ALM worked swiftly and smoothly with 10 million work items (artifacts). If your environment requires the management of an even larger volume of work items, please contact us for more performance data.

For this performance test we used a Fujitsu PRIMERGY TX2550 M4 Server with 12 CPUs x Intel(R) Xeon(R) Gold 6128 CPU and 128 GB RAM, running a Linux (CentOS 7) with an Oracle 12c database. These performance tests do not yet cover Windows-based installations or MySQL databases.

Hardware Architecture

Test server

The server used for testing was a Fujitsu PRIMERGY TX2550 M4 Server with 2 CPUs x Intel(R) Xeon(R) Gold 6128 CPU and 128 GB RAM of total RAM. The same piece of hardware was used for all the performance tests. The approximate market price of the test server is around $10,000.

VMware ESXi 6.7 was running on the server.

The tests ran on the same ESXi server. It is recommended to use two separate servers for codeBeamer and Oracle instances and at least 10 Gbit Network is required between these servers archiving the optimal performance on the Production environment.

codeBeamer ALM instance

The virtual server of this codeBeamer ALM instance ran with the following configuration:

Database instance

The virtual server of this Oracle instance ran with the following configuration:

Test topology

Software components and versions

Both codeBeamer ALM and Oracle virtual machines ran on CentOS 7.

codeBeamer ALM was configured to run with OpenJDK version 8 and a maximum of 16 GB heap size (from max. available 32 GB memory).

codeBeamer ALM version 9.3.0-final version
Server Apache Tomcat 8.5.37
Java OpenJDK 1.8.0_191
Database Oracle Database 12c Enterprise Edition Release 12.1.0.2.0
JDBC driver oracle.jdbc.driver.OracleDriver, Implementation-Version: 12.1.0.1.0-ProductionBuild-12
OS CentOS Linux 7

codeBeamer ALM configuration

Tomcat server configuration

ThreadPool configuration

To serve a higher load we increased Tomcat's thread pool count in server.xml:

 <Executor name="tomcatThreadPool" namePrefix="catalina-exec-" maxThreads="500" ></Executor>

Database connection configuration

To match the volume of database connections we increased the maxActive to 150 and maxIdle to 50 in the database connection settings in my-applicationContext.xml

<bean id="sharedPoolConfig" class="org.springframework.beans.factory.config.MapFactoryBean" abstract="true">
    <property name="sourceMap">
        <map>
            <entry key="defaultAutoCommit" value="true"></entry>
            <entry key="defaultReadOnly" value="false"></entry>
            <entry key="defaultTransactionIsolation" value="READ_COMMITTED"></entry>
            <entry key="maxActive" value="150"></entry>
            <entry key="maxIdle" value="50"></entry>
            <entry key="minIdle" value="5"></entry>
            <entry key="initialSize" value="0"></entry>
            <entry key="testOnBorrow" value="true"></entry>
            <entry key="testOnReturn" value="false"></entry>
            <entry key="testWhileIdle" value="false"></entry>
            <entry key="minEvictableIdleTimeMillis" value="300000"></entry>
            <entry key="timeBetweenEvictionRunsMillis" value="60000"></entry>
            <entry key="removeAbandoned" value="false"></entry>
            <entry key="removeAbandonedTimeout" value="300"></entry>
            <entry key="logAbandoned" value="false"></entry>
        </map>
    </property>
</bean>

Network

For network connections between the codeBeamer ALM server and the database, we recommend using a low latency time connection (less than 1 ms) and at least 10Gb network bandwidth.


During testing, all connections used HTTP ports.

Testing tools

For test planning and execution we used JMeter version 3.3 with Jenkins integration version 2.107.3 (for automatic execution only).

Test data shape and volume

The work items, users, projects, trackers, comments, reference (links), history and other elements were evenly distributed among the projects. The following table represents the number of work items by type. The table below shows that for 10 million work items (i.e. 10 million requirements), there were 30 million comments, and 203 million history entries in the database repository.

Artifact type 10M Work items
Work Items 10,170,913
Named Users 3,196
Projects 45
Trackers 585
Comments 30,512,425
Work Item References 53,012,983
History entries of Work Items 203,402,425

Test cases, scenarios

We used Apache JMeter to simulate the workload and to measure the performance of codeBeamer ALM. One test script was created for all three user actions. In the script, each HTTP request represented a user action. We started with 100/250/500/700/1000 concurrent users respectively, with all users logged in over a period of 15 minutes. Tests were then run for another 45 minutes. User actions during testing were simulated with a 1 minute thinking time for each user.

The following table shows the use cases and the percentage of users running each use case:

Use Case Description Number of Users
Browse trackers and items Login. Visit projects page. Open a project. Go to trackers page. Open a requirements tracker in document view. Open a tasks tracker. Open a bugs tracker. Open a bug, open related Task, open related task requirement, open related system requirement and open related customer requirement. Open members page. 70%
Create work items and comments Login. Visit projects page. Open a project. Go to trackers page. Create a requirement, add an association, and add 3 comments to it. Create a task, add an association, and add 3 comments to it. Create a bug, add an association, and add 3 comments to it. 20%
Search for items Login. Enter text in the search field. Scroll to the second page. Open a search result. 10%

Test performance results with up to 10 million work items and 1,000 concurrent users

Stress test with 1,000 concurrent logins

We used a database of 10 million work items (i.e. requirements, bugs, test cases) to measure the response time for login requests. The login stress test simulates 1,000 concurrent user logins in 300 seconds. See the response times below.

Concurrent Users Login response time with 10 million work items in the database
1,000 240 ms

Average response time

The following table summarizes the overall average response time for each user request using different user levels for the tested software / hardware configurations. Tests were carried out with 10 million work items (i.e. requirements, bugs, or test cases) and 100 - 1000 concurrent users.

Concurrent Users 10 million Work Items
100 133 ms
250 163ms
500 162 ms
750 216 ms
1000 233 ms

Response Time Diagram

The diagram shows the average response time vs concurrent users and for repositories with 10 million (10M) Work Items (i.e. requirements).

Detailed results of performance test runs

10M Work Items - 1000 concurrent Users

Performance test with 35,000 requirements in a single document

Most requirements documents only have a few hundred or a few thousand requirements. In working with our customers, however, we often encounter over 30,000 requirements in a single document. The test scenario below shows our performance test for such documents.

ReqIF file import performance with 35,463 requirements

We use a ReqIF file (55 MB) with 35,463 requirements for the import performance test. The ReqIF file import takes 25 minutes.

ReqIF file import with 35K requirements
25 min

Requirement Edit/View performance with 10,000 and 35,000 requirements in a single document

This performance measurement shows you how much time it takes to open a requirements document (one document or one module) in codeBeamer's "Document View" mode using a web browser. In the two test scenarios, the requirements documents have 10,000 and 35,000 requirements respectively. As an example, a modern car's entertainment system software might have 35,000 requirements. Google Chrome 59.0.3071.86 browser was used in our test.

Open 10,000 requirements from the web browser in Document View mode
~7 sec
Open 35,000 requirements from the web browser in Document View mode
~15 sec

CB:/displayDocument/grafik-15cabde5a26.png?doc_id=2881157Figure: "Document View" mode in web browser