You are not logged in. Click here to log in.

codebeamer Application Lifecycle Management (ALM)

Search In Project

Search inClear

Tags:  not added yet

Performance Test

This Wiki page describes how the performance of a codeBeamer instance can be measured.


Executing performance tests requires relevant expertise and experience.

  • Performance Testing should not be executed on a production codeBeamer instance otherwise you might loose your data!
  • Don't run performance tests with codeBeamer and Derby (the database shipped with codeBeamer) but with MySql or Oracle!
  • Please note that email notifications are disabled during performance tests to avoid email flood caused by frequent changes. You must restart the test CodeBeamer instance after load testing!

Requirements

codeBeamer must be running and accessible via an URL and codeBeamer's system administrator name and password must be also available.

Because requests are sent from several accounts parallel to codeBeamer the number of required floating ALM licenses can be calculated as below:

Project count * (Browse user count + Search user count + Create user count)

As default this equals to: 20 * (7+1+2) = 200

JMeter must be installed

JMeter 5.4 requires Java 8 or higher. JMeter is not supported since codebeamer release 22.10-LTS (GINA). For codebeamer release 22.10-LTS (GINA) and newer, use Gatling to test performance.

Most Linux distribution include JMeter however it can be also downloaded from http://jmeter.apache.org/download_jmeter.cgi.

Running the performance tests on the same host where also codeBeamer is running does not significantly impact the results and has the advantage that network speed does not impact the test results.

Currently JMeter tests run with JMeter 5.4.x

Performance Monitoring

We can setup a monitoring Performance Monitoring as well. You can find more details about the installation procedure and security setting of Moskito server using the following You must login to see this link. Register now, if you have no user account yet..

Download Test scripts

The scripts are shipped with codeBeamer and are available under /install_dir/tomcat/webapps/cb/performance-test.zip.

Performance Test Data generation

Default parameters for Test Data

The table below shows configurable parameters and their default values:


Variable Default Value
PROJECT_COUNT 20 Number of projects
BROWSE_USER_COUNT 7 Number of browser users per projects
SEARCH_USER_COUNT 1 Number of search users per projects
CREATE_USER_COUNT 2 Number of item creator users per projects


To modify the variables start jmeter with the command below:

$ jmeter -t cb-performance-test-setup.jmx

Now click on Create codeBeamer Performance Test Environment. To change a value just double click on the variable available in column Value as on the screen-shot below:

File->Save menu item can be used to save the modifications.

Run data generation

It must be always ensured that the tests run again a new codeBeamer instance where the installation (including post install) is just completed. The tests can not be repeated, always a new codeBeamer instance must be available. The tests/scripts require that jmeter is in your PATH.

The initial accounts, projects and some working items are created with script performance-test-setup:

./performance-test-setup protocol server port sysadmin_name sysadmin_password context_path

example:

$ ./performance-test-setup http localhost 8080 bond 007 /cb/

Please refer to Check+Test+Results to analyse the results.

Executing Performance Tests

Default parameters for Perfomance Test

The table below shows configurable parameters and their default values:


Variable Default Value
BROWSE_ITEMS_THREADS 70 Number of browse users (PROJECT_COUNT * BROWSE_USER_COUNT)
CREATE_ITEMS_THREADS 20 Number of create users (PROJECT_COUNT * CREATE_USER_COUNT)
SEARCH_THREADS 10 Number of search users (PROJECT_COUNT * SEARCH_USER_COUNT)


To modify the variables start jmeter with the command below:

$ jmeter -t cb-performance-test.jmx

Now click on Run codeBeamer Performance Test. To change a value just double click on the variable available in column Value as on the screen-shot below:

File->Save menu item can be used to save the modifications.

Run performance test

It must be always ensured that the tests run again a new codeBeamer instance where the installation (including post install) is just completed. The tests can not be repeated, always a new codeBeamer instance must be available.

The performance tests can be executed with script performance-test:

./performance-test protocol server_port context_path

example:

$ ./performance-test http localhost 8080 /cb/

Please refer to Check+Test+Results to analyse the results.

Check Test Results

The performance test results will be stored into a files with extension jtl. The command below can be used to view performance test result data:

jmeter -t cb-results.jmx

Default result files are:

  • Setup script: cb-performance-test-setup-DATE.jtl
  • Test script: cb-performance-test-DATE.jtl

Load result:

  • Select Summary Report
  • And load result with Browse button (right to Filename entry box).

Export result to CSV: click on Save Table Data button

Intland's Performance Test Result

Test hardware

  • Hardware: AMD EPYC 7402P 24-Core Processor and 16 GB Memory
  • codeBeamer and PostgreSQL were running on two separated docker instance
  • Running on CentOS 8
  • Database: PostgreSQL 12.6.3

codeBeamer 21.09-lts Results

PostgreSQL(8 GB codeBeamer, 8 CPU)

URI Samples Average (ms)
Add Attachment 5400 +0 69 +4
Create Item 1800 +0 182 +4
Document Edit View - Document Edit View Page 2450 +392 99 -18
Document Edit View - Inital Document View Center Panel And Tree 2450 +392 617 -82
Document View - Get Inital Document View Center Panel And Tree 2450 +388 316 -45
Document View - Visit Requirements Tracker Page 2450 +388 111 0
Get Documents Tree 2450 +384 22 -1
Get Requirements Branches 2450 +415 59 -5
Get Trackers Dashboard First Time 20 +0 17 -3
Get Trackers Tree 20 +0 9 -7
Get Wiki Tree 2450 +376 28 -27
Intelligent Document View - Show Document View 2450 +401 446 -39
Kanban Board - Show Cardboard 2450 +403 215 -30
Kanban Board - Visit Kanban Board Page 2450 +402 74 -9
Login 100 +0 100 +12
Open a Baseline 2450 +441 79 -48
Project Trackers Page - Get Trackers Dashboard First Time 2450 +384 32 -21
Project Trackers Page - Get Trackers Tree 2450 +388 28 -9
Project Trackers Page - Visit Trackers Page 2450 +384 68 -29
Search 2000 +0 224 +2
Set Load Tes Mode 100 +0 21 +12
Show Bugs Tracker Items 2450 +425 158 -66
Show Tasks Tracker Items 2450 +419 164 -31
Test Coverage - Get Coverage Tree First Node 2450 +406 65 -23
Test Coverage - Visit Coverage Page 2450 +406 119 -13
Traceability Browser - Get Trackers 2450 +409 54 -28
Traceability Browser - Visit Traceability Browser Page 2450 +409 52 -20
Visit Baselines Page 2450 +441 61 -46
Visit Bugs Tracker Page 2450 +425 101 -15
Visit Documents Page 2450 +378 38 -17
Visit Item Create Page 1800 +0 111 +14
Visit Login Page 100 +0 29 -1
Visit Members Page 2450 +446 42 -4
Visit Project Page 90 +0 68 +10
Visit Projects Page 90 +0 38 -7
Visit SCM Repositories Page 2450 +441 36 -4
Visit Tasks Tracker Page 2450 +419 103 -13
Visit Tracker Page 1800 +0 104 0
Visit Trackers Page 20 +0 103 +46
Visit Wiki Page 2450 +376 66 -50
Visit a Branch 2450 +416 118 -16
Visit a Bug 2450 +425 141 -99
Visit a Requirement 2450 +416 140 -37
Visit a Task 2450 +419 159 -84
All URIs 89290 +12614 122 -24