Integrate JMeter with ALM Octane

embrace_titelbild.png

Recently I have been on different engagements, where JMeter Load/Performance Testing were part of a continuous integration (CI) pipeline. On other engagements, JMeter Load/Performance Testing were not part of any continuous Integration pipeline. In both cases, we were successful connected ALM Octane with JMeter (with a CI pipeline and without). In the following, I will explain 2 different approaches on how to integrate JMeter with ALM Octane.

Easiest way to integrate a testing tool into ALM Octane is to use a continuous integration (CI) Integration – basically if the testing tool can talk to the CI (such as Jenkins) through a framework (such as Junit), we can say with high probability that ALM Octane will understand this communication – in some cases the XML provided by the testing tool need to be transformed into the correct format. The first approach would be to integrate the testing tool with the CI. From the CI, ALM Octane will push the test results to its workspaces.

In any case, if the organization is not consuming CI pipelines to deliver performance testing, also for this through a valid Junit XML the results can be pushed to ALM Octane using a test run collection tool.

Getting Started!

Before getting started, make sure to download & configure the following:

Understand and decide how JMeter tests will be represented in ALM Octane.

There are different representation options, how JMeter tests could be reflected in ALM Octane. You can choose to have tests represented by the sampler in Jmeter, which means low number of automated tests in ALM Octane with higher number of test runs (depends on the thread groups). Another way would be to say, each JMeter thread represents an automated test in ALM Octane, in such scenario you end up with a high number of tests with a low number of test runs. Other options would be to represents tests by test fragment of JMeter. So basically, there are different options. However, we will focus in this article on the following 2 representation options in ALM Octane.

Why it makes sense to push performance test results in to ALM Octane

ALM Octane is an Application Lifecycle Management Platform which acts as an DevOps Design Center to cover E2E all relevant phase across the lifecycle. Performance and Load tests are part of the continuous testing strategy in DevOps. Establishing Delivery Pipeline which executes all relevant tests (unit, system, integration, functional, performance, security) is the main goal of organizations transforming to Enterprise DevOps.  ALM Octane has been built to connect all phases together and deliver application with high quality and pace.

Performance tests (whether Load Runner [Professional, Enterprise or Cloud], Gatling, JMeter, etc.) need to be pushed to increase visibility on coverage and avoid broken traceability. Once performance tests are pushed into ALM Octane, they are represented as Automation Tests [AT]. Now these performance tests can be assigned to user story, requirements and / or defects.

embrace_relations

Hereby it is possible to link the requirements, user stories and / or defects coming into ALM Octane from various other tools such as Atlassian JIRA, Microsoft Azure DevOps, ServiceNow, in case you have a very heterogenous tool chain. To establish this integration, use Micro Focus Connect Core (https://marketplace.microfocus.com/appdelivery/content/micro-focus-connect-core).

Once everything is connected, enjoy the ALM Octane Dashboard.

embrace_dashboard

Option 1: JMeter Sampler representation as ALM Octane Automated Tests

Each JMeter Sampler is pushed as one single automated test. As a result, you have a smaller number of tests, which represents the most likely JMeter behaviour.

JmeterInOctaneBySamplerTherefor all thread runs are represented in the previous runs tab on the automated runs in ALM Octane.

JmeterInOctaneBySampler_results

Option 2: JMeter Thread-Runs representation as ALM Octane Automated Tests

In this scenario, all thread runs per sampler are pushed into ALM Octane as one single automation test. As a result, you have a higher number of tests with a lower number of runs.

threadastestoctane

Integrate JMeter into ALM Octane through a Continuous Integration Pipeline.

In this scenario, we will use Jenkins. You need to configure the following:

  • Make sure JMeter is installed on the Jenkins node which should perform execute the JMeter Tests.
  • Install the Micro Focus Automation Tools Plugin for Jenkins (link is above in the Getting Started section).
  • Configure API Client ID and Client Secret from ALM Octane in order to communicate with Jenkins (How to setup API Access: https://admhelp.microfocus.com/octane/en/latest/Online/Content/AdminGuide/how_setup_APIaccess.htm)
  • Python must be installed on the Jenkins node (link is above in the Getting Started section).

 

Configure JMeter to save the result-tree as XML file

In the view result tree properties, enter a path to write results to a XML file as show below:

embrace_resultasxml

Make sure to click on configure to save file in XML format.

embrace_saveasxml

Save your JMeter Test.

Once you obtained the API Access (Client-id and Client-secret) and the connection was established from Jenkins, create a freestyle job in Jenkins and assign it to a pipeline in ALM Octane. In an old post (https://www.linkedin.com/pulse/automating-gherkin-based-tests-using-alm-octane-cucumber-amir-khan/), you can checkout how to configure a Jenkins job when working with ALM Octane.

Create a Jenkins Freestyle Job

Create a freestyle project job in Jenkins, which will execute JMeter tests.

jenkins_freestyle project

Create a Pipeline in ALM Octane

Once you obtained the API Access (Client-id and Client-secret) and the connection was established from Jenkins, create a pipeline in ALM Octane.

You can create a pipeline directly from ALM Octane:

octanepipelinemodule

Or from Jenkins Jobs, which should run the Jmeter performance tests:

jenkins_almoctane

As a next step, we need to configure the Jenkins job to run the JMeter performance tests.

Configure Jenkins Job Build Steps to run JMeter Tests

This is probably the easiest step in the integration scenario – Jmeter can be executed from the commandline using a Windows Batch Command as a Build step in Jenkins.

Commandline:

C:\jmeter\bin\jmeter.bat -J <output_format>  -n -t <path_to_Jmeter_test.jmx> -l <result_file>

Example:

buildstep1

As the xml result file Jmeter generates does not match with the Junit XML Format, we need to transform the XML generated by JMeter to a well Junit formatted xml file.

First of all, I used and modified the XSL file to convert JMeter XML to Junit XML from the following community post: https://gist.github.com/beradrian/9933070a26d7c72ce67ee26242ed5a2b

The XSL I use to upload JMeter tests by sampler (Option 1 above) is the following:

<?xml version=”1.0″ encoding=”UTF-8″?>
<xsl:stylesheet xmlns:xsl=”http://www.w3.org/1999/XSL/Transform&#8221; version=”1.0″>
<xsl:output method=”xml” indent=”yes” encoding=”UTF-8″/>

<xsl:template name=”millisecs-to-ISO”>
<xsl:param name=”millisecs”/>

<xsl:param name=”JDN” select=”floor($millisecs div 86400000) + 2440588″/>
<xsl:param name=”mSec” select=”$millisecs mod 86400000″/>

<xsl:param name=”f” select=”$JDN + 1401 + floor((floor((4 * $JDN + 274277) div 146097) * 3) div 4) – 38″/>
<xsl:param name=”e” select=”4*$f + 3″/>
<xsl:param name=”g” select=”floor(($e mod 1461) div 4)”/>
<xsl:param name=”h” select=”5*$g + 2″/>

<xsl:param name=”d” select=”floor(($h mod 153) div 5 ) + 1″/>
<xsl:param name=”m” select=”(floor($h div 153) + 2) mod 12 + 1″/>
<xsl:param name=”y” select=”floor($e div 1461) – 4716 + floor((14 – $m) div 12)”/>

<xsl:param name=”H” select=”floor($mSec div 3600000)”/>
<xsl:param name=”M” select=”floor($mSec mod 3600000 div 60000)”/>
<xsl:param name=”S” select=”$mSec mod 60000 div 1000″/>

<xsl:value-of select=”concat($y, format-number($m, ‘-00’), format-number($d, ‘-00’))” />
<xsl:value-of select=”concat(format-number($H, ‘T00’), format-number($M, ‘:00’), format-number($S, ‘:00’))” />
</xsl:template>

<!–
https://jmeter.apache.org/usermanual/listeners.html#attributes

JMeter Attribute Meanings – If enabled in JMeter

by          Bytes
sby         Sent Bytes
de          Data encoding
dt          Data type
ec          Error count (0 or 1, unless multiple samples are aggregated)
hn          Hostname where the sample was generated
it          Idle Time = time not spent sampling (milliseconds) (generally 0)
lb          Label
lt          Latency = time to initial response (milliseconds) – not all samplers support this
ct          Connect Time = time to establish the connection (milliseconds) – not all samplers support this
na          Number of active threads for all thread groups
ng          Number of active threads in this group
rc          Response Code (e.g. 200)
rm          Response Message (e.g. OK)
s           Success flag (true/false)
sc          Sample count (1, unless multiple samples are aggregated)
t           Elapsed time (milliseconds)
tn          Thread Name
ts          timeStamp (milliseconds since midnight Jan 1, 1970 UTC)
varname     Value of the named variable
–>

<xsl:template match=”/testResults”>
<testsuites>
<testsuite>
<!– required for Junit xsd – no available in the jmeter result –>
<xsl:attribute name=”id”>1</xsl:attribute>
<!– required for Junit xsd – no available in the jmeter result –>
<xsl:attribute name=”name”>web.protocol.http</xsl:attribute>
<!– required for Junit xsd – no available in the jmeter result –>
<xsl:attribute name=”package”>http.protocol.listener</xsl:attribute>
<!– required for Junit xsd – no available in the jmeter result –>
<xsl:attribute name=”hostname”>Jmeter-Executor</xsl:attribute>
<!– required for JUnit xsd –>
<xsl:attribute name=”timestamp”>
<xsl:call-template name=”millisecs-to-ISO”>
<!– get timestamp from first test result convert it from epoch to ISO8601 –>
<xsl:with-param name=”millisecs” select=”*[1]/@ts” />
</xsl:call-template>
</xsl:attribute>
<!– required for Junit xsd – count of test results –>
<xsl:attribute name=”tests”><xsl:value-of select=”count(*)”/></xsl:attribute>
<!– required for Junit xsd – count of test failures –>
<xsl:attribute name=”failures”><xsl:value-of select=”count(*[./assertionResult/failure[text() = ‘true’]])”/></xsl:attribute>
<!– required for Junit xsd – count of test errors –>
<xsl:attribute name=”errors”><xsl:value-of select=”count(*[./assertionResult/error[text() = ‘true’]])”/></xsl:attribute>
<!– required for Junit xsd – Time taken (in seconds) to execute all the tests –>
<xsl:attribute name=”time”><xsl:value-of select=”sum(*/@t) div 1000″/></xsl:attribute>
<properties></properties>
<xsl:for-each select=”*”>
<testcase>
<xsl:attribute name=”classname”><xsl:value-of select=”concat(name(), ‘.’, substring-before(concat(@tn,’ ‘),’ ‘))”/></xsl:attribute>
<xsl:attribute name=”component”><xsl:value-of select=”@lb”/></xsl:attribute>
<xsl:attribute name=”name”><xsl:value-of select=”@lb”/></xsl:attribute>
<xsl:attribute name=”id”><xsl:value-of select=”concat(concat(@lb, ‘.’, @tn), ‘.’, @ng)”/></xsl:attribute>
<xsl:attribute name=”package”><xsl:value-of select=”concat(@rm, ‘.HTTP/’, @rc)”/></xsl:attribute>
<xsl:attribute name=”time”><xsl:value-of select=”@t div 1000″/></xsl:attribute>
<xsl:attribute name=”system-out”><xsl:value-of select=”concat(‘Name: ‘, @lb, ‘, Thread: ‘, @tn, ‘, Number of active threads: ‘, @na, ‘, Number of active thread groups: ‘, @ng, ‘, Return Code: ‘, @rc, ‘, Return Message: ‘, @rm)”/></xsl:attribute>
<xsl:attribute name=”timestamp”>
<xsl:call-template name=”millisecs-to-ISO”>
<!– get timestamp from first test result convert it from epoch to ISO8601 –>
<xsl:with-param name=”millisecs” select=”@ts” />
</xsl:call-template>
</xsl:attribute>
<xsl:attribute name=”status”><xsl:value-of select=”concat(@rm, ‘.HTTP/’, @rc)”/></xsl:attribute>
<xsl:if test=”assertionResult/failureMessage”>
<failure>
<!– show only the first failure message (if multiple) as the JUnit schema only supports one faulure node –>
<xsl:attribute name=”message”><xsl:value-of select=”assertionResult[./failure = ‘true’]/failureMessage”/></xsl:attribute>
<!– show only the first failure type (if multiple) as the JUnit schema only supports one faulure node –>
<xsl:attribute name=”type”><xsl:value-of select=”assertionResult[./failure = ‘true’]/name”/></xsl:attribute>
</failure>
</xsl:if>
<xsl:if test=”@s = ‘false'”>
<xsl:if test=”responseData”>
<error><xsl:value-of select=”responseData”/></error>
</xsl:if>
<failure>
<!– show only the first failure message (if multiple) as the JUnit schema only supports one faulure node –>
<xsl:attribute name=”message”><xsl:value-of select=”concat(‘Response Code: ‘, @rc, ‘, Response Message: ‘, @rm)”/></xsl:attribute>
<!– show only the first failure type (if multiple) as the JUnit schema only supports one faulure node –>
<xsl:attribute name=”type”><xsl:value-of select=”@rc”/></xsl:attribute>
</failure>
</xsl:if>
</testcase>
</xsl:for-each>
<!– required for JUnit xsd –>
<system-out></system-out>
<!– required for JUnit xsd –>
<system-err></system-err>
</testsuite>
</testsuites>
</xsl:template>
</xsl:stylesheet>

The XSL I use to upload JMeter tests by thread name (Option 2 above) is the following:

<?xml version=”1.0″ encoding=”UTF-8″?>
<xsl:stylesheet xmlns:xsl=”http://www.w3.org/1999/XSL/Transform&#8221; version=”1.0″>
<xsl:output method=”xml” indent=”yes” encoding=”UTF-8″/>

<xsl:template name=”millisecs-to-ISO”>
<xsl:param name=”millisecs”/>

<xsl:param name=”JDN” select=”floor($millisecs div 86400000) + 2440588″/>
<xsl:param name=”mSec” select=”$millisecs mod 86400000″/>

<xsl:param name=”f” select=”$JDN + 1401 + floor((floor((4 * $JDN + 274277) div 146097) * 3) div 4) – 38″/>
<xsl:param name=”e” select=”4*$f + 3″/>
<xsl:param name=”g” select=”floor(($e mod 1461) div 4)”/>
<xsl:param name=”h” select=”5*$g + 2″/>

<xsl:param name=”d” select=”floor(($h mod 153) div 5 ) + 1″/>
<xsl:param name=”m” select=”(floor($h div 153) + 2) mod 12 + 1″/>
<xsl:param name=”y” select=”floor($e div 1461) – 4716 + floor((14 – $m) div 12)”/>

<xsl:param name=”H” select=”floor($mSec div 3600000)”/>
<xsl:param name=”M” select=”floor($mSec mod 3600000 div 60000)”/>
<xsl:param name=”S” select=”$mSec mod 60000 div 1000″/>

<xsl:value-of select=”concat($y, format-number($m, ‘-00’), format-number($d, ‘-00’))” />
<xsl:value-of select=”concat(format-number($H, ‘T00’), format-number($M, ‘:00’), format-number($S, ‘:00’))” />
</xsl:template>

<!–
https://jmeter.apache.org/usermanual/listeners.html#attributes

JMeter Attribute Meanings – If enabled in JMeter

by          Bytes
sby         Sent Bytes
de          Data encoding
dt          Data type
ec          Error count (0 or 1, unless multiple samples are aggregated)
hn          Hostname where the sample was generated
it          Idle Time = time not spent sampling (milliseconds) (generally 0)
lb          Label
lt          Latency = time to initial response (milliseconds) – not all samplers support this
ct          Connect Time = time to establish the connection (milliseconds) – not all samplers support this
na          Number of active threads for all thread groups
ng          Number of active threads in this group
rc          Response Code (e.g. 200)
rm          Response Message (e.g. OK)
s           Success flag (true/false)
sc          Sample count (1, unless multiple samples are aggregated)
t           Elapsed time (milliseconds)
tn          Thread Name
ts          timeStamp (milliseconds since midnight Jan 1, 1970 UTC)
varname     Value of the named variable
–>

<xsl:template match=”/testResults”>
<testsuites>
<testsuite>
<!– required for Junit xsd – no available in the jmeter result –>
<xsl:attribute name=”id”>1</xsl:attribute>
<!– required for Junit xsd – no available in the jmeter result –>
<xsl:attribute name=”name”>web.protocol.http</xsl:attribute>
<!– required for Junit xsd – no available in the jmeter result –>
<xsl:attribute name=”package”>http.protocol.listener</xsl:attribute>
<!– required for Junit xsd – no available in the jmeter result –>
<xsl:attribute name=”hostname”>Jmeter-Executor</xsl:attribute>
<!– required for JUnit xsd –>
<xsl:attribute name=”timestamp”>
<xsl:call-template name=”millisecs-to-ISO”>
<!– get timestamp from first test result convert it from epoch to ISO8601 –>
<xsl:with-param name=”millisecs” select=”*[1]/@ts” />
</xsl:call-template>
</xsl:attribute>
<!– required for Junit xsd – count of test results –>
<xsl:attribute name=”tests”><xsl:value-of select=”count(*)”/></xsl:attribute>
<!– required for Junit xsd – count of test failures –>
<xsl:attribute name=”failures”><xsl:value-of select=”count(*[./assertionResult/failure[text() = ‘true’]])”/></xsl:attribute>
<!– required for Junit xsd – count of test errors –>
<xsl:attribute name=”errors”><xsl:value-of select=”count(*[./assertionResult/error[text() = ‘true’]])”/></xsl:attribute>
<!– required for Junit xsd – Time taken (in seconds) to execute all the tests –>
<xsl:attribute name=”time”><xsl:value-of select=”sum(*/@t) div 1000″/></xsl:attribute>
<properties></properties>
<xsl:for-each select=”*”>
<testcase>
<xsl:attribute name=”classname”><xsl:value-of select=”concat(name(), ‘.’, @lb)”/></xsl:attribute>
<xsl:attribute name=”component”><xsl:value-of select=”@lb”/></xsl:attribute>
<xsl:attribute name=”name”><xsl:value-of select=”concat(@lb, ‘_’, @tn)”/></xsl:attribute>
<xsl:attribute name=”id”><xsl:value-of select=”concat(concat(@lb, ‘.’, @tn), ‘.’, @ng)”/></xsl:attribute>
<xsl:attribute name=”package”><xsl:value-of select=”concat(@rm, ‘.HTTP/’, @rc)”/></xsl:attribute>
<xsl:attribute name=”time”><xsl:value-of select=”@t div 1000″/></xsl:attribute>
<xsl:attribute name=”system-out”><xsl:value-of select=”concat(‘Name: ‘, @lb, ‘, Thread: ‘, @tn, ‘, Number of active threads: ‘, @na, ‘, Number of active thread groups: ‘, @ng, ‘, Return Code: ‘, @rc, ‘, Return Message: ‘, @rm)”/></xsl:attribute>
<xsl:attribute name=”timestamp”>
<xsl:call-template name=”millisecs-to-ISO”>
<!– get timestamp from first test result convert it from epoch to ISO8601 –>
<xsl:with-param name=”millisecs” select=”@ts” />
</xsl:call-template>
</xsl:attribute>
<xsl:attribute name=”status”><xsl:value-of select=”concat(@rm, ‘.HTTP/’, @rc)”/></xsl:attribute>
<xsl:if test=”assertionResult/failureMessage”>
<failure>
<!– show only the first failure message (if multiple) as the JUnit schema only supports one faulure node –>
<xsl:attribute name=”message”><xsl:value-of select=”assertionResult[./failure = ‘true’]/failureMessage”/></xsl:attribute>
<!– show only the first failure type (if multiple) as the JUnit schema only supports one faulure node –>
<xsl:attribute name=”type”><xsl:value-of select=”assertionResult[./failure = ‘true’]/name”/></xsl:attribute>
</failure>
</xsl:if>
<xsl:if test=”@s = ‘false'”>
<xsl:if test=”responseData”>
<error><xsl:value-of select=”responseData”/></error>
</xsl:if>
<failure>
<!– show only the first failure message (if multiple) as the JUnit schema only supports one faulure node –>
<xsl:attribute name=”message”><xsl:value-of select=”concat(‘Response Code: ‘, @rc, ‘, Response Message: ‘, @rm)”/></xsl:attribute>
<!– show only the first failure type (if multiple) as the JUnit schema only supports one faulure node –>
<xsl:attribute name=”type”><xsl:value-of select=”@rc”/></xsl:attribute>
</failure>
</xsl:if>
</testcase>
</xsl:for-each>
<!– required for JUnit xsd –>
<system-out></system-out>
<!– required for JUnit xsd –>
<system-err></system-err>
</testsuite>
</testsuites>
</xsl:template>
</xsl:stylesheet>

Based on the representation you prefer to have in ALM Octane, you need to change the XSL file and use it to transform the JMeter XML into the desired Junit XML File.

To do this, you need to save the following python code into a python script file for example ‘convert_jmeter_to_junit.py’

import sys
import lxml.etree as ET

xml_jmeter_in=sys.argv[1]
xsl_convert=sys.argv[2]
xml_junit_out=sys.argv[3]

JMeterXML = ET.parse(xml_jmeter_in)
xsltConverter = ET.parse(xsl_convert)
transformXML = ET.XSLT(xsltConverter)
WellformattedJUnitXML = transformXML(JMeterXML)

#print(WellformattedJUnitXML)
f=open(xml_junit_out, “w”)
f.write(str(WellformattedJUnitXML))
f.close()

This python script runs with the following commandline:

python.exe <path_to_python_script> <path_to_jmeter_result-tree.xml> <path_to_xslt_file> <path_to_transformed_jUnit.xml>

Example:

cmdline

As the next step, we need to integrate the transformation of the JMeter XML into the Junit well formatted xml in our Jenkins job configuration.

After the JMeter Test Execution as Windows Batch Command, we need to add another Windows Batch Command as a Build Step in our Job.

runpython

Using this Batch Command in Jenkins, we will run the python script we created in the previous step.

The pyhton script will generate a junit xml file as an output, which we need to move or copy to the Jenkins workspace. To do this, another windows batch command is required, as follow:

copy <generated_junit_xml> <target_path_to_copy_the_junit_xml>

Example:

copyjunit

Configure Jenkins Job Post Build Steps to Analyze Results

First, to have Performance Trend Reporting embedded with the Jenkins Build, you can download and install the performance plugin to analyse JMeter results:

https://wiki.jenkins.io/display/JENKINS/Performance+Plugin

Now, lets take a look on the Post Build Steps configuration of the Jenkins Job.

Add as a first post build step, Publish Junit test result report:

publish junit

This step will allow Micro Focus Automation Tools Plugin to read the published Junit results in Jenkins and push it to the ALM Octane.

To have trend reporting attached to the job run, configure also the performance plugin as a post build step.

performance trend report

Save the Job configuration.

Execute Jenkins Job to run JMeter Tests

This can be performance directly from ALM Octane Pipelines module,

octanepipeline

or in Jenkin by Build Now.

jenkinbuildnow

In ALM Octane it shows that the pipeline is being executed…

octanerunning pipe

View the Results after Pipeline Run

Once the Jenkins Job is finished, it will push all test results in to ALM Octane.

afterpipelinerun

The Overview Tab will show you a summary on the current build run. In addition, you can use previous build runs to see the complete history of this pipeline.

Octane automatically classifies PROBLEMATIC TESTS by type, such as continuously failing, regression, unstable, etc. This helps to speed up the failure analysis.

Using the Dashboard module of ALM Octane, you can configure your performance test reports on progress, execution and coverage.

dashboard

In case you want to drilldown on a specific test run, ALM Octane saves a direct link to you Jenkins build with which the automated run is linked to.

automatedrundetails

From here you will be redirected to the Jenkins Job run with all the performance trend reporting.

additionaldetailsjenkins1

additionaldetailsjenkins2

You can map all pushed JMeter tests to ALM Octane Application Modules. This will allow you to understand in which business areas of your application under test the performance need more attention.

hotspot

Integrate JMeter Tests into ALM Octane without any continuous integration server

Now, if you don’t have a continuous integration server, in this section, we will explain how you can still integrate and push JMeter tests into ALM Octane. For integrating JMeter without a CI, you will require the following:

  • Python must be installed on the machine from where you want to push JMeter results to ALM Octane (link is above in the Getting Started section).
  • Download the test result collection tool for ALM Octane from GitHub (link is above in the Getting Started section).

Configure the Test Rusults Collection Tool

Once you have downloaded the test results collection tool from github, you need to create a config.properties file in the same folder from where the test-result-collection-tool.jar will run.

Copy and modify the following configuration text into a new text file and save it as config.properties.

# Server URL with protocol and port
server = http://octane-server:8080
# Server sharedspace ID
sharedspace = 1001
# Server workspace ID
workspace = 1002
# Server username
user = amir@microfocus.com
# Proxy host address
proxyhost = proxy.microfocus.com
# Proxy port number
proxyport = 8080
# Proxy username
proxyuser = test

Run JMeter tests to generate results for ALM Octane

Run your JMeter performance Tests from commandline or directly in JMeter.

JmeterRuntest

Is the test completed, transform the JMeter XML to a Junit well formatted XML.

We will use the same approach as in the first scenario. Run using Python the convert_jmeter_to_junit.py script with the Jmeter result-tree.xml and the XSL file from the first scenario (jmeter-junit-tests-by-sampler.xsl or jmeter-junit-tests-by-sampler-threadname.xsl) to generate the Junit XML file.

Simply use the same command line:

cmdline

By now, you should have the generate Junit xml file ready to be pushed to ALM Octane.

Push Junit xml with the Test Results Collection Tool to ALM Octane

To execute the test results collection tool, make sure you have the prepared config.properties file ready. Open command line interface and type the following command.

java -jar test-result-collection-tool.jar -t “Performance:JMeter” junit-transformed-by-sampler.xml

This will push the junit-transformed-by-sampler.xml file to ALM Octane.

View Results in ALM Octane

This is a bit different now, as the first scenario, where we used the Jenkins pipeline to publish and push the JMeter results to ALM Octane. With this approach, you will not have any linkage to ALM Octane’s PIPELINE module. All tests will be reported directly to the QUALITY module of ALM Octane.

automatedtestsqualitymodule

In this view you can group all test cases by their properties, for example by classname.

groupby

Conclusion

Track all phases of the agile testing pyramid with in ALM Octane and reduce the cost of translation between the tool.

ALM Octane is the Most Modern Application Lifecyle Management Platform for DevOps, Continuous Quality Management & Testing, as well as for Agile Development with Scrum, Kanban or ScrumBan.

Scale to your enterprise without compromising on:

Quality Management

Use continuous testing to ensure quality while accelerating delivery.

Integrations

Integrate with open source and 3rd party tools to achieve end-to-end visibility.

Continuous Pipelines & Delivery

Real time status into CI/CD ecosystems. View committed changes, identify possible root causes of failures and track commits associated with specific user stories and defects.

Try out ALM Octane Free Trial (90 Days): https://lnkd.in/db84Cfp

Visit our official ALM Octane Onlinehelp site to learn more about it: https://lnkd.in/dphsHCk

https://www.linkedin.com/pulse/integrate-jmeter-alm-octane-amir-khan

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s