Creating reports with Gatling

If you don’t run the load tests in an distributed environment, starting Grafana and Influx DB docker containers on local machine might be cumbersome. In this case, it might be handy to generate simulation reports for local load testing purposes. The Rhino framework provides simulation logging feature to write simulation metrics into a flat file on the disk. To enable simulation logging you will use @Logging on the Simulation entity.

You can use as logging formatter GatlingSimulationLogFormatter to generate Gatling simulation log files:

@Simulation(name = "Reactive Upload Test")
@Logging(file = "/Users/bagdemir/load-testing/sim.log", formatter = GatlingSimulationLogFormatter.class)
public class UploadLoadSimulation {

Once the simulation completes, you can run with -ro option to generate reports:

$ ./bin/ -ro /Users/bagdemir/load-testing

Gatling generates simulation report:

GATLING_HOME is set to /Users/bagdemir/Downloads/gatling
in a future release.
Parsing log file(s)...
Parsing log file(s) done
Generating reports...

---- Global Information --------------------------------------------------------
> request count                                     209920 (OK=0      KO=209920)
> min response time                                      0 (OK=-      KO=0     )
> max response time                                    634 (OK=-      KO=634   )
> mean response time                                     1 (OK=-      KO=1     )
> std deviation                                          4 (OK=-      KO=4     )
> response time 50th percentile                          1 (OK=-      KO=1     )
> response time 75th percentile                          1 (OK=-      KO=1     )
> response time 95th percentile                          3 (OK=-      KO=3     )
> response time 99th percentile                          5 (OK=-      KO=5     )
> mean requests/sec                                3498.667 (OK=-      KO=3498.667)
---- Response Time Distribution ------------------------------------------------
> t < 800 ms                                             0 (  0%)
> 800 ms < t < 1200 ms                                   0 (  0%)
> t > 1200 ms                                            0 (  0%)
> failed                                            209920 (100%)
---- Errors --------------------------------------------------------------------
>                                                                209920 (100,0%)

Reports generated in 7s.
Please open the following file: /Users/bagdemir/Downloads/gatling/test-run/index.html

And open up the index.html file /Users/bagdemir/Downloads/gatling/test-run/index.html:

Please beware of that simulation log file rotation is still an issue: So the log file will not be rotated unless you do employ another tool like logrotate.

Grafana Integration

NOTE: Grafana Integration is available from 1.5.0

Before you configure the framework to integrate with Grafana, the Influx DB cluster must be configured as data source in Grafana, so that Grafana is able to visualise metrics from the Influx DB. On Grafana, follow the path Configuration -> Data Source the set up Influx DB as Data Source.

Rhino can be configured to set up your Grafana dashboards before the tests are executed to visualise the metrics stored in Influx DB. It uses the simulation’s execution id that is to be unique for every execution with which the framework also create an Influx DB measurement, otherwise the metrics will be written into an existing measurement table. The simulation id can be set as environment variable - in a distributed environment, the environment variable is to be injected by container orchestration platform like Mesos, K8, etc. :

export SIM_ID="123-abc"

To let the framework create dashboard, you need to add following configurations into

grafana.token="<token with write access obtained from Grafana under Configuration -> API Keys >"

To enable Grafana integration for your simulation, @Grafana annotation is to be added to your simulation entity:

@Simulation(name = "Reactive Test", durationInMins = 5)
public class RhinoEntity {

Once the test is started, the dashboard with the id, $SIM_ID will get created by the framework, if it does not exist, otherwise the dashboard will be re-used:

Storing Metrics in Influx DB

Rhino can send simulation metrics to Influx DB instance over Influx DB API. To enable Influx DB integration, you can use @InfluxDB annotation on simulation entity:

@Simulation(name = "Server-Status Simulation Without User")
public class BlockingLoadTestWithoutUserSimulation {

To configure the Influx DB integration in properties file:


The metrics will be send in batches to Influx DB.