I have to conduct a test in Jmeter that uses a csv file as data source.
The requirement of the test is that for each request (the test needs to have 100 requests), 10.000 rows from the csv need to be processed.
To summarize, this is how the test should look like:
request 1 -> iterate over the first 10.000 rows (1-10.000) request 2 -> iterate over the next 10.000 rows (10.001 - 20.000) ... ... request 100 -> iterate over rows 990.001 - 1.000.000
I am new to Jmeter so I hope I was able to explain what I have to do, but in case you need more details please let me know.
So basically, is there an easy way to do this by adding some sort of controller or does it require a JSR223 sampler with some code in it?
Thank you.
I am testing an application that stores use to mark products as "sold", the requirement is to check how the app behaves when they are selling 100 batches each containing 10.000 products.
The test also has a sampler with a SOAP request, there is a node in the soap request xml containing the product serial number, initially I tried hardcoding 10.000 serial numbers under this parameter but the result wasn't the expected one.
I would go for splitting the original "large" CSV file into smaller CSV files containing 10 000 rows each.
Put the following code into "Script" area:
SampleResult.setIgnore()
def largeFile = new File('largefile.csv')
def i = 0
def smallFilePostfix = 0
def smallFile = new File('smallfile' + smallFilePostfix + '.csv')
largeFile.each { line ->
if (i >= 10000) {
i = 0
smallFilePostfix += 1
smallFile = new File('smallfile' + smallFilePostfix + '.csv')
}
i = i + 1
smallFile << line << System.getProperty('line.separator')
}
That's it, the above code will split largefile.csv
into smallfile0.csv
, smallfile1.csv
, etc. each containing 10 000 lines which can be used normally using i.e. __CSVRead() function
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments