This example shows how to create a performance test and
regression test for the fprintf
function.
Consider the following unit (regression) test. You can run this test as a
performance test using runperf('fprintfTest')
instead of
runtests('fprintfTest')
.
classdef fprintfTest < matlab.unittest.TestCase properties file fid end methods(TestMethodSetup) function openFile(testCase) testCase.file = tempname; testCase.fid = fopen(testCase.file,'w'); testCase.assertNotEqual(testCase.fid,-1,'IO Problem') testCase.addTeardown(@delete,testCase.file); testCase.addTeardown(@fclose,testCase.fid); end end methods(Test) function testPrintingToFile(testCase) textToWrite = repmat('abcdef',1,5000000); fprintf(testCase.fid,'%s',textToWrite); testCase.verifyEqual(fileread(testCase.file),textToWrite) end function testBytesToFile(testCase) textToWrite = repmat('tests_',1,5000000); nbytes = fprintf(testCase.fid,'%s',textToWrite); testCase.verifyEqual(nbytes,length(textToWrite)) end end end
The measured time does not include the time to open and close the file or the
assertion because these activities take place inside a
TestMethodSetup
block, and not inside a
Test
block. However, the measured time includes the time
to perform the verifications. Best practice is to measure a more accurate
performance boundary.
Create a performance test in a file, fprintfTest.m
, in
your current working folder. This test is similar to the regression test with
the following modifications:
The test inherits from matlab.perftest.TestCase
instead of matlab.unittest.TestCase
.
The test calls the startMeasuring
and
stopMeasuring
methods to create a boundary
around the fprintf
function call.
classdef fprintfTest < matlab.perftest.TestCase properties file fid end methods(TestMethodSetup) function openFile(testCase) testCase.file = tempname; testCase.fid = fopen(testCase.file,'w'); testCase.assertNotEqual(testCase.fid,-1,'IO Problem') testCase.addTeardown(@delete,testCase.file); testCase.addTeardown(@fclose,testCase.fid); end end methods(Test) function testPrintingToFile(testCase) textToWrite = repmat('abcdef',1,5000000); testCase.startMeasuring(); fprintf(testCase.fid,'%s',textToWrite); testCase.stopMeasuring(); testCase.verifyEqual(fileread(testCase.file),textToWrite) end function testBytesToFile(testCase) textToWrite = repmat('tests_',1,5000000); testCase.startMeasuring(); nbytes = fprintf(testCase.fid,'%s',textToWrite); testCase.stopMeasuring(); testCase.verifyEqual(nbytes,length(textToWrite)) end end end
The measured time for this performance test includes only the call to
fprintf
, and the testing framework still evaluates the
qualifications.
Run the performance test. Depending on your system, you might see warnings that the performance testing framework ran the test the maximum number of times, but did not achieve a 0.05 relative margin of error with a 0.95 confidence level.
results = runperf('fprintfTest');
Running fprintfTest .......... .......... .......... .......... ..... Done fprintfTest __________ results = 1x2 MeasurementResult array with properties: Name Valid Samples TestActivity Totals: 2 Valid, 0 Invalid.
The results
variable is a 1x2
MeasurementResult
array. Each element in the array
corresponds to one of the tests defined in the test file.
Display the measurement results for the first test. Your results might vary.
results(1)
ans = MeasurementResult with properties: Name: 'fprintfTest/testPrintingToFile' Valid: 1 Samples: [10x7 table] TestActivity: [14x12 table] Totals: 1 Valid, 0 Invalid.
As indicated by the size of
the TestActivity
property, the performance
testing framework collected 14 measurements. This number includes 4 measurements
to warm up the code. The Samples
property
excludes warm-up measurements.
Display the sample measurements for the first test.
results(1).Samples
ans = Name MeasuredTime Timestamp Host Platform Version RunIdentifier ______________________________ ____________ ____________________ ___________ ________ _____________________ ____________________________________ fprintfTest/testPrintingToFile 0.067772 02-Jan-2016 18:24:52 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.085359 02-Jan-2016 18:24:53 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.075863 02-Jan-2016 18:24:53 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.068161 02-Jan-2016 18:24:53 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.067606 02-Jan-2016 18:24:53 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.073692 02-Jan-2016 18:24:54 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.070815 02-Jan-2016 18:24:54 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.067791 02-Jan-2016 18:24:54 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.077599 02-Jan-2016 18:24:54 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc fprintfTest/testPrintingToFile 0.07438 02-Jan-2016 18:24:55 MY-HOSTNAME win64 9.0.0.320924 (R2016a) 9b6a0d5c-5fe7-4d26-8479-222792127ebc
Display the mean measured time for the first test. To exclude data collected
in the warm-up runs, use the values in the Samples
field.
sampleTimes = results(1).Samples.MeasuredTime; meanTest = mean(sampleTimes)
meanTest = 0.0729
Determine the average time for all the test elements. The
fprintfTest
test includes two different methods. Compare
the time for each method (test element).
Since the performance testing framework returns a Samples
table for each test element, concatenate all these tables into one table. Then
group the rows by test element Name
, and compute the mean
MeasuredTime
for each group.
fullTable = vertcat(results.Samples); summaryStats = varfun(@mean,fullTable,... 'InputVariables','MeasuredTime','GroupingVariables','Name')
summaryStats = Name GroupCount mean_MeasuredTime ______________________________ __________ _________________ fprintfTest/testPrintingToFile 10 0.072904 fprintfTest/testBytesToFile 27 0.079338
Both test methods write the same amount of data to a file. Therefore, some of
the difference between the mean values is attributed to calling the
fprintf
function with an output argument.
Change the statistical objectives defined by the runperf
function by constructing and running a time experiment. Construct a time
experiment with measurements that reach a sample mean with a 3% relative margin
of error within a 97% confidence level. Collect eight warm-up
measurements.
Construct an explicit test suite.
suite = testsuite('fprintfTest');
Construct a time experiment with a variable number of sample measurements, and run the tests.
import matlab.perftest.TimeExperiment experiment = TimeExperiment.limitingSamplingError('NumWarmups',8,... 'RelativeMarginOfError',0.03, 'ConfidenceLevel', 0.97); resultsTE = run(experiment,suite);
Running fprintfTest .......... .......... .......... ..........Warning: The target Relative Margin of Error was not met after running the MaxSamples for fprintfTest/testPrintingToFile. .......... .......... .......... ..........Warning: The target Relative Margin of Error was not met after running the MaxSamples for fprintfTest/testBytesToFile. Done fprintfTest __________
In this example output, the performance testing framework is not able to meet the stricter statistical objectives with the default number of maximum samples. Your results might vary.
Compute the statistics for all the test elements.
fullTableTE = vertcat(resultsTE.Samples); summaryStatsTE = varfun(@mean,fullTableTE,... 'InputVariables','MeasuredTime','GroupingVariables','Name')
summaryStatsTE = Name GroupCount mean_MeasuredTime ______________________________ __________ _________________ fprintfTest/testPrintingToFile 32 0.081782 fprintfTest/testBytesToFile 32 0.076378
Increase the maximum number of samples to 100 and rerun the time experiment.
experiment = TimeExperiment.limitingSamplingError('NumWarmups',2,... 'RelativeMarginOfError',0.03,'ConfidenceLevel',0.97,'MaxSamples',100); resultsTE = run(experiment,suite);
Running fprintfTest .......... .......... .......... .......... .......... .......... .......... .......... .......... .......... .......... .. Done fprintfTest __________
Compute the statistics for all the test elements.
fullTableTE = vertcat(resultsTE.Samples); summaryStatsTE = varfun(@mean,fullTableTE,... 'InputVariables','MeasuredTime','GroupingVariables','Name')
summaryStatsTE = Name GroupCount mean_MeasuredTime ______________________________ __________ _________________ fprintfTest/testPrintingToFile 55 0.07783 fprintfTest/testBytesToFile 53 0.079008
The testing framework achieves the statistical objectives for both tests in approximately 50 samples.
Start a new MATLAB® session. A new session ensures that MATLAB has not run the code contained in your tests.
Measure the first-time cost of your code by creating and running a fixed time experiment with zero warm-up measurements and one sample measurement.
Construct an explicit test suite. Since you are measuring the first-time cost of the function, run a single test. To run multiple tests, save the results and start a new MATLAB session between tests.
suite = testsuite('fprintfTest/testPrintingToFile');
Construct and run the time experiment.
import matlab.perftest.TimeExperiment
experiment = TimeExperiment.withFixedSampleSize(1);
results = run(experiment,suite);
Running fprintfTest . Done fprintfTest __________
Display the results. Observe the TestActivity
table to
ensure there are no warm-up samples.
fullTable = results.TestActivity
fullTable = Name Passed Failed Incomplete MeasuredTime Objective Timestamp Host Platform Version TestResult RunIdentifier ______________________________ ______ ______ __________ ____________ _________ ____________________ ___________ ________ _____________________ ________________________________ ____________________________________ fprintfTest/testPrintingToFile true false false 0.065501 sample 06-Jan-2016 09:53:44 MY-HOSTNAME win64 9.0.0.323070 (R2016a) [1x1 matlab.unittest.TestResult] e0fe27aa-3224-46d1-a05a-542e9b8d9edb
The performance testing framework collects one sample for each test.
matlab.perftest.TestCase
| matlab.perftest.TimeExperiment
| matlab.unittest.measurement.MeasurementResult
| runperf
| testsuite