Quantcast

SystemTest

Verification of a Sobel Edge Detection Algorithm

This example shows how to use SystemTest to test a digital signal processing (DSP) algorithm model represented in Simulink. The Sobel Edge Detection algorithm is the focus of this example.

This example requires the following products to run:

  • Simulink

  • Computer Vision System Toolbox™

Model Based Design

As designs become larger and more complicated, it is necessary to describe a design at a high level, enabling the designer to run simulations faster and identify bugs early on.

When implementing a DSP algorithm, a system level engineer often designs the algorithm and verifies that it satisfies the project requirements. This design serves as a baseline or golden reference for the engineers responsible for taking the algorithm to the hardware.

In developing this design, the system engineer generally does not keep the implementation details in mind but rather implements the algorithm to match the behavioral requirements. As a result, the downstream development team may need to make modifications to fit the design into a real-time system that may have limited resources such as memory or processing power.

This process of design elaboration may involve converting the double precision design to a fixed point design in order to make the implementation suitable for an FPGA or an ASIC.

Representing this design in the form of a Simulink model provides engineers with an executable specification that is at the center of Model Based Design: http://www.mathworks.com/applications/controldesign/description/mbd.html

Design Verification

In this example, the Sobel Edge Detection algorithm is implemented in Simulink. It is first implemented using the Computer Vision System Toolbox's Edge Detection block, which serves as the golden reference that would have been provided by a system engineer.

A second implementation offers an elaboration of the design in order to make it more realizable in hardware. It represents an alternate implementation that a development team would have modeled and explored in Simulink.

The results from this second design are verified against the golden reference by measuring the error introduced by the alternate implementation. A satellite image is used as the input to the edge detection algorithm.

demosystest_edge_model

Testing the Algorithm

In order for the elaborated design to satisfy the project's requirements, it must match the results produced by the golden reference within an absolute tolerance level of 10%. The test cases for which this condition must be met are:

  • Threshold values ranging from 365 to 535

  • Noise levels ranging from 0 to 100

Using SystemTest, test vectors are created representing this range of threshold and noise level values. The following elements are used to perform the necessary testing:

  • Simulink Element - For each main test iteration, a Simulink element is used to apply the varying threshold and noise level values to the model under test. The measured difference between the 2 algorithm implementations is read back into SystemTest and assigned to a test variable.

  • Limit Check Element - Using a Limit Check element, the measured difference is checked to determine if it meets the 10% error margin.

  • General Plot Element - For each threshold value being tested, the measured difference is plotted using a Stem plot-type within a General Plot element.

To view the test, use the systemtest function to open the test.

systemtest('demosystest_edge.test');

Open the example in the SystemTest desktop.Open the example in the SystemTest desktop.

Test Reuse to Evaluate a Fixed Point Design

Note that the initial model is focused on a double precision design. To assess how your test results would change with a fixed point design, the only change required is to configure the data type conversion block to use a fixed point data type.

Once this is done, the same SystemTest test can be reused without modification to revalidate the new design.

Saving Results

For each main test iteration, the following information is saved as a result for post-processing by specifying them under Save Results:

  • Measured difference between both algorithm implementations

  • The pass/fail value determined by the Limit Check element