Verify Generated Executable Program Results

Verify whether generated executable program results for the model match simulation results.

Configure Model for Verification

  1. Configure the Dashboard Scope block to monitor the values of signals Force: f(t): 1 and X. Double-click the Dashboard Scope block. In the Block Parameters dialog box, confirm that:

    • The block is connected to signals Force: f(t): 1 and X. To connect a Dashboard block to a signal, in the model canvas, select the signal. In the Block Parameters dialog box, select the signal name.

    • Min is set to -10.

    • Max is set to 10.

    Apply changes and close the dialog box.

  2. Configure the Knob block so that you can use the knob to change the value of the damping gain. Double-click the Knob block. In the Block Parameters dialog box, confirm that:

    • The block is connected to parameter Damping:Gain. To connect a Dashboard block to a parameter, in the model canvas, select the block that uses the parameter. In the Block Parameters dialog box, select the parameter name.

    • Minimum is set to 200.

    • Maximum is set to 600.

    • Tick Interval is set to 100.

    Apply changes and close the dialog box.

  3. Open the Model Configuration Parameters dialog box. On the C Code tab, click Settings.

  4. Configure the model such that Simulink® and the generated executable program log workspace data in the Simulation Data Inspector. Click Data Import/Export. Confirm that the model is configured with these settings:

    Parameter SelectedName Set To
    Timetout
    Statesxout
    Outputyout
    Signal logginglogsout
    Data storesdsmout
    Record logged workspace data in Simulation Data Inspector 
  5. Configure the model for building an executable program. Click Code Generation. Confirm that parameter Generate code only is cleared.

  6. Configure and validate the toolchain for building the executable program. Confirm that parameter Toolchain is set to Automatically locate an installed toolchain. Then, search for and click the Validate Toolchain button. The Validation Report indicates whether the checks passed.

  7. Configure parameters and signals so that the data is stored in memory and is accessible while the the executable program runs. To efficiently implement a model in C code, you do not allocate memory for every parameter, signal, and state in the model. If the model algorithm does not require data to calculate outputs, code generation optimizations eliminate storage for data. To allocate storage for the data so that you can access it during prototyping, you disable some optimizations.

    Click Code Generation > Optimization. Confirm that:

    • Default parameter behavior is set to Tunable. With this setting, block parameters, such as the Gain parameter of a Gain block, are tunable in the generated code.

    • Signal storage reuse is cleared. With this setting, the code generator allocates storage for signal lines. While running the executable program, you can monitor the values of these signals.

  8. Configure the code generator to produce nonfinite data (for example, NaN and Inf) and related operations. Click Code Generation > Interface. Confirm that parameter Support: non-finite numbers is selected.

  9. Configure a communication channel. For Simulink® to communicate with an executable program generated from a model, the model must include support for a communication channel. This example uses XCP on TCP/IP as the transport layer for a communication channel. Confirm these parameter settings:

  10. Disable MAT-file logging. Load the data into the Simulation Data Inspector from the MATLAB® base workspace. Confirm that parameter MAT-file logging is cleared.

  11. Apply your configuration changes, close the Model Configuration Parameters dialog box, and save the model.

Simulate Model and View Results

  1. From the Simulink Editor, in the Simulation tab, click Run. The clock on the Run button indicates that simulation pacing is enabled. Simulation pacing slows down a simulation so that you can observe system behavior. Visualizing simulations at a slower rate makes it easier to understand the underlying system design and identify design issues while demonstrating near real-time behavior.

    During the simulation, the Dashboard Scope block displays the behavior of signals Force: f(t):1 and X.

  2. In the Simulink Editor, in the Simulation tab, click Data Inspector. The Simulation Data Inspector opens with data from your simulation run imported.

  3. Expand the run (if not already expanded). Then, to plot the data, select data signals X and Force: f(t):1.

Leave these results in the Simulation Data Inspector. Later, you compare the simulation data to the output data produced by the executable program generated from the model.

Build and Run Executable Program and View Results

Build and run the model executable program.

  1. Click Monitor & Tune. Simulink:

    1. Builds the executable program. During the build process Building appears on the bottom-left corner of the Simulink Editor window. When the code generation report appears and the text reads Ready, the process is complete.

      • In Windows®, the code generator creates and places these files in your current working folder:

        • rtwdemo_secondOrderSystem.exe – Executable program file

        • rtwdemo_secondOrderSystem.pdb – Debugging symbols file for parameters and signals

      • In Linux®, the code generator creates and places DWARF format debugging information in the ELF executable program file, rtwdemo_secondOrderSystem, and places the file in your current working folder.

    2. Deploys the executable program as a separate process on your development computer.

    3. Connects the Simulink model to the executable program.

    4. Runs the model executable program code.

Compare Simulation and Executable Program Results

Use the Simulation Data Inspector to compare the executable program results with the simulation results.

  1. In the Simulation Data Inspector, inspect the results of your executable program run, Run 2: rtwdemo_secondOrderSystem.

  2. Click Compare.

  3. Select the data runs that you want to compare. For this example, from the Baseline list, select Run 1: rtwdemo_secondOrderSystem. From the Compare to list, select Run 2: rtwdemo_secondOrderSystem.

  4. In the upper-right corner of the Simulation Data Inspector, click Compare.

  5. The Simulation Data Inspector indicates that the output for X and Force: f(t):1 from the executable program code is out of tolerance from the simulation data output. To see a plot of the results for X, under File Comparisons, select the row for X.

  6. Inspect the comparison plot for Force: f(t):1. Under File Comparisons, select the row for Force: f(t):1.

  7. Determine whether the numerical discrepancies are significant by specifying an absolute relative tolerance value. For this tutorial, set Global Abs Tolerance to 1e-12. Then, click Compare. The comparisons for X and Force: f(t):1 are within tolerance.

    For more information,about numerical consistency verification and tolerances, see Numerical Consistency of Model and Generated Code Simulation Results.

Next, tune a parameter during program execution.