Main Content

Change Parameters of Scoreboard in UVM Test Bench

This example shows how to control the parameters of the response checker in a UVM test bench generated from Simulink. Such parameterization helps the design verification engineer reuse the scoreboard under different testing scenarios. Simulink parameters to the checker will result in a configuration object whose values can be set in a derived test class (randomized or not) or directly via a command line plus arg.

Introduction

See the example Generate Parameterized UVM Test Bench from Simulink for a description of the design and the background on generating a UVM test bench. To generate the default test bench for this example, execute:

% Generate a UVM test bench
design     = 'prm_uvmtb/PulseDetector'
sequence   = 'prm_uvmtb/GenPulse'
scoreboard = 'prm_uvmtb/CheckDetection'
uvmbuild(design, sequence, scoreboard)

The Parameterization of the Scoreboard

In the model the response checking is parameterized by an error threshold variable used in an assertion checking block. It is designated as a parameter to retain in the SystemVerilog by using a Simulink.Parameter, pErrorThreshold. In that specification, it is given a default value of 1.95 percent, which reflects the tolerance of error difference between the golden reference and the actual DUT. (In this testbench, the discrepancy is due to double precision versus a fixed point implementations of the detection logic.)

In the generated UVM code, the parameter is placed in a configuration object, mw_PulseDetector_scoreboard_cfg_obj,

See mw_PulseDetector_scoreboard_cfg_obj.sv.

which is instantiated in the build_phase of the test, mw_PulseDetector_test:

See mw_PulseDetector_test.sv.

and set via the uvm_config_db in the scoreboard, mw_PulseDetector_scoreboard, in its start_of_simulation_phase:

See mw_PulseDetector_scoreboard.sv.

Updating the Error Threshold Using a Plus Argument

The test plan originally called for a tolerance of 1.95 percent. If later, the design specification was tightened to reflect a more safety critical requirement to 0.50 percent, this can be done using the command line. In our support scripts, we use an environment variable to affect the SystemVerilog command line plus argument.

% Clear environment variables that influence the UVM simulation
setenv EXTRA_UVM_SIM_ARGS
setenv EXTRA_UVM_COMP_ARGS
setenv UVM_TOP_MODULE
% Simulate the UVM test bench using plusarg overrides
cd prm_uvmtb_uvmbuild/uvm_testbench/top
setenv EXTRA_UVM_SIM_ARGS '+SNR_default_inp_val=01000000 +RTWStructParam_pErrorThreshold=0.50'
! vsim -do run_tb_mq.do     % ModelSim/QuestaSim (gui)
! vsim -c -do run_tb_mq.do  % ModelSim/QuestaSim (console)
! ./run_tb_incisive.sh      % Incisive (console)
! ./run_tb_xcelium.sh       % Xcelium (console)
! ./run_tb_vcs.sh           % VCS (console)
cd ../../..

Observe that with an SNR of 1.0 the design sometimes violates the 0.50 percent error threshold.

Randomize Error Threshold Using Default Configuration Object in Test

You can generate and set values directly from a test since the scoreboard configuration object is a member. Here we randomize a threshold in the range [0.050, 0.500] percent. Notice that the threshold is of type real and therefore cannot be randomized directly.

See mw_PulseDetector_SCRPRM_param_overrides.sv.

To run the UVM simulations using these new classes, we can use the original scripts with extra arguments given through environment variables.

% Simulate the UVM test bench using randomized configuration object setting
cd prm_uvmtb_uvmbuild/uvm_testbench/top
setenv EXTRA_UVM_COMP_ARGS '-f ../../../overrides_SCRPRM/extra_comp_args.f'
setenv EXTRA_UVM_SIM_ARGS '+SNR_default_inp_val=01000000 +UVM_TESTNAME=mw_PulseDetector_test_SCRPRM'
! vsim -do run_tb_mq.do     % ModelSim/QuestaSim (gui)
! vsim -c -do run_tb_mq.do  % ModelSim/QuestaSim (console)
! ./run_tb_incisive.sh      % Incisive (console)
! ./run_tb_xcelium.sh       % Xcelium (console)
! ./run_tb_vcs.sh           % VCS (console)
cd ../../..

Observe that in this run, a threshold of 0.150 percent is chosen and that several frames end up violating this threshold.

Randomize Error Threshold Using Derived Configuration Object

You can also create more sophisticated randomization of the configuration settings by creating a derived scoreboard configuration object. In this example, the test plan requires that the threshold fall in different interesting ranges that reflect different operating environments of the detector.

See mw_PulseDetector_SCRPRM_param_overrides.sv.

To achieve this goal, the derived configuration object has the follow extra behaviors:

  • add a "threshold_bucket" member declared as randc

  • use that threshold_bucket member to imply threshold ranges

  • print out the randomized values for inspection during simulation

A new test is created which tells the UVM factory to use the new configuration object and to randomize its settings during the build_phase.

To run the UVM simulations using these new classes, we can use the original scripts with extra arguments given through environment variables.

% Simulate the UVM test bench using randomized derived configuration
object
cd prm_uvmtb_uvmbuild/uvm_testbench/top
setenv EXTRA_UVM_COMP_ARGS '-f ../../../overrides_SCRPRM/extra_comp_args.f'
setenv EXTRA_UVM_SIM_ARGS '+SNR_default_inp_val=01000000 +UVM_TESTNAME=mw_PulseDetector_test_SCRPRM2'
! vsim -do run_tb_mq.do     % ModelSim/QuestaSim (gui)
! vsim -c -do run_tb_mq.do  % ModelSim/QuestaSim (console)
! ./run_tb_incisive.sh      % Incisive (console)
! ./run_tb_xcelium.sh       % Xcelium (console)
! ./run_tb_vcs.sh           % VCS (console)
cd ../../..

Observe that for this run a threshold of 1.01 percent is chosen and that all frames are under this threshold.

Conclusions and Next Steps

In a full UVM environment, whenever randomization is introduced it is usually essential to include coverage and to ensure different parts of the environment are aware of which random values were generated. In these examples randomized values are simply printed out. Adding coverage for generated error thresholds and analyzing the actual signal errors over the coarse of many simulation runs is an exercise left to the reader.