Using Chronometer to Quickly Analyze your Program Timing

This tutorial shows how to use the VisualGDB Chronometer to quickly analyze the time elapsed between different events without involving heavy code instrumentation. Before you begin, install VisualGDB 5.3 or later.

We will show how to quickly compare the performance of the sinf() function with the software floating-point mode vs hardware floating-point mode.

  1. Start Visual Studio and open the VisualGDB Embedded Project Wizard:01-newprj
  2. Select “Create a new project with MSBuild -> Embedded binary”:02-msbuild
  3. Select the ARM toolchain and your device. In this tutorial we will use the STM32F4Discovery board with the STM32F407VG microcontroller:03-device
  4. Proceed with the default “LEDBlink (HAL)” sample:04-sample
  5. Connect your board to the computer via USB. VisualGDB will automatically detect your ST-Link type and configure OpenOCD to use it:05-stlink
  6. Press “Finish” to create the project. Build it by pressing Ctrl-Shift-B:06-build
  7. Replace the main loop in the main() function with this:

  8. We will now use the chronometer to count the amount of cycles used by the call to the sinf() function. Open VisualGDB Project Properties and enable the chronometer. Optionally enter the amount of CPU clock ticks per second so that VisualGDB can display the actual times instead of clock cycle counts:08-settiming
  9. Set a breakpoint on the line calling sinf() and let it hit:09-stopped
  10. Step over the current line by pressing F10. Then press F5 to run another iteration of the loop and step over sinf() again:10-stepObserve the Chronometer window (you can always open it via Debug->Tool Windows if you closed it before). It will show the time elapsed between each debugger event. See how the first call to sinf() with the argument value of 0 took ~4 microseconds, while the second call with the value of 0.01 took ~43 microseconds. This happens due to internal optimizations in the sinf() function. Also note how the breakpoint got hit the second time 501 milliseconds after resuming execution (that corresponds to HAL_Delay()).
  11. You can switch the view mode from physical time to clock cycles to see the exact clock cycles elapsed between different events.11-cycles
  12. Open VisualGDB Project Properties and switch the floating point support to Hardware:12-hw
  13. Repeat the steps used to measure the sinf() performance. See how switching to hardware floating point mode makes the second call to sinf() more than 10 times faster:13-hwtime
  14. The chronometer works by reading the ARM Cortex instruction counter register (DWT_CYCCNT) each time the debug session is stopped. You can see this by searching for 0xE0001004 (address of the register) in the GDB Session window:cyccntVisualGDB also automatically resets the register to 0 at each debug stop to avoid overflows. This eliminates the need to instrument the code and hence does not cause any overhead, although it can only be used to see the time between events that caused the debugger to stop (such as breakpoints).
  15. The low-level chronometer behavior can be modified by copying the “C:\Program Files (x86)\Sysprogs\VisualGDB\TimestampProviders\DWT.xml” file and replacing the commands:

    If your device uses a different core or does not support the DWT_CYCCNT instruction counter, you can define your own timestamp provider and select it in VisualGDB Project Properties. We will show a detailed example of that in one of the next tutorials.