How to contribute to the xPack OpenOCD
This page is designed for developers who plan to contribute new features or fix bugs in the xPack OpenOCD project and provides documentation on how to build and test the package.
The xPack Build Box
The build scripts in this project use the xPack Build Box (XBB)
tools, which require the usual native development tools
(packed as a Docker image for GNU/Linux builds), complemented with
several binary packages, installed with xpm
as development dependencies.
For those interested in understanding how things work, a good starting point would be to read the XBB page.
The XBB tools are intended for building standalone relocatable distributions, thus are quite complex and perform several post-processing steps to adjust RPATH and validate the resulting binaries.
For the traditional configure && make install
builds specific to Linux,
these scripts are probably too complicated and therefore are not recommended
for inexperienced users.
xPack build configurations
The xPack Framework supports projects with multiple build configurations.
Build configurations are sets of properties, actions and dependencies that apply to a specific build. Build configurations can inherit from other build configurations.
For simple projects, the typical use case is with two configurations, Debug and Release.
For building the binary xPack executables, there is one configuration for each platform:
win32-x64
darwin-x64
darwin-arm64
linux-x64
linux-arm64
linux-arm
In case you wonder where these names come
from, they are exactly the Node.js process.platform
and process.arch
for each platform.
The build configurations are defined in the package.json
file, in the
xpack
section.
{
"...":"...",
"xpack": {
"buildConfigurations": {
"...": {
},
"win32-x64": {
"inherit": [
"common-dependencies",
"common-actions",
"common-docker"
],
"devDependencies": {
"@xpack-dev-tools/gcc": "13.2.0-2.1",
"@xpack-dev-tools/mingw-w64-gcc": "13.2.0-1.1",
"@xpack-dev-tools/wine": "8.0.2-1.1"
},
"properties": {
"dockerImage": "ilegeul/ubuntu:amd64-18.04-xbb-v5.2.2"
},
"actions": {
"build": "{{properties.commandBashBuild}} --windows",
"build-development": "{{properties.commandBashBuild}} --windows --development",
"build-development-debug": "{{properties.commandBashBuild}} --windows --development --debug",
"build-development-tests-only": "{{properties.commandBashBuild}} --windows --development --tests-only"
}
}
}
}
}
To ask xpm to perform a specific action on a given build configuration,
use the --config <name>
option.
For example:
xpm install --config darwin-x64
xpm run build --config darwin-x64
xpm/xPack actions
The xpm actions are extensions of npm scripts, i.e. named sequences
of commands that are invoked via xpm run <name>
to perform specific
operations.
together in a sub-shell .
The commands are invoked in a sub-shell with an adjusted PATH,
having the xpacks/.bin
folder prepended. This ensures the locally installed tools are
preferred to the system tools.
Actions can be defined for the entire project or for a specific build configuration.
The actions are defined in the package.json
file, in the
xpack
section, at the top or inside build configurations.
For those who, for various reasons, can not use xpm, it is perfectly possible to manually adjust the PATH and to invoke the sequence of commands in order, just that it is more tedious, since multiple substitutions must be performed to compose the commands.
Visual Studio Code integration
xpm/xPack actions and build configurations are supported in Visual Studio via the xPack C/C++ Managed Build Tools extension.
With this extension installed, xpm/xPack actions can be very conveniently invoked via a single mouse click, for example:
Prerequisites
The build scripts run on GNU/Linux and macOS. The Windows binaries are compiled on x64 GNU/Linux, using mingw-w64.
For details on installing the prerequisites, please read the Build Prerequisites page.
Get project sources
The project is hosted on GitHub:
Branches
Apart from the unused master
branch, there are three active branches:
xpack
, with the latest stable version (default)xpack-development
, with the current development versionwebsite
, with the current content of the website
All development is done in the xpack-development
branch, and contributions via
Pull Requests should be directed to this branch.
When new releases are published, the xpack-development
branch is merged
into xpack
.
Pushes to the website
branch trigger a GitHub Action to generate
and publish the project web site.
To clone the stable branch (xpack
), run the following commands in a
terminal (on Windows use the Git Bash console):
rm -rf ~/Work/xpack-dev-tools/openocd-xpack.git && \
git clone https://github.com/xpack-dev-tools/openocd-xpack.git \
~/Work/xpack-dev-tools/openocd-xpack.git
For development purposes, clone the xpack-development
branch.
rm -rf ~/Work/xpack-dev-tools/openocd-xpack.git && \
mkdir -p ~/Work/xpack-dev-tools && \
git clone \
--branch xpack-development \
https://github.com/xpack-dev-tools/openocd-xpack.git \
~/Work/xpack-dev-tools/openocd-xpack.git
To contribute Pull Requests, fork the project and be sure the Copy the master branch only is disabled.
Use the xpack-development
branch and be sure you contribute the
Pull Requests back to the xpack-development
branch.
::
Get the writable helper sources (optional, for development purposes)
The project has a dependency to a common helper, that is
normally installed as a read-only dependency; for development
purposes, to be able to make changes to the scripts located inside the helper,
clone the xpack-development
branch and link it to
the user global xPacks store:
rm -rf ~/Work/xpack-dev-tools/xbb-helper-xpack.git && \
mkdir -p ~/Work/xpack-dev-tools && \
git clone \
--branch xpack-development \
https://github.com/xpack-dev-tools/xbb-helper-xpack.git \
~/Work/xpack-dev-tools/xbb-helper-xpack.git && \
xpm link -C ~/Work/xpack-dev-tools/xbb-helper-xpack.git
For more details the how a writable helper can be used via
xpm link
, please see the
XBB documentation.
Other repositories
Other repositories in use are:
- https://github.com/openocd-org/openocd.git - a read-only mirror of the upstream OpenOCD (git://git.code.sf.net/p/openocd/code)
How to build
The builds require dedicated machines for each platform (x64 GNU/Linux, armh64 GNU/Linux, arm GNU/Linux, x64 macOS and arm64 macOS).
Update the repo
git -C ~/Work/xpack-dev-tools/openocd-xpack.git pull
... and the helper (when using a writable helper) ...
git -C ~/Work/xpack-dev-tools/xbb-helper-xpack.git pull
Build the binaries
- Windows
- macOS x64
- macOS arm64
- GNU/Linux x64
- GNU/Linux arm64
- GNU/Linux arm
The Windows builds run on GNU/Linux, using mingw-w64.
To prepare the docker build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-link-deps --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To run the docker build:
xpm run docker-build --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for more verbosity, run the similar development build:
xpm run docker-build-development --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Several minutes later, the output of the build script is a compressed
archive and its SHA signature, created in
the build-assets/build/win32-x64/deploy
folder:
-
xpack-openocd-0.12.0-4-win32-x64.tar.gz
-
xpack-openocd-0.12.0-4-win32-x64.tar.gz.sha
To rerun the build, invoke the deep-clean action and repeat from install:
xpm run deep-clean --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the native build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm install --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm install --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To run the native build:
xpm run build --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for more verbosity, run the similar development build:
xpm run build-development --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Several minutes later, the output of the build script is a compressed
archive and its SHA signature, created in
the build-assets/build/darwin-x64/deploy
folder:
-
xpack-openocd-0.12.0-4-darwin-x64.tar.gz
-
xpack-openocd-0.12.0-4-darwin-x64.tar.gz.sha
To rerun the build, invoke the deep-clean action and repeat from install:
xpm run deep-clean --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the native build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm install --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm install --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To run the native build:
xpm run build --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for more verbosity, run the similar development build:
xpm run build-development --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Several minutes later, the output of the build script is a compressed
archive and its SHA signature, created in
the build-assets/build/darwin-arm64/deploy
folder:
-
xpack-openocd-0.12.0-4-darwin-arm64.tar.gz
-
xpack-openocd-0.12.0-4-darwin-arm64.tar.gz.sha
To rerun the build, invoke the deep-clean action and repeat from install:
xpm run deep-clean --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the docker build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-link-deps --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To run the docker build:
xpm run docker-build --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for more verbosity, run the similar development build:
xpm run docker-build-development --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Several minutes later, the output of the build script is a compressed
archive and its SHA signature, created in
the build-assets/build/linux-x64/deploy
folder:
-
xpack-openocd-0.12.0-4-linux-x64.tar.gz
-
xpack-openocd-0.12.0-4-linux-x64.tar.gz.sha
To rerun the build, invoke the deep-clean action and repeat from install:
xpm run deep-clean --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the docker build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-link-deps --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To run the docker build:
xpm run docker-build --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for more verbosity, run the similar development build:
xpm run docker-build-development --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
About 10 minutes later (3 minutes on ampere), the output of the build script is a compressed
archive and its SHA signature, created in
the build-assets/build/linux-arm64/deploy
folder:
-
xpack-openocd-0.12.0-4-linux-arm64.tar.gz
-
xpack-openocd-0.12.0-4-linux-arm64.tar.gz.sha
To rerun the build, invoke the deep-clean action and repeat from install:
xpm run deep-clean --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the docker build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-arm -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-arm -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-link-deps --config linux-arm -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To run the docker build:
xpm run docker-build --config linux-arm -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for more verbosity, run the similar development build:
xpm run docker-build-development --config linux-arm -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
About 10 minutes later, the output of the build script is a compressed
archive and its SHA signature, created in
the build-assets/build/linux-arm/deploy
folder:
-
xpack-openocd-0.12.0-4-linux-arm.tar.gz
-
xpack-openocd-0.12.0-4-linux-arm.tar.gz.sha
To rerun the build, invoke the deep-clean action and repeat from install:
xpm run deep-clean --config linux-arm -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Compile with debug info
In some cases it is necessary to run a debug session with the binaries.
For these cases, the build script accepts the --debug
options.
There are also xpm actions that use this option (build-development-debug
and docker-build-development-debug
).
Use a local cache
The XBB build scripts use a local cache such that files are downloaded only during the first run, later runs being able to use the cached files.
However, occasionally some servers may not be available, and the builds may fail.
The workaround is to manually download the files from alternate
locations (like
https://github.com/xpack-dev-tools/files-cache/tree/master/libs),
place them in the XBB cache (Work/cache
) and restart the build.
Manual tests
For the simplest functional case, plug a common board like the STM32F4DISCOVERY into an USB port, start the program and check if the CPU is identified.
Note: If this is the first time openocd is executed, on GNU/Linux it is necessary to configure the rights, otherwise LIBUSB will issue the libusb_open failed: LIBUSB_ERROR_ACCESS error.
sudo cp ~/Downloads/xpack-openocd-0.12.0-4/contrib/60-openocd.rules /etc/udev/rules.d
sudo udevadm control --reload-rules
Then it is possible to start openocd:
$ .../bin/openocd -f "board/stm32f4discovery.cfg"
xPack Open On-Chip Debugger 0.12.0-01004-g9ea7f3d64-dirty
Licensed under GNU GPL v2
For bug reports, read
https://openocd.org/doc/doxygen/bugs.html
Info : The selected transport took over low-level target control. The results might differ compared to plain JTAG/SWD
srst_only separate srst_nogate srst_open_drain connect_deassert_srst
Info : Listening on port 6666 for tcl connections
Info : Listening on port 4444 for telnet connections
Info : clock speed 2000 kHz
Info : STLINK V2J39S0 (API v2) VID:PID 0483:3748
Info : Target voltage: 2.901598
Info : [stm32f4x.cpu] Cortex-M4 r0p1 processor detected
Info : [stm32f4x.cpu] target has 6 breakpoints, 4 watchpoints
Info : starting gdb server for stm32f4x.cpu on 3333
Info : Listening on port 3333 for gdb connections
[stm32f4x.cpu] halted due to breakpoint, current mode: Handler HardFault
xPSR: 0x61000003 pc: 0x080002d6 msp: 0x2001ff78
^C
shutdown command invoked
Note: on recent macOS systems it might be necessary to allow individual programs to run.
For a more thorough test, run a debug session with
the Eclipse STM32F4DISCOVERY blinky test
available in the xpack-arm-none-eabi-openocd package, which uses
the -f "board/stm32f4discovery.cfg"
configuration file
(import the arm-f4b-fs
project and start the arm-f4b-fs-debug-oocd
launcher).