How to contribute to xPack OpenOCD
This page is designed for developers who intend to contribute new features or resolve bugs within the xPack OpenOCD project and provides documentation on building and testing the package.
The xPack Build Box
The build scripts in this project utilise the xPack Build Box (XBB)
tools, which require the usual native development tools
(packaged as a Docker image for GNU/Linux builds), complemented with
several binary packages, installed with xpm
as development dependencies.
For those interested in understanding how things work, a good starting point would be to read the XBB page.
The XBB tools are intended for building standalone relocatable distributions, thus are quite complex and perform several post-processing steps to adjust RPATH and validate the resulting binaries.
For the traditional configure && make install
builds specific to Linux,
these scripts are probably too complicated and therefore are not recommended
for inexperienced users.
xPack build configurations
The xPack Framework supports projects with multiple build configurations.
Build configurations are sets of properties, actions and dependencies that apply to a specific build. Build configurations can inherit from other build configurations.
For simple projects, the typical use case is with two configurations, Debug and Release.
For building the binary xPack executables, there is one configuration for each platform:
win32-x64
darwin-x64
darwin-arm64
linux-x64
linux-arm64
In case you wonder where these names originate
from, they are exactly the Node.js process.platform
and process.arch
for each platform.
The build configurations are defined in the package.json
file, within the
xpack
section.
{
"...":"...",
"xpack": {
"buildConfigurations": {
"...": {
},
"win32-x64": {
"inherit": [
"common-dependencies",
"common-actions",
"common-docker"
],
"devDependencies": {
"@xpack-dev-tools/gcc": "13.2.0-2.1",
"@xpack-dev-tools/mingw-w64-gcc": "13.2.0-1.1",
"@xpack-dev-tools/wine": "8.0.2-1.1"
},
"properties": {
"dockerImage": "ilegeul/ubuntu:amd64-18.04-xbb-v5.2.2"
},
"actions": {
"build": "{{properties.commandBashBuild}} --windows",
"build-development": "{{properties.commandBashBuild}} --windows --development",
"build-development-debug": "{{properties.commandBashBuild}} --windows --development --debug",
"build-development-tests-only": "{{properties.commandBashBuild}} --windows --development --tests-only"
}
}
}
}
}
To request xpm to perform a specific action on a given build configuration,
utilise the --config <name>
option.
For example:
xpm install --config darwin-x64
xpm run build --config darwin-x64
xpm/xPack actions
The xpm actions are extensions of npm scripts, i.e. named sequences
of commands that are invoked via xpm run <name>
to perform specific
operations together in a sub-shell.
The commands are invoked in a sub-shell with an adjusted PATH,
having the xpacks/.bin
folder prepended. This ensures the locally installed tools are
preferred to the system tools.
Actions may be defined for the entire project or for a specific build configuration.
The actions are defined in the package.json
file, within the
xpack
section, at the top or inside build configurations.
For those who, for various reasons, cannot utilise xpm, it is perfectly possible to manually adjust the PATH and to invoke the sequence of commands in order, just that it is more tedious, since multiple substitutions must be performed to compose the commands.
Visual Studio Code integration
xpm/xPack actions and build configurations are supported in Visual Studio via the xPack C/C++ Managed Build Tools extension.
With this extension installed, xpm actions may be very conveniently invoked via a single mouse click, for example:
Prerequisites
The build scripts execute on GNU/Linux and macOS. The Windows binaries are compiled on GNU/Linux, utilising mingw-w64.
For details on installing the prerequisites, please read the Build Prerequisites page.
Obtain project sources
The project is hosted on GitHub:
Branches
This project utilises multiple branches:
master
, not actively usedxpack
, containing the latest stable version (default)xpack-development
, containing the current development versionwebsite
, containing the current website content; pushes to this branch automatically trigger publication of the main websitedevelopment
, containing the current preview website content; pushes to this branch automatically trigger publication of the preview website
All development is conducted in the xpack-development
branch, and contributions via
Pull Requests should be directed to this branch.
When new releases are published, the xpack-development
branch is merged
into xpack
.
To clone the stable branch (xpack
), execute the following commands in a
terminal (on Windows use the Git Bash console):
rm -rf ~/Work/xpack-dev-tools/openocd-xpack.git && \
mkdir -p ~/Work/xpack-dev-tools && \
git clone \
--branch xpack \
https://github.com/xpack-dev-tools/openocd-xpack.git \
~/Work/xpack-dev-tools/openocd-xpack.git
For development purposes, clone the xpack-development
branch.
rm -rf ~/Work/xpack-dev-tools/openocd-xpack.git && \
mkdir -p ~/Work/xpack-dev-tools && \
git clone \
--branch xpack-development \
https://github.com/xpack-dev-tools/openocd-xpack.git \
~/Work/xpack-dev-tools/openocd-xpack.git
Alternatively, if the repository has already been cloned:
git -C ~/Work/xpack-dev-tools/openocd-xpack.git pull
To contribute Pull Requests, fork the project and ensure the Copy the master branch only is disabled.
Utilise the xpack-development
branch and ensure you contribute the
Pull Requests back to the xpack-development
branch.
Add links for development
During development, it is convenient to maintain a writable instance of the library to enable changes in parallel with the parent project.
To facilitate the use of a writable instance of this library in other projects, add a link from the user's global xPacks store to this local development folder:
xpm link -C ~/Work/xpack-dev-tools/openocd-xpack.git
And in the projects referring it:
xpm link @xpack-dev-tools/openocd
Get the writable helper sources (optional, for development purposes)
The project has a dependency on a common helper, that is
normally installed as a read-only dependency; for development
purposes, to be able to make changes to the scripts located within the helper,
clone the xpack-development
branch and link it to
the user's global xPacks store:
rm -rf ~/Work/xpack-dev-tools/xbb-helper-xpack.git && \
mkdir -p ~/Work/xpack-dev-tools && \
git clone \
--branch xpack-development \
https://github.com/xpack-dev-tools/xbb-helper-xpack.git \
~/Work/xpack-dev-tools/xbb-helper-xpack.git && \
xpm link -C ~/Work/xpack-dev-tools/xbb-helper-xpack.git
For more details on how a writable helper may be utilised via
xpm link
, please refer to the
XBB documentation.
Other repositories
Other repositories in use are:
- https://github.com/openocd-org/openocd.git - a read-only mirror of the upstream OpenOCD (git://git.code.sf.net/p/openocd/code)
How to build
The builds require dedicated machines for each platform (x64 GNU/Linux, armh64 GNU/Linux, arm GNU/Linux, x64 macOS and arm64 macOS).
Update the repository
git -C ~/Work/xpack-dev-tools/openocd-xpack.git pull
... and the helper (when using a writable helper) ...
git -C ~/Work/xpack-dev-tools/xbb-helper-xpack.git pull
Build the binaries
- Windows
- macOS x64
- macOS arm64
- GNU/Linux x64
- GNU/Linux arm64
The Windows builds execute on GNU/Linux, utilising mingw-w64.
To prepare the docker build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-link-deps --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To execute the docker build:
xpm run docker-build --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for increased verbosity, execute the similar development build:
xpm run docker-build-development --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Several minutes later, the output of the build script is a compressed
archive and its SHA signature, created within
the build-assets/build/win32-x64/deploy
folder:
-
xpack-openocd-0.12.0-7-win32-x64.tar.gz
-
xpack-openocd-0.12.0-7-win32-x64.tar.gz.sha
To re-execute the build, invoke the deep-clean action and repeat from installation:
xpm run deep-clean --config win32-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the native build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm install --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm install --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To execute the native build:
xpm run build --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for increased verbosity, execute the similar development build:
xpm run build-development --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
About 5 minutes later, the output of the build script is a compressed
archive and its SHA signature, created within
the build-assets/build/darwin-x64/deploy
folder:
-
xpack-openocd-0.12.0-7-darwin-x64.tar.gz
-
xpack-openocd-0.12.0-7-darwin-x64.tar.gz.sha
To re-execute the build, invoke the deep-clean action and repeat from installation:
xpm run deep-clean --config darwin-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the native build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm install --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm install --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To execute the native build:
xpm run build --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for increased verbosity, execute the similar development build:
xpm run build-development --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Several minutes later, the output of the build script is a compressed
archive and its SHA signature, created within
the build-assets/build/darwin-arm64/deploy
folder:
-
xpack-openocd-0.12.0-7-darwin-arm64.tar.gz
-
xpack-openocd-0.12.0-7-darwin-arm64.tar.gz.sha
To re-execute the build, invoke the deep-clean action and repeat from installation:
xpm run deep-clean --config darwin-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the docker build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-link-deps --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To execute the docker build:
xpm run docker-build --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for increased verbosity, execute the similar development build:
xpm run docker-build-development --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Several minutes later, the output of the build script is a compressed
archive and its SHA signature, created within
the build-assets/build/linux-x64/deploy
folder:
-
xpack-openocd-0.12.0-7-linux-x64.tar.gz
-
xpack-openocd-0.12.0-7-linux-x64.tar.gz.sha
To re-execute the build, invoke the deep-clean action and repeat from installation:
xpm run deep-clean --config linux-x64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To prepare the docker build:
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
... or, with the writable helper ...
xpm run install -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run link-deps -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-prepare --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets && \
xpm run docker-link-deps --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
To execute the docker build:
xpm run docker-build --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
or, for increased verbosity, execute the similar development build:
xpm run docker-build-development --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Several minutes later, the output of the build script is a compressed
archive and its SHA signature, created within
the build-assets/build/linux-arm64/deploy
folder:
-
xpack-openocd-0.12.0-7-linux-arm64.tar.gz
-
xpack-openocd-0.12.0-7-linux-arm64.tar.gz.sha
To re-execute the build, invoke the deep-clean action and repeat from installation:
xpm run deep-clean --config linux-arm64 -C ~/Work/xpack-dev-tools/openocd-xpack.git/build-assets
Compile with debug information
In some cases it is necessary to execute a debug session with the binaries.
For these cases, the build script accepts the --debug
options.
There are also xpm actions that utilise this option (build-development-debug
and docker-build-development-debug
).
Utilise a local cache
The XBB build scripts utilise a local cache such that files are downloaded only during the first execution, later executions being able to utilise the cached files.
However, occasionally some servers may not be available, and the builds may fail.
The workaround is to manually download the files from alternate
locations (such as
https://github.com/xpack-dev-tools/files-cache/tree/master/libs),
place them in the XBB cache (Work/cache
) and restart the build.
Manual tests
For the simplest functional case, plug a common board like the STM32F4DISCOVERY into an USB port, start the program and check if the CPU is identified.
Note: If this is the first time openocd is executed, on GNU/Linux it is necessary to configure the rights, otherwise LIBUSB will issue the libusb_open failed: LIBUSB_ERROR_ACCESS error.
sudo cp ~/Downloads/xpack-openocd-0.12.0-7/contrib/60-openocd.rules /etc/udev/rules.d
sudo udevadm control --reload-rules
Then it is possible to start openocd:
$ .../bin/openocd -f "board/stm32f4discovery.cfg"
xPack Open On-Chip Debugger 0.12.0-01004-g9ea7f3d64-dirty
Licensed under GNU GPL v2
For bug reports, read
https://openocd.org/doc/doxygen/bugs.html
Info : The selected transport took over low-level target control. The results might differ compared to plain JTAG/SWD
srst_only separate srst_nogate srst_open_drain connect_deassert_srst
Info : Listening on port 6666 for tcl connections
Info : Listening on port 4444 for telnet connections
Info : clock speed 2000 kHz
Info : STLINK V2J39S0 (API v2) VID:PID 0483:3748
Info : Target voltage: 2.901598
Info : [stm32f4x.cpu] Cortex-M4 r0p1 processor detected
Info : [stm32f4x.cpu] target has 6 breakpoints, 4 watchpoints
Info : starting gdb server for stm32f4x.cpu on 3333
Info : Listening on port 3333 for gdb connections
[stm32f4x.cpu] halted due to breakpoint, current mode: Handler HardFault
xPSR: 0x61000003 pc: 0x080002d6 msp: 0x2001ff78
^C
shutdown command invoked
Note: on recent macOS systems it might be necessary to allow individual programs to run.
For a more thorough test, run a debug session with
the Eclipse STM32F4DISCOVERY blinky test
available in the xpack-arm-none-eabi-openocd package, which uses
the -f "board/stm32f4discovery.cfg"
configuration file
(import the arm-f4b-fs
project and start the arm-f4b-fs-debug-oocd
launcher).