Contribute
Introduction
Feel free and welcome to contribute to this project. You can start with filing Issues and ideas for improvement but if you feel like contributing some code you are welcome even more!
Our favorite thoughts from The Zen of Python:
Beautiful is better than ugly.
Simple is better than complex.
Readability counts.
We respect the PEP8 Style Guide for Python Code. Here’s a couple of recommendations to keep on mind when writing code:
Maximum line length is 99 for code and 72 for documentation.
Comments should be complete sentences.
The first word should be capitalized (unless identifier).
When using hanging indent, the first line should be empty.
The closing brace/bracket/parenthesis on multiline constructs is under the first non-whitespace character of the last line.
When generating user messages use the whole sentence with the first word capitalized and enclose any names in single quotes:
self.warn(f"File '{path}' not found.")
For multiline shell scripts, use multiline strings -
ShellScriptwill perform dedenting correctly:ShellScript( f""" mkdir -p {workdir_root}; setfacl -d -m o:rX {workdir_root} """ )
However, for strings that are too long for a single line and are expected to render as a single line, literal concatenation is preferred:
raise tmt.steps.provision.RebootModeNotSupportedError( f"Guest '{self.multihost_name}' does not support soft reboot." " Containers can only be stopped and started again (hard reboot)." )
See the “maximum line length” point above for the actual limit.
Commits
It is challenging to be both concise and descriptive, but that is what a well-written summary should do. Consider the commit message as something that will be pasted into release notes:
The first line should have up to 50 characters.
Complete sentence with the first word capitalized.
Should concisely describe the purpose of the patch.
Do not prefix the message with file or module names.
Other details should be separated by a blank line.
Why should I care?
It helps others (and yourself) find relevant commits quickly.
The summary line will be reused later (e.g. for rpm changelog).
Some tools do not handle wrapping, so it is then hard to read.
You will make the maintainers happy to read beautiful commits :)
You can get some more context in the stackoverflow article.
Code Assistants
When a code assistant tool is used during the code creation use the following format for storing the attribution in the commit message:
- Assisted-by
when the tool has partially improved, polished or enhanced the code written by you
- Generated-by
in cases when the majority of the code was generated by the tool itself
For example:
Assisted-by: Claude
Generated-by: Cursor
Develop
In order to experiment, play with the latest bits and develop improvements it is best to use a virtual environment. Make sure that you have all required packages installed on your box:
make develop
Create a development virtual environment with hatch:
git clone https://github.com/teemtee/tmt
cd tmt
hatch env create dev
Enter the environment by running:
hatch -e dev shell
When interacting from within the development environment with services with internal certificates, you need to export the following environment variable:
export REQUESTS_CA_BUNDLE=/etc/pki/tls/certs/ca-bundle.crt
Install the pre-commit script to run all available checks for
your commits to the project:
pre-commit install
Tests
Every code change should be accompanied by tests covering the new feature or affected code area. It’s possible to write new tests or extend the existing ones.
If writing a test is not feasible for you, explain the reason in
the pull request. If possible, the maintainers will help with
creating needed test coverage. You might also want to add the
help wanted and tests needed labels to bring a bit more
attention to your pull request.
Run the default set of tests directly on your localhost:
tmt run
Run selected tests or plans in verbose mode:
tmt run --verbose plan --name basic
tmt run -v test -n smoke
You might want to set some useful environment variables when
working on tmt tests, for example TMT_FEELING_SAFE to
allow the local provision method or TMT_SHOW_TRACEBACK to
show the full details for all failures. Consider installing the
direnv command which can take care of these for you.
Unit Tests
To run unit tests in hatch environment using pytest and generate coverage report:
make coverage
To see all available scripts for running tests in hatch test virtual environments:
hatch env show test
To run ‘unit’ script for example, run:
hatch run test:unit
When running tests using hatch, there are multiple virtual environments available, each using a different Python interpreter (generally the lowest and highest version supported). To run the tests in all environments, install the required Python versions. For example:
dnf install python3.9 python3.11
Note
When adding new unit tests, do not create class-based tests derived from
unittest.TestCase class. Such classes do not play well with Pytest’s
fixtures, see https://docs.pytest.org/en/7.1.x/how-to/unittest.html for
details.
Provision Methods
Tests which exercise multiple provision methods should use the
PROVISION_HOW environment variable to select which provision
method should be exercised during their execution. This variable
is likely to have local set as the default value in the test
script to execute directly on the test runner as the default
scenario. If a test does not support the local provision
method make sure to use the provision-only tag so that the
test in question is excluded from the regular plans.
The following tags can be used to enable given test under the respective provision method plan:
- provision-artemis
For tests checking the artemis plugin functionality.
- provision-beaker
For tests checking the beaker plugin functionality using the
mrackplugin.- provision-connect
For tests checking the connect plugin functionality.
- provision-container
For tests checking the container provision method using the
podmanplugin.- provision-virtual
For tests checking the virtual.testcloud provision method using the
testcloudplugin.- provision-ssh
Tests which are not tied to a specific provision method but should be executed for all provision methods which are using
sshto connect to guests.- provision-only
Used to mark tests which are suitable to be run only under specific provision methods. These will be excluded from regular plans.
Images
Tests which exercise the container provisioning plugin with various guest environments should use the custom-built set of container images rather than using the upstream ones directly. We built custom images to have better control over the initial environment setup, especially when it comes to essential requirements and assumption tmt makes about the guest setup. The naming scheme also provides better information about content of these images when compared to very varied upstream locations.
Naming scheme
All our test images follow a simple naming pattern:
localhost/tmt/tests/BACKEND/DISTRIBUTION/RELEASE/EXTRAS:TAG
localhost/tmt/testsTo make it clear the image was built locally, it is owned by tmt, and it is not packaging tmt but serves for testing purposes only.
BACKENDThere are various kinds of “images”, the most well-known ones would be Docker/Podman images, their names would contain
containerflag, and QCOW2 images for VMs which would be labeled withvirtual.DISTRIBUTIONA lower-cased name of the Linux distribution hosted in the image:
fedora,ubuntu,alpine, etc.RELEASEA release of the
DISTRIBUTION:7for CentOS 7,stream9for CentOS Stream 9, or40,rawhideand evencoreosfor Fedora.EXTRASAdditional flags describing a “flavor” of the image:
upstreamimages are identical to an upstream image, adding no special setup on top of the upstream.unprivilegedimages come with password-lesssudosetup and may be used when unprivileged access is part of the test.ostreeimages are Fedora CoreOS that simulate being deployed by ostree.
TAGUsually
latestas in “the latest image for this distro, release and extra flags”.Note
So far we do not have much use for other tags besides
latest.stableused for Fedora CoreOS images will probably go away in favor oflatest.
For example, the following images can be found:
# Latest Alpine, with added Bash to simulate proper essential setup:
localhost/tmt/container/test/alpine
# Various CentOS releases:
localhost/tmt/container/test/centos/7
localhost/tmt/container/test/centos/stream9
# Fedora rawhide, with dnf5 pre-installed:
localhost/tmt/container/test/fedora/rawhide
# Same, but with password-less sudo set up:
localhost/tmt/container/test/fedora/rawhide/unprivileged
To build these images, run the following:
# Build all images...
make images/test
# ... or just a single one:
make images/test/tmt/container/test/fedora/rawhide:latest
Tests that need to use various container images should trigger this command before running the actual test cases:
rlRun "make -C images/test"
To list built container images, run the following:
podman images | grep 'localhost/tmt/tests/' | sort
To remove these images from your local system, run the following:
make clean/images/test
Docs
When submitting a change affecting user experience it’s always good to include respective documentation. You can add or update the Metadata Specification, extend the Examples or write a new chapter for the user Guide.
tmt documentation is written with reStructuredText and built with Sphinx. Various features of both reST and Sphinx are used widely in tmt documentation, from inline markup to references. Feel free to use them as well to link new or updated documentation to relevant parts, to highlight important points, or to provide helpful examples.
A couple of best practices when updating documentation:
When referring to a plugin, its options or documentation, prefer reference to
/plugins/STEP/PLUGINrather than to older/spec/plans/STEP/PLUGIN:# This is good: :ref:`/plugins/prepare/ansible` # If the user-facing plugin name differs from the Python one, # or if you need capitalize the first letter: :ref:`Beaker</plugins/provision/beaker>` # This should be avoided: :ref:`/spec/plans/prepare/ansible`
Design the plugin docstrings and help texts as if they are to be rendered by Sphinx, i.e. make use of ReST goodies: literals for literals - metavars, values, names of environment variables, commands, keys, etc.,
code-blockfor blocks of code or examples. It leads to better HTML docs and tmt has a nice CLI renderer as well, therefore there is no need to compromise for the sake of CLI.Use full sentences, i.e. capital letters at the beginning & a full stop at the end.
Use Python multiline strings rather than joining multiple strings over several lines. It often leads to leading and/or trailing whitespace characters that are easy to miss.
Plugin docstring provides the bulk of its CLI help and HTML documentation. It should describe what the plugin does.
Other than trivial use cases and keys deserve an example or two.
Unless there’s an important difference, describe the plugin’s configuration in terms of fmf rather than CLI. It is easy to map fmf to CLI options, and fmf makes a better example for someone writing fmf files.
When referring to plugin configuration in user-facing docs, speak about “keys”: “
playbookkey ofprepare/ansibleplugin”. Keys are mapped 1:1 to CLI options, let’s make sure we avoid polluting docs with “fields”, “settings” and other synonyms.A metavar should represent the semantic of the expected value, i.e.
--file PATHis better than--file FILE,--playbook PATH|URLis better than--playbook PLAYBOOK.If there is a default value, it belongs to the
default=parameter oftmt.utils.field(), and the help text should not mention it because the “Default is …” sentence can be easily added automatically and rendered correctly with`show_default=True.When showing an example of plugin configuration, include also an example for the command line:
Run a single playbook on the guest: .. code-block:: yaml prepare: how: ansible playbook: ansible/packages.yml .. code-block:: shell prepare --how ansible --playbook ansible/packages.yml
Do not use
:caption:directive ofcode-block, it is understod by Sphinx only anddocutilspackage cannot handle it.
Examples
By default, examples provided in the specification stories are
rendered as yaml. In order to select a different syntax
highlighting schema add # syntax: <format>, for example:
# syntax: shell
Building documentation is then quite straightforward:
make docs
Find the resulting html pages under the docs/_build/html
folder.
Visual themes
Use the TMT_DOCS_THEME variable to easily pick custom theme.
If specified, make docs would use this theme for documentation
rendering by Sphinx. The theme must be installed manually, make
docs will not do so. Variable expects two strings, separated by
a colon (:): theme package name, and theme name.
# Sphinx book theme, sphinx-book-theme:
TMT_DOCS_THEME="sphinx_book_theme:sphinx_book_theme" make docs
# Renku theme, renku-sphinx-theme - note that package name
# and theme name are *not* the same string:
TMT_DOCS_THEME="renku_sphinx_theme:renku" make docs
By default, docs/_static/tmt-custom.css provides additional tweaks
to the documentation theme. Use the TMT_DOCS_CUSTOM_HTML_STYLE
variable to include additional file:
$ cat docs/_static/custom.local.css
/* Make content wider on my wider screen */
.wy-nav-content {
max-width: 1200px !important;
}
TMT_DOCS_CUSTOM_HTML_STYLE=custom.local.css make docs
Note
The custom CSS file specified by TMT_DOCS_CUSTOM_HTML_STYLE
is included before the built-in tmt-custom.css, therefore to
override theme CSS, it is recommended to add !important flag.
tldr pages
The tldr pages are maintained in the central tldr-pages
repository. To modify existing pages or add new ones, submit your
changes directly there by following their contribution
guidelines.
Translations of existing pages into other languages are welcomed. If you’d like to help translate pages, please follow the same contribution process described above.
Note
Changes made directly to documentation in this repository will not be reflected in the tldr pages collection.
Issues
Before creating a new issue you might want to check the existing issues to prevent filing a duplicate. Important issues affecting many users are marked with the known issue label.
When creating a new issue, do not use any prefixes or file paths in the summary, for categorizing issues we are using labels and issue types which make it easier to filter relevant areas.
All newly identified bugs have to be filed as github issues and
marked with the Bug type. So, before creating a pull request
which is fixing a bug, make sure there is a corresponding issue
filed and mention it in the pull request description. This is
needed because of Functional Safety certification requirements.
Pull Requests
When committing changes, provide a concise summary in the commit’s
first line and any essential details in the commit description so
that the pull request title & description are populated with
reasonable information. This helps reviewers to better and faster
understand your intentions. Keep the pull request description
up-to-date, especially after significant changes have been made to
the implementation, as its content will be used for the final
commit merged into the main branch.
If the pull request addresses an existing issue, mention it using one of the automatically parsed formats so that it is linked to it, for example:
Fix #1234.
When submitting a new pull request which is not completely ready
for merging but you would like to get an early feedback on the
concept, use the GitHub feature to mark it as a Draft rather
than using the WIP prefix in the summary. You might also want
to assign the ci | skip label if there’s no point in running
any tests yet.
By default only a core set of tests is executed against a newly
created pull request and its updates to verify basic sanity of the
change. Once the pull request content is ready for a thorough
testing add the ci | full test label and make sure that the
status | discuss label is not present. All future changes of
the pull request will be tested with the full test coverage. For
changes related to documentation only the full test suite is not
required.
During the pull request review it is recommended to add new commits with your changes on the top of the branch instead of amending the original commit and doing a force push. This will make it easier for the reviewers to see what has recently changed and how exactly you have addressed the review comments.
It’s good to keep the pull request up-to-date with the main
branch. Rebase regularly or use /packit build command in the
pull request comment if there were significant changes on the
default branch otherwise newly added tests might cause unexpected
and irrelevant failures in your test jobs.
Once the pull request has been successfully reviewed and all tests
passed, please rebase on the latest main branch content. Check
that the pull request description is up-to-date as it will be used
for the final squash commit when merging the changes.
Checklist
The following checklist template is automatically added to the new pull request description to easily track progress of the implementation and prevent forgetting about essential steps to be completed before it is merged. Feel free to remove those which are irrelevant for your change.
Pull Request Checklist
* [ ] implement the feature
* [ ] write the documentation
* [ ] extend the test coverage
* [ ] update the specification
* [ ] adjust plugin docstring
* [ ] modify the json schema
* [ ] mention the version
* [ ] include a release note
The version should be mentioned in the specification and a release
note should be included, in the form of new story file under the
docs/releases/pending directory, when a new essential feature
is added or an important change is introduced so that users can
easily check whether given functionality is already available in
their package:
.. versionadded:: 1.23
Review
Code review is an essential part of the workflow. It ensures good quality of the code and prevents introducing regressions, but it also brings some additional benefits: By reading code written by others you can learn new stuff and get inspired for your own code. Each completed pull request review helps you, little by little, to get familiar with larger part of the project code and empowers you to contribute more easily in the future.
Pull requests ready for review can be easily filtered and found at the review tab.
For instructions how to locally try a change on your laptop see the Develop section. Basically just enable the development environment and check out the pull request branch or use the github cli to check out code from a fork repository:
hatch -e dev shell # enable the dev environment
git checkout the-feature # if branch is in the tmt repo
gh pr checkout 1234 # check out branch from a fork
It is also possible to directly install packages freshly built by Packit for given pull request. See the respective Packit check for detailed installation instructions.
Note that you don’t have to always read the whole change. There are several ways how to provide feedback on the pull request:
check how the documentation would be rendered in the
docs/readthedocs.orgpull request check, look for typos, identify wording which is confusing or not clear, point out that documentation is completely missing for some arearemind a forgotten item from the Checklist, for example suggest writing a release note for a new significant feature which should be highlighted to users
verify just the functionality, make sure it works as expected and confirm it in a short comment, provide a simple reproducer when something is broken
review only the newly added test case, verify that the test works as expected and properly verifies the functionality
Even partial review which happens sooner is beneficial, saves time. Every single comment helps to improve and move the project forward. No question is a dumb question. Every feedback counts!
Communication
When discussing changes proposed in a pull request review we encourage you to use the preferred channel to make the overall communication more efficient:
- GitHub comments
Preferred for review process and discussion to have all the information at one place
Asynchronous nature - contributors don’t have to react immediately
- Chat
Useful for quick sync
Do not overuse to prevent too many unnecessary interruptions
Merging
Pull request merging is done by one of maintainers who have a good overview of the whole code. Maintainer who will take care of the process will assign themselves to the pull request. Before merging it’s good to check the following:
New test coverage added if appropriate, all tests passed
Documentation has been added or updated where appropriate
At least two positive reviews provided by the maintainers
Merge commits are not used, rebase on the latest main instead.
Use the GitHub’s Squash & Merge button which will generate the
commit message from the pull request description. Just make sure
it is sane, the content is up-to-date, and remove the checklist
from it before merging.
Pull requests which should not or cannot be merged are marked with
the status | blocked label. For complex topics which need more
eyes to review and discuss before merging use the status |
discuss label.
Makefile
There are several Makefile targets defined to make the common daily tasks easy & efficient:
- make test
Execute the unit test suite.
- make smoke
Perform quick basic functionality test.
- make coverage
Run the test suite under coverage and report results.
- make docs
Build documentation.
- make packages
Build rpm and srpm packages.
- make images
Build container images.
- make tags
Create or update the Vim
tagsfile for quick searching. You might want to useset tags=./tags;in your.vimrcto enable parent directory search for the tags file as well.- make clean
Cleanup all temporary files.
Sprints
The team works in biweekly sprints with the following schedule:
Sprint day 0, Thursday … retrospective & planning meeting
Sprint day 3, Tuesday … hacking session, progress check
Sprint day 4, Wednesday … issue triage
Sprint day 6, Friday … all must have items should be done
Sprint day 8, Tuesday … hacking session, pre-release sync
Sprint day 9, Wednesday … issue triage, new version released
On the planning meeting (sprint day 0) team agrees on issues, tasks and pull requests which should be completed during the sprint. Any additional items which pop up during the sprint:
Are proposed to the team and discussed together
Release lead is consulted and approves the inclusion
Team members sync daily in the chat about the progress and blocking issues. The progress is tracked in the current sprint board. Tentative suggestions for the next sprint can be added by anybody. Those will be reviewed and dropped or approved in the planning meeting.
See the triage project for the current status of the issue triage and the backlog for the list of approved and prioritized issues.
Release
The tmt project is released biweekly. If there are urgent
changes which need to be released quickly, a hotfix release may be
created to address the important problem sooner.
Regular
Make sure there is at least one release note prepared in the
docs/releases/pending directory. If there is none, create at
least a short summary in docs/releases/pending/changes.fmf.
Start a new release using the release script. Provide the full
version as the parameter, for example:
./scripts/release 1.23.0
Create the release pull request using the link provided by the script and follow the release checklist there.
Note
Although the release pull request usually does not contain any functional changes, it is mandatory to make sure that the full test coverage has been successfully executed as a formal proof that the release is safe and stable.
Handle manually what did not went well:
If the automation triggered by publishing the new github release was not successful, publish the fresh code to the pypi repository manually using
hatch build && twine uploadIf there was a problem with creating Fedora pull requests, you can trigger them manually using
/packit propose-downstreamin any open issue.
Hotfix
The following steps should be followed when an important urgent fix needs to be released before the regular schedule:
Create a new branch from the
fedorabranchUse
git cherry-pickto apply the selected changeMention the hotfix release on the release page
Add a
Release x.y.zcommit, empty if needed:git commit --allow-empty -m "Release x.y.z"Create a new pull request with the target branch set to
fedoraMake sure that tests pass and merge the pull request
Tag the commit and publish the release in the same way as for regular release
Create a pull request with the hotfix release notes changes
Releaser
Taking care of a new tmt release is not just about performing
the final steps described above. In this role you should shepherd
the issues and pull requests like sheep so that they make it to
the main branch by the proposed deadline. Here’s a couple of
recommendations which could help you to make the release process
smooth and timely:
continually watch the issues & pull requests and gently push them forward if any of them seems to get stuck
bring attention especially to those with the high priority, the
mustissues and pull requests should be finished ideally one week before the release deadlineregularly check the pull request progress and highlight those which are waiting for feedback on the review sessions
if there is anything not clear and needs discussion bring it to the chat or raise the topic on the weekly sessions
do not hesitate to contact assignees directly, e.g. on the chat, if there is no update for a longer time, consider also reassigning the issue to another contributor if necessary
if there are pull requests ready for merging but not included in the release, it might make sense to squeeze them in, to make the development more fluent, just make sure they do not slow down important issues