Anuket Project
Release Process Based on CD
Current state
We need processes (and preferably only a single process) that address the needs of our different target audiences:
- OPNFV developers who work with upstream projects
- NFV developers who want to use OPNFV as a standard platform, or to test new hardware platforms
- Service providers who want to test and benchmark different hardware
- (What other groups do we need to serve and what are their needs?)
A more elaborate description of target audiences and their needs can be found here: User stories for OPNFV release artifacts
The current release model which starts from the latest stable versions of upstream projects can produce stable releases. The benefits are
- OPNFV releases have a steady, predictable cadence
- The upstream project versions are already stabilized and most bugs have been fixed
- There is always a well-defined "latest OPNFV release" with which for example the OVP program can work with
- There is a community of people working with the same released version
- Communication about the OPNFV releases is easier, since there is a well-defined release with a well-defined, tested feature set
The current OVP program assumes a reference platform containing only released upstream versions & which passes all planned compliance tests.
Issues with current state
Some of the issues with the current OPNFV release model are:
- OPNFV developers who are working with upstream projects cannot use the OPNFV releases for development and testing, since the versions are older
- If OpenStack will move to annual releases, then the OPNFV releases can have OpenStack versions that are several releases old
- Installer projects have to support both old and recent versions of OpenStack
- The current state also has the typical issues of the waterfall model: integration happens relatively late, there is a rush to integrate installers and CI testing, and testing before the release can take an unpredictable amount of time
- As a consequence, the release dates often slip
- Making the releases take a lot of effort in the end, which can be problematic if the effort in OPNFV reduces
- Some features that did not quite make it to the main release have come in a maintenance releases, which confuses the concept of a release
- Sometimes the maintanance releases have fared worse in testing than the main release, since all effort is already going to the next release
- The baseline for releases (OS packages, OpenStack and other upstream projects) can change, so the release is not really "stable": an installer for an OPNFV release pulls in packages from Linux distro repositories, from github etc and these upstream projects can change (as has happened already)
- To make the progress towards a release transparent and predictable, the milestones must be enforced which takes some effort and requires some punishments for missing deadlines. This reduces the incentives to participate in a release
- Overall, what is the incentive to participate in a release?
Goals for the release process
The release process should meet the requirements of the different stakeholders:
- Serve different audiences requiring a different trade-off between recent/stable out of the same process (e.g. "Developers" and "End Users")
- The maturity of each installed version can vary, but it should be clear for the user how well tested the installed version is
- It should be possible to uniquely identify the installed version
- Every installation of the OPNFV stack should produce the same end result
- The maturity of each installed version can vary, but it should be clear for the user how well tested the installed version is
Alternatives
One variant to the current system that has been discussed in the beginning of OPNFV is the right cadence for releases, such as
- 12 month release cadence: this would reduce the overall effort since the releases would come more seldom
- 1-2 month release cadence: the releases would be smaller, so hopefully less stressful
- 12 month releases with 1-month point releases: combines the benefits of both of above
An approach could be a process based on CI evolution - stepwise qualification for additional testing until an appropriate level of confidence is reached. Artifacts for different audiences can be taken from different stages of this pipeline. .e. the process would be to determine when we want to make artifacts available from which stage of the pipeline, and what does that imply for the allocation of testing resources to the different pipeline stages at a given time.
Proposal
However, there are still several open questions. The proposal is to set up a working group to define the release process