On 23/07/15 10:09, Paul Sherwood wrote:
as part of what I'm doing in GENIVI and AGL, I've undertaken to prepare
a Statement of Work for public-facing continuous integration and
automated test infrastructure.
Thanks for sharing! This is an interesting (and HUGE) area...
My hope is to get the implementation moving quickly, using available
FOSS solutions as far as possible, and therefore I'll be happy to
recommend Baserock tooling (eg Trove) if it clearly makes sense and can
fit with established technology choices (eg Bitbake for builds).
Makes sense. A lot of tools in our problem domain still seem to overlap
with each other and not operate together easily. It would be great to be
able to spend time figuring out how they complement each other, what the
best combinations are, and writing and documenting suitable plugins/glue
SoW: Continuous Integration and Automated Test
Various open source 'umbrella' projects including GENIVI, AGL and PRPL
have a need to establish and maintain public-facing infrastructure for
Continuous Integration and Automated Testing of their software
I never know what 'architecture' means exactly when talking about
software, but I want to point out that you can't test a design or plan
directly. You have test an implementation of it. I think that's an
important distinction, because it means that any testing plan needs to
be clear about exactly what implementation is going to be tested, and
how it is produced. (Or if it supports multiple implementations, what
interfaces they must implement in order to be testable).
* collection of source code from upstreams including new versions as
required on an ongoing basis
* recipes and/or automated instructions for building the upstream
* recipes and/or automated instructions for installing the upstream
components into an integrated 'distro'
is there a reason you have to specify 'recipes and/or automated
instructions'? I think 'automated instructions' should be enough ;)
* Continuous Integration tooling which executes the
* on user demand, or
* triggered by changes to the instructions or components themselves
* publication of the built/integrated artefacts which are output by the
execution of the ci tooling
* Automated Test tooling which executes sets of tests on deployed
instances of the integrated software
* publication of the results of the automated tests
I think you should be clear about what 'results' means. I think it's 3
things: pass/fail information, complete log files, and the actual built
* demonstration of license compliance
* machanism for formal code review prior to merging of changes, either
* before build and test or
* after build and test as candidate (preferred)
I can't think of anything else to add to this list, looks good.
* all of the CIAT software tooling must be FOSS
* any custom work to make the solution fit specific use-cases for
GENIVI, AGL, PRPL must be FOSS
* where possible the solution should align with the upstream developers
for its components.
* instructions for creation and setup of a working instance of CIAT
infrastructure must be provided
* minimal manual maintenance - the system should be mostly automated:
* minimal effort for license compliance (eg scripted license
checking, publish applicable source for all artefacts)
* developer/integrator engineers can trigger build/test using git push.
* CIAT must support
* several automated test frameworks (eg fitnesse, phoronix, LTSI Test
* testing scenarios on multiple architectures (eg ARMv7, x86_64) for
both virtual machines (eg QEMU) and actual hardware
Looks good at a glance.
I'd be very interested in seeing the results of the research that is
done into existing software in this area. How about requiring that the
results of that research are published publically?
Sam Thursfield, Codethink Ltd.
Office telephone: +44 161 236 5575