Uploaded image for project: 'Qt Quality Assurance Infrastructure'
  1. Qt Quality Assurance Infrastructure
  2. QTQAINFRA-2496

Adding provisioning changes takes too much developers time

    XMLWordPrintable

Details

    Description

      Currently provisioning scripts are used to fulfill couple of goals:
      1. installation of software necessary for Qt compilation (like installation of cmake)
      2. configuration of the systems (like disabling screen servers and indexing)
      3. network bandwidth optimization (like pre-downloading pip'ed packages)
      4. other caches like the network test server

      When provisioning process was designed only the first two use-cases were known, we also assumed (wrongly) that provisioning scripts would not be changed too often.

      Now depending on the use-case, integration time can be different, for example for (2) it is long but somehow reasonable, which is ~1 Qt5 integraion. For (1) it already gets tricky, so if the additional software is needed for whole Qt5 and the versioning is not too complex then it also takes ~1 Qt5 integration, but if the provisioning script is complex or it has to be changed often (for example version bump) then there is need for a migration path, for example if a module A depends on 3rd party library Foo1 and it has to update it to Foo2, one needs to do:

      • make A work with both Foo1 and Foo2 in the same time (build time switch)
      • create & merge qt5 change that installs Foo2 (with hope that the previous step was correct, if it was not start over)
      • remove support for Foo1 in module A

      Similar problem happens for case (4) and (3), but these almost always require a custom solution every time the update is needed. The Qt5 roundtrip takes literally weeks, while it happens almost every month.

      Now, the source of the issue is that provisioning and code that depends on it are separated into two repositories. So it is impossible to make the change atomic in anyway. Having atomicity between a module update and provisioning update would speedup development by avoiding the need for providing the compatibility path (1), (4) (and partially (3) as pip requirements may be out of sync). Which saves 2x Qt5 integrations and at least 1 module integration.

      Proposed solution 1:
      Implement in Coin and Gerrit cross - repositories atomic integration. If we could stage patches together from different repositories we could stage Qt5 change together with a module change.
      Advantage: Generic solution, that can be used for API updates too
      Disadvantage: Requires Gerrit modifications

      Proposed solution 2:
      Allow modules to contribute additional provisioning scripts. Currently only Qt5 contains provisioning scripts if every module could add them, it would solve the problem of the atomicity as also not mentioned before problem of locality (currently we install software without mentioning which module really depends on it)
      Advantage: As a bonus it solves locality problem

      Option 2: provisioning script computation example:
      Case: We build QtBase

      • We look into qt/qtbase/coin to see if it has valid provisioning scripts add them to the package
      • We look into qt/qt5/coin to see if it has valid provisioning scripts add them to the package
      • We look into every submodule of Qt5 excluding QtBase to see if it has valid provisioning scripts add them to the package
      • We create / re-use tier2
      • We build qtbase.
      • Done

      Case: We build QtSvg

      • We look into qt/qtsvg/coin to see if it has valid provisioning scripts add them to the package
      • We look into qt/qt5/coin to see if it has valid provisioning scripts add them to the package
      • We look into every submodule of Qt5 excluding QtSvg to see if it has valid provisioning scripts add them to the package
      • We create / re-use tier2
      • We build / re-use all qtsvg dependencies
      • We build qtsvg
      • Done

      Side notes (from comments):


      This proposal doesn't mean that suddenly, we should move all the provisioning down scripts to submodules. It only enables a way to define such scripts there. One can think about the policy, but stuff like pip, network test server, and probably conan should be on a module level, while disabling updates, screen servers etc. should stay at qt5 level.


      Mark that many provisioning changes have no visible impact until explicitly used. Like with network test server, the code needs to opt in for usage. Pip requirements pre-downloading is similar. There is no practical chance of breaking any other module.


      Provisioning is not special, it is code. It should be (auto) tested and it may contain bugs. The same principles should apply as to other parts of code.


      Option 2 does _not_ cause silent breakages, at beast it may cause delayed breakages, which are visible during Qt5 update, which is exactly the same as any other API breakage. With the right governing / policies we can _guaranty_ no breakages at all (case 3 and 4).

      Attachments

        No reviews matched the request. Check your Options in the drop-down menu of this sections header.

        Activity

          People

            nierob Nierob
            nierob Nierob
            Votes:
            0 Vote for this issue
            Watchers:
            14 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved:

              Gerrit Reviews

                There are no open Gerrit changes