[Gluster-Maintainers] Maintainer meeting minutes: 20th August, 2018

Amar Tumballi atumball at redhat.com
Mon Aug 20 16:43:40 UTC 2018


BJ Link

   - Bridge: https://bluejeans.com/217609845
   - Download: https://bluejeans.com/s/IVExy

<https://hackmd.io/yTC-un5XT6KUB9V37LG6OQ?view#Attendance>Attendance

   - Nithya, Amar, Nigel, Sunny, Ravi, Kaleb, kshlm, Raghavendra M,
   Raghavendra G, Shyam (Partial)
   - Calendar Declines: ppai

<https://hackmd.io/yTC-un5XT6KUB9V37LG6OQ?view#Agenda>Agenda

   -

   Master lockdown, and other stability initiatives
   - Where do we stand?
         - [Amar] Everything looks good at this point. Not as many random
         failures as there used to be.
      - What needs further attention?
         - There are still some random failures on brick-mux regression.
      - Any metrics?
         - [Atin]
         https://fstat.gluster.org/summary?start_date=2018-08-13&end_date=2018-08-20
          - tracks the failures reported since master lock down is revoked.
         - [Atin] Overall, things look much more stable after a bunch of
         test fixes being worked on during this lock down.
         - c7 nightly had reported green 5 out of 7 times, 1 failed with
         some core (
         https://lists.gluster.org/pipermail/gluster-devel/2018-August/055298.html),
         18th August’s run seem to hit n/w failure.
         - brick mux nightly had seen couple of issues since last week, 1.
         crash from tests/basic/ec/ec-5-2.t & 2. spurious failure from
         tests/bugs/replicate/bug-1586020-mark-dirty-for-entry-txn-on-quorum-failure.t.
         (fix merged today)
         - line-cov nightly had failed once in
         tests/bugs/replicate/bug-1586020-mark-dirty-for-entry-txn-on-quorum-failure.t
         (fix merged today)
      -

   Outstanding review queue is quite long, need review attention.
   - Maintainers keep track of their lanes and how many patches are pending.
   -

   v5.0 branching
   - Can we get all the features completed by today??
         - What are the features we’re targetting for this release? Github
         doesn’t give a clear picture. This is a problem.
      - Will this release be just a stability release?
         - There is nothing wrong in calling it a stability release.
         - If there are no features, and as we are still working on
         coverity, clang and other code coverage initiatives, should
we delay the
         branching?
         - If we’re calling this a stability release, perhaps we should
         look into memory leaks that have been reported on the lists already.
      - What about clang-formatter? Planned email content here
      <https://hackmd.io/sP5GsZ-uQpqnmGZmFKuWIg?view>
         - Around branching point is the time to get this done.
         - The messaging around “personal preference” for style vs “project
         preference” needs to be clear.
         - [Nigel] Still need to see the content, and give the proper set
         of tasks.
         - 2 options:
            -
               1. Big change where we can change all files, and make
               currently posted patches conflict.
            -
               1. Or make user change the file he touches with clang format.
            - Precommit hook Vs Server side change.
         - problem with git review, or git push directly compared to
         `./rfc.sh`. We need a smoke job for sure.
         - Should we force everyone to install clang? Make it optional for
         user.
         - AI on Amar for giving a date for this activity completion, by
         EOD 21st August, 2018.
      - [Kaleb] what is the status of python3? (fedora 29(?) and rhel8 are
      looming.)
         - Seemingly, we’re not 100% python3 ready
         <https://lists.gluster.org/pipermail/gluster-devel/2018-August/055216.html>.
         There are bugs that need fixing.
         - Ship with Python3 and we’ll fix the bugs as we find them.
         - Let’s change all the shebangs to python3.
      -

   GCS:
   - Any updates?
         - CSI driver under review -
         https://github.com/gluster/gluster-csi-driver/pull/11
            - We can land the patch and fix the license later too.
            - Good to go ahead and merge, and then get the followup
            patches, it would move things faster.
            - Kaushal to review Madhu’s changes on top of the PR, and if
            things look OK, then we can merge the PR.
         - GD2 + Gluster nightly container image -
         https://hub.docker.com/r/gluster/glusterd2-nightly/
            -
            https://github.com/gluster/glusterd2/tree/master/extras/nightly-container
         - Build pipeline - in progress. Waiting on infra to have all the
         deps installed.
         - Deployment script in-progress
      -

   Mountpoint
   - Presentations ready?
      - All set for travel?
         - Some delays with Visa arrival, few maintainers who were supposed
         to travel from India will be confirming only by end of the week.
      -

   Round Table
   - [Amar] Can we disable bd (block-device) translator from the
      build/code? No users raised concerns over it on deprecation email. This
      resolves many issues on coverity/gcc8/clang etc.
         - Generally, yes
         - [Kaleb] What about 'cluster/stripe' ?
      - [Nigel] Who owns glusterfsd folder? Could not find a module or
      owner in maintainers file. Right now tracked as “other” in Coverity.
         - Generally the same set who are part of ‘libglusterfs’
      - [Kaleb] what is the status of the selinux package?
         - Repo created, no more updates ever since.
         - AI: Milind to update the status. Latest known status is spec
         file being written.
      - [Nigel] Glusto test runs and failures - We need more dev
      involvement to help debug test failures. More in the next meeting.
         - Focused components should pass for sure.
         - AI: Make a list of components which are priority, so we can
         focus on them first.


---

Regards,
Amar
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.gluster.org/pipermail/maintainers/attachments/20180820/8716721a/attachment.html>


More information about the maintainers mailing list