[Gluster-devel] Reducing merge conflicts
Prashanth Pai
ppai at redhat.com
Fri Jul 15 05:13:59 UTC 2016
Hi,
Here are some observations from how openstack community does things:
1. Their bitregia equivalent named Stackalytics (http://stackalytics.com/) lists
contribution with *default metric being Reviews*. There are other metrics such as
Commits, Patchsets, Person-day effort etc.
2. They have a common 'Review Dashboard' (a really long gerrit query) that everyone
looks at everyday. This aids in picking up patches that have passed regression,
easier patches to review or important ones starred by community.
3. A +1 given on the spec design is treated as a gentle promise to review the code
when the feature lands. It's also an ACK from maintainers of those components that
the feature touches upon or modifies.
4. Auto abandon really old patches that has had no activity in recent past and send
reminder mail to author about the patch.
5. Contributors there are more comfortable with any reviewer uploading a new patchset
and then add a co-author line to commit message. This will be taken into account
in calculations of the contribution metric tool.
6. Have hackathons to collectively review code of priority features.
Some other small suggestions:
1. Have coding guideline deviations and styling issues be checked by a Jenkins job
(and make this non-voting) if possible.
2. Personally, I use 'git blame' to identify reviewers to add as they have modified
the same parts of code in the past and it might be easier for them to review it.
3. Individual contributors can have personal goals set for themselves such as one
review per day. If this becomes a habit, it'll be great.
4. I see a lot of code that I barely know much about. I go through it but fall
short of doing a +1 or -1 because of lack of familiarity with it. I'm sure there
are others too who are eager to contribute to reviews but aren't very well
versed with specific component. It should be okay to put in review comments
(can be questions too for learning) and not do a +1 or -1.
Cheers.
-Prashanth Pai
----- Original Message -----
> From: "Vijay Bellur" <vbellur at redhat.com>
> To: "Jeff Darcy" <jdarcy at redhat.com>, "Pranith Kumar Karampuri" <pkarampu at redhat.com>
> Cc: "Gluster Devel" <gluster-devel at gluster.org>
> Sent: Friday, 15 July, 2016 9:01:55 AM
> Subject: Re: [Gluster-devel] Reducing merge conflicts
>
> On 07/14/2016 03:39 PM, Jeff Darcy wrote:
> >> The feedback I got is, "it is not motivating to review patches that are
> >> already merged by maintainer."
> >
> > I can totally understand that. I've been pretty active reviewing lately,
> > and it's an *awful* demotivating grind. On the other hand, it's also
> > pretty demotivating to see one's own hard work "rot" as the lack of
> > reviews forces rebase after rebase. Haven't we all seen that? I'm
> > sure the magnitude of that effect varies across teams and across parts
> > of the code, but I'm equally sure that it affects all of us to some
> > degree.
> >
>
> Yes, we need to avoid this kind of rot. We have 500+ open patches across
> all branches today in the review pipeline. Maybe we could clean
> up/abandon some of these patches that are not relevant anymore? Once we
> have that I think it would be useful to have an ageing based policy to
> auto abandon patches. That might help us be more proactive in managing
> our review queues.
>
> IMO reviewing code is fun and takes some time for one to get better at.
> I have personally imbibed good coding practices, understood something
> better by reading and reviewing code. Several others [1] [2] also
> emphasize the importance of code reviews to both the developer and the
> project. If there is good enough interest on code reviews and how to get
> better at that, some of our more prolific & experienced code reviewers
> can possibly share thoughts over a hangout session?
>
>
> >
> >> Do you suggest they should change that
> >> behaviour in that case?
> >
> > Maybe. The fact is that all of our maintainers have plenty of other
> > responsibilities, and not all of them prioritize the same way. I know I
> > wouldn't be reviewing so many patches myself otherwise. If reviews are
> > being missed under the current rules, maybe we do need new rules.
> >
> >> let us give equal recognition for:
> >> patches sent
> >> patches reviewed - this one is missing.
> >> helping users on gluster-users
> >> helping users on #gluster/#gluster-dev
> >>
> >> Feel free to add anything more I might have missed out. May be new
> >> ideas/design/big-refactor?
> >
> > Also doc, infrastructure work, blog/meetup/conference outreach, etc.
> >
> >> let people do what they like more among these and let us also recognize
> >> them
> >> for all their contributions. Let us celebrate their work in each monthly
> >> news letter.
> >
>
>
> Some of these aspects are easy to measure and can be automated. It is
> fairly trivial to generate review statistics using gerrit's gsql
> interface. I even have a script/query lying around somewhere for that
> and can dig it up if somebody is interested in improving that to
> automate the process of generating review statistics.
>
> -Vijay
>
> [1]
> http://goodmath.scientopia.org/2011/07/06/things-everyone-should-do-code-review/
>
> [2] http://users.encs.concordia.ca/~pcr/paper/Rigby2014TOSEM.pdf
>
>
> _______________________________________________
> Gluster-devel mailing list
> Gluster-devel at gluster.org
> http://www.gluster.org/mailman/listinfo/gluster-devel
>
More information about the Gluster-devel
mailing list