[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [tracecompass-dev] Trace compass Gerrit code review report

Hi Matthew, David, Jean and everyone else too, of course.

First of all, thank you very much for your feedback. Many of you answered the survey, many more than I would have expected, so thank you again. Your opinions were really useful to us.

I am glad that you are interested in our code review analytics project. I hope you donât mind if I reply to all of you in the same email.

Some work needs to be done to better classify the reviews weights though.

Definitely! So far, we have only been testing the possibility to classify these different kinds of changes using a machine learning approach, assessing how well it performs and how useful might be the information collected in this way. 

Adding weights to the kinds of changes would be interesting for sure. Unfortunately, I still need to find a meaningful way to assign them.

The community as a whole is being represented by these charts, and I don't suggest categorizing by person, but I would suggest a year over year on longer lived projects to compare. 

Thanks, that was my idea as well. Unfortunately, for this report I had to limit the analysis to just one year because of the time constraints we have to create and send out reports. It is definitely possible to group changes per year and comparing them, I will add this in a next âreleaseâ of this tool.

Finally, if you could share the results of this work, I would love to see what's up.

Sure, I am glad you are interested in it. Unfortunately, we are still polishing this work to publish it somewhere so nothing is ready yet. However, I will happily send you a copy of it as soon as it will be ready.


Firstly, can a patch count as multiple review changes, or does each patch set count as one review change?
If a patch set can count as multiple review changes, what separates one review change from another? 

Yes, a patch can contain multiple review changes. We defined a review change as a group of continuous modified code lines. Furthermore, we tried to link together the code changes in the old patch with the one in the new patch to get a better sense of what was really changes.

A patch set updates the javadoc of 2 methods. Would that count as 2 documentation changes?

Yes, in this first prototype of the tool, it would.

-A patch set updates the javadoc of 1 method, and makes a functional change. Would that count as 1 documentation change and 1 functional change?

Exactly.

-A patch set changes the condition of a "if" on line 50, and changes the condition of the related "else" on line 52. Would that count as 1 or 2  functional changes?

If they were non-modified change lines in between, they would count as two changes. Unfortunately, we still have to define a sort of âlogical linkerâ that is able to tell that more code changes should be linked and counted as one. This is definitely on my list of things to do to expand the current tool. 

Secondly, if the same review change is made at the same line over multiple patch sets, does it count as one or multiple review changes?

The tool treats each patch as an independent unit, so at this stage it has no means to find this link and count the change as one. Thank you for pointing this out. It is definitely something I could improve while working on what I mentioned above. 

Thirdly, What would rebases and resolving merge conflict count as?

Good point. It depends on the specific case, I would say. If you have an example of review where this happen in mind, I will be happy to check it and tell you what happens in that case.


Thank you very much again for all your help and precious feedback.

Of course, feel free to ask more questions and reply to these answers if you want. I would be more than happy to discuss further this work with you. 

Best,
   Enrico


On 22 Aug 2019, at 20:00, David Pichà <david.piche@xxxxxxxxxxxx> wrote:

Hi Enrico,

I have some questions regarding review changes:

Firstly, can a patch count as multiple review changes, or does each patch set count as one review change?
If a patch set can count as multiple review changes, what separates one review change from another? 

Here are some examples:

-A patch set updates the javadoc of 2 methods. Would that count as 2 documentation changes?
-A patch set updates the javadoc of 1 method, and makes a functional change. Would that count as 1 documentation change and 1 functional change?
-A patch set changes the condition of a "if" on line 50, and changes the condition of the related "else" on line 52. Would that count as 1 or 2  functional changes?

Secondly, if the same review change is made at the same line over multiple patch sets, does it count as one or multiple review changes?

Here is an example:

-A patch set updates the javadoc of 1 method. The next patch set re-updates the javadoc, because of some typo. Does it count as 1 or 2 documentation changes?

Thirdly, What would rebases and resolving merge conflict count as?

BR,

David PichÃ

From: tracecompass-dev-bounces@xxxxxxxxxxx <tracecompass-dev-bounces@xxxxxxxxxxx> on behalf of Matthew Khouzam <matthew.khouzam@xxxxxxxxxxxx>
Sent: Thursday, August 22, 2019 1:22 PM
To: tracecompass developer discussions <tracecompass-dev@xxxxxxxxxxx>
Subject: Re: [tracecompass-dev] Trace compass Gerrit code review report
 
Hi Enrico,

You aren't bothering, many developers are on vacation as the school year starts soon. I'm sure you feel the crunch too. ð

I have filled out your survey, and I find it very interesting. Some work needs to be done to better classify the reviews weights though.

Here is a simple example

class a{
 int b(){
   return "string";
 }
}

this code has one functional error, no javadoc, and 10 style errors.

So chart implies that I would be spending 10+ times more on style than on functionality. In reality, a functionality error takes longer to find and longer to report. So I would not be surprised if there is a different weight associated to it.

Now how much is the weight of a functional error or a structural error vs a style issue? I don't know. I also suspect as student developers finish their internships, that less stylistic errors appear in the review. The community as a whole is being represented by these charts, and I don't suggest categorizing by person, but I would suggest a year over year on longer lived projects to compare. 

e.g. 
June 2018 vs June 2019. (second month of internships)

-- or --

May 2018 vs May 2019 (Near feature freeze)

Thoughts?

Finally, if you could share the results of this work, I would love to see what's up. I personally love the whole meta aspect of development.

BR
Matthew.

From: tracecompass-dev-bounces@xxxxxxxxxxx <tracecompass-dev-bounces@xxxxxxxxxxx> on behalf of Enrico Fregnan <fregnan@xxxxxxxxxx>
Sent: Thursday, August 22, 2019 1:10:44 PM
To: tracecompass developer discussions <tracecompass-dev@xxxxxxxxxxx>
Subject: Re: [tracecompass-dev] Trace compass Gerrit code review report
 
Hi Matthew,

Thank you your reply and your help.

Sorry for bothering you, I was not sure how active the mailing list generally is.

I hope you will find the report interesting, this is just a first step, but I would like to keep working on this code review analytics to make them more understandable and actionable.

Thank you very much again for your help.


Best,
  Enrico


On 22 Aug 2019, at 19:05, Matthew Khouzam <matthew.khouzam@xxxxxxxxxxxx> wrote:

Hi Enrico,

Thank you for your interest in trace compass, I will reply now, please understand that our developers may be in different time zones so an immediate response is not always possible.

//Matthew

From: tracecompass-dev-bounces@xxxxxxxxxxx <tracecompass-dev-bounces@xxxxxxxxxxx> on behalf of Enrico Fregnan <fregnan@xxxxxxxxxx>
Sent: Thursday, August 22, 2019 12:53:45 PM
To: tracecompass developer discussions <tracecompass-dev@xxxxxxxxxxx>
Subject: Re: [tracecompass-dev] Trace compass Gerrit code review report
 
Dear developers.

Sorry to bother you again. Unfortunately, I have to start soon processing the feedback we received.

Therefore, it would be great if you could give me your opinion too. This would help me to make the report more and more useful and improve the quality of the information contained it. 

If you like the idea of this kind of code review analytics, please consider to help me. Your opinion is fundamental to make me able to understand what is already ok and which point should be improved (or if it makes sense to improve them at all).

Here is the link to the report:




Thank you very much again for your help.

Best,
  Enrico

On 22 Aug 2019, at 11:42, Enrico Fregnan <fregnan@xxxxxxxxxx> wrote:

Dear developers,

I hope you had the chance to look at our report and found it interesting. 

Our goal is to create a Code Review Analytics tool to give the developers and project managers a better idea of what is really going on inside the code review process. 

This could lead to further improve it, optimising the effort that the developers spent on it. For instance, reducing the number of documentation changes might allow a reviewer to allocate more time on finding functional defects.

Another possible application might be to help optimising the allocation of the developers to a particular review.

Here is a link to the report:


I would like to keep improving our tool adding more and more actionable information. This is just the first step, so I am interested in hearing your feedback so I could keep improving it.

Please, consider to help me by answering the survey attached to the report if you think is information are useful or might be useful in the future with the next steps we would like to take in improving the tool.

Thank you very much.

Best,
  Enrico

On 21 Aug 2019, at 23:06, Enrico Fregnan <fregnan@xxxxxxxxxx> wrote:

Dear Trace compass developers,

I am Enrico, a researcher at the University of Zurich and I am studying how to evaluate the code review process to understand how it could be improved. 

I created a tool that analyses the changes that happen during a review on Gerrit, to measure the effects of code review on code and understand which kinds changes reviewers pay attention to. 

For example, are you aware that the majority of the changes happened during a code review in the last year in your project involve documentation issues? 

My goal is to give developers insight on what really happens during the code review process to check if this matches their expectations or if the process can be improved (e.g. improving the way in which reviewers are chosen, revising the code review policies and so on.)

You will more data about your project at the following link:


https://www.surveygizmo.com/s3/5176724/Code-review-changes-report-Tracecompass


I would like to understand better what works already and what still needs to be improved. I am planning to keep improving the tool adding more and more feature, but I would need your help to understand which information should be added/removed etc. 

For this reason, the link contains the report and some questions we would like to ask you about it. However, the report is designed in a way that you don't have to answer the questions (if you just want to look at the report), but it would be great if we could know what you think about our tool/report. Answering will take you 5/10 minutes. 

If you have any question about the report or our research, please don't hesitate to contact me.

Thank you very much for your time and sorry for bothering you. 

Cheers,
Enrico

_______________________________________________
tracecompass-dev mailing list
tracecompass-dev@xxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://www.eclipse.org/mailman/listinfo/tracecompass-dev

_______________________________________________
tracecompass-dev mailing list
tracecompass-dev@xxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://www.eclipse.org/mailman/listinfo/tracecompass-dev

_______________________________________________
tracecompass-dev mailing list
tracecompass-dev@xxxxxxxxxxx
To change your delivery options, retrieve your password, or unsubscribe from this list, visit
https://www.eclipse.org/mailman/listinfo/tracecompass-dev