Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
RE: [eclipse.org-eclipsecon-program-committee] Yes / No Polling Analysis


I agree with Don on this one. I think doing a broader review of tutorials is fine, but I would view community interest (how many folks attended) as a higher criteria than yes/no polling results.

Yes/no definitely kept things simple at last year's EC. Gotta love those binary numbers. Flip the bit as you walk out the door...

--Fitz

Brian Fitzpatrick
Eclipse Data Tools Platform PMC Chair
Eclipse Data Tools Platform Connectivity Team Lead
Staff Software Engineer, Sybase, Inc.



"Donald Smith" <donald.smith@xxxxxxxxxxx>
Sent by: eclipse.org-eclipsecon-program-committee-bounces@xxxxxxxxxxx

08/28/2008 07:12 AM

Please respond to
Eclipsecon Program Committee list        <eclipse.org-eclipsecon-program-committee@xxxxxxxxxxx>

To
"'Eclipsecon Program Committee list'" <eclipse.org-eclipsecon-program-committee@xxxxxxxxxxx>
cc
Subject
RE: [eclipse.org-eclipsecon-program-committee] Yes / No        Polling        Analysis





Just to play devil's advocate, I would propose that "yes/no" % is not
necessarily an indicator of quality.  Perhaps it's an indicator of
agreeability of the speaker.  

For the business track, I will be looking more at how many votes were cast
overall as a measure of interest, and only if it was overwhelmingly negative
look to correct.  In other words, I think I would recruit someone who was
75-25 last year over someone who was 9-1.  The first talk was obviously more
popular and elicited more response.


- Don

-----Original Message-----
From: eclipse.org-eclipsecon-program-committee-bounces@xxxxxxxxxxx
[mailto:eclipse.org-eclipsecon-program-committee-bounces@xxxxxxxxxxx] On
Behalf Of Peter Kriens
Sent: August 28, 2008 2:03 AM
To: Eclipsecon Program Committee list
Subject: Re: [eclipse.org-eclipsecon-program-committee] Yes / No Polling
Analysis

I am not sure the difference between Tutorials 89% and Short Talks 96%  
is statistically significant. I also think that anything over 90% is  
very good and is not possible to improve. When you look at more  
detailed reviews you always find some people that thought it was too  
technical and some that thought it was not technical enough. So 5% on  
each side of the curve seems quite normal.

I agree about the tutorial quality and I am ready to do more reviews.

As a review for next year, I'd like to have orange buckets to indicate  
"neutral". Currently neutral and good are bundled together and to find  
out what the good stuff was it is nice to know what was really good.

With the avalanche in iPhones and people with portables, would it be  
an idea to have some internet based feedback on a session (in real  
time)?

Kind regards,

                Peter Kriens


On 28 aug 2008, at 07:31, Scott Rosenbaum wrote:

> All,
>
> Last year we went to the yes/no polling for the first time.  We  
> actually got a lot of feedback with almost 7,000 ballots cast over  
> the four days and 226 sessions.  I realize that this polling data is  
> not very detailed and the Yes/No nature of probably games people  
> towards a better approval rating than you might get if you used a  
> different scale, but I still like the data for its simplicity and  
> the volume of feedback.
> At a very high level we should be please that better than 90% of the  
> people liked (+) the session that they attended.   On the other  
> hand, it is a little troublesome to me that 1 in 10 people walked  
> away from a session and decided that they were somehow unsatisfied  
> (-).  This is why you will hear me continuing to harp on quality  
> this year.  I hope to see an improvement in the overall ratings next  
> year, but it will only happen if you work with the submitters in  
> your category to improve their talks and make sure that they are  
> prepared to give great presentations.
>
> The attached PDF provides a fairly good overview of the results of  
> last years feedback.  One of the things that surprised me, and  
> really stood out was how well the Short Talks (96% +) did on average  
> as compared to the Tutorials (89% +) and Long Talks (90% +).  This  
> is one of the reasons that we have expanded the number of short talk  
> sessions this year and that we will be actively pushing you to  
> recruit and organize more short talks.
>
> One particular fact that stood out was that of the ten lowest rated  
> sessions, 8 of them were tutorials ( 44% - 62% approvals ).  I think  
> that the take away is that if you are going to advertise a tutorial  
> the presenter had better be prepared to do more than just talk about  
> the kewl stuff that they can do.  The demos have to be clear and  
> well rehearsed.  There needs to be concrete take away content.  
> Finally, a tutorial should have some level of hands on involvement.  
> This is one of the reasons that I would like to see more commercial  
> trainers recruited to come and provide tutorials this year.
> Do you feel that we as a program committee can take on a content  
> review of all of the tutorials?  Would you be willing to do that  
> with the tutorials in your categories?  Thoughts?
>
>
> Scott Rosenbaum
>
>
>
> <overall_analysis.pdf>_______________________________________________
> eclipse.org-eclipsecon-program-committee mailing list
> eclipse.org-eclipsecon-program-committee@xxxxxxxxxxx
>
https://dev.eclipse.org/mailman/listinfo/eclipse.org-eclipsecon-program-comm
ittee

_______________________________________________
eclipse.org-eclipsecon-program-committee mailing list
eclipse.org-eclipsecon-program-committee@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/eclipse.org-eclipsecon-program-comm
ittee

_______________________________________________
eclipse.org-eclipsecon-program-committee mailing list
eclipse.org-eclipsecon-program-committee@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/eclipse.org-eclipsecon-program-committee



Back to the top