Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [birt-pmc] Performance

Mike,

I was also happy to see that of the three reporting tools tested, BIRT came out with the best performance.  I was also pleased to see that BIRT was chosen as one of the tools to be picked on, in a world of open source reporting tools, it is good to know that we are one of the top three.  I would like to see the tests repeated using the new 2.1.1 version of BIRT, since we have made a number of performance enhancements.  The PMC is crafting an official response to their site acknowledging their work.

I do have some questions about the testing methodology and the conclusions drawn from that testing.  My first thought was that it is curious that each of the tools that were tested failed the test.  It makes me wonder if there is not a basic issue in the chosen testing methodology.  Performance testing of reporting tools is extremely difficult to do correctly, in particular if you are trying to do peak load predictions: View times, page density, database wait times, and rendering format can all significantly impact the outcome of the tests. 

In think that their conclusion that "sample report can not handle 10 simultaneous active users (CPU overload)" seems somewhat misleading.  Looking at the grinder scripts that were used to execute the reports shows that the wait time for these scripts was 5 seconds.  Typically, users are going to require more than 5 seconds to view the contents of a report, particularly for a data set size of 4,079 discrete records.  It would be interesting to repeat the test using a think time of between 30 seconds and one minute which are standard reporting think times when measuring reporting tools. 

I also am a bit confused by the statement "User invokes MySqlData.rptdesign report directly (without the BIRT Viewer)".  My reading of their testing script indicates that they are submitting a Web Viewer URL to the Tomcat server.  This indicates that they are running the reports through the BIRT Viewer.  Repeating the test without the overhead of the Tomcat application and the BIRT viewer would provide some very interesting information as to the performance of the core reporting engine technology. 

Report performance is one of the core functions of the BIRT framework.  As we have managed to build out the core functionality, we are turning our focus to performance improvements.  Our hope is to announce performance benchmarking results and improvements as we identify projects and staff to complete those projects.

Scott Rosenbaum




Mike Milinkovich wrote:
Scott,

Makes for some pretty interesting reading.

So what's the sense of the PMC? 

I guess the bad news is that the framework gets very loaded with even low
numbers of users. 

But the good news is that (unless I'm reading this wrong) that BIRT came out
ahead of JasperReports and JFreeReports as measured by their "Aggregate
Average Response Time".

Mike Milinkovich
Executive Director,
Eclipse Foundation, Inc.
Office: 613-224-9461 x228
Cell: 613-220-3223
mike.milinkovich@xxxxxxxxxxx
 
blog: http://milinkovich.blogspot.com/
 

  
-----Original Message-----
From: birt-pmc-bounces@xxxxxxxxxxx 
[mailto:birt-pmc-bounces@xxxxxxxxxxx] On Behalf Of Scott Rosenbaum
Sent: October 2, 2006 1:43 PM
To: BIRT PMC
Subject: [birt-pmc] Performance

All,

Here is the link to the performance test.  You can also see 
the Jasper and JFreeReports tests on the same site.

http://jroller.com/page/galina?entry=birt_reporting_framework_
performance_test


Scott
_______________________________________________
birt-pmc mailing list
birt-pmc@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/birt-pmc
    
_______________________________________________
birt-pmc mailing list
birt-pmc@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/birt-pmc

  

Back to the top