Filtering large data sets with parameters [message #184171] |
Wed, 09 August 2006 18:23  |
Eclipse User |
|
|
|
Originally posted by: staylor.accolo.com
Hi all,
I have been playing with BIRT for a few days now. I am trying to take a
large data set and allow the person running the report to filter it based
on a parameter. For testing purposes I have put in a statement in the
where clause of the query to keep the data set to around 100 rows (without
the where clause statement the results would be 150000 rows). I am able
to filter this small data set based on the report parameter w/o issue.
However, when I take the statement out of the query's where clause and I
try to preview the report it times out.
It seems that BIRT is pulling down the entire results set before filtering
on the report parameter. Is there a way to make sure that the report
parameter entered by the report user is used to filter the data before
retrieving it?
Thanks,
Ziggy
|
|
|
Re: Filtering large data sets with parameters [message #184191 is a reply to message #184171] |
Thu, 10 August 2006 03:56  |
Eclipse User |
|
|
|
Originally posted by: jhurley12.gmail.com
Ziggy
I have found that it is a bit tricky to add parameters to filter the data
you are retrieving before you retrieve it. I dont think there is a way to
add a parameter straight to the select statment from the design tool, but if
you add an intermediate PHP like step to your interface for accessing the
reports, you can just edit the select statement in the .rptdesign file based
on user input. This will allow you to use the most efficient select before
you run the report. I'm working on Tomcat and use a jsp to write the new
rptdesign file (based on user input) before i send a redirect calling birt
to run that new file. It makes the birt parameters and filters useless in
some cases, but it outperforms them so you might want to consider this.
I too am working with data of about 200000 rows, and I have been facing
similar issues as you. I actually was hoping to get a discussion going
about birt performance with large datasets because I was hoping Birt would
be able to handle quite a bit more than 200000 rows. If anyone has any
experience with this please post about it. I'm using a mySQL data source.
I sometimes receive the outOfMemoryError when running large reports,
sometimes they just take a few minutes to load. It would be nice to know
the limits of Birt if anyone has explored this issue and would like to share
their findings. Also any hints at optimizing performance would be welcome
as well.
Joe
"Ziggy" <staylor@accolo.com> wrote in message
news:4069249960fed9c40846e30a82279698$1@www.eclipse.org...
> Hi all,
>
> I have been playing with BIRT for a few days now. I am trying to take a
> large data set and allow the person running the report to filter it based
> on a parameter. For testing purposes I have put in a statement in the
> where clause of the query to keep the data set to around 100 rows (without
> the where clause statement the results would be 150000 rows). I am able
> to filter this small data set based on the report parameter w/o issue.
> However, when I take the statement out of the query's where clause and I
> try to preview the report it times out.
>
> It seems that BIRT is pulling down the entire results set before filtering
> on the report parameter. Is there a way to make sure that the report
> parameter entered by the report user is used to filter the data before
> retrieving it?
>
> Thanks,
>
> Ziggy
>
|
|
|
Powered by
FUDForum. Page generated in 0.03498 seconds