Use the performance profiler to make sure you understand where the time is
being spent. Depending on your scenario, there is:
1) Conversion of raw data to ValueObjects
2) Access of XML Data
3) updating any sorts or filters on the data
4) updating the DataGrid.
5) updating the renderers in the DataGrid.
Updating the DataGrid is not usually the problem as the DataGrid only
refreshes visible renderers. If only 20 rows and 20 columns are visible,
then it shouldn't matter if you send 20 rows or 300 rows, the update time
for the DG will be the same. If you have 50 thin columns visible and 60
rows, then you might have to worry about DG update speed.
If you are using default renderers, then there shouldn't be an issue there
either. If you have custom renderers and they are not optimized then there
might be an issue there, but again, it wouldn't matter if it was 20 rows or
300 rows if only 20 rows are visible.
Most folks are getting killed getting XML data and converting it to
ValueObjects. Using RemoteObject/AMF is the recommended option, but you can
also customize the conversion to use psuedothreads.
Thanks for your prompt response,
I have already incorporated all these stuffs,but the issue still exists since there is a huge result set that comes in to Flex layer from Java/DB layer.
Now my requirement is that I just need to show the user with 10 -15 rows at one shot and keep updating the other rows to the datagrid because by this way the user need not wait for longer time till the service fetches the entire result set.
1 person found this helpful
LiveCycle DataServices supports paging of data. You can build your own
paging if you have control over the service.
But unless the result is 1MB or more, it shouldn't really matter because the
data is delivered asynchronously. What usually matters more is what happens
once the data arrives. If you need to offload some of that, you can use
Can we perform that in BlazeDS ? How about Messaging, isn't it a good option to use it to send chunk of data in a timely manner and so I can get all the data in a timely manner by enabling the user to work on the data that is obtained.
I think there are two other options you can try.
1)You can try using some frameworks like Apache Lucene. Using this you can send the queries to lucene and fetch back the desired number of rows (say 10 or 15 at a time). You can also do all your sorting at lucene, so all you need to pass to lucene would be some parameters and their sorting order i.e. ascending or descending.
2)You can implement some progress bar to tell the user the result are being loaded. Once all you data comes to the client side, it would be easy and faster for you to perform any client side computations.
Hope this helps.
(p.s. If this post helps, please mark it so. Thanks! )
I'm not sure how much chunking is available in BlazeDS. My area is really
client side. Check the doc and see.