I am creating a data-centric application where you profile
your clients and create notes of each encounter with a customer. My
application makes use of a php web service that gets all the
information in the database and stores it in an arraycollection.
what if the table that it is returning 25 columns of 1,000,000
records. how would this affect the overall application? is this
even the correct practice. I load all the information in the client
machine to make full use of the sorting and filtering capabilities
I believe that would be a mistake.
Transfering that much data would take time.
Changes to the data by one client would not be reflected in
clients unless the data is pushed to them again after the
You would be much better off using your DB for
filtering/sorting the data to some degree and use flex to
filter/sort the result set if needed.
Just use Flex to query your db as needed bases on what the
user is doing.
I am doing the same thing 25 columns and I summarize the data
One for the start and then when a user makes selections the data
filters it works great some tmes it loads kinda slow but very
usable. You need all the data in the the array to to fancy
filtering with fading animated graphs and stuff. But I suggest
Summarizing as much as you can. I recently build a tracker that
tracks ever account in California (33,000) with 16 categories, more
than 50 items for 26 months (its a lot of data!!!) it takes a few
seconds on a broadband to get the data but when its all in barchars
interpolate and pie charts move and stuff. I could not be happier!
Just telling you what in doing im sure you could run more
stored prcedures and stuff against the data to get it back even