> We need more information than that to display a record,
so we select
> everything out of the database (which is fairly quick),
then JOIN that to the
> verity results (WHERE auctionID = KEY).
Right. And what do you need THOUSANDS of matches for, in one
hit, here?
And what do you need ALL those columns for, when dealing with
all these
thousands of rows. You've not really answered my question as
to "what's
the end result here?" What are you trying to achieve? I don't
mean what
you're doing to aggregate the data, but simply *why*? What is
the
requirement you have here to be engaging in this enterprise
in the first
place? Search screen? Stock control report? What?
Is there any way of optimising how much processing you're
doing?
If - say - you're doing a search results screen, you probably
don't need
1000s of results: you probably need 20. So just ask the DB
for 20: WHERE
id IN (#list of 20 IDs from Verity search results#)
If the user goes "NEXT >>", then grab the next 20
(cache the Verity query
somehow, rather than re-query it).
Obviously there's some heavy-lifting processing that might
need to process
the whole lot. Can this not be done in a sliding window of
results? Or
could you not pass the list of IDs from the Verity resultset
into the DB
somehow, and do the filtering on the DB engine, rather than
with CF (which
is not very good at bulk data processing, as you're seeing.
It's not what
it's designed for).
I find QoQ to be very flaky for all ut the most basic
operations. For
basic stuff it's fine. It does not surprise me that it seems
to leak
memory (or whatever it's doing) and eventually give up the
ghost. This
does not help you, I realise, but as a suggested practice:
don't expect too
much out of QoQ. Try some other method instead.
--
Adam