Using the charting application on CF 7.02, I receive a
"divide by zero" error that appears to bubble up from Java or at
least WebCharts3D (since it doesn't have the usual CF error
information). It occurs under very specific conditions (at least in
my application), and it's completely reproducible. There's no way,
however, to know where or how any division operation is being done.
It's occurring under following conditions: The line chart has
two series. The X-axis displays date values and the Y-axis a
numeric data value. The query containing the data for the first
series has a "dummy" row, containing only a date value, in order to
force the chart to begin at a specific date. It then has one or
more rows of data, so points are only shown for dates where data
was actually recorded.
The second series represents charting of an "ideal" value.
The query structure is built in code, and consists of a date and a
data value for every day within the date range. This produces a
flat line showing the ideal value.
The error occurs only when the first series has only two
rows: the dummy row and a row of real data. If there are multiple
rows of real data, the chart displays correctly.
However, if I remove the second series -- the ideal value --
the first series displays correctly, even when the series query has
only two rows. So clearly, the error comes from some sort of
interaction between the two datasets. But there's no way to tell
what the interaction is, or how to work around it.
I've Googled and searched extensively, but found no reference
to this problem. Any help would be greatly appreciated.