and my project is 32 bpc, which I thought would elimate banding.
No, it doesn't. You're still rendering to an 8bit CoDec. Using higher project bit depth merely means that the HDR to LDR conversion will yield more favorable dithering patterns, but that's it. You'd have to use 16bit image formats/ CoDecs to retain the fidelity. Still, this is kind of a circular reasoning: At some point it will get converted to 8bit and possibly heavy compression added on top of it when it goes on YouTube, DVD or BluRay. Therefore the more sensible approach is to re-evaluate your design and look into ways to change the gradient by e.g. adding slightly colored stops inbetween. Other techniques include of course adding tiny bits of noise as some sort of pseudo-random dithering or using otehr patterns to similar effect like some whispy, barely visible Fractal Noise....
What he said....
The only way to maintain the color space is to render to a high bit depth codec. I know that 8bit codecs say millions of colors but that isn't very many colors.
Let's say that the rgb value of the start of your gradient is 40,40,40 and 800 pixels away the rgb value is 80,80,80. That's a nice soft even gradient in float, but it displays badly on an 8 bit monitor and there's banding when rendered to an 8 bit codec. You've got a change in value of 40 over 800 pixels. That means you have, in a best case scenario, 20 pixel wide bands of color. Even if you're working in float (trillions of colors) there is a good chance you'll see banding on your monitor because there's a high likelihood that it will only display a slightly reduced spectrum of the available color values.
If you're delivering an 8 bit product for the web or even broadcast then the product is also going to be highly compressed. This just makes things worse.
Your only solution is to do something to visually break up this banding. Noise works. texture works, just about anything works but a smooth transition between similar color values that are not far apart.