Regardless of source video format or output encode settings, your video will be transcoded, recalculated, whatever you wish to call it, for the final output. I've never heard that video will look worse if you encode it to a higher bitrate than the source was. Don't believe everything you read on the web.
That said, it's probably true that you won't see any obvious difference in Blu-ray quality between 25Mbps or 30Mbps for instance...your video can only look "so good" and cranking up the data rate higher won't make any difference. You might find that it looks great at 20Mbps, as I have used with HDV sources for long-form videos.
The video capture defines how much real image information you have. Transcoding to a higher bit rate won't give you any more image information than what you started with. It will give you more data, but not more information. And those aren't interchangeable concepts.
My thought is that you just want to encode such that you don't actually loose any of that image information that you captured. You could probably use a max bit rate of 17Mbps if that's the max your camera recorded. It's easy enough to encode some disks (iterative trials are what BD-RE disks are really good for IMHO) at various levels and try them out, see what you think. You'll problaby end up where Jeff is -- encode a little higher than the original capture just to be sure. I do that too. ;-)