There are several different ways to build a symbolic font. Which you pick will depend on how much you care about Unicode purity vs. ease of input, and whether the symbols have real Unicode codepoints or not.
1) Lie about the encoding and give things plain old ASCII codepoints.
2) Declare it a "symbol font" (in MS terminology) which will help it to map to both ASCII and a section of PUA if I understand correctly. Mac OS will probably ignore this bit toggle and just pay attention to the Mac cmap. I am far from expert on this option, so hopefully somebody else will chime in if I am mangling how it works.
3) Double-encode glyphs to both ASCII and some higher Unicode codepoints.
4) Only encode glyphs to proper Unicodes where they exist. Use PUA where they don't. This ensures you never "lie" about the Unicode codepoints.
Adobe goes with the last option. The one caveat I will add is that you may need to declare it to support codepage 1252 (WinANSI) to get the font to behave reasonably under Windows; if it doesn't support ANY codepages it may not work right with some (or even a lot of) applications.