It sounds like you'd be much better off if you invested in a real web analytics service.
See Google Analytics:
Or perhaps Omniture:
I know some (if not all) of these services allow you to track custom/dynamic variables (related to conversion trackng, etc)
You could do it yourself in ColdFusion, but it sounds like you already have a good sense of the caveats. (Performance, false positives w/ bots, etc) Odds are you're going to end up spending a ton of time trying to "roll your own" on this, and as we all know, time is money. If it were me, I'd save myself the time and frustration and put my trust into a well established analytics provider, even if it cost me a few bucks.
My two cents...
I would not suggest having ColdFusion track site statistics within it's own code. There could be performance issues with this. What I would suggest is that you consider using a free or commercial vendor that can provide you access into your analyzed data. Then pass the important statistics you require to the analytics server via Meta Tags and acquire your data through service calls to the services API. Persist the relevant data in a local database and draw your conclusions from it. This will allow you to use both hit based metrics but more importantly also the analyzed Visitor based metrics which really are what you want. You can then let the analytics continue using regular means and utilize the data from previous analysis to influance your sites design programatically. To do this you need to choose an appropriate vendor for your analytics which can provide the options required. From my experience I know for sure you can do this with Webtrends and I think you can with Google Analytics and Omniture.
In particular I would look into Webtrends REST API at http://developer.webtrends.com
Not sure if this is relevant but I had a friend ask me about choosing each, Speed considerations and other things that would influence his decision on which to choose. Much of this is specific to his query. I am including what I wrote for him in case it may be of use but a great deal is irrelevant. The spider and bot stuff might be though.
The Data collection server identifies where to place the log data. The query parameters are stripped off the image request and used to generate the log file which will later be analyzed. Realtime (or near realtime) statistics are collected at this point and are generally only available for Hit based Stats as Visit metrics require the close of the visit to analyze.
Regarding Spiders, Bots and Automated systems:
There is no perfect way to remove all spider and bot traffic from log statistics. On average for a non commercial site at least 30% of the traffic in your standard WebLog files will be generated by Spiders and Bots. On a commercial site or if you are a government agency then this can skyrocket to 60% or more as these are often prime targets for automated systems intent on finding a penetration point.
(If it is mission critical to have absolutely NO spider, bot or automated system details in your log file you would want to implement an In House solution where the Log files are readily available to you and in a format that a Log Scrubber could be utilized to analyze and remove hits based on the User Agent string and rules you provide. Webtrends Software solutions would be a good choice for this.)
Regarding where to Collect Data:
What you DO NOT want to do have that CF Server hosting your site(s) be the same server that collects the returned data. That would be a very big big big NO NO for many reasons I will not go into here. Just trust me. Don't do it. If your collection occurs in house, that server needs to be it's own stand alone system or systems which use Round Robin for load balancing.
If you are using an On Demand service such as those provided by Webtrends, Google Analytics or Omniture then this is not a problem. The Data Collector happens on their server farms.
Regarding which to use In House or Software:
If you want the ultimate in control and configurability, have the budget and don't mind the added responsibility you perhaps want to consider an In House software solution. This is more costly in terms of Resources and Time but allows you full control over every aspect of your solution. This pretty much rules out Google Analytics as I do not believe they have a Software solution at this time. Webtrends and Omniture both offer in house solutions. These are ideal for Intranets in the corporate environment. I personally would use Webtrends but that is simply because I am more familiar with their software solution. Look around and make your own choice. Most vendors will offer free trials you can play with.
Regarding On Demand Services:
Which On Demand service to choose depends greatly on your level of traffic and the skill of your team who will be implementing the solution (perhaps just you). I have had a lot of experience tagging sites with Webtrends, Some with Google Analytics and only a little with Omniture and minimal other solutions. Below are simply my opinions on each.
Pros: They are great for smaller sites with less then 5 million hits a month. They have some very well formed tools and great documentation. So if your traffic level is less then this I would suggest using GA.
Cons: What they do not have is a support team that will provide as comprehensive of a support service as some of the commercial vendors. If you want great support you generally have to pay for it.
Also Google is not really set up to handle very large capacitiy sites. For Example: According to Orbitz they sent Webtrends 1 Billion hits in one day and Webtrends happily accepted it and analyzed it without problems. I do not think you would be able to do that with GA.
So if you are certain that growt on the site over the next year will exceed 5 million page views per month AND budget is not the primary concern, you will want to seek a commercial vendor.
Of the commercial solutions the two big boys are Omniture and Webtrends.
Pros: Webtrends solutions are the most powerful over all. They certianly do not have capacity problems and you can simply do more with their solutions if you know what you are doing. You will pay more for their solutions in the short term but their packages are more complete. They offer a greater amount of options to acquire your data but analyzed and raw outside of their User Interface then most vendors do.
Cons: Their User Interface leaves much to be desired. (They are working on that though and are designing ways for you to build your own UI if you do not like theirs. Hopefully this will materialize soon.).
Pros: UI at the moment is better thought out and easier to get around in for a beginner with less training or experience. Their solution will be less expensive at first and is reasonable powerful.
Cons: The starting price seems low but they charge you for every feature you would want to add. This is sort of like how the PC Vendors will sucker you in with a low priced PC's but then once start adding extra options the price skyrockets.
Word of warning: When pricing any vendors solution make sure the quote includes everything you require AND everthing you could want. Then if the price is too high but you still want to use their service start removing things that are not absolutely required. If you can still live with it add it to the list of possibilities. Otherwise drop it, don't consider it anymore and provide the vendor feedback regarding why.