0 Replies Latest reply on Nov 11, 2009 4:02 AM by n53mat

    CPU usage increases when web server is down (ubuntu)

    n53mat Level 1

      Hi All,


      I have a strange bug that has been perplexing me for a few weeks now. It only seems to occur on Linux/Ubuntu and it involves http requests.


      To give you a little background we've built a little http status monitor that pings a URL to check if a server is up or down. The problem is when we turn the server of the CPU usage of the application increases indefinately irrespective of the number or URLStream objects created. I this is leaft over a period of time the application locks up because of the high CPU usage.


      Steps to reproduce:


      1. Create a URL Stream or URLLoader to a domain name that can be resolved to an IP but the web server is unavailable and make a request (this can be done by either disabling the web server, or pointing to a fictitious domain and adding a fictitious IP to a local hosts file for this domain).


      2. Request the URL repeatedly over a cycle of N seconds;




      When attaching an IOErrorEvent listener the client will recognize that the URL cannot be reached and trigger the IOErrorEvent however when looking at the CPU usage level over a period of time. Given that N is 10 seconds or less you will notice a gradual and permanent increase in CPU usage (around 0.1% per 4 minutes). If the web server is then enabled the CPU usage will drop to a stable and lower level.


      NB. When using the local hosts table the application will need to be restarted before CPU usage returns to a normal level.


      Expected results:


      When this scenario is tested on a Mac or Windows based machine the CPU usage level remains the same despite the same.


      Attached is a simple project to demonstrate this.