Skip navigation
Currently Being Moderated

Socket (GetUrl) affected by OSX Lion?

Oct 20, 2011 7:31 AM

Tags: #mac #performance #osx_lion #indesign_cs4 #socket

Hi

 

I've been using the GetUrl function that Kris Coppieters at Rorohiko demonstrated on his blog, about "using state machines" some years ago. It has worked very well.

My question now is whether anyone knows of something in the "Lion" update for OSX, that can affect the socket http calls (when it comes to speed).

 

I ask since I got reports from a client about three or four of their updated computers (with scripts that use the getUrl function) being very slow, and script run times around 15 - 20 minutes instead of normally less than a minute.

 

There was a couple of months between the update until the need to run this script, so it took a while to even think that the Lion upgrade could have anything to do with the problem.

 

We found an OSX Snow Leopard computer (that was not in use and thus never updated), and there the same script ran quickly, when I tested (<1 minute).

 

Of course there could still be other explanations, such as other programs installed or that we logged on differently (other user) on the old machine, but currenly I tend to believe that Lion has some part in it.

 

I've noticed a lot of complaints about networking and Lion, when searching the net.

 

Anyone else out there, with a Slow Socket Lion?

 
Replies
  • Currently Being Moderated
    Oct 21, 2011 8:50 PM   in reply to Andreas Jansson

    Hi Andreas,

     

    I'm not sure whether this would solve your problem, but I do have an updated version that dramatically improves the efficiency of transfers that don't have a 'Content-length' header - but I haven't had time yet to update the downloadable files on my blog yet. If you send me an e-mail at 'kris at rorohiko.com', I'll send you a copy of the latest version to try - maybe it'll fix the problem for you.

     

    Cheers,

     

    Kris

     
    |
    Mark as:
  • Currently Being Moderated
    Oct 21, 2011 9:02 PM   in reply to Andreas Jansson

    Ok, I repackaged the updated script into the download linked to the blog. Download link can be found at:

     

    http://rorohiko.blogspot.com/2008_07_01_archive.html

     

    I'd be interested to hear whether it fixed the problem...

     

    Cheers,

     

    Kris

     
    |
    Mark as:
  • John Hawkinson
    5,572 posts
    Jun 25, 2009
    Currently Being Moderated
    Oct 22, 2011 4:57 PM   in reply to RorohikoKris

    Kris: With respect to:

    >>> Edit#2:

    Adjusted the script to give much faster downloads in case the Content-Length header is not present in the web server headers. Also changed the protocol to HTTP/1.0 instead of HTTP/1.1 to sidestep the issue of 'chunked' downloads - support for 'chunked' HTTP is left as an exercise.

    It appears that, if recent questions are any guide, use of HTTP/1.0 is probably insufficient. There were 2 or 3 questions this month on this forum that turned out to be cases where web servers gave confusing/wrong answers with HTTP/1.0 that worked properly with HTTP/1.1. In one case it was especially strange because some URLs on the server worked with 1.0 and others required 1.1.

     

    I suppose it might be sufficient to use HTTP/1.0 with a Host: header, I dunno. But I'd worry about HTTP/1.0...

     
    |
    Mark as:
  • Currently Being Moderated
    Oct 23, 2011 4:43 PM   in reply to John Hawkinson

    Hmm... Interesting. A few months back, I drug GetURL out of my software-closet and tried to use it with one particular server and found out it became the victim of 'chunked HTTP', where a large packet is 'chunked' with some addtional protocol layer interspersed.

     

    That threw the script a curve ball, and it did not handle that well. As GetURL is all pretty much 'ad hoc' code, rather than a full-fledged HTTP library (i.e. I tweak it as needed in each individual project, and the list of what it does not do is much longer than the list of what it does do) , I chose the easy way out: instead of implementing 'chunked HTTP' (which is not all that hard), I figured: if chunked HTTP is not part of the HTTP/1.0 spec, I'll simply force the use of HTTP/1.0 - so I simply changed

     

         ...

            "GET /" + parsedURL.path + " HTTP/1.1\n" +

         ...

     

    into

     

         ...

            "GET /" + parsedURL.path + " HTTP/1.0\n" +

         ...

     

    Yup, I know - that's lazy, eh! But that fixed it for me - and given the time constraints that reality always seems to be throwing at me, I did not do any kind of further 'deep' research into the area.

     

    If anyone is bumping into issues with GetURL: if you want me to have a look, you can send me a packet dump - use Wireshark or something similar to 'capture' the packets. That often gives me a good clue as to what is going wrong. However, no guarantees that I'll have a solution - and it might take a wee while before I have time.

     
    |
    Mark as:

More Like This

  • Retrieving data ...

Bookmarked By (0)

Answers + Points = Status

  • 10 points awarded for Correct Answers
  • 5 points awarded for Helpful Answers
  • 10,000+ points
  • 1,001-10,000 points
  • 501-1,000 points
  • 5-500 points