Skip navigation
losewy
Currently Being Moderated

could not complete your request because of a program error

Aug 9, 2012 7:59 AM

Tags: #photoshop_extended_cs6

hey guys...

i am using photoshop cs6 exteneded since days, and i wrote a text in 3D mode. so after finishing it, i tried to render it. after finishing rendering, i try to save my work, then i get something like "could not save because there is not enough memoryRAM" and after some tries to save it, i get " could not complete your request because of a program error" it always happens.

i use windows 7 with intel core i3, 2.13 GHz, and RAM 4 GB

 

thax

Sam

 
Replies 1 2 Previous Next
  • Currently Being Moderated
    Aug 9, 2012 8:08 AM   in reply to losewy

    How much VRAM do you have on GPU?  Believe 512 is min. for certain functions of 3D.

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 9, 2012 8:56 AM   in reply to losewy

    Check the web site of the maker of your video card for the latest display driver they offer for your hardware and OS.

     

    -Noel

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 9, 2012 5:51 PM   in reply to losewy

    So did you check what I said?

     

    What display driver version do you have?  Is it the latest one released on nVidia.com?

     

    -Noel

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 10, 2012 7:18 AM   in reply to losewy

    Having an older version on your PC will have no effect as they are independant programs.  Best not to uninstall them.

     

    I see this is a mobile GPU so question you have 2225 mb of VRAM.  Open Photoshop and click Help/system info.  On second page look for Video Card Memory.  Will probably be 256 or 512 mb.  If 256 may not be enough memory to work well with 3D.

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 10, 2012 7:36 AM   in reply to losewy

    Of the 4g RAM how much do you have allocated to PS?  Should be less than 70% or you will starve computer.

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 10, 2012 8:02 AM   in reply to losewy

    In PS click on edit/preferences/performance.  Think the recommended value for 4g is about 55%.  To get the benefits of CS6 you really need 64bit OS and min 8 g of RAM.  THe OS needs about 2 gigs and the rest can go to PS.

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 10, 2012 8:16 AM   in reply to losewy

    With CS6 a lot more reliance is put on the video card, and of course good drivers, so that seems to be were a lot of the problems lie. 

     

    Whether 4 g or ram is enough will depend on how large your images are and what edits you use.  The power users are going for 24g + now.

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 10, 2012 8:44 AM   in reply to losewy

    Not a computer expert, but seems like I have read that can also be a GPU memory problem where it does not release memory. 

     

    You might try Edit/Purge and then selecting the type of buffer you want to purge.  Video cache is also a choice.

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 10, 2012 9:02 AM   in reply to losewy

    Not to be too pushy about your display driver setup, but could you please go into Help - System Info... in Photoshop, copy what's there, and paste it into a post here (removing the serial number info)?  Maybe something will jump out that's not obvious.

     

    -Noel

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 10, 2012 9:49 AM   in reply to losewy

    Hi Sam,

     

    To confirm, you only run into this problem after rendering out a 3D type extrusion, correct?

     

    IOW, same small photo used each time:

    1) open 1000px x 1000px RGB file, generate type layer and extrude to 3D, render 3D layer, save — error

    2) open 1000px x 1000px RGB file, generate type layer and extrude to 3D, save — no error

    3) open 1000px x 1000px RGB file, generate type layer, save — no error

     

    If that's the case then I don't think it's a GPU problem. How many scratch disks do you have active and how much free space do they have? (go to Preferences> Performance).

     

    regards,

    steve

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 13, 2012 8:42 AM   in reply to losewy

    Hi Sam,

     

    Try changing your resolution value to 72 ppi.

     

    regards,

    steve

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 13, 2012 8:58 AM   in reply to SG...

    I wonder if changing the VRAM usage limit in Edit - Preferences - 3D could possibly help...  I've never heard any advice one way or another on manipulating the settings in that dialog.  Pretty much everyone leaves them at default settings I think.

     

    Steve, can you suggest ways to monitor GPU resource usage?

     

    -Noel

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 13, 2012 9:38 AM   in reply to losewy

    Have you tried increasing the size of your paging file?

     

    This might help, but Adobe seems to be bad about using paging  file space, so it might not -- I've

    got my paging file on a fast SDD, so paging should happen quickly --

     

    What does Photoshop do?  It refuses to allow paging and instead opens up a tmp dir on a slow disk.

    GRRR....

     

    Fortunately most programs aren't even aware of virtual memory vs. phys memory, let alone go out of their

    way to break a useful OS feature.  

     

    Of course Adobe might claim they will always use more memory than than what you have (at least w/the images I work on, that's usually true), but instead of forcing things to tmp, they could lower Photoshop's "page priority from it's current" 'highest' priority (default), to lowest(1).  That would ensure not only that what they would likely have written to tmp gets paged out first, BUT, ALSO, the many parts of photoshop that you are not using -- (are you really using the medical DiCom stuff while focusing on 3D, for example?, or do I need the 3D sturff in memory when I am only doing 2D stuff?  Allowing the OS to take care of least-accessed program segments is almost always more efficient than trying to do it yourself -- unless you dynamically profile each user's usage on a given document and have a provely better algorithm --  which no vendor has bothered to do yet, AFAIK.

     

    Anyway, even though photoshop deliberately breaks the normal OS paging mechanism for itself (albeit to be a 'polite program that doesn't force others out of memory), there are costs to not being able to adjust that (like it just plain outright not using virtual memory (meaning though a system might have a 40GB address space on Windows, on a sys with 12GB of memory, Photoshop will only use 8.4GB   (~70% of physical 12G). 

     

    In that scenario, Photoshop would waste 32GB of fast paging area while going to a slower drive not selected for 'fast paging usage'.  Worse, when disk space gets tight (<7% free space, Photoshop all but hangs as either it refuses to try to use all of disk OR the OS has too much time trying to find free spaces of the size Photoshop asks for).  Ironically, this is sometimes caused by some earlier Photoshop crash leaving a large tmp file somewhere that doesn't get cleaned until I run a disk-cleanup

     

    It's unfortunate that Photoshop disables it's usage of of virtual memory due to software limitations -- as in my above example it would be the difference between it using 8GB vs. 32GB -- (~70 % of phys or virtual to a fast SSD).

     

    Suggestion  to try to solve your prob.  I wouldn't run this way all the time (well, maybe not), but it might allow you to save.(I should try this and see if it helps w/perf as well!)...

    1) make sure you have a good sized swap file -- otherwise it's pointless (on your

    fastest HD, in the fastest part if possible) (usually at beginning of disk -- usually something you setup at system setup time).   A new HD or SSD willl work too.

    But speed is less necessary than space.  Speed is convenient, it's the space that will make/break the issue.

    2)  Get process-hacker (http://sourceforge.net/projects/processhacker/), -- using it, after you start photoshop, right click on it's process in the main process page, and under Misc, you can set it's page priority to 1 -- that way, when memory gets 'tight',

    you are telling the OS to page out photoshop's pages to the page file first.

    3) set Adobe's allowed memory usage to 100% -- oddly, Adobe seems to think 100% on a 48GB SYSTEM = 44GB, so they may be shooting the workability of this

    solution in the foot (i.e. if it doesn't use enough memory for Windows to start paging it to the page file, there's no point).

     

    Note -- EVEN if Photoshop for some reason has more code to disable use of all your system's virtual address space (the default is to use it, but due to adobe limiting it's memory usage, it doesn't), it is possible that OTHER programs will get paged out while you are working on this file -- i.e. Photoshop may get more memory to use and at the least, it should improve perf in photoshop (at the expense of background programs...but if you need to save a file...well...like I said, you may not run with 100% photoshop memory usage all the time... However, even @ 100%

    usage, On my system, they would still only be using 46% of my systems virtual memory.  So that 100% figure is pretty misleading -- especially if your paging file

    is on a memory-based drive (SSD) that can easily be 5-10X faster than desktop

    drives.

     

    Good look -- sorry for long post -- but involved topic and wanted to try to be complete!

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 13, 2012 11:05 AM   in reply to Astara_

    >>  It refuses to allow paging and instead opens up a tmp dir on a slow disk.

     

    That is incorrect.

     

    Photoshop uses the OS paging file, because it has no choice but to work with the OS VM system.

     

    But since Photoshop can use more image memory than the OS can handle in virtual memory, and needs to use image memory in ways that would perform poorly with OS virtual memory -- Photoshop uses it's own paging system in the scratch disks.  Relying entirely on the OS virtual memory system would be much, much slower than the current scratch file system (and not allow documents as large as Photoshop does).

     

    Photoshop does not "break" the OS VM system in any way.

     

    Photoshop does not disable the OS virtual memory system in any way.

     

    Photoshop is not "wasting" any space.

     

    Running at 100% is likely to cause problems for some OS allocators that don't do adequate error checking on allocations, and will cause problems for third party plugins that are kind of sloppy with their allocations.  (which is why we recommend not running at 100%)

     

    Photoshop's memory limits account for available memory -- and won't include the space taken by the OS, drivers, and the size of the loaded binaries.

     

    If the OS is paging much memory out, Photoshop will automatically free some memory to reduce memory pressure  (and thus reduce the paging).

     

    Paging priority has nothing to do with the situation.

     

    If you have a fast drive, why not use it for Photoshop scratch as others already do?

     

    Photoshop scratch files are either deleted by the OS when Photoshop crashes, or deleted the next time Photoshop launches (a safety measure, in case unlinking the files were kept around for some reason after unlinking).  They do not get left around.

     

    Paging Photoshop memory first will more likely result in double paging, causing Photoshop to run even slower.  So using process hacker in that way is probably a bad idea.

     

     

    Now that you have more accurate and complete information - you might want to rethink your approach.

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 13, 2012 12:33 PM   in reply to Chris Cox

    Chris Cox wrote:

     


    Photoshop scratch files are either deleted by the OS when Photoshop crashes, or deleted the next time Photoshop launches (a safety measure, in case unlinking the files were kept around for some reason after unlinking).  They do not get left around.

     

     

    Prior to Photoshop CS6 that was not always true.  With Photoshop CS5 I found them to accumulate.  I have been pleasantly surprised so far to find none left around from Photoshop CS6, which itself crashes far less often.

     

    And I can confirm that if you have a kick-butt system drive (I have a RAID array of SSDs) that pointing everything at it really makes Photoshop sing.

     

    -Noel

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 13, 2012 5:44 PM   in reply to Chris Cox

    This is the photoshop dialog:

    photoshop memory dialog.png

     

    Notice it only lets me use a max of 44737 or 43.7GB out of a phys-mem of 48GB.

     

    That means it will spill to it's temp file before it ever fills physical memory -- even @ 100%...

     

    Here is my system memory status... I just opened 2 pics in phtotoshop

    I'm sure I don't need to point out where they were opened:

     

    photoshop-mem+temp1.png

     

     

    Notice my Commit limit (page+phys = 1-4).

     

    It hits peak once in a while due to explorer bug eating all of memory...

    Now check this next one out --

     

    You are caught cold... -- photoshop didn't delete an over 1 week old temp file -- it's sitting next tot he current one

    alotn with a snap of it's current usage...

    photoshop-mem+temp.png

    Notice photoshop's cur tmp @ 45G next to a 8 day old 13G file it left around...

     

    So I would say it doesn't delete old tmps...proof above.

     

    Now look at it's memory usage 53G private and 55 virtual...

     

    (that's with settings set to 100% of my 44G memory (I guess a calc program it isn't)...

     

    Instead of using more paging file -- still have 40G available there, it is

    using a 44G slow tmp file.

     

    Everything I said is exactly true.

    And you apparently don't know how Photoshop runs on real users's systems.

     

    Now this may have changed dramatically in CS6, but I would doubt it...if so, I'm more than happy

    to try out CS6 upon getting a permanent license, though it really sounds like I should wait for the .0.1 release

    or maybe until I can afford another 48G of memory...

     

    Since it doesn't allow you to configure it for 70% of your VM space -- only some <100 percentage

    of your physmem -- so unless other progs use up alot of mem -- my paging file never gets touched.

     

    BTW -- know how PS runs on real systems do you want to stop telling people that photoshop pages normally?

     

    Since I have the info posted above showing it works exactly like I said.

     

    The only readon it is using paging memory now, is because I reset it's page prio to 1 and

    set it's usage to 100% (really only 90% of physical).

     

     

    So at it's highest usage setting, it barely uses half my VM space --- the rest spilling to a tmp file.. (where old tmp file ARE kept!)...

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 13, 2012 8:34 PM   in reply to Astara_

    The one thing you have wrong, Astara, is that Photoshop does not work as simply as you think regarding going to the scratch file only when filling memory.  It's WAY more complicated than that.  It will write data to the scratch file even before filling its memory limit.  It's just not something you can glance at and add up simply.

     

    Regarding your having programmed Photoshop to use 100% of the RAM it lists, I agree with the strategy of setting Photoshop to use all but a few GB of RAM.  I have been doing that for a long time.

     

    And you absolutely should make your SSD your scratch device - unless it's short on space, which is a bad way to run an SSD.

     

    -Noel

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 13, 2012 9:01 PM   in reply to Noel Carboni

    Regarding your having programmed Photoshop to use 100% of the RAM it lists, I agree with the strategy of setting Photoshop to use all but a few GB of RAM.  I have been doing that for a long time.

    Then why does PS say 71% is recommended?

     

    If the 100% already has a built-in safety margin, of 10%, then what's the point of going with 71%?

     

    As for what to use for a scratch disk -- I set aside a separate partition for my page file as is recommended for MS OS's so the swap/page won't fragment.   It takes alot less less to put a 64G SSD there than to try to put something big enough for scratch space.  

     

    yes, you want your scratch disks to be fast -- but in the computer industry, since

    paging is substituting for actual 'RAM', you want it to be as fast as possible.  Writing to a disk is expected to be an order of magnitude or more slower than RAM access -- the better a paging file can emulate RAM speeds, the better your computer will run "in general" -- this is why I made the statement that Adobe PS goes out of it's way to break normal Page file usage.

     

    You are providing the proof -- PS prefer's scratch disks before using the OS's page file.  It should be configurable based on an absolute figure -- not limited to

    configuration based on physical memory.

     

    The OS is *supposed* to have algorithms that have been designed by experts to optimize access to frequently needed content -- yet Adobe uses scratch files in preference to the OS's navtive, dedicated algorithms for this purpose -- and Adobe,

    a Graphics web application development company thinks their caching will be better

    than what an OS can do given the OS can see the whole picture?   Such hubris is not warranted.

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 13, 2012 10:20 PM   in reply to Astara_
    Astara_ wrote:

     

    Adobe uses scratch files in preference to the OS's navtive, dedicated algorithms for this purpose -- and Adobe,

    a Graphics web application development company thinks their caching will be better

    than what an OS can do given the OS can see the whole picture?   Such hubris is not warranted.

     

    You're making IMO overly simplistic assumptions about what Photoshop does.  It's not simple!  I questioned Chris Cox on the issue of Photoshop doing its own memory management quite extensively some time ago.  He said that during every development cycle they actually change out Photoshop's own memory management and try using the OS native memory management, and Photoshop's measured performance simply hasn't been as good.

     

    You also need to understand that it's a major application that must be able to run on a wide variety of operating systems, from XP to Windows 7/8 to a bunch of different OS X variants, and from 32 bit to 64 bit. It may be that some compromises in the design have had to occur to be able to offer an application with that wide a repertoire that is still maintainable in a single code base.

     

     

    Yes, at first glance it seems a bit odd that it works the way it does, and if it were redesigned today for use on one particular OS perhaps it might be completely different, but it is what it is based in large part on its past - it certainly doesn't get completely rewritten every major version release.  It might have made more sense to offer RAM usage in the form of a percentage when the address space size wasn't so open-ended (e.g., 32 bits).  Setting it to 100% in your case makes sense because you're leaving a few gigabytes for the OS and your other apps.  Setting it to 70% leaves a few gigabytes on a smaller system.

     

    What's got you worked up over Photoshop's methods of doing memory management?  Is it not actually performing as well as you'd like?  Under what conditions are you having to wait on swapping and/or scratch data operations?  It sounds to me as though you didn't build your computer to be as well optimized for Photoshop use as it could be, and the use of spinning disks is holding you back.  It's just a matter of economics.

     

    -Noel

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 13, 2012 11:30 PM   in reply to Noel Carboni

    For the same reason I don't spend money for a dedicated gaming system I don't buy a computer just for photoshop usage.

     

    If they tested those same algorithms on MY hardware, -- something meant to be general purpose, and fast when used as such, they might have come up with different results. 

     

    The point is not to roll your own stuff every release (well, I guess redoing the same

    thing over and over is a good way to keep job security -- especially when it doesn't

    seem to go for reliability and features. 

     

    What I am complaining about is what the original author of this note chain is complaining about -- poor use of resources.

     

    While Ps is usually locked up updating multiple layers, My machine sits 70% idle due to it's resource allocation.   When I run out of hard disk space on my scratch disks and my swap drives have are never touched -- that's really annoying.  The expectation is that programs will use swap before slower scratch disks.  Photoshop

    breaks the mold.

     

    It does lots of graphics operations, yet I rarely see it do anything needing much GPU power.  The most memory in GPU ~ 300-400MB out of 1.5G, So .. it leaves memory untouched, Graphics and system and goes to scratch disks first.

     

    Instead of using a scratch disk, they should do an in-memory file system that spill to disk when it gets too full.  Anything in that section can having it's memory priority set to lowest -- as though it was on disk -- in best case it will be used before being paged out, in worst case, it goes to the 2nd's 2nd fastest store -- it's virtual memory (at least that's how computers are designed, level  1 cache's 16-32K data+code,

    L2 256K (depending on generation) and L3 @ further from the cpu core -- and larger (and slower)...then system memory, then flash memory backing page store (it would be really insane for every program to use scratch files instead of virtual memory...it's a waste of time for the most part.  you might squeeze more

    perf out in one config, while hurting another.  And tomorrow it's the opposite.

     

    They have time to rewrite the memory allocator with each version but not to get it rock solid on Nvidia graphics cards that GPU's that can be dedicated to doing graphical computations?   I question the priorities.

     

    FWIW -- I have a 4-way 600MB/sd RAID0 for my scratch drive -- it sjust that it

    can run close to full and not having programs use any page in preference to

    using only scrach just goes against common sense...

     

    I can't help it if people don't know how to configure swap/paging to be the fastest

    thing on their system. 

     

    Ideallly -- they'd the amount of memory used a bit more variable -- but the limit it to a range where it doesn't work well.

     

    Watching adobe fail to save your file (on a network drive) because it can't

    make enough room in local tmp files while 60G meant for that purpose goes

    idle would make most people a little annoyed, don't cha think?

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 13, 2012 11:48 PM   in reply to Astara_

    >> Notice it only lets me use a max of 44737 or 43.7GB out of a phys-mem of 48GB.

     

    Yes, because the OS, drivers, and binaries take up some space.

     

    And everything you said is still pretty wrong.

     

    If the scratch file hasn't been deleted - then something is holding on to the file.

    Again, Photoshop uses 2 methods to try and make sure those files don't stick around.

    But it sounds like you're still using CS5, and there may have been a bug in CS5 that caused the scratch file unlinking to fail (I think I recall a bug like that, but it's been years...).

     

    And Photoshop is not supposed to use all of your virtual memory - because that would be slower.  Photoshop is supposed to use the RAM you allow it to, and use the scratch file for everything else it needs.  Using the OS VM would be slower, and much more limited (couldn't open large docs, couldn't use many history states, etc.).

    Please remember that Photoshop can address somewhere around 2 Exabytes per scratch volume (and I think we set a limit of 256 scratch volumes). The OS is hard pressed to address 4 times the installed RAM in it's paging file. Photoshop's scratch usage is frequently larger than what the OS can handle.

     

    The OS optimizes it's paging file for typical application beahvior (read: MS Office) -- which is the reverse of typical image processing behavior (LRU vs. MRU).  And the OS paging is demand based and blocking -- instead of async and predictive (photoshop knows what it will be processing next, and the OS doesn't).  Even adding VM hinting doesn't make it that much better.

     

    Again, Photoshop frees RAM when the OS is paging heavily, and paging out Photoshop's memory will quickly lead to double paging (and massive slowdowns).

     

    Photoshop's algorithms are also designed by experts (who still have to explain the details to OS experts who haven't thought it all through).

     

     

     

    The original author is complaining about a program error, which may or may not have to do with resource use.

    But you're way off in left field talking about your misunderstanding of a non-problem.

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 14, 2012 9:44 AM   in reply to Chris Cox

    Chris Cox wrote:

     

    If the scratch file hasn't been deleted - then something is holding on to the file.

     

    Chris, you really need to realize that you're wrong on this point.

     

    I know PC systems arguably as well as you do, and I can say with certainty that with Photoshop CS5 there were cases where a Photoshop scratch file would get left laying around, persistent even across reboots and multiple Photoshop sessions, without any "something" holding onto the file.  There have been times through history where I've had to manually delete bunches of them, and I've even reported it to you before.  As I recall you met the report with disbelief before.

     

    http://forums.adobe.com/message/2973880

     

    I'm not trying to break your rocks - for the most part I'm on your side in this conversation - but there are some things that really happen that go against your apparently deep seated beliefs.

     

    The nice thing is that I haven't seen a remnant temp file show up with Photoshop CS6 yet (knocking on wood), though to be honest it crashes FAR less often.  From the tone of your responses there really hasn't been any change, though - and that's a bit disturbing since it probably means the problem will recur.

     

    -Noel

     
    |
    Mark as:
  • Noel Carboni
    23,480 posts
    Dec 23, 2006
    Currently Being Moderated
    Aug 14, 2012 9:56 AM   in reply to Astara_

    Astara, you appear to have some misconceptions about how your computer and operating system work, and about how best to configure a high performance general purpose computer system.  Then you seem to be complaining that a particular big application doesn't fit the system you built very well.

     

    As much as you feel you've done to make your system sing, there are better ways to do it and better systems out there, and most importantly there are some on which people don't see Photoshop run into resource problems.  It often just costs more money to break through limitations.

     

    -Noel

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 14, 2012 10:34 AM   in reply to Chris Cox

    The OS optimizes it's paging file for typical application beahvior (read: MS Office) -- which is the reverse of typical image processing behavior (LRU vs. MRU).  And the OS paging is demand based and blocking -- instead of async and predictive

    What OS are you talking about?   Linux and Win7 both have predictive algorithms

    and, at least linux has explicity calls to say what you will and won't need (win7 may too)... linux has async I/O and I'd be suprised if win7 didn't as well.  So the above doesn't apply to modern 64-bit OS's and hasn't since before the release of CS5.5.

    (photoshop knows what it will be processing next, and the OS doesn't).  Even adding VM hinting doesn't make it that much better.

    Again, on what OS?  Are you talking tests on XP in 32-bit mode or even on Vista?

    or did you look at the options available on Win7-64 or linux?  And If you are spending time hand tuning OS-level algorithms to deal with OS definciencies with each release, maybe time should be addressed to also dealing with what OS to run on.  Using openGL over DirectX has been shown to offer 30-50% performance boosts in some tests, and native linux implementations boost that even more compared to Win7.  With Win8 and another shim layer added at boot time, another

    level of indirection, slowness and further removal from the HW is added...  Is that really a route you want to go down? 

     

    If the OS is that bad, you might reconsider supporting a more flexible OS that will likely run on more types of hardware..

     

     

    Earlier you claimed:

    Photoshop uses it's own paging system in the scratch disks.

    Photoshop does not disable the OS virtual memory system in any way.

    Photoshop does not "break" the OS VM system in any way.


     

    Yet now you admit, that Photoshop detect OS paging, and attempts to

    prevent its normal operation (i.e. 'disables it') by catching resource contention

    before it hits the OS as you confirm:

    Again, Photoshop frees RAM when the OS is paging heavily.

     

     

     

     

    and paging out Photoshop's memory will quickly lead to double paging (and massive slowdowns)

     

    If you relied on the OS to page out the areas you were going to write to the tmp files by assiging page priorities to 'tmp-file destined info', you'd be giving the os hints that this particular area of memory is for 'tmpfile storage and can be paged out if demand requires.   The problem isn't that the OS can't do what you want, it's that you don't tell it what to page out. 

     

    You might most easily accomplish this as I suggested before, but creating a ram-disk out of part of memory and marking that engire area as low-priority.

     

    The at a user config level -- allow them to specify how to use the temp disks -- ie.

    use them in parallel?  or use up this one to  X<units> (specifiable as [MGT]B or % its total similar to howyou can resize images to absolute units, pixels or percent ).

     

    Photoshop is not "wasting" any space.

     

    Oh?

     

    Then why, when I am at the recommended usage level of 71% of total memory,

    did I often run into the problem of Photoshop hanging due to insufficient scratch space to complete its operation -- (seems to either stop using scratch, or the OS is not letting it have the last 7% of disk space...not sure which...but when my 400G HD got down to

    30G free, (w photoshop using 90G in scratch (I thought 100GB for scrach would be

    enough for most of my purposes, but that assumed it would use the 106GB of memory my system is configured for, NOT the 48GB of memory that is physical.

     

    Photoshop is waisting 2/3rd of my remaining space at that point and in general, at least 33% of what I allocate for photoshop's tmp files. 

     

    If it needs more space, I can give a 3rd (and slower) scratch file space -- that would

    solve it's space needs on a multi-TB Network HD.

     

    But that would be very slow and Photoshop doesn't even allow that as an option, presumably for performance reasons.   That being the case, why do they only use a scratch disk located on slow storage vs. a faster one located in a private RAM DISK it creates with low-page prio -- then it would be able to use swap which in normal system is your fastest external device -- and not the slower scratch disks, that Niel is using the hack of throwing expensive disks at primarily for a program that isn't following the normal

    rules.

     

    But Niel's seems to think it's no problem -- I'm sure he won't mind upgrading my system's scratch disks as monetary issues just a minor resource issue for him...

     

    Of course if we all owned crays (well they'd be slow today, but today's equiv), then no one would have to worry about program efficiency... but then there was the reality for most customers who are on some budget (maybe 0)...

     

     

     

     

     

    Chris Cox wrote:

     

    >> Notice it only lets me use a max of 44737 or 43.7GB out of a phys-mem of 48GB.

     

    Yes, because the OS, drivers, and binaries take up some space.

     

    And everything you said is still pretty wrong.

     

    If the scratch file hasn't been deleted - then something is holding on to the file.

    Again, Photoshop uses 2 methods to try and make sure those files don't stick around.

    But it sounds like you're still using CS5, and there may have been a bug in CS5 that caused the scratch file unlinking to fail (I think I recall a bug like that, but it's been years...).

     

    And Photoshop is not supposed to use all of your virtual memory - because that would be slower.  Photoshop is supposed to use the RAM you allow it to, and use the scratch file for everything else it needs.  Using the OS VM would be slower, and much more limited (couldn't open large docs, couldn't use many history states, etc.).

    Please remember that Photoshop can address somewhere around 2 Exabytes per scratch volume (and I think we set a limit of 256 scratch volumes). The OS is hard pressed to address 4 times the installed RAM in it's paging file. Photoshop's scratch usage is frequently larger than what the OS can handle.

     

    The OS optimizes it's paging file for typical application beahvior (read: MS Office) -- which is the reverse of typical image processing behavior (LRU vs. MRU).  And the OS paging is demand based and blocking -- instead of async and predictive (photoshop knows what it will be processing next, and the OS doesn't).  Even adding VM hinting doesn't make it that much better.

     

    Again, Photoshop frees RAM when the OS is paging heavily, and paging out Photoshop's memory will quickly lead to double paging (and massive slowdowns).

     

    Photoshop's algorithms are also designed by experts (who still have to explain the details to OS experts who haven't thought it all through).

     

     

     

    The original author is complaining about a program error, which may or may not have to do with resource use.

    But you're way off in left field talking about your misunderstanding of a non-problem.

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 14, 2012 11:25 AM   in reply to Astara_

    Astara - you're still confusing RAM and your paging file.

     

    I'm talking about all currently available OSes.  We've experimented with OS VM hinting - and found that it still isn't close to what we need to keep Photoshop working fast and with enough addressable space.

     

    And you're coming up with ridiculous claims based on my straightforward explanation.

     

    This conversation is over.  Either you're trying to troll us, or you really don't have enough understanding of the OS behaviors and application strategies to engage in a useful conversation.

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 14, 2012 3:59 PM   in reply to Noel Carboni

    I too have encountered situations, without crashes, where the Photoshop TMP gets left behind. It used to happen more often (never THAT often, but enough to notice) with earlier versions of PS, and I do have to admit that I have not noticed any "orphans," since about CS 4. [Can I borrow some of that "wood," Noel?]

     

    In the past, it has happened, and I know of no program that I have ever had, that would interfer with PS not being able to clean out that file.

     

    Now, if IIRC, if there is a crash, then when PS starts up next time, it should look for the old Scratch Disk, and clean that out. I have seen that function not work, on a few occasions too, though it HAS been many years, and many versions, since I had my last crash with PS open.

     

    I attribute this to the adage: "Sometimes the magic works, and sometimes it does not... " [Chief Dan George]

     

    Hunt

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 14, 2012 4:06 PM   in reply to losewy

    i forgot to mention that i have 1 scratch disk with 53 GB free space

    That is not much room for the PS Scratch Disk. If one is working with nothing but small files, and doing only simple editing, it might be adequate, but there are other considerations. At about 70% of capacity, a HDD will begin to have diminished performance, and this diminished performance will get worse and worse. At 100% capacity (Scratch Disks can grow quickly), the chance of a catastrophic failure are high. I try to never temp fate.

     

    Good luck, and I would find more HDD space for the Scratch Disks, if it were me.

     

    Hunt

     
    |
    Mark as:
  • Currently Being Moderated
    Aug 14, 2012 4:15 PM   in reply to Chris Cox

    I thouroughly understand the difference between RAM and swap(paging file). 

     

     

    It's been the underpinning of virtual memory since the early 70's, after it was

    proven (by IBM research), to " the virtual memory overlay system worked consistently better than the best manual-controlled systems").  Now I know that Adobe trumps IBM, but could you point the the research that is hardware independent -- not based on testing on a particular config, cuz, then you screw everyone who doesn't have your config.  But based on mathematics or computer science or a wide-ranged cross hardware study?  

     

    I don't think you are the first to think you can do better -- but when you look at all the hardware + software configs Photoshop runs on you can really say your measurements did nothing more than measure performance on, at most, a few config, if more than 1 and even if more than one machine, likely a homogenous user that only uses the product in a specific way.

     

    Experimenting with VM hinting isn't the same as getting it right or having an adequate VM priority system to allow for adequate control.  When you say hinting it sounds like you are talking POSIX ... a notorious pile of smelling dung, that was designed for the least common denominator.  Designing for that will CLEARLY NOT take advantage  of any given OS's HW or features.

     

    I've made no ridiculous claims that I have  not backed up with screenshots of real proven events.   I have made suggestions of possible solutions if that's what you are

    calling "claims"... but not having tested such, I don't see how I it would be possible for me to claim it.

     

    Did you read the Windows Internals book -- especially the VM section (Chapter 9).  It has alot more than "hinting", and if you haven't tried using prioritized memory as well as having control of prefetch, (linux has similar but different mechanisms -- not POSIX), then I would submit you haven't touched the subject.   Hinting is not even close to what modern OS's can do.  Hinting was big 10 years ago -- and I never noticed any benefit from it either -- on any application. 

     

    You can call someone quoting industry research and references to you "ridiculous" and trolling -- but only someone with their head buried under a pile of dogma would refuse to consider how they might have biased their research and came to conclusions that are no longer valid.

     


     
    |
    Mark as:
1 2 Previous Next
Actions

More Like This

  • Retrieving data ...

Bookmarked By (0)

Answers + Points = Status

  • 10 points awarded for Correct Answers
  • 5 points awarded for Helpful Answers
  • 10,000+ points
  • 1,001-10,000 points
  • 501-1,000 points
  • 5-500 points