If it's random arbitrary files, then it may be a mere file I/O problem. any indexing services, virus scanners or such running? Does it correctly generate a report when you choose that mode?
A different file each time but are they all in the same folder?
You might try to copy your problem source files to another location, maybe up one or two levels in the directory.
Can any of the image sequences be flattened into movies? That can make a huge difference. If the movie is lossless and at least one frame per image (and no interlacing or frame blending blur) you can always unpack it into an image sequence later.
In addition to what Mylenium and Bogiesan suggested (and this may seem like a silly suggestion), but, do you have enough free space on the volume to which you're trying to collect? Have you tried changing the permissions on all of the files (batch through command line, or OS)?
Also, are you trying to collect to a location over the network? Maybe try collecting locally first.
On a separate note, did you ever get that photoJpeg situation resolved?
Thanks for all the ideas. Yes, plenty of space is available on the destination volume -- it's about 80GB of stuff, and I originally was collecting to the same network volume (over 1Gbps Ethernet) from which I'm reading it. I also tried collecting to a local destination (again, plenty of free space on the local volume) to reduce the likelihood of I/O problems on the NIC. Same trouble.
Permissions are all A-OK (the project opens / renders without any trouble).
Copying the source materials local first or turning the image sequences into movies would be a huge problem -- the source items are scattered throughout a very large working job folder (over a dozen different people worked on elements of the job), and there are so many sequences and footage items that tracking them down would take days, maybe weeks.
I have been able to successfully collect the project by connecting to the fileserver over SMB instead of AFP (it's a Solaris fileserver running Xinet KA-Share for Appletalk AFP). Anyone know a reason AFP should be the source of the trouble here? We routinely copy very large filesets to/from the server over AFP without incident.
Rob LaRose Systems Administrator Imaginary Forces 646.486.6868 office 646.486.4700 fax www.imaginaryforces.com<http://www.imaginaryforces.com/
Try not to take this personally, it's so easy for folks around here to get ruffled. I am facing a similar issue at my shop but on a far smaller scale.
The additional information you have provided suggests only one thing to me, you seem to be relying upon After Effects, a compositing system, to handle all of your data management for what has become a h-u-g-e project. Your statement that compiling all of the individual elements is too much to even contemplate suggests your shop's file management structures and policies are woefully and dangerously inadequate for the type of work you and your boss like to do and invoice. You've got to come up with a better file system that includes in-process backups and redundancies.
Try locating your AE resources via Bridge. If you can see them in Bridge, AE should pick them up.
Rant Mode <off>
You've wasted a ton of time trying to force AE to do this job for you and it's not working so it's time to start hauling your resources manually to a central location so they can be dealt with in a more conventional way. Our network switches are all programmed to ignore AFP logons these days, everything must be accomplished via SMB. AFP caused tremendous file-ownership issues that could not be resolved. It was awful.
Good luck but I think you're asking more of AE than it can deliver. It's your OS that is trying to resolve the network connections.
I'd take every word of David's advice and pay a penny for it. If you need to copy so much data that often, AE is certainly not your friend. You really should consider using an asset management and tracking system such as GridIron's Flow or AlienBrain. Then all the "collecting" can happen outside AE on a strict file system level. Also, and take that with a grain of salt, having to do that kind of massive cleanup to me is no good sign in terms of the overall workflow. In my view there is certainly potential there to avoid the "versioning trap" with lots of redundant data scattered in different places...
Not ruffled at all. I really appreciate the advice y'all can offer
here. I wouldn't come askin' questions otherwise. Especially we
small-staff sysadmins depend on our colleagues to share their ideas &
revelations with us.
We're certainly on the razor of needing an asset-management system to
help out, but the ones I've looked at (Gridiron Flow and Alienbrain
included) don't seem from their marketing to really be tremendously AE-
friendly. For example, Flow's "package" tool will gather all the
files used in my AE project, but won't update the project itself, so
I'm left with a bunch of broken links which then need to be manually
re-pointed. Add the additional layer of multiple animators working on
separate AE projects that use the same footage items (e.g., multiple
versions of a commercial spot/campaign) and it gets even messier.
And Bridge, as we all know, is not without its own heartbreak (at
least as of CS3). The assets ARE predictably located , so I don't
mean to overstate the difficulty of manually collecting project assets
to one place -- just the number of them, and the fact that many of
them are image sequences (moving 1,000 1MB files takes a lot longer
than moving a single 1GB file) would make it a very tedious
operation. And it is, after all, the whole purpose of AE's "collect"
feature, is it not?
I'm don't want to downplay the importance of workflow management,
which we certainly depend on, but even workflow management tools for
the most part only "virtually" manage your assets, without regard to
their actual locations. Without a collect/package step, when it's
necessary deliver a project (not just the output of the project), or
to restore an archived/offlined project from tape, you're in a fix.
So if you've got time, I'd be interested in hearing the techniques/
tools you folks use for managing large multi-designer, multi-animator,
multi-project AE workflows.
> So if you've got time, I'd be interested in hearing the techniques/
tools you folks use for managing large multi-designer, multi-animator,
multi-project AE workflows.<
Wish it were a simple fix but you must realize that every shop faces this reality sooner or later. The amount of time wasted on asset chaos is recovered almost instantly after good workflow policies are put into place but these toosl and procedures are grossly oversold to managers who only want to look at return on investment. The bigger picture is known as the total Cost of Adoption which is a ratio of ROI and cost of implementation less the cost of lost opportunities. Any hip accountant can figure that out for you.
I had glossed over the CS3 part of your original post, sorry. CS4 offers much more robust versions of Bridge and Version Cue looks like its finally getting some respect as something more than its previous status as a total waste of intellectual effort.
We happen to use Canto Cumulus but I have a supervisor who is convinced Microsoft's Sharepoint is the answer to all of our problems. It is a disaster when applied to anything other than Microsoft documents so forget it.
I am trying to build a system that uses the least number of proprietary tools and imposes the smallest number of onerous tasks upon creatives who just want to get to work. Nonetheless, sophisticated file management requires a manager and that individual must have power to impose and enforce policy. If you cannot rely on your participants to adhere to policies they do not support or understand, you must have someone in charge of wrangling your thousands of files.
You need file naming conventions, file placement requirements, a taxonomy for storage placement, a predefined metadata template and strict metadata requirements, centralized file storage and versioning segregation, backup policies and offsite redundancy, disaster recovery plans and a rehearsal at a hot site.
Well, we're just a small shop, but since it usually falls upon me to administrate, here's a few general tips:
- Hide all resources that are not meant to be accessed by certain people. This can be done easily by using different sub-nets on your network and defining access privileges. Write down the network MAC addresses and use a whitelist/ blacklist. A bit of work initially, but a lot safer than working with user names.
- Forget about individual user privileges. At one point or another everyone will bump into situations where they cannot access stuff, but need it for their work. Simply ramp up all users to Power User status, but assign individual groups to them which only you can manage. That leaves it up to you to define who can access what, but without having to dig so deep that you always end up with individual adjustments. You merely add or remove users from groups (which is more practical for interns and freelancers, even)
- Quota can do miracles for you. If a user only has x GB of storage available, he'll quickly learn to not create redundant data by copying files.
- Prevent all local storage. Artists have a tendency to offline files at whim for experimentation and then forget about it. Give them a "garbage dump" directory on the server instead where they can store their private data and any non-persistent work files, but with quota. Use server-side desktops and user home directories, if possible, but unfortunately many programs cause problems with that (including Adobe's), so it's not always an option.
- For each new project, create a pre-defined folder structure. We e.g. use number_clientname_worktype_year for the root, and inside have directories for compositing, 3D stuff, print, client data etc.. Define it so from the getgo, that specific structures like project directories for 3D programs with their sub-folders for textures etc. are in place, but users cannot create additional folders. When they are forced to use existing folders, they will be much more careful in naming their files.
- Define versioning rules and naming conventions. Now "siginificant milestone" is a matter of view, but I at least try to create new version files for that. So if e.g in the morning I open up a comp project and change a few things in it that take more than 10 minutes to reproduce, this is significant and a new version is safed. I don't care much for simple things like adding effects - I can reproduce them in a wee in case of crashes or data loss - but complex masking, elaborate 3D modeling, dynamics simulations etc. qualify.
- Of course back up every day, at least incrementally. If following the above rules, this can be as easy as copying the project folder to an external disk.
As for the rest - simply be hard-nosed. Your artists will whine and complain for a while, but they will get used to it and eventually see the wisdom in a regulated system. It's also up to you to brief them accordingly and remind them from time to time.