This content has been marked as final. Show 6 replies
Hi MohaviRazavi and welcome to the RH community.
I suspect that no one answered your previous post because they were not sure what you meant by "crushing" a file. I'm assuming you are talking about a Unix command that is similar to compressing? To be honest I have no knowledge of just how any compression tool works and quite frankly I don't want to. What I want to know is that it works.
Is it a problem for your users or are they just curious?
Like Colum, I inferred that crushing was akin to zipping a file. Using some form of file compression utility to make it smaller.
As long as the end resulting file works, there should be no need for worry or cause for alarm with a size difference. These beasties use different algorithms to compress the data. Perhaps something in the source files was rearranged a smidge that resulted in better compression occurring where it wasn't before and that accounts for the decrease in size.
Just my $.02 worth. I suppose when adjusted for current events, it may be my $.0000002 worth.
Thanks for the input. Yes, I am talking about compressing files into a .CSH file. It's how we deliver help files to our customers. Once, we did deliver a file that did not compile correctly, so when it was "crushed," folders were missing. Now the client is hyper-aware of that and compares the size of each delivery against the previous delivery, which is why they noticed that previous deliveres of a .CSH file might be larger than the current delivery by as much as 200 KB. I see nothing missing in the files themselves, but every time this happens, they ask us why, so I'd like to provide a reasonable answer.
Thanks again for your help.
Rick's suggestion about the compression algorithm is the way I'd go. I might also politely suggest that if I spent more time producing the documentation and less time firefighting non-existent problems that there would be less likelihood delivering help files that didn't compile properly.
I so hear you. Thanks!