I want to do the following:
1. Record a stream, capturing both the camera and microphone.
2. Quality of the recoring is the same for everyone, without regard to actual bandwidth, so I use NetStream.bufferTime on the publishing stream to allocate a buffer that is large enough to accumulate all the recorded data on the client side while it is uploaded to the server. If the bandwidth does not allow Flash to send all the data in real-time, then the local buffer grows but never gets full.
3. When the recording is stopped, both the camera and microphone are detached from the stream and the remaining recorded data in the stream's local buffer is uploaded (can take minutes depending on the upload speed and recording's length and quality) until NetStream.bufferLength reaches 0 (all the data has been uploaded). Then the recorded FLV can be played back.
Now I've found out that this works fine with effective bandwith, but when the bandwidth is too low to allow real-time upload, I've found out that most of the times the resulting uploaded FLV on the server-side is kind of corrupted: jerky playback, slow framerates, sudden bursts of speed in the playback, invalid playhead time values, etc. I'm at the point where I'm trying to discover if this is a client or server related issue.
So my question would be: if the user's bandwidth does not allow an application to upload a high-quality recorded stream in real-time, is it possible, using a high-enough NetStream.bufferTime value as described above, to publish a well-formed FLV on the server side? Otherwise, it looks like I'll need to lower the recording quality to accomodate the user's upload speed.