• Global community
    • Language:
      • Deutsch
      • English
      • Español
      • Français
      • Português
  • 日本語コミュニティ
    Dedicated community for Japanese speakers
  • 한국 커뮤니티
    Dedicated community for Korean speakers
Exit
Locked
0

How can I add a delay (1 minute) on live streaming?

Guest
Nov 19, 2014 Nov 19, 2014

Copy link to clipboard

Copied

My project need to read a stream input from an IP camera and stream out to FMS. We have the need that user is viewing the camera feeds which is a little bit older. 

For example , now it 11:30 AM, the user is actually viewing the feed at 11:29.

My purpose is so that the client is viewing  camera feed which is 1 mins old from now which gives us the response time to shutdown the cemera if emergency happens.

So anyone help me whether I can do this?

Thanks in advance.

Views

2.6K

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Nov 20, 2014 Nov 20, 2014

Copy link to clipboard

Copied

Did you check Adobehelp guide...

Adobe Media Server 5.0.6 * Application.xml file

See MaxQueueDelay.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Nov 20, 2014 Nov 20, 2014

Copy link to clipboard

Copied

Also if you are looking for per stream configuration

See  Stream.maxQueueDelay in

Adobe Flash Media Server 4.5 * Stream class

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Nov 24, 2014 Nov 24, 2014

Copy link to clipboard

Copied

Hi hparmar,

Please help me out on this, it seems any property that I have changed in application.xml did not take effect.

I am posting the application.xml below located in livepkgr/ directory to achieve the purpose that when client is viewing the camera feed, they are alway viewing the 1 minutes older camera feed from now so that we can take some actions before something happen.

All I have done is to set MaxQueueDelay and  MinBufferTime   to 60000 according to the document, but it does not seem to be working for me.

For your information, I am using a trial version to evaluate Adobe media server to see if it fits on our need.  Therefore I am not sure whether there is some restrictions on this version..

Thanks, please help me out, I really need to result of Adobe media server whether it is capable to do this.

<Application>

    <StreamManager>

        <Live>

            <!-- When enabled, the server will assume that incoming live streams have -->

            <!-- timestamps that are based on an absolute clock, such as a SMPTE time -->

            <!-- signal contained within the encoder's input source.                  -->

            <!-- Default is false.                                                    -->

            <AssumeAbsoluteTime>false</AssumeAbsoluteTime>

            <!-- Allow take over so that encoders that go down (for some reason) and  -->

            <!-- and come back up can immediately republish.                          -->

            <PublishTimeout>0</PublishTimeout>

            <!-- This configuration takes care of situation when the timestamps of    -->

            <!-- certain messages in internally or externally published streams don't -->

            <!-- honor absolute clock contract.                                       -->

            <!-- The possible values are listed below:                                -->

            <!-- 0 disables the work around and is default if the configuration is    -->

            <!--   not present.                                                       -->

            <!-- 1 handles the scenario for externally published streams only         -->

            <!-- 2 handles the scenario for internally published streams only         -->

            <!-- 3 handles both of above                                              -->

            <AdjustForZeroTimeStampMessages>2</AdjustForZeroTimeStampMessages>

            <!-- When enabled along with AssumeAbsoluteTime, the server will          -->

            <!-- assume that incoming live streams will always maintain the           -->

            <!-- timestamps based on an absolute clock across server and stream       -->

            <!-- restarts.                                                            -->

            <!-- Default is true.                                                     -->

            <AdjustForRecordingRollover>false</AdjustForRecordingRollover>

         

   <Queue enabled="true">

<!-- Specifies how often the server will flush the message queue by size   -->

<!-- (in bytes).   Set value to 0 also disable queuing.  Default is 4096.  -->

<MaxQueueSize>4096</MaxQueueSize>

<!-- Specifies how often the server will flush the message queue by time   -->

<!-- (in milliseconds).   The default value is set to 500 milliseconds.    -->

<MaxQueueDelay>60000</MaxQueueDelay>

<!-- Specifies whether the server will flush the message queue when a      -->

<!-- data message arrives.  This is important when streaming only data     -->

<!-- messages, so the server will send out the messages immediately.       -->

<!-- Default is set to true.                                               -->

<FlushOnData>true</FlushOnData>

<!-- When queuing is enabled, messages in the queue can be combined to     -->

<!-- form aggregate messages.  The "enabled" attribute will determine      -->

<!-- whether aggregate messages will be created.  Default is "true".       -->

<AggregateMessages enabled="true">

<!-- This setting determines the maximum size (in bytes) of  -->

<!-- aggregate messages created from the message queue when  -->

<!-- when aggregate messages are enabled.  Default is -1,    -->

<!-- which means that the aggregates can be as large as the  -->

<!-- queue size allows.                                      -->

<MaxAggMsgSize>-1</MaxAggMsgSize>

<!-- Maximum Duration in Millisconds of an aggregate message while -->

<!-- reading from livestream. Default is 10000 Millisconds. -->

<!-- <MaxAggMsgDuration>10000</MaxAggMsgDuration > -->

</AggregateMessages>

</Queue>

        </Live>

    </StreamManager>

<Client>

<!-- Use 10 if you have bandwidth values for AMS 2, 2 otherwise. -->

<WindowsPerAck>2</WindowsPerAck>

<Bandwidth override="yes">

<!-- Specified in bytes/sec. The default provides about 20Mbps. -->

<ServerToClient>2500000</ServerToClient>

<!-- Specified in bytes/sec -->

<ClientToServer>2500000</ClientToServer>

</Bandwidth>

<!-- Bandwidth cannot be set higher than the value set here. -->

<BandwidthCap override="no">

<!-- Specified in bytes/sec -->

<ServerToClient>5000000</ServerToClient>

<!-- Specified in bytes/sec -->

<ClientToServer>5000000</ClientToServer>

</BandwidthCap>

    <!-- Configure downstream (server to client) bw detection. -->

    <!-- To disable native bw detection and use script-based   -->

    <!-- bw detection, set the enabled attribute to false. By  -->

    <!-- default, native bw detection is enabled.              -->

    <BandwidthDetection enabled="true">

      <!-- Don't send data to client greater than this rate (in Kbps). -->

      <!-- Default of -1 means don't need to throttle; just send the   -->

      <!-- data at whatever rate is necessary to measure bw.           -->

      <MaxRate>-1</MaxRate>

      <!-- Amount of data sent to client. Server sends a series of random bytes -->

      <!-- to the client, each time sending this much more data. For example,   -->

      <!-- x bytes are sent, followed by 2x bytes, followed by 3x bytes, etc.   -->

      <!-- until <MaxWait> time has elapsed.                                    -->

      <DataSize>16384</DataSize>

      <!-- Specifies how long (in sec.) server will send data to the client.    -->

      <MaxWait>2</MaxWait>

    </BandwidthDetection>

<MsgQueue>

<Live>

<!-- Drop live audio if audio q exceeds time specified. time in milliseconds -->

<MaxAudioLatency>2000</MaxAudioLatency>

<!-- Default buffer length in millisecond for live audio and video queue. -->

<MinBufferTime>60000</MinBufferTime>

</Live>

<Recorded>

<!-- Default buffer length in millisecond for live audio and video, value cannot be set below this by Flash player. -->

<MinBufferTime>2000</MinBufferTime>

</Recorded>

<Server>

<!-- Ratio of the buffer length used by server side stream -->

<!-- to live buffer.  The value is between 0 and 1.  To    -->

<!-- avoid break up of audio, the ratio should not be more -->

<!-- than 0.5 of the live buffer.                          -->

<BufferRatio>0.5</BufferRatio>

<!-- This specifies whether outgoing messages are sent in       -->

<!-- different priorities for server to server connection.  By  -->

<!-- default, prioritization is set to false which means        -->

<!-- messages are sent out through one channel with the same    -->

<!-- priority.  If the value is set to true, messages will be   -->

<!-- sent through multiple channels with different priorities.  -->

<!-- And the priority is based on the message type.             -->

<Prioritization>false</Prioritization>

<!-- The minimum number of video messages to queue at the start -->

<!-- of a stream.  When an H264 stream or other pipelined codec -->

<!-- stream begins it needs some number of messages, at most 64 -->

<!-- to begin playback and this setting ensures that regardless -->

<!-- of the buffer set there are enough messages at the start   -->

<!-- to begin playback quickly.  Making this less than 64, down -->

<!-- to 0 won't queue up extra messages but may cause H264      -->

<!-- content with a low FramesPerSecond to delay before start   -->

<MinQueuedVideo>64</MinQueuedVideo>

</Server>

<Diagnostic>

<!-- Whether diagnostic logging specific to message queue is    -->

<!-- enabled.  Default is false.                                -->

<Enable>false</Enable>

</Diagnostic>

</MsgQueue>

<HTTPTunnel>

<!-- The following two parameters affect the latency observed by -->

<!-- a client tunneling into the server. Low values will reduce  -->

<!-- the latency but increase the network bandwidth overhead.    -->

<!-- Applications desiring low latency may use 128/256 for idle  -->

<!-- post and ack intervals, while those not susceptible to high -->

<!-- latencies may use 1024/2048. The default setting provides   -->

<!-- medium latency and is set at 512/512.                       -->

<!-- This specifies the interval at which the client should send -->

<!-- idle posts to the server to indicate that the player has no -->

<!-- data to send. This is needed to enable the server to send   -->

<!-- downstream data. This interval is specified in milliseconds -->

<!-- and must be in the range 0 to 4064 msec.                    -->

<IdlePostInterval>512</IdlePostInterval>

<!-- This specifies the max time the server may wait before it   -->

<!-- sends back an ack for a client idle post.                   -->

<IdleAckInterval>512</IdleAckInterval>

<!-- This specifies the default mime type header sent on tunnel  -->

<!-- responses. In general the server always uses the mime type  -->

<!-- specified by the incoming requests. The server will use the -->

<!-- following only if it could not determine the mime type from -->

<!-- incoming requests.                                          -->

<MimeType>application/x-fcs</MimeType>

<!-- This specifies the size (in KB) of the write buffer. The    -->

<!-- default is 16Kb.                                            -->

<WriteBufferSize>16</WriteBufferSize>

</HTTPTunnel>

<!-- This specifies the max size of messages for screen-sharing      -->

<!-- video packets.  Messages larger than this limit will be broken  -->

<!-- into multiple messages, which can allow for updates to appear   -->

<!-- faster but also creates partial image updates.  The size is     -->

<!-- bytes.  Set to zero for to not alter screen-sharing messages    -->

<MaxMessageSizeLosslessVideo>0</MaxMessageSizeLosslessVideo>

<!-- Specifies the RTMP chunk size to use in all streams for this -->

<!-- application.  Stream content breaks into chunks of this size -->

<!-- in bytes.  Larger values reduce CPU usage, but also commit to -->

<!-- larger writes that can delay other content on lower bandwidth -->

<!-- connections.  This cana have a minimum value of 128 (bytes) and -->

<!-- a maximum value of 65536 (bytes) with a default of 4096 bytes -->

<!-- Note that older clients may not support chunk sizes largee than -->

<!-- 1024 bytes. If the chunk setting is larger than these clients can -->

<!-- support, the chunk setting will be capped at 1024 bytes. -->

<OutChunkSize>4096</OutChunkSize>

<!-- An application can be configured to deliver aggregate messages to       -->

<!-- clients that support them by setting the "enabled" attribute to "true". -->

<!-- The server will attempt to send aggregate messages to these supported   -->

<!-- clients based whenever possible.                                        -->

<!-- When this setting is disabled, aggregate messages will always be broken -->

<!-- up into individual messages before being delivered to clients.          -->

<!--  The default is "true".                      -->

<AggregateMessages enabled="true"></AggregateMessages>

<!-- Controls libconnect.dll access configurations  -->

<Access>

<!-- This controls if readAccess and writeAccess use the format  -->

<!-- where folder level permissions are enforced  -->

<!-- if true, individual files cannot be marked specifically for -->

<!-- read or write access. Default is false, new format allows   -->

<!-- access control at the file level  -->

<FolderAccess>false</FolderAccess>

<!-- This tag specifies a semicolon delimited list of folder     -->

<!-- levels which are given with audio sample access.  By        -->

<!-- default, this tag is disabled and the value is empty which  -->

<!-- means no one will have audio access.  This tag can be       -->

<!-- enabled by setting "enabled" to "true".                     -->

<!-- When enabled, everyone have audio sample access to the      -->

<!-- access folders specified in the list.  If value is set to   -->

<!-- to "/", full sample access will be granted.  User can also  -->

<!-- override this setting by using the access adaptor or server -->

<!-- side script.                                                -->

<AudioSampleAccess enabled="false"></AudioSampleAccess>

<!-- This tag specifies a semicolon delimited list of folder     -->

<!-- levels which are given with video sample access.  By        -->

<!-- default, this tag is disabled and the value is empty which  -->

<!-- means no one will have video access.  This tag can be       -->

<!-- enabled by setting "enabled" to "true".                     -->

<!-- When enabled, everyone have video sample access to the      -->

<!-- access folders specified in the list.  If value is set to   -->

<!-- to "/", full sample access will be granted.  User can also  -->

<!-- override this setting by using the access adaptor or server -->

<!-- side script.                                                -->

<VideoSampleAccess enabled="false"></VideoSampleAccess>

</Access>

</Client>

</Application>

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Nov 25, 2014 Nov 25, 2014

Copy link to clipboard

Copied

I think you are just setting the value of MaxQueueDelay and not adjusting MaxQueueSize accordingly...

if you set MaxQueueDelay  to 60 seconds and If say you have a stream of 0.8 Mbps(i.e 0.1 MB =per second) then MaxQueueSize  would be roughly be 60 * 0.1 MB = 6 MB i.e 6 * 1024 * 1024 = 6291456 and MaxQueueSize  should be MaxQueueSize ...Also you may or may not want to turn off FlushOnData flag depending on your needs..

I verified this on standard build of AMS(i.e standard config that ships with AMS), if you have made some other changes to AMS config then you will have to consider and take into account those changes into account as well.

In such a case best think would be to create such a stream on one application and then stream/remote publish(from one server to another server or to to another app  on same server) and use all your optimizations on that remote server.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Nov 25, 2014 Nov 25, 2014

Copy link to clipboard

Copied

Hi hparmar,

Thanks a lot for your reply.

I have actually noticed the the MaxQueueDelay needs to work with MaxQueueSize from this article: http://www.adobe.com/devnet/adobe-media-server/articles/dynstream_live.html

"Corresponding to that, you should set theMaxQueueSize property with the maximum size in bytes needed to queue up 4000 milliseconds of data. This would be the size of 4000 milliseconds of data of the stream with the highest bit rate that will be streamed by the publisher; for example, if the highest bit rate is going to be 2.4 Mbps, then the MaxQueueSize should be:

2.4 × 1024 × 1024 ÷ 8 bps × 4 sec. = 1,258,292 bytes"

So in my case, the highest bitrate is 1000k  the MaxQueueSize should be roughly 7680000, however after I set this value, the delay was enormous, it is several  minutes, do you know what will potentially cause the big delay?

Actually I was creating the stream from other application basically wraps FFMPEG, ffmpeg starts the encoding from the camera feed and publish to media server in difference bitrates.

Thanks.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Adobe Employee ,
Nov 25, 2014 Nov 25, 2014

Copy link to clipboard

Copied

for my 650 Kbps stream, i used following values

<MaxQueueSize>5120000</MaxQueueSize>
<!-- Specifies how often the server will flush the message queue by time   -->
<!-- (in milliseconds).   The default value is set to 500 milliseconds.    -->
<MaxQueueDelay>60000</MaxQueueDelay>

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Nov 25, 2014 Nov 25, 2014

Copy link to clipboard

Copied

LATEST

After I have set those values, the live stream doesnot work at all.

On the client side, the video stays as images, and not playing at all in 10 mins.

Still do not know what happened.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines
Guest
Nov 23, 2014 Nov 23, 2014

Copy link to clipboard

Copied

Hi hparmar,

Thanks for your reply!! Much appreciated it.

However I did what you said by adding the MaxQueueDelay and set to 60000 in application.xml, which is  1 minute.

This is what I have done:

1. Change the livepkgr/application.xml ,that is , setting MaxQueueDelay

2. Restart media server

3. start streaming the camera feed to media server.

4. Load the stream in videoplayer.hmtl.


5. I can still view the feed after roughly 25 seconds (delay caused by network problem?), but not after 1 minute I set in the application.xml

As I am new to AMS and searching the possibilities for our new project, so I am wondering Is there anything I have done wrong?

Thanks in advance.

Votes

Translate

Translate

Report

Report
Community guidelines
Be kind and respectful, give credit to the original source of content, and search for duplicates before posting. Learn more
community guidelines