I'm trying to make a software which sends video and audio data to a flash media server by using RTMP protocol.
Currently, my program can communicate with a flash media server correctly.
RTMP specifications does not describe about the raw data in video/audio messages, so I muxed raw H.264 and AAC data into video/audio messages and sent to the server.
The server seems to accept them, but a video player cannot playback the stream sending from the server. The player just says "Loading..."
For a test purpose, I sniffed the network packets between Wirecast and the flash media server and ripped off only video and audio data. Then, I muxed those data into video/audio message and sent to the flash media server. In this case, the video player connected to the server can playback the stream correctly.
I checked the stream sent from Wirecast, the stream seems not to be H.264 raw data because those data are not started from 0x17 instead of H.264 start code.
With those situation, I am wondering what kind of container format I should use for H.264/AAC data to the flash media server.
Does anyone have idea?
Can you try appending "mp4:" to the stream name.. Are you playing the livestream or recorded file. Try to record the file as mp4.. you just need to append mp4: in the stream name to do that. For example, if you stream is "abc" .. use "mp4:abc" in the publish as well as play option.
Thank you very much for your answer. =)
I am trying "livestream" one. I added "mp4:" to both of stream name which is sent to the server from my test program and the video player, but nothing seems to be changed.
I am compared my H.264 stream with the stream which is received/sent by Wirecast/UStream and determined that their stream is completely different from mine.
The video/audio packets in their stream seems to be video/audio packets muxed with FLV packets. I'll try to mux my v/a packets by FLV v/a packets.
But, I am still wondering is there any way to use H.264/AAC raw stream instead of FLV packed ones? I don't think it's the only way to mux v/a data into rtmp packets.
Does anyone have advice or idea?
Ohh.. you mean to say, you are sending the raw stream and not packing it into rtmp messages.. FMS only understand the rtmp and will try to intrepret your data as rtmp chunkstream.. and if you send raw stream, it is bound to fail.
Hi NpComplete, Thank you for your answer.
Hmm, No. I muxed raw H.264/AAC stream into rtmp video/audio messages and divided into rtmp chunks, then send to the server.
The problem is that the stream in rtmp video/audio messages is NOT usual raw H.264/AAC data.
It seems to be muxed data for FLV video/audio which is described in FLV specs. And, the rtmp specs does not mention about it at all.
flv specs: http://www.adobe.com/devnet/flv/
rtmp specs: http://www.adobe.com/devnet/rtmp/
I am still reverse-engineering the Wirecast stream to find out what kind of muxing format it is. Because it is seemingly FLV video/audio muxed data, but also it seems to be little bit different from them.
Also, thank you for telling about FMS/rtmp sdk, but I didn't notice that before implementing rtmp protocol into my program.
And, I already implemented rtmp protocol into my program correctly, so the problem is only data which are inside rtmp video/audio message.
Even so, just in case, I'll contact to adobe to get the SDK. =)
Do you know about the muxing format inside rtmp video/audio message for H.264/AAC?
Thanks for your reply.
I just want to send only the video data to flash player. Do you have any specification which describes the format in which video packets are to be sent in RTMP?
I have used ffplay client which support RTMP video rendering. Its say that HandShake: client signature does not match! and [h264] no frame.
i just formed an rtmp header for video data, read frame from file(mp4) form it has rtmp body and sent to flash or ffplay.
Can you explain more details on this.
Have you solved the problem which you faced. Can you provide me the details or any document for this. I'm aslo facing the same problem as you said.
I already implemented rtmp protocol into my server program correctly. Problem is only with the video data.
The required information are in "Video File Format Specification Version
10" which you can get from Adobe site. The section "FLV File Format"
describes how audio and video FLV packets have to be created. These FLV
packets can then be sent using RTMP
I had another problem. Flash player was not sending createstream command to my rtmp server software. Handshakes, connect and Window Acknowledgement Size are received from Flash player to my rtmp server. Flash player stop sending commands after Window Ack Size. Any help would be greatly appreciated.
I am using Adobe Flash Media Server Connector for C++ version 184.108.40.206.
I managed to stream on the one hand a stream with H264 video only (i.e. rtmp://myserveradress:1935/live/video1) and on the other hand a stream with AAC audio only (i.e. rtmp://myserveradress:1935/live/audio1).
Although I am able to stream simultaneously video and audio, I don’t manage to play both streams from the server in this case.
Is it possible to stream audio and video to the same url (i.e. rtmp://myserveradress:1935/live/mymedia) ?
How can I stream a media which contains both video and audio that will be play simultaneously ?
Thanks for any help.