13 Replies Latest reply: Feb 22, 2014 6:08 AM by Vilius Šumskas RSS

    FMS causes memory to fill and crash linux server

    iGoez

      FMS 4.5 on centOS 6.2

       

      Issue - error in stream either encoder issue or packet loss (rtmp sent to server to livepkgr HTTP streaming out, using f4m for flash machines and m3u8 for iOS) causes FMS to ramp up memory until 100%usage causing the server to crash.  why is this happening????? need help pronto! Thanks!

        • 1. Re: FMS causes memory to fill and crash linux server
          SE_0208 Adobe Employee

          FMS 4.5 is not officialy supported on CentOS 6.2 - so we've not tested if from our end - but i don't see whether there should issue from FMS end - may be something is changed in 6.2 from 5.5 which is causing this.

           

          Do you see anything in you master/core/edge logs which can be little indicative of what might be happening?

          • 2. Re: FMS causes memory to fill and crash linux server
            wujemurray

            We are having a similar problem, although not on CentOS 6.2   We are running on RedHat:

             

            Red Hat Enterprise Linux Server release 5.8 (Tikanga)

             

            Kernel:

             

            Linux stream1 2.6.18-308.4.1.el5 #1 SMP Wed Mar 28 01:54:56 EDT 2012 x86_64 x86_64 x86_64 GNU/Linux

             

            FMS:

             

            FMS_4_5_0_r297

             

             

             

            The FMS httpd server is consuming all the memory in the system (here is the output from top):

             

              PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+ COMMAND                                                                                                                                                       
              26118 fms       19   0 12.6g 7.3g 1672 S 99.6 93.6   9:46.69 httpd    

             

             

            Our system has 8Gb of physical memory and 10Gb of swap.   It consumes all the memory then all the sway until the system is trashed to death.  

             

             

            After some time the system will kill off the process:

             

              May  9 10:03:57 stream1 kernel: Out of memory: Killed process 26118, UID 505, (httpd).

             

             

            There is nothing obvious in the master.log:

             

            #Fields: datetimex-pidx-statusx-ctxx-comment
            2012-05-0908:45:4031937(i)2581173FMS detected IPv6 protocol stack!-
            2012-05-0908:45:4031937(i)2581173FMS config <NetworkingIPv6 enable=false>-
            2012-05-0908:45:4031937(i)2581173FMS running in IPv4 protocol stack mode!-
            2012-05-0908:45:4031937(i)2581173Host: stream1 IPv4: 127.0.0.1-
            2012-05-0908:45:4031937(i)2571011Server starting...-
            2012-05-0908:45:4131937(i)2581413/opt/adobe/fms/Apache2.2/bin/httpd -f ./conf/httpd.conf -d "/opt/adobe/fms/Apache2.2" -k start returned 0: [Wed May 09 08:45:40 2012] [info] mod_jithttp - FMS installation path: "/opt/adobe/fms/"
            httpd: Could not reliably determine the server's fully qualified domain name, using 127.0.0.1 for ServerName-
            2012-05-0908:45:4131937(i)2581224Edge (31960) started, arguments : -edgeports ":1935,80" -coreports "localhost:19350" -conf "./conf/Server.xml" -adaptor "_defaultRoot_" -name "_defaultRoot__edge1" -edgename "edge1".-
            2012-05-0908:45:4131937(i)2571111Server started (./conf/Server.xml).-
            2012-05-0908:48:0831937(i)2581221Core (32387) started, arguments : -adaptor "_defaultRoot_" -vhost  -app  -inst  -tag  -conf "./conf/Server.xml" -name "_defaultRoot_::::".-
            2012-05-0909:34:5831937(w)2581171System memory load (91) is high.-
            2012-05-0910:03:5931937(i)2581172System memory load (58) is now below the maximum threshold.-
            2012-05-0910:05:5831937(w)2581171System memory load (91) is high.-

             

             

             

            Or the edge.log:

             

            #Date: 2012-05-09

            #Fields: date    time    x-pid    x-status    x-ctx    x-comment

            2012-05-09    08:45:41    31960    (i)2581173    FMS detected IPv6 protocol stack!    -

            2012-05-09    08:45:41    31960    (i)2581173    FMS config <NetworkingIPv6 enable=false>    -

            2012-05-09    08:45:41    31960    (i)2581173    FMS running in IPv4 protocol stack mode!    -

            2012-05-09    08:45:41    31960    (i)2581173    Host: stream1 IPv4: 127.0.0.1    -

            2012-05-09    08:45:41    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : localhost:19350/v4    -

            2012-05-09    08:45:42    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 80/v4    -

            2012-05-09    08:45:42    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 1935/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 127.0.0.1:19350 (rtmfp-core)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 128.252.28.71:19350 (rtmfp-core)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 128.252.28.70:19350 (rtmfp-core)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 10.0.0.1:19350 (rtmfp-core)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 192.168.122.1:19350 (rtmfp-core)/v4    -

            2012-05-09    08:45:43    31960    (i)2631509    Public rtmfp-core addresses for listener _defaultRoot__edge1 are: 127.0.0.1:19350;128.252.28.71:19350;128.252.28.70:19350;10.0.0.1:19350;192.168.122.1:1935 0    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 127.0.0.1:1935 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 128.252.28.71:1935 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 128.252.28.70:1935 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 10.0.0.1:1935 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 192.168.122.1:1935 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 127.0.0.1:80 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 128.252.28.71:80 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 128.252.28.70:80 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 10.0.0.1:80 (rtmfp)/v4    -

            2012-05-09    08:45:43    31960    (i)2631174    Listener started ( _defaultRoot__edge1 ) : 192.168.122.1:80 (rtmfp)/v4    -

            2012-05-09    08:48:08    31960    (i)2581252    Registering core (32387).    -

            2012-05-09    09:34:58    31960    (w)2581171    System memory load (91) is high.    -

            2012-05-09    10:04:00    31960    (i)2581172    System memory load (58) is now below the maximum threshold.    -

            2012-05-09    10:05:58    31960    (w)2581171    System memory load (91) is high.    -

             

             

             

             

             

             

            Our exact stream setup is like this:

             

            We are using livepkgr.  Three streams 200,500,1000Kb streams.   We have written a f4m and m3u8 manifest for the 3 streams.   We modified the event.xml based on Adobes recommendations.   We modified the manifiest.xml making sure it matches the f4m and m3u8 bit rates.  

             

             

             

             

            Anything you can help us with would be appricated.  

            • 3. Re: FMS causes memory to fill and crash linux server
              RMonst3r

              Having the same issue CentOS release 5.7 (Final) 12G ram.

               

              httpd eats up all resources and kills the server.

               

              Using livepkr for HDS and HLS.

               

              Any help on this issue would be greatly appreciated.

              • 4. Re: FMS causes memory to fill and crash linux server
                Zeromega

                Hello,

                 

                it seems to be a common issue.

                same problem for me on Linux Machine with RAM 16 Gbytes:

                Linux 2.6.18-274.3.1.el5 #1 SMP Tue Sep 6 20:13:52 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux

                Into Messages logs i have a lot of: Server[17763]: System memory load (91) is high. and then the system crashes... no more services up.

                 

                Please any advice?

                 

                Thanks

                • 5. Re: FMS causes memory to fill and crash linux server
                  Amit Kumar Adobe Employee

                  Please use latest FMS 4.5.2 as there has been lots of memory and performance improvements from 4.5.0.297 build. Let me know if you still see the issue.

                  • 6. Re: FMS causes memory to fill and crash linux server
                    RMonst3r Community Member

                    WIll try and let you guys know how it works.

                    • 7. Re: FMS causes memory to fill and crash linux server
                      RMonst3r Community Member

                      Still happening on lastest FMS.

                       

                       

                      Processes on this host with the greatest increase in usage of Memory during the last 5 minute(s) (from June 15 23:36 UTC to June 15 23:40 UTC):



                      httpd1862010.32 GB

                       

                       

                      Server crash.

                      • 8. Re: FMS causes memory to fill and crash linux server
                        RMonst3r Community Member

                        I've added RLimitMEM 1000000000 1000000000 on the apache conf to see if that would prevent the entire machine from crashing.

                         

                        Will see...

                        • 9. Re: FMS causes memory to fill and crash linux server
                          RMonst3r Community Member

                          That didn't work either...

                           

                           

                          http://s18.postimage.org/5djfu9qfd/fmscrash.png

                           

                          Never had a problem with FMS using the live application, only when i started using livepkr and HLS this started happening every day.

                           

                          Unfortunately we need HLS/HTML5 streaming, if this cannot be fixed soon we will have to dump the fms platform in favor of a competitor.

                          • 10. Re: FMS causes memory to fill and crash linux server
                            ShivenK Adobe Employee

                            Guys,

                             

                            I am looking to reproduce this at my end. I am working with FMS 4.5.2 on CentOS 6.2.

                            Few questions:

                             

                            1. What kind of load does the server have?

                            2. After how long does this occur?

                            3. For how long are the streams published?

                             

                             

                            Thanks,

                            Shiven

                            • 11. Re: FMS causes memory to fill and crash linux server
                              RMonst3r Community Member

                              I put FMS in it's own CentOS 5.5 Xen VM and it's behaving now.

                               

                              Can't really experiment with it since it's on a production machine.

                              • 12. Re: FMS causes memory to fill and crash linux server
                                Jeff Dias Community Member

                                I`m using Adobe Media Server 5 on a EC2 running CentOs 5.5 and I have the exact same issue.

                                 

                                on my logs, there are several messages of high memory usage, then below threshold, even when I have less than 50 clients.

                                 

                                I use two interactive applications.

                                 

                                My daily load is around 5.000 recorded streams.

                                • 13. Re: FMS causes memory to fill and crash linux server
                                  Vilius Šumskas Community Member

                                  To anyone struggling with similar issue make sure you have the licence serial loaded AND the server fully restarted. At least on Linux restarting the server through Flash Media Administrator Console leave some zombie processes running which blocks loading the licence correctly. Restart whole OS instead.

                                   

                                  ShivenK, you can easily reproduce this by:

                                  1. Install FMS 4.5.7 on Linux. Don't forget to specify Interactive serial licence.

                                  2. Test hls-vod and hls-live. It should work without issue.

                                  3. Restart the server through Flash Media Administrator Console.

                                  4. Test hls-vod and hls-live again. This time VOD will be limited to 30 minutes, and Live will crash the server after 30 minutes by using all RAM. Even though serial is visible in Administrator Console.

                                   

                                  Reproduced multiple times here.