2014-09-03 44 views
0

我正在我的笔记本电脑上建立一个简单的ffmpeg命令行,以便从其摄像头进行流式传输。命令行读取(底部的详细输出):ffmpeg无法传输到远程客户端

host1> ffmpeg -v verbose \ 
       -f dshow \ 
       -i video="Camera":audio="Microphone" \ 
       -r 30 -g 0 -vcodec h264 -acodec libmp3lame \ 
       -tune zerolatency \ 
       -preset ultrafast \ 
       -f mpegts udp://12.34.56.78:12345 

首先,它在本地工作。即,我可以在同一台主机上使用ffplay查看输出:

host1> ffplay -hide_banner -v udp://12.34.56.78:12345 

现在有什么不工作是我在同一个网络中做到这一点时,从另一台机器。它显示了一个nan进度:

host2> ffplay -hide_banner -v udp://12.34.56.78:12345 
    nan : 0.000 fd= 0 aq= 0KB vq= 0KB sq= 0B f=0/0 

我以前ncat转储的原始内容。但是,没有任何输出:

host2>\ncat\ncat -v -u 12.34.56.78 12345 
Ncat: Version 5.59BETA1 (http://nmap.org/ncat) 
Ncat: Connected to 12.34.56.78:12345. 
(...and nothing happen...) 

注为我所用ncat使用相同的端口和协议(UDP)跨线相互沟通,我可以排除防火墙问题。这工作,他们可以互相聊天:

host1> ncat -l -u -p 12345 
host2> ncat -u 12.34.56.78 12345 

任何提示?

我使用的是Windows XP,安装的FFMPEG 64bit从here。下面是我的ffmpeg命令的输出:

C:\ffmpeg\bin>ffmpeg -v verbose -f dshow -i video="Integrated Camera":audio="Microphone (Realtek High Definition Audio)" -r 30 -g 0 -vcodec h264 -acodec libmp3lame -tune zerolatency -preset ultrafast -f mpegts udp://12.34.56.78:12345 
ffmpeg version N-66012-g97b8809 Copyright (c) 2000-2014 the FFmpeg developers 
    built on Sep 1 2014 00:21:15 with gcc 4.8.3 (GCC) 
    configuration: --disable-static --enable-shared --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug -enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-decklink --enable-zlib 
    libavutil  54. 7.100/54. 7.100 
    libavcodec  56. 1.100/56. 1.100 
    libavformat 56. 3.100/56. 3.100 
    libavdevice 56. 0.100/56. 0.100 
    libavfilter  5. 0.103/5. 0.103 
    libswscale  3. 0.100/3. 0.100 
    libswresample 1. 1.100/1. 1.100 
    libpostproc 53. 0.100/53. 0.100 
Guessed Channel Layout for Input Stream #0.1 : stereo 
Input #0, dshow, from 'video=Integrated Camera:audio=Microphone (Realtek High Definition Audio)': 
    Duration: N/A, start: 171840.657000, bitrate: N/A 
    Stream #0:0: Video: rawvideo, bgr24, 640x480, 30 fps, 30 tbr, 10000k tbn, 30 tbc 
    Stream #0:1: Audio: pcm_s16le, 44100 Hz, 2 channels, s16, 1411 kb/s 
Matched encoder 'libx264' for codec 'h264'. 
[graph 0 input from stream 0:0 @ 0000000000470aa0] w:640 h:480 pixfmt:bgr24 tb:1/10000000 fr:10000000/333333 sar:0/1 sws_param:flags=2 
[auto-inserted scaler 0 @ 0000000004326d00] w:iw h:ih flags:'0x4' interl:0 
[format @ 0000000004325a00] auto-inserting filter 'auto-inserted scaler 0' between the filter 'Parsed_null_0' and the filter 'format' 
[auto-inserted scaler 0 @ 0000000004326d00] w:640 h:480 fmt:bgr24 sar:0/1 -> w:640 h:480 fmt:yuv444p sar:0/1 flags:0x4 
No pixel format specified, yuv444p for H.264 encoding chosen. 
Use -pix_fmt yuv420p for compatibility with outdated media players. 
[graph 1 input from stream 0:1 @ 0000000000460c20] tb:1/44100 samplefmt:s16 samplerate:44100 chlayout:0x3 
[audio format for output stream 0:1 @ 00000000004601a0] auto-inserting filter 'auto-inserted resampler 0' between the filter 'Parsed_anull_0' and the filter 'audio format for output stream 0:1' 
[auto-inserted resampler 0 @ 00000000004604a0] ch:2 chl:stereo fmt:s16 r:44100Hz -> ch:2 chl:stereo fmt:s16p r:44100Hz 
[libx264 @ 000000000081bb20] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX 
[libx264 @ 000000000081bb20] profile High 4:4:4 Intra, level 3.0, 4:4:4 8-bit 
[mpegts @ 000000000081abe0] muxrate VBR, pcr every 3 pkts, sdt every 200, pat/pmt every 40 pkts 
Output #0, mpegts, to 'udp://12.34.56.78:12345': 
    Metadata: 
    encoder   : Lavf56.3.100 
    Stream #0:0: Video: h264 (libx264), yuv444p, 640x480, q=-1--1, 30 fps, 90k tbn, 30 tbc 
    Metadata: 
     encoder   : Lavc56.1.100 libx264 
    Stream #0:1: Audio: mp3 (libmp3lame), 44100 Hz, stereo, s16p 
    Metadata: 
     encoder   : Lavc56.1.100 libmp3lame 
Stream mapping: 
    Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264)) 
    Stream #0:1 -> #0:1 (pcm_s16le (native) -> mp3 (libmp3lame)) 
Press [q] to stop, [?] for help 
*** 1 dup! 
frame= 241 fps= 31 q=28.0 Lsize= 3439kB time=00:00:08.03 bitrate=3506.4kbits/s dup=1 drop=0 
video:3035kB audio:125kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 8.791966% 
Input file #0 (video=Integrated Camera:audio=Microphone (Realtek High Definition Audio)): 
    Input stream #0:0 (video): 240 packets read (221184000 bytes); 240 frames decoded; 
    Input stream #0:1 (audio): 16 packets read (1411200 bytes); 16 frames decoded (352800 samples); 
    Total: 256 packets (222595200 bytes) demuxed 
Output file #0 (udp://12.34.56.78:12345): 
    Output stream #0:0 (video): 241 frames encoded; 241 packets muxed (3108187 bytes); 
    Output stream #0:1 (audio): 306 frames encoded (352512 samples); 307 packets muxed (128313 bytes); 
    Total: 548 packets (3236500 bytes) muxed 
[libx264 @ 000000000081bb20] frame I:241 Avg QP:27.97 size: 12897 
[libx264 @ 000000000081bb20] mb I I16..4: 100.0% 0.0% 0.0% 
[libx264 @ 000000000081bb20] coded y,u,v intra: 26.3% 0.5% 0.0% 
[libx264 @ 000000000081bb20] i16 v,h,dc,p: 19% 28% 21% 31% 
[libx264 @ 000000000081bb20] kb/s:3095.29 
[dshow @ 0000000000467720] real-time buffer[Integrated Camera] too full (90% of size: 3041280)! frame dropped! 
Received signal 2: terminating. (I pressed CTRL-C) 
+0

我也试图做同样的事情,但我经常得到以下错误: [dshow @ 00000000003ebb20]实时缓冲区[屏幕捕获记录]太满(275%的大小:3041280)!帧丢弃! 你能帮我解答吗? – 2015-04-21 13:27:10

回答

2

好吧我得到它的工作。问题是我对FFmpegFFplay工作方式的理解是错误的。当我们说:

host1> ffmpeg -i INPUT -i protocol://ip:port 

这并不意味着ffmpeg具有约束力,并监听ip:port,而是,它试图“后”输出到该端点。

同样

host2> ffplay -i protocol://ip:port 

意味着ffplay实际上上ip结合和传入内容监听port

因此为了得到这个工作,ffmpeg应该张贴到ip:port,其中ip:port是远程主机和端口ffplay是听,而不是本地计算机的IP地址 - 因为ffmpeg是客户端,而不是服务器。

+0

另请参见多播,我以为你最初使用的是 – rogerdpack 2014-09-04 21:31:46

+0

所以在这种情况下'ip'将是host2机器的'ip'?如果是这样,那么为什么你能够在同一台机器上获取内容,因为同一台机器具有'ip','localhost',而客户端'FFMPEG'将数据流式传输到'12.34.56.78:12345'? – 2015-04-20 11:51:24

+0

@BilalAhmedYaseen因为当在本地机器上测试时,我在ffmpeg和ffplay上都指定了localhost。 – KFL 2015-08-05 04:29:45