2017-08-28 153 views
1

目标是通过网络将视频从Raspberry Pi(Raspivid/H.264)流式传输到运行在笔记本电脑上的OpenCV应用程序中。将Raspivid的Netcat H.264视频转换为OpenCV

开放CV捕获如下(C++):

cv::VideoCapture cap; 
cap.open("cam_1"); // cam_1 is a FIFO 

cv::Mat frame; 

while(1){ 
    cap >> frame; 
    cv::imshow("", frame); 
    cv::waitKey(10); 
} 

的FIFO流是如下创建:

mkfifo cam_1 

一旦OpenCV的程序运行时,netcat的监听器启动:

ncat --recv-only --keep-open --verbose --listen 5001 > cam_1 

一旦netcat监听器在笔记本电脑上运行,流将从第È树莓裨

raspivid --verbose --nopreview -b 2000000 --timeout 0 -o - | ncat 192.168.LAPTOP.IP 5001 

,或者出于调试目的,在笔记本电脑上的本地文件可以被流式传输到netcat的:

cat video.h264 | nc 192.168.LAPTOP.IP 5001 

二者均得到以下错误:

Unable to stop the stream: Inappropriate ioctl for device (ERROR)icvOpenAVI_XINE(): Unable to initialize video driver.

有趣的是,如果我启动笔记本电脑上的Netcat侦听器,然后用CTRL + C杀死它,然后在启动视频流之前再次启动它,使用任一方法... 然后视频p正确放置

我不明白为什么启动netcat侦听器,然后杀死它,然后再次启动有一个影响或影响是什么。我认为可能需要在视频之前将EOF或BOF回显到FIFO中,我不确定该语法是什么。

我尝试了所有Netcat的味道。

+0

https://stackoverflow.com/a/44972255/2836621 –

回答

1

如果您在OpenCV尝试读取它之后但在开始流式传输之前触摸FIFO,则它将起作用。

1

我只是解决了这个使用以下https://stackoverflow.com/a/48675107/2355051

我结束了适应这种picamera python recipe

在树莓派:(createStream.py)

import io 
import socket 
import struct 
import time 
import picamera 

# Connect a client socket to my_server:8000 (change my_server to the 
# hostname of your server) 
client_socket = socket.socket() 
client_socket.connect(('10.0.0.3', 777)) 

# Make a file-like object out of the connection 
connection = client_socket.makefile('wb') 
try: 
    with picamera.PiCamera() as camera: 
     camera.resolution = (1024, 768) 
     # Start a preview and let the camera warm up for 2 seconds 
     camera.start_preview() 
     time.sleep(2) 

     # Note the start time and construct a stream to hold image data 
     # temporarily (we could write it directly to connection but in this 
     # case we want to find out the size of each capture first to keep 
     # our protocol simple) 
     start = time.time() 
     stream = io.BytesIO() 
     for foo in camera.capture_continuous(stream, 'jpeg', use_video_port=True): 
      # Write the length of the capture to the stream and flush to 
      # ensure it actually gets sent 
      connection.write(struct.pack('<L', stream.tell())) 
      connection.flush() 

      # Rewind the stream and send the image data over the wire 
      stream.seek(0) 
      connection.write(stream.read()) 

      # Reset the stream for the next capture 
      stream.seek(0) 
      stream.truncate() 
    # Write a length of zero to the stream to signal we're done 
    connection.write(struct.pack('<L', 0)) 
finally: 
    connection.close() 
    client_socket.close() 

在机器正在处理流:(processStream.py)

import io 
import socket 
import struct 
import cv2 
import numpy as np 

# Start a socket listening for connections on 0.0.0.0:8000 (0.0.0.0 means 
# all interfaces) 
server_socket = socket.socket() 
server_socket.bind(('0.0.0.0', 777)) 
server_socket.listen(0) 

# Accept a single connection and make a file-like object out of it 
connection = server_socket.accept()[0].makefile('rb') 
try: 
    while True: 
     # Read the length of the image as a 32-bit unsigned int. If the 
     # length is zero, quit the loop 
     image_len = struct.unpack('<L', connection.read(struct.calcsize('<L')))[0] 
     if not image_len: 
      break 
     # Construct a stream to hold the image data and read the image 
     # data from the connection 
     image_stream = io.BytesIO() 
     image_stream.write(connection.read(image_len)) 
     # Rewind the stream, open it as an image with opencv and do some 
     # processing on it 
     image_stream.seek(0) 
     image = Image.open(image_stream) 

     data = np.fromstring(image_stream.getvalue(), dtype=np.uint8) 
     imagedisp = cv2.imdecode(data, 1) 

     cv2.imshow("Frame",imagedisp) 
     cv2.waitKey(1) #imshow will not output an image if you do not use waitKey 
     cv2.destroyAllWindows() #cleanup windows 
finally: 
    connection.close() 
    server_socket.close() 

该解决方案与我在原始问题中引用的视频具有相似的结果。较大的分辨率帧会增加Feed的延迟,但这对我的应用程序来说是可以忍受的。

首先你需要运行processStream.py,然后在Raspberry Pi上执行createStream.py。如果这不起作用,请执行以下脚本:sudo