webcam

display cv2.VideoCapture image inside Pygame surface

我的梦境 提交于 2019-12-03 14:58:37
I'm trying to use opencv (cv2) to stream a webcam feed into a pygame surface object. The problem is the colors aren't displaying correctly. I think it is the type casting, but I'm having trouble understanding the pygame surface documentation to know what it expects. This code demonstrates what I'm talking about import pygame from pygame.locals import * import cv2 import numpy color=False#True#False camera_index = 0 camera=cv2.VideoCapture(camera_index) camera.set(3,640) camera.set(4,480) #This shows an image the way it should be cv2.namedWindow("w1",cv2.CV_WINDOW_AUTOSIZE) retval,frame=camera

reading a barcode with a webcam

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-03 14:55:15
问题 Hey, I'm trying to read a EAN-13 barcode from my webcam. I already wrote a class to do that work. I'm taking a picture from my webcam, trimming it to show ONLY the barcode, and reading the barcode with the code tables from wikipedia. For some reason, the barcode gets trimmed, but the output is always "0-1-1-1-1-1-1-1-1-1-1-1-1". I wonder if i did any stupid mistake or misunderstood something? I do not want to use ANY third-party programs! this is my code for now: public class BarcodeDecoder {

Creating synchronized stereo videos using webcams

人盡茶涼 提交于 2019-12-03 13:54:58
问题 I am using OpenCV to capture video streams from two USB webcams (Microsoft LifeCam Studio) in Ubuntu 14.04. I am using very simple VideoCapture code (source here) and am trying to at least view two videos that are synchronized against each other. I used Android stopwatch apps (UltraChron Stopwatch Lite and Stopwatch Timer) on my Samsung Galaxy S3 mini to realize that my viewed images are out of sync (show different time on stopwatch). The frames are synced maybe in 50% of the time. The frame

Open Source Video Gesture Recognition Library in C# [closed]

末鹿安然 提交于 2019-12-03 13:40:58
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 4 years ago . I need an open-source Video Gesture Recognition .NET Library/API (in C#). We have a web cam... we move a hand and it causes some events... So I need motion detection for navigation lib/api... Where can I find such a library? 回答1: I've heard that AForge is pretty awesome (check out the AForge.Vision.Motion

Capture video from several webcams with getUserMedia

我只是一个虾纸丫 提交于 2019-12-03 12:27:22
问题 I would like to capture video from multiple webcams connected to my pc. It is easy enough to use one web-cam, but how can I get video streams from multiple sources? Is it possible to select which camera to use for one stream? navigator.getUserMedia ({ video: true }, function (oMedia) { var video = document.getElementById ('tVideo1'); video.src = window.URL.createObjectURL (oMedia); }); 回答1: I was intrigued by your question, so I started researching 'getusermedia multiple cameras'. After a

How to create virtual webcam in Windows 10?

99封情书 提交于 2019-12-03 11:31:11
I would like to take video from a webcam, render some text on the frames and do some motion tracking and pass it on to a virtual webcam so it can be streamed easily. I found some answers on stackoverflow suggesting that I should use DirectShow. According to information in DirectShow documentation , the DirectShow SDK is part of Windows SDK. So I installed the latest Windows SDK but it seems that it doesn't include DirectShow because there are no DirectShow samples under C:\Program Files (x86)\Microsoft SDKs\Windows . (The stackoverflow answers are also pretty old - dated around 2010) Can you

Webcam streaming from Mac using FFmpeg

被刻印的时光 ゝ 提交于 2019-12-03 10:03:11
问题 I want to stream my webcam from Mac using FFmpeg. First I checked the supported devices using ffmpeg -f avfoundation -list_devices true -i "" Output: [AVFoundation input device @ 0x7fdf1bd03000] AVFoundation video devices: [AVFoundation input device @ 0x7fdf1bd03000] [0] USB 2.0 Camera #2 [AVFoundation input device @ 0x7fdf1bd03000] [1] FaceTime HD Camera [AVFoundation input device @ 0x7fdf1bd03000] [2] Capture screen 0 [AVFoundation input device @ 0x7fdf1bd03000] [3] Capture screen 1

How to capture still image from webcam on linux

↘锁芯ラ 提交于 2019-12-03 09:22:54
问题 I am trying to write a C++/Qt program for linux, where I take a still image photo from a webcam, make some transformations to a photo (cropping, resizing, etc.), and save it to a jpeg file. But I have encountered some problems. The main problem is that standart UVC (usb video device class) linux driver currently does not support direct still image capture: http://www.ideasonboard.org/uvc/ . So, there are two possible ways to capture still image. You can take one frame from the video stream

Displaying WebCam video with Qt

百般思念 提交于 2019-12-03 09:22:39
问题 I'm using Qt 4.5 (2009.03) on Linux Gnome (Ubuntu 9.04) and would like to display video captured by my webcam in a Phonon::VideoWidget of my Qt application. I have a first implementation using the v4l2 API where I do the YUV2 to RGB conversion and fill a QImage my self. It works well but it is not very efficient. A collegue used gStreamer to do the same thing and it was much much faster. Since then I found out about phonon and would like to use it. Everything is configured and set up except

Command-line streaming webcam with audio from Ubuntu server in WebM format

拟墨画扇 提交于 2019-12-03 08:39:47
I am trying to stream video and audio from my webcam connected to my headless Ubuntu server (running Maverick 10.10). I want to be able to stream in WebM format (VP8 video + OGG). Bandwidth is limited, and so the stream must be below 1Mbps. I have tried using FFmpeg. I am able to record WebM video from the webcam with the following: ffmpeg -s 640x360 \ -f video4linux2 -i /dev/video0 -isync -vcodec libvpx -vb 768000 -r 10 -vsync 1 \ -f alsa -ac 1 -i hw:1,0 -acodec libvorbis -ab 32000 -ar 11025 \ -f webm /var/www/telemed/test.webm However despite experimenting with all manner of vsync and async