I\'m trying to do the following: receive video stream using gstreamer and process it with opencv. I\'ve found few solutions, and one of them is to write video into (from gst
So. After searching a while, I've found a solution, which involves retrieving data from buffer. So the idea is to create playbin and set appsink as "video-sink". Here is code sample:
cout<<"Creating appsink"<frame_w, mk->frame_h), IPL_DEPTH_8U, 1);
}
buf = gst_app_sink_pull_buffer(fks);
frame->imageData = (char*)GST_BUFFER_DATA(buf);
ProcessFrame(frame);
gst_buffer_unref(buf);
return true;
}
this works. PS. There's a lot of info about this method, but I spent a lot of time searching for it. So I decided to post it here in order to provide at least some keywords to search.
UPDATE. And a bit more information about connecting gstreamer and opencv. It's about converting buffer to iplimage now. First of all, we need to receive rgb buffer, to make conversion as easy, as possible. In order to do this we'll replace appsinks with appsink, connected to ffmpegcolorspace
cout<<"Creating appsink"<frame_w));
gst_structure_get_int(caps_struct, "height", &(mark->frame_h));
gst_structure_get_int(caps_struct, "depth", &depth);
mark->GeneratePoints();
frame = cvCreateImage(cvSize(mark->frame_w, mark->frame_h), depth/3, 3);
//callback function
gboolean Core::GetFrame(GstAppSink *fks, gpointer frame)
{
IplImage* frame_temp = frame
IplImage* frame_temp_two = cvCloneImage(frame_temp);
GstBuffer* buf;
buf = gst_app_sink_pull_buffer(fks);
frame_temp_two->imageData = (char*) GST_BUFFER_DATA(buf);
cvConvertImage(frame_temp_two, frame_temp, CV_CVTIMG_SWAP_RB);
ProcessFrame(frame_temp);
gst_buffer_unref(buf);
return true;
}
I hope this will help somebody.