问题
I want to stream video captured from the webcam using ASP.NET Core application. I also need to do some manipulations with the frames, that's why I'm using OpenCVSharp.
Currently I have next developments:
- html in my view - here I don't know what type I should use
<video id="video" preload="auto">
<source src="LiveVideo" type="<< don't know the type >>"/>
</video>
- my controller - here I also don't know the content type, and the main problem: I don't know how to stream the video captured by OpenCVSharp
[ApiController]
[Route("[controller]")]
public class LiveVideoController : ControllerBase
{
[HttpGet]
public async Task<FileStreamResult> GetVideo()
{
// capture frames from webcam
// https://github.com/shimat/opencvsharp/wiki/Capturing-Video
var capture = new VideoCapture(0);
var stream = await << somehow get the stream >>;
return new FileStreamResult(stream, << don't know the content type >>);
}
}
回答1:
If somebody needs so exotic behavior for the app, I've found a way. It is possible with Blazor. I read each frame like a bytes array and send it to the UI and convert it to the image.
Here is my Blazor component LiveVideo.razor
@page "/live-video"
@using OpenCvSharp;
<img src="@_imgSrc" />
@code {
private string _imgSrc;
protected override async Task OnInitializedAsync()
{
using (var capture = new VideoCapture(0))
using (var frame = new Mat())
{
while (true)
{
capture.Read(frame);
var base64 = Convert.ToBase64String(frame.ToBytes());
_imgSrc = $"data:image/gif;base64,{base64}";
await Task.Delay(1);
StateHasChanged();
}
}
}
}
来源:https://stackoverflow.com/questions/57514490/asp-net-core-stream-video-from-the-opencvsharp-capture