How to decode H.264 video frame in Java environment

前端 未结 5 833
迷失自我
迷失自我 2020-12-23 22:21

Does anyone know how to decode H.264 video frame in Java environment?

My network camera products support the RTP/RTSP Streaming.

The service standard RTP/RTS

5条回答
  •  [愿得一人]
    2020-12-23 22:46

    You can use a pure Java library called JCodec ( http://jcodec.org ).
    Decoding one H.264 frame is as easy as:

    ByteBuffer bb = ... // Your frame data is stored in this buffer
    H264Decoder decoder = new H264Decoder();
    Picture out = Picture.create(1920, 1088, ColorSpace.YUV_420); // Allocate output frame of max size
    Picture real = decoder.decodeFrame(bb, out.getData());
    BufferedImage bi = JCodecUtil.toBufferedImage(real); // If you prefere AWT image
    

    If you want to read a from from a container ( like MP4 ) you can use a handy helper class FrameGrab:

    int frameNumber = 150;
    BufferedImage frame = FrameGrab.getFrame(new File("filename.mp4"), frameNumber);
    ImageIO.write(frame, "png", new File("frame_150.png"));
    

    Finally, here's a full sophisticated sample:

    private static void avc2png(String in, String out) throws IOException {
        SeekableByteChannel sink = null;
        SeekableByteChannel source = null;
        try {
            source = readableFileChannel(in);
            sink = writableFileChannel(out);
    
            MP4Demuxer demux = new MP4Demuxer(source);
    
            H264Decoder decoder = new H264Decoder();
    
            Transform transform = new Yuv420pToRgb(0, 0);
    
            MP4DemuxerTrack inTrack = demux.getVideoTrack();
    
            VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
            Picture target1 = Picture.create((ine.getWidth() + 15) & ~0xf, (ine.getHeight() + 15) & ~0xf,
                    ColorSpace.YUV420);
            Picture rgb = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.RGB);
            ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);
            BufferedImage bi = new BufferedImage(ine.getWidth(), ine.getHeight(), BufferedImage.TYPE_3BYTE_BGR);
            AvcCBox avcC = Box.as(AvcCBox.class, Box.findFirst(ine, LeafBox.class, "avcC"));
    
            decoder.addSps(avcC.getSpsList());
            decoder.addPps(avcC.getPpsList());
    
            Packet inFrame;
            int totalFrames = (int) inTrack.getFrameCount();
            for (int i = 0; (inFrame = inTrack.getFrames(1)) != null; i++) {
                ByteBuffer data = inFrame.getData();
    
                Picture dec = decoder.decodeFrame(splitMOVPacket(data, avcC), target1.getData());
                transform.transform(dec, rgb);
                _out.clear();
    
                AWTUtil.toBufferedImage(rgb, bi);
                ImageIO.write(bi, "png", new File(format(out, i)));
                if (i % 100 == 0)
                    System.out.println((i * 100 / totalFrames) + "%");
            }
        } finally {
            if (sink != null)
                sink.close();
            if (source != null)
                source.close();
        }
    }
    

提交回复
热议问题