ÃֽŠ°Ô½Ã±Û(JAVA)
2017.09.25 / 10:03

MP4 µ¿¿µ»ó ¾÷·Îµå ÀÚµ¿ ½ºÅ©¸°¼¦ ÆÄÀÏ ½æ³×ÀÏ »ý¼º ¶óÀ̺귯¸®

µðÆÌ
Ãßõ ¼ö 374

jcodec - a pure java implementation of video/audio codecs.


https://github.com/jcodec/jcodec


About

JCodec is a library implementing a set of popular video and audio codecs. Currently JCodec supports:

  • Video codecs

    • H.264 main profile decoder;
    • H.264 baseline profile encoder;
    • VP8 decoder (I frames only);
    • VP8 encoder (I frames only);
    • MPEG 1/2 decoder ( I/P/B frames, interlace );
    • Apple ProRes decoder/encoder;
    • JPEG decoder;
    • PNG decoder/encoder.
  • Audio codecs

    • SMPTE 302M decoder;
    • AAC decoder (JAAD)
    • RAW PCM.
  • Formats:

    • MP4 ( MOV ) demuxer / muxer;
    • MKV ( Matroska ) demuxer / muxer;
    • MPEG PS demuxer;
    • MPEG TS demuxer;
    • WAV demuxer/muxer;
    • MPEG Audio (MP3) demuxer;
    • ADTS demuxer.

JCodec is free software distributed under FreeBSD License.

Future development

Those are just some of the things JCodec dev team is planning to work on:

  • Video
    • Improve H.264 encoder: CABAC, rate control;
    • Performance optimize H.264 decoder.
  • Audio
    • MP3 decoder;
    • AAC encoder.

Getting started

JCodec can be used in both standard Java and Android. It contains platform-agnostic java classes. To use the latest version of JCodec add the maven dependency as below:

<dependency>
    <groupId>org.jcodec</groupId>
    <artifactId>jcodec</artifactId>
    <version>0.2.1</version>
</dependency>

OR gradle dependency as below:

dependencies {
    ...
    compile 'org.jcodec:jcodec:0.2.1'
    ...
}

Additionally if you want to use JCodec with AWT images (BufferedImage) add this maven dependency:

<dependency>
    <groupId>org.jcodec</groupId>
    <artifactId>jcodec</artifactId>
    <version>0.2.1</version>
</dependency>
<dependency>
    <groupId>org.jcodec</groupId>
    <artifactId>jcodec-javase</artifactId>
    <version>0.2.1</version>
</dependency>

OR if you want to use JCodec with Android images (Bitmap) add this gradle dependency:

android {
    configurations.all {
        resolutionStrategy.force 'com.google.code.findbugs:jsr305:3.0.2'
    }
}
dependencies {
    ...
    compile 'org.jcodec:jcodec:0.2.1'
    compile 'org.jcodec:jcodec-android:0.2.1'
    ...
}

For the latest and greatest (the 0.2.2-SNAPSHOT) clone this Git project and build locally like so:

git clone https://github.com/jcodec/jcodec.git
cd jcodec
mvn clean install
(cd javase; mvn clean install)
(cd android; mvn clean install)

If you JUST need the jars, download them from here:

There is virtually no documentation right now but the plan is to catch up on this so stay tuned. stackoverflow.com contains quite a bit information at this point.

Also check the 'samples' subfolder. It's a maven project, and it contains some code samples for the popular use-cases:

Performance / quality considerations

Because JCodec is a pure Java implementation please adjust your performance expectations accordingly. We usually make the best effort to write efficient code but despite this the decoding will typically be an order of magnitude slower than the native implementations (such as FFMpeg). We are currently looking into implementing performance-critical parts in OpenCL (or RenderScript for Android) but the ETA is unknown.

Expect the encoded quality/bitrate for h.264 (AVC) to be so much worse compared to the well known native encoders (such as x264). This is because very little work has been put so far into developing the encoder and also because encoders usually trade speed for quality, speed is something we don't have in Java, hence the quality. Again we may potentially fix that in the future by introducing OpenCL (RenderScript) code but at this point it's an unknown.

That said the decode quality should be at the industry level. This is because the decoding process is usually specified by the standard and the correct decoder implementations are expected to produce bit-exact outputs.

Sample code

Getting a single frame from a movie ( supports only AVC, H.264 in MP4, ISO BMF, Quicktime container ):

int frameNumber = 42;
Picture picture = FrameGrab.getFrameFromFile(new File("video.mp4"), frameNumber);

//for JDK (jcodec-javase)
BufferedImage bufferedImage = AWTUtil.toBufferedImage(picture);
ImageIO.write(bufferedImage, "png", new File("frame42.png"));

//for Android (jcodec-android)
Bitmap bitmap = AndroidUtil.toBitmap(picture);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, new FileOutputStream("frame42.png")); 

Get all frames from a video file

File file = new File("video.mp4");
FrameGrab grab = FrameGrab.createFrameGrab(NIOUtils.readableChannel(file));
Picture picture;
while (null != (picture = grab.getNativeFrame())) {
    System.out.println(picture.getWidth() + "x" + picture.getHeight() + " " + picture.getColor());
}

Getting a sequence of frames from a movie ( supports only AVC, H.264 in MP4, ISO BMF, Quicktime container ):

double startSec = 51.632;
int frameCount = 10;
File file = new File("video.mp4");

FrameGrab grab = FrameGrab.createFrameGrab(NIOUtils.readableChannel(file));
grab.seekToSecondPrecise(startSec);

for (int i=0;i<frameCount;i++) {
    Picture picture = grab.getNativeFrame();
    System.out.println(picture.getWidth() + "x" + picture.getHeight() + " " + picture.getColor());
    //for JDK (jcodec-javase)
    BufferedImage bufferedImage = AWTUtil.toBufferedImage(picture);
    ImageIO.write(bufferedImage, "png", new File("frame"+i+".png"));

    //for Android (jcodec-android)
    Bitmap bitmap = AndroidUtil.toBitmap(picture);
    bitmap.compress(Bitmap.CompressFormat.PNG, 100, new FileOutputStream("frame"+i+".png")); 
}

Making a video with a set of images from memory:

SeekableByteChannel out = null;
try {
    out = NIOUtils.writableFileChannel("/tmp/output.mp4");
    // for Android use: AndroidSequenceEncoder
    AWTSequenceEncoder encoder = new AWTSequenceEncoder(out, Rational.R(25, 1));
    for (...) {
        // Generate the image, for Android use Bitmap
        BufferedImage image = ...;
        // Encode the image
        encoder.encodeImage(image);
    }
    // Finalize the encoding, i.e. clear the buffers, write the header, etc.
    encoder.finish();
} finally {
    NIOUtils.closeQuietly(out);
}

Contact

Feel free to communicate any questions or concerns to us. Dev team email: jcodecproject@gmail.com






Java Code Examples for org.jcodec.containers.mp4.demuxer.MP4Demuxer

The following are top voted examples for showing how to use org.jcodec.containers.mp4.demuxer.MP4Demuxer. These examples are extracted from open source projects. You can vote up the examples you like and your votes will be used in our system to product more good examples. 

Example 1
Project: OpenSpaceDVR   File: JCodecUtil.java View source code6 votes
public static Format detectFormat(ByteBuffer b) {
    int movScore = MP4Demuxer.probe(b.duplicate());
    int psScore = MPSDemuxer.probe(b.duplicate());
    int tsScore = MTSDemuxer.probe(b.duplicate());

    if (movScore == 0 && psScore == 0 && tsScore == 0)
        return null;

    return movScore > psScore ? (movScore > tsScore ? Format.MOV : Format.MPEG_TS)
            : (psScore > tsScore ? Format.MPEG_PS : Format.MPEG_TS);
}
 

Example 2
Project: OpenSpaceDVR   File: FrameGrab.java View source code6 votes
public FrameGrab(SeekableByteChannel in) throws IOException, JCodecException {
    ByteBuffer header = ByteBuffer.allocate(65536);
    in.read(header);
    header.flip();
    Format detectFormat = JCodecUtil.detectFormat(header);

    switch (detectFormat) {
    case MOV:
        MP4Demuxer d1 = new MP4Demuxer(in);
        videoTrack = d1.getVideoTrack();
        break;
    case MPEG_PS:
        throw new UnsupportedFormatException("MPEG PS is temporarily unsupported.");
    case MPEG_TS:
        throw new UnsupportedFormatException("MPEG TS is temporarily unsupported.");
    default:
        throw new UnsupportedFormatException("Container format is not supported by JCodec");
    }
    decodeLeadingFrames();
}
 
Example 3
Project: openolat   File: MovieServiceImpl.java View source code6 votes
@Override
public Size getSize(VFSLeaf media, String suffix) {
	File file = null;
	if(media instanceof VFSCPNamedItem) {
		media = ((VFSCPNamedItem)media).getDelegate();
	}
	if(media instanceof LocalFileImpl) {
		file = ((LocalFileImpl)media).getBasefile();
	}
	if(file == null) {
		return null;
	}

	if(suffix.equals("mp4") || suffix.equals("m4v")) {
		try(RandomAccessFile accessFile = new RandomAccessFile(file, "r")) {
			FileChannel ch = accessFile.getChannel();
			FileChannelWrapper in = new FileChannelWrapper(ch);
			MP4Demuxer demuxer1 = new MP4Demuxer(in);
			org.jcodec.common.model.Size size = demuxer1.getMovie().getDisplaySize();
			int w = size.getWidth();
			int h = size.getHeight();
			return new Size(w, h, false);
		} catch (Exception | AssertionError e) {
			log.error("Cannot extract size of: " + media, e);
		}
	} else if(suffix.equals("flv")) {
		try(InputStream stream = new FileInputStream(file)) {
			FLVParser infos = new FLVParser();
			infos.parse(stream);
			if(infos.getWidth() > 0 && infos.getHeight() > 0) {
				int w = infos.getWidth();
				int h = infos.getHeight();
				return new Size(w, h, false);
			}
		} catch (Exception e) {
			log.error("Cannot extract size of: " + media, e);
		}
	}

	return null;
}
 

Example 4
Project: video-watermarking   File: ConverterThread.java View source code6 votes
@Override
public void run() {
    File inputFile = new File(inputPath);
    File outputFile = new File(outputPath);

    try {
        MP4Demuxer demuxer = new MP4Demuxer(new AutoFileChannelWrapper(inputFile));
        MP4Muxer muxer = new MP4Muxer(NIOUtils.rwFileChannel(outputFile));


        List<AbstractMP4DemuxerTrack> audioTracks = demuxer.getAudioTracks();
        for (AbstractMP4DemuxerTrack audioTrack : audioTracks) {
            FramesMP4MuxerTrack muxerTrack = muxer.addTrack(TrackType.SOUND, (int) audioTrack.getTimescale());
            muxerTrack.addSampleEntry(audioTrack.getSampleEntries()[0]);
            for (int i = 0; i < audioTrack.getFrameCount(); i++) {
                muxerTrack.addFrame((MP4Packet) audioTrack.nextFrame());
            }
        }

        FramesMP4DemuxerTrack videoTrack = (FramesMP4DemuxerTrack) demuxer.getVideoTrack();
        FramesMP4MuxerTrack muxerTrack = muxer.addTrack(TrackType.VIDEO, (int) videoTrack.getTimescale());

        AVCMP4Adaptor decoder = new AVCMP4Adaptor(videoTrack);
        H264Encoder encoder = new H264Encoder();

        Size size = videoTrack.getMeta().getDimensions();
        long frameCount = videoTrack.getFrameCount();
        List<ByteBuffer> spsList = new ArrayList<ByteBuffer>();
        List<ByteBuffer> ppsList = new ArrayList<ByteBuffer>();

        ByteBuffer outBuffer = ByteBuffer.allocate(size.getWidth() * size.getHeight());
        for (int i = 0; i < frameCount; i++) {
            updateProgress(i, frameCount);

            if (Thread.currentThread().isInterrupted()){
                return;
            }

            MP4Packet packet = videoTrack.nextFrame();
            Picture picture;
            try {
                picture = decoder.decodeFrame(packet, decoder.allocatePicture());
            } catch (Exception ex) {
                continue;
            }
            BufferedImage srcImage = ConverterUtils.toBufferedImage(picture);
            core.getBaseImageProvider().reset();
            core.getBaseImageProvider().setSource(srcImage);

            core.processImage();

            BufferedImage precessedImage = core.getCombinedImage();
            Picture modifiedPicture = ConverterUtils.fromBufferedImage(precessedImage, ColorSpace.YUV420J);

            outBuffer.clear();
            outBuffer = encoder.encodeFrame(modifiedPicture, outBuffer);

            spsList.clear();
            ppsList.clear();
            H264Utils.wipePS(outBuffer, spsList, ppsList);
            H264Utils.encodeMOVPacket(outBuffer);

            MP4Packet modifiedPacket = new MP4Packet(packet, outBuffer);
            muxerTrack.addFrame(modifiedPacket);
        }

        muxerTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));

        muxer.writeHeader();
    } catch (Exception e) {
        if (e instanceof ClosedByInterruptException) {
            core.setConvertThreadException(new Exception("Interrupted"));
            return;
        }
        core.setConvertThreadException(e);
    } finally {
      releaseButton();
    }
}
 
Example 5
Project: OpenSpaceDVR   File: Remux.java View source code5 votes
public void remux(File tgt, File src, File timecode, Handler handler) throws IOException {
    SeekableByteChannel input = null;
    SeekableByteChannel output = null;
    SeekableByteChannel tci = null;
    try {
        input = readableFileChannel(src);
        output = writableFileChannel(tgt);
        MP4Demuxer demuxer = new MP4Demuxer(input);

        
        TimecodeMP4DemuxerTrack tt = null;
        if (timecode != null) {
            tci = readableFileChannel(src);
            MP4Demuxer tcd = new MP4Demuxer(tci);
            tt = tcd.getTimecodeTrack();
        }

        MP4Muxer muxer = WebOptimizedMP4Muxer.withOldHeader(output, Brand.MOV, demuxer.getMovie());

        List<AbstractMP4DemuxerTrack> at = demuxer.getAudioTracks();
        List<PCMMP4MuxerTrack> audioTracks = new ArrayList<PCMMP4MuxerTrack>();
        for (AbstractMP4DemuxerTrack demuxerTrack : at) {
            PCMMP4MuxerTrack att = muxer.addPCMAudioTrack(((AudioSampleEntry) demuxerTrack
                    .getSampleEntries()[0]).getFormat());
            audioTracks.add(att);
            att.setEdits(demuxerTrack.getEdits());
            att.setName(demuxerTrack.getName());
        }

        AbstractMP4DemuxerTrack vt = demuxer.getVideoTrack();
        FramesMP4MuxerTrack video = muxer.addTrack(VIDEO, (int) vt.getTimescale());
        // vt.open(input);
        video.setTimecode(muxer.addTimecodeTrack((int) vt.getTimescale()));
        copyEdits(vt, video, new Rational((int)vt.getTimescale(), demuxer.getMovie().getTimescale()));
        video.addSampleEntries(vt.getSampleEntries());
        MP4Packet pkt = null;
        while ((pkt = (MP4Packet)vt.nextFrame()) != null) {
            if (tt != null)
                pkt = tt.getTimecode(pkt);
            pkt = processFrame(pkt);
            video.addFrame(pkt);

            for (int i = 0; i < at.size(); i++) {
                AudioSampleEntry ase = (AudioSampleEntry) at.get(i).getSampleEntries()[0];
                int frames = (int) (ase.getSampleRate() * pkt.getDuration() / vt.getTimescale());
                MP4Packet apkt = (MP4Packet)at.get(i).nextFrame();
                audioTracks.get(i).addSamples(apkt.getData());
            }
        }

        MovieBox movie = muxer.finalizeHeader();
        if (handler != null)
            handler.handle(movie);
        muxer.storeHeader(movie);

    } finally {
        NIOUtils.closeQuietly(input);
        NIOUtils.closeQuietly(output);
        NIOUtils.closeQuietly(tci);
    }
}
 
Example 6
Project: OpenSpaceDVR   File: TestTool.java View source code5 votes
private void doIt(String in) throws Exception {
    SeekableByteChannel raw = null;
    SeekableByteChannel source = null;
    try {
        source = new FileChannelWrapper(new FileInputStream(in).getChannel());

        MP4Demuxer demux = new MP4Demuxer(source);

        H264Decoder decoder = new H264Decoder();

        AbstractMP4DemuxerTrack inTrack = demux.getVideoTrack();

        VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
        AvcCBox avcC = Box.as(AvcCBox.class, Box.findFirst(ine, LeafBox.class, "avcC"));

        ByteBuffer _rawData = ByteBuffer.allocate(1920 * 1088 * 6);

        decoder.addSps(avcC.getSpsList());
        decoder.addPps(avcC.getPpsList());

        Packet inFrame;

        int sf = 2600;
        AbstractMP4DemuxerTrack dt = (AbstractMP4DemuxerTrack) inTrack;
        dt.gotoFrame(sf);
        while ((inFrame = inTrack.nextFrame()) != null && !inFrame.isKeyFrame())
            ;
        // System.out.println(inFrame.getFrameNo() + " - " +
        // inFrame.isKeyFrame());
        dt.gotoFrame(inFrame.getFrameNo());

        List<Picture> decodedPics = new ArrayList<Picture>();
        int totalFrames = (int) inTrack.getFrameCount(), seqNo = 0;
        for (int i = sf; (inFrame = inTrack.nextFrame()) != null; i++) {
            ByteBuffer data = inFrame.getData();
            List<ByteBuffer> nalUnits = splitMOVPacket(data, avcC);
            _rawData.clear();
            H264Utils.joinNALUnits(nalUnits, _rawData);
            _rawData.flip();

            if (H264Utils.idrSlice(_rawData)) {
                if (raw != null) {
                    raw.close();
                    runJMCompareResults(decodedPics, seqNo);
                    decodedPics = new ArrayList<Picture>();
                    seqNo = i;
                }
                raw = new FileChannelWrapper(new FileOutputStream(coded).getChannel());
                H264Utils.saveStreamParams(avcC, raw);
            }
            raw.write(_rawData);

            decodedPics.add(decoder.decodeFrame(nalUnits,
                    Picture.create((ine.getWidth() + 15) & ~0xf, (ine.getHeight() + 15) & ~0xf, ColorSpace.YUV420)
                            .getData()));
            if (i % 500 == 0)
                System.out.println((i * 100 / totalFrames) + "%");
        }
        if (decodedPics.size() > 0)
            runJMCompareResults(decodedPics, seqNo);
    } finally {
        if (source != null)
            source.close();
        if (raw != null)
            raw.close();
    }
}