hardware-acceleration

How to know Android decoder MediaCodec.createDecoderByType(type) is Hardware or software decoder?

倾然丶 夕夏残阳落幕 提交于 2019-12-01 04:53:30
问题 Is there a way to find out if the decoder that received using MediaCodec.createDecoderByType(type) is a hardware decoder or a software decoder? 回答1: There is no real formal flag for indicating whether a codec is a hardware or software codec. In practice, you can do this, though: MediaCodec codec = MediaCodec.createDecoderByType(type); if (codec.getName().startsWith("OMX.google.")) { // Is a software codec } (The MediaCodec.getName() method is available since API level 18. For lower API levels

Android WebView Hardware Accelerated Keyboard Glitch

白昼怎懂夜的黑 提交于 2019-11-30 20:47:50
问题 When WebView is hardware accelerated, clicking on input field causes keyboard to appear and html is redrawed shifted and duplicated for a moment: 1) When soft keyboard is appearing WebView pans its content to bottom-left, then againt to normal position. Causes short view-able duplication. 2) When changing keyboards (ex. abc->numbers) contents are panned down by keyboard height and then back to normal position. Causes short view-able duplication. Tested on two Android 4.0 tablets, if hardware

Hardware accelerated FFmpeg on android?

亡梦爱人 提交于 2019-11-30 08:45:10
I compiled an older version of FFmpeg for Android (if I recall correctly it was 0.6.X). FFmpeg decodes a video frame, scales it, then I'm using OpenGL to draw it on the screen. As far as I can tell, the problem is the decoding and scaling---they're not hardware accelerated. My question is: Is the latest version of FFmpeg hardware accelerated for ARM (Android) processors? Am I going about this the wrong way? i.e., is there a better way of doing this? Where "this" is playing a h264 HD video as a video live wallpaper---the framerate needs to be high, so hardware acceleration is desirable whenever

Why is hardware acceleration not working on my View?

我是研究僧i 提交于 2019-11-30 05:08:35
I'm using Facebook's Rebound library to replicate the bouncy animations seen in their chat heads implementation. The problem is, most of the time the animation stutters. A few pictures will explain this better. Here's the buttery-smooth chat heads animation: And here's my attempt (notice how the animation for the white View skips nearly all frames): Once in a while it works smoothly: Below is the code I'm using currently (the entire project is up on Github if you want to set it up quickly). I'm guessing this has something to do with hardware acceleration not being enabled correctly on my View

Why aren't browsers smart enough to hardware accelerate without tricks?

只谈情不闲聊 提交于 2019-11-30 02:01:58
There are tons of webpages these days recommending you add these rules to your content to make it hardware accelerated: transform: translate3d(0,0,0); -webkit-transform: translate3d(0,0,0); This always struck me as ridiculous. Why should the browser need my help to decide to hardware accelerate? It's going to be faster, right? So why not just do it? Why wait for me to "trick" the browser into it? Another way of asking this question might be, why doesn't every baseline/reset stylesheet include the lines * { transform: translate3d(0,0,0); -webkit-transform: translate3d(0,0,0); } It's not so much

How do I use Hardware accelerated video/H.264 decoding with directx 11 and windows 7?

↘锁芯ラ 提交于 2019-11-29 22:38:50
I've been researching all day and not gotten very far. I'm on windows 7, using directx 11. (My final output is to be a frame of video onto a DX11 texture) I want to decode some very large H.264 video files, and the CPU (using libav) doesn't cut it. I've looked at the hwaccel capabilities of libav using DXVA2, but hit a road block when I need to create a IDirectXVideoDecoder, which can only be created with a D3D9 interface. (which I don't have using DX11) Whenever I've looked up DXVA documentation, it doesn't reference DX11, was this removed in DX10 or 11? (Can't find any confirmation of this,

Test if Hardware Acceleration has been enabled for a CSS animation?

旧城冷巷雨未停 提交于 2019-11-29 20:58:53
How can I tell (for testing purposes) if Hardware Acceleration has been enabled for a CSS animation? I have the following code which essentially enlarges an element and makes it fullscreen (without using the HTML5 fullscreen API). It runs like a stuttering asthmatic tortoise on most mobiles when using a jQuery animation so I have used CSS3 instead. Here is the jsFiddle example: $("#makeFullscreen").on("click", function() { var map = $("#map"), mapTop = map.offset().top, mapLeft = map.offset().left; $("#map").css({ "position": "fixed", "top": mapTop, "left": mapLeft, "width": map.outerWidth

How to use hardware acceleration with ffmpeg

心已入冬 提交于 2019-11-29 19:36:48
I need to have ffmpeg decode my video(e.g. h264) using hardware acceleration. I'm using the usual way of decoding frames: read packet -> decode frame. And I'd like to have ffmpeg speed up decoding. So I've built it with --enable-vaapi and --enable-hwaccel=h264 . But I don't really know what should I do next. I've tried to use avcodec_find_decoder_by_name("h264_vaapi") but it returns nullptr. Anyway, I might want to use others API and not just VA API. How one is supposed to speed up ffmpeg decoding? P.S. I didn't find any examples on Internet which uses ffmpeg with hwaccel. ixSci After some

Does GDI+ support graphics acceleration?

喜欢而已 提交于 2019-11-29 16:47:19
I'm trying to write a screensaver for a Windows platform using C++ and Win APIs. To render graphics I'm using GDI+, but the issue is rendering png's with some small amount of animation (fade-in and -out) becomes very "CPU heavy." So I was wondering if there's a way to enable GPU acceleration for GDI+ APIs? And if it's not possible, is there something that I can use from a non-managed code that supports GPU acceleration (apart from OpenGL or DirectX)? Nope. GDI is mostly about manipulation of in memory bitmaps when it comes down to it. If you want more advanced features, use Direct3D/2D. 来源:

Hardware accelerated FFmpeg on android?

廉价感情. 提交于 2019-11-29 12:20:06
问题 I compiled an older version of FFmpeg for Android (if I recall correctly it was 0.6.X). FFmpeg decodes a video frame, scales it, then I'm using OpenGL to draw it on the screen. As far as I can tell, the problem is the decoding and scaling---they're not hardware accelerated. My question is: Is the latest version of FFmpeg hardware accelerated for ARM (Android) processors? Am I going about this the wrong way? i.e., is there a better way of doing this? Where "this" is playing a h264 HD video as