cmtime

Converting CMTime To String is wrong value return

送分小仙女□ 提交于 2021-02-09 10:58:47
问题 I want CMTime to String an human readable. So I Found below code. extension CMTime { var durationText:String { let totalSeconds = CMTimeGetSeconds(self) let hours:Int = Int(totalSeconds / 3600) let minutes:Int = Int(totalSeconds.truncatingRemainder(dividingBy: 3600) / 60) let seconds:Int = Int(totalSeconds.truncatingRemainder(dividingBy: 60)) if hours > 0 { return String(format: "%i:%02i:%02i", hours, minutes, seconds) } else { return String(format: "%02i:%02i", minutes, seconds) } } } And I

Using seconds in AVPlayer seekToTime

我们两清 提交于 2020-05-25 06:35:18
问题 this should be a simple one. I have an AVPlayer playing a video file and I want to be able to jump to a specific time but I'm having some trouble understanding how CMTime works. I need to specify the time in seconds. For example: if I wanted to jump to second 10.8 I'd like to do something like this: [self.avPlayer.currentItem seekToTime:CMTimeMakeWithSeconds(10.8, 1]; But I'm not getting the result I want. 回答1: http://warrenmoore.net/understanding-cmtime Might give you a clearer idea of

AVPlayer SeekToTime not working. Starting from beginning everytime

时光总嘲笑我的痴心妄想 提交于 2020-01-25 17:33:30
问题 I'm having trouble with UISlider and AVPlayer scrubbing method. Every time the method is invoked the player restarts from 0. I tried debugging and seems that the slider value is right but when I Step Over, it's set 0, thus the player restarts. This is what I've tried: var desiredTime = CMTimeMake(Int64(self.progressSlider.value), 1) AudioManager.sharedInstance.audioPlayer.seekToTime(desiredTime) I've also tried looking at similar questions and found that it was the same when I tried their

How to accurately set fast shutter speeds (exposure duration) on an AVCaptureDevice?

亡梦爱人 提交于 2020-01-15 06:31:47
问题 I'm working on a camera app for IOS (13). For that I use an AVCaptureSession in conjunction with an AVCaptureVideoPreviewLayer. So far everything works fine. Now I want to let the user choose and set a custom shutter speed (exposure duration) out of an given array of typical shutter speed values (in 1/3-exposure-stops) as an [Int32]: let shutterSpeedValues: [Int32] = [1, 2, 3, 4, 5, 6, 8, 10, 13, 15, 20, 25, 30, 40, 50, 60, 80, 100, 125, 160, 200, 250, 320, 400, 500, 640, 800, 1000, 1250,

Creating a time range for AVAssetExportSession

安稳与你 提交于 2020-01-12 08:01:11
问题 I was wondering how to make a time range for AVAssetExportSession from time stamps such as: NSTimeInterval start = [[NSDate date] timeIntervalSince1970]; NSTimeInterval end = [[NSDate date] timeIntervalSince1970]; The code that I am using for my export session is as follows: AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality]; exportSession.outputURL = videoURL; exportSession.outputFileType =

Split CMTimeRange into multiple CMTimeRange chunks

淺唱寂寞╮ 提交于 2019-12-24 00:24:55
问题 Lets assume I have a CMTimeRange constructed from start time zero, and duration of 40 seconds. I want to split this CMTimeRange into multiple chunks by a X seconds divider. So the total duration of the chunks will be the same duration as the original duration, and each startTime will reflect the endTime of of the previous chunk. The last chunk will be the modulus of the left over seconds. For example, for video of 40 seconds, and divider of 15 seconds per chunk: First CMTimeRange - start time

How to get thumbnail image of video from ALAsset in iOS?

烈酒焚心 提交于 2019-12-21 17:13:58
问题 I want to get thumbnail images of every frame from video and then save this images in Mutable Array of images. I want to use this images to play as a animation. NSURL* assetURL = [self.asset valueForProperty:ALAssetPropertyAssetURL]; NSDictionary* assetOptions = nil; AVAsset* myAsset = [[AVURLAsset alloc] initWithURL:assetURL options:assetOptions]; self.imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset]; int duration = CMTimeGetSeconds([myAsset duration]); for(int i

How to pass Float in CMTimeMake's timescale Int32

╄→гoц情女王★ 提交于 2019-12-11 15:20:22
问题 I am working on CMTimeMake in order to add slow and fast motion effect to Video. Where we have to divide by Video scale for Fast effect and multiply by Video scale for Slow effect. Here it is: let videoScaleFactor = Int64(2) // Get the scaled video duration let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(videoAsset.duration.value / videoScaleFactor, videoAsset.duration.timescale) : CMTimeMake(videoAsset.duration.value * videoScaleFactor, videoAsset.duration.timescale) Now as per my

AVPlayer seekToTime not working properly

邮差的信 提交于 2019-12-07 08:07:47
问题 Im having a problem with seeking with AVPlayer.seekToTime, I have the time index that I want to seek to inside a scrollViewDidScroll method like this: func scrollViewDidScroll(scrollView: UIScrollView) { let offsetTime = scrollView.contentOffset.y * 0.1 self.playerController.player?.seekToTime(CMTime(seconds: Double(offsetTime), preferredTimescale: 10), toleranceBefore: kCMTimePositiveInfinity, toleranceAfter: kCMTimeZero) } But the video does not flow nice. For example, when you scroll I

I cannot get a precise CMTime for Generating Still Image from 1.8 second Video

若如初见. 提交于 2019-12-06 03:21:59
问题 Every time I try to generate a still frame from my video asset, it is generated at a time of 0.000.. seconds. I can see this from my log message. The good thing is that I can get the image at time 0.000.. to show up in a UIImageView called "myImageView." I thought the problem was that AVURLAssetPreferPreciseDurationAndTimingKey was not set, but even after I figured out how to do that, it still does not function.. Here is what I have.. time, actualTime, and generate are declared in the Header