Multiple videos with AVPlayer

前端 未结 3 1324
忘了有多久
忘了有多久 2020-12-22 17:38

I am developing an iOS app for iPad that needs to play videos in some part of the screen. I have several video files that needs to be played after each other in an order tha

3条回答
  •  忘掉有多难
    2020-12-22 18:06

    Update — https://youtu.be/7QlaO7WxjGg

    Here's your answer using a collection view as an example, which will play 8 at a time (note that no memory management of any kind is necessary; you may use ARC):

        - (UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath {
        UICollectionViewCell *cell = (UICollectionViewCell *)[collectionView dequeueReusableCellWithReuseIdentifier:kCellIdentifier forIndexPath:indexPath];
    
        // Enumerate array of visible cells' indexPaths to find a match
        if ([self.collectionView.indexPathsForVisibleItems
             indexOfObjectPassingTest:^BOOL(NSIndexPath * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
                 return (obj.item == indexPath.item);
             }]) dispatch_async(dispatch_get_main_queue(), ^{
                 [self drawLayerForPlayerForCell:cell atIndexPath:indexPath];
             });
    
    
        return cell;
    }
    
    - (void)drawPosterFrameForCell:(UICollectionViewCell *)cell atIndexPath:(NSIndexPath *)indexPath {
        [self.imageManager requestImageForAsset:AppDelegate.sharedAppDelegate.assetsFetchResults[indexPath.item]
                                                              targetSize:AssetGridThumbnailSize
                                                             contentMode:PHImageContentModeAspectFill
                                                                 options:nil
                                                           resultHandler:^(UIImage *result, NSDictionary *info) {
                                                               cell.contentView.layer.contents = (__bridge id)result.CGImage;
                                                           }];
    }
    
    - (void)drawLayerForPlayerForCell:(UICollectionViewCell *)cell atIndexPath:(NSIndexPath *)indexPath {
        cell.contentView.layer.sublayers = nil;
        [self.imageManager requestPlayerItemForVideo:(PHAsset *)self.assetsFetchResults[indexPath.item] options:nil resultHandler:^(AVPlayerItem * _Nullable playerItem, NSDictionary * _Nullable info) {
            dispatch_sync(dispatch_get_main_queue(), ^{
                if([[info objectForKey:PHImageResultIsInCloudKey] boolValue]) {
                    [self drawPosterFrameForCell:cell atIndexPath:indexPath];
                } else {
                    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:[AVPlayer playerWithPlayerItem:playerItem]];
                    [playerLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
                    [playerLayer setBorderColor:[UIColor whiteColor].CGColor];
                    [playerLayer setBorderWidth:1.0f];
                    [playerLayer setFrame:cell.contentView.layer.bounds];
                    [cell.contentView.layer addSublayer:playerLayer];
                    [playerLayer.player play];
                }
            });
        }];
    }
    

    The drawPosterFrameForCell method places an image where a video cannot be played because it is stored on iCloud, and not the device.

    Anyway, this is the starting point; once you understand how this works, you can do all the things you wanted, without any of the glitches, memory-wise, that you described.

提交回复
热议问题