iOS开发-自定义相机实例(仿微信)

网上有很多自定义相机的例子,这里只是我临时写的一个小demo,仅供参考:

用到了下面几个库:

#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>

在使用的时候需要在Info.plist中把相关权限写进去:

Privacy - Microphone Usage Description
Privacy - Photo Library Usage Description
Privacy - Camera Usage Description

我在写这个demo时,是按照微信的样式写的,同样是点击拍照、长按录制视频,视频录制完直接进行播放,这里封装了一个简易的播放器:

m文件

#import "HAVPlayer.h"
#import <AVFoundation/AVFoundation.h>

@interface HAVPlayer ()

@property (nonatomic,strong) AVPlayer *player;//播放器对象

@end

@implementation HAVPlayer

/*
// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect {
  // Drawing code
}
*/

- (instancetype)initWithFrame:(CGRect)frame withShowInView:(UIView *)bgView url:(NSURL *)url {
  if (self = [self initWithFrame:frame]) {
    //创建播放器层
    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    playerLayer.frame = self.bounds;

    [self.layer addSublayer:playerLayer];
    if (url) {
      self.videoUrl = url;
    }

    [bgView addSubview:self];
  }
  return self;
}

- (void)dealloc {
  [self removeAvPlayerNtf];
  [self stopPlayer];
  self.player = nil;
}

- (AVPlayer *)player {
  if (!_player) {
    _player = [AVPlayer playerWithPlayerItem:[self getAVPlayerItem]];
    [self addAVPlayerNtf:_player.currentItem];

  }

  return _player;
}

- (AVPlayerItem *)getAVPlayerItem {
  AVPlayerItem *playerItem=[AVPlayerItem playerItemWithURL:self.videoUrl];
  return playerItem;
}

- (void)setVideoUrl:(NSURL *)videoUrl {
  _videoUrl = videoUrl;
  [self removeAvPlayerNtf];
  [self nextPlayer];
}

- (void)nextPlayer {
  [self.player seekToTime:CMTimeMakeWithSeconds(0, _player.currentItem.duration.timescale)];
  [self.player replaceCurrentItemWithPlayerItem:[self getAVPlayerItem]];
  [self addAVPlayerNtf:self.player.currentItem];
  if (self.player.rate == 0) {
    [self.player play];
  }
}

- (void) addAVPlayerNtf:(AVPlayerItem *)playerItem {
  //监控状态属性
  [playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];
  //监控网络加载情况属性
  [playerItem addObserver:self forKeyPath:@"loadedTimeRanges" options:NSKeyValueObservingOptionNew context:nil];

  [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playbackFinished:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.player.currentItem];
}

- (void)removeAvPlayerNtf {
  AVPlayerItem *playerItem = self.player.currentItem;
  [playerItem removeObserver:self forKeyPath:@"status"];
  [playerItem removeObserver:self forKeyPath:@"loadedTimeRanges"];
  [[NSNotificationCenter defaultCenter] removeObserver:self];
}

- (void)stopPlayer {
  if (self.player.rate == 1) {
    [self.player pause];//如果在播放状态就停止
  }
}

/**
 * 通过KVO监控播放器状态
 *
 * @param keyPath 监控属性
 * @param object 监视器
 * @param change 状态改变
 * @param context 上下文
 */
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{
  AVPlayerItem *playerItem = object;
  if ([keyPath isEqualToString:@"status"]) {
    AVPlayerStatus status= [[change objectForKey:@"new"] intValue];
    if(status==AVPlayerStatusReadyToPlay){
      NSLog(@"正在播放...,视频总长度:%.2f",CMTimeGetSeconds(playerItem.duration));
    }
  }else if([keyPath isEqualToString:@"loadedTimeRanges"]){
    NSArray *array=playerItem.loadedTimeRanges;
    CMTimeRange timeRange = [array.firstObject CMTimeRangeValue];//本次缓冲时间范围
    float startSeconds = CMTimeGetSeconds(timeRange.start);
    float durationSeconds = CMTimeGetSeconds(timeRange.duration);
    NSTimeInterval totalBuffer = startSeconds + durationSeconds;//缓冲总长度
    NSLog(@"共缓冲:%.2f",totalBuffer);
  }
}

- (void)playbackFinished:(NSNotification *)ntf {
  Plog(@"视频播放完成");
  [self.player seekToTime:CMTimeMake(0, 1)];
  [self.player play];
}

@end

另外微信下面的按钮长按会出现圆弧时间条:

m文件

#import "HProgressView.h"

@interface HProgressView ()

/**
 * 进度值0-1.0之间
 */
@property (nonatomic,assign)CGFloat progressValue;

@property (nonatomic, assign) CGFloat currentTime;

@end

@implementation HProgressView

// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect {
  // Drawing code
  CGContextRef ctx = UIGraphicsGetCurrentContext();//获取上下文
  Plog(@"width = %f",self.frame.size.width);
  CGPoint center = CGPointMake(self.frame.size.width/2.0, self.frame.size.width/2.0); //设置圆心位置
  CGFloat radius = self.frame.size.width/2.0-5; //设置半径
  CGFloat startA = - M_PI_2; //圆起点位置
  CGFloat endA = -M_PI_2 + M_PI * 2 * _progressValue; //圆终点位置

  UIBezierPath *path = [UIBezierPath bezierPathWithArcCenter:center radius:radius startAngle:startA endAngle:endA clockwise:YES];

  CGContextSetLineWidth(ctx, 10); //设置线条宽度
  [[UIColor whiteColor] setStroke]; //设置描边颜色

  CGContextAddPath(ctx, path.CGPath); //把路径添加到上下文

  CGContextStrokePath(ctx); //渲染
}

- (void)setTimeMax:(NSInteger)timeMax {
  _timeMax = timeMax;
  self.currentTime = 0;
  self.progressValue = 0;
  [self setNeedsDisplay];
  self.hidden = NO;
  [self performSelector:@selector(startProgress) withObject:nil afterDelay:0.1];
}

- (void)clearProgress {
  _currentTime = _timeMax;
  self.hidden = YES;
}

- (void)startProgress {
  _currentTime += 0.1;
  if (_timeMax > _currentTime) {
    _progressValue = _currentTime/_timeMax;
    Plog(@"progress = %f",_progressValue);
    [self setNeedsDisplay];
    [self performSelector:@selector(startProgress) withObject:nil afterDelay:0.1];
  }

  if (_timeMax <= _currentTime) {
    [self clearProgress];

  }
}

@end

接下来就是相机的控制器了,由于是临时写的,所以用的xib,大家不要直接使用,直接上m文件代码吧:

#import "HVideoViewController.h"
#import <AVFoundation/AVFoundation.h>
#import "HAVPlayer.h"
#import "HProgressView.h"
#import <Foundation/Foundation.h>
#import <AssetsLibrary/AssetsLibrary.h>

typedef void(^PropertyChangeBlock)(AVCaptureDevice *captureDevice);
@interface HVideoViewController ()<AVCaptureFileOutputRecordingDelegate>

//轻触拍照,按住摄像
@property (strong, nonatomic) IBOutlet UILabel *labelTipTitle;

//视频输出流
@property (strong,nonatomic) AVCaptureMovieFileOutput *captureMovieFileOutput;
//图片输出流
//@property (strong,nonatomic) AVCaptureStillImageOutput *captureStillImageOutput;//照片输出流
//负责从AVCaptureDevice获得输入数据
@property (strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;
//后台任务标识
@property (assign,nonatomic) UIBackgroundTaskIdentifier backgroundTaskIdentifier;

@property (assign,nonatomic) UIBackgroundTaskIdentifier lastBackgroundTaskIdentifier;

@property (weak, nonatomic) IBOutlet UIImageView *focusCursor; //聚焦光标

//负责输入和输出设备之间的数据传递
@property(nonatomic)AVCaptureSession *session;

//图像预览层,实时显示捕获的图像
@property(nonatomic)AVCaptureVideoPreviewLayer *previewLayer;

@property (strong, nonatomic) IBOutlet UIButton *btnBack;
//重新录制
@property (strong, nonatomic) IBOutlet UIButton *btnAfresh;
//确定
@property (strong, nonatomic) IBOutlet UIButton *btnEnsure;
//摄像头切换
@property (strong, nonatomic) IBOutlet UIButton *btnCamera;

@property (strong, nonatomic) IBOutlet UIImageView *bgView;
//记录录制的时间 默认最大60秒
@property (assign, nonatomic) NSInteger seconds;

//记录需要保存视频的路径
@property (strong, nonatomic) NSURL *saveVideoUrl;

//是否在对焦
@property (assign, nonatomic) BOOL isFocus;
@property (strong, nonatomic) IBOutlet NSLayoutConstraint *afreshCenterX;
@property (strong, nonatomic) IBOutlet NSLayoutConstraint *ensureCenterX;
@property (strong, nonatomic) IBOutlet NSLayoutConstraint *backCenterX;

//视频播放
@property (strong, nonatomic) HAVPlayer *player;

@property (strong, nonatomic) IBOutlet HProgressView *progressView;

//是否是摄像 YES 代表是录制 NO 表示拍照
@property (assign, nonatomic) BOOL isVideo;

@property (strong, nonatomic) UIImage *takeImage;
@property (strong, nonatomic) UIImageView *takeImageView;
@property (strong, nonatomic) IBOutlet UIImageView *imgRecord;

@end

//时间大于这个就是视频,否则为拍照
#define TimeMax 1

@implementation HVideoViewController

-(void)dealloc{
  [self removeNotification];

}

- (void)viewDidLoad {
  [super viewDidLoad];
  // Do any additional setup after loading the view.

  UIImage *image = [UIImage imageNamed:@"sc_btn_take.png"];
  self.backCenterX.constant = -(SCREEN_WIDTH/2/2)-image.size.width/2/2;

  self.progressView.layer.cornerRadius = self.progressView.frame.size.width/2;

  if (self.HSeconds == 0) {
    self.HSeconds = 60;
  }

  [self performSelector:@selector(hiddenTipsLabel) withObject:nil afterDelay:4];
}

- (void)hiddenTipsLabel {
  self.labelTipTitle.hidden = YES;
}

- (void)didReceiveMemoryWarning {
  [super didReceiveMemoryWarning];
  // Dispose of any resources that can be recreated.
}

- (void)viewWillAppear:(BOOL)animated {
  [super viewWillAppear:animated];
  [[UIApplication sharedApplication] setStatusBarHidden:YES];
  [self customCamera];
  [self.session startRunning];
}

-(void)viewDidAppear:(BOOL)animated{
  [super viewDidAppear:animated];
}

-(void)viewDidDisappear:(BOOL)animated{
  [super viewDidDisappear:animated];
  [self.session stopRunning];
}

- (void)viewWillDisappear:(BOOL)animated {
  [super viewWillDisappear:animated];
  [[UIApplication sharedApplication] setStatusBarHidden:NO];
}

- (void)customCamera {

  //初始化会话,用来结合输入输出
  self.session = [[AVCaptureSession alloc] init];
  //设置分辨率 (设备支持的最高分辨率)
  if ([self.session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
    self.session.sessionPreset = AVCaptureSessionPresetHigh;
  }
  //取得后置摄像头
  AVCaptureDevice *captureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
  //添加一个音频输入设备
  AVCaptureDevice *audioCaptureDevice=[[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];

  //初始化输入设备
  NSError *error = nil;
  self.captureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
  if (error) {
    Plog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);
    return;
  }

  //添加音频
  error = nil;
  AVCaptureDeviceInput *audioCaptureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:audioCaptureDevice error:&error];
  if (error) {
    NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);
    return;
  }

  //输出对象
  self.captureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];//视频输出

  //将输入设备添加到会话
  if ([self.session canAddInput:self.captureDeviceInput]) {
    [self.session addInput:self.captureDeviceInput];
    [self.session addInput:audioCaptureDeviceInput];
    //设置视频防抖
    AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    if ([connection isVideoStabilizationSupported]) {
      connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;
    }
  }

  //将输出设备添加到会话 (刚开始 是照片为输出对象)
  if ([self.session canAddOutput:self.captureMovieFileOutput]) {
    [self.session addOutput:self.captureMovieFileOutput];
  }

  //创建视频预览层,用于实时展示摄像头状态
  self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
  self.previewLayer.frame = self.view.bounds;//CGRectMake(0, 0, self.view.width, self.view.height);
  self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;//填充模式
  [self.bgView.layer addSublayer:self.previewLayer];

  [self addNotificationToCaptureDevice:captureDevice];
  [self addGenstureRecognizer];
}

- (IBAction)onCancelAction:(UIButton *)sender {
  [self dismissViewControllerAnimated:YES completion:^{
    [Utility hideProgressDialog];
  }];
}

- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
  if ([[touches anyObject] view] == self.imgRecord) {
    Plog(@"开始录制");
    //根据设备输出获得连接
    AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeAudio];
    //根据连接取得设备输出的数据
    if (![self.captureMovieFileOutput isRecording]) {
      //如果支持多任务则开始多任务
      if ([[UIDevice currentDevice] isMultitaskingSupported]) {
        self.backgroundTaskIdentifier = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil];
      }
      if (self.saveVideoUrl) {
        [[NSFileManager defaultManager] removeItemAtURL:self.saveVideoUrl error:nil];
      }
      //预览图层和视频方向保持一致
      connection.videoOrientation = [self.previewLayer connection].videoOrientation;
      NSString *outputFielPath=[NSTemporaryDirectory() stringByAppendingString:@"myMovie.mov"];
      NSLog(@"save path is :%@",outputFielPath);
      NSURL *fileUrl=[NSURL fileURLWithPath:outputFielPath];
      NSLog(@"fileUrl:%@",fileUrl);
      [self.captureMovieFileOutput startRecordingToOutputFileURL:fileUrl recordingDelegate:self];
    } else {
      [self.captureMovieFileOutput stopRecording];
    }
  }
}

- (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
  if ([[touches anyObject] view] == self.imgRecord) {
    Plog(@"结束触摸");
    if (!self.isVideo) {
      [self performSelector:@selector(endRecord) withObject:nil afterDelay:0.3];
    } else {
      [self endRecord];
    }
  }
}

- (void)endRecord {
  [self.captureMovieFileOutput stopRecording];//停止录制
}

- (IBAction)onAfreshAction:(UIButton *)sender {
  Plog(@"重新录制");
  [self recoverLayout];
}

- (IBAction)onEnsureAction:(UIButton *)sender {
  Plog(@"确定 这里进行保存或者发送出去");
  if (self.saveVideoUrl) {
    WS(weakSelf)
    [Utility showProgressDialogText:@"视频处理中..."];
    ALAssetsLibrary *assetsLibrary=[[ALAssetsLibrary alloc]init];
    [assetsLibrary writeVideoAtPathToSavedPhotosAlbum:self.saveVideoUrl completionBlock:^(NSURL *assetURL, NSError *error) {
      Plog(@"outputUrl:%@",weakSelf.saveVideoUrl);
      [[NSFileManager defaultManager] removeItemAtURL:weakSelf.saveVideoUrl error:nil];
      if (weakSelf.lastBackgroundTaskIdentifier!= UIBackgroundTaskInvalid) {
        [[UIApplication sharedApplication] endBackgroundTask:weakSelf.lastBackgroundTaskIdentifier];
      }
      if (error) {
        Plog(@"保存视频到相簿过程中发生错误,错误信息:%@",error.localizedDescription);
        [Utility showAllTextDialog:KAppDelegate.window Text:@"保存视频到相册发生错误"];
      } else {
        if (weakSelf.takeBlock) {
          weakSelf.takeBlock(assetURL);
        }
        Plog(@"成功保存视频到相簿.");
        [weakSelf onCancelAction:nil];
      }
    }];
  } else {
    //照片
    UIImageWriteToSavedPhotosAlbum(self.takeImage, self, nil, nil);
    if (self.takeBlock) {
      self.takeBlock(self.takeImage);
    }

    [self onCancelAction:nil];
  }
}

//前后摄像头的切换
- (IBAction)onCameraAction:(UIButton *)sender {
  Plog(@"切换摄像头");
  AVCaptureDevice *currentDevice=[self.captureDeviceInput device];
  AVCaptureDevicePosition currentPosition=[currentDevice position];
  [self removeNotificationFromCaptureDevice:currentDevice];
  AVCaptureDevice *toChangeDevice;
  AVCaptureDevicePosition toChangePosition = AVCaptureDevicePositionFront;//前
  if (currentPosition == AVCaptureDevicePositionUnspecified || currentPosition == AVCaptureDevicePositionFront) {
    toChangePosition = AVCaptureDevicePositionBack;//后
  }
  toChangeDevice=[self getCameraDeviceWithPosition:toChangePosition];
  [self addNotificationToCaptureDevice:toChangeDevice];
  //获得要调整的设备输入对象
  AVCaptureDeviceInput *toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDevice error:nil];

  //改变会话的配置前一定要先开启配置,配置完成后提交配置改变
  [self.session beginConfiguration];
  //移除原有输入对象
  [self.session removeInput:self.captureDeviceInput];
  //添加新的输入对象
  if ([self.session canAddInput:toChangeDeviceInput]) {
    [self.session addInput:toChangeDeviceInput];
    self.captureDeviceInput = toChangeDeviceInput;
  }
  //提交会话配置
  [self.session commitConfiguration];
}

- (void)onStartTranscribe:(NSURL *)fileURL {
  if ([self.captureMovieFileOutput isRecording]) {
    -- self.seconds;
    if (self.seconds > 0) {
      if (self.HSeconds - self.seconds >= TimeMax && !self.isVideo) {
        self.isVideo = YES;//长按时间超过TimeMax 表示是视频录制
        self.progressView.timeMax = self.seconds;
      }
      [self performSelector:@selector(onStartTranscribe:) withObject:fileURL afterDelay:1.0];
    } else {
      if ([self.captureMovieFileOutput isRecording]) {
        [self.captureMovieFileOutput stopRecording];
      }
    }
  }
}

#pragma mark - 视频输出代理
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{
  Plog(@"开始录制...");
  self.seconds = self.HSeconds;
  [self performSelector:@selector(onStartTranscribe:) withObject:fileURL afterDelay:1.0];
}

-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
  Plog(@"视频录制完成.");
  [self changeLayout];
  if (self.isVideo) {
    self.saveVideoUrl = outputFileURL;
    if (!self.player) {
      self.player = [[HAVPlayer alloc] initWithFrame:self.bgView.bounds withShowInView:self.bgView url:outputFileURL];
    } else {
      if (outputFileURL) {
        self.player.videoUrl = outputFileURL;
        self.player.hidden = NO;
      }
    }
  } else {
    //照片
    self.saveVideoUrl = nil;
    [self videoHandlePhoto:outputFileURL];
  }

}

- (void)videoHandlePhoto:(NSURL *)url {
  AVURLAsset *urlSet = [AVURLAsset assetWithURL:url];
  AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:urlSet];
  imageGenerator.appliesPreferredTrackTransform = YES;  // 截图的时候调整到正确的方向
  NSError *error = nil;
  CMTime time = CMTimeMake(0,30);//缩略图创建时间 CMTime是表示电影时间信息的结构体,第一个参数表示是视频第几秒,第二个参数表示每秒帧数.(如果要获取某一秒的第几帧可以使用CMTimeMake方法)
  CMTime actucalTime; //缩略图实际生成的时间
  CGImageRef cgImage = [imageGenerator copyCGImageAtTime:time actualTime:&actucalTime error:&error];
  if (error) {
    Plog(@"截取视频图片失败:%@",error.localizedDescription);
  }
  CMTimeShow(actucalTime);
  UIImage *image = [UIImage imageWithCGImage:cgImage];

  CGImageRelease(cgImage);
  if (image) {
    Plog(@"视频截取成功");
  } else {
    Plog(@"视频截取失败");
  }

  self.takeImage = image;//[UIImage imageWithCGImage:cgImage];

  [[NSFileManager defaultManager] removeItemAtURL:url error:nil];

  if (!self.takeImageView) {
    self.takeImageView = [[UIImageView alloc] initWithFrame:self.view.frame];
    [self.bgView addSubview:self.takeImageView];
  }
  self.takeImageView.hidden = NO;
  self.takeImageView.image = self.takeImage;
}

#pragma mark - 通知

//注册通知
- (void)setupObservers
{
  NSNotificationCenter *notification = [NSNotificationCenter defaultCenter];
  [notification addObserver:self selector:@selector(applicationDidEnterBackground:) name:UIApplicationWillResignActiveNotification object:[UIApplication sharedApplication]];
}

//进入后台就退出视频录制
- (void)applicationDidEnterBackground:(NSNotification *)notification {
  [self onCancelAction:nil];
}

/**
 * 给输入设备添加通知
 */
-(void)addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice{
  //注意添加区域改变捕获通知必须首先设置设备允许捕获
  [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
    captureDevice.subjectAreaChangeMonitoringEnabled=YES;
  }];
  NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter];
  //捕获区域发生改变
  [notificationCenter addObserver:self selector:@selector(areaChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];
}
-(void)removeNotificationFromCaptureDevice:(AVCaptureDevice *)captureDevice{
  NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter];
  [notificationCenter removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];
}
/**
 * 移除所有通知
 */
-(void)removeNotification{
  NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter];
  [notificationCenter removeObserver:self];
}

-(void)addNotificationToCaptureSession:(AVCaptureSession *)captureSession{
  NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter];
  //会话出错
  [notificationCenter addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:captureSession];
}

/**
 * 设备连接成功
 *
 * @param notification 通知对象
 */
-(void)deviceConnected:(NSNotification *)notification{
  NSLog(@"设备已连接...");
}
/**
 * 设备连接断开
 *
 * @param notification 通知对象
 */
-(void)deviceDisconnected:(NSNotification *)notification{
  NSLog(@"设备已断开.");
}
/**
 * 捕获区域改变
 *
 * @param notification 通知对象
 */
-(void)areaChange:(NSNotification *)notification{
  NSLog(@"捕获区域改变...");
}

/**
 * 会话出错
 *
 * @param notification 通知对象
 */
-(void)sessionRuntimeError:(NSNotification *)notification{
  NSLog(@"会话发生错误.");
}

/**
 * 取得指定位置的摄像头
 *
 * @param position 摄像头位置
 *
 * @return 摄像头设备
 */
-(AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position{
  NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
  for (AVCaptureDevice *camera in cameras) {
    if ([camera position] == position) {
      return camera;
    }
  }
  return nil;
}

/**
 * 改变设备属性的统一操作方法
 *
 * @param propertyChange 属性改变操作
 */
-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{
  AVCaptureDevice *captureDevice= [self.captureDeviceInput device];
  NSError *error;
  //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁
  if ([captureDevice lockForConfiguration:&error]) {
    //自动白平衡
    if ([captureDevice isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance]) {
      [captureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
    }
    //自动根据环境条件开启闪光灯
    if ([captureDevice isFlashModeSupported:AVCaptureFlashModeAuto]) {
      [captureDevice setFlashMode:AVCaptureFlashModeAuto];
    }

    propertyChange(captureDevice);
    [captureDevice unlockForConfiguration];
  }else{
    NSLog(@"设置设备属性过程发生错误,错误信息:%@",error.localizedDescription);
  }
}

/**
 * 设置闪光灯模式
 *
 * @param flashMode 闪光灯模式
 */
-(void)setFlashMode:(AVCaptureFlashMode )flashMode{
  [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
    if ([captureDevice isFlashModeSupported:flashMode]) {
      [captureDevice setFlashMode:flashMode];
    }
  }];
}
/**
 * 设置聚焦模式
 *
 * @param focusMode 聚焦模式
 */
-(void)setFocusMode:(AVCaptureFocusMode )focusMode{
  [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
    if ([captureDevice isFocusModeSupported:focusMode]) {
      [captureDevice setFocusMode:focusMode];
    }
  }];
}
/**
 * 设置曝光模式
 *
 * @param exposureMode 曝光模式
 */
-(void)setExposureMode:(AVCaptureExposureMode)exposureMode{
  [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
    if ([captureDevice isExposureModeSupported:exposureMode]) {
      [captureDevice setExposureMode:exposureMode];
    }
  }];
}
/**
 * 设置聚焦点
 *
 * @param point 聚焦点
 */
-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{
  [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
//    if ([captureDevice isFocusPointOfInterestSupported]) {
//      [captureDevice setFocusPointOfInterest:point];
//    }
//    if ([captureDevice isExposurePointOfInterestSupported]) {
//      [captureDevice setExposurePointOfInterest:point];
//    }
    if ([captureDevice isExposureModeSupported:exposureMode]) {
      [captureDevice setExposureMode:exposureMode];
    }
    if ([captureDevice isFocusModeSupported:focusMode]) {
      [captureDevice setFocusMode:focusMode];
    }
  }];
}

/**
 * 添加点按手势,点按时聚焦
 */
-(void)addGenstureRecognizer{
  UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)];
  [self.bgView addGestureRecognizer:tapGesture];
}

-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{
  if ([self.session isRunning]) {
    CGPoint point= [tapGesture locationInView:self.bgView];
    //将UI坐标转化为摄像头坐标
    CGPoint cameraPoint= [self.previewLayer captureDevicePointOfInterestForPoint:point];
    [self setFocusCursorWithPoint:point];
    [self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposureMode:AVCaptureExposureModeContinuousAutoExposure atPoint:cameraPoint];
  }
}

/**
 * 设置聚焦光标位置
 *
 * @param point 光标位置
 */
-(void)setFocusCursorWithPoint:(CGPoint)point{
  if (!self.isFocus) {
    self.isFocus = YES;
    self.focusCursor.center=point;
    self.focusCursor.transform = CGAffineTransformMakeScale(1.25, 1.25);
    self.focusCursor.alpha = 1.0;
    [UIView animateWithDuration:0.5 animations:^{
      self.focusCursor.transform = CGAffineTransformIdentity;
    } completion:^(BOOL finished) {
      [self performSelector:@selector(onHiddenFocusCurSorAction) withObject:nil afterDelay:0.5];
    }];
  }
}

- (void)onHiddenFocusCurSorAction {
  self.focusCursor.alpha=0;
  self.isFocus = NO;
}

//拍摄完成时调用
- (void)changeLayout {
  self.imgRecord.hidden = YES;
  self.btnCamera.hidden = YES;
  self.btnAfresh.hidden = NO;
  self.btnEnsure.hidden = NO;
  self.btnBack.hidden = YES;
  if (self.isVideo) {
    [self.progressView clearProgress];
  }
  self.afreshCenterX.constant = -(SCREEN_WIDTH/2/2);
  self.ensureCenterX.constant = SCREEN_WIDTH/2/2;
  [UIView animateWithDuration:0.25 animations:^{
    [self.view layoutIfNeeded];
  }];

  self.lastBackgroundTaskIdentifier = self.backgroundTaskIdentifier;
  self.backgroundTaskIdentifier = UIBackgroundTaskInvalid;
  [self.session stopRunning];
}

//重新拍摄时调用
- (void)recoverLayout {
  if (self.isVideo) {
    self.isVideo = NO;
    [self.player stopPlayer];
    self.player.hidden = YES;
  }
  [self.session startRunning];

  if (!self.takeImageView.hidden) {
    self.takeImageView.hidden = YES;
  }
//  self.saveVideoUrl = nil;
  self.afreshCenterX.constant = 0;
  self.ensureCenterX.constant = 0;
  self.imgRecord.hidden = NO;
  self.btnCamera.hidden = NO;
  self.btnAfresh.hidden = YES;
  self.btnEnsure.hidden = YES;
  self.btnBack.hidden = NO;
  [UIView animateWithDuration:0.25 animations:^{
    [self.view layoutIfNeeded];
  }];
}

/*
#pragma mark - Navigation

// In a storyboard-based application, you will often want to do a little preparation before navigation
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
  // Get the new view controller using [segue destinationViewController].
  // Pass the selected object to the new view controller.
}
*/

@end

使用也挺简单:

- (IBAction)onCameraAction:(UIButton *)sender {
  //额 。。由于是demo,所以用的xib,大家根据需求自己更改,该demo只是提供一个思路,使用时不要直接拖入项目
  HVideoViewController *ctrl = [[NSBundle mainBundle] loadNibNamed:@"HVideoViewController" owner:nil options:nil].lastObject;
  ctrl.HSeconds = 30;//设置可录制最长时间
  ctrl.takeBlock = ^(id item) {
    if ([item isKindOfClass:[NSURL class]]) {
      NSURL *videoURL = item;
      //视频url

    } else {
      //图片

    }
  };
  [self presentViewController:ctrl animated:YES completion:nil];
}

demo地址也给出来吧:iosCamera_jb51.rar

自此就结束啦,写的比较简单,希望能帮助到大家,谢谢!

x效果图也帖出来吧:

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持我们。

(0)

相关推荐

  • iOS中UITableView Cell实现自定义单选功能

    今天分享下cell的单选,自定义的,不是下图这种网上找到的打对勾的,我搜了好久,基本上都是打对勾的文章,就决定自己写一篇.基本上自己的app都会有一个风格吧,咱也不能一直用打对勾的方式去做(看起来是不是很low). 我们要实现的是下面的这种形式.瞬间好看了很多,高大上了很多是吧. 具体我来给大家介绍一下.我这种方法有可能不是很好,有大神来,欢迎多多交流. 首先在你自定义的cell里面加入一个UIImageView,因为你肯定要有选择和未选择两张图片的吧,所以这个UIImageView来切换图片.

  • iOS实现自定义起始时间选择器视图

    随着界面的整体效果的各种展现, 起始时间选择器的展现也需求突出! 最近项目中发现时间选择器使用处还挺多, 数了数原型图发现有6处. 便决定自定义时间选择器视图写个 Demo, 封装好在所需控制器里直接调用! 主要功能: 调起时间选择器, 传值(起始时间/截止时间), 两者时间均要合理, 不能超过未来时间, 并且起始时间不能大于截止时间. 点击取消或空白处收起时间选择器. 如果需要可以根据自己的需求来修改界面, 效果如下: 主要步骤: 创建时间选择器Picker 且确认取消按钮实现功能逻辑 创建展

  • iOS自定义UIScrollView的滚动条实例代码

    UIScrollView有自己默认的滚动条,可设置隐藏和显示,但是有时候这个默认的滚动条没办法满足我们的需求,那这时候只能通过自定义来实现了. 实现自定义滚动条需要解决的主要问题是: 在scrollview滚动的过程中如何改变滚动条的位置,进而确保滚动条和scrollView在相同时间内走完自己的位移,只要把这个问题解决好了,那我们就可以优雅的自定义滚动条了. 那如何解决这个滚动条的当前滚动位移呢?我们知道,UIScrollView有一个滚动范围,滚动条也有一个滚动范围,也就是说两者的最大的滚动

  • iOS利用MJRefresh实现自定义刷新动画效果

    本文主要介绍iOS 利用MJRefresh实现自定义动画的上拉刷新下拉加载效果,一般的类型(包括更新时间与loading图案)这里不做介绍. 要想实现此功能,首先得有一套load的图片数组. 接下来就是实现过程: 引入头文件: #import "MJRefresh.h" //自定义一个方法实现 - (void)prepareRefresh { NSMutableArray *headerImages = [NSMutableArray array]; for (int i = 1; i

  • iOS实现自定义日期选择器示例

    iOS自定义日期选择器,下面只是说明一下怎么用,具体实现请在最后下载代码看看: 效果如下: .h文件解析 选择日期选择器样式 typedef enum{ DateStyleShowYearMonthDayHourMinute = 0, DateStyleShowMonthDayHourMinute, DateStyleShowYearMonthDay, DateStyleShowMonthDay, DateStyleShowHourMinute }XHDateStyle; //日期选择器样式 @

  • iOS自定义日期、时间、城市选择器实例代码

    选择器,我想大家都不陌生,当需要用户去选择某些范围值内的一个固定值时,我们会采用选择器的方式.选择器可以直观的提示用户选择的值范围.统一信息的填写格式,同时也方便用户快速的进行选择,比如对于性别,正常情况下就只有男女两种情况,那这时候用一个选择器给用户进行选择的话,可以避免错误数据的输入,也更方便用户去填写.再比如需要获取用户的生日信息时,采用选择器的方式可以统一生日的格式,如果让用户自行输入的话,可能会出现各种各样的生日信息格式,不利于数据的存储,但是采用选择器的方式的话,用户可找到对应的日期

  • iOS开发之自定义图片拉伸功能

    需求 为了减小app体积,同时为了适配不同尺寸屏幕或不同应用场景,很多图片素材都是标准通用的,比如IM消息气泡.按钮阴影效果等,但直接使用这些素材会产生一些问题,假如我们需要实现以下效果,即使用图片为账号密码输入框添加阴影效果: 图片素材: 直接使用图片实现的效果与需求效果对比: 经过自定义拉伸调整过后,最终效果: 实现 将storyboard中的控件关联到代码文件中,accountTextBgImageView 为textFiled 下的背景图片视图,实现代码如下: UIImage *text

  • IOS中自定义类中限制使用原生实例化方法

    IOS中自定义类中限制使用原生实例化方法 在自定义的类中,除了有系统自带的实例化方法外,还可能会有开发者自定义的实例化方法.当不想使用系统自定义方法时,而仅使用自定义的实例化方法时,可以这样做下限制. 如下示例所示: #import <UIKit/UIKit.h> @interface MYView : UIView // 限制使用系统方法进行实例化 // 方法1 - (instancetype)init UNAVAILABLE_ATTRIBUTE; // 方法2 - (instancetyp

  • iOS开发-自定义相机实例(仿微信)

    网上有很多自定义相机的例子,这里只是我临时写的一个小demo,仅供参考: 用到了下面几个库: #import <AVFoundation/AVFoundation.h> #import <AssetsLibrary/AssetsLibrary.h> 在使用的时候需要在Info.plist中把相关权限写进去: Privacy - Microphone Usage Description Privacy - Photo Library Usage Description Privacy

  • iOS 10自定义相机功能

    本文实例为大家分享了iOS 10自定义相机功能的具体代码,供大家参考,具体内容如下 直接上代码 // // TGCameraVC.swift // TGPhotoPicker // // Created by targetcloud on 2017/7/25. // Copyright © 2017年 targetcloud. All rights reserved. // import UIKit import AVFoundation import Photos @available(iOS

  • IOS开发自定义view方法规范示例

    目录 前言 一.关于自定义View的初始化方法 二.关于addSubview 三.关于layoutSubviews 四.关于frame与bounds 总结 前言 对于接触业务开发的童鞋,自定义View的开发是进行最频繁的工作了.但发现一些童鞋还是没有以一个好的规范甚至以一种错误的方式来搭建UI控件.由此,本文将以以下目录来进行讲叙,详细描述关于自定义View的一些书写注意事项. 关于自定义View的初始化方法 关于addSubview 关于layoutSubviews 关于frame与bound

  • IOS 开发自定义条形ProgressView的实例

    IOS 自定义进度条 ProgressView,好的进度条,让人赏心悦目,在等待的时候不是那么烦躁,也算是增加用户体验吧! 进度条在iOS开发中很常见的,我在项目开发中也写过好多进度条,有好多种类的,条形,圆形等,今天给大家总结一种条形的开发进度条. 简单思路: 1.自定义进度条先继承UIView 建立一个CustomBarProgressView  2.在.H文件中外漏的方法<开始的方法><初始化的方法>  3.在.M文件中 利用定时器改变位置 实现进度条 #效果图 #部分代码

  • Android自定义SwipeRefreshLayout高仿微信朋友圈下拉刷新

    上一篇文章里把SwipeRefreshLayout的原理简单过了一下,大致了解了其工作原理,不熟悉的可以去看一下:http://www.jb51.net/article/89310.htm 上一篇里最后提到,SwipeRefreshLayout的可定制性是比较差的,看源码会发现跟样式相关的几个类都是private的而且方法是写死的,只暴露出了几个颜色设置的方法.这样使得SwipeRefreshLayout的使用比较简单,主要就是设置一个监听器在onRefresh方法里完成刷新逻辑.讲道理Swip

  • iOS开发之手势识别实例

    感觉有必要把iOS开发中的手势识别做一个小小的总结.下面会先给出如何用storyboard给相应的控件添加手势,然后在用纯代码的方式给我们的控件添加手势,手势的用法比较简单.和button的用法类似,也是目标 动作回调,话不多说,切入今天的正题. 总共有六种手势识别:轻击手势(TapGestureRecognizer),轻扫手势 (SwipeGestureRecognizer), 长按手势(LongPressGestureRecognizer),  拖动手势(PanGestureRecogniz

  • iOS仿微信相机拍照、视频录制功能

    网上有很多自定义相机的例子,这里只是我临时写的一个iOS自定义相机(仿微信)拍照.视频录制demo,仅供参考: 用到了下面几个库: #import <AVFoundation/AVFoundation.h> #import <AssetsLibrary/AssetsLibrary.h> 在使用的时候需要在Info.plist中把相关权限写进去: Privacy - Microphone Usage Description Privacy - Photo Library Usage

  • react+redux仿微信聊天界面

    一.项目概况 基于react+react-dom+react-router-dom+redux+react-redux+webpack2.0+react-photoswipe+swiper等技术混合开发的手机端仿微信界面聊天室--reactChatRoom,实现了聊天记录下拉刷新.发送消息.表情(动图),图片.视频预览,打赏.红包等功能. 二.技术栈MVVM框架: react / react-dom状态管理:redux / react-redux页面路由:react-router-dom弹窗插件

  • 微信公众帐号开发-自定义菜单的创建及菜单事件响应的实例

    微信开发公众平台自定义菜单需要花钱认证才能实现,不想花钱只能玩测试账号了,不过这并不影响开发.我的开发都是基于柳峰老师的微信公众平台应用开发做的. 只要我们使用公众平台测试账号就可以开发自定义菜单了,比较方便,测试账号开放了很多接口,很方便. 在开发自定义菜单的时候可以参考微信公众平台开发者文档的自定义菜单创建. 一.自定义菜单 1.自定义菜单最多包括3个一级菜单,每个一级菜单最多包含5个二级菜单. 2.一级菜单最多4个汉字,二级菜单最多7个汉字,多出来的部分将会以"..."代替. 3

  • IOS开发仿微信消息长按气泡菜单实现效果

    目录 正文 使用方法 导入项目 使用 对比微信实现效果 正文 话不多说,直接上效果图 使用方法 导入项目 代码地址:github.com/shangjie119… 将SJPopMenu文件夹拖入到工程或者使用pod导入工程 pod 'SJPopMenu' 这个组件降低与原工程的耦合度,几乎不需要改动原工程代码. 使用 显示: [[SJPopMenu menu] showBy:xxxxxx] 需实现 SJCustomSelectTextView 里面方法,如果是自定义textView,只需将 SJ

随机推荐