更改摄像头和设备方向
问题描述:
我有一个基本上是视图的应用程序,当用户单击按钮时,摄像头可视化开始。更改摄像头和设备方向
我想允许当相机不能显示所有方向,但是当相机表现,我需要强制应用到肖像模式,因为如果它不是肖像,视频旋转。当相机关闭时,应用程序可能会再次旋转。
你知道,如果我能解决视频定位的问题?
或者我该如何强制应用程序进入肖像模式?我知道在早期的ios版本中,您可以使用[UIDevice setOrientation:]
,但它已被最近的ios所弃用。
我怎样才能为iOS 5和iOS 6做到这一点?
我已经试过:
[self presentViewController:[UIViewController new] animated:NO
completion:^{ [self dismissViewControllerAnimated:NO completion:nil]; }];
而且在shouldAutorotateToInterfaceOrientation方法:
if (state == CAMERA) {
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}else{
return YES;
}
它工作正常,并迫使应用肖像。但是当相机关闭时,它不能正常工作,它不能很好地旋转。
我的意思是,当相机被关闭,这是发生了什么:
- 应用程序是在人像
- 如果我尝试旋转应用程序,设备旋转而不是应用程序。我可以看到有关时间,电池等信息的ipad状态栏被旋转,但该应用程序没有。
- 如果我再次去肖像,然后旋转设备,它工作正常。
你知道可能是什么问题呢?
在此先感谢。
答
我想我找到了在同一时间与变化相机和设备方向的问题的解决方案。
我把这个代码时,我初始化摄像头,并且还shouldAutorotateToInterfaceOrientation方法,在这里我允许所有的方位。
AVCaptureVideoOrientation newOrientation;
UIInterfaceOrientation deviceOrientation = [UIApplication sharedApplication].statusBarOrientation;
NSLog(@"deviceOrientation %c",deviceOrientation);
switch (deviceOrientation)
{
case UIInterfaceOrientationPortrait:
NSLog(@"UIInterfaceOrientationPortrait");
newOrientation = AVCaptureVideoOrientationPortrait;
break;
case UIInterfaceOrientationLandscapeRight:
NSLog(@"UIInterfaceOrientationLandscapeRight");
newOrientation = AVCaptureVideoOrientationLandscapeRight;
break;
case UIInterfaceOrientationLandscapeLeft:
NSLog(@"UIInterfaceOrientationLandscapeLeft");
newOrientation = AVCaptureVideoOrientationLandscapeLeft;
break;
default:
NSLog(@"default");
newOrientation = AVCaptureVideoOrientationPortrait;
break;
}
if ([self.prevLayer respondsToSelector:@selector(connection)]){
if ([self.prevLayer.connection isVideoOrientationSupported]){
self.prevLayer.connection.videoOrientation = newOrientation;
}else{
NSLog(@"NO respond to selector connection");
}
}else{
if ([self.prevLayer isOrientationSupported]){
self.prevLayer.orientation = newOrientation;
}else{
NSLog(@"NO isOrientationSupported");
}
}
答
请尝试以下为方向代码,所以我认为你的问题可以得到解决。
- (void)encodeVideoOrientation:(NSURL *)anOutputFileURL
{
CGAffineTransform rotationTransform;
CGAffineTransform rotateTranslate;
CGSize renderSize;
switch (self.recordingOrientation)
{
// set these 3 values based on orientation
}
AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:anOutputFileURL options:nil];
AVAssetTrack *sourceVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:sourceVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:sourceAudioTrack
atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];
[layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero];
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1,30);
videoComposition.renderScale = 1.0;
videoComposition.renderSize = renderSize;
instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
videoComposition.instructions = [NSArray arrayWithObject: instruction];
AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition
presetName:AVAssetExportPresetMediumQuality];
NSString* videoName = @"export.mov";
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL * exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
{
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
assetExport.outputFileType = AVFileTypeMPEG4;
assetExport.outputURL = exportUrl;
assetExport.shouldOptimizeForNetworkUse = YES;
assetExport.videoComposition = videoComposition;
[assetExport exportAsynchronouslyWithCompletionHandler:
^(void) {
switch (assetExport.status)
{
case AVAssetExportSessionStatusCompleted:
// export complete
NSLog(@"Export Complete");
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"Export Failed");
NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
// export error (see exportSession.error)
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export Failed");
NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
// export cancelled
break;
}
}];
}
希望这可以帮助你。
非常感谢您的快速回答。对不起,我是一个新手到ios编程,我不明白我如何使用你的代码。我必须把它放在哪里?它有什么作用?再次感谢。 – 2013-05-09 12:15:17