welcome to linkAR technical documentation


There are some cases in which it is necessary to load the camera outside of the lib. We will explain here the main steps that you need to follow to achieve it.

1- Noticeable Files


1.1- ViewController
Initializes the library and adds images into it. Also initializes the own camera (Capture Manager).
Also, creates POIs and adds them into the library.

1.2- MyCameraCaptureSession.m
Initializes AVCaptureSession and adds video input.

2- Own Camera

In the ViewController.m we have:

- (void)viewDidLoad
    // 1- Init ARBrowser viewController
    _arlibCtrl = [[ARglLibController alloc] initWithAppKey:@"D653qNqWnTJR+/KU94gw4HaAlypBkWtL5NXuui+hRg==" useDefaultCamera:FALSE];  

1-Initiates the ARBrowser. Notice that we set useDefaultCamera: FALSE

The API_KEY is only valid for the sample package: arlab.HelloARBrowserOwnCamera.

In Browser is necessary add the library view:

- (void)viewDidAppear:(BOOL)animated
    [self.view addSubview:_arlibCtrl.view];

In ARBrowser 2D iOS Programming Guide, you can see as add the frameworks, init the ARBrowser, add POIs…

2.1.- Initiating own camera:

In ViewController.h we have:

#import "MyCameraCaptureSession"@property (nonatomic,strong) MyCameraCaptureSession *captureManager;

In ViewController.m we have:

- (void)viewDidLoad
    // 1- Initialize Capture Manager.
    [self setCaptureManager:[[MyCameraCaptureSession alloc] init]];
    [[self captureManager] addVideoInput];
    [[self captureManager] addVideoPreviewLayer];
    // 2- Add Video Preview Layer and set the frame
    CGRect layerRect = [[[self view] layer] bounds];
    [[[self captureManager] previewLayer] setBounds:layerRect];
    [[[self captureManager] previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),
    [[[self view] layer] addSublayer:[[self captureManager] previewLayer]];
    // 3- Start CaptureSession
    [[captureManager captureSession] startRunning];

1-Initializes Capture Manager.
2-Adds Video Preview Layer and sets the frame.
3-Starts CaptureSession.

2.2.- File to manage the camera:

In MyCameraCaptureSession.h we have:

#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#import <UIKit/UIKit.h>
@interface MyCameraCaptureSession : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate> {    
@property (strong,nonatomic) AVCaptureVideoPreviewLayer *previewLayer;
@property (strong,nonatomic) AVCaptureSession *captureSession;
- (void)addVideoInput;
- (void)addVideoPreviewLayer;

In MyCameraCaptureSession.m we have:

// Initializes AVCaptureSession.
- (id)init {
	if ((self = [super init])) {
		[self setCaptureSession:[[AVCaptureSession alloc] init]];
	return self;

// Add preview layer.
- (void)addVideoPreviewLayer {
	[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]]];
	[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

// Add video as device input.
- (void)addVideoInput {
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];	
	if (videoDevice) {
		NSError *error;
		AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
		if (!error) {
			if ([[self captureSession] canAddInput:videoIn])
				[[self captureSession] addInput:videoIn];
				NSLog(@"Couldn't add video input");		
			NSLog(@"Couldn't create video input");
		NSLog(@"Couldn't create video capture device");