Tuesday, October 28, 2014

How to create Photo Editing Extension in Swift?


In the first Extension Tutorial, you learned about basics of Extension & how it works.

In this tutorial, you will take a deep dive into the Photo Editing Extension. This post will let you know how you can create the photo editing Extension using Swift language.

Photo Editing extension allows the user to edit the photo and video in the Photo app. You can choose your extension to open from the photo app and do some editing and return the edited image to the photo app. But make sure the functionality your extension provides is best suited for Photo Extension.

Create Photo editing extension-

Create a project with “Single View Application” template  and select languages as Swift and set the project name “PhotoExtensionSample”.




Add Target-

In the Xcode project navigator, select the project & add new target to the project. Choose PhotoEditingExtension in Application Extension and named it “Photo Extension”. Also make sure that you have select the language Swift for both Project & target. 



This target will comprises of 3 files- 
1) PhotoExteniosnViewController
2) MainInterface.storyboard 
3) Info.plist

PhotoEditingViewController is a subclass of UIViewController & adopts the PHContentEditingController protocol & includes the life cycle method of PHContentEditingController. Your PhotoEditingViewController will includes the following methods-

1) func canHandleAdjustmentData(adjustmentData: PHAdjustmentData?) -> Bool
2) func startContentEditingWithInput(contentEditingInput: PHContentEditingInput?, placeholderImage: UIImage

3) func finishContentEditingWithCompletionHandler(completionHandler: ((PHContentEditingOutput!) -> Void)!)
4) var shouldShowCancelConfirmation: Bool 
5) func cancelContentEditing()



Run Extension-

Now, if you run your extension then it will gives you the option to run the host app. Choose Photos app. It will launch the photo app & you can select any photo to edit. 



From the photo edit screen tap on the button just before the Done button. It will shows all the Photo Editing Extension & chose your Extension. Now, it will launch your extension. Your extension is showing Hello World. If you taps on the cancel & down button then it will return the control to the host app.




Design User Interface-

Lets make some modification to the MainInterface.storyboard. Add the imageView and slider on the view with appropriate constraints. 

Now defines the outlet and action method for slider in the PHContentEditingController class.


Implement Life cycle methods-

When the control is pass to the extension then its developer responsibility to grab the image and allows the user to edit the photos.
System will call startContentEditingWithInput method & provides the content for editing.

func startContentEditingWithInput(contentEditingInput: PHContentEditingInput?, placeholderImage: UIImage) {
// Present content for editing, and keep the contentEditingInput for use when closing the edit session.
// If you returned YES from canHandleAdjustmentData:, contentEditingInput has the original image and adjustment data.
// If you returned NO, the contentEditingInput has past edits "baked in".
input = contentEditingInput
beginImage = CIImage(image: input?.displaySizeImage)
filter = CIFilter(name: "CISepiaTone")
filter.setValue(beginImage, forKey: kCIInputImageKey)
filter.setValue(0.5, forKey: kCIInputIntensityKey)
let outputImage : CIImage = filter.outputImage
editingImageView?.image = UIImage(CIImage: outputImage)
}

PHContentEditingInput will provides the content for editing the image. Once you get the image, you can perform some editing operation & pass the edited image to the photo app. In this method, i have created the sepia filter and assign the input image with intensity 0.5. You can change this intensity value by changing the slider value & generate the output image.

@IBAction func sliderValueChanged(sender: UISlider){
let sliderValue = sender.value
filter.setValue(sliderValue, forKey: kCIInputIntensityKey)
let outputImage = filter.outputImage
editingImageView?.image = UIImage(CIImage: outputImage)
}

When user taps on the done button it will call finishContentEditingWithCompletionHandler method. In this method, you will create PHContentEditingOutput and apply the changes to the full size image which user has done and calls the completion block by passing the output.

func finishContentEditingWithCompletionHandler(completionHandler: ((PHContentEditingOutput!) -> Void)!) {
// Update UI to reflect that editing has finished and output is being rendered.
// Render and provide output on a background queue.
dispatch_async(dispatch_get_global_queue(CLong(DISPATCH_QUEUE_PRIORITY_DEFAULT), 0)) {
// Create editing output from the editing input.
let output = PHContentEditingOutput(contentEditingInput: self.input)
let archivedData = NSKeyedArchiver.archivedDataWithRootObject(self.filter.valueForKey(kCIInputIntensityKey))
let newAdjustmentData = PHAdjustmentData(formatIdentifier: self.formatIdentifier,
formatVersion: self.formatVersion,
data: archivedData)
output.adjustmentData = newAdjustmentData
// Write the JPEG Data
let fullSizeImage = CIImage(contentsOfURL: self.input?.fullSizeImageURL)
UIGraphicsBeginImageContext(fullSizeImage.extent().size);
self.filter.setValue(fullSizeImage, forKey: kCIInputImageKey)
UIImage(CIImage: self.filter.outputImage).drawInRect(fullSizeImage.extent())
let outputImage = UIGraphicsGetImageFromCurrentImageContext()
let jpegData = UIImageJPEGRepresentation(outputImage, 1.0)
UIGraphicsEndImageContext()
jpegData.writeToURL(output.renderedContentURL, atomically: true)
// Call completion handler to commit edit to Photos.
completionHandler?(output)
// Clean up temporary files, etc.
}
}

Here we also need to set the adjustmentData for output. PHAdjustmentData is initialised with formatIdentifier, formatVersion & data. Format identifier should be unique so assign your bundle identifier as Format Identifier & pass the slider value in the archived form to the Adjustment data.

When user tries to edit the image which has been previously edited then system will call canHandleAdjustmentData. In this method, you can check that your extension can handles the adjustment data or not and return the appropriate BOOL value.

let formatIdentifier = "com.mindfire.PhotoExtensionSample"
let formatVersion    = "1.0"

func canHandleAdjustmentData(adjustmentData: PHAdjustmentData?) -> Bool {
// Inspect the adjustmentData to determine whether your extension can work with past edits.
// (Typically, you use its formatIdentifier and formatVersion properties to do this.)
return adjustmentData?.formatIdentifier == formatIdentifier &&
adjustmentData?.formatVersion == formatVersion
//        return false
}

After this method, it will call startContentEditingWithInput method. In this method, you can resume the editing in the image if it has been previously edited by your extension. startContentEditingWithInput has the property adjustmentData which will return the adjusted value which you have set in the finishContentEditingWithCompletionHandler method.

func startContentEditingWithInput(contentEditingInput: PHContentEditingInput?, placeholderImage: UIImage) {
// Present content for editing, and keep the contentEditingInput for use when closing the edit session.
// If you returned YES from canHandleAdjustmentData:, contentEditingInput has the original image and adjustment data.
// If you returned NO, the contentEditingInput has past edits "baked in".
input = contentEditingInput
beginImage = CIImage(image: input?.displaySizeImage)
filter = CIFilter(name: "CISepiaTone")
filter.setValue(beginImage, forKey: kCIInputImageKey)
if let adjustmentData = contentEditingInput?.adjustmentData {
let value = NSKeyedUnarchiver.unarchiveObjectWithData(adjustmentData.data) as NSNumber
filter.setValue(value, forKey: kCIInputIntensityKey)
slider?.value = value.floatValue
}
else {
filter.setValue(0.5, forKey: kCIInputIntensityKey)
}
let outputImage : CIImage = filter.outputImage
editingImageView?.image = UIImage(CIImage: outputImage)
}

Now, run your extension it will look like this-




Discard Changes-

If you taps on the cancel button it will directly discard all the changes. If you want some confirmation from user before it discard the changes. You need to return true value from shouldShowCancelConfirmation method. 




You can download the sample app from Github.



Friday, October 17, 2014

How to initialise View controller in Swift?


Initialising ViewController in Swift is different from the Objective C. As in Swift, we have to implement the Required Designated initialiser method and they will not return any value.

If you create a subclass of UIViewController using Swift Language then your file will look like this-

import UIKit

class DetailViewController: UIViewController {

    override func viewDidLoad() {
        super.viewDidLoad()

    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
    }

    /*
    // MARK: - Navigation

    // In a storyboard-based application, you will often want to do a little preparation before navigation
    override func prepareForSegue(segue: UIStoryboardSegue!, sender: AnyObject!) {
        // Get the new view controller using segue.destinationViewController.
        // Pass the selected object to the new view controller.
    }
    */

}

You have to implement init(coder aDecoder: NSCoder!) and init(nibName nibNameOrNil: String?, bundle nibBundleOrNil: NSBundle?) to initialize the ViewController manually. 

After adding the initializer method your ViewController class look like this-

class DetailViewController: UIViewController {

required init(coder aDecoder: NSCoder)
{
super.init(coder: aDecoder)
}
override init(nibName nibNameOrNil: String?, bundle nibBundleOrNil: NSBundle?) {
super.init(nibName: nibNameOrNil, bundle: nibBundleOrNil)
// Here you can init your properties
}

    override func viewDidLoad() {
        super.viewDidLoad()

    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
    }
}

If you will not add this methods in your ViewController File then app will crash and it will show error “fatal error: use of unimplemented initializer” to you.

This error is occurring because we have different initialising behaviour in Swift in comparison to Objective C. Objective C automatically calls the initialising method but in swift we have to add the required initialising method manually.  

Also, Swift initialiser do not return any value as return by Objective C initialiser.

Note-
1) If you are using StoryBoard then add init(coder aDecoder: NSCoder!) in your ViewController class.
2) If you are initialising the ViewController programatically then add init(nibName nibNameOrNil: String?, bundle nibBundleOrNil: NSBundle?) in your ViewController class.


Thursday, October 16, 2014

Create custom view using IBDesignable & IBInspectable in XCode6


With XCode6, Apple has added two new attributes for Interface Builder i.e. IBDesignable & IBInspectable. It allows to add controls in the Interface builder.

IBDesignable tells the interface builder that it can load the view & render it. This view should be in framework to work. 

IBInspectable will show the properties declared in custom view class in the Interface builder Inspector.

You can use these two attributes to expand the functionality of your custom view & preview the custom view directly through IB. 

In this post we will create a custom view that renders in the Interface builder.

1) Open Xcode 6 and create a single view application & select the language Swift.

2) Add new target to the project & select Framework & Library & choose Cocoa Touch Framework and named it "CustomView".





3) Create new Swift File "CustomView" subclass of UIView & add it in CustomView framework. Your Swift file look like this

import UIKit

class CustomView: UIView {

}

4) Now you need to prefix your class with keyword @IBDesignable keyword to inform the Interface builder that class will design themselves.

@IBDesignable class CustomView: UIView {

}

5) Add the properties which you want to render in the attributes Inspector of Interface builder. Here i have added 3 properties with a prefix of @IBInspectable & implement the didSet observer to update UI. This will allow the Interface builder to read & write the values of these property in the inspector.

@IBDesignable class CustomView: UIView {

@IBInspectable var borderColor: UIColor = UIColor.clearColor() {
didSet {
layer.borderColor = borderColor.CGColor
}
}
@IBInspectable var borderWidth: CGFloat = 0 {
didSet {
layer.borderWidth = borderWidth
}
}
@IBInspectable var cornerRadius: CGFloat = 0 {
didSet {
layer.cornerRadius = cornerRadius
}
}
}

6) Now open the Storyboard & add the view & change the class to CustomView.



7) Open the attribute Inspector, it will show all the properties. You can change the value of this properties & see the results.



You can download the sample app from github.


Before XCode6, we were doing all this things programatically. But now using these two keywords, we can customised the view easily & reduce the code & makes the custom view more reusable.


Note- In Objective C, you can use IB_DESIGNABLE and IBInspectable to render custom view in Interface builder.
Please check this link for more details-



Wednesday, October 15, 2014

Introduction to Extension



Extension is the new feature of iOS 8 & OS X Yosemite (v10.10) which allows the third party app to communicate with other apps. It allows the developers to extend the app functionality & content beyond the application & make it available to the users while using other applications. 

Some example of Extension are-

1) Show the app as widget on the Today Screen.
2) Use the photo filters within the iOS photo app.
3) Provide custom keyboard that users can use in place of iOS system keyboard. 

Extension is different from the app and designed for the specific task and will end itself when the task is completed. Application acts as a container for your extensions.

Note- Memory limits for running app extension is lower than the memory limits provided to the foreground app.


Types of Extension-

There are 7 types of Extension each of which is tied to an area of system known as Extension points. Each extension point defines usage policy & provides APIs that developer will use when they create an extension for that area. Since each extension point is associated to usage policies and specific APIs, developers have to choose the appropriate extension point for the type of functionality they want to provide. Out of 7 extension, 6 is available in iOS 8 & 4 in OX X Yosemite.

1) Today- Today Extension is also named as Widget can be used to show quick update or perform quick task in the Today View of notification centre. It is available in both iOS & OS X. 

2) Share- If you want to share images or contents then you can use Share Extension to post the content on the Website. It is available in both iOS& OS X.

3) Action- This extension manipulate or view content within the context of another app. It is available in iOS & OS X.

4) PhotoEditing- This extension provides the ability to edit the photo or video within the Photos app. It is available in iOS only.

5) FinderSync- This extension provides the ability to modify the behaviour of Finder. Unlike other apps its does not provides the functionality to add features to other apps. It is available in OS X only.

6) Document Provider- This extension provides access to & manage a repository of files. It is available in iOS only.

7) Custom keyboard- This extension replace the iOS system keyboard with the custom keyboard. It is available in iOS only.


Extension’s life cycle-

Extension lifecycle & environment is different from the app.  System instantiates the extension when host app request for it. Host app defines the context provided to the extension and starts the extension life cycle when it sends a request in response to a user action. Extension terminates soon after it completes the request it received from the host app.




eg- Suppose user finds some content in Safari to share and taps the share button then its shows all the available extension. Here, Safari is the host app & by choosing the extension host app request the extension to perform the task to share the selected content. When the task is completed then the extension will send the response to the Host app.


How an App extension communicates-

Host app communicates with the Extension. Host app request the extension & after completing the request extension will send the response to host app. 



There is no direct communication between app extension & its containing app & also host will not communicates with the extension’s container. However, Indirect communication is available using well-defined API’s. Extension can open the containing app by using the openURL method and access the shared resources by using NSUserDefaults.





How to create Extension-

You can create the app extension by adding new target to the app. You can add multiple extension to a application.


How to deliver Extension to the users-

To deliver the Extension to the users, you submit application containing extension to the app store. When user installs the app containing extension, then extension will automatically installed in the device. 


Limitations-

1) Health Kit and Event Kit UI Frameworks are not available.
2) Extensions cannot perform long running background tasks.
3) Extensions cannot access the device’s cameras or microphones (although they may access existing media files)
4) Extensions cannot receive Air Drop data (although they can transmit data via Air Drop)
5) Extension cannot access the sharedApplication object so cannot use any of the methods on that object.


Note- API’s which are not available for extensions are marked with an unavailability macro "NS_EXTENSION_UNAVAILABLE".


Monday, October 13, 2014

How to add overlay to Video?



Sometimes you want to add overlay to the captured video for adding copyright label & watermark image.

You can use AVFoundation framework which provides the ability to edit the captured video. Using this framework, we can edit & add the watermark image & text over the captured video. 

You can use this method by passing the image & video url. You should pass the image which you want to add as a overlay  to the captured video. This method will export the new video with watermark image over the video.


- (void) createWatermark:(UIImage*)image video:(NSURL*)videoURL
{
if (videoURL == nil)
return;
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoURL options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack* compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo  preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack* clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
  ofTrack:clipVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
//  create the layer with the watermark image
CALayer* aLayer = [CALayer layer];
aLayer.contents = (id)image.CGImage;
aLayer.frame = CGRectMake(50, 100, image.size.width, image.size.height);
aLayer.opacity = 0.9;
//sorts the layer in proper order
AVAssetTrack* videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize videoSize = [videoTrack naturalSize];
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:aLayer];
// create text Layer
CATextLayer* titleLayer = [CATextLayer layer];
titleLayer.backgroundColor = [UIColor clearColor].CGColor;
titleLayer.string = @"Dummy text";
titleLayer.font = CFBridgingRetain(@"Helvetica");
titleLayer.fontSize = 28;
titleLayer.shadowOpacity = 0.5;
titleLayer.alignmentMode = kCAAlignmentCenter;
titleLayer.frame = CGRectMake(0, 50, videoSize.width, videoSize.height / 6);
[parentLayer addSublayer:titleLayer];
//create the composition and add the instructions to insert the layer:
AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];
videoComp.renderSize = videoSize;
videoComp.frameDuration = CMTimeMake(1, 30);
videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
/// instruction
AVMutableVideoCompositionInstruction* instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
AVAssetTrack* mixVideoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mixVideoTrack];
instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
videoComp.instructions = [NSArray arrayWithObject: instruction];
// export video
_assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
_assetExport.videoComposition = videoComp;
NSLog (@"created exporter. supportedFileTypes: %@", _assetExport.supportedFileTypes);
NSString* videoName = @"NewWatermarkedVideo.mov";
NSString* exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL* exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
 
//Final code
 
switch (_assetExport.status)
{
case AVAssetExportSessionStatusUnknown:
NSLog(@"Unknown");
break;
case AVAssetExportSessionStatusWaiting:
NSLog(@"Waiting");
break;
case AVAssetExportSessionStatusExporting:
NSLog(@"Exporting");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(@"Created new water mark image");
_playButton.hidden = NO;
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"Failed- %@", _assetExport.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Cancelled");
break;
}
}
];   
}

You can download the sample app from github.