Read Photo Library
If we only need to read one image or one video at a time in iOS (or take a photo/video), then we can do it with UIImagePickerController. But many times we need to read multiple photos or videos from PhotoLibrary at one time, and at this time we need to find another way. Fortunately, Apple provides us with a corresponding interface.
Before we start coding, we want to know several classes:
ALAssetsLibrary: represents the entire PhotoLibrary. We can generate an instance object, which is equivalent to a handle to the photo library.
ALAssetsGroup: Grouping of photo libraries, we can get handles of all groups through the ALAssetsLibrary instance.
ALAsset: An instance of ALAsset represents an asset, that is, a photo or video. We can obtain the corresponding subnail or original image through its instance, etc.
Another thing you need to know is blocks. Apple has appeared in a large number of this thing after iOS 4.0, which means it is becoming more and more widely used, but this thing is indeed easy to use. I won’t talk much about this thing here, I will talk about it in detail after following my blog.
For the requirements of this article, we read the group and each asset are asynchronous, but now we use blocks, we can do it in a function. So blocks are really convenient.
Let's look at the code below:
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc]init];// Generate an instance of the entire photolibrary handle
NSMutableArray *mediaArray = [[NSMutableArray alloc]init];//Storage the array of media
[assetsLibrary enumerateGroupsWithTypes:ALAssetsGroupAll usingBlock:^(ALAssetsGroup *group, BOOL *stop) {//Get all groups
[group enumerateAssetsUsingBlock:^(ALAsset *result, NSUInteger index, BOOL *stop) {//From the group
NSString* assetType = [result valueForProperty:ALAssetPropertyType];
if ([assetType isEqualToString:ALAssetTypePhoto]) {
NSLog(@"Photo");
}else if([assetType isEqualToString:ALAssetTypeVideo]){
NSLog(@"Video");
}else if([assetType isEqualToString:ALAssetTypeUnknown]){
NSLog(@"Unknow AssetType");
}
NSDictionary *assetUrls = [result valueForProperty:ALAssetPropertyURLs];
NSUInteger assetCounter = 0;
for (NSString *assetURLKey in assetUrls) {
NSLog(@"Asset URL %lu = %@",(unsigned long)assetCounter,[assetUrls objectForKey:assetURLKey]);
}
NSLog(@"Representation Size = %lld",[[result defaultRepresentation]size]);
}];
} failureBlock:^(NSError *error) {
NSLog(@"Enumerate the asset groups failed.");
}];
Save a picture or video to PhotoLibrary
Then, the file is transferred to the photo library through the path of the temporary file.
Let's look at the corresponding API directly:
// These methods can be used to add photos or videos to the saved photos album.
// With a UIImage, the API user can use -[UIImage CGImage] to get a CGImageRef, and cast -[UIImage imageOrientation] to ALAssetOrientation.
- (void)writeImageToSavedPhotosAlbum:(CGImageRef)imageRef orientation:(ALAssetOrientation)orientation completionBlock:(ALAssetsLibraryWriteImageCompletionBlock)completionBlock;
// The API user will have to specify the orientation key in the metadata dictionary to preserve the orientation of the image
- (void)writeImageToSavedPhotosAlbum:(CGImageRef)imageRef metadata:(NSDictionary *)metadata completionBlock:(ALAssetsLibraryWriteImageCompletionBlock)completionBlock __OSX_AVAILABLE_STARTING(__MAC_NA,__IPHONE_4_1);
// If there is a conflict between the metadata in the image data and the metadata dictionary, the image data metadata values will be overwritten
- (void)writeImageDataToSavedPhotosAlbum:(NSData *)imageData metadata:(NSDictionary *)metadata completionBlock:(ALAssetsLibraryWriteImageCompletionBlock)completionBlock __OSX_AVAILABLE_STARTING(__MAC_NA,__IPHONE_4_1);
- (void)writeVideoAtPathToSavedPhotosAlbum:(NSURL *)videoPathURL completionBlock:(ALAssetsLibraryWriteVideoCompletionBlock)completionBlock;
The first three are all pictures stored. Through the parameters, we can find that the first one uses the direction we passed in, the second one can retain the image metadata through the metadata passed in the image. The first two are converting the image into CGImageRef and then saving it. The third one is passing in NSData so the image information can be completely preserved. At the same time, metadata is also passed in. If the information provided by the image conflicts with the metadata, the metadata will overwrite the metadata brought by the image itself.
The last one is the API for storing videos. You can see that the parameter is an NSURL. This is just a file URL of a local temporary file.
Storing images selects the appropriate API according to your needs. For example, what we get is an instance of UIImage, then it is more convenient for us to use the first or second one. If we read image data from local temporary files, it is more convenient for us to use the third one directly.
Here is a simple code:
- (void)saveImage:(UIImage*)image{
ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc]init];
[assetsLibrary writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation) completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(@"Save image fail:%@",error);
}else{
NSLog(@"Save image succeed.");
}
}];
}
It's a little trouble to save video. You need to write the video to the local file first, then get the path to the local temporary file, and then call the fourth API above to write to the photo library.
Regarding writing temporary files, I have written an article about reading and writing files before, so you can go and have a look.
Here I present a demo that writes the video of the project resource library into the photo library, so that you can import the video into the simulator, which is convenient for testing sometimes.
The main code is as follows:
- (void)save:(NSString*)urlString{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:urlString]
completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(@"Save video fail:%@",error);
} else {
NSLog(@"Save video succeed.");
}
}];
}