25

The following code loads images that are also located on iCloud or the streams images. How can we limit the search to only images in the camera roll?

var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: nil)
William Falcon
  • 9,813
  • 14
  • 67
  • 110

8 Answers8

21

After adding the Camera Roll and Photo Stream albums, Apple added the following PHAssetCollectionSubtype types in iOS 8.1:

  1. PHAssetCollectionSubtypeAlbumMyPhotoStream (together with PHAssetCollectionTypeAlbum) - fetches the Photo Stream album.

  2. PHAssetCollectionSubtypeSmartAlbumUserLibrary (together with PHAssetCollectionTypeSmartAlbum) - fetches the Camera Roll album.

Haven't tested if this is backward-compatible with iOS 8.0.x though.

StatusReport
  • 3,397
  • 1
  • 23
  • 31
  • 2
    Hmm.. `PHAssetCollectionSubtypeSmartAlbumUserLibrary` lists 14,850 items on my iPad. That's the whole Photos collection, not what's on my iPad... – Grimxn Apr 29 '16 at 13:55
  • 1
    If you have iCloud Photo Library enabled, it will list all the assets that are on the device plus the ones that are available via the iCloud Photo Library. AFAIK there's no (documented) way to know which of the assets are stored locally on the device and which are available via the cloud and not synchronized yet. – StatusReport Apr 30 '16 at 09:01
  • Indeed, that will work, but I don't think that you'll want to iterate all your assets and request them in order to figure out if they're local to the device or not. – StatusReport Apr 30 '16 at 13:58
  • I get the same result when using iCloud Photo Library. It returns all images in the "all photos" collection regardless of if they are actually on the device or not. It's possible to check if an image is on the device using requestContentEditingInputWithOptions or requestImageDataForAsset, but this is asyncronous and seems crazy to do for every asset in the list. Why don't they just have a way to specify only local assets in fetchAssetsInAssetCollection? – Tom Kincaid Jul 28 '17 at 13:26
  • I'm not aware of a public API that reveals that - they seem to try to keep it transparent to the client. If you're willing to use [undocumented APIs](https://github.com/nst/iOS-Runtime-Headers/blob/master/Frameworks/Photos.framework/PHAsset.h) there are plenty of properties that may be `nil` if the asset is not downloaded. – StatusReport Jul 30 '17 at 06:30
5

Through some experimentation we discovered a hidden property not listed in the documentation (assetSource). Basically you have to do a regular fetch request, then use a predicate to filter the ones from the camera roll. This value should be 3.

Sample code:

//fetch all assets, then sub fetch only the range we need
var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions)

assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in
    results.addObject(obj)
}

var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %@", argumentArray: [3]))
results = NSMutableArray(array: cameraRollAssets)
admdrew
  • 3,790
  • 4
  • 27
  • 39
William Falcon
  • 9,813
  • 14
  • 67
  • 110
  • 7
  • This appears to be the most stable solution at the moment. The only other "solution" I could come up with was to call PHAsset.requestContentEditingInputWithOptions and examining the PHContentEditingInput object returned for each in order to obtain the URL, then filtering out all whose URLs don't contain "/DCIM/". But that is extremely slow and brittle compared to this. – cwilper Sep 24 '14 at 11:36
  • 1
    This appears to have been "fixed", and now raises an exception if you try and refer to `assetSource` in the GM. – Rizwan Sattar Sep 25 '14 at 23:55
  • @RizwanSattar the API has been hidden but the property is still available in the current 8.0.2 build, the problem is now its private API and undocumented - not worth trusting it. – Oliver Atkinson Sep 26 '14 at 10:25
  • For anyone working in obj-c, `assets` would be a `PHFetchResult` object and `obj` is an object of type `PHAsset` – CalZone Nov 03 '15 at 18:48
  • I get "Unsupported predicate in fetch options: assetSource" – Tom Kincaid Jul 28 '17 at 13:41
5

If you are searching like me for Objective C code, and also you didn't get Answer of new library/ Photo Framework as you were getting deprecated AssetsLibrary's code , Then this will help you: Swift

Global Variables:

  func getAllPhotosFromCameraRoll() -> [UIImage] {
    // TODO: Add `NSPhotoLibraryUsageDescription` to info.plist
    PHPhotoLibrary.requestAuthorization { print($0) } // TODO: Move this line of code to somewhere before attempting to access photos

    var images = [UIImage]()

    let requestOptions: PHImageRequestOptions = PHImageRequestOptions()
    requestOptions.resizeMode = .exact
    requestOptions.deliveryMode = .highQualityFormat
    requestOptions.isSynchronous = true

    let fetchResult: PHFetchResult = PHAsset.fetchAssets(with: .image, options: nil)
    let manager: PHImageManager = PHImageManager.default()

    for i in 0..<fetchResult.count {
      let asset = fetchResult.object(at: i)
      manager.requestImage(
        for: asset,
        targetSize: PHImageManagerMaximumSize,
        contentMode: .default,
        options: requestOptions,
        resultHandler: { (image: UIImage?, info: [AnyHashable: Any]?) -> Void in
          if let image = image {
            images.append(image)
          }
        })
    }

    return images
  }

Objective C

Global Variables:

NSArray *imageArray;
NSMutableArray *mutableArray;

below method will help you:

-(void)getAllPhotosFromCamera
{
    imageArray=[[NSArray alloc] init];
    mutableArray =[[NSMutableArray alloc]init];

    PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init];
    requestOptions.resizeMode   = PHImageRequestOptionsResizeModeExact;
    requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat;
    requestOptions.synchronous = true;
    PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];

    NSLog(@"%d",(int)result.count);

    PHImageManager *manager = [PHImageManager defaultManager];
    NSMutableArray *images = [NSMutableArray arrayWithCapacity:[result count]];

    // assets contains PHAsset objects.

    __block UIImage *ima;
    for (PHAsset *asset in result) {
        // Do something with the asset

        [manager requestImageForAsset:asset
                           targetSize:PHImageManagerMaximumSize
                          contentMode:PHImageContentModeDefault
                              options:requestOptions
                        resultHandler:^void(UIImage *image, NSDictionary *info) {
                            ima = image;

                            [images addObject:ima];
                        }];


    }

    imageArray = [images copy];  // You can direct use NSMutuable Array images
}
Senseful
  • 86,719
  • 67
  • 308
  • 465
karan
  • 3,319
  • 1
  • 35
  • 44
4

If you use your own PHCachingImageManager instead of the shared PHImageManager instance then when you call requestImageForAsset:targetSize:contentMode:options:resultHandler: you can set an option in PHImageRequestOptions to specify that the image is local.

networkAccessAllowed Property

A Boolean value that specifies whether Photos can download the requested image from iCloud.

networkAccessAllowed

Discussion

If YES, and the requested image is not stored on the local device, Photos downloads the image from iCloud. To be notified of the download’s progress, use the progressHandler property to provide a block that Photos calls periodically while downloading the image. If NO (the default), and the image is not on the local device, the PHImageResultIsInCloudKey value in the result handler’s info dictionary indicates that the image is not available unless you enable network access.

Grimxn
  • 22,115
  • 10
  • 72
  • 85
4

This can help. You can use your own data model instead of AlbumModel I used.

func getCameraRoll() -> AlbumModel {
          var cameraRollAlbum : AlbumModel!

          let cameraRoll = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil)

          cameraRoll.enumerateObjects({ (object: AnyObject!, count: Int, stop: UnsafeMutablePointer) in
            if object is PHAssetCollection {
              let obj:PHAssetCollection = object as! PHAssetCollection

              let fetchOptions = PHFetchOptions()
              fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]
              fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
              let assets = PHAsset.fetchAssets(in: obj, options: fetchOptions)

              if assets.count > 0 {
                let newAlbum = AlbumModel(name: obj.localizedTitle!, count: assets.count, collection:obj, assets: assets)

                cameraRollAlbum = newAlbum
              }
            }
          })
         return cameraRollAlbum

        }
jamesthakid
  • 1,265
  • 10
  • 11
0

Here is Objective- c version provided by apple.

-(NSMutableArray *)getNumberOfPhotoFromCameraRoll:(NSArray *)array{
    PHFetchResult *fetchResult = array[1];
    int index = 0;
    unsigned long pictures = 0;
    for(int i = 0; i < fetchResult.count; i++){
        unsigned long temp = 0;
        temp = [PHAsset fetchAssetsInAssetCollection:fetchResult[i] options:nil].count;
        if(temp > pictures ){
            pictures = temp;
            index = i;
        }
    }
    PHCollection *collection = fetchResult[index];

       if (![collection isKindOfClass:[PHAssetCollection class]]) {
        // return;
    }
    // Configure the AAPLAssetGridViewController with the asset collection.
    PHAssetCollection *assetCollection = (PHAssetCollection *)collection;
    PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil];
    self. assetsFetchResults = assetsFetchResult;
    self. assetCollection = assetCollection;
    self.numberOfPhotoArray = [NSMutableArray array];
    for (int i = 0; i<[assetsFetchResult count]; i++) {
        PHAsset *asset = assetsFetchResult[i];
        [self.numberOfPhotoArray addObject:asset];
    }
    NSLog(@"%lu",(unsigned long)[self.numberOfPhotoArray count]);
    return self.numberOfPhotoArray;
}

Where you can grab following details

PHFetchResult *fetchResult = self.sectionFetchResults[1];
        PHCollection *collection = fetchResult[6];
**value 1,6 used to get camera images**
**value 1,0 used to get screen shots**
**value 1,1 used to get hidden**
**value 1,2 used to get selfies** 
**value 1,3 used to get recently added**
**value 1,4 used to get videos**
**value 1,5 used to get recently deleted**
 **value 1,7 used to get favorites**

Apple demo link

Declare your property

@property (nonatomic, strong) NSArray *sectionFetchResults;
@property (nonatomic, strong) PHFetchResult *assetsFetchResults;
@property (nonatomic, strong) PHAssetCollection *assetCollection;
@property (nonatomic, strong) NSMutableArray *numberOfPhotoArray;
karthikeyan
  • 3,821
  • 3
  • 22
  • 45
  • It may help objective c developers.who knows objective-c developer never come into this link? – karthikeyan Jun 29 '16 at 07:24
  • The demo doesn't explicitly refer to each of the collections by index (eg 6 for camera images) as you have; what if they change the order in which they're returned? – Dave Nottage Jul 31 '16 at 06:53
  • It appears the updated answer just returns the count of the collection with the most assets. What if there's a collection that has more than the camera roll? – Dave Nottage Jul 31 '16 at 08:39
0

I've been banging my head over this too. I've found no way to filter for only assets on the device with fetchAssetsWithMediaType or fetchAssetsInAssetCollection. I'm able to use requestContentEditingInputWithOptions or requestImageDataForAsset to determine if the asset is on the device or not, but this is asynchronous and seems like it's using way too much resources to do for every asset in the list. There must be a better way.

PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil];

for (int i=0; i<[fetchResult count]; i++) {

    PHAsset *asset = fetchResult[i];

    [asset requestContentEditingInputWithOptions:nil
                               completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) {
                                   if ([[info objectForKey:PHContentEditingInputResultIsInCloudKey] intValue] == 1) {
                                       NSLog(@"asset is in cloud");
                                   } else {
                                       NSLog(@"asset is on device");
                                   }
                               }];

}
Tom Kincaid
  • 4,887
  • 6
  • 47
  • 72
-1

If you don't want to rely on an undocumented API, look at [asset canPerformEditOperation:PHAssetEditOperationContent]. This only returns true if the full original is available on device.

Admittedly this is also fragile, but testing shows it works for all of the assetSource types (photostream, iTunes sync, etc).

scosman
  • 2,343
  • 18
  • 34
  • Doesn't work for me when using iCloud photo library. It returns true whether or not the image is actually on the device. – Tom Kincaid Jul 28 '17 at 14:14