Skip to content

feat(mobile): add playbackStyle to native sync API#26541

Merged
mertalev merged 9 commits intoimmich-app:mainfrom
LeLunZ:feature/sync-playback-style
Feb 28, 2026
Merged

feat(mobile): add playbackStyle to native sync API#26541
mertalev merged 9 commits intoimmich-app:mainfrom
LeLunZ:feature/sync-playback-style

Conversation

@LeLunZ
Copy link
Collaborator

@LeLunZ LeLunZ commented Feb 26, 2026

As requested, smaller PRs for #26148

The first step which is easy to uncouple from all other changes in the native sync of the playbackstyle.
After this PR I will continue with:

  • PR2: Persist playbackStyle in local asset database
  • PR3: And finally Animated image rendering support :D

Adds a playbackStyle field to PlatformAsset in the pigeon sync API so native platforms can communicate the asset's playback style (image, video, animated, livePhoto) to Flutter during sync.

Whats important to know:

  • iOS there now is the playbackStyleValue in the PHAsset where we use iOS provides, but in android there is no such thing as a playback style value, thus we have to "guess" when playbackStyle is of animation (if you don't have other ideas). Currently this happens via MimeType of "image/gif"
  • also in iOS there is another playback style called videoLooping. I wasn't sure if we also want to report that back. So I currently didn't include it and it just report playback style video for it.

Adds a `playbackStyle` field to `PlatformAsset` in the pigeon sync API so
native platforms can communicate the asset's playback style (image, video,
animated, livePhoto) to Flutter during sync.

- Add `playbackStyleValue` computed property to `PHAsset` extension (iOS)
- Populate `playbackStyle` in `toPlatformAsset()` and the full-sync path
- Update generated Dart/Kotlin/Swift files
Copilot AI review requested due to automatic review settings February 26, 2026 17:45
@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 26, 2026

@mertalev

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds playbackStyle to the mobile native-sync Pigeon API so iOS/Android can report whether a synced asset is an image, video, animated image, or Live Photo to Flutter during sync.

Changes:

  • Extend PlatformAsset (Pigeon) with a playbackStyle field and regenerate platform bindings (Dart/Swift/Kotlin).
  • Populate playbackStyle on iOS using PHAsset.playbackStyle.
  • Infer playbackStyle on Android using MediaStore MIME_TYPE (currently treating image/gif as animated).

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
mobile/pigeon/native_sync_api.dart Adds playbackStyle to the Pigeon PlatformAsset definition (defaulting to image).
mobile/lib/platform/native_sync_api.g.dart Regenerated Dart bindings to serialize/deserialize playbackStyle.
mobile/ios/Runner/Sync/PHAssetExtensions.swift Maps PHAsset.playbackStyle to the integer playbackStyle value in PlatformAsset.
mobile/ios/Runner/Sync/MessagesImpl.swift Updates the placeholder PlatformAsset used for id-only comparisons to include playbackStyle.
mobile/ios/Runner/Sync/Messages.g.swift Regenerated Swift bindings to include playbackStyle.
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt Adds MIME_TYPE to the MediaStore query and derives playbackStyle (video/gif/static).
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/Messages.g.kt Regenerated Kotlin bindings to include playbackStyle.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@LeLunZ LeLunZ force-pushed the feature/sync-playback-style branch from b0c106b to f5cd4a9 Compare February 26, 2026 17:58
@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 26, 2026

and actually something else I just saw:

in mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt

There is a check if the file is a image:

val duration = if (mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_IMAGE) 0
          else c.getLong(durationColumn) / 1000

But the mediaType is already a remaped value? Should the comparison really be made against the enum? Shouldn't we use the value we defined a few lines above?

Should we include this small change here?

@LeLunZ LeLunZ force-pushed the feature/sync-playback-style branch from 0417135 to 3aedd0a Compare February 26, 2026 18:08
@mertalev
Copy link
Member

and actually something else I just saw:

in mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt

There is a check if the file is a image:


val duration = if (mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_IMAGE) 0

          else c.getLong(durationColumn) / 1000

But the mediaType is already a remaped value? Should the comparison really be made against the enum? Shouldn't we use the value we defined a few lines above?

Should we include this small change here?

It's nicer to use enums than raw ints. We could probably define our own enum and use that instead of 0, 1, 2 etc.

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 26, 2026

It's nicer to use enums than raw ints. We could probably define our own enum and use that instead of 0, 1, 2 etc.

@mertalev Should I include that in here?

@mertalev
Copy link
Member

It's nicer to use enums than raw ints. We could probably define our own enum and use that instead of 0, 1, 2 etc.

@mertalev Should I include that in here?

I'm hesitant to change existing sync code in this PR, so let's leave it for now.

@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 26, 2026

@mertalev I just want to highlight that these changes here are only to get some initial values for assets that weren't synced to the server yet.

For remote assets we have a duration that we can later use in the flutter code, just like @meesfrensel commented.

@mertalev
Copy link
Member

mertalev commented Feb 27, 2026

@mertalev I just want to highlight that these changes here are only to get some initial values for assets that weren't synced to the server yet.

For remote assets we have a duration that we can later use in the flutter code, just like @meesfrensel commented.

Some people don't want to back up everything, so making it work for local-only assets is still valuable. Live images also only work for backed up assets at the moment (and don't work offline) because we haven't implemented local playback and we don't even know if an asset is live or not until the server tells us. This metadata is a useful first step to improving that situation.

Supposedly, it's available as an internal _special_format column that can be queried, and XMP is available from the DB as well.

enum class PlaybackStyle {
      IMAGE,
      IMAGE_ANIMATED,
      LIVE_PHOTO,
      VIDEO,
      VIDEO_LOOPING,
      UNKNOWN,
  }

  fun getPlaybackStyle(contentResolver: ContentResolver, uri: Uri): PlaybackStyle {
      if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) {
          contentResolver.query(
              uri,
              arrayOf(MediaStore.Files.FileColumns.MEDIA_TYPE, "_special_format"),
              null, null, null
          )?.use { cursor ->
              if (cursor.moveToFirst()) {
                  val mediaType = cursor.getInt(0)
                  val specialFormat = cursor.getInt(1)

                  return when {
                      specialFormat == 2 -> PlaybackStyle.LIVE_PHOTO
                      specialFormat == 1 || specialFormat == 3 -> PlaybackStyle.IMAGE_ANIMATED
                      mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_VIDEO -> PlaybackStyle.VIDEO
                      mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_IMAGE -> PlaybackStyle.IMAGE
                      else -> PlaybackStyle.UNKNOWN
                  }
              }
          }
          return PlaybackStyle.UNKNOWN
      }

      // Pre-API 33 fallback
      val columns = mutableListOf(MediaStore.Files.FileColumns.MEDIA_TYPE)
      if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) columns.add(MediaStore.MediaColumns.XMP)

      var mediaType = 0
      var xmp: String? = null

      contentResolver.query(uri, columns.toTypedArray(), null, null, null)?.use { cursor ->
          if (cursor.moveToFirst()) {
              mediaType = cursor.getInt(0)
              if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) {
                  xmp = cursor.getBlob(1)?.toString(Charsets.UTF_8)
              }
          }
      }

      // API 26-29: get XMP from file
      if (xmp == null && Build.VERSION.SDK_INT < Build.VERSION_CODES.R) {
          contentResolver.openInputStream(uri)?.use { stream ->
              xmp = ExifInterface(stream).getAttribute(ExifInterface.TAG_XMP)
          }
      }

      if (xmp != null && ("GCamera:MotionPhoto" in xmp!! || "Camera:MotionPhoto" in xmp!!)) {
          return PlaybackStyle.LIVE_PHOTO
      }

      if (mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_VIDEO) {
          return PlaybackStyle.VIDEO
      }

      if (mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_IMAGE) {
          val mimeType = contentResolver.getType(uri)
          if (mimeType == "image/gif") return PlaybackStyle.IMAGE_ANIMATED
          if (mimeType == "image/webp") {
              contentResolver.openInputStream(uri)?.use { stream ->
                  val header = ByteArray(21)
                  if (stream.read(header) == 21 && "ANIM" in String(header, Charsets.US_ASCII)) {
                      return PlaybackStyle.IMAGE_ANIMATED
                  }
              }
          }
          return PlaybackStyle.IMAGE
      }

      return PlaybackStyle.UNKNOWN
  }

This is just from Claude; I haven't tested this and the code could probably be simplified, but based on this it should be possible to get this metadata on Android. It would be annoying if we had this column but then couldn't trust whether it's accurate because it isn't populated properly on Android.

@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 27, 2026

So it seems thats part of the SDK Extension which isn't part of the built?
Just saw that there is also FORMAT_GIF/MOTION_PHOTO/ANIMATED_WEBP/NONE, so we can work with this.

If we don't want to add compileSdkExtension 21 to the built we can also just use the raw values instead of the typed constants.

@mertalev
Copy link
Member

mertalev commented Feb 27, 2026

The column itself is an extension of S, which is SDK 31, so it would still need a runtime check like SdkExtensions.getExtensionVersion(Build.VERSION_CODES.S) >= 21.

@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 27, 2026

@mertalev Okay, so I debug this approach a bit on emulators:

API 36.1 (special format)

  • detected animated webp
  • detected gif
  • detected a single motion photo out of 3 different downloaded from here
  • doesn't detect apple Live Photos downloaded from immich onto android

xmp for Motion Photo detection and ANIM in Header for animated webp

  • using the xmp detection detected 2 out of 3 motion photos
  • checking ANIM in header didn't work for webp detection.
  • also doesn't detect apple Live Photos downloaded from immich onto android

So the question is, should we always use XMP as fallback even if special format is available, so we detect more motion photos?


Webp fallback worked when I used this code:

ctx.contentResolver.openInputStream(uri)?.use { stream ->
            val header = ByteArray(64)
            if (stream.read(header) < 30) return PlaybackStyle.IMAGE

            val riff = String(header, 0, 4, Charsets.US_ASCII)
            val webp = String(header, 8, 4, Charsets.US_ASCII)
            if (riff != "RIFF" || webp != "WEBP") return PlaybackStyle.IMAGE

            var offset = 12
            while (offset + 8 <= header.size) {
              val chunk = String(header, offset, 4, Charsets.US_ASCII)
              val size =
                (header[offset + 4].toInt() and 0xFF) or
                  ((header[offset + 5].toInt() and 0xFF) shl 8) or
                  ((header[offset + 6].toInt() and 0xFF) shl 16) or
                  ((header[offset + 7].toInt() and 0xFF) shl 24)

              if (chunk == "VP8X") {
                val flags = header[offset + 8].toInt() and 0xFF
                return when {
                  (flags and 0x02) != 0 -> PlaybackStyle.ANIMATED // Animation flag
                  else -> PlaybackStyle.IMAGE
                }
              }

              offset += 8 + size + (size % 2) // padding
            }
          }

@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 27, 2026

Seems like specialFormat == 2 -> PlaybackStyle.LIVE_PHOTO doesn't work for HEIC motion photos on my emulator.

@mertalev
Copy link
Member

Do you have a physical Android to test with? The emulator might be more limited.

@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 27, 2026

Do you have a physical Android to test with? The emulator might be more limited.

No I sadly don't have a android phone.

But I found the problem. I downloaded the "Heic" file called Galaxy S20 - MP, no HDR, Only Samsung versionless trailer.heic which is per se a motion photo, but only in the Samsung gallery, and it seems that photo is not following a standard.
The other heic photos have the tag included, and are being recognised.

So the only motion photos not recognised on android are these non standard versions and apple Live Photos.

To correctly detect apple Live Photos we would have to:

  1. Read XMP/EXIF com.apple.content-identifier
  2. Query MediaStore for a video file with the same base name (e.g., IMG_1234.HEIC → IMG_1234.MOV)
  3. Read that file's metadata to confirm the identifier matches

And then we can also tell on android if there is an iOS Live Photo.
But I think this doesn't fit well into the current pipeline.

@mertalev
Copy link
Member

mertalev commented Feb 27, 2026

I don't think we need to go that far, just sticking to what follows the spec and is supported by the OS is fine.

@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 27, 2026

Okay nice, then I think this PR has everything. If you still find issues let me know :D

@LeLunZ
Copy link
Collaborator Author

LeLunZ commented Feb 28, 2026

Messed up my commit, 636f38e also includes most of the other changes in MessagesImplBase.kt

@mertalev mertalev enabled auto-merge (squash) February 28, 2026 03:06
@mertalev mertalev merged commit d6c724b into immich-app:main Feb 28, 2026
51 of 52 checks passed
babbitt pushed a commit to babbitt/immich that referenced this pull request Mar 1, 2026
* feat(mobile): add playbackStyle to native sync API

Adds a `playbackStyle` field to `PlatformAsset` in the pigeon sync API so
native platforms can communicate the asset's playback style (image, video,
animated, livePhoto) to Flutter during sync.

- Add `playbackStyleValue` computed property to `PHAsset` extension (iOS)
- Populate `playbackStyle` in `toPlatformAsset()` and the full-sync path
- Update generated Dart/Kotlin/Swift files

* fix(tests): add playbackStyle to local asset test cases

* fix(tests): update playbackStyle to use integer values in local sync tests

* feat(mobile): extend playbackStyle enum to include videoLooping

* Update PHAssetExtensions.swift

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* fix(playback): simplify playbackStyleValue implementation by removing iOS version check

* feat(android): implement proper playbackStyle detection

* add PlatformAssetPlaybackStyle enum

* linting

---------

Co-authored-by: Mert <101130780+mertalev@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants