feat(mobile): add playbackStyle to native sync API#26541
feat(mobile): add playbackStyle to native sync API#26541mertalev merged 9 commits intoimmich-app:mainfrom
Conversation
Adds a `playbackStyle` field to `PlatformAsset` in the pigeon sync API so native platforms can communicate the asset's playback style (image, video, animated, livePhoto) to Flutter during sync. - Add `playbackStyleValue` computed property to `PHAsset` extension (iOS) - Populate `playbackStyle` in `toPlatformAsset()` and the full-sync path - Update generated Dart/Kotlin/Swift files
There was a problem hiding this comment.
Pull request overview
Adds playbackStyle to the mobile native-sync Pigeon API so iOS/Android can report whether a synced asset is an image, video, animated image, or Live Photo to Flutter during sync.
Changes:
- Extend
PlatformAsset(Pigeon) with aplaybackStylefield and regenerate platform bindings (Dart/Swift/Kotlin). - Populate
playbackStyleon iOS usingPHAsset.playbackStyle. - Infer
playbackStyleon Android using MediaStoreMIME_TYPE(currently treatingimage/gifas animated).
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| mobile/pigeon/native_sync_api.dart | Adds playbackStyle to the Pigeon PlatformAsset definition (defaulting to image). |
| mobile/lib/platform/native_sync_api.g.dart | Regenerated Dart bindings to serialize/deserialize playbackStyle. |
| mobile/ios/Runner/Sync/PHAssetExtensions.swift | Maps PHAsset.playbackStyle to the integer playbackStyle value in PlatformAsset. |
| mobile/ios/Runner/Sync/MessagesImpl.swift | Updates the placeholder PlatformAsset used for id-only comparisons to include playbackStyle. |
| mobile/ios/Runner/Sync/Messages.g.swift | Regenerated Swift bindings to include playbackStyle. |
| mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt | Adds MIME_TYPE to the MediaStore query and derives playbackStyle (video/gif/static). |
| mobile/android/app/src/main/kotlin/app/alextran/immich/sync/Messages.g.kt | Regenerated Kotlin bindings to include playbackStyle. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
b0c106b to
f5cd4a9
Compare
|
and actually something else I just saw: in There is a check if the file is a image: But the mediaType is already a remaped value? Should the comparison really be made against the enum? Shouldn't we use the value we defined a few lines above? Should we include this small change here? |
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
0417135 to
3aedd0a
Compare
It's nicer to use enums than raw ints. We could probably define our own enum and use that instead of 0, 1, 2 etc. |
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
@mertalev Should I include that in here? |
… iOS version check
I'm hesitant to change existing sync code in this PR, so let's leave it for now. |
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
|
@mertalev I just want to highlight that these changes here are only to get some initial values for assets that weren't synced to the server yet. For remote assets we have a duration that we can later use in the flutter code, just like @meesfrensel commented. |
Some people don't want to back up everything, so making it work for local-only assets is still valuable. Live images also only work for backed up assets at the moment (and don't work offline) because we haven't implemented local playback and we don't even know if an asset is live or not until the server tells us. This metadata is a useful first step to improving that situation. Supposedly, it's available as an internal enum class PlaybackStyle {
IMAGE,
IMAGE_ANIMATED,
LIVE_PHOTO,
VIDEO,
VIDEO_LOOPING,
UNKNOWN,
}
fun getPlaybackStyle(contentResolver: ContentResolver, uri: Uri): PlaybackStyle {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.TIRAMISU) {
contentResolver.query(
uri,
arrayOf(MediaStore.Files.FileColumns.MEDIA_TYPE, "_special_format"),
null, null, null
)?.use { cursor ->
if (cursor.moveToFirst()) {
val mediaType = cursor.getInt(0)
val specialFormat = cursor.getInt(1)
return when {
specialFormat == 2 -> PlaybackStyle.LIVE_PHOTO
specialFormat == 1 || specialFormat == 3 -> PlaybackStyle.IMAGE_ANIMATED
mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_VIDEO -> PlaybackStyle.VIDEO
mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_IMAGE -> PlaybackStyle.IMAGE
else -> PlaybackStyle.UNKNOWN
}
}
}
return PlaybackStyle.UNKNOWN
}
// Pre-API 33 fallback
val columns = mutableListOf(MediaStore.Files.FileColumns.MEDIA_TYPE)
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) columns.add(MediaStore.MediaColumns.XMP)
var mediaType = 0
var xmp: String? = null
contentResolver.query(uri, columns.toTypedArray(), null, null, null)?.use { cursor ->
if (cursor.moveToFirst()) {
mediaType = cursor.getInt(0)
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) {
xmp = cursor.getBlob(1)?.toString(Charsets.UTF_8)
}
}
}
// API 26-29: get XMP from file
if (xmp == null && Build.VERSION.SDK_INT < Build.VERSION_CODES.R) {
contentResolver.openInputStream(uri)?.use { stream ->
xmp = ExifInterface(stream).getAttribute(ExifInterface.TAG_XMP)
}
}
if (xmp != null && ("GCamera:MotionPhoto" in xmp!! || "Camera:MotionPhoto" in xmp!!)) {
return PlaybackStyle.LIVE_PHOTO
}
if (mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_VIDEO) {
return PlaybackStyle.VIDEO
}
if (mediaType == MediaStore.Files.FileColumns.MEDIA_TYPE_IMAGE) {
val mimeType = contentResolver.getType(uri)
if (mimeType == "image/gif") return PlaybackStyle.IMAGE_ANIMATED
if (mimeType == "image/webp") {
contentResolver.openInputStream(uri)?.use { stream ->
val header = ByteArray(21)
if (stream.read(header) == 21 && "ANIM" in String(header, Charsets.US_ASCII)) {
return PlaybackStyle.IMAGE_ANIMATED
}
}
}
return PlaybackStyle.IMAGE
}
return PlaybackStyle.UNKNOWN
}This is just from Claude; I haven't tested this and the code could probably be simplified, but based on this it should be possible to get this metadata on Android. It would be annoying if we had this column but then couldn't trust whether it's accurate because it isn't populated properly on Android. |
|
So it seems thats part of the SDK Extension which isn't part of the built? If we don't want to add |
|
The column itself is an extension of S, which is SDK 31, so it would still need a runtime check like |
|
@mertalev Okay, so I debug this approach a bit on emulators: API 36.1 (special format)
xmp for Motion Photo detection and ANIM in Header for animated webp
So the question is, should we always use XMP as fallback even if special format is available, so we detect more motion photos? Webp fallback worked when I used this code: ctx.contentResolver.openInputStream(uri)?.use { stream ->
val header = ByteArray(64)
if (stream.read(header) < 30) return PlaybackStyle.IMAGE
val riff = String(header, 0, 4, Charsets.US_ASCII)
val webp = String(header, 8, 4, Charsets.US_ASCII)
if (riff != "RIFF" || webp != "WEBP") return PlaybackStyle.IMAGE
var offset = 12
while (offset + 8 <= header.size) {
val chunk = String(header, offset, 4, Charsets.US_ASCII)
val size =
(header[offset + 4].toInt() and 0xFF) or
((header[offset + 5].toInt() and 0xFF) shl 8) or
((header[offset + 6].toInt() and 0xFF) shl 16) or
((header[offset + 7].toInt() and 0xFF) shl 24)
if (chunk == "VP8X") {
val flags = header[offset + 8].toInt() and 0xFF
return when {
(flags and 0x02) != 0 -> PlaybackStyle.ANIMATED // Animation flag
else -> PlaybackStyle.IMAGE
}
}
offset += 8 + size + (size % 2) // padding
}
} |
|
Seems like |
|
Do you have a physical Android to test with? The emulator might be more limited. |
No I sadly don't have a android phone. But I found the problem. I downloaded the "Heic" file called Galaxy S20 - MP, no HDR, Only Samsung versionless trailer.heic which is per se a motion photo, but only in the Samsung gallery, and it seems that photo is not following a standard. So the only motion photos not recognised on android are these non standard versions and apple Live Photos. To correctly detect apple Live Photos we would have to:
And then we can also tell on android if there is an iOS Live Photo. |
|
I don't think we need to go that far, just sticking to what follows the spec and is supported by the OS is fine. |
|
Okay nice, then I think this PR has everything. If you still find issues let me know :D |
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
mobile/android/app/src/main/kotlin/app/alextran/immich/sync/MessagesImplBase.kt
Outdated
Show resolved
Hide resolved
|
Messed up my commit, 636f38e also includes most of the other changes in MessagesImplBase.kt |
* feat(mobile): add playbackStyle to native sync API Adds a `playbackStyle` field to `PlatformAsset` in the pigeon sync API so native platforms can communicate the asset's playback style (image, video, animated, livePhoto) to Flutter during sync. - Add `playbackStyleValue` computed property to `PHAsset` extension (iOS) - Populate `playbackStyle` in `toPlatformAsset()` and the full-sync path - Update generated Dart/Kotlin/Swift files * fix(tests): add playbackStyle to local asset test cases * fix(tests): update playbackStyle to use integer values in local sync tests * feat(mobile): extend playbackStyle enum to include videoLooping * Update PHAssetExtensions.swift Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * fix(playback): simplify playbackStyleValue implementation by removing iOS version check * feat(android): implement proper playbackStyle detection * add PlatformAssetPlaybackStyle enum * linting --------- Co-authored-by: Mert <101130780+mertalev@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
As requested, smaller PRs for #26148
The first step which is easy to uncouple from all other changes in the native sync of the playbackstyle.
After this PR I will continue with:
Adds a
playbackStylefield toPlatformAssetin the pigeon sync API so native platforms can communicate the asset's playback style (image, video, animated, livePhoto) to Flutter during sync.Whats important to know:
playbackStyleValuein the PHAsset where we use iOS provides, but in android there is no such thing as a playback style value, thus we have to "guess" when playbackStyle is of animation (if you don't have other ideas). Currently this happens via MimeType of "image/gif"videoLooping. I wasn't sure if we also want to report that back. So I currently didn't include it and it just report playback style video for it.