Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

✨ Migrate to new-arch (react-native 0.74, Fabric/TurboModules/CodeGen/bridgeless) #2614

Open
2 tasks done
mrousavy opened this issue Feb 27, 2024 · 46 comments
Open
2 tasks done
Labels
🤖 android Issue affects the Android platform ✨ feature Proposes a new feature or enhancement Fund 🍏 ios Issue affects the iOS platform

Comments

@mrousavy
Copy link
Owner

mrousavy commented Feb 27, 2024

What feature or enhancement are you suggesting?

VisionCamera needs to migrate to the new architecture, and use Fabric for the View, and TurboModules + CodeGen for the native methods. Also, this implies compatibility with react-native 0.74, as that currently fails to build. (related to New Arch)

Implementation plan

Since VisionCamera is a bit more complicated, there are a few blockers I have encountered:

  1. ❌ I have both a View (CameraView) and two Modules (CameraModule + DevicesModule) in the codebase. Currently, the create-react-native-library template does not have a template for multiple CodeGen specs (views + modules).
  2. ❌ For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the <Camera> view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet. This is how this works currently:
  3. ❌ For the Frame Processor runtime I need access to the jsi::Runtime as early as possible (on demand). Currently I get the jsi::Runtime from the RCTCxxBridge/CatalystInstance, which is a kinda private and unsafe API.

And then a few things I wanted to wait for (which aren't blockers) before switching to the new arch are:

  1. I want to pass multiple callbacks to a native function. This was a limitation with the Bridge, but should now work with TurboModules I think? In my case startRecording() takes both onRecordingFinished and onRecordingError callbacks, and also returns a Promise which resolves once the recording has actually been started.
  2. I couldn't find a clear script to trigger CodeGen (something like npx react-native codegen) to generate both iOS and Android specs.
  3. New arch doesn't have first class Swift support as far as I know?
  4. New arch doesn't support custom "hybrid" objects (so e.g. I could return an in-memory UIImage/Image instance in takePhoto(), instead of writing it to a file and returning a string as that's the only supported type)

What Platforms whould this feature/enhancement affect?

iOS, Android

Alternatives/Workarounds

Currently only the renderer interop layer can be used, but there are some issues:

Additional information

Upvote & Fund

  • We're using Polar.sh so you can upvote and help fund this issue.
  • We receive the funding once the issue is completed & confirmed by you.
  • Thank you in advance for helping prioritize & fund our backlog.
Fund with Polar
@mrousavy mrousavy added 📢 help wanted Extra attention is needed 🍏 ios Issue affects the iOS platform 🤖 android Issue affects the Android platform ✨ feature Proposes a new feature or enhancement labels Feb 27, 2024
@mrousavy mrousavy pinned this issue Feb 27, 2024
@cipolleschi
Copy link

Hi @mrousavy, thanks for this issue, it is super helpful!

  1. ❌ I have both a View (CameraView) and two Modules (CameraModule + DevicesModule) in the codebase. Currently, the create-react-native-library template does not have a template for multiple CodeGen specs (views + modules).

This should not be a problem. In your codegenConfig field of the vision-camera package, you can specify all as type, and CodeGen shuld generate the native code for both modules and classes.
In this case, the jsSrcs field should be a parent folder that contains all the specs. For example:

react-native-camera-module
'-> js
    '-> Module
          '-> NativeCameraModule.js
     '-> NativeComponent
         '-> CameraViewComponent.js 

and you pass just js as folder in the jsSrcs.

  1. ❌ For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet. This is how this works currently:
    JS: Camera.tsx (uses findNodeHandle(..) + a global setFrameProcessor(..) func)
    iOS: VisionCameraProxy.mm (uses UIManager::viewForReactTag to find the view)
    Android: VisionCameraProxy.kt (uses UIManagerHelper.getUIManager to find the View)
  2. ❌ For the Frame Processor runtime I need access to the jsi::Runtime as early as possible (on demand). Currently I get the jsi::Runtime from the RCTCxxBridge/CatalystInstance, which is a kinda private and unsafe API.
    iOS: VisionCameraInstaller::install
    Android: VisionCameraProxy.kt.

I don't have a strong guidance here, but a coleague is working on 3. and perhaps with is work you'll be able to make 2. works as well? Not sure about this.

  1. I want to pass multiple callbacks to a native function. This was a limitation with the Bridge, but should now work with TurboModules I think? In my case startRecording() takes both onRecordingFinished and onRecordingError callbacks, and also returns a Promise which resolves once the recording has actually been started.

I read internally a proposal to make this happen, but I don't have an update on the work here.

  1. I couldn't find a clear script to trigger CodeGen (something like npx react-native codegen) to generate both iOS and Android specs.

In 0.74, we have npx react-native codegen, but you need to call it twice, one for iOS and one for Android.
It should be easy to add a

"script" : {
  "run-codegen": "scripts/execute-codegen.sh",
  }

to react-native-vision-camera where you run:

npx react-native codegen <ios params>
npx react-native codegen <android params>

and then you can call

yarn run-codegen
  1. New arch doesn't have first class Swift support as far as I know?

No, and it will not have in the short term. To make it work properly, we need to radically change the ReactCommon file structure and it is a long work which we can't prioritize at the moment.

Anyway, it is possible to work around the issue: you should be able to create a thin Objective-C++ layer that:

  1. implements the module/components specs.
  2. holds a Swift object.
  3. forward all the invocation to the held Swift object.

It would be actually very interesting to try this out in a library that is as complex as yours: we might be able to generalize the wrapper somehow... 🤔

  1. New arch doesn't support custom "hybrid" objects (so e.g. I could return an in-memory UIImage/Image instance in takePhoto(), instead of writing it to a file and returning a string as that's the only supported type)

I think expo manage to achieve something like this for some of their packages. There are ways to do it probably, but I feel that they might depends heavily on your use case.

Currently only the renderer interop layer can be used, but there are some issues:
#2613

We did a lot of work on interop layers for 0.74. The issue author is not mentioning which version of React Native they are using. It is likely that we fix them already in 0.74 (or they will be fixed soon).

@mrousavy
Copy link
Owner Author

Thanks so much for your detailed answer @cipolleschi! This is really helpful :)

@rayronvictor

This comment was marked as off-topic.

@fabOnReact
Copy link

fabOnReact commented Feb 28, 2024

  1. For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet. This is how this works currently:
    JS: Camera.tsx (uses findNodeHandle(..) + a global setFrameProcessor(..) func)
    iOS: VisionCameraProxy.mm (uses UIManager::viewForReactTag to find the view)
    Android: VisionCameraProxy.kt (uses UIManagerHelper.getUIManager to find the View)

I believe there is an example of implementation in TextInput.js which uses the codegen command setTextAndSelection instead of using UIManager + findNodeHandle.

Codegen Commands replaces setNativeProp, which was used with functionalities like react-native-camera zoom prop. Re-rendering the entire component every time there is a change in zoom, would cause the react-native app to slow down, for this reason we would use setNativeProp({zoom: newZoomValue}).

It is sometimes necessary to make changes directly to a component without using state/props to trigger a re-render of the entire subtree. When using React in the browser for example, you sometimes need to directly modify a DOM node, and the same is true for views in mobile apps. setNativeProps is the React Native equivalent to setting properties directly on a DOM node.

More info here facebook/react-native#42701 (comment)

The codegen command calls FabricRenderer dispatchCommand which uses createFromHostFunction to register a function on the native Android/iOS API. JavaScript can call the function in the native iOS/Android API.

  1. Function::createFromHostFunction() registers a new function with methodNam is dispatchCommand
/// A function which has this type can be registered as a function
/// callable from JavaScript using Function::createFromHostFunction().
/// When the function is called, args will point to the arguments, and
/// count will indicate how many arguments are passed.  The function
/// can return a Value to the caller, or throw an exception.  If a C++
/// exception is thrown, a JS Error will be created and thrown into
/// JS; if the C++ exception extends std::exception, the Error's
/// message will be whatever what() returns. Note that it is undefined whether
/// HostFunctions may or may not be called in strict mode; that is `thisVal`
/// can be any value - it will not necessarily be coerced to an object or
/// or set to the global object.
  1. The HostFunction for the methodName dispatchCommand is passed as the last argument of createFromHostFunction

https://github.com/facebook/react-native/blob/8317325fb2bf6563a9314431c42c90ff68fb35fb/packages/react-native/ReactCommon/jsi/jsi/jsi.h#L100-L111

  /// Create a function which, when invoked, calls C++ code. If the
  /// function throws an exception, a JS Error will be created and
  /// thrown.
  /// \param name the name property for the function.
  /// \param paramCount the length property for the function, which
  /// may not be the number of arguments the function is passed.

https://github.com/facebook/react-native/blob/8ff05b5a18db85ab699323d1745a5f17cdc8bf6c/packages/react-native/ReactCommon/react/renderer/uimanager/UIManagerBinding.cpp#L603-L618

[uiManager, methodName, paramCount](
    jsi::Runtime& runtime,
    const jsi::Value& /*thisValue*/,
    const jsi::Value* arguments,
    size_t count) -> jsi::Value {
  validateArgumentCount(runtime, methodName, paramCount, count);


  auto shadowNode = shadowNodeFromValue(runtime, arguments[0]);
  if (shadowNode) {
    uiManager->dispatchCommand(
        shadowNode,
        stringFromValue(runtime, arguments[1]),
        commandArgsFromValue(runtime, arguments[2]));
  }
  return jsi::Value::undefined();
});
  1. The dispatchCommand HostFunction calls FabricMountingManager::dispatchCommand:

@fabOnReact

This comment was marked as resolved.

@philIip
Copy link

philIip commented Feb 29, 2024

❌ For the Frame Processor runtime I need access to the jsi::Value/jsi::Function that the user passes to the view directly, instead of converting it to a callback within the TurboModules/Native Modules system because the Frame Processor function is a Worklet. This is how this works currently:

hiiiii thanks so much for this post; reading this and the code i'm not totally understanding what the core issue is - is it that the UIManager APIs themselves are not supported / behaving as expected in the new architecture?

another question, i'm assuming that the problem you're implying with the worklet is that it needs to be run on a specific thread, but the callback conversion forces it to always be run on JS thread? the piece that i'm missing here is where the native module block conversion is happening? it seems like everything in your code pointers is pure JSI at the moment and doesn't involve the native module infra.

❌ For the Frame Processor runtime I need access to the jsi::Runtime as early as possible (on demand). Currently I get the jsi::Runtime from the RCTCxxBridge/CatalystInstance, which is a kinda private and unsafe API.

right now we made this backward compatible, but i will be writing some docs on what the future-looking APIs are for this!

@mrousavy
Copy link
Owner Author

hiiiii thanks so much for this post; reading this and the code i'm not totally understanding what the core issue is - is it that the UIManager APIs themselves are not supported / behaving as expected in the new architecture?

Hey @philIip thanks for tuning in! 👋 so the idea is that VisionCamera's View uses the RN architecture for almost everything:

  • device
  • format
  • isActive
  • resizeMode
  • ...etc

But there is one prop which the user passes to the Camera that I cannot use the RN architecture for, and that is the frameProcessor. The reason is that I do some special transformations to that function, and need to get it as the pure jsi::Function on the native side to convert it to a callable Worklet on my native side.
If I don't manually get it as a jsi::Function and instead use the bridge for this, it will convert it to a RCTCallback or something, which is not what I want.

This then runs on the Camera Thread and is synchronously invoked with a jsi::HostObject as a parameter.

But this is only one specific prop for the Camera (frameProcessor) the rest should use the RN architecture as normal.

The RN architecture simply doesn't have a mechanism for Worklets, or custom jsi::HostObjects currently, so I need to write that manually in JSI/C++.

another question, i'm assuming that the problem you're implying with the worklet is that it needs to be run on a specific thread, but the callback conversion forces it to always be run on JS thread? the piece that i'm missing here is where the native module block conversion is happening? it seems like everything in your code pointers is pure JSI at the moment and doesn't involve the native module infra.

Exactly, that and that I cannot pass a custom jsi::HostObject ("Frame") to the callback.

This is where I eventually convert the jsi::Function to a normal Objective-C/Swift method:

@philIip
Copy link

philIip commented Mar 1, 2024

ok i see, so bear with me just want to make sure i'm getting it right! this worklet scenario is not properly supported in the old or new architecture, but you've figured out a way to make it work doing all this custom JSI stuff.

is it a blocker because this is something you feel strongly should be baked into the framework, or is there actually a gap in the old / new architecture here? i think that's the last part i'm missing

@mrousavy
Copy link
Owner Author

mrousavy commented Mar 1, 2024

No worries! Yes this is something that does not work in old or new arch, and I don't think it even should be part of core.
It's very custom stuff.

The reason I thought it was a blocker for me is because the workaround in old arch was using the RCTCxxBridge/CatalystInstance to get the Runtime, and this now no longer works in new arch - so I want to use a Cxx TurboModule to get access to the runtime directly, but I have my doubts that codegen will work in this "hybrid" mode (generate everything for C++, but also bridge 90% of the props over to Java/Swift, while the others remain in C++ JSI)

@joacub
Copy link

joacub commented Apr 5, 2024

hi @mrousavy thanks for this great project, im looking for this as we are using the scanner in a ver small application for cuba and the phones in there are very old ones, and i think this will be a really good for the performance of the overall scanner, do you ahve any time on when this will be migrated to the new arch ?, the react-native-workles is a blocker for us to use the new arch

@mrousavy
Copy link
Owner Author

mrousavy commented Apr 8, 2024

Not yet, I asked the Meta guys a question here: reactwg/react-native-new-architecture#167 (reply in thread) and I am waiting for a reply - that'll decide how easy it's gonna be to migrate over to new arch.

@flexingCode

This comment was marked as outdated.

@necmettindev

This comment was marked as outdated.

@mrousavy mrousavy removed the 📢 help wanted Extra attention is needed label May 2, 2024
@chawkip

This comment was marked as duplicate.

@404-html
Copy link

404-html commented May 6, 2024

Do you guys plan migrate both, V3 and V4, to the new-arch?

@mrousavy
Copy link
Owner Author

mrousavy commented May 6, 2024

It's only me doing the migration - and no, only V4 will be migrated. If you are on V3, I'd recommend to switch over to V4 because there are tons of improvements

@coder-xiaomo
Copy link

Thanks for the author's open source project, which has helped me a lot. I would like to know, is it a big workload to migrate v4 to new architecture, and does it take a lot of time and energy?

I am preparing to migrate to react native 0.74.1, but this project does not support the new architecture at present, I would like to know how long will it be appropriate to migrate, such as weeks, months, or even longer?

Thanks again for the author's open source project, which has helped me a lot!

@MinskLeo
Copy link

MinskLeo commented May 7, 2024

Thanks for author's effort. Would be really good to have react-native-vision-camera on 0.74+.

@joacub

This comment was marked as resolved.

@idrisssakhi
Copy link

idrisssakhi commented May 11, 2024

A small patch for those having a build issue. Not sure if the frame processor will continue working after this patch, waiting for your inputs. But the camera will work, and the build issues will be resolved.

diff --git a/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt b/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
index d697befe..8de418b0 100644
--- a/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
+++ b/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
@@ -7,12 +7,14 @@ import com.facebook.jni.HybridData
 import com.facebook.proguard.annotations.DoNotStrip
 import com.facebook.react.bridge.ReactApplicationContext
 import com.facebook.react.bridge.UiThreadUtil
+import com.facebook.react.common.annotations.FrameworkAPI
 import com.facebook.react.turbomodule.core.CallInvokerHolderImpl
 import com.facebook.react.uimanager.UIManagerHelper
 import com.mrousavy.camera.core.ViewNotFoundError
 import com.mrousavy.camera.react.CameraView
 import java.lang.ref.WeakReference
 
+@OptIn(FrameworkAPI::class)
 @Suppress("KotlinJniMissingFunction") // we use fbjni.
 class VisionCameraProxy(private val reactContext: ReactApplicationContext) {
   companion object {

@tremblerz

This comment was marked as resolved.

@nikitapilgrim

This comment was marked as resolved.

@joacub

This comment was marked as resolved.

@chawkip
Copy link

chawkip commented May 14, 2024

A small patch for those having a build issue. Not sure if the frame processor will continue working after this patch, waiting for your inputs. But the camera will work, and the build issues will be resolved.

diff --git a/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt b/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
index d697befe..8de418b0 100644
--- a/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
+++ b/node_modules/react-native-vision-camera/android/src/main/java/com/mrousavy/camera/frameprocessors/VisionCameraProxy.kt
@@ -7,12 +7,14 @@ import com.facebook.jni.HybridData
 import com.facebook.proguard.annotations.DoNotStrip
 import com.facebook.react.bridge.ReactApplicationContext
 import com.facebook.react.bridge.UiThreadUtil
+import com.facebook.react.common.annotations.FrameworkAPI
 import com.facebook.react.turbomodule.core.CallInvokerHolderImpl
 import com.facebook.react.uimanager.UIManagerHelper
 import com.mrousavy.camera.core.ViewNotFoundError
 import com.mrousavy.camera.react.CameraView
 import java.lang.ref.WeakReference
 
+@OptIn(FrameworkAPI::class)
 @Suppress("KotlinJniMissingFunction") // we use fbjni.
 class VisionCameraProxy(private val reactContext: ReactApplicationContext) {
   companion object {

Works like a charm! Thanks 🙏🏻

@cjadhav

This comment was marked as duplicate.

@mrousavy

This comment was marked as off-topic.

@mrousavy
Copy link
Owner Author

JFYI; I just released react-native-vision-camera 4.3.2 which works on react-native 0.74 on both the old and the new architecture.

It is a temporary workaround, as it still uses the compatibility layer and is not a true TurboModule/Fabric view.

For now this works fine in my tests, but in the future VisionCamera will definitely need to be migrated to a TurboModule/Fabric view, which is what this issue here is about.

@joacub

This comment was marked as off-topic.

@balazsgerlei
Copy link

JFYI; I just released react-native-vision-camera 4.3.2 which works on react-native 0.74 on both the old and the new architecture.

It is a temporary workaround, as it still uses the compatibility layer and is not a true TurboModule/Fabric view.

For now this works fine in my tests, but in the future VisionCamera will definitely need to be migrated to a TurboModule/Fabric view, which is what this issue here is about.

If I understand correctly, supporting both old and new architectures is only temporary, right? May I ask what prevents keeping support for both?

It's quite painful that no react native Camera library seem to support both architectures and e.g. if one would like to have a sample app for a library that supports both architectures cannot have even a quick qr code scanning in that sample without a library that supports both...

@mrousavy
Copy link
Owner Author

If I understand correctly, supporting both old and new architectures is only temporary, right?

Well essentially every library out there only supports both new and old arch temporarily - at some point in the not so distant future, the old bridge/old arch will be completely gone from react native.
VisionCamera might just drop support for the old arch sooner than other libraries.

May I ask what prevents keeping support for both?

Because I am maintaining VisionCamera (and a bunch of other libraries) alone and it is already a huge effort. Maintaining multiple architectures makes things twice as complicated, and I choose to keep things simple.
If the time comes when I drop old-arch support, you can always stick to an older version of VisionCamera, until you upgrade to new arch as well.

Also, unlike a few other simpler libraries, VisionCamera has a very complex JSI part (Frame Processors, with synchronous access to the Frame object and raw data), which I plan to simplify a lot by moving it to new arch. This makes the entire structure simpler and easier to maintain, but won't be backwards compatible.

And lastly, I am currently prototyping a different custom React Native Framework (basically an alternative to TurboModules), and I might migrate VisionCamera over to that.

The benefit is much better performance, simpler codebase (for the ObjC bridge and also especially for Frame Processors), and custom objects support in takePhoto() or prepareRecording().
More on that soon, maybe I'll post updates on Twitter. But this is really exciting :)

It's quite painful that no react native Camera library seem to support both architectures

What? VisionCamera supports both old and new architecture(*1) in the latest stable release(s).

if one would like to have a sample app for a library that supports both architectures cannot have even a quick qr code scanning in that sample without a library that supports both...

I'm not sure if I understand what you're saying. VisionCamera supports both old and new architecture(*1), and the example app has a QR code scanner embedded in it. Did you try it?

You can also simply enable the new arch flag in the example to build it with the new arch.

  • *1: New architecture is currently just the compatibility layer built by Meta. It is not a true codegen'd TurboModule/Fabric view yet.

@balazsgerlei
Copy link

I'm not sure if I understand what you're saying. VisionCamera supports both old and new architecture(*1), and the example app has a QR code scanner embedded in it. Did you try it?

Sure, sorry for not being clear - I meant that except the current release of react-native-vision-camera nothing seem to support both architectures (for QR code reading) and it seems like this will be temporary too.

I have not upgraded to 4.x with a library sample app that I based my use-case example on, but I surely will, but my comment was basically an inquiry on whether this state will remain or would no longer be the case (soon).

Thanks for shedding some light on your feature plans!

@mrousavy
Copy link
Owner Author

Well yes I will eventually drop support for old-arch - probably a bit sooner than other libraries, but that is due to the complexity that will be required to maintain both archs. My plan for the new arch (maybe VisionCamera 5.x.x) will be to use the new foundation that I am writing, which is essentially just a bare C++ module that calls into Swift and Kotlin - so it will by design not support the old architecture unless I write a bridging layer for it - which I consider unnecessary if release this in e.g. 4 months. V4 will support old arch (and new arch with the compat layer), V5 then only new arch.

@jeanraymonddaher
Copy link

will be awesome once we have this. Thank you !!

@zzz08900
Copy link
Contributor

zzz08900 commented Aug 19, 2024

Just noticed RN 0.75 came with an official way of accessing jsi::Runtime in TurboModules. Does that benefit RNVC?

Here's the link
https://reactnative.dev/blog/2024/08/12/release-0.75

@mrousavy
Copy link
Owner Author

Hey - yep they introduced this specifically for modules like react-native-vision-camera, but I likely won't need it anymore because I'm building Nitro Modules now.

@jeanraymonddaher

This comment was marked as spam.

@mrousavy
Copy link
Owner Author

Update: I recently released Nitro Modules, and I'm currently thinking about my implementation strategy here.

I'll likely use Nitro because VisionCamera has a bunch of native objects it passes around (like frameProcessor={...}, device={...}, or format={...}) as well as instances it creates imperatively (prepareRecordingSession(): RecordingSession, takePhoto(): Image).
This is not possible with Turbo/Fabric, but very well supported in Nitro - at least as of right now.

However Nitro does not have first-class view support, and I'd have to somehow bridge that over to a Turbo/Fabric view. (e.g. like this).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖 android Issue affects the Android platform ✨ feature Proposes a new feature or enhancement Fund 🍏 ios Issue affects the iOS platform
Projects
None yet
Development

No branches or pull requests