@@ -113,6 +113,10 @@ bazel to build the iOS application. The content of the
113
113
5 . ` Main.storyboard ` and ` Launch.storyboard `
114
114
6 . ` Assets.xcassets ` directory.
115
115
116
+ Note: In newer versions of Xcode, you may see additional files ` SceneDelegate.h `
117
+ and ` SceneDelegate.m ` . Make sure to copy them too and add them to the ` BUILD `
118
+ file mentioned below.
119
+
116
120
Copy these files to a directory named ` HelloWorld ` to a location that can access
117
121
the MediaPipe source code. For example, the source code of the application that
118
122
we will build in this tutorial is located in
@@ -247,6 +251,12 @@ We need to get frames from the `_cameraSource` into our application
247
251
` MPPInputSourceDelegate ` . So our application ` ViewController ` can be a delegate
248
252
of ` _cameraSource ` .
249
253
254
+ Update the interface definition of ` ViewController ` accordingly:
255
+
256
+ ```
257
+ @interface ViewController () <MPPInputSourceDelegate>
258
+ ```
259
+
250
260
To handle camera setup and process incoming frames, we should use a queue
251
261
different from the main queue. Add the following to the implementation block of
252
262
the ` ViewController ` :
@@ -288,6 +298,12 @@ utility called `MPPLayerRenderer` to display images on the screen. This utility
288
298
can be used to display ` CVPixelBufferRef ` objects, which is the type of the
289
299
images provided by ` MPPCameraInputSource ` to its delegates.
290
300
301
+ In ` ViewController.m ` , add the following import line:
302
+
303
+ ```
304
+ #import "mediapipe/objc/MPPLayerRenderer.h"
305
+ ```
306
+
291
307
To display images of the screen, we need to add a new ` UIView ` object called
292
308
` _liveView ` to the ` ViewController ` .
293
309
@@ -411,6 +427,12 @@ Objective-C++.
411
427
412
428
### Use the graph in ` ViewController `
413
429
430
+ In ` ViewController.m ` , add the following import line:
431
+
432
+ ```
433
+ #import "mediapipe/objc/MPPGraph.h"
434
+ ```
435
+
414
436
Declare a static constant with the name of the graph, the input stream and the
415
437
output stream:
416
438
@@ -549,6 +571,12 @@ method to receive packets on this output stream and display them on the screen:
549
571
}
550
572
```
551
573
574
+ Update the interface definition of ` ViewController ` with ` MPPGraphDelegate ` :
575
+
576
+ ```
577
+ @interface ViewController () <MPPGraphDelegate, MPPInputSourceDelegate>
578
+ ```
579
+
552
580
And that is all! Build and run the app on your iOS device. You should see the
553
581
results of running the edge detection graph on a live video feed. Congrats!
554
582
@@ -560,5 +588,5 @@ appropriate `BUILD` file dependencies for the edge detection graph.
560
588
561
589
[ Bazel ] :https://bazel.build/
562
590
[ `edge_detection_mobile_gpu.pbtxt` ] :https://github.com/google/mediapipe/tree/master/mediapipe/graphs/edge_detection/edge_detection_mobile_gpu.pbtxt
563
- [ common ] :( https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/common)
564
- [ helloworld ] :( https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/helloworld)
591
+ [ common ] :https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/common
592
+ [ helloworld ] :https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/helloworld
0 commit comments