Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What are you using ROS# for? #20

Open
MartinBischoff opened this issue Nov 24, 2017 · 49 comments
Open

What are you using ROS# for? #20

MartinBischoff opened this issue Nov 24, 2017 · 49 comments
Labels

Comments

@MartinBischoff
Copy link
Collaborator

MartinBischoff commented Nov 24, 2017

We are very curious what you use ROS# for!

Let us use this issue to present our projects, whether finished, ongoing, or first idea.

I start! Here is our first public project:

teleoperation

A Tutlebot 2 is teleoperated via touchpad controllers HTC Vive.
A 3d point cloud of the camera image is illustrated in VR, as well as the mapping data and the moving turtlebot. The wheels of the turtlebot are animated via JointState messages. Users can teleport to different positions in the scene.
Further, users can switch between teleoperation and simulation mode in real time. In the simulation mode, RosBridgeClient disconnects and Users control the Urdf Simulation model by applying a torque at the wheels. It feels quite similar with the VR glasses on. The only difference is that the camera image is missing and the map does not update anymore. 😏

@lucascoelhof
Copy link

Hi all,

I'm a master's student at UFMG, in Brazil. My master's thesis project is about robot swarm teleoperation, and since we had a Oculus Rift laying around we thought it was a good idea to use it. However, as you may know, Oculus Rift does not have a fully functional Linux driver and they don't have plans to officially support it, so I made numerous attempts on somehow doing something in ROS and being able to visualize it on Oculus Rift. Maybe I lost 6 months trying different stuff.
In one of my attempts, I started using Unity, and I was looking for some C# libraries for ROS and I stumbled upon ROS#. And it solved all my problems!
You guys had a URDF importer (something I haven't even thought I needed but it was extremely useful). And had also implemented rosbridge with some message types that I needed. In a few HOURS I could implement what I was trying to accomplish for months. 😄

Since then, I'm using ROS# to transfer data between Unity and ROS. From Unity I need the Oculus Poses and buttons states. And ROS sends back the positions of my dear little robots and some other visualization stuff.

@jeremyfix
Copy link
Contributor

Hi,

I like to test control and visual processing algorithms on robotic platforms. While we can certainly test them on real platforms, it is clearly easier to test them on simulated robots.

To achieve this purpose, I used to work with V-REP interfaced with ROS. However, it turns that a particular project, I needed to get some realistic shading, light reflections, etc... Why my limited knowledge, I think that this is not easily achieved with VREP. I then turned to game engines like Unreal Engine and more recently Unity.

Unfortunately, I initially forgot quickly about Unity, wrongly thinking it is was not possible to use without a license. I therefore invested some time trying to achieve my purpose with Unreal Engine, especially trying to use Microsoft AirSim (yes I needed a drone as well) ; But, the Unreal Engine really requires a beast, not a computer :) And I even do not know how easily it would have been to interface with the ROS world, but definitely the time it takes to compile the engine makes the progresses at a very slow pace. More recently, colleagues suggested to use Unity and it turns out that 1) it is very quick to set up and to run even on a moderately decent hardware and 2) combined with ROS-sharp it seems to perfectly fit my needs.

So, for visually controlled robots, I can benefit from the Unity 3D capabilities for the rendering, and still develop/use algorithms on the ROS side; And this is also more funny to play with nice looking simulations :)

@Jasonthefirst
Copy link

I am a student at the KIT Karlsruhe trying to control a Kuka Robot with the Hololens. Right now we try to use the ros-sharp library but UWP is resilient. So far we only used ros-sharp urdf importer to get our robot into unity because it works better than the old urdf importer. After the import we have to delete ros-sharp because we can't compile the project.
The project goal is to control a robot virtually and only if the robot is able to do a movement without colliding in the virtual world the robot should move in the real world.

@MartinBischoff
Copy link
Collaborator Author

MartinBischoff commented Dec 14, 2017

Hi @Jasonthefirst , thanks for sharing your project with us! We also had a small project with the Hololens this year. Unfortunately without ROS#, though 😉

Let's discuss platform requirements in issue #21. Or, alternatively, please open a new issue if more appropriate.

I find it fascinating for what different types af applications, robots and VR/AR equipment you guys use ROS#. Looking forward to hear more stories!

@MartinBischoff
Copy link
Collaborator Author

MartinBischoff commented Apr 16, 2018

We are very proud and happy that parts of ROS# already found their way into Process Simulate, a Siemens Simulation Software for production lines.

My colleagues just sent me the link to this youtube video illustrating the import of a URDF model.

@jeremyfix
Copy link
Contributor

Hi @MartinBischoff , here is a follow up with a particular example of Unity3D + RosSharp where we want to simulate underground environments explored with drones.

You can have a look to it at : https://youtu.be/XajgNfNJ1VI

Not the full code is yet provided. Only the code of the drone is available for the moment at : https://github.com/jeremyfix/ros-and-unity/tree/master/DronePrefab

@MartinBischoff
Copy link
Collaborator Author

Very impressive @jeremyfix ! Thank you for sharing this info here.
I am happy to see what you can do with ROS#. Are you using this setup to test any (autonomous) functions of the drone? Is the similarity of real and virtual camera data of importance for you here?

All: Please do not hesitate and tell us about your applications here. It needn't be as elaborated as the applications above. We are also happy to read about any planned or ongoing tryouts...

@jeremyfix
Copy link
Contributor

@MartinBischoff I indeed use this setup for testing autonomous navigation in underground . This navigation is based on optic flow. For easily testing this ability , we needed to get a simulated environment with 3D shapes, textures, lightning so be as fair as possible with what we may encounter in real underground environements.

@NishanthJKumar
Copy link

Hi all,

I'm an undergraduate from Brown University in Providence, and my lab currently uses ROS# for a variety or projects including VR and AR visualization and teleoperation of a Rethink Robotics Baxter, and a Kinova Robotics Movo.

@MartinBischoff, I'm currently attempting to implement something similar to your turtlebot teleoperation project, but on a different robot. Is the Unity Project for this online somewhere, or is there a tutorial on how to integrate a depth image with the rgb camera? That'd be very helpful!

Our lab will post links to videos and images of the project soon! Excited to share more with all of you.

@MartinBischoff
Copy link
Collaborator Author

MartinBischoff commented Jun 19, 2018

Hi @NishanthJKumar ! Thanks for sharing this info about your project here!
I am looking forward to see the videos and images!

We did not publish our code for the turtlebot application open source. It would be too much effort to to make the code readable, document, fix bugs and administrated it properly. I hope you understand.

For the 3D point cloud we proceeded as follows:

  • We used opencv library to uncompress the CompressedImage data.
  • We used a script like this to get the points in 3D from the two 2D images.
  • We then put this data into a mesh with a simple shader that illustrates point data only.
  • By only updating parts of the mesh (e.g. 20%) in each frame you can improve the performance of your application.

@MahirGulzar
Copy link

MahirGulzar commented Jun 22, 2018

Hello everyone,

I am a Masters student in Computer Science from University of Tartu Estonia. I am currently doing an Internship in an Estonian Robotics Organization. My task here is to integrate ROS with Unity to display statistics and other operations on a Hololens device.

So far it was going good with ROS# Library but I had to abandon it. The problem arrises with sockets. The current version of ROS# doesn't work with UWP. Infact System.net.sockets is not working with UWP. After digging deep I found out that UDW supports Windows.Networking for sockets but Unity cannot recognize this. So right now trying to find a workaround to get pass this issue.

Btw I followed your thread for ROS# upgrade in which you use IProtocol to pass socket type on connection but still the issue persisted.

@kwunh
Copy link

kwunh commented Jul 13, 2018

Hello,

I'm currently working on my project where I have to implement a way to teleoperate a Baxter Robot machine using a HTC Vive on a Single Linux Ubuntu machine. Honestly a really daunting task as it's my first year doing coding but I was mainly just using java so C# is thankfully not too far-off. Anyways, I have got Unity working on my machine & learned basics of ROS, now I want to start communicating between the two.

This seems to be exactly the tool I need, so thank you so much for providing this! But I just want to make sure this is actually possible to do so on a single Linux machine as the documentation only mentions a Windows build.

I've seen posts of both people successfully and unsuccessfully using ROS# on a Linux machine... I can't get an opportunity to test this out until Monday but just want to know what other people's experiences with ROS# on a Linux machine is like. Using a Linux Ubuntu 16.04 btw. Thanks for any input! :)

@vbschettino
Copy link

Hi @MartinBischoff ,

First, thanks for making this awesome project available! I'm trying to achieve something very similar to what you've done with the TurtleBot and Vive, but all in Linux. I already have the correct setup for integrating Vive, Unity and ROS using ROS#, but only for the standard message types provided on the unity package. Could you, or someone else who also achieved this, elaborate a bit more on how you handled point cloud/RGBD data coming from ROS and used meshes to render it? I'm a beginner in Unity and couldn't quite get the idea.

@feofeona
Copy link

Hi everyone,

I am a researcher based in Taiwan National Chengchi University, we are currently developing AI for drones, and my task is to build a Ground Control Station to monitor the drones and supply other members with visual representations of data.

I am actually a game designer and quite familiar with Unity3D but lack the domain knowledge of drones and ROS, had been dealing with QGroundControl for the past few weeks but progress is slow, so I was searching for an alternate solution until I found ROS#.

I am using Unity3D on Ubuntu 16.04 and trying to communicate with my Intel Aero RTF Drone with ROS onboard; Any suggestions are welcomes :D

@samiamlabs
Copy link

Hi.

I'm planning to make a new version a automatic forklift project I worked on a year ago. See: http://minireach.readthedocs.io/en/latest/demo.html

Multiple fully simulated trucks in gazebo was rather slow, hopefully Unity works better...

@MischaRo
Copy link
Contributor

Hello,
thank you @MartinBischoff for making ROS# possible. Together with @awesome-manuel, we are working on a project to bring an tool for visualization of Nodes, Topics, Services...(similar to RVIZ) into 3D-space using the Hololense and other AR/VR devices. For this, we are currently working with the ROS#-UWP support project of @dwhit. Our goal is to create a user friendly and intuitive interface of the underlying ROS processes, which should be applicable independent of different use cases. The information could also be displayed relative in space to a real robot, to get a better understanding of the data and the robots functions. Later in the project we also plan to create a use case with a real robot to show of the different functionalities of the interface. We are really looking forward to work with ROS# :)

@MartinBischoff
Copy link
Collaborator Author

Hi @MischaRo and @awesome-manuel ! In pull request #135 you mentioned an appveyour.yml that you set up to bulid ROS# with AppVeyor. If you think it is of interest for others, please feel free to upload it e.g. on a new open GitHub project of yours.

@awesome-manuel
Copy link
Contributor

We see two use cases for AppVayor:

  1. Have a build (and test) feedback for pull request
  2. Automated build and distribution of DLLs instead of manual build and push into the repository

We can do both in our own repo, but especially the first case makes more sense to have it in your repo.

@Y0rk-code
Copy link

Hi,

What I am using ROS# for?....

I have some Unity and ROS (basic) experience.
So the main goal is learning. And maybe use it some day in a project interfacing ROS<->Unity

I managed to import the Niryo One URDF. Next step: Move the Unity Robot from ROS.
https://niryo.com/forums/topic/ros/

I like this ROS# project, thank you very much!

niryo_unity

@bsaund
Copy link
Contributor

bsaund commented Feb 10, 2019

Hello,
I am using ROS# in a system to teleop a physical robot using an HTC Vive.
https://www.youtube.com/watch?v=EahUsJKVfw8

ROS# was a great tool for connecting Unity to our ROS ecosystem.

@MartinBischoff
Copy link
Collaborator Author

Impressive video @bsaund . I'm happy to read this positive feedback about ROS#.

@stevensu1838
Copy link

stevensu1838 commented Jun 25, 2019

@bsaund
Hey buddy, I am also a user of ROS# as well. Can you please share you project? May I please learn from you on how to convert point cloud and display it into Unity? Any suggestions would be appreciate

@stevensu1838
Copy link

stevensu1838 commented Jun 25, 2019

Hi @NishanthJKumar ! Thanks for sharing this info about your project here!
I am looking forward to see the videos and images!

We did not publish our code for the turtlebot application open source. It would be too much effort to to make the code readable, document, fix bugs and administrated it properly. I hope you understand.

For the 3D point cloud we proceeded as follows:

  • We used opencv library to uncompress the CompressedImage data.
  • We used a script like this to get the points in 3D from the two 2D images.
  • We then put this data into a mesh with a simple shader that illustrates point data only.
  • By only updating parts of the mesh (e.g. 20%) in each frame you can improve the performance of your application.

Hi @MartinBischoff
The script you linked above is not working anymore. Can you please re-link it? Cheers

@MartinBischoff
Copy link
Collaborator Author

Link corrected (also here).
Please use this issue for only posting what cool things you are doing (or planning to do) with ROS#.

@bsaund
Copy link
Contributor

bsaund commented Jun 25, 2019

@stevensu1838
See my project: https://github.com/UM-ARM-Lab/unity_victor_teleop/wiki/Connecting-to-the-Kinect
The Kinect portion should be general, though some pieces of the full project are specific to my robot. Please post issues if you need further clarification.

@MartinBischoff
Copy link
Collaborator Author

Here we applied ML-Agents to train the Shadow Hand URDF to catch a falling ball.

Further info on this project that we did in 2018 already can be found here.

Special thanks to all students involved in this project
... and to all supporters of the underlying open source software.

@Rhybo
Copy link

Rhybo commented Nov 8, 2019

ROS# was the backbone for my thesis entitled "Observational Oversight for Understanding Trust in Interactive Human and AI Systems."

I developed a 3d user interface in Unity software that used the output of simulated UxVs with on-board ROS processes to instantiate them in the virtual environment in the correct orientation (odometry), accurately place them in the virtual world with the help of the CoordinateSharp repo and MapBox, and issue positional change commands all while having a near real-time depiction of the UxV positions within the environment.

ROS# made the generation of custom messages effortless and the integration with Unity easy.

image

@MartinBischoff
Copy link
Collaborator Author

@Rhybo thank you for this info and this positive feedback about ROS#.
I'm glad that ROS# was a help in your thesis.
I just read the abstract to understand a bit more what you were doing. It's impressive to see ROS# in operation for applications we never thought of when making it open source.

@LukasSeipel
Copy link

Hi there!
Thanks for ROS# !
I use it for my master thesis in which I want to control a mobile robot via an AR user interface.
I made a short video about it:
https://youtu.be/Nl5RxR_Plu0

All the best

@MartinBischoff
Copy link
Collaborator Author

Thanks for this post and for sharing the video @LukasSeipel !

@EricVoll
Copy link

Hi!

we (bunch of ETH Zürich students) are working together with Microsoft on our project. Mainly we are building a (more or less) plug and play HoloLens 1/2 Data-Visualization & Interaction tool for robots.
For that we extended parts of the URDF parser and the (Unity side) GameObject generator. The adjusted version can now also synchronize dynamic urdfs with the existing Robot-GameObject.

We also setup a REST-Server in a DockerContainer and can now (during runtime) request different Robots via POST requests. We actually created Docker Containers with the file_server, ROS, ROS Bridge and everything needed to start work with ROS#.

Our project deadlines will come up in a few weeks and I'll post some videos then :)

I think we made a few changes where you could be interested:

  • URDF parser extensions
  • Unity GameObject Synchronizer with Robot-Objects
  • Docker Containers running all the "Ubuntu stuff"
    This is our Ros# Fork and this is the general project repository. (Both repositories are not cleaned up yet. That will happen in the next few weeks)

Thank you very much for all your work! This repository enabled us to do a cool project 🙂

@MartinBischoff
Copy link
Collaborator Author

Hi @luchspeter ! thanks for the info and for also sharing your work with the general public. I am looking forward to seeing the videos of your project.

@EricVoll
Copy link

EricVoll commented Jun 16, 2020

Here is the video of our project (with a bit of a delay): https://youtu.be/_RWhjLkz5sk
We are currently waiting for a HoloLens2 which should arrive in a few weeks to make a better video. 🙂

@berkayalpcakal
Copy link
Contributor

Hello! I would like to share a project we have done for the course "Medical Augmented Reality" at TU Munich.

Here we instantiate a robotic arm hologram on HoloLens, set target points for the arm by tracking markers, send the environment mesh that is obtained by HoloLens to ROS, let MoveIt plan the collision-free trajectory, and see the hologram robot executing the trajectory.

Check out the videos!
https://drive.google.com/drive/folders/1Juz5tdw5dTWx5OKgjLzZ4U2TZSK-6dLV?usp=sharing

Here is the repo;
https://github.com/berkayalpcakal/ROS-HoloLens-Integration

Software Framework

demo

@glennliu
Copy link

Hi

I'm using ROS# for my HoloLens interface. I built an AR interface on HoloLens to remotely interact with an autonomous drone. The hologram contains: 3D occupancy map from the drone, real-time odometry and target drone for manipulation.

holo_ar

The AR interface is tested with an autonomous drone when flying in our HKUST campus.
Paper link (accepted by IROS 2020): https://arxiv.org/abs/2008.02234
Video link: https://www.youtube.com/watch?v=a_rlThegoB0

Glen

@saltyseabastard
Copy link

I work for Bastian Solutions in the R&D team in Boise, ID. I am using RosSharp to create a shared deterministic nav interface for our various automated vehicles. Thanks for all your hard work everyone!

@MartinBischoff
Copy link
Collaborator Author

Thanks @saltyseabastard for dropping a line here! Happy to read this SW is of use for you.

@EricVoll
Copy link

EricVoll commented Sep 22, 2020

Hi 👋

I was told that I am allowed to do a little advertisements for my fork. I currently maintain the UWP support fork of this repository over at https://github.com/EricVoll/ros-sharp/
This Ros# version can be deployed to UWP apps and with that to HoloLens 1&2
img
I recently updated the fork from the previous maintainer and the UWP fork is now almost up to date with the original Ros# and should also contain all features. It is of course fully compatible with MRTK (Version 2.3 & 2.4 tested)

Thanks again for @MartinBischoff for your work.

@MartinBischoff
Copy link
Collaborator Author

MartinBischoff commented Oct 21, 2020

Recently, we initiated a new open source project on github/siemens that we named Evaluation Framework.
It consists of three self-contained tools which facilitate exploring many design variants and finding the best compromise solution.

Evaluation Framework can also be used in combination with ROS# as demonstrated here.

Thanks go to Michael Dyck for his commitment in this project.

@MartinBischoff
Copy link
Collaborator Author

MartinBischoff commented Feb 18, 2021

Check out this article for new ideas on what you can do with ROS#.

@petpetpeter
Copy link

petpetpeter commented Apr 20, 2021

Hello! I'm here to thank you for the contribution to ROS#
and shared my project which is "ARCore and ROS integration with ROS#"

For someone who wondered if this tool is suitable for android devices

Please check out the video.
https://www.youtube.com/watch?v=28zOEsvyGYM

Android Set up
https://github.com/petpetpeter/ARrosCore

@Rive4
Copy link

Rive4 commented May 17, 2021

Only wanted to thank you for this amazing repository and share that we have been working with it to train reinforcement learning neural network which with we allowed a platform to get close to a target position.

LabVR

And as doctoral student, I am also working with the Hololens 2 adapted fork in order to program robotic arms in an easier way (still work-in-progress).

@MartinBischoff
Copy link
Collaborator Author

Thank you for your feedback and for sharing this info @Rive4 ! A beatiful scene image and a nice AI application.
Feel free to share further info here, e.g. when you finished your thesis or when you have published a demo video.

@WangDongBUAA
Copy link

Hi, @MartinBischoff ,I am impressed by your project, and lately, I took on a project about VR and the turtlebot2 robot. I wanna know how to realize VR to control the robot in real-time?

@simongle99
Copy link

Hi, I'm a french student and I'm currently working on a sonar. For my project I'm suppose to export data from ROS to Unity, so I choose to use ROS#. However, there is no script to collect this kind of data.
I'm not really familiar with C#, so I decided to "copy" an existing script and understand every part of it.
I created a new MessageTypes for my sonar in order to get and set data, but when I declare the constructor I get an error (CS0824). here you can find a screenshot of my script to create the new MessageTypes:
image
Could someone explain me why that doesn't work ?
Thanks in advance

@EricVoll
Copy link

@simongle99 this is not the right place to ask this question. I'd rather open a new Issue for this :)
But quick answer: remove the extern keyword and actually implement the constructor.

@simongle99
Copy link

simongle99 commented May 11, 2022 via email

@oliviasculley
Copy link

Hey! It looks like I forgot to mention my project NERVV that I developed in 2020 while at Purdue University's Jun Laboratory. While I don't have any demo videos handy, NERVV lets you easily set up machines in Unity, parse input data to create a virtual twin, and then even output this data to control machines with ROS!

Using a framework like NERVV to set up machines gives some benefits like being able to quickly get input and output data flowing easily, but also lets other cool things be built that don't depend on a specific machine or setup as well! For instance, I was able to implement virtual twins with cool features like (additional) collision detection, continuous interpolation, or with the ability to use inverse kinematics that was able to work on any of the NERVV machines.

Then in the NERVV example project, I was able to use that to build a VR demo app that let you view live machine data, adjust any NERVV machine axes manually, or activate IK on any machine to approximate your VR controller's location and orientation! All of this was made possible by ROS# which was really easy to use, so thank you so much!

Unfortunately I don't have any cool videos of using the VR controllers on hand, but here you can see some of the virtual twinned machines in this photo, and these were being streamed live, I promise 😅
an example of the nervv example project

@memrecakal
Copy link
Collaborator

Hi all, I would like to share my graduation project. Using ROS# (with ROS2), I designed/simulated/prototyped a multi-purpose remote-controlled underwater vehicle. Unity Engine is used for the simulation and remote control Android app, and MATLAB (with ROS Toolbox) is used for the MIMO controller.

unity_sim_demo_compressed.mp4

Here is the repo link: https://github.com/memrecakal/Design-of-a-Multi-Purpose-Remote-Controlled-Underwater-Vehicle-Using-ROS-2-Unity-and-MATLAB-/tree/main

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests