- Overview
- Demo
- Motivation
- Technical Aspect
- Installation And Run
- Directory Tree
- To Do
- Bug / Feature Request
- Technologies Used
- Credits
This project is a hand tracking application built using OpenCV and MediaPipe. The application can detect hands in real-time from a webcam feed, track the position of hand landmarks, and provide functionality for drawing and erasing on the screen using hand gestures.
Here's a brief demonstration of the hand tracking application in action:
The motivation behind this project was to explore the capabilities of computer vision and hand tracking technologies, and to create an interactive application that demonstrates these capabilities in a fun and engaging way.
The application uses the following technologies and libraries:
- Opencv: An open-source computer vision and machine learning software library.
- MediaPipe: A cross-platform framework developed by Google for building multimodal applied machine learning pipelines.
- Numpy: A library for the Python programming language, adding support for large, multi-dimensional arrays and matrices.
The application follows these steps:
- Capture video from a webcam.
- Use MediaPipe to detect and track hand landmarks in the video frames.
- Implement different modes for drawing and erasing on the screen based on hand gestures.
- Overlay the drawing on the video feed and display it to the user.
- Clone the repository or download the source code.
- Install the required packages by running the following command:
pip install -r requirements.txt
- Run the application with the following command:
python VirtualPainter.py
Directory Tree
│ app.py
│ HandTrackingModule.py
│ README.md
│ requirements.txt
└───Header
# Images for header will be stored here
- Improve hand tracking accuracy and responsiveness.
- Add support for more gestures and interactions.
- Optimize performance for lower-end hardware.
If you find a bug or have a feature request, please open an issue here.
- MediaPipe
- OpenCV
- NumPy