Skip to content

Real Time Implementation of Virtual Mouse using Computer Vision

Notifications You must be signed in to change notification settings

zain18jan2000/Virtual-Mouse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 

Repository files navigation

Virtual-Mouse

Real Time Implementation of Virtual Mouse using Computer Vision

DEPENDENCIES:

1) Mediapipe
2) OpenCV
3) math
4) Numpy

WHAT IS MEDIAPIPE ?

MediaPipe is a Framework for building machine learning pipelines for processing time-series data like video, audio, etc. This cross-platform Framework works in Desktop/Server, Android, iOS, and embedded devices like Raspberry Pi and Jetson Nano.

WHAT IS MEDIAPIPE HANDS ?

MediaPipe contains MediaPipe Hands, which is a high-fidelity hand and finger tracking solution. It employs machine learning to infer 21 3D landmarks of a hand from just a single frame. It utilizes an ML pipeline consisting of multiple models working together. A palm detection model that operates on the full image and returns an oriented hand bounding box. A hand landmark model that operates on the cropped image region defined by the palm detector and returns high-fidelity 3D hand keypoints as shown below.

hand_landmarks

WHAT'S INCLUDED IN THIS REPOSITORY ?

1) handTrackingModule.py

This module is used for the detection of hands, to identify their landmarks and to detect their positions.

2) virtual_mouse.py

This is the main module used is for the real time implementation of virtual mouse, by utilizing handTrackingModule.py.

VIDEO DEMO:

https://drive.google.com/file/d/1MfvZ1jSLdncAgq_Wnb-7ruPF7NvlRuc3/view?usp=sharing

About

Real Time Implementation of Virtual Mouse using Computer Vision

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages