{"id":11923,"date":"2022-04-05T10:00:00","date_gmt":"2022-04-05T01:00:00","guid":{"rendered":"https:\/\/www.gigas-jp.com\/appnews\/?p=11923"},"modified":"2022-04-05T10:11:07","modified_gmt":"2022-04-05T01:11:07","slug":"basic-handtracking-in-python-opencv-using-mediapipe","status":"publish","type":"post","link":"https:\/\/www.gigas-jp.com\/appnews\/archives\/11923","title":{"rendered":"Basic HandTracking in Python, OpenCV using mediapipe"},"content":{"rendered":"\n<p>Today I would like to share about tracking hands in python and opencv using mediapipe library.<\/p>\n\n\n\n<p>Let\u2019s take a look.<\/p>\n\n\n\n<p>OpenCV is a tool for image processing and performing computer vision tasks. It is an open-source library that can be used to perform tasks like face detection, objection tracking, landmark detection, and much more. It supports multiple languages including python, java C++.<\/p>\n\n\n\n<p>MediaPipe is Google\u2019s open-source framework for media processing. It is cross-platform  that can run on Android, iOS, web.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" width=\"887\" height=\"869\" src=\"https:\/\/www.gigas-jp.com\/appnews\/wp-content\/uploads\/sites\/4\/2022\/04\/demo.png\" alt=\"\" class=\"wp-image-11924\" \/><\/figure>\n\n\n\n<p>First of all, we will import required libraries in line 1 to 3.<\/p>\n\n\n\n<p>Next, in line 6, we need to create a VideoCapture Object to capture a video from a webcam.<\/p>\n\n\n\n<p>And if not found camera, info message is displayed in line 9.<\/p>\n\n\n\n<p>Then in line 14 to 16, we will call the methods of MediaPipe and create a Hands() object to detect handlandmarks.<\/p>\n\n\n\n<p>Next, in line 21, while camera is opened, we will handle hand tracking processes .<\/p>\n\n\n\n<p>In line 22 and 23, read the images from the camera and convert the images to RGB images because we will have to pass only RGB images in hands.process(imgRGB) of line 25.<\/p>\n\n\n\n<p>In line 26, we can get hand Landmarks easily by the help of mediapipe. If we print landmarks, we will see they are coordinates in line 27.<\/p>\n\n\n\n<p>And we need to draw every hand landmarks by using drawing_utils of mediapipe in line 29 to 31.<\/p>\n\n\n\n<p>And we will also show the frame rate of the camera. So we will calculate fps with cTime(current time) and pTime(previous time) by using time library in line 33 and 35. Before that, firstly cTime and pTime are also needed to assign into 0 in line 18 and 19.<\/p>\n\n\n\n<p>Then we will write fps value on the display capture in line 37.<\/p>\n\n\n\n<p>Finally, in line 39 and 40, we output the live capture.<\/p>\n\n\n\n<p>There we go.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" width=\"450\" height=\"177\" src=\"https:\/\/www.gigas-jp.com\/appnews\/wp-content\/uploads\/sites\/4\/2022\/04\/demo-1.gif\" alt=\"\" class=\"wp-image-11927\" \/><\/figure>\n\n\n\n<p>So this is all for now. For further details information, here are the reference links.<\/p>\n\n\n\n<p><a href=\"https:\/\/docs.opencv.org\/4.x\/d6\/d00\/tutorial_py_root.html\">https:\/\/docs.opencv.org\/4.x\/d6\/d00\/tutorial_py_root.html<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/google.github.io\/mediapipe\/getting_started\/python.html\">https:\/\/google.github.io\/mediapipe\/getting_started\/python.html<\/a><\/p>\n\n\n\n<p>Hope you enjoy that.<\/p>\n\n\n\n<p>By Asahi<\/p>\n<div class='wp_social_bookmarking_light'>\n            <div class=\"wsbl_google_plus_one\"><g:plusone size=\"medium\" annotation=\"none\" href=\"https:\/\/www.gigas-jp.com\/appnews\/archives\/11923\" ><\/g:plusone><\/div>\n            <div class=\"wsbl_hatena_button\"><a href=\"\/\/b.hatena.ne.jp\/entry\/https:\/\/www.gigas-jp.com\/appnews\/archives\/11923\" class=\"hatena-bookmark-button\" data-hatena-bookmark-title=\"Basic HandTracking in Python, OpenCV using mediapipe\" data-hatena-bookmark-layout=\"standard\" title=\"\u3053\u306e\u30a8\u30f3\u30c8\u30ea\u30fc\u3092\u306f\u3066\u306a\u30d6\u30c3\u30af\u30de\u30fc\u30af\u306b\u8ffd\u52a0\"> <img src=\"\/\/b.hatena.ne.jp\/images\/entry-button\/button-only@2x.png\" alt=\"\u3053\u306e\u30a8\u30f3\u30c8\u30ea\u30fc\u3092\u306f\u3066\u306a\u30d6\u30c3\u30af\u30de\u30fc\u30af\u306b\u8ffd\u52a0\" width=\"20\" height=\"20\" style=\"border: none;\" \/><\/a><script type=\"text\/javascript\" src=\"\/\/b.hatena.ne.jp\/js\/bookmark_button.js\" charset=\"utf-8\" async=\"async\"><\/script><\/div>\n            <div class=\"wsbl_twitter\"><a href=\"https:\/\/twitter.com\/share\" class=\"twitter-share-button\" data-url=\"https:\/\/www.gigas-jp.com\/appnews\/archives\/11923\" data-text=\"Basic HandTracking in Python, OpenCV using mediapipe\" data-via=\"GIGASJAPAN_APPS\" data-lang=\"ja\">Tweet<\/a><\/div>\n            <div class=\"wsbl_facebook_like\"><div id=\"fb-root\"><\/div><fb:like href=\"https:\/\/www.gigas-jp.com\/appnews\/archives\/11923\" layout=\"button_count\" action=\"like\" width=\"100\" share=\"false\" show_faces=\"false\" ><\/fb:like><\/div>\n            <div class=\"wsbl_facebook_send\"><div id=\"fb-root\"><\/div><fb:send href=\"https:\/\/www.gigas-jp.com\/appnews\/archives\/11923\" colorscheme=\"light\" ><\/fb:send><\/div>\n    <\/div>\n<br class='wp_social_bookmarking_light_clear' \/>\n","protected":false},"excerpt":{"rendered":"<p>Today I would like to share about tracking hands in python and opencv using mediapipe library. Let\u2019s take a lo [&hellip;]<\/p>\n","protected":false},"author":20,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[100],"tags":[],"acf":[],"_links":{"self":[{"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/posts\/11923"}],"collection":[{"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/users\/20"}],"replies":[{"embeddable":true,"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/comments?post=11923"}],"version-history":[{"count":1,"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/posts\/11923\/revisions"}],"predecessor-version":[{"id":11928,"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/posts\/11923\/revisions\/11928"}],"wp:attachment":[{"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/media?parent=11923"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/categories?post=11923"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.gigas-jp.com\/appnews\/wp-json\/wp\/v2\/tags?post=11923"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}