This week's goal was to get the first portion of the code up and running. This portion of code focused on getting the hand detection to function on the laptop. In order to achieve this, more research was done on MediaPipe. (https://google.github.io/mediapipe/solutions/hands.html)
MediaPipe, as a module, was created by Google and consists of different ML (machine learning) solutions such as object tracking, face detection, iris tracking, etc. Within this python module, there was a solution based solely on hand tracking and another that tracks a user’s whole body. For the ViBRA, the hand tracking module was the main focus, although we would later on need to grab just the elbow portion from the full-body tracking solution. Week 7 consisted of writing up the code to access the hand tracking solution and attempting to get it to function, that was unsuccessful and the code did not run as expected. Jumping into week 8, the focus was concentrated on getting the code to run with the correct output, see below.

Link to video: https://screenrec.com/share/KwRxST9Ezm
Comentários