Your Web News in One Place

Help Webnuz

Referal links:

Sign up for GreenGeeks web hosting
December 8, 2022 06:16 pm GMT

SignLanguage - Learn ASL Practically - (MongoDB Atlas Hackathon 2022 Submission)

What I built

SignLanguage is a platform where users can practically learn American Sign Language using machine learning and access videos for over 20,000+ ASL phrases.

signLanguage

Category Submission: Search No More

App Link

https://signlanguage.webdrip.in/

App Previews & Features

Homepage

The homepage displays the entire application preview. Users can look up terms for videos and alphabets, as well as play games.

Signlanguage Homepage

Phrases Dictionary

Signlanguage includes around 20,000 phrases from which users can learn ASL.

Phrases

open video

Lightning Fast Fuzzy Search

Users can search for any video from 20,000+ video using the fuzzy search feature implemented using MongoDB Atlas search in the ASL phrases dictionary.

Fuzzy Search

Fuzzy Search play video

The video below shows how quickly MongoDB can search through over 20,000 video documents.

Game Of ASL

Users can play games that have been created using mediapipe and tensorflow machine learning libraries to help them learn and validate their learning.

Learn Numbers

Learn numbers from 0-10 in chronological order with the help of artificial intelligence.

Random Numbers

Show your hands on screen and AI will try to predict the numbers from 0-10 based on your hand signs.

Link to Source Code

Permissive License

MIT License

Background

Developing software applications or products that solve real world problem statements has always intrigued me. When I started learning ML and explored it's possibilities I thought of making a project that would be a part of something bigger than itself i.e. helping out community and reaching people in an easily accessible manner. This was my motivation for developing this app SignLanguage.

YouTube is a great source of knowledge, but its algorithm doesn't promote short videos or videos that entice users. SignLanguage solves this problem by collecting these resources and making them easily accessible through our webapp. This webapp has around 20,000+ curated video phrases and features like fuzzy Search which can help users to learn American Sign Language(ASL) with ease.

SignLanguage also has a number of games that users can play to practice their ASL fundamentals while having fun.

Tech Stack & Libraries Used

The app is basically built using eleventy js (11ty) for frontend, MongoDb Realm as a backend and mediapipe and tensorflow js for machine learning.

  • Eleventy Js
  • Mongo Db
  • Mongo Db Realm functions
  • Mongo Db Realm HTTPS Endpoints
  • Mongo Db Atlas Search
  • Tensorflow
  • Mediapipe

How I built it

Collecting 20,000+ ASL words was a difficult effort, therefore I wanted a backend that was simple and quick to set up. So, I choose Mongo Db realm as I was using Mongo Db to store my data.

Creating 20,000 Documents

The first time I saw my data, I wondered if I could even store this data into MongoDB and how difficult it would be. However, MongoDB Compass makes importing JSON data extremely easy.

Mongo Db Compass

Mongo Db Atlas Search

Creating a Search Index for my data collection was a breeze, and I was impressed with how fast and precise the results were.

atlas search

Mongo Db Realm HTTPS ENDPOINTS

After adding data and creating search index, I wanted to create an API that could be used by the frontend. Mongo DB makes creating api easy through their HTTPS ENDPOINTS & realm function services.

I built a search Index endpoint called /searchVideo which took a realm function and returns search results from atlas search.

exports = function(request,response){  let collection = context.services.get("mongodb-atlas").db("SignLanguage").collection("videos");  const {searchVid} = request.query    let pipeline = [        {          $search: {            index: 'searchVideos',            text: {              query:searchVid,              path: {                'wildcard': '*'              }            }          }        }    ];    return collection.aggregate(pipeline)};

I also built an simple API endpoint called videoApi and alphabetSong which returns paginated data from database.

exports = function(request,response){  const {start} = request.query  let data = context.services.get("mongodb-atlas").db("SignLanguage").collection("videos").find().skip(parseInt(start)).limit(12).toArray();  return data;};

realm function

For creating ASL game, One of the issues I experienced was creating and gathering datasets for deep learning model. I had to personally collect the dataset because the datasets obtained on Kaggle either did not predict results correctly or were of poor quality.

Machine Learning workflow model is shown below

Image description

A brief overview of creation of classifying model is;

  • We first predict key-points on hands using the Open CV Python package and the mediapipe hands model by google.
  • Following this, a custom model is created using Tensorflow that classifies the keypoint.
  • This model is then converted to tfjs model which can be used in browser to predict hand signs.

https://media.giphy.com/media/xUPOqo6E1XvWXwlCyQ/giphy.gif

Thanks!

Hope you liked my project! if you have any feedback please feel free to comment down below.


Original Link: https://dev.to/narottam04/signlanguage-learn-asl-practically-mongodb-atlas-hackathon-2022-submission-2cl7

Share this article:    Share on Facebook
View Full Article

Dev To

An online community for sharing and discovering great ideas, having debates, and making friends

More About this Source Visit Dev To