top of page

Theoretical Answer Evaluation System

T. A. E. S evaluates an answer on the basis of a given answer key by using Deep Averaging Network and regex string matching. The idea is to utilize text vectorization and cosine similarity to evaluate those vectors to check answers.

Theoretical Answer Evaluation evaluates textual answers available in digital form and marks them on the basis of a provided answer key. The system can identify if points given in the answer-key, are similar to a point in the provided answer in any random order using cosine similarity. It can also check for plagiarism and grammar on a preliminary level. The study revolves around how a paper-checker without any domain knowledge of a subject can check a paper of a given subject, and mimics this process using Machine Learning. This system allows automated checking of subjective answers without the aid of human benefactor. It can find similarity in text passages with high accuracy and also distribute and cut marks on the basis of number of points found and grammatical errors in the answer respectively. It is an essential in the time of a pandemic as it allows to have subjective type tests, with automated checking. This system’s score coincides with human given scores approximately 8 out of 10 times on a database of 1379 sentences and is faster than any human checker, as a mere second is required for a passage of about 200 words.

 


 
TAES Demo



 

Click here to go to the project capsule.

bottom of page