An LLM-powered agentic testing framework that identifies semantic differences between functionally similar code snippets through adversarial test generation and execution. Built on LangGraph, the tool implements a stateful directed graph where two agents work adversarially - one generating targeted test cases based on semantic analysis, the other evaluating coverage gaps and challenging assumptions. The agents iterate until reaching high confidence that all behavioral differences have been identified. Unlike traditional diff tools that compare syntax, this approach detects behavioral differences by executing both code versions in isolated subprocesses and producing confidence-scored semantic diff reports along with reusable test cases.
Read my article about building an adversarial AI testing framework.- Python
- LangGraph
- Gemini
- LLMs
- Agentic AI
- Testing
- Code Analysis
- State Management
This project is a production-ready sentiment analysis API built with FastAPI, powered by a custom-trained DistilBERT model for fast and efficient text classification. Trained on IMDB movie reviews, the model provides high-quality sentiment predictions. The API offers real-time analysis with a focus on both accuracy and performance. Docker containerization ensures easy deployment, and Prometheus monitoring tracks key metrics like request count, response time, and error rates. The CI/CD pipeline, managed with GitHub Actions, automates testing and code validation. This project demonstrates MLOps best practices for training, deploying, and maintaining machine learning models in production.
Read my article about it on Towards Data Science.- Python
- FastAPI
- Prometheus
- HuggingFace
- LLMs
- MLOPs
- AI
- CI/CD
- NLP
This project allows users of the popular mood tracking app
to analyze both the trends of their moods, as well as
identifying impact of activities on their mood.
This development requires heavy manipulation of the data,
time-series analysis, statistical analysis, frequent
pattern mining, and association mining.
- Python
- Data Mining
- Time-Series
- Forecasting
- Streamlit
- mlxtend
- scipy
- plotly
I studied Mathematics and Computer Science at Emory University for my Undergraduate Degree and am studying for a Master's Degree in Computer Science at the Georgia Institute of Technology. Below are some Selected Courses:
As a student at GaTech, I am studying Computer Science with a specialization in machine learning. While at GaTech I have taken several fascinating courses that have involved a high level of both programming efficacy and theoretical foundations.
-
Reinforcement Learning – During this
course, I implemented several state-of-the-art RL
algorithms and applied and fine-tuned them to a variety
of environments. These include:
- DDPG Actor-Critic methods for continuous space problems in OpenAI Gym environments,
- Counterfactual Credit Assignment for Multi-Agent RL,
- DQN with YOLO for self-driving vehicles.
- Python
- PyTorch
- Deep Learning
- Neural Networks
- CNN
- Reinforcement Learning
- Supervised Learning
- Experimental Design
- Matplotlib
-
Network Science – This course focuses
on the study of Networks and analysis techniques on them.
I implemented large-scale analysis of social, biological,
and technological datasets. Methods include:
- Community detection algorithms (e.g. modularity maximization, clustering) to uncover hidden structures and functional modelling,
- Computing and interpreting key network metrics and centrality measures (degree, betweenness, PageRank), clustering coefficients, shortest paths, and resilience,
- Modelling and simulating spreading process on networks, including epidemic models, and information diffusion.
- Python
- NetworkX
- Matplotlib
- GNN
- Data Analysis
Health Informatics – This course examines the integration of information technology and healthcare, focusing on the use, analysis, and exchange of health data to improve patient outcomes and healthcare delivery. I worked with diverse health data sources and standards, including Electronic Health Records (EHRs), FHIR (Fast Healthcare Interoperability Resources), and common data elements (CDEs), to ensure data interoperability and consistency across systems.
Methods include:- Preprocessing and harmonizing health data from EHRs and other clinical sources using standardized data elements (such as those defined in USCDI and CDE repositories),
- Implementing and utilizing FHIR for interoperable health data exchange and integration,
- Applying machine learning models for disease prediction, patient risk stratification, and clinical decision support,
- Analyzing and visualizing health data to support research, quality improvement, and patient care.
- Python
- JavaScript
- APIs
- FHIR
- EHRs
- Common Data Elements (CDEs)
- Health Data Standards
As a student at Emory University, I am received a degree in Mathematics and Computer Science with a focus on building robust systems and data-driven applications. My coursework has provided me with a strong foundation in both theoretical concepts and practical programming skills, preparing me to tackle complex challenges across the computing landscape.
-
Systems Programming – In this course, I gained hands-on experience designing and implementing low-level software components that interact closely with hardware and operating systems. Key projects and skills include:
- Developing multi-threaded applications in C for process synchronization and inter-process communication,
- Building custom shell environments and file system utilities,
- Debugging and profiling system-level code for performance and reliability.
- C
- Linux/Unix
- Shell Scripting
- Concurrency
- Systems Programming
- Debugging
-
Machine Learning – This course provided a comprehensive introduction to modern machine learning techniques and their applications. I implemented and evaluated a variety of models and algorithms, including:
- Supervised learning methods such as linear regression, logistic regression, and decision trees,
- Unsupervised learning techniques including clustering (K-means, hierarchical),
- Model evaluation using cross-validation, ROC curves, and hyperparameter tuning.
- Python
- Supervised-Learning
- Unsupervised-Learning
- scikit-learn
- Pandas
- Data Analysis
- Machine Learning
- Model Evaluation
-
Database Systems – This course explored the design, implementation, and optimization of relational database systems. I worked on projects involving:
- Designing normalized relational schemas and writing complex SQL queries,
- Implementing transaction management and concurrency control,
- SQL
- PostgreSQL
- Database Design
- Data Modeling
- Transaction Management