I’m a Computer Science student at Fisk University and a Software Engineer with internship experience at Microsoft and Amazon. I’m passionate about building scalable systems, machine learning, and making technology accessible.
Currently working on AI/ML projects including RAG systems and neural networks from scratch. Previously built distributed systems handling trillions of documents at Microsoft and notification infrastructure for millions at Amazon.
Email / LinkedIn / GitHub / (626) 879-3494
Experience
Microsoft - Software Engineer Intern
May 2025 - July 2025 | Redmond, WA
Reduced SharePoint Online document recovery latency from 14 days to 1 day. Built failover mechanisms for distributed systems handling document deduplication. Resolved live-site issues eliminating 60,000+ errors per deployment unit.
Microsoft - Software Engineer Intern
May 2024 - July 2024 | Redmond, WA
Built background maintenance jobs managing lifecycle of 10+ trillion documents. Optimized SQL operations removing 1M+ stale records. Fixed production localization issues affecting 190M+ OneDrive and SharePoint users.
Amazon - Software Developer Engineer Intern
May 2023 - Aug 2023 | Seattle, WA
Designed scalable push notification system serving 1.5M+ employees with 99.9% availability. Reduced cloud infrastructure costs by 45% and improved UUID retrieval latency by 51% through cross-functional collaboration.
Fisk University Library - Junior Developer
Nov 2023 - Present | Nashville, TN
Containerized Omeka-S CMS deployment reducing onboarding time by 90%. Automated AWS infrastructure provisioning using Terraform. Managing 150,000+ historic Rosenwald Project records.
Projects
Omeka-S RAG Agent (Sept 2024)
Built a Retrieval-Augmented Generation chatbot for Omeka-S documentation using LangChain, Llama 3, and ChromaDB. Implemented hallucination checking and integrated Tavily for enhanced context.
pd-detect.ai (Aug 2024)
Full-stack diagnostic platform for Parkinson’s Disease detection achieving 92% accuracy. Features real-time voice analysis using PyTorch neural networks and personalized medical recommendations via Llama3-powered chatbot.
NanoGPT Implementation (Sept 2024)
From-scratch implementation of 10.8M parameter GPT model in PyTorch. Trained on Shakespeare dataset demonstrating deep understanding of transformer architecture and attention mechanisms.
What I’m Working On
- Implementing transformer models from scratch to deepen understanding of modern AI architectures
- Building RAG systems for enterprise documentation to improve developer productivity
- Contributing to open-source projects in the AI/ML space
- Exploring applications of ML in healthcare diagnostics
Recent Thoughts
The intersection of distributed systems and machine learning presents fascinating challenges. At Microsoft, I learned how to build systems that scale to trillions of documents. Now I’m exploring how to make these systems intelligent through AI integration.
Building GPT from scratch taught me that understanding fundamentals is crucial - you can’t effectively use tools you don’t understand. The same principle that applied to managing SharePoint’s massive infrastructure applies to neural networks: start with first principles, then scale.
Last updated: September 2025