Hello World!
I'm Vaishnavi
Master of Computer Science, University of Massachusetts Amherst
I am a highly skilled engineering professional with a strong background in C++ and Python, seeking full-time opportunities starting January 2025. I equipped myself with a Master’s degree in Computer Science from the University of Massachusetts Amherst. My interests lie in scalable distributed systems, deep learning, and ML infrastructure. To further enhance my expertise, I have undertaken relevant projects and coursework in these areas. Previously, I worked as a Software Engineer, developing and testing LTE PHY layer protocols before transitioning to the verification of 5G transceivers.
Education

University of Massachusetts Amherst
Feb 2023 - Dec 2024
Master of Computer Science (Minor: Data Science) - GPA:4.0

National Institute of Technology Karnataka
Aug 2014 - May 2018
Bachelor of Technology in Electrical and Electronics Engineering - CGPA:9.26/10
Relevant Coursework
- Advanced Natural Language Processing
- Distributed and Operating Systems
- Advanced Algorithms
- Neural Networks
- Machine Learning
Technical Skills
Programming Languages
Databases and Technologies
Machine Learning Frameworks
Blogs
Work Experience

Software Engineer
Aug 2019 - May 2022- Designed and implemented C/C++ based functional test frameworks for evaluating multiple embedded subsystems, ensuring high system reliability and precise feature validation.
- Developed scalable Python based orchestration to manage regression test workflows, reducing manual validation time by 40%.
- Migrated signal-processing logic from MATLAB to modular Python packages and integrated it into a versioned local cloud test system, reducing datapath errors by 60%

Software Engineer
Sep 2018-Apr2019- Enhanced LTE PHY layer protocols in CATM1 and NBIoT devices by developing Python and C++ feature tests.
- Identified corner cases to boost performance by up to 90% in key tests like throughput and SINR for both FDD and TDD systems.
- Assisted developers by adding implementations for 5G NR downlink channels in C++ reducing feature implementation time by 20 days.

Software Engineering Intern
May 2017-July 2017- Established a continuous monitoring system using Python for critical parameters of solar panels’ health,utilizing RPI, Arduino, and Zigbee as the central components.
- Integrated firebase cloud uploads for real-time updates displaying information on RPi’s LCD screen during boot-up increasing the overall health by 60%.
Projects
Python, AWS, gRPC
- Building a distributed, fault-tolerant LLM serving system with gRPC and FastAPI based control APIs, supporting autoscaling, model caching, fault tolerance, dynamic model loading, and Redis backed persistence.
- Deployed the system on AWS with Docker containers and prometheus based monitoring, achieving 70% reduction in operational overhead.
Flask, Docker, Python
- Developed a microservices-based scalable stock trading application with dedicated Catalog, Frontend, and Order services.
- Implemented a thread-per-request concurrency model with locking, enabling the system to efficiently handle 10 simultaneous clients.
- Designed a cache consistency mechanism on the frontend service using a server push model, improving average latency per lookup request by 5ms.
- Implemented replicated, fault tolerant order servers by performing leader elections and synchronizing recovered replicas.
- Created docker instance for the servers and deployed the application on AWS EC2 instance.
Python, GCP
- Implemented GPTQ based 4-bit quantization pipeline with on the fly dequantization, reducing model size with minimal degradation in performance.
- Added custom quantized linear layers and packed state compression, cutting memory footprint by 50% for a 70GB model.
Pytorch,LLM
- Manually curated dataset of 400 quotes for quote interpretation by sampling the top 10 most frequent labels from a pool of 30k quotes selected from the Quotes-500k dataset.
- Fine tuned and evaluated BERT, RoBERTA, GPT2, T5 for multi-label classification on quotes obtaining an accuracy of 70%.
- Performed zero-shot, one-shot, and few-shot prompting for quote interpretation on T5 (3B) and Gemma (2B) models.
Python, LLM
- Implemented key features of a compact version of Llama2 in accordance with its official documentation.
- Performed sentence completion task given a prefix using temperature sampling for temperatures 0 and 1.
- Fine-tuned the model for sentence classification on sst and cfimdb datasets using standard fine tuning and LoRA fine tuning.
- Observed 3% increase in performance on cfimdb dataset using LoRA fine-tuning.
Python
- Developed from-scratch implementations of KNN, Random Forest, and Neural Networks with hyperparameter tuning and cross-validation applied to the Digits, Titanic, Loan, and Parkinson’s datasets.
- Achieved high classification accuracy with the best performing models reaching over 90% accuracy on the respective datasets.
Neural Network Compression
- Explored various knowledge transfer techniques, such as logit matching, feature matching for image classification task on CIFAR-100 and ImageNet datasets. Experimented with custom architectures to assess trade-offs across evaluation metrics like accuracy, compression ratio, inference time, and memory usage.