Hi, I'm Mohamad Zamini.

A
Self-driven and passionate about advancing machine learning, I am focused on pushing the boundaries of Large Language Models (LLMs) through innovative research and development. As a final-year PhD Candidate with hands-on experience in optimizing LLMs during my recent internship, I am committed to solving complex real-world problems with cutting-edge AI technology. My ambition is to contribute to the future of AI by developing scalable and efficient models that can transform industries and enhance human-computer interaction.

About

I am a PhD candidate in Computer Science with a focus on optimizing Multimodal Large Language Models (MLLMs) to enhance their reasoning capabilities. My work involves accelerating LLMs through advanced techniques such as pruning, ensuring performance is maintained or improved. I have hands-on experience with foundational models, having previously interned at Numenta Inc., and I am currently developing innovative approaches like Mixture of Depth (MoD), Mixture of Experts (MoE) for resamplers, and attention pruning to push the boundaries of MLLM efficiency and scalability.

  • Programming Languages: Python, C++
  • Databases: MySQL, MongoDB, PostgreSQL
  • Libraries & Frameworks: PyTorch, TensorFlow, Hugging Face Transformers, Keras, NumPy, Pandas, OpenCV
  • Model Optimization & Deployment: ONNX, TensorRT, TorchServe, FastAPI
  • Tools & Platforms: Git, Docker, Kubernetes, AWS, GCP, Azure, JIRA, Weights & Biases (wandb)

Seeking a challenging position that leverages my expertise in Machine Learning and Software Engineering, offering opportunities for professional development, innovative experiences, and personal growth.

Experience

Data Science Intern
  • Built LLM-driven analytics agent enabling natural-language querying and multi-turn reasoning over large-scale telemetry logs.
  • Automated weekly monitoring of retention and engagement metrics using SHAP, ANOVA, and anomaly-aware delta detection.
  • Designed scalable data pipelines supporting session-level behavioral modeling.
  • Tools: Python, PyTorch
May 2025 - Aug 2025 | Redmond, WA
Machine Learning Engineer Intern
  • Fine-tuned Mistral and LLaMA models with activation sparsity and attention sparsity for efficient inference.
  • Developed dynamic context pruning, KWTA mechanisms, and KV caching optimizations.
  • Tools: Python, PyTorch, Accelerate, GPT, llama
July 2024 - Sept 2024 | Redwood city, CA
Digital Innovation Intern
  • Designed semantic compression system using deep autoencoders for high-dimensional data.
  • Built ML models for geothermal data analysis and improved accuracy through algorithmic optimization.
June 2022 - Aug 2022 | Atlanta, GE
NLP Engineer
  • Fine-tuned BART for Persian text summarization.
  • Implemented Matrix Factorization for topic modeling.
  • Implemented BiLSTM-CRF for sequence tagging and matrix factorization for topic modeling.
  • Tools: Python, Scikit-learn, NLTK
June 2018 - Aug 2019 | Tehran, Iran

Projects

music streaming app
Explainability analysis

SHAP vs Lime Vs ELI5

Accomplishments
  • Tools: Python, PyTorch
  • To explain the model's predictions, the project uses model interpretability tools such as SHAP (SHapley Additive exPlanations), Lime (Local Interpretable Model-agnostic Explanations), and Eli5 (Explain Like I'm 5). These tools provide insights into how the model makes decisions and highlight the importance of different features in predicting strokes..
Causal Inference
Causal Inference

Causal Inference with Bayesian Networks

Accomplishments
  • Tools: Python, PyTorch
Screenshot of web app
Bi-directional Autoregressive Transformers From scratch

A simple Bi-directional Autoregressive Transformers From scratch.

Accomplishments
  • Tools: Python, PyTorch
  • implement the tokenizer, the model and tune it just the way we want.
Screenshot of  web app
image captioning

VIT + GPT2 image captioning

Accomplishments
  • Incorporated Convolution Neural Networks (CNN) for extracting image features and Long Short Term Memory for extracting question embeddings.
Screenshot of  web app
Attention-based Graph Neural Network

Multi-Label Text Classification using Attention-based Graph Neural Network.

Accomplishments
  • Multi-Label-Text-Classification-using-Attention-based-Graph-Neural-Network.
Screenshot of  web app
GPT2 for writing Python code

explore how to finetune the GPT2 and create a Python Question answering mdoel like chatgpt

Accomplishments
  • Develop simple chatbot.

Skills

Languages and Databases

Python
HTML5
CSS3
MySQL
PostgreSQL
Shell Scripting

Libraries

NumPy
Pandas
OpenCV
scikit-learn
matplotlib
NLTK

Frameworks

Django
Flask
Bootstrap
Keras
TensorFlow
PyTorch

Other

Git
AWS
Docker

Education

University of Wyoming

Laramie, WY

Degree: PhD in Computer Science
Area of Study: Causal Reasoning for Improving Generalization in Visual Question Answering

    Relevant Courseworks:

    • Intro to Artificial Intelligen
    • Machine Learning
    • High Perform Comput & Paradigm
    • Advanced Image Processing
    • Neural and Fuzzy Systems

Tarbiat Modares University

Tehran, Iran

Degree: Masters of Information Technology
CGPA: 3.68/4.0

    Relevant Courseworks:

    • Artificial Neural Networks
    • Neural and Fuzzy Systems

Contact