I'm Nevyn Duarte, a Data Scientist and Machine Learning Engineer with a passion for building intelligent systems that solve complex, real-world problems. My work spans quantitative finance, large-scale data engineering, and production AI systems.
Professional Background
Currently, I serve as an AI Engineer at Bridges AI Consulting, where I design and build end-to-end AI platforms for large-scale content understanding and generation. My focus is on creating scalable ingestion and transformation layers for unstructured data streams, applying machine learning techniques to extract actionable signals, and implementing production-oriented workflows that emphasize reliability and operational efficiency.
Prior to Bridges AI, I worked at M Science (Jefferies) as a Quantitative Equity Research Associate, where I developed predictive equity models using PySpark and SQL on Databricks. I analyzed millions of transactions and job postings data points, produced data-driven research reports that accelerated report cycles from quarterly to monthly, and automated data extraction workflows that improved team efficiency by 20%.
My experience also includes roles at Citco Fund Services where I supported risk and valuation models for hedge fund portfolios exceeding $10B in assets under management, and at AMD where I designed adaptive machine learning algorithms for yield analysis across 100k+ wafer samples.
Education
I'm currently completing my Master of Science in Data Science at the University of Colorado Boulder, with a focus on deep learning, parallel computing, and advanced statistical modeling. My coursework includes probabilistic modeling, unsupervised machine learning, natural language processing, and generative AI.
I hold a Bachelor of Science in Mathematics from the University of Texas at Austin, where I was recognized with the UT CNS Award for Excellence in Computer Science for my research contributions to the Building-Wide Intelligence (BWI) robotics project.
Research & Interests
My research interests center on the intersection of machine learning theory and practical applications. I'm particularly drawn to problems in predictive modeling for financial markets, autonomous systems, and natural language understanding. I actively study academic papers and textbooks to inform how I design models, evaluate risk, and build production-ready systems.
At UT Austin, I contributed to robotics research developing trajectory-based person-following systems using DeepSORT tracking and Google's Triplet Loss function, programming robot navigation in Python, C++, and ROS.
Technical Expertise
My technical toolkit includes Python, PyTorch, TensorFlow, PySpark, and SQL for data science and machine learning. I have extensive experience with cloud platforms (AWS, Databricks), containerization (Docker), and data visualization tools (Tableau, Power BI). For software engineering, I work with Go, C++, and JavaScript, and have built production systems using REST APIs, serverless architectures, and distributed computing frameworks.
Let's Connect
I'm always interested in discussing challenging problems in ML engineering, quantitative finance, and AI systems. Whether you're working on something interesting or just want to connect, feel free to reach out.