Below are a list of academic projects I have undertaken. This collection of projects span across multiple domains including Economics, Statistics and Machine Learning/AI.
This WIP project is an economic analysis at the SEC's 2023 cybersecurity filing regulation changes. Together with Dr. Taylor Wright, we are researching the changes in the distribution of filing timings prior to, and after the regulation changes. This research will uncover if companies strategic timing of cybersecurity disclosures are being mitigated by SEC regulations.
This project was initially presented at the Faculty of Math and Science Undergraduate Research Symposium at Brock University on July 31st of 2025, and is being presented with new results at the Faculty of Social Science Research Symposium. This project will be the only undergraduate project in the symposium, with the rest comprising of Master's and PhD candidate presentations. There are additional plans to present the full thesis in the Summer of 2026.
This project is a comprehensive Python package for building, analyzing, and learning Bayesian networks—probabilistic graphical models that represent relationships between variables through directed acyclic graphs. The package provides a robust framework for structure learning, supporting over 40 different algorithms including constraint-based methods (PC, FCI, SGS), score-based approaches (Hill Climbing, GES, FGES), and hybrid techniques (MMHC, M3HC). Beyond structure learning, the package includes capabilities for causal inference through intervention and counterfactual engines, multiple scoring functions for model evaluation, and utilities for data transformation, visualization, and integration with other probabilistic modeling libraries like NetworkX and pgmpy.
My contributions to this project were extensive, spanning algorithm implementation, code infrastructure improvements, and project maintenance. I implemented nine new structure learning algorithms, including A* (a globally optimal heuristic search algorithm), OBS (order-based search with aggressive pruning), ASOBS (a variant starting from empty graphs), MINOBS and IINOBS (memory-efficient variants using crossover and local search), OptOrd (global search using decomposable score functions), SC (Sparse Candidate algorithm optimized for sparse networks), and PCB (an enhanced PC algorithm with backwards phase for better edge orientations). Additionally, I developed the Akaike Information Criterion (AIC) scoring function to support these algorithms and fixed critical bugs in seven existing algorithms (FCI, Hill Climbing, PC-Stable, SGS, Structural EM, and others) that were non-functional due to import issues and infrastructure problems. This work required deep understanding of graph theory, probability theory, search algorithms, and algorithm complexity analysis.
Beyond algorithm development, I significantly improved the project's infrastructure and maintainability. I created a comprehensive CONTRIBUTING.md guide with setup instructions, testing requirements, and code style guidelines; established proper project standards with pyproject.toml for build system configuration; and developed a systematic algorithm benchmarking framework with test dataset generators covering various network structures. I also standardized and updated test files, created a unified test runner for comparing all algorithms, fixed dependency management in requirements.txt, and enhanced the README with better documentation. In total, I added 30+ new files, improved 20+ existing files, and contributed over 3,300 lines of code, transforming the project from a partially broken codebase with 16 algorithms (7 non-functional) into a production-ready package with 24 functioning algorithms ready for PyPI distribution. This work demonstrated my ability to not only implement complex algorithms from research literature but also to maintain and improve large-scale software projects with attention to software engineering best practices.