As a top-priority (P0) project for Shopify's AppUI team, I tackled slow-loading embedded apps and the lack of performance data. The project aimed to establish a baseline, measure improvements, and identify optimization areas within the app loading pipeline.
Situation: We needed to monitor and improve the load speed of embedded applications.
Task: Develop a robust reporting system.
Action: We faced numerous challenges with data collection due to misaligned technical terminology among stakeholders. We re-evaluated the app loading process, identified data gaps, and collaborated with engineers to re-implement data collection, ensuring alignment on goals and terminology.
Sample report (the actual report is confidential).
Through meticulous iteration and data cleaning, I built a reporting system that tracked load speed by app, day, and user, with aggregates across dimensions like location, API version, device and browser types, plan type, session load count, etc. The results revealed that upgrading to newer versions could improve load speeds tenfold, with further gains possible through background asset loading.
During my time on the Shopify data team, I led a crucial project focused on enhancing the effectiveness of developer documentation. This involved optimizing the search engine that powered the documentation, ensuring engineers could easily test and implement improvements.
Situation: The developer documentation's search engine was a key area for improvement. Our objective was to accurately measure its performance and provide the engineering team with a robust tool to test and monitor improvements.
Task: My primary responsibility was to define and implement metrics that would effectively evaluate search engine performance, plus a system for automated A/B testing, allowing engineers to deploy and monitor their enhancements efficiently.
Action: We collected and modeled diverse data from across Shopify, including click data, user logins, page views, user profiles, shop profiles, and API calls. From this, we developed a comprehensive set of metrics like Click Through Rate (CTR), Mean Reciprocal Rank (MRR), Time on Page, Bounce Rate, and Pogo Stick Rate, which accounted for both clicked results and their rank. I then created detailed reporting and an automated tool that enabled the engineering team to deploy and monitor A/B tests for continuous improvements.
Result: Through these efforts, we achieved a 2% improvement in our North Star metric, the click-through rate. For Shopify, this translated into a significantly better developer experience, a critical component of their strategic goals. For an e-commerce business, this type of improvement can lead to thousands of dollars in immediate revenue.
At Shopify, I played a significant role in a major project to migrate our databases. This extensive undertaking transformed our approach to data storage, code functionality, task scheduling, and reporting.
Situation: Shopify undertook a major project to modernize its data infrastructure, entailing a complete technological overhaul. This transformation included migrating from Presto to Google Cloud Storage, transitioning from Kimball Ross to Domain Driven Design, adopting DBT over PySpark, replacing our proprietary scheduling system with AirFlow, and switching from Mode to Looker Studio.
Task: My core responsibility as a member of the data team was to inventory and prioritize data models and reporting utilized by my product teams. This involved either redesigning and building new models to align with the latest architecture or eliminating obsolete ones.
A Figma database design, a common method used when designing data models.
Action: I led the comprehensive migration of 12 models over six months, meticulously redesigning each in collaboration with product teams and in-house data engineers. I utilized DBT for model construction and completely reconstructed reporting within Looker Studio.
Result: During an intensive migration, I became an expert in DBT and Looker Studio, proficient in Google Cloud Storage and Domain-Driven Design for data modeling. The successful migration of critical models ensured business continuity and unlocked advanced analytical capabilities. My skills in data migration, modeling, and reporting are directly transferable and highly beneficial for any business aiming to establish robust and scalable data foundations.
This project focused on optimizing bidding increments for online auctions to significantly increase revenue and merchant take-homes. By analyzing current bidding patterns, the goal was to identify and implement changes that would yield a substantial financial return with minimal investment.
Situation: At MaxSold, there was an opportunity to enhance auction revenue by refining the stepwise bidding function. The hypothesis was that a small increase in bidding increments earlier in the auction process could lead to a significant rise in average lot value, translating into millions of dollars in additional revenue annually. This initiative was identified as having the largest potential impact on the company's bottom line and for its merchants, with the smallest associated investment.
Task: The primary task was to optimize the bidding increment scheme to increase the average lot value by at least $1, which was projected to generate approximately $1.2 million per year in additional revenue. This required designing and executing an A/B test to compare the performance of the existing bidding structure against a new, optimized one.
Action: I independently designed and implemented an A/B test to evaluate the new bidding increment scheme. The test ran for one month, carefully managing challenges such as the high variability of auctions requiring large sample sizes and the presence of outliers like unusually high-value or promoted auctions. I also addressed the inability to fully automate updates due to data erasure risks and the lack of an engineering team. Validation metrics included the number of lots per auction, while average lot value and average gross merchandise value (GMV) were used as key test metrics.
Result: The original hypothesis proved correct, leading to a successful implementation of the new bidding increment scheme. This change resulted in an estimated increase of $5 in average lot value, translating to over $1 million per year in pure profit for MaxSold and approximately $3 million in total revenue. This optimization also directly benefited merchants by increasing their take-home earnings from auctions. The project successfully navigated technical and logistical challenges, demonstrating a significant positive impact on the company's financial performance.