September 2024 is my 3rd year as a Product Designer at Meta, in 3 years I have collaborated with incredible minds, developed hard and soft skills, and positively impacted my team and company.
Impact:
Insights & Metrics: Meta is a metric driven company, I've learned to gather insights and data, weigh them against both our design and business goals and make calculated decisions to improve our experiences.
Design Leadership: At Meta I have mentored junior designers, project managed teams, led design sprints and immersion sessions in London, Thailand and Brazil.
Org Contribution: Improving our systems is one of my main focuses on any team I join.
I've created internal tools to improve our abysmal product testing experiences
Driven visual roadmaps that have provided partner teams a less abstract view of our teams plans for the year.
Conducted brainstorming sessions to make room for all XFN contributions to our six month roadmap.
Introduced a focus week of 90 XFN participants that led to the creation of project one pagers and 20+ audits of our experiences
Design & Development: Shipped products and experiments that have unblocked $72 million in yearly revenue
Education: I've continued to learn through Meta's training portal improving my design leadership, design strategy, and designing with AI.
Collaboration:
At Meta we collaborate cross functionally with Content Designers, Project Managers, Software Engineers, Data Scientists, Data Engineers
No items found.
The following case study may be abridged to maintain Meta intellectual property
Integrity Entry Points - Case Study 1
Problem: Our advertisers spent most of their time on surfaces like ads manager, LWI and Meta Business Suite. When they are enforced upon and their accounts get banned they are directed to an entirely new experience that may feel foreign and can sometimes be overwhelming.
Solution: In 2022 we built Integrity Entry points to meet our users where they are and guide them through a resolution process. This way they can get back to advertising on our platforms with minimal disruption.
My Role
My role as the Product Designer started long before the project kickoff. A senior software engineer discerned there was a better way for our company to approach the remediation process when an advertisers account was restricted. Frustratingly, he was never able to gain buy in from leadership nor our cross functional partners to start this effort. After several one on one meetings and brainstorming the engineer and I were able to develop a formal strategy and approach to get this project funded.
Brainstorm
Together we defined an ambiguous concept, created visuals to support the vision, and outlined potential features and lift. We created a deck and presented our new strategy to our organization during our monthly meeting in order to gain buy in from both leadership and our XFN. We were able to secure a small investment for an MVP which included the support of a content designer, junior software engineer and data scientist. With this help we were able to set in motion a project that had been neglected for years!
Our new strategy would allow the advertiser to resolve their issues on the platform they are used to instead of navigating to a new surface. We strongly believed that this would help stop churn and allow advertisers to continue on the platform with minimal disruption!
Our Goal
For this project we wanted to improve the remediation process for restricted advertisers. In order to measure success and continue our work we needed to align with our organizations top line metrics. Our main goal was to improve the time it takes for an advertiser to resolve their issue. Our secondary goal was to increase the amount of appeals from our advertisers. We paid attention to drop off rate and click through rate as well.
Keys to Success
With our goal in mind, there were a few things that I determined were necessary to ultimately lead to a successful project:
Organization: During kickoff I created a project document to maintain organization. Our project doc housed all relevant links , listed all stakeholders and maintained notes from our weekly standup. I also created a RACI chart to establish accountability
Project Management: Due to medical leave our team was without a project manager, I stepped in and lead the team meetings, created deadlines and provided space for weekly updates.
Prioritization: Our insights determined that Ads Manager was the platform that advertisers used most frequently so we prioritized design efforts there. We would approach the other teams on LWI and Meta Business Suite in the following half.
Communication: While building our experience, I shared our progress across multiple organizations, specifically to the different platforms that it could potentially live on.
Design Process
Quality Reviews
Due to the pace of our business, it was common for teams at Meta to launch projects and experiments that didn’t meet the ideal level of product quality. In 2022 a senior designer introduced our design team to Quality Reviews. Quality Reviews were a way for us to conduct a heuristic evaluation of our projects based on industry wide standards.
I brought that practice to the Integrity entry points project before launching our experiment. I sourced a team of multi disciplinary XFN and walked them through the experience we designed, then the team scored our experiment and determined if their were any launch blocking errors. I summarized the results and shared it with . Once all launch blocking problems were resolved we launched our experiment.
The image above is merely a visual representation of Quality reviews, which were more expansive and done through Google sheets.
Results
Our efforts raised the amount of resolved cases across Meta by 0.2%, we unblocked $24 million dollars in revenue yearly. Our daily incremental appeal overturns were up +1.5% which looked like 630 new wrongfully enforced advertisers a day getting back to the platform and continuing their business.
The experiment was a success for our team and was later adopted by several other teams for the next two halves. I acted as a liaison onboarding new spaces to our process and supported with design process and consult.
ACE Internal Testing Tool - Case Study 2
Context: The After Conversion Experiences (ACE) team supported advertisers and sellers by surfacing their policy compliance, page health, and enforcement scores. Because ACE operated in a compliance-heavy environment, the dashboards changed frequently. The team had a strong engineering culture and relied on internal tooling, but as the product team grew to include designers, researchers, and content strategists, it became clear that the existing workflows did not support design needs.
The absence of a reliable way to test the advertiser experience created friction across the organization and directly affected design quality, empathy, and onboarding.
Problem: In 2023 the product team (designers, content designers, researchers) could not reliably test end to end advertiser journeys. They depended on engineers to manipulate test accounts or manufactured artificial states that lacked real user feedback. This meant that scenarios were inaccurate and incomplete. Cross functional partners were frequently asked for screenshots of live pages, often ten or more at a time, which disrupted their work and slowed decision making. New hires could not see the advertiser journey as it really unfolded, which made onboarding inefficient. The lack of a realistic testing method caused frustration, slowed iteration, and contributed to attrition on the design team.
Solution: We created an interactive internal dashboard as an expansion of the Account Quality Viewer. The tool allowed product teams to generate realistic enforcement scenarios across all stages, including before enforcement, during enforcement, after enforcement, and the review request process. It gave ACE and other teams the ability to experience the platform exactly as advertisers and sellers did. The tool became a reliable way to audit flows, train new hires, and validate design decisions without relying on engineering support.
My Role
I served as the product designer and driver of the project. After a senior content designer raised the issue, I partnered with her to frame the problem and develop vision work. I pitched the concept to ACE leadership and secured investment. With the support of two engineers and a product marketing manager, I led the design and delivery of the tool. My focus was on defining the experience, ensuring it reflected real advertiser end to end journeys, and aligning the team on outcomes that would scale beyond ACE.
Constraints
The team was small and resources were limited, so we had to design a solution that was simple, flexible, and fast to deliver. Frequent regulatory changes meant the dashboard had to adapt quickly to new enforcement scenarios. Balancing accuracy with speed was critical.
Results
The tool became a trusted resource within ACE and across other product teams. Designers and researchers could self serve testing scenarios, reducing reliance on engineering. This increased productivity amongst the capital design team allowing us to deploy vision work much faster. Leadership credited the project with improving empathy, onboarding, and overall product quality. By addressing a systemic gap, the tool helped stabilize the design team, reduced frustration, and strengthened collaboration between design and engineering.