PayPal - Privacy Center
Overview
PayPal needed to redesign their Privacy Center before rolling it out to 300M+ users. The goal was to build trust in PayPal.
I led foundational research to identify trust barriers. As a result of my research, trust scores increased by 10%.
View the live PayPal Privacy Center at this link: https://www.paypal.com/us/privacy-center/home
Framing the Research
Research Strategy Realignment
When I inherited the project, the team had already redesigned the Privacy Center without first conducting user research.
Their approach was based on internal assumptions about what users needed, and they planned to validate it by comparing trust score surveys between the old and new versions.
However, this meant measuring success without first understanding whether the redesign actually met user needs. Instead of rushing into measurement, I guided the team to step back and identify what they wanted to learn:
In a Privacy Center, what builds or erodes trust?
Where do users struggle?
How well do they understand the information?
These questions required observing user behavior and gathering rich feedback about our users experiences – insights that couldn't be captured through simple survey scores. This shift in focus – from measuring trust scores to understanding user experience – led us to adopt a more effective mixed-methods research approach.
Methodology & Stakeholder buy-in
Why I chose a mixed-method approach
I proposed a two-phased approach:
Qualitative Interviews – To uncover user behaviors, pain points, and trust drivers.
Quantitative Survey – To validate qualitative findings at scale and measure changes in trust.
I advocated for qualitative interviews first to uncover nuanced insights. Trust is subjective and influenced by subtle design elements like wording and layout, which can’t be fully captured by a survey. Interviews allowed us to observe user behaviors and explore the why behind user trust, providing the necessary context for any later survey
After gathering insights from the interviews, we would have followed up with a survey to evaluate those insights across a larger sample. This two-phase approach ensured a balance of depth (qualitative) and breadth (quantitative) - ensuring we understood both the why, and the how many, behind user behavior.
Stakeholder Buy-in
Some stakeholders were skeptical of using interviews, thinking a survey or A/B test would be more efficient. I explained that while quantitative methods could show if trust increased, they wouldn’t explain why it increased. Interviews provided the critical context behind the trust shifts, such as identifying vague terms or hidden controls that could erode confidence. By emphasizing this distinction, I helped the team see the value of qualitative insights in uncovering the underlying reasons for user behavior.
User Research & Insights
I conducted 12 moderated interviews with PayPal users across privacy sensitivity levels. Each session combined natural exploration, task-based scenarios, and think-aloud feedback.
To keep the team aligned and engaged, I held daily “breakdown sessions” where I shared insights with designers and product managers. These sessions:
Highlight user pain points, making the research feel tangible
Build early buy-in, so stakeholders understood key insights before the final analysis.
Create space for discussions about how the findings could influence design decisions.
After the interviews, I led a collaborative affinity mapping session to synthesize and prioritize insights. We clustered feedback into themes, focusing on areas of confusion, trust concerns, and navigation struggles. Then, we ranked themes based on their frequency, severity, and potential for design improvement, ensuring that our recommendations were both actionable and impactful.
This process ensured that our recommendations were both user-driven and actionable. Key themes were presented to senior leadership with supporting video clips, making the findings more compelling and driving immediate design changes.
Key Findings & Design Impact
Make data protection clearer
Protection of personal and financial information was a primary reason for using PayPal. However, as participants went through the Privacy Center, they didn’t receive clear confirmation that their data was safe.
"I want to know my financial information isn’t being sold or shared with third parties."
Solution: Added explicit statements like “We never share your full financial information” near sensitive data entry points.
Example of highlighting the protection of financial information
Used simpler, more precise language
Vague terms like “data” caused confusion— participants weren’t sure if it included their personal or financial information.
They wanted more specific definitions of what information is collected, how it is used, and who has access to it.
"Does ‘data’ mean my name and email, or does it include my credit card information too?"
Solution: Clearly defined personal vs. financial data upfront, using concise language and real-world examples.
Clear and upfront definitions of common terms
High clarity examples of what data is shared, when you use PayPal
Improved visibility of privacy controls
Users felt like Privacy Centers usually conceal information from them, reducing trust. Having upfront controls made participants feel like PayPal was transparent and not concealing important details.
"When it's clear how I can control my data, it feels like the company is being honest and transparent. Privacy Centers that are hard to understand just make me feel like they’re hiding something."
Solution: Redesigned navigation to make privacy controls more prominent and easier to access.
New section: 'How you can manage your privacy settings'
Live-Test Results
These design changes were implemented in the revamped Privacy Center and later tested via a survey led by a senior UXR. Although I was not directly involved in the survey execution, I worked closely with the senior UXR to ensure that the survey reflected the key insights from the qualitative research. The survey results showed:
A 10% increase in internal trust metrics, like comfort in data sharing, and the feeling that PayPal has their best interest at heart.
The redesigned content was recognized more frequently for building trust, and was flagged less often as having a negative impact than the old content.
These results showed how the design changes, driven by user research, reached our goal in improving trust.
Takeaways
Starting with the right questions – rather than jumping to metrics – led to deeper insights about user trust. By taking time to understand user behaviors and pain points through interviews, we uncovered specific design improvements that wouldn't have been visible through survey data alone.
Daily stakeholder involvement amplifies research impact. By holding regular breakdown sessions and involving the team in synthesis, insights were quickly translated into design decisions. This collaborative approach helped bridge the gap between research findings and implementation.
While we achieved our goal of increasing trust metrics, which was valuable to senior leadership, future research should tie trust to business outcomes. Measuring how increased trust affects customer support calls, product adoption, or user retention would better demonstrate the ROI of privacy-focused design improvements. This data could strengthen the case for continued investment in trust and privacy initiatives.