Testing & Iteration
Your MVP is live—now what? The real work begins: watching how users behave, collecting feedback, and rapidly iterating based on what you learn.
Metrics Check
What metric measures how many visitors become customers?
Key Metrics to Track
👀 Traffic
How many people visit your page or product?
🎯 Conversion
% who take the action you want (sign up, buy)
📊 Engagement
Are users actually using your product?
🔄 Retention
Do users come back?
😊 NPS
Would they recommend you? (0-10 scale)
💬 Feedback
What are users saying?
The "One Metric That Matters"
Don't track everything—focus on one key metric that indicates if your core hypothesis is working. For an early MVP, this is usually:
Are users completing the core action that delivers value?
Collecting Feedback
Quantitative data (numbers) tells you WHAT is happening. Qualitative feedback (conversations) tells you WHY.
- User interviews: 15-30 min calls with early users
- Surveys: Quick feedback forms (keep them short!)
- Session recordings: Watch how users actually use your product
- Support messages: Every complaint is a learning opportunity
Interpret the Data
What should you iterate on?
🌟 Fix the Leak First!
The data shows a clear drop-off: 30 matches → 8 meetings. The feedback explains why: unclear next steps. Add a clear email with "Here's your mentor, here's how to book a call" before worrying about more sign-ups.
👍 Maybe, But...
The problem might not be mentor quality—students said they didn't know what to do next. Fix the process before blaming people.
💡 Wrong Focus
More sign-ups won't help if 70% drop off before the core value (meeting). Fix the funnel leak before pouring more water in.
Prediction Game: Conversion Rate
A new mentor matching landing page went live. Traffic came from posting in student Facebook groups (highly targeted). The page offers free mentor matching.
What % of visitors signed up?
Targeted traffic (student groups) + free offer + relevant product = 12% conversion. Cold traffic from ads typically converts 2-5%. Context matters enormously for predicting metrics!
The Iteration Cycle
Weekly Iteration Loop
Monday: Review last week's metrics and feedback
Tuesday-Thursday: Build/test one improvement
Friday: Launch the improvement
Weekend: Collect new data
Repeat!
Pivot vs Persevere
If your core hypothesis is wrong, you should .
If metrics are improving with each iteration, you should .
Small changes based on feedback are called .
Define Your Success Metrics
PlanningDefine how you'll measure if your MVP is working:
Example Metrics Plan
One Metric: # of mentor meetings completed
Success: 20 meetings in first month = validated. Less than 5 = major problem.
Secondary: 1) Sign-up rate 2) Match rate 3) User satisfaction (1-10)
Feedback: Short survey after first meeting + 3 user calls per week
Check: Review metrics every Sunday night
Decision: After 4 weeks, decide: if <5 meetings despite fixes → pivot. If 10+ → persevere and scale.
🎯 Key Takeaways
- Focus on "one metric that matters" for your core hypothesis
- Quantitative data shows WHAT, qualitative shows WHY
- Fix funnel leaks before adding more traffic
- Iterate weekly: review → build → launch → learn
- Know when to pivot (hypothesis wrong) vs persevere (improving)