Skip to main content

Campaigns & A/B Testing

Comprehensive guide to creating and managing campaigns with Resync.

Overview

Campaigns are Resync's powerful A/B testing system that allows you to test different versions of your content, measure results, and make data-driven decisions. The system handles variant assignment, impression tracking, conversion tracking, and statistical analysis automatically.

Table of Contents


Key Concepts

Campaign

A time-bound experiment that tests different versions (variants) of your content against each other.

Variant

A specific version of content in a campaign. Each variant is a Content Block created in your Resync dashboard. Campaigns test different Content Blocks against each other to see which performs better.

Impression

When a user is assigned a variant and sees the content. Tracked automatically when getVariant() is called.

Conversion

When a user completes the goal action (e.g., purchase, signup, click). You track conversions by logging specific events.

Statistical Significance

A measure of confidence that the difference between variants is real and not due to chance. Resync calculates this automatically.


Creating a Campaign

In the Dashboard

  1. Navigate to Campaigns

    • Go to your Resync dashboard
    • Click Campaigns in the sidebar
    • Click Create New Campaign
  2. Basic Information

    Campaign Name: "Checkout Flow Test"
    Description: "Testing simplified vs detailed checkout"
  3. Schedule

    Start Date: 2025-12-01 00:00:00
    End Date: 2025-12-31 23:59:59
  4. Goal Event

    Goal Event ID: evt_purchase_completed
    Description: "User completes a purchase"
  5. Variants (Must be Content Blocks)

    First, create your Content Blocks in the dashboard:

    • Content Block 1: "CheckoutFlowOriginal" (3-step checkout)
    • Content Block 2: "CheckoutFlowSimplified" (1-step checkout)

    Then add them as campaign variants:

    Variant A (Control)

    Name: "Original Checkout"
    Content Block: CheckoutFlowOriginal (select from dropdown)
    Traffic Allocation: 50%

    Variant B (Treatment)

    Name: "Simplified Checkout"
    Content Block: CheckoutFlowSimplified (select from dropdown)
    Traffic Allocation: 50%

    Important: Each variant MUST be a Content Block. You cannot use custom code or arbitrary content - only Content Blocks created in your dashboard.

  6. Audience Targeting (Optional)

    Target Audience: "All Users" or select specific audience
  7. Save

    • Click Save as Draft to review later
    • Click Activate to start immediately (if start date is in the past)

Via API

POST /api/v1/apps/:appId/campaigns

{
"name": "Checkout Flow Test",
"description": "Testing simplified vs detailed checkout",
"status": "draft",
"startDate": "2025-12-01T00:00:00Z",
"endDate": "2025-12-31T23:59:59Z",
"goalEventId": "evt_purchase_completed",
"variants": [
{
"name": "Original Checkout",
"contentViewId": 101,
"trafficAllocation": 50
},
{
"name": "Simplified Checkout",
"contentViewId": 102,
"trafficAllocation": 50
}
],
"audienceId": null
}

Campaign Data Structure

Complete Campaign Object

interface Campaign {
// Basic Information
id: number;
appId: number;
name: string;
description?: string;

// Status & Scheduling
status: 'draft' | 'active' | 'paused' | 'completed';
startDate: Date;
endDate: Date;
timezone?: string;

// Goal Configuration
goalEventId: string;
goalEvent?: Event;

// Variants
variants: CampaignVariant[];

// Targeting
audienceId?: number;
audience?: Audience;

// Analytics
analytics: CampaignAnalytics;

// Metadata
createdAt: Date;
updatedAt: Date;
createdBy?: number;
}

interface CampaignVariant {
id: number;
campaignId: number;
name: string;
contentViewId: number;
contentView?: ContentView;
trafficAllocation: number; // 0-100
isControl: boolean;

// Statistics
impressions: number;
conversions: number;
conversionRate: number;
}

interface CampaignAnalytics {
totalImpressions: number;
totalConversions: number;
averageConversionRate: number;

// Statistical Analysis
statisticalSignificance: boolean;
pValue?: number;
confidenceLevel?: number;

// Winner Information
winningVariantId?: number;
winningVariant?: CampaignVariant;

// Detailed Statistics
variants: VariantStatistics[];
}

interface VariantStatistics {
variantId: number;
variantName: string;
impressions: number;
conversions: number;
conversionRate: number;

// Comparison to Control
uplift?: number; // Percentage improvement
pValue?: number;

// Confidence Intervals
lowerBound?: number;
upperBound?: number;
}

Campaign Status Lifecycle

Status Flow

DRAFT → ACTIVE → COMPLETED
↓ ↓
ARCHIVED PAUSED → ACTIVE

Status Definitions

1. DRAFT

When: Campaign is being created or edited Behavior:

  • Not visible to users
  • No impressions or conversions tracked
  • Can be edited freely
  • Not scheduled

Actions:

  • Edit campaign details
  • Add/remove variants
  • Activate campaign

2. ACTIVE

When: Campaign is live and running Behavior:

  • Users are assigned variants
  • Impressions tracked automatically
  • Conversions tracked when goal event occurs
  • Cannot edit variants or goals (only pause/stop)

Auto-Start:

  • Campaign automatically becomes ACTIVE when startDate is reached
  • Triggered by Campaign Scheduler (runs every minute)

Actions:

  • Pause campaign
  • Complete campaign early
  • View analytics

3. PAUSED

When: Campaign temporarily stopped Behavior:

  • No new variant assignments
  • No new impressions tracked
  • Existing data preserved
  • Can be resumed

Actions:

  • Resume (back to ACTIVE)
  • Complete campaign
  • View analytics

4. COMPLETED

When: Campaign has ended Behavior:

  • No new assignments or tracking
  • All data preserved
  • Statistical analysis finalized
  • Winner may be declared

Auto-Complete:

  • Campaign automatically becomes COMPLETED when endDate is reached
  • Triggered by Campaign Scheduler

Winner Declaration:

  • If statistical significance is reached, winner is declared
  • Webhook campaign.winner_declared is triggered

Actions:

  • View final results
  • Archive campaign
  • Duplicate for new test

5. ARCHIVED

When: Campaign is no longer needed Behavior:

  • Hidden from main campaign list
  • Data preserved
  • Cannot be reactivated
  • Can be unarchived

Variant Assignment

How Assignment Works

  1. User Requests Variant

    const variant = await Resync.getVariant('checkout_flow_test');
  2. Assignment Algorithm

    User Identifier (userId or sessionId)
    → Hash Function
    → Consistent Assignment (same user = same variant)
    → Traffic Allocation Applied
    → Variant ID Returned
  3. Assignment Properties

    • Consistent: Same user always gets same variant
    • Random: But appears random across users
    • Respects Traffic: Based on traffic allocation %
    • Immediate: No delay or database lookup

Traffic Allocation

Control how traffic is split between variants.

There are two possible variant assignment types:

Weighted Rollout

Each user is randomly assigned to a variant based on the percentage weights you set. For example, with 50% Control and 50% Variant A, roughly half of users will always see the Control content and the other half will always see Variant A. The same user always sees the same variant.

// Equal Split (50/50)
Variant A: 50%
Variant B: 50%

// Unequal Split (75/25) - More conservative
Variant A (Control): 75%
Variant B (Treatment): 25%

// Multi-variant (33/33/34)
Variant A: 33%
Variant B: 33%
Variant C: 34%

Round Robin

Variants are rotated in sequence for each new user or impression. Each variant gets equal exposure regardless of weight settings. For example, with Control and Variant A, the first user sees Control, the second sees Variant A, the third sees Control again, and so on in a repeating pattern.


Tracking Impressions

Automatic Impression Tracking

Impressions are tracked automatically when you call ResyncCampaignView is rendered:

// This call automatically tracks an impression
<ResyncCampaignView name='Summer Sales' />;

When Impressions Are Counted

Counted:

  • First time user sees a campaign
  • User sees the variant content
  • Campaign is ACTIVE

Not Counted:

  • Campaign is DRAFT, PAUSED, or COMPLETED
  • User already has an impression for this campaign
  • API call fails

Viewing Impressions

In Dashboard:

Campaign → Analytics Tab

Variant A: 5,234 impressions
Variant B: 5,189 impressions
Total: 10,423 impressions

Tracking Conversions

What is a Conversion?

A conversion occurs when a user completes your campaign's goal event. This is how you measure success.

Setting Up Conversions

  1. Define Goal Event (when creating campaign):

    goalEventId: "evt_purchase_completed"
  2. Track Goal Event in Your App:

    // When user completes purchase
    Resync.logEvent({
    eventId: 'evt_purchase_completed',
    metadata: {
    amount: 99.99,
    productId: 'prod_123'
    }
    });
  3. Conversion Automatically Recorded:

    • Resync matches event to campaign goal
    • Checks if user has an impression
    • Records conversion for user's assigned variant
    • Updates campaign statistics

Conversion Requirements

For a conversion to be counted:

Required:

  • Event ID matches campaign's goalEventId
  • User has an impression for this campaign
  • User was assigned a variant
  • Event occurs during campaign active period

Not Counted:

  • Event occurs before impression
  • Event occurs after campaign ends
  • Wrong event ID
  • User not assigned to campaign

Example: E-commerce Conversion Flow (React Native)

import { ResyncCampaignView } from 'resync-react-native';
import Resync from 'resync-react-native';

export default function HomeScreen() {
// 1. User lands on Hompage → Impression tracked automatically
// 2. Variant assigned and Content Block rendered automatically e.g a Promo Banner with a CTA to shop

return (
<ResyncCampaignView
name="Premium Sales Promo"
/>
);
}

// Then somewhere in your app

// 3. User completes purchase → Track conversion
export default function CheckoutScreen() {
const completePurchase = async (orderId, amount) => {
const success = await processPayment();

if (success) {
// This triggers conversion tracking
await Resync.logEvent({
eventId: 'evt_purchase_completed',
logId: orderId,
metadata: {
amount: amount,
currency: 'USD',
orderId: orderId
}
});
// → Conversion tracked automatically
}
};
}

How it works:

  • Create two Content Blocks in dashboard: "CheckoutOriginal" and "CheckoutSimplified"
  • Add them as variants in your campaign
  • ResyncCampaignView automatically assigns user to a variant and renders that Content Block
  • You just track the conversion event

Multiple Conversion Events

You can track multiple events as conversions:

// Primary Goal
goalEventId: "evt_purchase_completed"

// Also track secondary events for insights:
Resync.logEvent({ eventId: 'evt_add_to_cart' });
Resync.logEvent({ eventId: 'evt_checkout_started' });
Resync.logEvent({ eventId: 'evt_purchase_completed' }); // ← Conversion

// View funnel in analytics:
// Impressions: 10,000
// Add to Cart: 3,000 (30%)
// Checkout Started: 1,500 (15%)
// Purchase Completed: 450 (4.5%) ← Conversion Rate

Viewing Conversions

In Dashboard:

Campaign → Analytics Tab

Variant A:
Impressions: 5,234
Conversions: 157
Conversion Rate: 3.0%

Variant B:
Impressions: 5,189
Conversions: 208
Conversion Rate: 4.0%

Uplift: +33% (statistically significant)

Statistics & Analytics

Key Metrics

1. Impressions

Number of users who saw each variant.

Variant A: 5,234 impressions
Variant B: 5,189 impressions

2. Conversions

Number of users who completed the goal action.

Variant A: 157 conversions
Variant B: 208 conversions

3. Conversion Rate

Percentage of users who converted.

Variant A: 157 / 5,234 = 3.0%
Variant B: 208 / 5,189 = 4.0%

4. Uplift

Improvement compared to control variant.

Uplift = (Treatment Rate - Control Rate) / Control Rate
= (4.0% - 3.0%) / 3.0%
= 33.3% improvement

5. Statistical Significance

Confidence that the difference is real, not due to chance.

P-value: 0.02 (2%)
Confidence Level: 95%
Result: SIGNIFICANT ✅

Interpretation: We are 95% confident that Variant B
performs better than Variant A.

Statistical Analysis

Resync uses Bayesian statistics to determine:

  1. P-value: Probability that difference is due to chance

    • < 0.05: Significant (95% confidence)
    • < 0.01: Highly significant (99% confidence)
  2. Confidence Intervals: Range of likely conversion rates

    Variant A: 2.5% - 3.5% (95% CI)
    Variant B: 3.5% - 4.5% (95% CI)
    No overlap → Significant difference
  3. Sample Size Requirements

    Minimum per variant:
    - 100 conversions (recommended)
    - 1,000 impressions (minimum)
    - 1-2 weeks runtime (minimum)

Analytics Dashboard

Campaign Overview:

Status: ACTIVE
Running: 14 days
Ends: 16 days

Total Impressions: 10,423
Total Conversions: 365
Average Conversion Rate: 3.5%

Winning Variant: B (+33% uplift)
Statistical Significance: YES ✅
P-value: 0.02

Variant Breakdown:

┌────────────────────────────────────────────────────┐
│ Variant A (Control) - Original Checkout │
├────────────────────────────────────────────────────┤
│ Impressions: 5,234 │
│ Conversions: 157 │
│ Conversion Rate: 3.0% │
│ Confidence: 2.5% - 3.5% │
└────────────────────────────────────────────────────┘

┌────────────────────────────────────────────────────┐
│ Variant B (Treatment) - Simplified Checkout ⭐ │
├────────────────────────────────────────────────────┤
│ Impressions: 5,189 │
│ Conversions: 208 │
│ Conversion Rate: 4.0% │
│ Confidence: 3.5% - 4.5% │
│ Uplift: +33% │
│ P-value: 0.02 │
│ Significance: YES ✅ │
└────────────────────────────────────────────────────┘

Accessing Analytics via API

GET /api/v1/campaigns/123/analytics

{
"campaignId": 123,
"status": "active",
"totalImpressions": 10423,
"totalConversions": 365,
"averageConversionRate": 3.5,
"statisticalSignificance": true,
"pValue": 0.02,
"confidenceLevel": 95,
"winningVariantId": 102,
"variants": [
{
"variantId": 101,
"variantName": "Original Checkout",
"impressions": 5234,
"conversions": 157,
"conversionRate": 3.0,
"confidenceInterval": {
"lower": 2.5,
"upper": 3.5
}
},
{
"variantId": 102,
"variantName": "Simplified Checkout",
"impressions": 5189,
"conversions": 208,
"conversionRate": 4.0,
"uplift": 33.3,
"pValue": 0.02,
"confidenceInterval": {
"lower": 3.5,
"upper": 4.5
}
}
]
}

Best Practices

1. Set Clear Success Metrics

Define what success looks like before starting:

Campaign: "Checkout Flow Test"

Primary Metric:
- Purchase completion rate
- Goal: +10% improvement
- Minimum: 100 conversions per variant

Secondary Metrics:
- Time to checkout
- Cart abandonment rate
- Average order value

2. Run Tests Long Enough

Minimum Duration:
- 1-2 weeks (to capture weekly patterns)
- 100+ conversions per variant
- Achieve statistical significance

Don't Stop Early:
❌ "Variant B is winning after 2 days!"
→ Too early, need more data
✅ "After 2 weeks with 200 conversions,
Variant B is winning with 95% confidence"

4. Consider Sample Size

Required Sample Size Calculator:

Baseline Conversion Rate: 3%
Minimum Detectable Effect: 20% improvement
(3%3.6%)

Required per variant:
- Impressions: ~8,500
- Conversions: ~255
- Duration: 2-3 weeks (at 500 users/day)

5. Track the Full Funnel

// Track intermediate steps
Resync.logEvent({ eventId: 'evt_view_product' });
Resync.logEvent({ eventId: 'evt_add_to_cart' });
Resync.logEvent({ eventId: 'evt_view_cart' });
Resync.logEvent({ eventId: 'evt_checkout_started' });
Resync.logEvent({ eventId: 'evt_purchase_completed' }); // Conversion

// Analyze where users drop off
Funnel Analysis:
- Product Views: 10,000
- Add to Cart: 3,000 (30%)
- View Cart: 2,500 (25%)
- Checkout Started: 1,500 (15%)
- Purchase Completed: 450 (4.5%)

Troubleshooting

Users Not Seeing Variants

Possible Causes:

  1. Campaign is not ACTIVE
  2. Campaign hasn't started yet
  3. User is not in target audience
  4. API/SDK not initialized

Solutions:

  1. Check campaign status
  2. Check user audience membership

Conversions Not Tracking

Problem: Events logged but no conversions in analytics

Possible Causes:

  1. Event ID doesn't match campaign goal
  2. User has no impression
  3. Event logged before impression
  4. Campaign has ended

Solutions:

  1. Verify event ID matches
  2. Ensure impression before conversion
  3. Check timing

No Statistical Significance

Problem: Test running for weeks but no clear winner

Possible Causes:

  1. Sample size too small
  2. Variants too similar
  3. High variance in data

Solutions:

  1. Check sample size
  2. Check effect size
  3. Increase traffic Current: 500 users/day Action: Promote experiment to more users

Unequal Variant Distribution

Problem: One variant getting more traffic than expected

Possible Causes:

  1. Traffic allocation settings incorrect
  2. Users seeing multiple variants (cookies cleared)
  3. Bot traffic

Solutions:

// 1. Verify allocation
Campaign settings:
Variant A: 50%
Variant B: 50%

Actual distribution:
Variant A: 5,234 (50.2%)
Variant B: 5,189 (49.8%)
Result:Normal variance

// 2. Set user ID consistently
await Resync.loginUser(user.id); // Use actual user ID


Need Help?