PostHog Integrations: Connect Your Stack | OpsBlu Docs

PostHog Integrations: Connect Your Stack

Connect PostHog with data warehouses, CDPs, messaging tools, and your entire stack

Overview

PostHog is built to play well with your existing stack. Whether you're using Segment for customer data, Slack for team communication, BigQuery for data warehousing, or Salesforce for CRM, PostHog integrates seamlessly to create a unified analytics ecosystem.

The platform offers three main types of integrations:

  1. Data pipelines: Export PostHog data to warehouses and BI tools
  2. Customer data platforms: Ingest data from Segment, RudderStack, and mParticle
  3. Apps and notifications: Send insights to Slack, Teams, webhooks, and more

Because PostHog is open-source, you can also build custom integrations using the API, webhooks, or by forking the platform itself. This flexibility makes PostHog adaptable to virtually any tech stack.

Deployment Options

PostHog Cloud

PostHog Cloud is the fastest way to get started. No infrastructure to manage, automatic scaling, and all features included.

Setup:

  1. Sign up at app.posthog.com
  2. Create a project and get your API key
  3. Add the PostHog snippet to your app
  4. Start tracking events immediately

Pricing:

  • Free tier: 1 million events/month, 5,000 session recordings, unlimited feature flags
  • Usage-based: Pay only for what you use beyond free tier
  • Transparent pricing: No hidden fees or surprise bills

Benefits:

  • Managed infrastructure and automatic updates
  • Global CDN for fast SDK loading
  • Built-in DDoS protection and security
  • 99.9% uptime SLA
  • EU and US data hosting options

Best for:

  • Teams that want to move fast without DevOps overhead
  • Startups and scale-ups focused on product development
  • Companies comfortable with cloud-hosted analytics

Self-Hosted

Self-hosting gives you complete control. All data stays on your infrastructure, and you own the entire stack.

Deployment Options:

  • Docker Compose: For development and small teams
  • Kubernetes: For production and scale
  • AWS, GCP, Azure: Using PostHog's deployment guides
  • On-premises: For air-gapped or highly regulated environments

Setup (Docker Compose):

# Clone PostHog
git clone https://github.com/PostHog/posthog.git
cd posthog

# Start services
docker-compose up -d

# PostHog will be available at http://localhost:8000

Setup (Kubernetes via Helm):

# Add PostHog Helm repo
helm repo add posthog https://posthog.github.io/charts-clickhouse
helm repo update

# Install PostHog
helm install posthog posthog/posthog \
  --set cloud=false \
  --set ingress.hostname=posthog.yourdomain.com

# Check deployment
kubectl get pods -n posthog

System Requirements:

  • Minimum: 4GB RAM, 2 CPU cores, 20GB storage
  • Recommended: 16GB RAM, 4+ CPU cores, 100GB+ SSD storage
  • Scale: PostHog can handle billions of events with proper infrastructure

Benefits:

  • Complete data ownership and privacy
  • No limits on events, users, or features
  • Customizable and extendable (it's open-source)
  • No recurring SaaS fees
  • Compliance-friendly (HIPAA, SOC 2, GDPR)

Considerations:

  • You manage updates, scaling, and backups
  • Requires DevOps expertise
  • Infrastructure costs (servers, storage, bandwidth)

Best for:

  • Enterprises with strict data governance requirements
  • Healthcare, finance, or regulated industries
  • High-volume products where self-hosting is cost-effective
  • Teams with existing infrastructure and DevOps capacity

Which Should You Choose?

Factor PostHog Cloud Self-Hosted
Setup time Minutes Hours to days
Maintenance Zero Ongoing
Data control Shared Full ownership
Compliance SOC 2, GDPR You control
Cost at scale Higher (usage-based) Lower (infra only)
Updates Automatic Manual
Best for Speed & simplicity Control & compliance

Rule of thumb:

  • Start with PostHog Cloud for speed
  • Self-host if you have specific compliance needs or hit scale where costs favor self-hosting
  • You can migrate from Cloud to self-hosted (or vice versa) later

SDK & Platform Support

PostHog offers official SDKs for virtually every platform and language.

Web & Mobile SDKs

JavaScript (Browser):

npm install posthog-js
# or
yarn add posthog-js

React:

npm install posthog-js
# Includes React hooks and context

React Native:

npm install posthog-react-native

iOS (Swift/Objective-C):

pod 'PostHog', '~> 3.0'

Android (Kotlin/Java):

implementation 'com.posthog:posthog-android:3.+'

Flutter:

dependencies:
  posthog_flutter: ^3.0.0

Server-Side SDKs

Node.js:

npm install posthog-node

Python:

pip install posthog

Ruby:

gem install posthog-ruby

PHP:

composer require posthog/posthog-php

Go:

go get github.com/posthog/posthog-go

Java:

<dependency>
  <groupId>com.posthog.java</groupId>
  <artifactId>posthog</artifactId>
  <version>3.0.0</version>
</dependency>

Rust:

cargo add posthog-rs

Elixir:

{:posthog, "~> 0.1"}

Framework Integrations

Next.js:

// pages/_app.js
import posthog from 'posthog-js';
import { PostHogProvider } from 'posthog-js/react';

if (typeof window !== 'undefined') {
  posthog.init('YOUR_API_KEY', {
    api_host: 'https://app.posthog.com',
    loaded: (posthog) => {
      if (process.env.NODE_ENV === 'development') posthog.debug();
    }
  });
}

export default function App({ Component, pageProps }) {
  return (
    <PostHogProvider client={posthog}>
      <Component {...pageProps} />
    </PostHogProvider>
  );
}

Vue.js:

// main.js
import { createApp } from 'vue';
import posthog from 'posthog-js';

const app = createApp(App);

app.config.globalProperties.$posthog = posthog.init(
  'YOUR_API_KEY',
  {
    api_host: 'https://app.posthog.com'
  }
);

app.mount('#app');

Svelte:

// App.svelte
import posthog from 'posthog-js';

posthog.init('YOUR_API_KEY', {
  api_host: 'https://app.posthog.com'
});

export { posthog };

Data Warehouse Integrations

Export PostHog data to your data warehouse for custom analysis, reporting, and ML pipelines.

BigQuery

Setup:

  1. Go to Data Pipeline > Destinations in PostHog
  2. Click New Destination > BigQuery
  3. Provide your Google Cloud project ID and dataset name
  4. Authenticate with Google Cloud
  5. Select which events to export

What gets exported:

  • Events with all properties
  • Person properties
  • Session data
  • Feature flag states

Frequency:

  • Real-time export (events appear within minutes)
  • Batched for efficiency

Schema:

-- Events table
SELECT
  timestamp,
  event,
  distinct_id,
  properties
FROM `your_project.posthog.events`
WHERE DATE(timestamp) = CURRENT_DATE()

Snowflake

Setup:

  1. Create a Snowflake database and warehouse
  2. Generate Snowflake credentials (account, user, password, role)
  3. In PostHog: Data Pipeline > Destinations > Snowflake
  4. Enter Snowflake connection details
  5. Select events to export

Benefits:

  • Join PostHog analytics with sales, support, and marketing data
  • Build custom dashboards in Looker, Tableau, or Metabase
  • Run ML models on user behavior data

Redshift

Setup:

  1. Provision a Redshift cluster
  2. Create a database and schema for PostHog data
  3. In PostHog: Data Pipeline > Destinations > Redshift
  4. Provide connection string and credentials
  5. Configure export schedule

PostgreSQL / MySQL

Setup:

  1. In PostHog: Data Pipeline > Destinations > PostgreSQL (or MySQL)
  2. Provide database connection details
  3. PostHog creates tables automatically
  4. Events flow continuously

Use cases:

  • Custom reporting queries
  • Integration with internal tools
  • Backups and archival

Customer Data Platforms

Segment

PostHog integrates with Segment in both directions:

Send data FROM Segment TO PostHog:

// Segment tracks the event
analytics.track('Button Clicked', {
  button_name: 'signup_cta'
});

// PostHog receives it automatically (if Segment destination is configured)

Setup:

  1. In Segment: Add PostHog as a destination
  2. Enter your PostHog API key and project URL
  3. Enable the destination
  4. All Segment events flow to PostHog

Send data FROM PostHog TO Segment:

// Track in PostHog
posthog.capture('feature_used', {
  feature_name: 'advanced_reports'
});

// Configure export to Segment in PostHog settings

Benefits:

  • Single source of truth for customer data
  • Use Segment's extensive destination catalog
  • Migrate from other analytics tools without changing instrumentation

RudderStack

RudderStack is an open-source Segment alternative.

Setup:

  1. Add PostHog as a destination in RudderStack
  2. Configure connection settings
  3. Events tracked via RudderStack appear in PostHog

Why RudderStack + PostHog:

  • Both are open-source
  • Both support self-hosting
  • Complete control over your data pipeline

mParticle

Setup:

  1. In mParticle: Configure PostHog as an output
  2. Provide PostHog API credentials
  3. Map mParticle events to PostHog events

Messaging & Notifications

Slack

Get real-time alerts and insights delivered to Slack.

Setup:

  1. In PostHog: Apps > Slack
  2. Click Install and authorize with Slack
  3. Create subscriptions for specific insights

Use cases:

  • Alert when errors spike
  • Notify when experiments reach significance
  • Daily digest of key metrics
  • User activity notifications (e.g., "New enterprise signup!")

Example:

PostHog Alert:

Errors increased by 230% in the last hour
Event: "api_error"
Count: 147 (normal: 45)

View in PostHog: https://app.posthog.com/...

Microsoft Teams

Same functionality as Slack, delivered to Teams channels.

Setup:

  1. In PostHog: Apps > Microsoft Teams
  2. Generate a webhook URL in Teams
  3. Configure PostHog to send alerts to that webhook

Discord

Setup:

  1. Create a Discord webhook in your server settings
  2. In PostHog: Apps > Webhook
  3. Enter Discord webhook URL
  4. Configure event triggers

CRM & Marketing Automation

Salesforce

Sync PostHog cohorts and events to Salesforce for enriched customer profiles.

Setup:

  1. In PostHog: Apps > Salesforce
  2. Authenticate with Salesforce
  3. Map PostHog properties to Salesforce fields

Use cases:

  • Enrich lead records with product usage data
  • Trigger sales outreach based on product behavior
  • Segment customers by engagement level

Example mapping:

  • PostHog plan → Salesforce Subscription_Tier__c
  • PostHog total_logins → Salesforce Login_Count__c
  • PostHog cohort "Power Users" → Salesforce campaign

HubSpot

Setup:

  1. In PostHog: Apps > HubSpot
  2. Authenticate with HubSpot
  3. Select which events and properties to sync

Use cases:

  • Trigger email campaigns based on product usage
  • Score leads using engagement data
  • Personalize marketing based on feature adoption

Intercom

Setup:

  1. In PostHog: Apps > Intercom
  2. Provide Intercom API key
  3. PostHog sends events and user properties to Intercom

Benefits:

  • Support teams see user behavior in Intercom
  • Trigger messages based on product actions
  • Enrich customer profiles with analytics data

API & Webhooks

PostHog API

The PostHog API gives you programmatic access to everything: events, persons, cohorts, insights, feature flags, and more.

Authentication:

curl https://app.posthog.com/api/projects/ \
  -H "Authorization: Bearer YOUR_API_KEY"

Common operations:

Capture events:

curl -X POST https://app.posthog.com/capture/ \
  -H "Content-Type: application/json" \
  -d '{
    "api_key": "YOUR_PROJECT_API_KEY",
    "event": "button_clicked",
    "distinct_id": "user_123",
    "properties": {
      "button_name": "signup_cta"
    }
  }'

Get feature flag value:

curl -X POST https://app.posthog.com/decide/ \
  -H "Content-Type: application/json" \
  -d '{
    "api_key": "YOUR_PROJECT_API_KEY",
    "distinct_id": "user_123"
  }'

Query insights:

curl https://app.posthog.com/api/projects/PROJECT_ID/insights/ \
  -H "Authorization: Bearer YOUR_API_KEY"

Create cohort:

curl -X POST https://app.posthog.com/api/projects/PROJECT_ID/cohorts/ \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Power Users",
    "filters": {
      "properties": {
        "type": "AND",
        "values": [
          {
            "key": "login_count",
            "value": 10,
            "operator": "gte"
          }
        ]
      }
    }
  }'

Webhooks

PostHog can send webhooks when specific events occur.

Setup:

  1. In PostHog: Apps > Webhook
  2. Enter webhook URL
  3. Configure trigger conditions

Supported triggers:

  • New event matches filter
  • User enters cohort
  • Experiment reaches significance
  • Insight threshold crossed

Payload example:

{
  "event": "purchase_completed",
  "distinct_id": "user_123",
  "properties": {
    "order_id": "ORDER_456",
    "revenue": 99.99
  },
  "timestamp": "2024-01-20T14:30:00Z"
}

Use cases:

  • Trigger custom workflows in Zapier or Make
  • Send data to internal systems
  • Update external databases
  • Fire cloud functions (AWS Lambda, Google Cloud Functions)

Community Apps

PostHog has a growing app ecosystem built by the community.

Popular apps:

  • GeoIP Enrichment: Add location data to events
  • Property Filter: Filter out specific events or properties
  • Schema Enforcer: Validate event structure
  • Downsampling: Reduce event volume for high-traffic apps
  • PII Manager: Automatically detect and redact sensitive data

Install apps:

  1. Go to Apps in PostHog
  2. Browse available apps
  3. Click Install on any app
  4. Configure settings

Build your own: PostHog apps are just JavaScript. You can build custom transformations, integrations, or enrichments.

// Example: Add timezone to all events
export function processEvent(event) {
  event.properties.timezone = Intl.DateTimeFormat().resolvedOptions().timeZone;
  return event;
}

Migration from Other Platforms

Migrating from Amplitude

Import historical data:

  1. Export events from Amplitude (CSV or API)
  2. Transform to PostHog event format
  3. Use PostHog batch import API

Switch tracking:

  1. Install PostHog SDK alongside Amplitude
  2. Gradually replace Amplitude calls with PostHog
  3. Run both in parallel during transition
  4. Remove Amplitude once confident

Migrating from Mixpanel

Import data:

  1. Use Mixpanel export API to get events
  2. Map Mixpanel event structure to PostHog
  3. Batch import to PostHog

Code migration:

// Mixpanel
mixpanel.track('Button Clicked', { button_name: 'signup' });

// PostHog (very similar)
posthog.capture('button_clicked', { button_name: 'signup' });

Migrating from Google Analytics

Considerations:

  • GA focuses on pageviews; PostHog focuses on events
  • You'll need to rethink your tracking strategy
  • PostHog offers more granular behavioral insights

Approach:

  1. Define key events that map to GA goals
  2. Implement PostHog event tracking
  3. Run both GA and PostHog in parallel
  4. Validate data matches expectations
  5. Sunset GA once PostHog is trusted

Best Practices

Integration Strategy

Do:

  • Start with one integration and validate before adding more
  • Document which data flows where
  • Set up monitoring for integration health
  • Test integrations in staging before production
  • Use PostHog's debug mode during setup

Don't:

  • Create circular data flows (A → B → A)
  • Send sensitive data to third parties without consent
  • Over-integrate (keep it simple)
  • Forget to handle integration failures gracefully

Data Consistency

  • Use the same distinct_id across all platforms
  • Maintain consistent event naming conventions
  • Sync user properties bidirectionally where needed
  • Validate data quality regularly

Performance

  • Batch events when possible (PostHog SDKs do this automatically)
  • Use server-side tracking for high-volume events
  • Monitor integration latency
  • Set appropriate timeouts for external APIs

Additional Resources: