Overview
PostHog is built to play well with your existing stack. Whether you're using Segment for customer data, Slack for team communication, BigQuery for data warehousing, or Salesforce for CRM, PostHog integrates seamlessly to create a unified analytics ecosystem.
The platform offers three main types of integrations:
- Data pipelines: Export PostHog data to warehouses and BI tools
- Customer data platforms: Ingest data from Segment, RudderStack, and mParticle
- Apps and notifications: Send insights to Slack, Teams, webhooks, and more
Because PostHog is open-source, you can also build custom integrations using the API, webhooks, or by forking the platform itself. This flexibility makes PostHog adaptable to virtually any tech stack.
Deployment Options
PostHog Cloud
PostHog Cloud is the fastest way to get started. No infrastructure to manage, automatic scaling, and all features included.
Setup:
- Sign up at app.posthog.com
- Create a project and get your API key
- Add the PostHog snippet to your app
- Start tracking events immediately
Pricing:
- Free tier: 1 million events/month, 5,000 session recordings, unlimited feature flags
- Usage-based: Pay only for what you use beyond free tier
- Transparent pricing: No hidden fees or surprise bills
Benefits:
- Managed infrastructure and automatic updates
- Global CDN for fast SDK loading
- Built-in DDoS protection and security
- 99.9% uptime SLA
- EU and US data hosting options
Best for:
- Teams that want to move fast without DevOps overhead
- Startups and scale-ups focused on product development
- Companies comfortable with cloud-hosted analytics
Self-Hosted
Self-hosting gives you complete control. All data stays on your infrastructure, and you own the entire stack.
Deployment Options:
- Docker Compose: For development and small teams
- Kubernetes: For production and scale
- AWS, GCP, Azure: Using PostHog's deployment guides
- On-premises: For air-gapped or highly regulated environments
Setup (Docker Compose):
# Clone PostHog
git clone https://github.com/PostHog/posthog.git
cd posthog
# Start services
docker-compose up -d
# PostHog will be available at http://localhost:8000
Setup (Kubernetes via Helm):
# Add PostHog Helm repo
helm repo add posthog https://posthog.github.io/charts-clickhouse
helm repo update
# Install PostHog
helm install posthog posthog/posthog \
--set cloud=false \
--set ingress.hostname=posthog.yourdomain.com
# Check deployment
kubectl get pods -n posthog
System Requirements:
- Minimum: 4GB RAM, 2 CPU cores, 20GB storage
- Recommended: 16GB RAM, 4+ CPU cores, 100GB+ SSD storage
- Scale: PostHog can handle billions of events with proper infrastructure
Benefits:
- Complete data ownership and privacy
- No limits on events, users, or features
- Customizable and extendable (it's open-source)
- No recurring SaaS fees
- Compliance-friendly (HIPAA, SOC 2, GDPR)
Considerations:
- You manage updates, scaling, and backups
- Requires DevOps expertise
- Infrastructure costs (servers, storage, bandwidth)
Best for:
- Enterprises with strict data governance requirements
- Healthcare, finance, or regulated industries
- High-volume products where self-hosting is cost-effective
- Teams with existing infrastructure and DevOps capacity
Which Should You Choose?
| Factor | PostHog Cloud | Self-Hosted |
|---|---|---|
| Setup time | Minutes | Hours to days |
| Maintenance | Zero | Ongoing |
| Data control | Shared | Full ownership |
| Compliance | SOC 2, GDPR | You control |
| Cost at scale | Higher (usage-based) | Lower (infra only) |
| Updates | Automatic | Manual |
| Best for | Speed & simplicity | Control & compliance |
Rule of thumb:
- Start with PostHog Cloud for speed
- Self-host if you have specific compliance needs or hit scale where costs favor self-hosting
- You can migrate from Cloud to self-hosted (or vice versa) later
SDK & Platform Support
PostHog offers official SDKs for virtually every platform and language.
Web & Mobile SDKs
npm install posthog-js
# or
yarn add posthog-js
npm install posthog-js
# Includes React hooks and context
React Native:
npm install posthog-react-native
iOS (Swift/Objective-C):
pod 'PostHog', '~> 3.0'
Android (Kotlin/Java):
implementation 'com.posthog:posthog-android:3.+'
Flutter:
dependencies:
posthog_flutter: ^3.0.0
Server-Side SDKs
Node.js:
npm install posthog-node
Python:
pip install posthog
Ruby:
gem install posthog-ruby
PHP:
composer require posthog/posthog-php
Go:
go get github.com/posthog/posthog-go
Java:
<dependency>
<groupId>com.posthog.java</groupId>
<artifactId>posthog</artifactId>
<version>3.0.0</version>
</dependency>
Rust:
cargo add posthog-rs
Elixir:
{:posthog, "~> 0.1"}
Framework Integrations
// pages/_app.js
import posthog from 'posthog-js';
import { PostHogProvider } from 'posthog-js/react';
if (typeof window !== 'undefined') {
posthog.init('YOUR_API_KEY', {
api_host: 'https://app.posthog.com',
loaded: (posthog) => {
if (process.env.NODE_ENV === 'development') posthog.debug();
}
});
}
export default function App({ Component, pageProps }) {
return (
<PostHogProvider client={posthog}>
<Component {...pageProps} />
</PostHogProvider>
);
}
// main.js
import { createApp } from 'vue';
import posthog from 'posthog-js';
const app = createApp(App);
app.config.globalProperties.$posthog = posthog.init(
'YOUR_API_KEY',
{
api_host: 'https://app.posthog.com'
}
);
app.mount('#app');
Svelte:
// App.svelte
import posthog from 'posthog-js';
posthog.init('YOUR_API_KEY', {
api_host: 'https://app.posthog.com'
});
export { posthog };
Data Warehouse Integrations
Export PostHog data to your data warehouse for custom analysis, reporting, and ML pipelines.
BigQuery
Setup:
- Go to Data Pipeline > Destinations in PostHog
- Click New Destination > BigQuery
- Provide your Google Cloud project ID and dataset name
- Authenticate with Google Cloud
- Select which events to export
What gets exported:
- Events with all properties
- Person properties
- Session data
- Feature flag states
Frequency:
- Real-time export (events appear within minutes)
- Batched for efficiency
Schema:
-- Events table
SELECT
timestamp,
event,
distinct_id,
properties
FROM `your_project.posthog.events`
WHERE DATE(timestamp) = CURRENT_DATE()
Snowflake
Setup:
- Create a Snowflake database and warehouse
- Generate Snowflake credentials (account, user, password, role)
- In PostHog: Data Pipeline > Destinations > Snowflake
- Enter Snowflake connection details
- Select events to export
Benefits:
- Join PostHog analytics with sales, support, and marketing data
- Build custom dashboards in Looker, Tableau, or Metabase
- Run ML models on user behavior data
Redshift
Setup:
- Provision a Redshift cluster
- Create a database and schema for PostHog data
- In PostHog: Data Pipeline > Destinations > Redshift
- Provide connection string and credentials
- Configure export schedule
PostgreSQL / MySQL
Setup:
- In PostHog: Data Pipeline > Destinations > PostgreSQL (or MySQL)
- Provide database connection details
- PostHog creates tables automatically
- Events flow continuously
Use cases:
- Custom reporting queries
- Integration with internal tools
- Backups and archival
Customer Data Platforms
Segment
PostHog integrates with Segment in both directions:
Send data FROM Segment TO PostHog:
// Segment tracks the event
analytics.track('Button Clicked', {
button_name: 'signup_cta'
});
// PostHog receives it automatically (if Segment destination is configured)
Setup:
- In Segment: Add PostHog as a destination
- Enter your PostHog API key and project URL
- Enable the destination
- All Segment events flow to PostHog
Send data FROM PostHog TO Segment:
// Track in PostHog
posthog.capture('feature_used', {
feature_name: 'advanced_reports'
});
// Configure export to Segment in PostHog settings
Benefits:
- Single source of truth for customer data
- Use Segment's extensive destination catalog
- Migrate from other analytics tools without changing instrumentation
RudderStack
RudderStack is an open-source Segment alternative.
Setup:
- Add PostHog as a destination in RudderStack
- Configure connection settings
- Events tracked via RudderStack appear in PostHog
Why RudderStack + PostHog:
- Both are open-source
- Both support self-hosting
- Complete control over your data pipeline
mParticle
Setup:
- In mParticle: Configure PostHog as an output
- Provide PostHog API credentials
- Map mParticle events to PostHog events
Messaging & Notifications
Slack
Get real-time alerts and insights delivered to Slack.
Setup:
- In PostHog: Apps > Slack
- Click Install and authorize with Slack
- Create subscriptions for specific insights
Use cases:
- Alert when errors spike
- Notify when experiments reach significance
- Daily digest of key metrics
- User activity notifications (e.g., "New enterprise signup!")
Example:
PostHog Alert:
Errors increased by 230% in the last hour
Event: "api_error"
Count: 147 (normal: 45)
View in PostHog: https://app.posthog.com/...
Microsoft Teams
Same functionality as Slack, delivered to Teams channels.
Setup:
- In PostHog: Apps > Microsoft Teams
- Generate a webhook URL in Teams
- Configure PostHog to send alerts to that webhook
Discord
Setup:
- Create a Discord webhook in your server settings
- In PostHog: Apps > Webhook
- Enter Discord webhook URL
- Configure event triggers
CRM & Marketing Automation
Salesforce
Sync PostHog cohorts and events to Salesforce for enriched customer profiles.
Setup:
- In PostHog: Apps > Salesforce
- Authenticate with Salesforce
- Map PostHog properties to Salesforce fields
Use cases:
- Enrich lead records with product usage data
- Trigger sales outreach based on product behavior
- Segment customers by engagement level
Example mapping:
- PostHog
plan→ SalesforceSubscription_Tier__c - PostHog
total_logins→ SalesforceLogin_Count__c - PostHog cohort "Power Users" → Salesforce campaign
HubSpot
Setup:
- In PostHog: Apps > HubSpot
- Authenticate with HubSpot
- Select which events and properties to sync
Use cases:
- Trigger email campaigns based on product usage
- Score leads using engagement data
- Personalize marketing based on feature adoption
Intercom
Setup:
- In PostHog: Apps > Intercom
- Provide Intercom API key
- PostHog sends events and user properties to Intercom
Benefits:
- Support teams see user behavior in Intercom
- Trigger messages based on product actions
- Enrich customer profiles with analytics data
API & Webhooks
PostHog API
The PostHog API gives you programmatic access to everything: events, persons, cohorts, insights, feature flags, and more.
Authentication:
curl https://app.posthog.com/api/projects/ \
-H "Authorization: Bearer YOUR_API_KEY"
Common operations:
Capture events:
curl -X POST https://app.posthog.com/capture/ \
-H "Content-Type: application/json" \
-d '{
"api_key": "YOUR_PROJECT_API_KEY",
"event": "button_clicked",
"distinct_id": "user_123",
"properties": {
"button_name": "signup_cta"
}
}'
Get feature flag value:
curl -X POST https://app.posthog.com/decide/ \
-H "Content-Type: application/json" \
-d '{
"api_key": "YOUR_PROJECT_API_KEY",
"distinct_id": "user_123"
}'
Query insights:
curl https://app.posthog.com/api/projects/PROJECT_ID/insights/ \
-H "Authorization: Bearer YOUR_API_KEY"
Create cohort:
curl -X POST https://app.posthog.com/api/projects/PROJECT_ID/cohorts/ \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"name": "Power Users",
"filters": {
"properties": {
"type": "AND",
"values": [
{
"key": "login_count",
"value": 10,
"operator": "gte"
}
]
}
}
}'
Webhooks
PostHog can send webhooks when specific events occur.
Setup:
- In PostHog: Apps > Webhook
- Enter webhook URL
- Configure trigger conditions
Supported triggers:
- New event matches filter
- User enters cohort
- Experiment reaches significance
- Insight threshold crossed
Payload example:
{
"event": "purchase_completed",
"distinct_id": "user_123",
"properties": {
"order_id": "ORDER_456",
"revenue": 99.99
},
"timestamp": "2024-01-20T14:30:00Z"
}
Use cases:
- Trigger custom workflows in Zapier or Make
- Send data to internal systems
- Update external databases
- Fire cloud functions (AWS Lambda, Google Cloud Functions)
Community Apps
PostHog has a growing app ecosystem built by the community.
Popular apps:
- GeoIP Enrichment: Add location data to events
- Property Filter: Filter out specific events or properties
- Schema Enforcer: Validate event structure
- Downsampling: Reduce event volume for high-traffic apps
- PII Manager: Automatically detect and redact sensitive data
Install apps:
- Go to Apps in PostHog
- Browse available apps
- Click Install on any app
- Configure settings
Build your own: PostHog apps are just JavaScript. You can build custom transformations, integrations, or enrichments.
// Example: Add timezone to all events
export function processEvent(event) {
event.properties.timezone = Intl.DateTimeFormat().resolvedOptions().timeZone;
return event;
}
Migration from Other Platforms
Migrating from Amplitude
Import historical data:
- Export events from Amplitude (CSV or API)
- Transform to PostHog event format
- Use PostHog batch import API
Switch tracking:
- Install PostHog SDK alongside Amplitude
- Gradually replace Amplitude calls with PostHog
- Run both in parallel during transition
- Remove Amplitude once confident
Migrating from Mixpanel
Import data:
- Use Mixpanel export API to get events
- Map Mixpanel event structure to PostHog
- Batch import to PostHog
Code migration:
// Mixpanel
mixpanel.track('Button Clicked', { button_name: 'signup' });
// PostHog (very similar)
posthog.capture('button_clicked', { button_name: 'signup' });
Migrating from Google Analytics
Considerations:
- GA focuses on pageviews; PostHog focuses on events
- You'll need to rethink your tracking strategy
- PostHog offers more granular behavioral insights
Approach:
- Define key events that map to GA goals
- Implement PostHog event tracking
- Run both GA and PostHog in parallel
- Validate data matches expectations
- Sunset GA once PostHog is trusted
Best Practices
Integration Strategy
Do:
- Start with one integration and validate before adding more
- Document which data flows where
- Set up monitoring for integration health
- Test integrations in staging before production
- Use PostHog's debug mode during setup
Don't:
- Create circular data flows (A → B → A)
- Send sensitive data to third parties without consent
- Over-integrate (keep it simple)
- Forget to handle integration failures gracefully
Data Consistency
- Use the same
distinct_idacross all platforms - Maintain consistent event naming conventions
- Sync user properties bidirectionally where needed
- Validate data quality regularly
Performance
- Batch events when possible (PostHog SDKs do this automatically)
- Use server-side tracking for high-volume events
- Monitor integration latency
- Set appropriate timeouts for external APIs
Additional Resources: