Skip to content
99ersstudio
All work
Pipelines/Live

Content Engine

Self-hosted marketing-ops platform. 12 platform publishers, GTM + schema + redirects + attribution, MOCK_MODE-first, OAuth tokens Fernet-encrypted at rest.

12
publishers
2
phase
25+
service modules

The problem

An agency or in-house marketing team that wants to publish a single piece of content to TikTok, YouTube, LinkedIn, Instagram, Facebook, Xing, GMB, an email list, the company website, an industry portal, and a press release feed currently buys five SaaS tools, glues them with Zapier, and still has to log into each platform to authorise OAuth. The data ends up in five silos, the OAuth tokens live in five vendor databases, and switching providers means re-doing every integration. We wanted one self-hosted box that owns the OAuth tokens, the SEO intelligence, the AI generation, the platform-specific variants, the publishing schedule, the attribution, and the A/B tests — and that boots green from a `docker compose up` with zero real API keys.

How we built it

  1. 01Shipped a `MOCK_MODE=True` default that stubs every external API — OpenAI, Claude, all twelve publishers, OAuth flows — so the stack boots, the UI renders, and end-to-end pipelines run on a fresh laptop without anyone touching a credential. Each integration gets its real implementation by flipping a single env var.
  2. 02Built one publisher contract (`backend/app/services/publishers/base.py`) and twelve concrete adapters: `tiktok`, `youtube`, `linkedin`, `instagram`, `facebook`, `xing`, `gmb`, `industry_portal`, `press`, `email`, `website`, plus a generic `industry_portal` for niche destinations. Adding a thirteenth platform is one file and one row in the publisher registry.
  3. 03Stored every OAuth token Fernet-encrypted at rest with a `TOKEN_ENCRYPTION_KEY` env var, logged every OpenAI / platform API call to `api_usage_log` for cost tracking, and modeled content with a state machine (`draft → review → approved → scheduled → published`) so nothing ships without an explicit gate.
  4. 04Bundled the long tail an agency actually needs but rarely buys: GTM container generator, schema.org JSON-LD generator, redirect manager, sitemap engine, internal linker, hashtag generator, attribution engine, A/B test engine, landing-page engine, thumbnail generator, video compiler, QR generator, invoice + report generators, and an XLSX importer for client onboarding.
  5. 05Ran the whole thing on Docker Compose behind Caddy (auto-HTTPS), with ports deliberately offset from a parallel SEOMAX install (Postgres 5433, Redis 6380, API 8001, Caddy 81 / 444) so both stacks can coexist on one box. PostgreSQL 16 with JSONB for `platform_variants`, `seo_metadata`, and raw API responses; async SQLAlchemy 2.0 throughout.

Outcome

Phase 2 complete: AI Content Generation + Quality Engine. 109 files, German Fachartikel templates, SEO optimizer, hashtag generator, twelve publishers, every endpoint wired to Celery for async work. Full-stack `docker compose up --build` brings up Postgres + Redis + Caddy + FastAPI + Next.js with healthchecks gating the API on a ready database. Alembic migrations + a one-shot bootstrap that creates the first admin user.

Stack

PythonFastAPICeleryRedisPostgreSQLNext.jsshadcn/uiDockerCaddy

Python 3.12 · FastAPI · Celery + Redis · async SQLAlchemy 2.0 · PostgreSQL 16 (JSONB, UUID PKs) · Next.js 15 + shadcn/ui + Recharts + next-intl · Docker Compose · Caddy reverse proxy with auto-HTTPS · Fernet for token encryption.

Next up

Phase 3: scheduled-posting system, sandbox testing with real customer data, spreadsheet import workflow, and the video workflow. Then real OAuth onboarding flows for each of the twelve publishers.

More case studies