Build a TikTok Content Factory with n8n + RenderIO

March 8, 2026 · RenderIO

A content factory produces TikTok-ready videos at scale

A single TikTok video is simple to make. Running 10 accounts with 3 posts per day each is a production problem. That's 30 unique, TikTok-optimized videos every day.

You can hire a video editor. Or you can build a factory.

A content factory is an automated pipeline. Drop source videos in, get TikTok-ready variations out. No manual editing. No per-video effort. n8n orchestrates the workflow. RenderIO handles the processing.

Use the RenderIO n8n node

RenderIO has a partner-verified community node on the n8n marketplace. Install from Settings → Community Nodes → search "renderio". The node supports chained commands (up to 10 sequential steps) and parallel execution — both useful for content factory pipelines.

The workflows below use HTTP Request nodes for granular control, but the same FFmpeg commands and file mappings work with the native node.

Factory architecture

Content Library → Daily Selector → Processing Pipeline → Quality Gate → Distribution Queue

Content Library: A folder or database of source videos. Daily Selector: Picks today's videos based on schedule and account assignment. Processing Pipeline: Resize, make unique, watermark, strip metadata. Quality Gate: Verify output meets TikTok requirements. Distribution Queue: Store processed videos for posting.

Node-by-node breakdown

Phase 1: Content selection

Node 1: Schedule Trigger

  • Runs daily at 5:00 AM

  • Gives the factory time to process before posting hours

Node 2: Fetch today's content (HTTP Request or Google Sheets)

Pull from a content calendar. Each row has:

  • source_video_url: The original video

  • accounts: Which accounts should post this video (comma-separated)

  • scheduled_date: When to post

// Code node: filter for today's content
const today = new Date().toISOString().split('T')[0];
const allContent = $input.all();

return allContent
  .filter(item => item.json.scheduled_date === today)
  .map(item => ({
    json: {
      sourceUrl: item.json.source_video_url,
      accounts: item.json.accounts.split(',').map(a => a.trim()),
      title: item.json.title
    }
  }));

Node 3: Expand accounts (Code node)

Turn each content item into one item per account:

const items = $input.all();
const expanded = [];

for (const item of items) {
  for (const account of item.json.accounts) {
    expanded.push({
      json: {
        sourceUrl: item.json.sourceUrl,
        title: item.json.title,
        account,
        jobId: `${account}_${Date.now()}`
      }
    });
  }
}

return expanded;

If you have 3 source videos each assigned to 10 accounts, this produces 30 items.

Phase 2: Base processing

Node 4: Split in Batches (size: 5)

Process 5 at a time to balance speed with rate limits.

Node 5: Resize to TikTok (HTTP Request)

{
  "ffmpeg_command": "-i {{in_video}} -vf \"scale=1080:1920:force_original_aspect_ratio=decrease,pad=1080:1920:(ow-iw)/2:(oh-ih)/2:black\" -c:v libx264 -crf 22 -preset fast -c:a aac -movflags +faststart {{out_video}}",
  "input_files": { "in_video": "{{ $json.sourceUrl }}" },
  "output_files": { "out_video": "base_{{ $json.jobId }}.mp4" }
}

Nodes 6-8: Poll for completion

Standard Wait → Check Status → IF loop.

Phase 3: Make unique

Each account needs a distinct variation. Generate unique parameters based on the account name for deterministic but varied results:

Node 9: Generate unique params (Code node)

const item = $input.first().json;
const baseUrl = item.output_files.out_video.storage_url;

// Deterministic randomization based on account name
const seed = item.account.split('').reduce((acc, c) => acc + c.charCodeAt(0), 0);
const crop = 2 + (seed % 8);
const brightness = ((seed % 5) * 0.01 - 0.02).toFixed(3);
const pitch = (1.0 + ((seed % 3) * 0.005 - 0.005)).toFixed(4);
const crf = 21 + (seed % 4);
const noise = 3 + (seed % 4);

return [{
  json: {
    ...item,
    baseUrl,
    crop,
    brightness,
    pitch,
    crf,
    noise
  }
}];

Node 10: Apply uniqueness + watermark (HTTP Request)

Combine uniqueness modifications with account-specific watermark in a single FFmpeg command:

{
  "ffmpeg_command": "-i {{in_video}} -i {{logo}} -filter_complex \"[0:v]crop=iw-{{ $json.crop }}:ih-{{ $json.crop }}:{{ $json.crop }}/2:{{ $json.crop }}/2,scale=1080:1920:force_original_aspect_ratio=decrease,pad=1080:1920:(ow-iw)/2:(oh-ih)/2:black,eq=brightness={{ $json.brightness }},noise=alls={{ $json.noise }}:allf=t[bg];[1:v]scale=100:-1,format=rgba,colorchannelmixer=aa=0.3[wm];[bg][wm]overlay=W-w-20:H-h-20\" -af \"asetrate=44100*{{ $json.pitch }},aresample=44100\" -c:v libx264 -crf {{ $json.crf }} -map_metadata -1 {{out_video}}",
  "input_files": {
    "in_video": "{{ $json.baseUrl }}",
    "logo": "https://example.com/logos/{{ $json.account }}.png"
  },
  "output_files": {
    "out_video": "final_{{ $json.jobId }}.mp4"
  }
}

This single command does everything:

  1. Crops by a unique amount

  2. Rescales to 1080x1920

  3. Adjusts brightness

  4. Adds noise

  5. Overlays account-specific watermark at 30% opacity

  6. Shifts audio pitch

  7. Re-encodes at a unique CRF

  8. Strips metadata

One API call per final video.

Nodes 11-13: Poll for completion

Phase 4: Quality gate

Node 14: Verify output (Code node)

const item = $input.first().json;

const checks = {
  hasOutput: !!item.output_files?.out_video,
  statusOk: item.status === 'SUCCESS'
};

const passed = Object.values(checks).every(v => v);

return [{
  json: {
    ...item,
    qualityChecks: checks,
    passed
  }
}];

Node 15: IF passed

  • True: Continue to distribution

  • False: Log to error sheet, send alert

Phase 5: Distribution

Node 16: Store in distribution queue (Google Sheets or database)

Write to a "Ready to Post" sheet:

Job IDAccountVideo URLTitleDatePosted
acc1_170...account_1https://storage...Recipe tip2026-02-22No

Node 17: Back to Split in Batches

Loop continues until all items are processed.

Node 18: Summary notification (Slack)

After all batches complete:

Factory run complete:
- 30 videos processed
- 28 passed quality gate
- 2 failed (logged to error sheet)
- Total processing time: 12 minutes

Optimization: Cache the base resize

If the same source video goes to 10 accounts, don't resize it 10 times. Resize once, then generate 10 variations from the resized base.

Add a deduplication step after Node 3:

// Group by sourceUrl
const groups = {};
for (const item of $input.all()) {
  const url = item.json.sourceUrl;
  if (!groups[url]) groups[url] = [];
  groups[url].push(item.json.account);
}

// Output one item per unique source
return Object.entries(groups).map(([url, accounts]) => ({
  json: { sourceUrl: url, accounts }
}));

Resize each unique source once. Then expand back to per-account items for the uniqueness step.

This reduces processing time by 60-80% when the same video goes to many accounts.

Daily operations

Once the factory is running:

Morning: Factory processes overnight content. Review the quality gate failures. Midday: Post scheduled content from the distribution queue. Evening: Add tomorrow's content to the calendar.

The factory handles the tedious middle part. You handle creative selection and posting.

Scaling the factory

ScaleVideos/DayProcessing TimeRenderIO Plan
Starter10~5 minStarter
Growth30~15 minPro
Scale100~45 minGrowth
Enterprise500+~3 hrsEnterprise

Get started

  1. Sign up at renderio.dev

  2. Build Phase 1-2 first (select and resize)

  3. Test with 3 videos and 2 accounts

  4. Add uniqueness and watermarking

  5. Scale to your full account portfolio

The Growth plan at 29/mocovers1,000commands.ScaletoBusiness(29/mo covers 1,000 commands. Scale to Business (99/mo, 20,000 commands) for higher volume. For product-specific content automation, see automating TikTok product content.