Runway output isn't TikTok-ready
Runway Gen-3 Alpha generates impressive video. But the raw output has problems for TikTok:
16:9 aspect ratio instead of TikTok's 9:16
AI metadata embedded in the file
Too-clean visuals with zero sensor noise
720p or 1080p landscape instead of 1080x1920 portrait
Variable audio levels (or no audio at all)
Upload raw Runway output to TikTok and you'll get black bars, low distribution, and potentially an AI content flag. Not what you want after spending generation credits.
The fix is a post-processing pipeline. Five FFmpeg steps, one API call. If you're using Kling instead, the Kling post-processing guide covers its specific artifacts (texture flickering, color drift) with tuned filter values.
The problems with Runway Gen-3 output
Aspect ratio mismatch
Runway Gen-3 outputs at 1280x768 (roughly 5:3) or 1920x1080 (16:9). TikTok displays at 1080x1920 (9:16). That's a 90-degree rotation of the aspect ratio.
You can't just rotate the video. You need to either crop to the center (losing sides) or add a blurred background fill.
Metadata exposure
Runway embeds generation metadata:
Some platforms read this. Even if TikTok doesn't actively penalize it today, there's no reason to leave it in.
Visual sterility
Runway video is mathematically perfect. No noise, no grain, no lens imperfections. Human eyes are tuned to detect this. On a platform where 99% of content comes from phone cameras, Runway output looks alien.
Step-by-step FFmpeg pipeline
1. Strip metadata
2. Convert to 9:16 with blurred background
The best approach for landscape-to-portrait conversion uses a blurred, zoomed version of the video as the background:
This creates a full-bleed 1080x1920 video with the original content centered and a blurred version filling the background.
3. Add grain and color correction
noise=alls=16 matches indoor phone camera grain. The slight desaturation and contrast bump simulate a phone camera's processing.
4. Normalize audio
5. Optimize for mobile playback
faststart moves the moov atom to the beginning so mobile playback starts immediately.
Single-command version
All five steps combined:
Batch process with RenderIO API
One video is fine locally. But if you're generating 20-50 Runway clips per day for a content operation, you need automation.
Processing 100 videos in parallel
Each job runs independently on Cloudflare's edge network. 100 videos finish in roughly the time it takes to process 1-2 locally.
Set up webhooks for completion
Instead of polling, configure a webhook to get notified when processing finishes:
RenderIO supports webhook configuration with automatic retries and a dead letter queue. No lost notifications.
Cost comparison
| Approach | 100 videos/day | Monthly cost | Time |
| Manual in Premiere | 100 videos | $0 (your time) | 8+ hours |
| Local FFmpeg script | 100 videos | $0 (your CPU) | 2-3 hours |
| RenderIO API | 100 videos | $49/mo (Pro plan, 5,000 commands) | Minutes |
With the Business plan at 0.005. For smaller volumes, the Growth plan at $29/mo covers 1,000 commands.
The Growth plan at $29/mo covers 1,000 commands -- enough for processing hundreds of Runway outputs daily.