A series of generative environment tests exploring how ComfyUI pipelines, multi-model orchestration, and structured evaluation can produce cinematic frames that hold up to feature film standards. Built to test reproducibility, art direction, and integration with traditional CG workflows rather than chase one-off results.

The Pipeline

The node graph is not a single workflow. It is a modular look-development system built to explore cinematic worldbuilding at a scale traditional shot-by-shot iteration rarely allows.


The project began as a Younger Dryas film-trailer concept: an ice age world of glacial valleys, burning forests, megafauna, and human survival at the edge of extinction.


Rather than generate isolated images, the pipeline was structured so environmental conditions, atmosphere, lighting logic, and compositional framing existed as independent randomized branches. These systems recombined into structured prompt variations while maintaining a consistent cinematic throughline.


The system operated autonomously, generating large batches of candidate frames for review and refinement. Selected images moved through a separate upscale and finishing pipeline designed for cinematic continuity, reproducibility, and integration with traditional CG workflows.


The result is a scalable environment look-development process focused less on one-off imagery and more on structured cinematic exploration.

Autonomous look development pipeline. A master prompt system built in ComfyUI using conditional logic and seed-triggered variable groupings to generate structured variation at scale. Environmental conditions, lighting states, atmospheric density, and compositional framing were each governed by independent randomization branches that assembled into coherent compound prompts at runtime. The pipeline ran overnight, generating thousands of cinematic frames across a broad visual world while maintaining narrative and aesthetic throughline. Output was reviewed in bulk, with hero candidates pushed through a separate refinement and upscaling pass.

I used flux_krea first because imo it produces more creative results, then ran it though a flux_fp16 pass with the KodakPorta800 lora to force photorealism. I used some random seed generators to get a fluctuation of results in the look dev phase.