Spatio-temporal filtered motion DAGs for path-tracing
Motion Blur is an important effect of photo-realistic rendering. Distribution ray tracing can simulate motion blur very well by integrating light, both over the spatial and the temporal domain. However, increasing the problem by the temporal dimension entails many challenges, particularly in cinematic multi-bounce path tracing of complex scenes, where heavy-weight geometry with complex lighting and even offscreen elements contribute to the final image. In this paper, we propose the Motion DAG (Directed Acyclic Graph), a novel data structure that filters an entire animation sequence of an object, both in the spatial and temporal domain. These Motion DAGs interleave a temporal interval binary tree for filtering time consecutive data and a sparse voxel octree (SVO), which simplifies spatially nearby data. Motion DAGs are generated in a pre-process and can be easily integrated in a conventional physically based path tracer. Our technique is designed to target motion blur of small objects, where coarse representations are sufficient. Specifically, in this scenario our results show that it is possible to significantly reduce both, memory consumption and render time.