RotoTexture: Automated Tools for Texturing Raw Video

Hui Fang, John C. Hart IEEE Transactions on Visualization and Computer Graphics 12(6), Nov. 2006, pp. 1580-1589

We propose a video editing system that allows a user to apply a time-coherent texture to a surface depicted in the raw video from a single uncalibrated camera, including the surface texture mapping of a texture image and the surface texture synthesis from a texture swatch. Our system avoids the construction of a 3-D shape model and instead uses the recovered normal field to deform the texture so that it plausibly adheres to the undulations of the depicted surface. The texture mapping method uses the non-linear least-squares optimization of a spring model to control the behavior of the texture image as it is deformed to match the evolving normal field through the video. The texture synthesis method uses a coarse optical flow to advect clusters of pixels corresponding to patches of similarly oriented surface points. These clusters are organized into a minimum advection tree to account for the dynamic visibility of clusters. We take a rather crude approach to normal recovering and optical flow estimation, yet the results are robust and plausible for nearly diffuse surfaces such as faces and t-shirts.

rototexture.pdf8.17 MB
rototexture.mpg25.65 MB

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer