Zusammenfassung

The Bidirectional Texture Function (BTF) is becoming widely used for accurate representation of real-world material appearance. In this paper a novel BTF compression model is proposed. The model resamples input BTF data into a parametrization, allowing decomposition of individual view and illumination dependent texels into a set of multidimensional conditional probability density functions. These functions are compressed in turn using a novel multi-level vector quantization algorithm. The result of this algorithm is a set of index and scale code-books for individual dimensions. BTF reconstruction from the model is then based on fast chained indexing into the nested stored code-books. In the proposed model, luminance and chromaticity are treated separately to achieve further compression. The proposed model achieves low distortion and compression ratios 1:233-1:2040, depending on BTF sample variability. These results compare well with several other BTF compression methods with predefined compression ratios, usually smaller than 1:200. We carried out a psychophysical experiment comparing our method with LPCA method. BTF synthesis from the model was implemented on a standard GPU, yielded interactive framerates. The proposed method allows the fast importance sampling required by eye-path tracing algorithms in image synthesis.