|
|
- a public source of BTF and BRDF samples for your research
|
|
Book |
|
Although the field of texture processing is now well-established, research in this area remains predominantly restricted to texture analysis and simple and approximate static textures.
This comprehensive text/reference presents a survey of the state of the art in multidimensional, physically-correct visual texture modeling. Starting from basic principles and building upon the fundamentals to the latest advanced methods, the book brings together research from computer vision, pattern recognition, computer graphics, virtual and augmented reality. The text assumes a graduate-level understanding of statistics and probability theory, and a knowledge of basic computer graphics principles, but is accessible to newcomers to the field. Researchers, lecturers, students and practitioners will all find this book an invaluable reference on the rapidly developing new field of texture modeling.
Topics and features:
Reviews the entire process of texture synthesis, including material appearance representation, measurement, analysis, compression, modeling, editing, visualization, and perceptual evaluation
Explains the derivation of the most common representations of visual texture, discussing their properties, advantages, and limitations
Describes a range of techniques for the measurement of visual texture, including BRDF, SVBRDF, BTF and BSSRDF
Investigates the visualization of textural information, from texture mapping and mip-mapping to illumination- and view-dependent data interpolation
Examines techniques for perceptual validation and analysis, covering both standard pixel-wise similarity measures and also methods of visual psychophysics
Reviews the applications of visual textures, from visual scene analysis in image processing and medical applications, to high-quality visualizations for cultural heritage and the automotive industry.
|
@book{haindl_filip13visual,
title = {Visual Texture},
author = {Haindl, M. and Filip, J.},
series = {Advances in Computer Vision and Pattern Recognition},
publisher={Springer-Verlag},
address = {London},
isbn = {978-1-4471-4901-9},
year={2013},
pages = {284}}
|
Sort publications by: [year] [category]
|
Preprints |
|
Filip J., Dechterenko F., Schmidt F.,Lukavsky J., Vilimovska V., Kotera J., Fleming, R. W.:
Material Fingerprinting: Identifying and Predicting Perceptual Attributes of Material Appearance
,
arXiv 2410.13615, October 2024
[bib]
[preprint]
|
Materials exhibit an extraordinary range of visual appearances. Characterising and quantifying appearance is important not only for basic research on perceptual mechanisms, but also for computer graphics and a wide range of industrial applications. While methods exist for capturing and representing the optical properties of materials and how they vary across surfaces (Haindl & Filip., 2013), the representations are typically very high-dimensional, and how these representations relate to subjective perceptual impressions of material appearance remains poorly understood. Here, we used a data-driven approach to characterising the perceived appearance characteristics of 30 samples of wood veneer using a ‘visual fingerprint’ that describes each sample as a multidimensional feature vector, with each dimension capturing a different aspect of the appearance. Fifty-six crowd-sourced participants viewed triplets of movies depicting different wood samples as the sample rotated. Their task was to report which of the two match samples was subjectively most similar to the test sample. In another online experiment 45 participants rated ten wood-related appearance characteristics for each of the samples. The results reveal a consistent embedding of the samples across both experiments and a set of 9 perceptual dimensions capturing aspects including the roughness, directionality and spatial scale of the surface patterns. We also showed that a weighted linear combination of eleven image statistics, inspired by the rating characteristics, predicts perceptual dimensions well.
|
@misc{filip24material,
title={Material Fingerprinting: Identifying and Predicting Perceptual Attributes
of Material Appearance},
author={Jiri Filip and Filip Dechterenko and Filipp Schmidt and Jiri Lukavsky
and Veronika Vilimovska and Jan Kotera and Roland W. Fleming},
year={2024},
eprint={2410.13615},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2410.13615},
}
|
Journal Papers (impacted) |
|
Filip J., Lukavsky J., Dechterenko F., Schmidt F., Fleming, R. W.:
Perceptual Dimensions of Wood Materials
,
Journal of Vision, Vol.24, Number 5, May 2024, pages 12-12.
[bib]
[pdf]
[preprint + data]
|
Materials exhibit an extraordinary range of visual appearances. Characterising and quantifying appearance is important not only for basic research on perceptual mechanisms, but also for computer graphics and a wide range of industrial applications. While methods exist for capturing and representing the optical properties of materials and how they vary across surfaces (Haindl & Filip., 2013), the representations are typically very high-dimensional, and how these representations relate to subjective perceptual impressions of material appearance remains poorly understood. Here, we used a data-driven approach to characterising the perceived appearance characteristics of 30 samples of wood veneer using a ‘visual fingerprint’ that describes each sample as a multidimensional feature vector, with each dimension capturing a different aspect of the appearance. Fifty-six crowd-sourced participants viewed triplets of movies depicting different wood samples as the sample rotated. Their task was to report which of the two match samples was subjectively most similar to the test sample. In another online experiment 45 participants rated ten wood-related appearance characteristics for each of the samples. The results reveal a consistent embedding of the samples across both experiments and a set of 9 perceptual dimensions capturing aspects including the roughness, directionality and spatial scale of the surface patterns. We also showed that a weighted linear combination of eleven image statistics, inspired by the rating characteristics, predicts perceptual dimensions well.
|
@article{filip24perceptual,
title = {Perceptual Dimensions of Wood Materials},
author = {Filip, J. and Lukavsk{\'y}, J. and D{\v e}cht{\v e}renko, F.
and Schmidt, F. and Fleming, R. W.}
journal = {Journal of Vision},
issue = {24},
number = {5},
month = {5},
year = {2024},
publisher = {Association for Research in Vision and Ophthalmology},
DOI = {10.1167/jov.24.5.12},
ISSN = {1534-7362},
pages = {12-12},
url = {https://doi.org/10.1167/jov.24.5.12},
eprint = {https://arvojournals.org/arvo/content\_public/journal/jov/938673/i1534-7362-24-5-12\_1716555985.57742.pdf}
}
|
|
Filip J., Vilímovská, V.:
Characterization of Wood Materials Using Perception-Related Image Statistics.
Journal of Imaging Science and Technology, 67(5), 2023, pp. 1-9
[bib]
[pdf]
|
An efficient computational characterization of real-world materials is one of the challenges in image understanding. An automatic assessment of materials, with similar performance as human observer, usually relies on complicated image filtering derived from models of human perception. However, these models become too complicated when a real material is observed in the form of dynamic stimuli. This study tackles the challenge from the other side. First, we collected human ratings of the most common visual attributes for videos of wood samples and analyzed their relationship to selected image statistics. In our experiments on a set of sixty wood samples, we have found that such image statistics can perform surprisingly well in the discrimination of individual samples with reasonable correlation to human ratings. We have also shown that these statistics can be also effective in the discrimination of images of the same material taken under different illumination and viewing conditions.
|
@article{filip23characterization,
title = {Characterization of Wood Materials Using Perception-Related Image Statistics},
author = {Filip, J. and Vilimovsk{\'a}, V.},
journal = {Journal of Imaging Science and Technology},
issue = {67},
number = {5},
year = {2023},
publisher = {Imaging Science and Technology},
DOI = {10.2352/J.ImagingSci.Technol.2023.67.5.050408},
pages = {1--9}
}
|
|
Filip J., Vávra R., Maile, F. J.:
Waviness analysis of glossy surfaces based on deformation of a light source reflection.
Journal of Coatings Technology and Research, 2023
[bib]
[pdf]
|
The evaluation of waviness, also known as orange peel, is essential for the quality control of materials in industrial fields working with high gloss materials, e.g. coatings, automotive and metal fabrication. This paper presents an affordable non-contact method for waviness analysis based on a single image of the light source reflected from the surface under study. The spatial perturbations along the contour of the light source reflection are compared to the ideal contour and analyzed in the Fourier domain to obtain standard features that have been compared to commercial ripple characterization device. Additional three method-specific features are proposed and evaluated. Our method has been tested on a set of ten orange peel standards, ten effect and three solid coating samples and shows promising performance in waviness characterization of glossy surfaces.
|
@article{filip23waviness,
title = {Waviness analysis of glossy surfaces based on deformation of
a light source reflection},
author = {Filip, J. and V{\'a}vra, R. and Maile, F. J.},
journal = {Journal of Coatings Technology and Research},
issue = {20},
number = {},
year = {2023},
publisher = {Springer},
DOI = {10.1007/s11998-023-00775-6},
pages = {1703--1712}
}
|
|
Filip J., Vávra R., Kolafová M., Maile, F. J.:
Assessment of sparkle and graininess in effect coatings using a high-resolution gonioreflectometer and psychophysical studies.
In Journal of Coatings Technology and Research, 2021
[bib]
[pdf]
|
The aim of this article is to propose a model to automatically predict visual judgement of sparkle and graininess of special effect pigments used in industrial coatings.
Many applications in the paint and coatings, printing and plastics industry rely on multi-angle color measurements with the aim of properly characterizing the appearance, i.e., the color and texture of the manufactured surfaces. However, when it comes to surfaces containing effect pigments, these methods are in many cases insufficient and it is particularly texture characterization methods that are needed. There are two attributes related to texture that are commonly used: (1) diffuse coarseness or graininess and (2) sparkle or glint impression.
In this paper, we analyzed visual perception of both texture attributes using two different psychophysical studies of 38 samples painted with effect coatings including different effect pigments, and 31 test persons.
Our previous work has shown a good agreement between a study using physical samples with one that uses high-resolution photographs of these sample surfaces.
We have also compared the perceived (1) graininess and (2) sparkle with the performance of two commercial instruments that are capable of capturing both attributes. Results have shown a good correlation between the instruments' readings and the psychophysical studies.
Finally, we implemented computational models predicting these texture attributes that have a high correlation with the instrument readings as well as the psychophysical data. By linear scaling of the predicted data using instruments readings, one can use the proposed model for the prediction of graininess and both static and dynamic sparkle values.
|
@article{filip21assessment,
title = {Assessment of sparkle and graininess in effect coatings using
a high-resolution gonioreflectometer and psychophysical studies},
author = {Filip, J. and V{\'a}vra, R. and Kolafov{\'a}, M. and Maile, F. J.},
journal = {Journal of Coatings Technology and Research},
issue = {18},
number = {6},
year = {2021},
publisher = {Springer},
DOI = {10.1007/s11998-021-00518-5},
pages = {1511--1530}
}
|
|
Filip J., Vavra R., Maile F.J., Kolafova M. :
Framework for Capturing and Editing of Anisotropic Effect Coatings, Computer Graphics Forum, Volume 40 Issue 1, 2020, pp.68-80
[bib]
[pdf]
Wiley - top cited article 2021-2022
|
Coatings are used today for products, ranging from automotive production to electronics and everyday use items. Product design is taking on an increasingly important role, where effect pigments come to the fore, offering a coated surface extra optical characteristics. Individual effect pigments have strong anisotropic, azimuthaly-dependent behavior, typically suppressed by a coating application process, randomly orienting pigment particles resulting in isotropic appearance. One exception is a pigment that allows control ot the azimuthal orientation of flakes using a magnetic field. We investigate visual texture effects due to such an orientation in a framework allowing efficient capturing, modelling and editing of its appearance. We captured spatially-varying BRDFs of four coatings containing magnetic effect pigments. As per-pixel non-linear fitting cannot preserve coating sparkle effects, we suggest a novel method of anisotropy modelling based on images shifting in an angular domain. The model can be utilized for a fast transfer of desired anisotropy to any isotropic effect coating, while preserving important spatially-varying visual features of the original coating. The anisotropic behavior was fitted by a parametric model allowing for editing of coating appearance. This framework allows exploration of anisotropic effect coatings and their appearance transfer to standard effect coatings in a virtual environment.
|
@article {filip20framework,
title = {Framework for Capturing and Editing of Anisotropic Effect Coatings},
author = {Filip, J. and Vavra, R. and Maile, F.J. and Kolafova, M.},
journal = {Computer Graphics Forum},
issue = {40},
number = {1},
year = {2020},
publisher = {The Eurographics Association and John Wiley & Sons Ltd. },
DOI = {10.1111/cgf.14119},
pages = {68--80}
}
|
|
Filip J., Vávra R., :
Image-based Appearance Acquisition of Effect Coatings.
Computational Visual Media, Volume 5, Issue 1, pp 73–89, March 2019
[bib]
[pdf]
|
Paint manufacturers strive to introduce unique visual effects to coatings in order to visually communicate functional properties of products using value-added, customized design. However, these effects often feature complex angularly dependent spatially-varying behavior, thus representing a challenge in digital reproduction.
In this paper we analyze several approaches to capturing spatially-varying appearance of effect coatings.
We compare a baseline approach based on bidirectional texture function (BTF) with four variants of half-difference parameterization. Through a psychophysical study we identify minimal sampling along individual dimensions of this parametrization.
We conclude that bivariate representations preserve visual fidelity of effect coatings, while in contrast to BTF, better characterizing near-specular behavior and significantly restricting number of captured images.
|
@article{filip19effect,
title = {{Image-based Appearance Acquisition of Effect Coatings}},
author = {Filip, J. and V{\'a}vra, R.},
journal = {Computational Visual Media},
issue = {1},
number = {5},
year = {2019},
publisher = {Springer},
DOI = {10.1007/s41095-019-0134-3},
pages = {73--89}
}
|
|
Filip J., Kolafová M.:
Perceptual Attributes Analysis of Real-World Materials.
In ACM Transactions on Applied Perception, Volume 16 Issue 1, January 2019
[bib]
[pdf]
[appendix]
[issue cover]
|
Material appearance is often represented by a bidirectional reflectance distribution function (BRDF). Although the concept of the BRDF is widely used in computer graphics and related applications, the number of actual captured BRDFs is limited due to a time and resources demanding measurement process. Several BRDF databases have already been provided publicly, yet subjective properties of underlying captured material samples, apart from single photographs, remain unavailable for users. In this paper we analyzed material samples, used in the creation of the UTIA BRDF database, in a psychophysical study with nine subjects and assessed its twelve visual, tactile, and subjective attributes. Further, we evaluated the relationship between the attributes and six material categories. We consider the presented perceptual analysis as valuable and complementary information to the database; that could aid users to select appropriate materials for their applications.
|
@article{filip18perceptual,
title = {Perceptual Attributes Analysis of Real-World Materials},
author = {Filip, J. and Kolafova, M.},
journal = {Transactions on Applied Perception},
issue = {1},
number = {16},
year = {2018},
publisher = {ACM},
DOI = {https://dl.acm.org/citation.cfm?id=3301412&picked=formats&preflayout=flat},
pages = {}
}
|
|
Vávra R., Filip J.:
Adaptive slices for acquisition of anisotropic BRDF.
Computational Visual Media, 1(4), 2018
[bib]
[pdf]
|
BRDF continues to be used as a fundamental tool for representing material appearance in computer graphics.
In this paper we present a practical adaptive method for acquisition of the anisotropic BRDF. It is based on a sparse adaptive measurement of the complete four-dimensional BRDF space by means of one-dimensional slices which form a sparse four-dimensional structure in the BRDF space and which can be measured by continuous movements of a light source and a sensor. Such a sampling approach is advantageous especially for gonioreflectometer-based measurement devices where the mechanical travel of a light source and a sensor creates a significant time constraint.
In order to evaluate our method, we perform adaptive measurements of three materials and we simulate adaptive measurements of thirteen others.
We achieve a four-times lower reconstruction error in comparison with the regular non-adaptive BRDF measurements given the same count of measured samples.
Our method is almost twice better than a previous adaptive method, and it requires from two- to five-times less samples to achieve the same results as alternative approaches.
|
@article{vavra18adaptive,
title = {{Adaptive slices for acquisition of anisotropic BRDF}},
author = {V{\'a}vra, R. and Filip, J.},
journal = {Computational Visual Media},
issue = {1},
number = {4},
year = {2018},
publisher = {Springer},
DOI = {https://doi.org/10.1007/s41095-017-0099-z},
pages = {1--15}
}
|
|
Filip J., Vávra R., Maile, F. J.:
Optical analysis of coatings including diffractive pigments using a high-resolution gonioreflectometer.
In Journal of Coatings Technology and Research, Volume 16, Issue 2, 2019, pp. 555-572
[bib]
[pdf]
|
The aim of this article is to demonstrate a new way of measuring and understanding the appearance of pigment flake orientation and texture in special effect pigments for use in industrial coatings.
We have used diffractive pigments and analyzed the relative orientation of the particles in the coating layers by evaluating their behavior in two common industry applications – solventborne and powder coatings. We have measured the interference color by taking readings with a high-resolution gonioreflectometer, in order to test the viability of automatic diffractive pigment evaluation. The results were analyzed using both psychophysical (i.e. human) and computational (i.e. mechanical) methods.
Our later psychophysical and computational analysis of the visual differences that diffractive pigments present in both solventborne (1) and powder coating (2) systems for in-plane and out-of-plane geometries revealed that solventborne liquid paint systems better preserve the appearance of original diffraction gratings. This is due to enhanced orientation of the anisotropic pigment particles. The powder coating surfaces investigated, on the other hand, preserved higher intensity and thus visibility in randomly oriented solitary flakes, creating a greater sparkle contrast. We confirmed our findings by capturing and visualizing coating appearance by means of a bidirectional texture function. We then compared the diffractive pigment evaluation results with other state-of-the-art measuring device readings.
We believe that our work provides valuable information on flake orientation, and also compares pigment performance in a range of industrial coating systems, which may enable industrial companies to improve paint spraying processes.
|
@article{filip18optical,
title = {Optical analysis of coatings including diffractive pigments using
a high-resolution gonioreflectometer},
author = {Filip, J. and V{\'a}vra, R. and Maile, F. J.},
journal = {Journal of Coatings Technology and Research},
issue = {16},
number = {2},
year = {2019},
publisher = {Springer},
DOI = {10.1007/s11998-018-0137-5},
pages = {555--572}
}
|
|
Filip J., Kolafová M., Havlíèek M., Vávra R., Haindl M., Rushmeier H.:
Evaluating Physical and Rendered Material Appearance.
The Visual Computer (Proceedings of Computer Graphics International - CGI), 34(6-8), pp.805-816, 2018
[bib]
[pdf]
|
Many representations and rendering techniques have been proposed for presenting material appearance in computer graphics. One outstanding problem is evaluating their accuracy. In this paper, we propose assessing accuracy by comparing human judgements of material attributes made when viewing a computer graphics rendering to those made when viewing a physical sample of the same material. We demonstrate this approach using 16 diverse physical material samples distributed to researchers at the MAM 2014 workshop. We performed two psychophysical experiments. In the first experiment we examined how consistently subjects rate a set of twelve visual, tactile and subjective attributes of individual physical material specimens. In the second experiment, we asked subjects to assess the same attributes for identical materials rendered as BTFs under point-light and environment illuminations. By analyzing obtained data, we identified which material attributes and material types are judged consistently and to what extent the computer graphics representation conveyed the experience of viewing physical material appearance.
|
@article{filip18evaluating,
title = {{Evaluating Physical and Rendered Material Appearance}},
author = {Filip, J. and Kolafov{\'{a}}, M. and Havl{\'\i}{\v c}ek, M. and
V{\'a}vra, R. and Haindl, M. and Rushmeier H.},
journal = {The Visual Computer (Computer Graphics International 2018)},
issue = {34},
number = {6-8},
year = {2018},
publisher = {Springer},
DOI = {https://doi.org/10.1007/s00371-018-1545-3},
pages = {805--816}
}
|
|
Vavra R., Filip J.:
Minimal Sampling for Effective Acquisition of Anisotropic BRDFs.
Computer Graphics Forum,
(proceedings of Pacific Graphics 2016), pp.299-309, 2016
[bib]
[pdf]
|
BRDFs are commonly used for material appearance representation in applications ranging from gaming and the movie industry, to product design and specification. Most applications rely on isotropic BRDFs due to their better availability as a result of their easier acquisition process. On the other hand, anisotropic BRDF due to their structure-dependent anisotropic highlights, are more challenging to measure and process. This paper thus leverages the measurement process of anisotropic BRDF by representing such BRDF by the collection of isotropic BRDFs.
Our method relies on an anisotropic BRDF database decomposition into training isotropic slices forming a linear basis, where appropriate sparse samples are identified using numerical optimization.
When an unknown anisotropic BRDF is measured, these samples are repeatably captured in a small set of azimuthal directions.
All collected samples are then used for an entire measured BRDF reconstruction from a linear isotropic basis.
Typically, below 100 samples are sufficient for the capturing of main visual features of complex anisotropic materials, and we provide a minimal directional samples to be regularly measured at each sample rotation. We conclude, that even simple setups relying on five bidirectional samples (maximum of five stationary sensors/lights) in combination with eight rotations (rotation stage for specimen) can yield a promising reconstruction of anisotropic behavior.
Next, we outline extension of the proposed approach to adaptive sampling of anisotropic BRDF to gain even better performance.
Finally, we show that our method allows using standard geometries, including industrial multi-angle reflectometers, for the fast measurement of anisotropic BRDFs.
|
@article{vavra16minimal,
title = {{Minimal Sampling for Effective Acquisition of Anisotropic {BRDF}s}},
author = {Vavra, Radomir and Filip, Jiri},
journal = {Computer Graphics Forum (PACIFIC GRAPHICS 2016)},
issue = {35},
number = {7},
year = {2016},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
DOI = {10.1111/cgf.13027},
pages = {299 -- 309}
}
|
|
Havran V., Filip J., Myszkowski K.:
Perceptually Motivated BRDF Comparison using Single Image.
Computer Graphics Forum,
Volume 35, Issue 4, pp.1-12, (proceedings of EGSR 2016), June 2016 (IF 1.642)
[bib]
[pdf]
[data+code]
|
Surface reflectance of real-world materials is now widely
represented by the bidirectional reflectance distribution function (BRDF)
and also by spatially varying representations such as SVBRDF and the bidirectional texture function (BTF).
The raw surface reflectance measurements are typically
compressed or fitted by analytical models, that always introduce a certain loss of accuracy.
For its evaluation we need a distance function between a reference surface
reflectance and its approximate version. Although
some of the past techniques tried to reflect the perceptual sensitivity of
human vision, they have neither optimized illumination and viewing
conditions nor surface shape. In this paper, we suggest a new
image-based methodology for comparing different anisotropic BRDFs. We
use optimization techniques to generate a novel surface which has
extensive coverage of incoming and outgoing light directions, while
preserving its features and frequencies that are important for material
appearance judgments. A single rendered image of such a surface
along with simultaneously optimized lighting and viewing
directions leads to the computation of a meaningful BRDF difference,
by means of standard image difference predictors.
A psychophysical experiments revealed that our surface
provides richer information on material properties than the standard
surfaces often used in computer graphics, e.g., sphere or blob.
|
@article{havran16perceptually,
title = {{Perceptually Motivated {BRDF} Comparison using Single Image}},
author = {Havran, Vlastimil and Filip, Jiri and Myszkowski, Karol},
journal = {Computer Graphics Forum},
year = {2016},
publisher = {The Eurographics Association and John Wiley & Sons Ltd.},
pages = {1467-8659},
DOI = {10.1111/cgf.12944}
}
|
|
Filip J., Vavra, R., Havlicek M., Krupicka M.:
Predicting Visual Perception of Material Structure in Virtual Environments,
Computer Graphics Forum, Volume 36, Issue 1, pp.89-100, January 2017 (IF 1.642)
[bib]
[pdf]
[supplementary material] [stimulus] [scene1]
[scene2]. |
One of the most accurate yet still practical representation of material appearance is the Bidirectional Texture Function (BTF). The BTF can be viewed as an extension of Bidirectional Reflectance Distribution Function (BRDF) for additional spatial information that includes local visual effects such as shadowing, inter-reflection, subsurface-scattering, etc. However, the shift from BRDF to BTF represents not only a huge leap in respect to the realism of material reproduction, but also related high memory and computational costs stemming from the storage and processing of massive BTF data.
In this work we argue that each opaque material, regardless of its surface structure, can be safely substituted by a BRDF without the introduction of a significant perceptual error when viewed from an appropriate distance. Therefore, we ran a set of psychophysical studies over 25 materials to determine so called critical viewing distances, i.e., the minimal distances at which the material spatial structure (texture) cannot be visually discerned.
Our analysis determined such typical distances for several material categories often used in interior design applications.
Furthermore, we propose a combination of computational features that can predict such distances without the need for a psychophysical study.
We show that our work can significantly reduce rendering costs in applications that process complex virtual scenes.
|
@article{filip15predicting,
author = {Filip, J. and V{\'a}vra, R. and Havlicek, M. and Krupicka, M.},
title = {Predicting Visual Perception of Material Structure
in Virtual Environments},
journal = {Computer Graphics Forum},
volume = {36},
number = {1},
year = {2017},
month = {January},
pages = {89--100},
DOI = {10.1111/cgf.12789},
url = {http://staff.utia.cas.cz/filip/projects/15CGF} }
|
|
Filip J., Havlicek M., Vavra, R.:
Adaptive Highlights Stencils for Modeling of Multi-Axial BRDF Anisotropy (extended version),
The Visual Computer 33(1), Springer, pp.5–15, 2017 (IF 0.957)
[bib]
[pdf]
[supplementary material]
[code Matlab, shader]. |
Directionally dependent anisotropic material appearance phenomenon is widely represented using bidirectional reflectance distribution function (BRDF).
This function needs in practice either reconstruction of unknown values interpolating between sparse measured samples or requires data fidelity preserving compression forming a compact representation from dense measurements.
Both properties can be, to a certain extent, preserved by means of analytical BRDF models. Unfortunately, the number of anisotropic BRDF models is limited, and moreover, most require either a demanding iterative optimization procedure dependent on proper initialization or the user setting parameters. Most of these approaches are challenged by the fitting of complex anisotropic BRDFs.
In contrast, we approximate BRDF anisotropic behavior by means of highlight stencils and derive a novel BRDF model that independently adapts such stencils to each anisotropic mode present in the BRDF.
Our model allows for the fast direct fitting of parameters without the need of any demanding optimization. Furthermore, it achieves an encouraging, expressive visual quality as compared to rival solutions that rely on a similar number of parameters. We thereby ascertain that our method represents a promising approach to the analysis and modeling of complex anisotropic BRDF behavior.
|
@article{filip15adaptiveVC,
title = {Adaptive Highlights Stencils for Modeling of
Multi-Axial {BRDF} Anisotropy},
author = {Filip, J. and Havl{\'\i}{\v c}ek, M. and V{\'a}vra, R.},
journal = {The Visual Computer},
volume = {33},
number = {1},
ISSN = {0178-2789},
DOI = {10.1007/s00371-015-1148-1},
year = {2017},
month = {January},
publisher = {Springer Berlin Heidelberg},
pages = {5–15}}
|
|
Filip J., Vavra R., Krupicka M.: Rapid Material Appearance Acquisition Using Consumer Hardware. Sensors 2014, 14(10), 19785-19805; doi:10.3390/s141019785 (IF 2.245)
[bib]
[pdf]
[movie device]
[movie TVBTF]
|
A photo-realistic representation of material appearance can be achieved by means of bidirectional texture function (BTF) capturing a material's appearance for varying illumination, viewing directions, and spatial pixel coordinates. BTF captures many non-local effects in material structure such as inter-reflections, occlusions, shadowing, or scattering.
The acquisition of BTF data is usually time and resource-intensive due to the high dimensionality of BTF data. This results in expensive, complex measurement setups and/or excessively long measurement times.
We propose an approximate BTF acquisition setup based on a simple, affordable mechanical gantry containing a consumer camera and two LED lights.
It captures a very limited subset of material surface images by shooting several video sequences.
A psychophysical study comparing captured and reconstructed data with the reference BTFs of seven tested materials revealed that results of our method show a promising visual quality.
Speed of the setup has been demonstrated on measurement of human skin and measurement and modeling of a glue dessication time-varying process.
As it allows for fast, inexpensive, acquisition of approximate BTFs, this method can be beneficial to visualization applications demanding less accuracy, where BTF utilization has previously been limited.
|
@Article{filip14portable,
AUTHOR = {Filip, Jiri and Vavra, Radomir and Krupicka, Mikulas},
TITLE = {Rapid Material Appearance Acquisition Using Consumer Hardware},
JOURNAL = {Sensors},
VOLUME = {14},
YEAR = {2014},
NUMBER = {10},
PAGES = {19785--19805},
URL = {http://www.mdpi.com/1424-8220/14/10/19785},
ISSN = {1424-8220},
DOI = {10.3390/s141019785}}
|
|
Filip, J. and Vavra, R.: Template-Based Sampling of Anisotropic BRDFs.
Computers Graphics Forum (Proceedings of Pacific Graphics 2014), Volume 33, Issue 7, pp.91-99, October 2014 (IF 1.642)
[bib]
[preprint]
[appendix] |
BRDFs are commonly used to represent given materials' appearance in computer graphics and related fields. Although, in the recent past, BRDFs have been extensively measured, compressed, and fitted by a variety of analytical models, most research has been primarily focused on simplified isotropic BRDFs.
In this paper, we present a unique database of 150 BRDFs representing a wide range of materials; the majority exhibiting anisotropic behavior.
Since time-consuming BRDF measurement represents a major obstacle in the digital material appearance reproduction pipeline, we tested several approaches estimating a very limited set of samples capable of high quality appearance reconstruction.
Initially, we aligned all measured BRDFs according to the location of the anisotropic highlights.
Then we propose an adaptive sampling method based on analysis of the measured BRDFs. For each BRDF, a unique sampling pattern was computed, given a predefined count of samples.
Further, template-based methods are introduced based on reusing of the precomputed sampling patterns. This approach enables a more efficient measurement of unknown BRDFs while preserving the visual fidelity for the majority of tested materials.
Our method exhibits better performance and stability than competing sparse sampling approaches; especially for higher numbers of samples.
|
@article{filip14template,
author = {Filip, J. and V{\'a}vra, R.},
title = {Template-Based Sampling of Anisotropic {BRDF}s},
journal = {Computer Graphics Forum},
volume = {33},
number = {7},
year = {2014},
month = {October},
conference = {Pacific Graphics 2014},
pages = {91--99},
DOI = {10.1111/cgf.12477},
url = {http://staff.utia.cas.cz/filip/projects/14PG} }
|
|
Filip J., Vavra R.: Fast Method of Sparse Acquisition and Reconstruction of View and Illumination
Dependent Datasets.
Computers and Graphics (Elsevier), vol. 37, no. 5, pp.376-388, August 2013 (IF 1.000)
[bib],
[preprint] |
Although computer graphics uses measured view and illumination dependent data to achieve realistic digital reproduction of real-world material properties, the extent of their utilization is currently limited by a complicated acquisition process.
Due to the high dimensionality of such data, the acquisition process is demanding on time and resources. Proposed is a method of approximate reconstruction of the data from a very sparse dataset, obtained quickly using inexpensive hardware. This method does not impose any restrictions on input datasets and can handle anisotropic, non-reciprocal view and illumination direction-dependent data. The method's performance was tested on a number of isotropic and anisotropic apparent BRDFs, and the results were encouraging. The method performs better than the uniform sampling of a comparable sample count and has three main benefits: the sparse data acquisition can be done quickly using inexpensive hardware, the measured material does not need to be extracted or removed from its environment, and the entire process of data reconstruction from the sparse samples is quick and reliable.
Finally, the ease of sparse dataset acquisition was verified in measurement experiments with three materials, using a simple setup of a consumer camera and a single LED light. The proposed method has also shown promising performance when applied to sparse measurement and reconstruction of BTFs, mainly for samples with a lower surface height variation. Our approach demonstrates solid performance across a wide range of view and illumination dependent datasets, therefore creating a new opportunity for development of time and cost-effective portable acquisition setups.
|
@Article{filip13fast,
title = {Fast Method of Sparse Acquisition and Reconstruction of View
and Illumination Dependent Datasets},
author = {Filip, J. and V{\'a}vra, R.},
journal = {Computer and Graphics},
publisher = {Elsevier},
volume = {37},
number = {5},
year = {2013},
pages = {376-388}}
|
|
Havran V., Filip J., Myzskowski K..:Bidirectional Texture Function Compression based on Multi-Level Vector Quantization.
Computer Graphics Forum, vol. 29, no. 1, pp. 175-190, March 2010 (IF 1.681)
[bib], [preprint] [web] |
The Bidirectional Texture Function (BTF) is becoming widely used for
accurate representation of real-world material appearance. In this
paper a novel BTF compression model is proposed. The model resamples
input BTF data into a parametrization, allowing decomposition of
individual view and illumination dependent texels into a set of
multidimensional conditional probability density functions.
These functions are compressed in turn using a novel multi-level
vector quantization algorithm. The result of this algorithm is a set
of index and scale code-books for individual dimensions. BTF
reconstruction from the model is then based on fast chained indexing
into the nested stored code-books. In the proposed model, luminance
and chromaticity are treated separately to achieve further compression.
The proposed model achieves low distortion and compression ratios 1:233-1:2040,
depending on BTF sample variability. These results compare well with several other BTF compression methods with predefined compression ratios, usually smaller than $1:200$. We carried out a psychophysical experiment comparing our method with LPCA method. BTF synthesis from the model was implemented on a standard GPU, yielded interactive framerates. The proposed method allows the fast importance sampling required by eye-path tracing algorithms in image synthesis.
|
@article{havran10bidirectional,
author = {Havran, V. and Filip, J. and Myszkowski, K.},
title = {Bidirectional Texture Function Compression based on
the Multilevel Vector Quantization},
journal = {Computer Graphics Forum},
publisher = {The Eurographics Association and Blackwell Publishing},
volume = {29},
number = {1},
year = {2010},
month = {March},
pages = {175--190}}
|
|
Filip J., Haindl M.:Bidirectional Texture Function Modeling: A State of the Art Survey.
IEEE Transactions on Pattern Analysis and Machine Intelligence (IEEE TPAMI), vol. 31, no. 11, pp. 1921-1940, October 2009 (IF 4.378)
[bib] |
An ever-growing number of real world computer vision applications require
classification, segmentation, retrieval, or realistic rendering
of genuine materials. However, the appearance of real
materials dramatically changes with illumination and viewing
variations. Thus, the only reliable representation of material visual properties
requires capturing of its reflectance in as wide range of light and camera position combinations as possible.
This is a principle of the recent most advanced texture representation, the
Bidirectional Texture Function (BTF).
Multispectral BTF is a seven-dimensional function that depends on view and illumination directions as well as on planar texture coordinates.
BTF is typically obtained by measurement of thousands of
images covering many combinations of illumination and viewing angles. However,
the large size of such measurements has prohibited
their practical exploitation in any sensible application until
recently. During the last few years the first BTF measurement, compression, modeling and rendering methods have emerged. In this paper we categorize,
critically survey, and psychophysically compare such approaches, which were published in this newly arising and important computer vision~\&~graphics area.
|
@article{filip08bidirectional,
author = {Filip, J. and Haindl, M.},
title = {Bidirectional Texture Function Modeling: A State of the Art
Survey},
journal = {IEEE Transactions on Pattern Analysis and Machine
Intelligence},
publisher = {IEEE Press},
volume = {31},
number = {11},
year = {2009},
month = {October},
pages = {1921-1940} }
|
|
Filip J., Chantler M.J., Haindl M.:On Uniform Resampling and Gaze Analysis of Bidirectional Texture Functions.
ACM Transactions on Applied Perception, vol. 6, no. 3, Article 18, August 2009, 15 pp. (IF 1.447)
[bib] [cover image] |
The use of illumination and view dependent texture information is recently the best way to
capture the appearance of real-world materials accurately. One example is the
Bidirectional Texture Function. The main disadvantage of these data is their
massive size.
In this paper we employ perceptually-based methods to allow more efficient
handling of these data. In the first step we analyse different uniform resampling by means of a psychophysical study
with eleven subjects, comparing original data with rendering of a uniformly
resampled version over the hemisphere of illumination and view dependent
textural measurements. We have found that down-sampling in view and illumination azimuthal angles is less
apparent than in elevation angles and that illumination directions can be
down-sampled more than view directions without loss of visual accuracy. In the second step we analysed subjects gaze
fixation during the experiment. The gaze analysis confirmed resuls from the
experiment and revealed that subjects were fixating at locations aligned with direction of
main gradient in rendered stimuli. As this gradient was mostly
aligned with illumination gradient we conclude that subjects were observing
materials mainly in direction of illumination gradient.
Our results provide interesting insights in human perception of real materials
and show promising consequences for development of more efficient compression and rendering algorithms using these kind of massive data.
|
|
|
Filip J., Chantler M.J., Green P.R., Haindl M.:A Psychophysically Validated Metric for Bidirectional Texture Data Reduction.
ACM Transactions on Graphics 27(5) (proceedings of SIGGRAPH Asia 2008), Article 138, December 2008, 11 pp. (IF 3.619)
[bib] [preprint] [web] |
Bidirectional Texture Functions (BTF) are commonly thought to provide the
most realistic perceptual experience of materials from rendered images. The key to providing efficient compression of BTFs is the decision as to how much of the data should be preserved. We use psychophysical experiments to show that this decision depends critically upon the material concerned. Furthermore, we develop a BTF derived metric that enables us to automatically set a material's compression parameters in such a way as to provide users with a predefined perceptual quality.
We investigate the correlation of three different BTF metrics with
psychophysically derived data. Eight materials were presented to eleven naive
observers who were asked to judge the perceived quality of BTF renderings as the
amount of preserved data was varied. The metric showing the highest correlation with the thresholds set by the observers was the mean variance of individual BTF images. This metric was then used to automatically determine the material-specific compression parameters used in a vector quantisation scheme. The results were successfully validated in an experiment with six additional materials and eighteen observers.
We show that using the psychophysically reduced BTF data significantly
improves performance of a PCA-based compression method. On average, we were able to increase the compression ratios, and decrease processing times, by a factor of four without any differences being perceived.
|
@article{filip08psychophysically,
author = {Filip, J. and Chantler, M.J. and Green, P.R. and Haindl, M.},
title = {A Psychophysically Validated Metric for Bidirectional Texture
Data Reduction},
journal = {ACM Transactions on Graphics (Proceedings of SIGGRAPH
Asia 2008)},
publisher = {ACM Press},
volume = {27},
number = {5},
year = {2008},
month = {December},
conference = {ACM SIGGRAPH Asia 2008},
pages = {138},
url = {http://staff.utia.cas.cz/filip/projects/pertex} }
|
|
Haindl,M., Filip J.:
Extreme
Compression and Modeling of Bidirectional Texture Function.
IEEE Transactions on Pattern Analysis and Machine Intelligence (IEEE TPAMI),
IEEE Press, Volume 29, Issue 10, October 2007, pp.1859-1865. (IF 3.547)
[bib] |
The recent advanced representation for realistic real-world materials
in virtual reality applications is the Bidirectional Texture Function (BTF) which describes rough texture appearance for varying illumination and viewing
conditions. Such a function can be represented by thousands of measurements
(images) per material sample. The resulting BTF size excludes its
direct rendering in graphical applications and some compression of
these huge BTF data spaces is obviously inevitable. In this paper
we present a novel, fast probabilistic model-based algorithm for
realistic BTF modeling allowing an extreme compression
with the possibility of a fast hardware implementation.
Its ultimate aim is to create a visual impression of
the same material without a pixel-wise correspondence to the
original measurements. The analytical step of the algorithm starts
with a BTF space segmentation and a range map estimation by
photometric stereo of the BTF surface, followed by the spectral and
spatial factorization of selected sub-space color texture
images. Single mono-spectral band-limited factors are independently modeled by their dedicated spatial probabilistic model. During rendering, the sub-space images of arbitrary size are synthesized and both color (possibly multi-spectral) and range information is combined in a bump-mapping filter
according to the view and illumination directions.
The presented model offers a huge BTF compression ratio unattainable by any alternative sampling-based BTF synthesis method. Simultaneously this model can
be used to reconstruct missing parts of the BTF measurement space.
|
@Article{haindl07extreme,
title = {Extreme Compression and Modeling of Bidirectional Texture Function},
author = {Haindl, M. and Filip, J.},
journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence},
publisher = {IEEE Press},
year = {2007},
month = {October},
volume = {29},
number = {10},
pages = {1859--1865}
url = {http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?isnumber=
4293197&arnumber=4293214&count=23&index=16} }
|
|
Filip J., Haindl,
M.:
BTF Modelling Using BRDF Texels. International Journal of
Computer Mathematics (IJCM), Taylor & Francis, Volume 84, Issue 9, September
2007, pp. 1267-1283. (IF 0.423)
[bib] |
The highest fidelity representations of realistic real-world materials currently
comprise Bidirectional Texture Functions (BTF).
The BTF is a six dimensional function depending on view and illumination directions as well as on planar texture coordinates. The huge size of such measurements, typically in the form of thousands of images covering all possible combinations of illumination and viewing angles, has prohibited their practical exploitation and obviously some compression and modelling method of these enormous BTF data spaces is inevitable.
The two proposed approaches combine BTF spatial clustering with cluster index modelling by means of efficient Markov random field models. The methods allow the generation of a seamless cluster index of arbitrary size to cover large virtual 3D objects surfaces. Both methods represent original BTF data using a set of local spatially dependent Bidirectional Reflectance Distribution Function (BRDF) values which are combined according to synthesised cluster index and illumination / viewing directions by means of two types of Markov random field models. BTF data compression using both methods is about 1:200
and their synthesis is very fast.
|
@Article{filip07BTF,
title = {{BTF} modelling using {BRDF} texels},
author = {Filip, J. and Haindl, M.},
journal = {International Journal of Computer Mathematics (IJCM)},
publisher = {Taylor and Francis},
year = {2007},
month = {September},
volume = {84},
number = {9},
pages = {1267--1283} ,
url = {http://www.informaworld.com/smpp/content~content=
a781734939?jumptype=alert&alerttype=author,email} }
|
Journal Papers (other) |
|
Maile F. J., Filip J.:
Innovative ternary system (ZTS) based on ultra-thin effect pigments. The realisation and characterisation of gold alloy shades.
In Pitture e Vernici (European Coatings + Formulation Applications), issue 4, May-June, 2024
[bib]
[pdf]
|
Various effect pigments like pearlescents and metal pigments made of gold or copper-zinc alloys have long been used to decorate surfaces in a wide variety of applications, although there is a difference in color formation. Colors in metallic elements and their alloys can be explained using band theory whereas the formation of (interference) colors in pearlescent pigments occurs through thin layers of higher refractive index deposited on semi-transparent substrates with platelet-like morphology. Furthermore, compared to the physics of the gold-silver-copper alloys and their object surfaces, particle properties such as scattering at pigment edges and particle orientation must be taken into account when processing effect pigments in coatings, printing and plastics applications to achieve gold color shades on objects, since these significantly influence the final appearance.
|
@article{maile24innovative,
title = {Innovative ternary system (ZTS) based on ultra-thin effect pigments.
The realisation and characterisation of gold alloy shades},
author = {Maile, F. J. and Filip, J.},
journal = {Pitture e Vernici (European Coatings + Formulation Applications)},
issue = {},
number = {},
month = {May},
year = {2024},
publisher = {Pitture e Vernici},
DOI = {},
pages = {10--15}
}
|
|
Maile F. J., Hubert A., Filip J.:
Gilded surfaces: innovative mixing system with ultra-thin effect pigments (ZTS - Zenexo(R) Ternary System).
In European Coatings Journal, 2022
[bib]
[pdf]
|
Effect pigments based on ultra-thin pigment (UTP) technology can be used to create gold shades with exceptional hiding power and colour gamut for coating, printing and plastic applications [1]. Inspired by the ternary plot known from the metal alloys Au-Ag-Cu [2], three UTP-based effect pigments (YY-YS-OO) have been mixed using the innovative Zenexo ternary system (ZTS) to imitate gold shades. This paper looks at the new system and the device used to characterise appearance of samples prepared using the ZTS approach. Laboratory processes can be simplified and significant costs saved by combining both innovations.
|
@article{maile22gilded,
title = {Gilded Surfaces: innovative mixing system with ultra-thin
effect pigments (ZTS - Zenexo(R) Ternary System)},
author = {Maile, F. J. and Hubert, A. and Filip, J.},
journal = {European Coatings Journal},
issue = {},
number = {},
month = {December},
year = {2022},
publisher = {Vicenz},
DOI = {},
pages = {1--7}
}
|
|
Maile F. J., Filip J.:
New applications for polychromatic effect pigments.
Asia Pacific Coatings Journal, Volume. 29, Number 2, (2016), p. 35-38
[bib] |
Pigments that generate special effects like angle-dependent color or decorative texture have a growing economic significance and can be found in various industrial products and end-user applications. In decorative uses, special effect pigments provide three major advantages: (a) they can create the illusion of optical depth, which is for example be observed when applying pearlescent pigments in car paints; (b) they can generate subtle to startling angle-dependent eye-catching color effects, which can for example be used in car paints or decorative printing; (c) the have the ability to imitate the effect of natural pearls in buttons, plastic bottles, and many other decorative objects.
This paper provides an insight into our recently discovered new applications for MultiFlect and Alegrace Spectacular polychromatic effect pigments.
These effect pigment generations allow the creation of surfaces which exhibit polychromatic light sparks under directed lighting in various coating, printing or cosmetic applications. When used in interior architecture e.g. in coil or powder coatings, different surface textures and interference color effects can be perceived when exposed to different spectra of light using LEDs, i.e. to create semi-smart surfaces.
|
@article{maile16new,
author = {Maile, F. J. and Filip, J.},
title = {New applications for polychromatic effect pigments},
journal = {Asia Pacific Coatings Journal},
volume = {29},
number = {2},
year = {2016},
month = {April},
pages = {35--38}}
|
|
Haindl M., Filip J., Vavra, R.:
Digital Material Appearance: the Curse of Tera-Bytes . ERCIM News (2012), 90, pp.49-50
[bib] |
@Article{haindl12ERCIM,
title = {Digital Material Appearance: the Curse of Tera-Bytes},
author = {Haindl, M. and Filip, J. and V{\'a}vra, R.},
journal = {ERCIM News, No. 90},
year = {2012},
pages = {49--50}}
|
|
Haindl M., Filip J., Hatka M.:
Realistic Material Appearance Modelling . ERCIM News (2010), 81, pp.13-14
[bib] |
@Article{filip10ERCIM,
title = {Realistic Material Appearance Modelling,
author = {Haindl M., Filip J., Hatka M.},
journal = {ERCIM News},
year = {2010},
number = {81},
pages = {13-14},
url = {http://ercim-news.ercim.eu/en81/special/realistic-material-
appearance-modelling} }
|
|
Filip J., Haindl,
M., Chetverikov D.:
Fast Synthesis of Dynamic Colour Textures. ERCIM News (2006), 66, pp.53-54
[bib] |
@Article{filip06ERCIM,
title = {{F}ast {S}ynthesis of {D}ynamic {C}olour {T}extures},
author = {Filip, J. and Haindl, M.},
journal = {ERCIM News},
year = {2006},
number = {66},
pages = {53-54},
url = {http://www.ercim.org/publication/Ercim_News/enw66/haindl.html} }
|
|
Haindl M., Filip J.:
Modelling of Authentic Reflectance
Behaviour in Virtual Environments. ERCIM News (2005), 62, pp.49-50.
[bib] |
@Article{haindl05ERCIM,
title = {{M}odelling of authentic reflectance behaviour in
virtual environments},
author = {Haindl, M. and Filip, J.},
journal = {ERCIM News},
year = {2005},
number = {62},
pages = {49-50},
url = {http://www.ercim.org/publication/Ercim_News/enw62/haindl.html} }
|
Conference Papers (peer-reviewed) |
|
Filip J., Maile F. J., Vávra R.:
Angle-Dependent Analysis and Modeling of Absorption and Structural Colors in Feathers, In proceedings of CVCS2024: the 12th Colour and Visual Computing Symposium, 2024
[bib]
[pdf] [shader] Best paper award
|
Bird feathers represent one of the most challenging examples of material appearance, driven by a combination of structural properties at different scales. At the microscopic scale, feather appearance is dictated by the anisotropic behavior of directional feather parts. At the nanoscale, some species exhibit structural color due to light diffraction on structures comparable in size to the wavelength of light. In this paper, we capture feathers from four different species, analyze various aspects of their appearance, and suggest its efficient modeling approach.
|
@inproceedings {filip24angle,
editor = {Hardeberg, Jon Yngve and Rushmeier, Holly},
title = {Angle-Dependent Analysis and Modeling of Absorption and Structural Colors in Feathers},
author = {Filip, Jirí and Maile, Frank J. and Vavra, Radomir},
booktitle = {CVCS2024: the 12th Colour and Visual Computing Symposium},
year = {2024},
}
|
|
Filip J., Vítek M. :
Psychophysical Insights into Anisotropic Highlights of 3D Printed Objects, In proceedings of MAM2024 - MANER Conference London, 2024
[bib]
[pdf]
[data] |
/p>
3D printing has been extensively used for over two decades by various practitioners and professionals in the industry. This technique, which involves adding material from melted filament layer by layer based on CAD model geometry, imparts a unique appearance to the printed objects. The layering structure generates specific directional reflectance patterns on printed surfaces, leading to anisotropic highlights. Due to slight inaccuracies in the printing setup, the appearance of individual layers is not seamless and exhibits sparkle-like effects along the highlight. In this paper, we conducted a psychophysical experiment to analyze human perception of the printed objects, focusing on the intensity and width of the anisotropic highlights.We discovered that the contrast near the highlights and the variability of pixel intensities along the highlights are highly correlated with human ratings. Lastly, we present a straightforward method utilizing these computational features to enhance the visualization of 3D printed objects.
|
@inproceedings {filip24psychophysical,
title = {{Psychophysical Insights into Anisotropic Highlights of 3D Printed Objects}},
author = {Filip, Jirí and Vítek, Martin},
booktitle = {Workshop on Material Appearance ModelingJoint MAM - MANER Conference
- Material Appearance Network for Education and Research},
editor = {Hardeberg, Jon Yngve and Rushmeier, Holly},
year = {2024},
publisher = {The Eurographics Association},
ISSN = {2309-5059},
ISBN = {978-3-03868-264-6},
DOI = {10.2312/mam.20241177}
}
|
|
Filip J., Dechterenko F.:
Exploration of Core Material Appearance Features.
IS&T/Appamat International Workshop on Material Appearance, 2023
[bib]
[pdf]
|
Digital representations of materials are widely used in various applications. However, understanding
their visual properties from human vision perspective and automatically interpreting the visual
properties of captured materials remains an ongoing research challenge [1].
To identify the most crucial appearance attributes of real materials, we conducted a user study
involving 210 materials, including fabric, leather, wood, plastic, metal, and paper (see Figure 1, left
panel). For each material, we recorded a video showcasing both its specular and non-specular
appearances. The materials were grouped into three separate movies each showing 70 randomly
selected materials. Participants were then asked to identify and rank at least five most visually
distinguishing features, in the order of importance, that set apart the materials within each video –
essentially, the features that make materials different. We collected a total of 451 valid text responses
from 32 participants. Subsequently, a manual semantic clustering based on keywords occurrence
revealed the 15 most frequently mentioned attributes (as illustrated in Figure 1, right panel).
In the second validation study, we tasked six participants with clustering all 451 responses according
to 15 predefined attributes. The interrater agreement was notably high, as evidenced by Fleiss' Kappa
score of 0.786. Among the 451 responses evaluated, we identified 198 instances in which all six raters
reached a unanimous consensus (43.9%). If we lowered the threshold to require agreement from at
least three raters, we found 254 cases with such consensus (56.3%). Additionally, for a two-rater
agreement criterion, we observed agreement in 396 cases (87.8%). This system yielded a situation
where 16 out 32 participants had their responses completely integrated within the rating system, while
the remaining 16 participants exhibited a range of 5.6% - 26.7% divergence.
Our findings reveal that the most prominent attributes include common visual features as color
variability, saturation, roughness, brightness, shininess, texture, and pattern. Interestingly, participants
frequently mentioned tactile and subjective attributes like warmth, hardness, naturalness, and
attractiveness.
|
@inproceedings{filip23exploration,
title = {Exploration of Core Material Appearance Features},
author = {Filip, J. and D{\v e}cht{\v e}renko, F.},
booktitle = {IS&T/Appamat International Workshop on Material Appearancep},
year = {2023},
DOI = {https://gdr-appamat.cnrs.fr/3220-2/},
publisher = {CNRS},
}
|
|
Filip J., Kolafová M., Vávra R.:
Perceived Effects of Reflective Translucency in 3D Printing Filaments.
In proceedings of The 16th International Conference on
Signal Image Technology & Internet Based Systems, Workshop on Appearance and Imaging, 2022
[bib]
[pdf]
[data] |
3D printing becomes a standard for rapid prototyping and fabrication of customized parts. When it comes to appearance of the printed parts, it is highly affected by the printing filament type and its properties. This paper analyses reflective translucency of seventeen 3D prints and their original filaments. We performed a psychophysical study to obtain perceived translucency data, i.e. discriminability of background based on its reflection when passing through translucent material. These data are then compared with translucency measurements using a commercial device. Finally, we captured BRDF data of the filaments and used them to determine appropriate translucency measurement conditions.
|
@inproceedings {filip22perceived,
title = {Perceived Effects of Reflective Translucency in 3D Printing Filaments},
author = {Filip, Jiri and Kolafová, Martina and Vavra, Radomir},
booktitle = {The 16th International Conference on
Signal Image Technology & Internet Based Systems, IWAI workshop},
year = {2022},
publisher = {IEEE},
}
|
|
Filip J., Kolafova M., Vavra, R. :
Perceived Effects of Static and Dynamic Sparkle in Captured Effect Coatings, To appear in proceedings of the 15th International Conference on
Signal Image Technology & Internet Based Systems (International Workshop on Appearance and Imaging), 2019
[bib]
[pdf]
|
Quality control applications in the coating industry characterize visual properties of coatings containing effect pigments using glint impression, often denoted as sparkle. They rely on a collection of static images capturing sparkle properties of pigment flakes.
However, visual characteristics of pigment flakes are highly correlated to their material properties and their orientations in coating layers. Thus, while two effect coatings can exhibit similar static sparkle behavior, their dynamic sparkle behavior may be very distinct.
In this paper, we analyzed the perception of static and dynamic sparkle using two psychophysical studies on 38 effect coatings and 31 human subjects.
First, we have shown a good agreement between the perception of sparkle in real specimens and in photographs. Second, we observed significant differences in perceived static and dynamic sparkle. Our results demonstrate a need for a multiangle recording of sparkle when assessing effect pigment visual characteristics.
|
@inproceedings {filip19perceived,
title = {Perceived Effects of Static and Dynamic Sparkle
in Captured Effect Coatings},
author = {Filip, Jiri and Kolafová, Martina and Vavra, Radomir},
booktitle = {The 15th International Conference on
Signal Image Technology & Internet Based Systems},
year = {2019},
publisher = {IEEE},
ISSN = {},
ISBN = {},
DOI = {}
}
|
|
Filip J., Kolafova M., Vavra, R. :
A Psychophysical Analysis of Fabricated Anisotropic Appearance, In proceedings of Pacific Graphics short papers, 2019
[bib]
[pdf]
|
Many materials change surface appearance when observed for fixed viewing and lighting directions while rotating around its normal. Such distinct anisotropic behavior manifests itself as changes in textural color and intensity. These effects are due to structural elements introducing azimuthally-dependent behavior. However, each material and finishing technique has its unique anisotropic properties which are often difficult to control.
To avoid this problem, we study controlled anisotropic appearance introduced by means of 3D printing. Our work tends to link perception of directionality with perception of anisotropic reflectance effect it causes. We simulate two types of structure-based anisotropic effects, which are related to directional principles found in real-world materials. For each type, we create a set of test surfaces by controlling the printed anisotropy level and assess them in a psychophysical study to identify a perceptual scale of anisotropy. The generality of these scales is then verified by means of anisotropic surfaces appearance capturing using bidirectional texture function and its analysis on 3D objects. Eventually, we relate the perceptual scale of anisotropy to a computational feature obtained directly from anisotropic highlights observed in the captured reflectance data. The feature is validated using a psychophysical study analyzing visibility of anisotropic reflectance effects.
|
@inproceedings {filip19psychophysical,
title = {A Psychophysical Analysis of Fabricated Anisotropic Appearance},
author = {Filip, Jiri and Kolafová, Martina and Vavra, Radomir},
booktitle = {Pacific Graphics Short Papers},
year = {2019},
publisher = {},
ISSN = {},
ISBN = {},
DOI = {}
}
|
|
Filip J., Kolafova M. :
On Visual Attractiveness of Anisotropic Effect Coatings, In proceedings of Workshop on Material Appearance Modeling, 2019
[bib]
[pdf]
|
With the global trend in customer preference towards achromatic car colors, color designers in coating industry strive to create novel design-critical appearances based on novel effect pigments. At microscopic scale, the pigment particles allow to create specific optical effects like sparkle under directed lighting along with a specific texture in diffuse lighting, while at a macroscopic scale they create the appearance of angle-dependent color and a strong luminance contrast. Although individual particles in effect coatings exhibit anisotropic behavior, the majority of effect coatings exhibit isotropic appearance at a macroscopic scale due to a random orientation of the particles which can be explained with the manufacturing process of the coating. This paper demonstrates an visual appearances achievable by using anisotropic effect coatings based on magnetic pigments. In a psychophysical study, we assessed visual attractiveness of these coatings on a car-like shape for different viewing angles.
|
@inproceedings {filip19onvisual,
booktitle = {Workshop on Material Appearance Modeling},
editor = {Reinhard Klein and Holly Rushmeier},
title = {{On Visual Attractiveness of Anisotropic Effect Coatings}},
author = {Filip, Jiri and Kolafová, Martina},
year = {2019},
publisher = {The Eurographics Association},
ISSN = {},
ISBN = {},
DOI = {}
}
|
|
Filip J., Vávra R., Maile F.J., Eibon B.:
Image-based Discrimination and Spatial Non-uniformity Analysis of Effect Coatings.
In proceedings of 8th International Conference on Pattern Recognition Applications and Methods, , February 2019
[bib]
[pdf]
|
Various industries are striving for novel, more reliable but still efficient approaches to coatings characterization. Majority of industrial applications use portable instruments for characterization of effect coatings. They typically capture a limited set of in-plane geometries and have limited ability to reliably characterize gonio-apparent behavior typical for such coatings. The instruments rely mostly on color and reflectance characteristics without using a texture information across the coating plane. In this paper, we propose image-based method that counts numbers of effective pigments and their active area. First, we captured appearance of eight effect coatings featuring four different pigment materials, in in-plane and out-of-plane geometries. We used a gonioreflectometer for fixed viewing and varying illumination angles. Our analysis has shown that the proposed method is able to clearly distinguish pigment materials and coating applications in both in-plane and out-of-plane geometries. Finally, we show an application of our method to analysis of spatial non-uniformity, i.e. cloudiness or mottling, across a coated panel.
|
@inproceedings{filip19image,
title = {Image-based Discrimination and Spatial Non-uniformity Analysis
of Effect Coatings},
author = {Filip, J. and Vavra, R. and Maile, F.J. and Eibon, B.},
booktitle = {8th International Conference on Pattern Recognition
Applications and Methods},
volume = {},
venue = {Prague},
year = {2019},
publisher = {IAPR},
pages = {}
}
|
|
Filip J., Maile F. J.:
In the Search of an Ideal Measurement Geometry for Effect Coatings.
In proceedings of Pigment and Color Science Forum, 2017
[bib]
[pdf]
|
Visualization and analysis of effect coatings capturing, become highly important nowadays, when digital appearance content is necessary for facilitating quick communication of material appearance properties across various industries.
The contribution of this paper is twofold. First, we discuss development of measurement geometries used in the industry, discuss limitations related to small instruments and finally compare three selected instruments in terms of luminance and color.
Second, present an application of minimal sampling approach to reconstruct dense in-plane geometries based on a sparse set of measured geometries. As this approach relies on properties of a created PCA basis, we show how one can optimize sparse directions for specific pigment types or coating systems. As the span of our tested coatings is limited, we present this optimization method as an application showcase rather than a reference directions.
|
@inproceedings{filip18inthesearch,
title = {In the Search of an Ideal Measurement Geometry for Effect Coatings},
author = {Filip, J. and Maile, F. J.},
booktitle = {Proceedings of Pigment and Color Science Forum},
volume = {},
venue = {Boston},
year = {2018},
publisher = {SMITHERS Rapra},
pages = {}
}
|
|
Filip J., Kolafova M. :
Effects of Surface Anisotropy on Perception of Car Body Attractiveness, In proceedings of Pacific Graphics short papers, 2018
[bib]
[pdf]
|
In the automotive industry effect coatings are used to introduce customized product design, visually communicating the unique impression of a car.
Industrial effect coatings systems achieve primarily a globally isotropic appearance, i.e., surface appearance that does not change when material rotates around its normal. To the contrary, anisotropic appearance exhibits variable behavior due to oriented structural elements.
This paper studies to what extent anisotropic appearance improves a visual impression of a car body beyond a standard isotropic one.
We ran several psychophysical studies identifying the proper alignment of an anisotropic axis over a car body, showing that regardless of the illumination conditions, subjects always preferred an anisotropy axis orthogonal to car body orientation. The majority of subjects also found the anisotropic appearance more visually appealing than the isotropic one.
|
@inproceedings {filip18effects,
author = {Filip, J. and Kolafov\'{a}, M.},
title = {Effects of Surface Anisotropy on Perception of Car Body Attractiveness},
booktitle = {Proceedings of the 26th Pacific Conference on Computer Graphics and Applications: Short Papers},
series = {PG '18},
year = {2018},
isbn = {978-3-03868-073-4},
location = {Kowloon, Hong Kong},
pages = {17--20},
numpages = {4},
url = {https://doi.org/10.2312/pg.20181270},
doi = {10.2312/pg.20181270},
publisher = {Eurographics Association},
address = {Goslar Germany, Germany},
}
|
|
Filip J., Kolafova M. :
Perception of Car Shape Orientation and Anisotropy Alignment, In proceedings of Workshop on Material Appearance Modeling, 2018
[bib]
[pdf]
|
The color designers are used to introduce customized product design, visually communicating the unique impression of a car. They always carefully observe harmony of color and body shape to obtain desired visual impression.
This paper studies to what extent anisotropic appearance improves a visual impression of a car body beyond a standard isotropic one. To address this challenge, we ran several psychophysical studies identifying the proper alignment of an anisotropic axis over a car body. We have shown that subjects preferred an anisotropy axis orthogonal to car body orientation and that the majority of subjects found the anisotropic appearance more visually appealing than the isotropic one.
|
@inproceedings {filip18perception,
booktitle = {Workshop on Material Appearance Modeling},
editor = {Reinhard Klein and Holly Rushmeier},
title = {{Perception of Car Shape Orientation and Anisotropy Alignment}},
author = {Filip, Jiri and Kolafová, Martina},
year = {2018},
publisher = {The Eurographics Association},
ISSN = {2309-5059},
ISBN = {978-3-03868-055-0},
DOI = {10.2312/mam.20181197}
}
|
|
Vavra R., Filip J.:
Adaptive Measurement of Anisotropic Material Appearance.
Proceedings of Pacific Graphics Short Papers, pp. 1-6, 2017
[bib]
[pdf]
|
We present a practical adaptive method for acquisition of the anisotropic BRDF. It is based on a sparse adaptive measurement of the complete four-dimensional BRDF space by means of one-dimensional slices which form a sparse four-dimensional structure in the BRDF space and which can be measured by continuous movements of a light source and a sensor. Such a sampling approach is advantageous especially for gonioreflectometer-based measurement devices where the mechanical travel of a light source and a sensor creates a significant time constraint. In order to evaluate our method, we perform adaptive measurements of three materials and we simulate adaptive measurements of ten others. We achieve a four-times lower reconstruction error in comparison with the regular non-adaptive BRDF measurements given the same count of measured samples. Our method is almost twice better than a previous adaptive method, and it requires from two- to five-times less samples to achieve the same results as alternative approaches.
|
@inproceedings {vavra17adaptive,
title = {{Adaptive Measurement of Anisotropic Material Appearance}},
author = {Vavra, Radomir and Filip, Jiri},
booktitle = {Pacific Graphics Short Papers},
editor = {Jernej Barbic and Wen-Chieh Lin and Olga Sorkine-Hornung},
year = {2017},
publisher = {The Eurographics Association},
ISBN = {978-3-03868-051-2},
DOI = {10.2312/pg.20171316},
pages = {1--6}
}
|
|
Filip J., Maile F. J.:
Appearance Acquisition and Analysis of Effect Coatings.
In proceedings of Pigment and Color Science Forum, 2017
[bib]
[pdf]
|
Eect coatings capturing, visualization and analysis become highly important nowadays, when digital appearance content is necessary for facilitating quick communication of material appearance properties across various industries.
This paper briefly overviews our recent contributions to: (1) rapid and realistic appearance measurement and visualization of effect coatings, (2) characterization of effect coatings allowing instant discrimination between different coating systems and effect pigments, (3) a non-invasive automatic texture-based particle orientation analysis method.
|
@inproceedings{filip17appearance,
title = {Appearance Acquisition and Analysis of Eect Coatings},
author = {Filip, J. and Maile, F. J.},
booktitle = {Proceedings of Pigment and Color Science Forum},
volume = {},
venue = {Alicante},
year = {2017},
publisher = {SMITHERS},
pages = {}
}
|
|
Filip J., Vavra R., Maile F. J.:
BRDF Measurement of Highly-Specular Materials using a Goniometer.
In proceedings of 33th Spring Conference on Computer Graphics (SCCG 2017) , pp.131-137, 2017
[bib]
[pdf]
Honorable mention
|
Visually accurate capture of appearance of highly specular surfaces is of a great research interest of the coating industry, who strive to introduce highly reflective products while minimizing their production and quality assessment costs, and avoiding environmental issues related to the production process. An efficient measurement of such surfaces is challenging due to their narrow specular peak of an unknown shape and typically very high dynamic range. Such behavior puts higher requirements on capabilities of a measuring device and has impact on length of the measurement process. In this paper, we rely on a material probes with a predefined curved shape featuring slight local inhomogeneities. This defines a goniometric device as appropriate means of appearance capture. To shorten a typically long measurement time required when using these approaches, we introduce a method of material appearance acquisition by means of the isotropic BRDF using relatively sparse sampling adapted to each measured material individually.
|
@inproceedings{filip17brdf,
title = {{BRDF} Measurement of Highly-Specular Materials using a Goniometer},
author = {Filip, J. and Vavra, R. and Maile, F. J.},
booktitle = {Proceedings of 33th Spring Conference on
Computer Graphics (SCCG 2017) },
volume = {},
venue = {Mikulov},
year = {2017},
publisher = {Brno University of technology},
pages = {131--137}
}
|
|
Filip J., Vavra, R., Haindl M.:
Capturing Material Visualization Data Using Goniometers.
Proceedings of the 4th CIE Expert Symposium on Colour and Visual Appearance,
CIE x043:2016, pp. 121-127, 2016
[bib]
|
Reproduction of the appearance of real-world materials in virtual environments has been one of the ultimate challenges of computer graphics. The required material representations depend on the complexity of the material's appearance. They start with a bidirectional reflectance distribution function (BRDF) describing distribution of energy reflected in the viewing direction when illuminated from a specific direction. As the BRDF cannot capture a material's spatial structure, it has been extended to a more general bidirectional texture function (BTF) capturing non-local effects in rough material structures, such as occlusions, masking, sub-surface scattering, or inter-reflections. A monospectral BTF is a six-dimensional function representing the material appearance at each surface point for variable illumination and view directions, parameterized by elevation and azimuthal angles. This paper describes application of gonioreflectometrs for capturing BTF for material visualization purposes. It starts with the reference measurement setup and continue with introduction of portable measurement approaches to capturing approximate BTF. Finally, we discuss future challenges in material acquisition.
|
@inproceedings{filip16capturing,
title = {Capturing Material Visualization Data Using Goniometers},
author = {Filip, Jiri and Vavra, Radomir and Haindl, Michal},
booktitle = {Proceedings of the 4th CIE Expert Symposium
on Colour and Visual Appearance},
volume = {CIE x043:2016},
venue = {Prague},
year = {2016},
publisher = {CIE},
pages = {121--127}
}
|
|
Filip J., Vavra, R.:
Using Reflectors for Analysis and Acquisition of Anisotropic BRDF.
Proceedings of the 4th CIE Expert Symposium on Colour and Visual Appearance,
CIE x043:2016, pp.155-162, 2016
[bib]
|
BRDFs are commonly used for material appearance representation in applications ranging from gaming and the movie industry, to product design and specification. Most applications rely on isotropic BRDFs due to their better availability as a result of their easier acquisition process. On the other hand, anisotropic BRDF due to their structure-dependent anisotropic highlights, are more challenging to measure and process. This paper describes approaches to BRDF analysis and acquisition using ellipsoidal and parabolic reflectors commonly used for in illumination industry. First, we show how reflectors can be used for detection of anisotropic properties as anisotropic axes and highlight width. Second, we outline and approach for anisotropic BRDF capture using reflector that allows fast and convenient measurement without need of measured sample extraction.
|
@inproceedings{filip16using,
title = {Using Reflectors for Analysis and Acquisition of Anisotropic BRDF},
author = {Filip, Jiri and Vavra, Radomir},
booktitle = {Proceedings of the 4th CIE Expert Symposium
on Colour and Visual Appearance},
volume = {CIE x043:2016},
venue = {Prague},
year = {2016},
publisher = {CIE},
pages = {155--162}
}
|
|
Filip J., Havran V., Myszkowski K.:
Gaze Analysis of BRDF Distortions.
4th Eurographics Workshop on Material
Appearance Modelling (MAM),
, June 2016, [bib] |
BRDFs are currently used as a standard representation of material
surface reflectance properties either in the from of tabulated measurements or
a parametric model. However, the compression of tabulated representations
as well as their fitting into such a parametric model typically introduces some visual
degradations.
We describe our analysis of the human gaze behavior
in two types of standard visual experiments, where the common task is to compare
a pair of BRDFs. The analysis was
carried out across six different isotropic/anisotropic materials and
three application-relevant BRDF degradation models.
|
@inproceedings{filip16gaze,
title = {{Gaze Analysis of {BRDF} Distortions}},
author = {Filip, Jiri and Havran, Vlastimil and Myszkowski, Karol},
booktitle = {Proceedings of 4th Eurographics Workshop on Material
Appearance Modelling (MAM)},
venue = {Dublin},
year = {2016},
publisher = {The Eurographics Association},
DOI = {}
}
|
|
Vavra, R., Filip J.:
BRDF Interpolation using Anisotropic Stencils,
In Proceedings of IS&T Electronic Imaging (Measuring, Modeling, and Reproducing Material Appearance 2016, pp.1-6),
2016
[bib]
[pdf]
|
Fast and reliable measurement of material appearance is crucial for many applications ranging from virtual prototyping to visual quality control. The most common appearance representation is BRDF capturing illumination- and viewing-dependent reflectance. One of the approaches to a rapid BRDF measurement captures its subspace, using so called slices, by continuous movements of a light and camera in azimuthal directions, while their elevations are fixed. This records set of slices in the BRDF space while the remaining data are unknown. We present a novel approach to the BRDF reconstruction based on a concept of anisotropic stencils interpolating values along predicted locations of anisotropic highlights. Our method marks an improvement over the original linear interpolation method, and thus we conclude it as a promising variant of interpolation from such sparse yet very effective measurements.
|
@inproceedings{vavra16brdf,
title = {{BRDF} Interpolation using Anisotropic Stencils},
author = {Vavra, R. and Filip, J.},
booktitle = {Proceedings of IS&T Electronic Imaging (Measuring, Modeling, and Reproducing Material Appearance 2016)},
venue = {San Francisco},
year = {2016},
month = {February},
DOI = {https://doi.org/10.2352/ISSN.2470-1173.2016.9.MMRMA-356},
pages = {1-6}}
|
|
Filip J. :
Analyzing and Predicting Anisotropic Effects of BRDFs,
In proceedings of ACM SIGGRAPH Symposium on Applied Perception 2015, pp. 22-32
[bib]
[pdf] [supplementary material]. |
The majority of the materials we encounter in the real-world have variable reflectance when rotated along a surface normal. This view and illumination azimuthally-variable behavior is known as visual anisotropy. Such behavior can be represented by a four-dimensional anisotropic BRDF that characterizes the anisotropic appearance of homogeneous materials. Unfortunately, most past research has been devoted to simplistic three dimensional isotropic BRDFs.
In this paper, we analyze and categorize basic types of BRDF anisotropy, use a psychophysical study to assess at which conditions can isotropic appearance be used without loss of details in material appearance. To this end, we tested the human impression of material anisotropy on various shapes and under two illuminations. We conclude that subjects sensitivity to anisotropy declines with increasing complexity of 3D geometry and increasing uniformity of illumination environment. Finally, we derive and perceptually validate a computationally efficient measure of material visual anisotropy.
|
@inproceedings{filip15analyzing,
title = {Analyzing and Predicting Anisotropic Effects of {BRDF}s},
author = {Filip, J.},
booktitle = {In proceedings of ACM SIGGRAPH Symposium on Applied
Perception (SAP)},
venue = {},
year = {2015},
month = {September},
conference = {},
pages = {}}
|
|
Filip J., Somol P. :
Materials Classification using Sparse Gray-Scale Bidirectional Reflectance Measurements,
In proceedings of 16th International
Conference on Computer Analysis of Images and Patterns (CAIP), LNCS 9257. Part II, pp. 289-299, September 2015
[bib]
[pdf] [poster]. |
Material recognition applications use typically texture or color based features; however, these measurements are in many application fields unavailable or too expensive. Therefore, bidirectional reflectance measurements are used, i.e., dependent on both illumination and viewing directions. But even measurement of such BRDF data is very time- and resources-demanding.
In this paper we use dependency aware feature selection method to identify very sparse set of the most discriminative bidirectional reflectance samples that can reliably distinguish between three types of materials from BRDF database -- fabric, wood, and leather.
We conclude that ten gray-scale samples primarily at high illumination and viewing elevations are sufficient to identify type of material with accuracy over 96%. We analyze estimated placement of the bidirectional samples for discrimination between different types of materials. The stability of such directional samples is very high as was verified by an additional leave-one-out classification experiment.
We consider this work a step towards automatic method of material classification based on several reflectance measurements only.
|
@inproceedings{filip15materials,
title = {Materials Classification using Sparse Gray-Scale
Bidirectional Reflectance Measurements},
author = {Filip, J. and Somol, P.},
booktitle = {Computer Analysis of Images and Patterns, LNCS 9257},
venue = {Valletta, Malta},
year = {2015},
month = {September},
conference = {16th International Conference on Computer
Analysis of Images and Patterns, CAIP},
pages = {289--299}}
|
|
Filip J., Havlicek M. :
Adaptive Highlights Stencils for Modeling of Multi-Axial BRDF Anisotropy,
In proceedings of Computer Graphics International (CGI'15), June 2015
[bib]
[pdf]. |
Directionally dependent anisotropic material appearance phenomenon is widely represented using bidirectional reflectance distribution function (BRDF).
This function needs in practice either reconstruction of unknown values between sparse measured samples, or requires data fidelity preserving compression forming a compact representation from dense measurements.
Both properties can be, to a certain extent, preserved by means of analytical BRDF models. Unfortunately, the number of anisotropic BRDF models is limited, and moreover, most of them require either a demanding iterative optimization procedure dependent on proper initialization, or the user setting of parameters. Most of these approaches are challenged by the fitting of complex anisotropic BRDFs. In contrast, we approximate BRDF anisotropic behavior by means of highlight stencils and derive a novel BRDF model that adapts such stencils to each type of anisotropy present in the BRDF independently. Our model allows for the fast direct fitting of parameters without the need of any demanding optimization. Furthermore, it achieves an encouraging, expressive visual quality in comparison to rival solutions that rely on a similar number of parameters, and allows GPU implementation. We believe that our method represents a promising approach to the analysis and modeling of complex anisotropic BRDF behavior.
|
@inproceedings{filip15adaptive,
title = {Adaptive Highlights Stencils for Modeling of
Multi-Axial {BRDF} Anisotropy},
author = {Filip, J. and Havlicek, M.},
booktitle = {Computer Graphics International},
venue = {Strasbourgh, FRA},
year = {2015},
month = {June},
conference = {},
pages = {}}
|
|
Filip J., Vavra R.:
Anisotropic Materials Appearance Analysis Using Ellipsoidal Mirror .
In proceedings of IS&T/SPIE Conference on Measuring, Modeling, and Reproducing Material Appearance, paper 9398-25, February 2015
[bib]
[pdf]. |
Many real-world materials exhibit significant changes in appearance when rotated along a surface normal. The presence of this behavior is often referred to as visual anisotropy. Anisotropic appearance of spatially homogeneous materials is commonly characterized by a four-dimensional BRDF. Unfortunately, due to simplicity most past research is devoted to three dimensional isotropic BRDFs. In this paper we introduce an innovative, fast, and inexpensive image-based approach to detection of anisotropy extent, its main axes and width of corresponding anisotropic highlights. The method uses only an off the shelf ellipsoidal reflector and compact camera and does not rely on any moving parts. We compare obtained results with a material microgeometry scan, and show how the results correspond to, e.g., microstructure of individual threads in fabric materials. Precise knowledge of materials' behavior can be used for the designing of a material-dependent sampling pattern so material can be measured much more accurately in the same amount of time using a regular gonioreflectometer.
|
@inproceedings{filip15anisotropic,
title = {Anisotropic materials appearance analysis using ellipsoidal mirror},
author = {Filip, J. and V{\'a}vra, R.},
booktitle = {IS&T/SPIE Conference on Measuring, Modeling,
and Reproducing Material Appearance, paper 9398-25},
venue = {San Francisco, CA},
year = {2015},
month = {February},
conference = {},
pages = {}}
|
|
Havlicek M., Haindl M., Filip J.:
Dynamic Textures Modeling Using Temporal Mixing Coefficients Reduction.
In proceedings of MUSCLE International Workshop on Computational Intelligence for Multimedia Understanding, November 2014
[bib]. |
Real world materials often change their appearance over time. If these variations are spatially and temporally homogeneous then the material visual appearance can be represented by a dynamic texture which is a natural extension of classic texture concept including the time as an extra dimension. In this article we present possible way to handle multispectral dynamic textures based on a combination of input data eigen analysis and subsequent processing of temporal mixing coefficients. The proposed method exhibits overall good performance, offers extremely fast synthesis which is not restricted in temporal dimension and simultaneously enables to compress significantly the original measured visual data.
|
@inproceedings{havlicek14dynamic,
title = {Dynamic Textures Modeling Using Temporal Mixing Coefficients Reduction},
author = {Havlicek, M. and Haindl, M. and Filip, J.},
booktitle = {In proceedings of the MUSCLE International Workshop on
Computational Intelligence for Multimedia Understanding},
venue = {Paris, France},
year = {2014},
month = {November},
conference = {},
pages = {}}
|
|
Filip J., Vavra R., Havlicek M.:
Effective Acquisition of Dense Anisotropic BRDF.
In proceedings of the 22th International Conference on Pattern Recognition, ICPR 2014, pp. 2047--2052
[bib]
[poster]. |
The development of novel analytical BRDF models, as well as adaptive BRDF sampling approaches, rely on the appropriate BRDF measurement of real materials. The quality of measurements is even more critical when it comes to accurately representing anisotropic materials where the character of anisotropy is unknown (locations of anisotropic highlights, their width, shape, etc.).
As currently there is a lack of dense yet noise-free BRDF anisotropic datasets, we introduce such unique measurements of three anisotropic fabric materials. In this paper we discuss a method of dense BRDF data acquisition, post-processing, missing values interpolation, and analyze properties of the datasets. Our results are compared with photographs, dense data fitted and generated by two state-of-the art anisotropic BRDF models, and alternative measurements available.
|
@InProceedings{filip14effective,
author = {Filip, J. and Vavra, R. and Havlicek, M.},
title = {Effective Acquisition of Dense Anisotropic {BRDF}},
booktitle = {Proceedings of the 22th International Conference on
Pattern Recognition, ICPR 2014},
location = {Stockholm, Sweden},
month = {August},
country = {Sweden},
year = {2014},
pages = {2047--2052}}
|
|
Filip J., Vavra R., Haindl M., Havran V., Zid P., Krupicka, M.:
BRDF Slices: Accurate Adaptive Anisotropic Appearance Acquisition.
In proceedings of the 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2013, pp.4321--4326
[bib]
[poster]. |
In this paper we introduce unique publicly available dense anisotropic BRDF data measurements. We use this dense data as a reference for performance evaluation of the proposed BRDF sparse angular sampling and interpolation approach. The method is based on sampling of BRDF subspaces at fixed elevations by means of several adaptively-represented, uniformly distributed, perpendicular slices. Although this proposed method requires only a sparse sampling of material, the interpolation provides a very accurate reconstruction, visually and computationally comparable to densely measured reference. Due to the simple slices measurement and method's robustness it allows for a highly accurate acquisition of BRDFs. This in comparison with standard uniform angular sampling, is considerably faster yet uses far less samples.
|
@InProceedings{filip13brdf,
author = {Filip, J. and Vavra, R. and Haindl, M. and Zid, P.
and Krupicka, M. and Havran, V.},
title = {{BRDF} Slices: Accurate Adaptive Anisotropic Appearance Acquisition},
booktitle = {In proceedings of the 26th IEEE Conference on Computer
Vision and Pattern Recognition, CVPR 2013},
location = {Portland, OR},
month = {June},
country = {USA},
year = {2013},
pages = {4321--4326},
url = {http://btf.utia.cas.cz/?brdf_dwn}}
|
|
Filip J., Grim J., Haindl M.:
A Probabilistic Approach to Rough Texture Compression and Rendering.
In MUSCLE International Workshop on Computational Intelligence for Multimedia Understanding, October 2013, pp. 8-12
[bib], [slides]. |
Rough textures describe a general visual appearance of real-world materials with regard to view and illumination directions. As the massive size and dimensionality of such representations is a main limitation of their broader use, efficient parameterization and compression methods are needed.
Our method is based on estimating the joint probability density of the included nine spatial, directional, and spectral variables in the form of a Gaussian mixture of product components. Our reflectance prediction formula can be expressed analytically as a simple continuous function of input variables and allows fast analytic evaluation for arbitrary spatial and directional values without need for a lengthy interpolation from a finite grid of angular measurements.
This method achieves high compression ratio increasing linearly with texture spatial resolution.
|
@inproceedings{filip13probabilistic,
title = {A Probabilistic Approach to Rough Texture Compression
and Rendering},
author = {Filip, J. and Grim, J. and Haindl, M.},
booktitle = {In proceedings of the MUSCLE International Workshop on
Computational Intelligence for Multimedia Understanding},
venue = {Antalya, Turkey},
year = {2013},
month = {October},
conference = {},
pages = {8--12}}
|
|
Somol P., Grim J., Filip J., Pudil P.:
On Stopping Rules in Dependency-Aware Feature Ranking.
In proceedings of the 18th Iberoamerican Congress on Pattern Recognition, CIARP 2013, November 2013, November 2013, pp. 286-293
[bib] [poster]. |
Feature Selection in very-high-dimensional or small sample problems is particularly prone to computational and robustness complications. It is common to resort to feature ranking approaches only or to randomization techniques.
A recent novel approach to the randomization idea in form of Dependency-Aware Feature Ranking (DAF) has shown great potential in tackling these problems well.
Its original definition, however, leaves several technical questions open. In this paper we address one of these questions: how to define stopping rules of the randomized computation
that stands at the core of the DAF method. We define stopping rules that are easier to interpret and show that the number of randomly generated probes does not need to be extensive.
|
@InProceedings{somol13stopping,
author = "Somol, P. and Grim, J. and Filip, J. and Pudil, P.",
title = "On Stopping Rules in Dependency-Aware Feature Ranking",
booktitle = "In proceedings of the 18th Iberoamerican Congress
on Pattern Recognition, CIARP 2013",
editor = {Jos\'{e} Ruiz-Shulcloper and Gabriella Sanniti di Baja},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
volume = {8258},
location = "Havana",
month = "November",
country = {Cuba},
year = "2013",
pages = {286--293},
url = {http://link.springer.com/chapter/10.1007%2F978-3-642-41822-8_36}
}
|
|
Filip J.:
Restoring Illumination and View Dependent Data from Sparse Samples.
In proceedings of the 21th International Conference on Pattern Recognition (ICPR 2012), pp.1391--1394
[bib] [poster]. |
Capturing material's appearance with respect to illumination and viewing directions is crucial to achieve realistic visual experience of digitized materials. The capturing process is time demanding or requires a specific shape of captured material. Therefore, we propose a method of such a data reconstruction from very sparse measurements, whose placement allows for continuous and fast acquisition, from which can benefit future acquisition setups. The model was tested on a number of BRDF samples and showed promising performance in terms of whole data-space reconstruction speed and visual quality. Despite using a very sparse dataset, the method proved to be useful also for reconstruction of bidirectional texture functions.
|
@InProceedings{filip12restoring,
author = "Filip, J.",
title = "Restoring Illumination and View Dependent Data from Sparse Samples",
booktitle = "Proceedings of the 21th International Conference
on Pattern Recognition, ICPR 2012",
location = "Tsukuba, Japan",
month = "November",
country = {Japan},
year = "2012",
pages = {1391--1394}}
|
|
Filip J., Haindl. M, Stancik J.:
Predicting Environment Illumination Effects on Material Appearance.
In proceedings of the 21th International Conference on Pattern Recognition (ICPR 2012), pp.2075--2078
[bib] (oral presentation). |
Environment illumination is a key to achieving a realistic visualization of material appearance. One way to achieve such an illumination is an approximation by rendering of the material surface lit by a finite set of point light sources.
In this paper we employed visual psychophysics to identify a minimal number of point light sources approximating realistic illumination. Furthermore, we analyzed stimuli images and correlation of their statistics with obtained psychophysical data. Finally, image statistics were identified which can predict such a minimal environment representation for three tested materials, depending on the visual properties of the illumination environment.
|
@InProceedings{filip12predicting,
author = "Filip, J. and Haindl, M. and Stancik, J.",
title = "Predicting Environment Illumination Effects on Material
Appearance",
booktitle = "Proceedings of the 21th International Conference
on Pattern Recognition, ICPR 2012",
location = "Tsukuba, Japan",
month = "November",
country = {Japan},
year = "2012",
pages = {2075--2078}}
|
|
Vavra R., Filip J.:
Registration of Multi-View Images of Planar Surfaces.
In proceedings of the 11th Asian Conference on Computer Vision (ACCV 2012),
[bib] [poster]. |
This paper presents a novel image-based registration method for high-resolution multi-view images of a planar material surface.
Contrary to standard registration approaches, this method aligns images based on a true plane of the material's surface and not on a plane defined by registration marks. It combines the camera calibration and the iterative fitting of desired position and slant of the surface plane , image re-registration, and evaluation of the surface alignment. To optimize image compression performance, we use an error of a compression method as a function evaluating the registration quality. The proposed method shows encouraging results on example visualizations of view- and illumination-dependent textures.
In addition to a standard multi-view data registration approach, it provides a better alignment of multi-view images and thus allows more detailed visualization using the same compressed parameterization size.
|
@InProceedings(vavra12registration,
author = "V{\'a}vra, R. and Filip, J.",
title = "Registration of Multi-View Images of Planar Surfaces",
booktitle = "Proceedings of the 11th Asian Conference on Computer
Vision, ACCV",
location = "Daejeon, Korea",
month = "November",
country = {Korea},
year = "2012")
|
|
Filip J., Haindl. M.:
User Study of Viewing and Illumination Dependent Material Appearance. In: Predicting Perceptions: Proceedings of the 3rd International Conference on Appearance. Lulu Press, Edinburgh UK, pp. 34-38. ISBN 978-1-4716-6869-2, April, 2012
[bib] [poster]. |
Our research focuses on a way how people view real materials with respect to their orientation as well as illumination direction. We performed user study with fifteen naive subjects using novel interactive stimuli where subjects could arbitrarily change orientations of planar surface and directional illumination. Seven real materials were represented by means of illumination and view dependent textures. The study comprised two experiments, free-view and task-oriented, and user behavior across different samples together with their answers to questionnaire were recorded and analysed.
|
@inproceedings{filip12user,
title = {User Study of Viewing and Illumination Dependent Material
Appearance},
author = {Filip, J. and Haindl, M.},
booktitle = {In proceedings of Predicting Perceptions 2012 -
the 3rd International Conference on Appearance},
venue = {Edinburgh, UK},
year = {2012},
month = {April},
conference = {},
pages = {34--38},
url = {http://opendepot.org/1048/1/User_Study_of_Viewing_and_
Illumination_Dependent_Material_Appearance.pdf} }
|
|
Filip J., Vacha P., Haindl. M.:
Analysis of Human Gaze Interactions with Texture and Shape. MUSCLE International Workshop on Computational Intelligence for Multimedia Understanding, Springer LNCS 7252, December, 2011
[bib] (oral presentation). |
Understanding of human perception of textured materials is one of
the most difficult tasks of computer vision. In this paper we designed
a strictly controlled psychophysical experiment with stimuli featuring different
combinations of shape, illumination directions and surface texture. Appearance
of five tested materials was represented by measured view and illumination dependent Bidirectional Texture Functions. Twelve subjects participated in visual search task - to find which of four identical three dimensional objects had its texture modified. We investigated the effect of shape and texture on
subjects' attention. We are not looking at low level salience, as the task is to make a high level quality judgment. Our results revealed several interesting aspects of human perception of different textured materials and,
surface shapes.
|
@inproceedings{filip12analysis,
title = {Analysis of Human Gaze Interactions with Texture and Shape},
author = {Filip, J. and Vacha, P. and Haindl, M.},
booktitle = {In proceedings of the MUSCLE International Workshop
on Computational Intelligence for Multimedia Understanding,
Springer LNCS 7252},
venue = {Pisa, Italy},
year = {2011},
month = {December},
conference = {},
pages = {160--172}}
|
|
Haindl M., Filip J.:
Advanced Textural Representation of Materials Appearance. SIGGRAPH Asia 2011 Courses, Hong Kong, China, December 2011, 85 pages.
[bib] |
A multidimensional visual texture is the appropriate paradigm for physically correct material visual properties representation. The course will presents recent advances in texture modelling methodology applied in computer vision, pattern recognition, computer graphics, and virtual/augmented reality applications. Contrary to previous courses on material appearance (e.g. [2]), we focus on materials whose nature allows exploiting of texture modelling approaches. This course builds on our recent tutorial held at CVPR 2010 [1].
This topic is introduced in wider and complete context of pattern recognition and image processing. It comprehends modelling of multi-spectral images and videos which can be accomplished either by a multi-dimensional mathematical models or sophisticated sampling methods from the original measurements. The key aspects of the topic, i.e., different multi-dimensional data models with their corresponding benefits and drawbacks, optimal model selection, parameter estimation and model synthesis techniques are discussed. These methods produce compact parametric sets that allow not only to faithfully reproduce material appearance, but are also vital for visual scene analysis, e.g., texture segmentation, classification, retrieval etc.
Special attention is devoted to a recent most advanced trend towards Bidirectional Texture Function (BTF) modelling [2], used for materials that do not obey Lambertian law, whose reflectance has non-trivial illumination and viewing direction dependency. BTFs recently represent the best known effectively applicable textural representation of the most real-world materials’ visual properties. The techniques covered include efficient Markov random field-based algorithms [3], intelligent sampling algorithms, spatially-varying reflectance models and challenges with their possible implementation on GPU. Introduced approaches will be categorized and compared in terms of visual quality, analysis and synthesis speed, texture compression rate, and their ability to be applied in GPU.
The course also deals with proper data measurement, visualization of texture models in virtual scenes, visual quality evaluation feedback [4], as well as description of key industrial and research applications. We will discuss options which type of material representation is appropriate for required application, what are its limits and possible modelling options, and what the biggest challenges in realistic modelling of materials are.
This introductory course provides a useful overview for the steadily growing number of researchers, lecturers, industry practitioners, and students interested in this new and progressive computer graphics area.
|
@inproceedings{haindl11advanced,
title = {Advanced textural representation of materials appearance},
author = {Haindl, M. and Filip, J.},
booktitle = {Proceedings of SA '11 SIGGRAPH Asia 2011 Courses ,
Eds: Sander P., SIGGRAPH Asia 2011, (Hong Kong, CN,
12.12.2011-15.12.2011)},
venue = {Hong-Kong, China},
year = {2011},
month = {December},
conference = {},
pages = {35},
url = {} }
|
|
Filip J., Haindl. M, Chantler M.J.:
Gaze-Motivated Compression of Illumination and View Dependent Textures. In proceedings of the 20th International Conference on Pattern Recognition (ICPR), IEEE, Los Alamitos 2010, pp.862-864
[bib] (oral presentation). |
Illumination and view dependent texture provide ample information on the appearance of real materials at the cost of enormous data storage requirements. Hence, past research focused mainly on compression and modelling of these data, however, few papers have explicitly addressed the way in which humans perceive these compressed data.
We analyzed human gaze information to determine appropriate texture statistics. These statistics were then exploited in a pilot illumination and view direction dependent data compression algorithm. Our results showed that taking into account local texture variance can increase compression of current
methods more than twofold, while preserving original realistic appearance and allowing fast data reconstruction.
|
@inproceedings{filip10gaze,
title = {Gaze-Motivated Compression of Illumination and View
Dependent Textures},
author = {Filip, J. and Haindl, M. and Chantler, M.J.},
booktitle = {In proceedings of the 20th International Conference
on Pattern Recognition (ICPR)},
venue = {Istanbul, Turkey},
year = {2010},
month = {August},
conference = {ICPR 2010},
pages = {862--864},
url = {} }
|
|
Filip J., Vacha P., Haindl. M, Green P.R.:
A Psychophysical Evaluation of Texture Degradation Descriptors. Proceedings of IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition (SSPR & SPR 2010), LNCS 6218, pp. 423-433, Cesme, Izmir, Turkey, August 18-20, 2010.
[bib], [poster]. |
Delivering digitally a realistic appearance of materials is one of
the most difficult tasks of computer vision. Accurate representation of
surface texture can be obtained by means of view- and illumination-dependent textures. However, this kind of appearance representation produces massive datasets so their compression is inevitable. For optimal visual performance of compression methods, their parameters should be tuned to a specific material.
We propose a set of statistical descriptors motivated by textural features, and psychophysically evaluate their performance on three subtle artificial degradations of textures appearance. We tested five types of descriptors on five different textures and combination of thirteen surface shapes and two illuminations. We found that descriptors based on a two-dimensional causal auto-regressive model, have the highest correlation with the psychophysical results, and so can be used for automatic detection of subtle changes in rendered textured surfaces in accordance with human vision.
|
@inproceedings{filip10psychophysical,
title = {A Psychophysical Evaluation of Texture Degradation Descriptors},
author = {Filip, J. and Vacha, P. and Haindl, M. and Green, P.R.},
booktitle = {Proceedings of IAPR International Workshop on Structural,
Syntactic, and Statistical Pattern Recognition (SSPR & SPR 2010),
LNCS 6218},
venue = {Cesme, Izmir, Turkey},
year = {2010},
month = {August},
conference = {SSPR & SPR 2010},
pages = {423--433},
url = {} }
|
|
Filip J., Haindl M.:
Fast and Reliable PCA-Based Temporal Segmentation of Video Sequences. In: Proceedings of the 19th International Conference on Pattern Recognition (ICPR), IEEE, Los
Alamitos 2008, pp.1-4
[bib] (oral presentation). |
With significantly increasing number of archived movie sequences a need of their automatic indexation and annotation is raising. Robust and fast temporal segmentation of video sequences is one of the challenging research topics in this area. In this paper we propose a new temporal segmentation method of the video sequences based on PCA approach.
Contrary to standard approaches based on histogram or motion field analysis
the proposed method does not require any such a complex analysis.
The method starts with sparse greyscale sampling and eigen-analysis of input
sequence. A sum of absolute derivatives of temporal
mixing coefficients of main eigen-images is then used as cuts detection feature,
while dissolve transitions are detected by means of coefficients' specific behaviour. The functionality of the method was successfully tested on number of sequences ranging from artificial set of similar dynamic textures to professional documentary movies.
Although, the results may not be unexpected, we believe that proposed method
provides novel, very fast and reliable way of movie cuts detection.
|
@inproceedings{filip08fast,
author = {Filip, J. and Haindl, M.},
title = {Fast and Reliable PCA-Based Temporal Segmentation
of Video Sequences},
booktitle = {19th International Conference on Pattern Recognition
(ICPR)},
year = {2008},
month = {December},
pages = {1--4}}
|
|
Filip J., Chantler J. M., Haindl M.:
On Optimal Resampling of View and Illumination Dependent Textures. Proceedings of 5th Symposium on Applied Perception in Graphics and Visualization (APGV) 2008, pp.131-134
[bib] [cover image] (oral presentation). |
The use of illumination and view dependent textural information is one way to
capture the realistic appearance of genuine materials. One example of such
data is the bidirectional texture function. The main disadvantage of
these data, that makes their further application very difficult, is their
massive size.
Perceptually-based methods can determine optimal uniform resampling of these data that allows considerable reduction of a number of view and illumination dependent samples. In this paper we propose to achieve this goal by means of a psychophysical study, comparing
original data rendering with rendering of their uniformly
resampled version over the hemisphere of illumination and view dependent
textural measurements. The resampling was done separately for elevation
and azimuthal angles as well as in illumination and view space. Our
results shown promising consequences for compression and modeling
algorithms using this kind of massive data.
|
@inproceedings{filip08optimal,
author = {Filip, J. and Chantler, M.J. and Haindl, M.},
title = {On optimal resampling of view and illumination dependent textures},
booktitle = {Fifth Symposium on Applied Perception in Graphics
and Visualization},
year = {2008},
month = {August},
conference = {Symposium on Applied Perception in Graphics and
Visualization},
pages = {131--134},
url = {http://staff.utia.cas.cz/filip/papers/filip08optimal.pdf} }
|
|
Green P., Filip J., Clarke A., Chantler M.:
Effects of light environment on camouflage against textured surfaces. AVA Meeting at University of Cambridge (Animal Camouflage: Questions, Answers and New Directions) 2008 (oral presentation). |
The image of a camouflaged animal on a background surface provides the input to a predator’s visual system and is the starting point for neural processing that will determine whether the prey is detected or not. The spatial pattern of luminance in the image arises from variation in both the reflectance and the surface relief of the animal and its background. It is also a function of the illumination of the scene, and will vary with the azimuth and elevation of the light source, and the amount of ambient light present. The effects of these variables of illumination on the image of a camouflaged animal and its background will be important determinants of its visibility to a predator. Graphics techniques such as rendering of surface height maps and bidirectional texture functions provide methods for exploring these effects in either real or synthetic textured target and background surfaces. Some examples are described, and their implications for the effectiveness of different types of camouflage under varying natural illumination conditions are discussed.
|
|
Filip J., Haindl M.:
BTF Modelling Using BRDF Texels. In:
Advances in Machine Vision, Image Processing, and Pattern Analysis. (Zheng, N.,
Jiang, X., Lan, X. eds.). (Lecture Notes in Computer Science. 4153). Springer, Heidelberg 2006,
pp. 475-484.
[bib] (oral presentation). |
The highest fidelity representations of realistic real-world materials currently used comprise Bidirectional Texture Functions (BTF). The BTF is a six dimensional function depending on view and illumination directions as
well as on planar texture coordinates. The huge size of such measurements, typically in the form of thousands of images covering all possible combinations of illumination and viewing angles, has prohibited their practical exploitation and obviously some compression and modelling method of these enormous BTF data
spaces is inevitable. The proposed approach combines BTF spatial
clustering with cluster index modelling by means of an efficient
Markov random field model. This method allows to generate seamless cluster index of arbitrary size to cover large virtual 3D objects surfaces. The method represents original BTF data using a set of local spatially dependent
Bidirectional Reflectance Distribution Function (BRDF) values
which are combined according to synthesised cluster index and illumination /
viewing directions. BTF data compression using this method is about 1:100
and their synthesis is very fast.
|
@inproceedings{filip06BTF,
title = {BTF Modelling Using BRDF Texels},
author = {Filip, J. and Haindl, M.},
booktitle = {Proceedings of International Workshop on Intelligent
Computing in Pattern Analysis/Syntheis. (Lecture Notes in Computer
Science 4153)},
publisher = {Springer-Verlag},
address = {Berlin Heidenberg},
volume = {1},
pages = {475--484},
venue = {Xi'an, China},
month = {August},
year = {2006},
isbn = {978-3-540-37597-5},
book_pages = {475--484},
url = {http://staff.utia.cas.cz/filip/papers/filip06BTF.pdf} }
|
|
Filip J., Haindl M.,
Chetverikov, D.: Fast Synthesis of Dynamic Colour
Textures. In: Proceedings of the 18th IAPR International Conference on
Pattern Recognition. (Tang Y., Wang P., Lorette G., Yeung D. eds.). IEEE, Los
Alamitos 2006, Volume 4, pp.25-28
[bib] (oral presentation, accept. rate 14.0%). |
Textural appearance of many real word materials is not static but shows progress
in time. If such a progress is spatially and temporally homogeneous these
materials can be represented by means of \emph{dynamic texture} (DT). DT
modelling is a challenging problem which
can add new quality into computer graphics applications. We propose a novel
hybrid method for colour DTs modelling. The method is based on eigen-analysis of DT
images and subsequent preprocessing and modelling of temporal interpolation eigen-coefficients
using a causal auto-regressive model. The proposed method shows
good performance for most of the tested DTs, which depends mainly on the properties
of the original sequence. Moreover, this method compresses significantly the
original data and enables extremely fast synthesis of
artificial sequence, which can be easily performed by means of contemporary graphics hardware.
|
@inproceedings{filip06fast,
title = {Fast Synthesis of Dynamic Colour Textures},
author = {Filip, J. and Haindl, M. and Chetverikov, D.},
booktitle = {Proceedings of 18th International Conference
on Pattern Recognition},
publisher = {IEEE Computer Society Press},
volume = {4},
pages = {25--28},
venue = {Hong Kong, China},
month = {August},
year = {2006},
isbn = {0-7695-2521-0},
book_pages = {25--28},
url = {http://staff.utia.cas.cz/filip/papers/filip06fast.pdf} }
|
|
Filip J., Haindl M.:
Efficient
Image-Based Bidirectional Texture Function Model. In: Texture
2005. Proceedings. (Chantler M., Drbohlav O. ed.). Heriot-Watt univ., Edinburgh 2005,
pp. 7-12.
[bib] (oral presentation). |
Recent advances in computer hardware and virtual modelling
allow to respect view and illumination dependencies of natural
surface materials. The corresponding texture representation in the form of
Bidirectional Texture Function (BTF) enables significant improvements
of virtual models realism at the expense of immense increase
of material sample data space. Thus introduction of some fast compression,
modelling and rendering method for BTF data is inevitable.
In this paper we introduce a generalisation of our polynomial extension of the Lafortune model computed for every original BTF measurement pixel
allowing seamless BTF texture enlargement. This nonlinear reflectance
model is further extended using parameters clustering technique to achieve higher compression ratio.
The presented method offers BTF modelling in excellent visual quality as was tested on variety of BTF measurements. The method gives BTF compression ratio 1:200 as well as fast graphics hardware implementation.
|
@InProceedings{filip05efficient,
title = {Efficient Image Based Bidirectional Texture Function Model},
author = {Filip, J. and Haindl, M.},
booktitle = {Texture 2005: Proceedings of 4th Internatinal Workshop
on Texture Analysis and Synthesis},
editor = {Chantler, M. and Drbohlav, O.},
publisher = {Heriot-Watt University},
address = {Edinburgh},
month = {October},
isbn = {1-904410-13-8},
year = {2005},
pages = {7-12},
url = {http://staff.utia.cas.cz/filip/papers/filip05efficient.pdf} }
|
|
Haindl M., Filip J.:
A Fast Probabilistic Bidirectional Texture
Function Model. In: Image Analysis and Recognition. (Campilho A., Kamel
M. eds.). (Lecture Notes in Computer Science. 3212). Springer, Heidelberg 2004,
pp. 298-305.
[bib] (oral presentation). |
The bidirectional texture function (BTF) describes texture
appearance variations due to varying illumination and viewing
conditions. This function is acquired by large number of
measurements for all possible combinations of illumination and
viewing positions hence some compressed representation of these
huge BTF texture data spaces is obviously inevitable. In this
paper we present a novel efficient probabilistic model-based
method for multispectral BTF texture compression which simultaneously allows its efficient modelling. This representation model is capable of seamless BTF space enlargement and direct implementation inside the graphical card processing unit. The analytical step of the algorithm starts with BTF texture
surface estimation followed by the spatial factorization of an input multispectral texture image. Single band-limited factors are independently modelled by their dedicated 3D causal autoregressive models
(CAR). We estimate an optimal contextual neighbourhood and parameters for each CAR. Finally the synthesized multiresolution multispectral texture pyramid is
collapsed into the required size fine resolution synthetic smooth texture. Resulting BTF is combined in a displacement map filter of the rendering hardware using both multispectral and range information, respectively. The presented model offers immense BTF texture compression ratio which cannot be achieved by any other sampling-based BTF texture synthesis method.
|
@InProceedings(haindl04fast,
title = {{A} fast probabilistic {B}idirectional {T}exture {F}unction model},
author = {Haindl, M. and Filip, J.},
booktitle = {Image Analysis and Recognition},
chapter= {2},
editor = {Campilho, A. and Kamel, M.},
publisher = {Springer},
address = {Heidelberg},
month = {September},
year = {2004},
pages = {298-305},
url = {http://staff.utia.cas.cz/filip/papers/haindl04fast.pdf} )
|
|
Filip J., Haindl M.:
Non-Linear Reflectance Model for
Bidirectional Texture Function Synthesis. In: Proceedings of the 17th
IAPR International Conference on Pattern Recognition. (Kittler J., Petrou M.,
Nixon M. eds.). IEEE, Los Alamitos 2004, pp. 80-83.
[bib] (oral presentation, accept. rate 18.0%). |
A rough texture modelling involves a huge image data-set - the
Bidirectional Texture Function (BTF). This 6-dimensional
function depends on planar texture coordinates as well as on view
and illumination angles. We propose a new non-linear reflectance
model, based on a Lafortune reflectance model improvement, which
restores all BTF database images independently for each view
position and herewith significantly reduces stored BTF data size.
The extension consists in introducing
several spectral parameters for each BTF image which are linearly
estimated in the second estimation step according to the original
data. The model parameters are computed for every surface
reflectance field contained in the original BFT data. This
technique allows BTF data compression by the ratio 1:15 while
the synthesised images are almost indiscernible from the
originals. The method is universal, and easily implementable in a
graphical hardware for purpose of real-time BTF rendering.
|
@InProceedings(filip04nonlinear,
title = {{N}on-linear reflectance model for {B}idirectional {T}exture
{F}unction synthesis},
author = {Filip, J. and Haindl, M.},
booktitle = {Proceedings of the 17th IAPR International Conference
on Pattern Recognition},
chapter= {1},
editor = {Kittler, J. and Petrou, M. and Nixon, M.},
publisher = {IEEE},
address = {Los Alamitos},
month = {August},
year = {2004},
pages = {80-83},
url = {http://staff.utia.cas.cz/filip/papers/filip04nonlinear.pdf} )
|
|
Haindl M., Filip J.,
Arnold M.: BTF Image Space Utmost
Compression and Modelling Method. In: Proceedings of the 17th IAPR
International Conference on Pattern Recognition. (Kittler J., Petrou M., Nixon
M. eds.). IEEE, Los Alamitos 2004, pp. 194-197.
[bib] (poster, accept. rate 35.1%). |
The bidirectional texture function (BTF) describes rough texture
appearance variations due to varying illumination and viewing conditions. Such a
function consists of thousands of measurements (images) per sample. Resulted BTF
size excludes its direct rendering in graphical applications and some
compression of these huge BTF data spaces is obviously inevitable. In
this paper we present a novel fast probabilistic model-based algorithm for
realistic BTF modelling allowing such an efficient
compression with possibility of direct implementation inside the graphics card.
The analytical step of the algorithm starts with the BTF space segmentation and range map
estimation of the BTF surface followed by the spectral and spatial factorisation
of selected sub-space multispectral texture images. Single monospectral
band-limited factors are independently modelled by their dedicated causal
autoregressive models (CAR). During rendering the corresponding sub-space images
of arbitrary size are synthesised and both multispectral and range information
is combined in a bump mapping filter of
the rendering hardware according to view and illumination positions.
The presented model offers huge BTF compression ratio unattainable by any
alternative sampling-based BTF synthesis method. Simultaneously this model can
be used to reconstruct missing parts of the BTF measurement space.
|
@InProceedings(haindl04BTF,
title = {{BTF} image space utmost compression and modelling method},
author = {Haindl, M. and Filip, J. and Arnold, M.},
booktitle = {Proceedings of the 17th IAPR International Conference
on Pattern Recognition},
chapter= {3},
editor = {Kittler, J. and Petrou, M. and Nixon, M.},
publisher = {IEEE},
address = {Los Alamitos},
month = {August},
year = {2004},
pages = {194-197},
url = {http://staff.utia.cas.cz/filip/papers/haindl04BTF.pdf} )
|
|
Haindl M., Filip J.:
Fast BTF Texture Modelling. In:
Texture 2003. Proceedings. (Chantler M. ed.). Heriot-Watt univ., Edinburgh 2003,
pp. 47-52.
[bib] (poster). |
This paper presents a novel fast model-based algorithm for realistic multispectral BTF texture modelling potentially capable of direct implementation inside the graphical card processing unit.
The algorithm starts with range map estimation of the BTF texture followed by the spectral and spatial factorisation of an input multispectral texture image. Single orthogonal monospectral band-limited factors are independently modelled by their dedicated Gaussian Markov random field models (GMRF). We estimate an optimal contextual neighbourhood and parameters for each GMRF. Finally single synthesised band-limited factors are collapsed into the fine resolution monospectral images and using the inverse Karhunen-Loeve transformation we obtain the smooth multispectral texture. Both multispectral and range information is combined in a bump mapping or alternatively a displacement mapping filter of the rendering hardware. The presented model offers huge BTF texture compression ratio which cannot be achieved by any other sampling-based BTF texture synthesis method.
|
@InProceedings(haindl03fast,
title = {{F}ast {B}{T}{F} texture modelling},
author = {Haindl, M. and Filip, J.},
booktitle = {Texture 2003. Proceedings},
editor = {Chantler, M.},
publisher = {IEEE Press},
address = {Edinburgh},
month = {October},
year = {2003},
pages = {47-52},
url = {http://staff.utia.cas.cz/filip/papers/haindl03fast.pdf} )
|
|
Haindl M., Filip J.:
Fast Restoration of Colour Movie Scratches. In: Proceedings of the 16th International Conference on Pattern Recognition. (Kasturi R., Laurendeau D., Suen C. eds.). IEEE Computer Society, Los Alamitos 2002, pp. 269-272.
[bib] (oral presentation, accept. rate 20.2%). |
This paper presents a new type of scratch removal algorithm based on a causal adaptive multidimensional multitemporal prediction. The predictor use available information from the neighbourhood of a missing multispectral pixels due to spectral, temporal and spatial correlation of video data but not any information
from the failed pixels themselves.
|
@InProceedings(haindl02fast,
title = {{F}ast restoration of colour movie scratches},
author = {Haindl, M. and Filip, J.},
booktitle = {Proceedings of the 16th International Conference on Pattern
Recognition},
chapter= {3},
editor = {Kasturi, R. and Laurendeau, D. and Suen, C.},
publisher = {IEEE Computer Society},
address = {Los Alamitos},
month = {August},
year = {2002},
pages = {269-272},
url = {http://staff.utia.cas.cz/filip/papers/haindl02fast.pdf} )
|
Dissertation Thesis |
|
Filip J.:
Colour Rough Textures
Modelling. Dissertation Thesis, FEL, CTU in Prague (2005), 133 pp.
[bib] [web] |
Constantly increasing graphics hardware computational power finally enables
fast and realistic rendering of virtual reality models whose realisation was
until recently impossible. Such realistic models require, among others, natural looking textures covering virtual objects of rendered scene.
Applications of these advanced texture models in virtual reality systems now
allow photorealistic material appearance approximation for
such complex tasks as visual safety simulations or interior design in
automotive/airspace industry or architecture.
For aim of such advanced applications a smooth textures lit by reflectance models alternatively combined with bump-mapping are not able to offer correct and realistic reproduction of material appearance. This is caused due to inherited complexity of many materials whose rough structure produces such visual effects as selfshadowing, masking, interreflection or subsurface scattering. The one way how capture these material's attributes is using much more complex representation of a rough or 3D texture called Bidirectional Texture Function (BTF). BTF is a six dimensional function
depending on view and illumination directions as well as on planar texture
coordinates. This function is acquired as several thousands of images during
varying light and camera positions. However, the huge size of measured BTF prevents it from using for any fast application so introduction of some fast compression, modelling and rendering method for BTF data is inevitable.
In this thesis we review BTF acquisition, modelling and rendering methods published so far, survey problems concerning BTF mapping and rendering implementation, surface height approximation and finally propose two novel BTF modelling approaches realised in several BTF modelling methods.
The first proposed approach introduces probabilistic BTF model based on Markov
random field modelling of BTF subspaces. These subspaces are obtained using BTF
segmentation and the regular material pattern is introduced into the model by
means of surface height simulation.
The second one is based on polynomial extension of one-lobe Lafortune model
computed in every pixel of original BTF measurements. The model is further extended by the parameters clustering to achieve higher compression ratio and the BTF sample enlargement by means of an image tiling technique.
The first method offer slightly compromised visual quality for some materials
and enables compression ratio unbeatable by any other BTF compression or
modelling method while the second approach enable BTF modelling in excellent
quality with moderate compression ratio. Both mentioned models enable fast hardware implementation and were tested on several distinct BTF materials, their
properties are discussed and obtained results are compared with the original BTF measurements.
Although BTF modelling methods shows excellent performance there are still
several problems which limit their wide use in any system of virtual
reality. These limitations are within scope of extensive further research in
computer vision and computer graphics community.
|
@phdthesis{ filipj05colour,
author = {Filip, J.},
title = {Colour Rough Textures Modelling},
school = {Czech Technical University in Prague, Faculty
of Electrical Engineering, Department of Cybernetics,
Technicka 2, 166 27 Prague 6, Czech Republic},
year = {2005},
url = {http://ro.utia.cz/demos/dt_jf/dt_jf.html} }
|
Research Reports |
|
Filip J., Vavra R., Krupicka M.: A Portable Setup for Fast Material Appearance Acquisition. (Research Report No. 2342). ÚTIA AV CR, Praha 2014, 7 pp.
[bib] |
A photo-realistic representation of material appearance can be achieved by means of bidirectional texture function (BTF) capturing a material's appearance for varying illumination, viewing directions, and spatial pixel coordinates. BTF captures many non-local effects in material structure such as inter-reflections, occlusions, shadowing, or scattering.
The acquisition of BTF data is usually time and resource-intensive due to the high dimensionality of BTF data. This results in expensive, complex measurement setups and/or excessively long measurement times.
We propose an approximate BTF acquisition setup based on a simple, affordable mechanical gantry containing a consumer camera and two LED lights.
It captures a very limited subset of material surface images by shooting several video sequences.
A psychophysical study comparing captured and reconstructed data with the reference BTFs of seven tested materials revealed that results of our method show a promising visual quality.
As it allows for fast, inexpensive, acquisition of approximate BTFs, this method can be beneficial to visualization applications demanding less accuracy, where BTF utilization has previously been limited.
|
@techreport{filip14portable,
title = {A Portable Setup for Fast Material Appearance Acquisition},
author = {Vavra, R and Filip, J. and Krupicka M.},
institution = {UTIA AS CR},
number = {2342},
year = {2013},
month = {December}}
|
|
Vavra R., Filip J., Somol P.: A Comparison of Adaptive Sampling and Interpolation of 2D BRDF Subspaces. (Research Report
No. 2339). ÚTIA AV CR, Praha 2013, 47 pp.
[bib] |
This report comprises overview of interpolation and sampling methods of Bidirectional Reflectance Distribution Function (BRDF). We analyzed 2D BRDF subspaces of eleven materials. We compared performance of five interpolation methods, three different sampling patterns, and compared twelve adaptive sampling strategies. Finally, based on knowledge of entire data we estimated sub-optimal sampling patterns and as a reference compared them with other tested sampling approaches.
|
@techreport{vavra13comparison,
title = {A Comparison of Adaptive Sampling and Interpolation of
2D BRDF Subspaces},
author = {Vavra, R and Filip, J. and Somol P.},
institution = {UTIA AS CR},
number = {2339},
year = {2013},
month = {December}}
|
|
Filip J.: Towards Effective Measurement and Interpolation of Bidirectional Texture Functions. (Research Report
No. 2298). ÚTIA AV CR, Praha 2011, 8 pp.
[bib] |
Bidirectional texture function (BTF) is acquired by taking thousands of material surface images for different illumination and viewing directions. This function, provided it is measured accurately,
is typically exploited for visualization of material appearance in visual accuracy demanding applications. However, accurate measurement of the BTF is time and resources demanding task. While the sampling of illumination and viewing directions is in all known measurement systems done uniformly, we believe that to be more effective the sampling should be tailored specifically to reflectance properties of materials to be measured.
Hence, we introduce a novel method of sparse BTF sampling. The method starts with collecting information about material visual behavior by means of small initial subset of reflectance samples measurement and analysis. This information is fed into our heuristic algorithm producing sparse material dependent sampling that is consequently used for BTF measurement and interpolation. The algorithm was tested in simulated measurement test with ten BTF samples, their estimated image subsets were selected, the remaining images were interpolated, and results were computationally and psychophysically compared with the measured data. In average the number of sampling points was less than half of original measurements points, and for most materials the produced BTF renderings were perceptually indiscernible from the originals.
|
@techreport{filip11towards,
title = {Towards Effective Measurement and Interpolation of
Bidirectional Texture Functions},
author = {Filip, J.},
institution = {UTIA AS CR},
number = {2298},
year = {2011},
month = {April}}
|
|
Havran V., Filip J., Myszkowski K.: Bidirectional Texture Function Compression Based on Multi-Level Vector Quantization - Supplemental Material. (Research Report No. 2265). ÚTIA AV CR, Praha 2009, 58 pp.
[bib] [web] |
|
@Techreport{havran09bidirectional,
author = {Havran, V. and Filip, J. and Myszkowski, K.},
title = {{Bidirectional Texture Function Compression based on
the Multilevel Vector Quantization SUPPLEMENTAL MATERIAL}},
year = 2009,
type = {Research Report},
Institution = {Institute of Information Theory and Automation,
Academy of Sciences of the Czech Republic},
NUMBER = {2265},
pages = {1--59},
ADDRESS = {Prague, Czech Republic},
MONTH = {Dec},
}
|
|
Haindl M., Filip J.,
Somol P.: Texturing Library - Reference Manual. (Research Report
No. 2142). ÚTIA AV CR, Praha 2005, 209 pp. |
|
Meseth J., Degener
P., Klein R., Haindl M., Filip J., Somol P.: Texture Synthesis on
Surfaces. (Research Report No. 2141). ÚTIA AV CR, Praha 2005, 30
pp. |
|
Meseth J., Müller G., Klein R., Haindl M., Filip J., Somol P., Zid P.: Specification and Prototype Description of Texture Mapping and Synthesis. (Research Report No. 2107). UTIA AV CR, Praha 2004, 21 pp. |
|
Haindl M., Filip J., Somol P., Havlicek V.: RealReflect Library - Reference Manual. (Research Report No. 2121). UTIA AV CR, Praha 2004, 200 pp. |
|
Haindl M., Filip J., Somol P.: Advances in BTF Modelling. (Research Report No. 2119). UTIA AV CR, Praha 2004, 36 pp. |
|
Haindl M., Klein R., Filip J., Somol P., Meseth J.: Specification and Prototype Description of BTF Database and Model. (Research Report No. 2076). UTIA AV CR, Praha 2003, 26 pp. |
|
Haindl M., Filip J., Somol P., Meseth J.: BTF Parametric Database. (Research Report No. 2090). UTIA AV CR, Praha 2003, 20 pp. |
|
Haindl M., Filip J., Somol P.: Specification and Prototype Description of Texture Model. (Research Report No. 2075). UTIA AV CR, Praha 2002, 21 pp.
|
|
Haindl M., Filip J.: A Fast Model-Based Restoration of Colour Movie Scratches. (Research Report No. 2031). UTIA AV CR, Praha 2001, 22 pp.
|
Diploma Thesis |
| Filip J.:
Colour Movies Scratch Restoration. Diploma thesis FEL CVUT, Prague (2002), 65 pp. |
Digital restoration of the scratches in image sequences is essential
for recovering of the old movies as well as for online processing in
scanning and duplicating machines.
The first part of this thesis describes review of digital movie
representation. Next is introduced the survey of contemporary methods for motion
estimation and scratch restoration in image sequence. The results of
two simple motion estimation methods are discussed.
The main contribution of this thesis is a new algorithm
for the scratch restoration in multi-spectral image sequence based
on causal adaptive multidimensional prediction by 3D and 3.5D causal
autoregressive models. The predictor use available information from
corrupted pixel surrounding due to spectral, spatial and temporal correlation
in multispectral image data, and adaptively updates it's parameters. The model
assumes white Gaussian noise in each spectral layer, but layers can
be mutually correlated. Experimental results captivate that proposed
method easily outperforms any mentioned classical scratch restoration
method.
|
Tutorials/Courses Presented |
| Filip J., Frisvad J. R., Tsesmelis T., Giachetti A., Gregersen S. K. S.:
Methods for photographic radiometry, modeling of light transport and material appearance (one-day tutorial). Presented at conference 3DV 2018, Verona, Italy, September 8, 2018. |
| Haindl M., Filip J.:
Advanced Nature Exteriors Modelling (half-day tutorial). Presented at conference ICPR 2012, Tsukuba, Japan, November 2012. |
| Haindl M., Filip J.:
Advanced Textural Representation of Materials Appearance (half-day course). Presented at conference SIGGRAPH Asia 2011, Hong Kong, China, December 2011. [bib] |
| Haindl M., Filip J.:
Advanced Material Appearance Modelling (half-day tutorial). Presented at conference SCIA 2011, Ystad, Sweden, May 2011. |
| Haindl M., Filip J.:
Bidirectional Texture Function Modeling (half-day tutorial). Presented at conference CVPR 2010, San Francisco, USA, August 2010. |
Invited and seminar talks |
|
In the search of an ideal measurement geometry for effect coatings
, Pigment and Color Science Forum, Boston, USA, October 5, 2018. |
|
Texture-based measurements for visualization and characterization of effect coatings, Merck KGaA, Darmstadt, June 29, 2018. |
|
Digital Material Appearance: from Interior Materials to Effect Coatings, Justus-Liebig-Universität Gießen, Abteilung Allgemeine Psychologie, Gießen, March 5, 2018. |
|
Effect coatings appearance acquisition and analysis
, Pigment and Color Science Forum, Alicante, Spain, October 5, 2017. |
|
Visual perception and automotive industry (T. Dauser, B. Eibon, F. Maile, J. Filip), The 6th conference of PRISM network (Perceptual Representation of Illumination, Shape & Material), Schloss Rauischholzhausen, Germany, October 21, 2016. |
|
Material Appearance Measurement and Visualization, Pigment and Color Science Forum, Prague, Czech Republic, October 8, 2015. |
|
Anisotropic Material Appearance: Acquisition and Modelling, Seminar talk Max Planck Institute for Informatics, Saarbrucken, Germany, July 29, 2015. |
|
Jak reálná mù¾e být virtuální realita?, Seminar talk for PhD students, Thomas Bata University, Zlín, Czech Republic, May 11, 2015. |
|
Adaptive Acquisition of Anisotropic Appearance, Seminar: Challenges in Digital Material Appearance, Computer Graphics, University of Bonn, Germany, January 28, 2015. |
|
Apearance Capturing and Modelling using Bidirectional Texture Functions
, Special Seminar in Computer Graphics, Computer Graphics Group, Faculty of Mathematics and Physics, Charles University, Prague, Czech Republic, November 24, 2011. |
|
Accurate Materials Appearance Representation using Bidirectional Texture Functions: Measurement, Compression, Modelling and Perception, Spring 2009 Pattern Recognition and Computer Vision Colloquium, Centre of machine perception CTU, Prague, Czech Republic, April 23, 2009. |
Awards |
|
Award of Czech Academy of Sciences – for important research results in Mathematical modelling of material appearance for year 2011. Team led by prof. M. Haindl with members J. Filip,
J. Grim, V. Havlíèek, M. Hatka. |
|
Otto Wichterle premium – award for young scientists from Academy of Sciences of the Czech Republic, June 2, 2010 |
|
Copyright notice: papers on this page are the author's version of the
work. They are posted here for your personal use. Not for redistribution. The
definitive versions were published as indicated above.
|
|