logo
    Texel-Att: Representing and Classifying Element-based Textures by Attributes
    0
    Citation
    0
    Reference
    20
    Related Paper
    Abstract:
    Element-based textures are a kind of texture formed by nameable elements, the texels [1], distributed according to specific statistical distributions; it is of primary importance many sectors, namely textile, fashion and interior design industry. State-of-theart texture descriptors fail to properly characterize element-based texture, so we present Texel-Att to fill this gap. Texel-Att is the first fine-grained, attribute-based representation and classification framework for element-based textures. It first individuates texels, characterizing them with individual attributes; subsequently, texels are grouped and characterized through layout attributes, which give the Texel-Att representation. Texels are detected by a Mask-RCNN, trained on a brand-new element-based texture dataset, ElBa, containing 30K texture images with 3M fully-annotated texels. Examples of individual and layout attributes are exhibited to give a glimpse on the level of achievable graininess. In the experiments, we present detection results to show that texels can be precisely individuated, even on textures in the wild; to this sake, we individuate the element-based classes of the Describable Texture Dataset (DTD), where almost 900K texels have been manually annotated, leading to the Element-based DTD (E-DTD). Subsequently, classification and ranking results demonstrate the expressivity of Texel-Att on ElBa and E-DTD, overcoming the alternative features and relative attributes, doubling the best performance some cases; finally, we report interactive search results on ElBa and E-DTD: with Texel-Att on the E-DTD dataset we are able to individuate within 10 iterations the desired texture the 90% of cases, against the 71% obtained with a combination of the finest existing attributes so far. Dataset and code is available at this https URL
    Keywords:
    Texel
    Texture (cosmology)
    Representation
    Texture (cosmology)
    Texture filtering
    Texture compression
    Smoothness
    Feature (linguistics)
    Bidirectional texture function
    Texture atlas
    Feature vector
    Texture Synthesis
    We introduce a method of a structural analysis for description of textures in natural scenes. The proposed texture analysis system automatically extracts the elements in a texture image, measures their picture properties, classifies them into some discriminable classes (one ground and some figures), and produces the description of elements in each class and also of their placement rules. This description can be used for learning and recognition of texture images. We also present a new idea for evaluating texture analyzers. In order to directly see what essential features or structures of the given textures each analyzer extracts, we propose an analysis-by-synthesis method. That is, the synthesis program in our system reconstructs the texture image on the basis of its description. Comparing the reconstructed image with the original one, we can evaluate what information is preserved and what is lost in the description and can improve our algorithms.
    Texture (cosmology)
    Texture Synthesis
    Basis (linear algebra)
    Texture compression
    Citations (11)
    This paper proposes the problem of unsupervised extraction of texture elements, called texels, which repeatedly occur in the image of a frontally viewed, homogeneous, 2.1D, planar texture, and presents a solution. 2.1D texture here means that the physical texels are thin objects lying along a surface that may partially occlude one another. The image texture is represented by the segmentation tree whose structure captures the recursive embedding of regions obtained from a multiscale image segmentation. In the segmentation tree, the texels appear as subtrees with similar structure, with nodes having similar photometric and geometric properties. A new learning algorithm is proposed for fusing these similar subtrees into a tree-union, which registers all visible texel parts, and thus represents a statistical, generative model of the complete (unoccluded) texel. The learning algorithm involves concurrent estimation of texel tree structure, as well as the probability distributions of its node properties. Texel detection and segmentation are achieved simultaneously by matching the segmentation tree of a new image with the texel model. Experiments conducted on a newly compiled dataset containing 2.1D natural textures demonstrate the validity of our approach.
    Texel
    Texture (cosmology)
    Tree (set theory)
    Citations (62)
    Texture atlas parameterization provides an effective way to map a variety of colour and data attributes from 2D texture domains onto polygonal surface meshes. Most of the existing literature focus on how to build seamless texture atlases for continuous photometric detail, but little e ort has been devoted to devise e cient techniques for encoding self-repeating, uncontinuous signals such as building facades. We present a perception-based scheme for generating space-optimized texture atlases speci cally designed for intentionally non-bijective parameterizations. Our scheme combines within-chart tiling support with intelligent packing and perceptual measures for assigning texture space in accordance to the amount of information contents of the image and on its saliency. We demonstrate our optimization scheme in the context of real-time navigation through a gigatexel urban model of an European city. Our scheme achieves signi cant compression ratios and speed-up factors with visually indistinguishable results. We developed a technique that generates space-optimized texture atlases for the particular encoding of uncontinuous signals projected onto geometry. The scene is partitioned using a texture atlas tree that contains for each node a texture atlas. The leaf nodes of the tree contain scene geometry. The level of detail is controlled by traversing the tree and selecting the appropriate texture atlas for a given viewer position and orientation. In a preprocessing step, textures associated to each texture atlas node of the tree are packed. Textures are resized according to a given user-de ned texel size and the size of the geometry that are projected onto. We also use perceptual measures to assign texture space in accordance to image detail. We also explore different techniques for supporting texture wrapping of uncontinuous signals, which involved the development of e cient techniques for compressing texture coordinates via the GPU. Our approach supports texture ltering and DXTC compression without noticeable artifacts. We have implemented a prototype version of our space-optimized texture atlases technique and used it to render the 3D city model of Barcelona achieving interactive rendering frame rates. The whole model was composed by more than three million triangles and contained more than twenty thousand different textures representing the building facades with an average original resolution of 512 pixels per texture. Our scheme achieves up 100:1 compression ratios and speed-up factors of 20 with visually indistinguishable results.
    Texture atlas
    Texture compression
    Bidirectional texture function
    Texel
    Texture filtering
    Tree (set theory)
    Citations (0)
    Texture filtering
    Texture Synthesis
    Texture compression
    Texture (cosmology)
    Similarity (geometry)
    Bidirectional texture function
    Texture atlas
    Sample (material)
    Position (finance)
    Texture memory
    Texel
    Citations (4)
    Patterns and textures are defining characteristics of many natural objects: a shirt can be striped, the wings of a butterfly can be veined, and the skin of an animal can be scaly. Aiming at supporting this analytical dimension in image understanding, we address the challenging problem of describing textures with semantic attributes. We identify a rich vocabulary of forty-seven texture terms and use them to describe a large dataset of patterns collected in the wild.The resulting Describable Textures Dataset (DTD) is the basis to seek for the best texture representation for recognizing describable texture attributes in images. We port from object recognition to texture recognition the Improved Fisher Vector (IFV) and show that, surprisingly, it outperforms specialized texture descriptors not only on our problem, but also in established material recognition datasets. We also show that the describable attributes are excellent texture descriptors, transferring between datasets and tasks; in particular, combined with IFV, they significantly outperform the state-of-the-art by more than 8 percent on both FMD and KTHTIPS-2b benchmarks. We also demonstrate that they produce intuitive descriptions of materials and Internet images.
    Texture (cosmology)
    Representation
    Basis (linear algebra)
    Citations (4)
    In this thesis, we present an approach to finding a procedural representation of a texture to replicate a given texture image which we call image-based procedural texture matching. Procedural representations are frequently used for many aspects of computer generated imagery, however, the ability to use procedural textures is limited by the difficulty inherent in finding a suitable procedural representation to match a desired texture. More importantly, the process of determining an appropriate set of parameters necessary to approximate the sample texture is a difficult task for a graphic artist. The textural characteristics of many real world objects change over time, so we are therefore interested in how textured objects in a graphical animation could also be made to change automatically. We would like this automatic texture transformation to be based on different texture samples in a time-dependant manner. This notion, which is a natural extension of procedural texture matching, involves the creation of a smoothly varying sequence of texture images, while allowing the graphic artist to control various characteristics of the texture sequence. Given a library of procedural textures, our approach uses a perceptually motivated texture similarity measure to identify which procedural textures in the library may produce a suitable match. Our work assumes that at least one procedural texture in the library is capable of approximating the desired texture. Because exhaustive search of all of the parameter combinations for each procedural texture is not computationally feasible, we perform a two-stage search on the candidate procedural textures. First, a global search is performed over pre-computed samples from the given procedural texture to locate promising parameter settings. Secondly, these parameter settings are optimised using a local search method to refine the match to the desired texture. The characteristics of a procedural texture generally do not vary uniformly for uniform parameter changes. That is, in some areas of the parameter domain of a procedural texture (the set of all valid parameter settings for the given procedural texture) small changes may produce large variations in the resulting texture, while in other areas the same changes may produce no variation at all. In this thesis, we present an adaptive random sampling algorithm which captures the texture range (the set of all images a procedural texture can produce) of a procedural texture by maintaining a sampling density which is consistent with the amount of change occurring in that region of the parameter domain. Texture transformations may not always be contained to a single procedural texture, and we therefore describe an approach to finding transitional points from one procedural texture to another. We present an algorithm for finding a path through the texture space formed from combining the texture range of the relevant procedural textures and their transitional points. Several examples of image-based texture matching, and texture transformations are shown. Finally, potential limitations of this work as well as future directions are discussed.
    Procedural modeling
    Texture (cosmology)
    Texture filtering
    Texture compression
    Texture Synthesis
    Texture atlas
    Similarity (geometry)
    Bidirectional texture function
    Citations (0)
    Procedural textures have been widely used as they can be easily generated from various mathematical models. However, the model parameters are not perceptually meaningful or uniform for non-expert users; therefore it is difficult for general users to obtain a desired texture by tuning the parameters. In order to satisfy users' requirement, we propose a novel procedural texture retrieval scheme that can return textures according to commonly used perceptual dimensions. We establish a procedural texture database that includes abundant textures so as to meet the diverse demands of users. All textures in the database are projected into a perceptual space after we construct the mapping model. First, we investigate the salient features of the input texture; then we calculate the Euclidean distance between the input texture and each texture in the database. Experimental results show that our method can effectively retrieve textures that are perceptually consistent with users' input.
    Texture (cosmology)
    Texture filtering
    Projective texture mapping
    Texture Synthesis
    Texture compression
    We present a user-centered approach to image-based texture synthesis. The synthesis uses a user-defined brush as generation primitive. This allows synthesis of textures at the texel level and further integration of the method into most of the existing digital image edition softwares. We treat texture images as probability density estimators from which new images with perceptually similar appearance and structure can be sampled. We model the input texture sample as a Markov Random Field and propose several algorithms based on the locality and stationarity principles assumed from the theory.
    Texture Synthesis
    Texel
    Image editing
    Markov random field
    Texture (cosmology)
    Texture compression
    Image Synthesis
    Citations (2)