logo
    Generation of 3D Texture Using Multiple 2D Models Analysis
    46
    Citation
    0
    Reference
    10
    Related Paper
    Citation Trend
    Abstract:
    Abstract Solid (30) texturing is commonly used in computer graphics for producing more realistic images. It is often more attractive than the conventional 20 texture mapping but remains more complex on some points. Its major difficulty concerns the generation of 30 texture in a general and efficient way. The well‐known traditional procedural methods use generally a simplified mathematical model of a natural texture. No reliable way for the choice of the mathematical model parameters, which characterise directly the produced 30 texture, is given. Therefore, 30 texture generation becomes a more or less experimental process with these methods. Our recently published methodfor an automatic 30 texture generation avoids this problem by the use of the spectral analysis of one 2D model texture. The resulting 30 texture is of good quality but one open problem remains: the aspect of the produced texture cannot be fully controlled over the entire 30 space by only one 20 spectral analysis. This may be considered as a serious limitation for some kinds of textures representing important variations in any direction. In this paper we present a new and more powerful analytical approach for an automatic 30 texture generation. Contrarily to our previous method, this new approach is not exclusively based on the spectral analysis of only one 20 model. It uses two or three 2D models corresponding to different slices of a 30 texture block, so, the aspect of the produced 3D texture can be controlled more efficiently over the entire 30 space. In addition, a more efficient 30 texture antialiasing, well adapted to this new method is presented.
    Keywords:
    Texture (cosmology)
    Texture compression
    Texture Synthesis
    Texture filtering
    Bidirectional texture function
    Texture atlas
    Projective texture mapping
    Displacement mapping
    Texture compression
    Texture filtering
    Projective texture mapping
    Texture atlas
    Bidirectional texture function
    Texture (cosmology)
    Texture Synthesis
    Displacement mapping
    Aliasing
    In computer graphics, textures represent the detail appearance of the surface of objects, such as colors and patterns. Example-based texture synthesis is to construct a larger visual pattern from a small example texture image. In this paper, we present a simple and efficient method which can synthesize a large scale texture in real-time based on a given example texture by simply tiling and deforming the example texture. Different from most of the existing techniques, our method does not perform search operation and it can compute texture values at any given points (random access). In addition, our method requires small storage which is only to store one example texture. Our method is suitable for synthesizing irregular and near-stochastic texture. We also propose methods to efficiently synthesize and map 3D solid textures on 3D meshes.
    Texture Synthesis
    Projective texture mapping
    Texture filtering
    Texture atlas
    Texture compression
    Texture (cosmology)
    Texture memory
    Bidirectional texture function
    Displacement mapping
    Citations (10)
    Mapping from parameter space to texture space separates textures from objects, enabling control of textured images without modifying the object or texture. Although conceptually straightforward, /spl gamma/-mapping turns out to be very useful in practical texture mapping implementation. It logically separates the texture from the object and makes the texture mapping process flexible and easier to use. By modifying the /spl gamma/-mapping, texture placement on the object is efficiently controlled without modifying either the texture image or the underlying geometry. We concentrate on efficiency issues related to /spl gamma/-mapping, as well as its practical applications to parametric surfaces.
    Projective texture mapping
    Texture filtering
    Texture atlas
    Texture compression
    Bidirectional texture function
    Texture (cosmology)
    Texture Synthesis
    Displacement mapping
    Citations (9)
    Bidirectional texture function
    Projective texture mapping
    Texture filtering
    Texture compression
    Texture atlas
    Texture Synthesis
    Texture (cosmology)
    Displacement mapping
    Citations (3)
    Texture filtering
    Texture atlas
    Texture compression
    Projective texture mapping
    Bidirectional texture function
    Texture Synthesis
    Texture (cosmology)
    Texture memory
    Displacement mapping
    Citations (12)
    Texture synthesis approach from samples developed after texture mapping and procedural texture synthesis.This method avoid noticeable seams between texture patches and minimize the stretch and distortion of the pattern when tiling a texture on surfaces.It is important on the computer graphics,computer vision and image processing.A new approach is presented for texture synthesis from patch: It apply max flow principle and sequence way to create the graph structure of the GraphCut.It does be simple and have a good experimental result.The method especially apply to near-regular texture image.
    Texture filtering
    Texture compression
    Texture Synthesis
    Texture atlas
    Projective texture mapping
    Bidirectional texture function
    Texture (cosmology)
    Displacement mapping
    Distortion (music)
    Citations (0)
    Abstract The goal of texture synthesis is to generate an arbitrarily large high‐quality texture from a small input sample. Generally, it is assumed that the input image is given as a flat, square piece of texture, thus it has to be carefully prepared from a picture taken under ideal conditions. Instead we would like to extract the input texture from any surface from within an arbitrary photograph. This introduces several challenges: Only parts of the photograph are covered with the texture of interest, perspective and scene geometry introduce distortions, and the texture is non‐uniformly sampled during the capture process. This breaks many of the assumptions used for synthesis. In this paper we combine a simple novel user interface with a generic per‐pixel synthesis algorithm to achieve high‐quality synthesis from a photograph. Our interface lets the user locally describe the geometry supporting the textures by combining rational Bézier patches. These are particularly well suited to describe curved surfaces under projection. Further, we extend per‐pixel synthesis to account for arbitrary texture sparsity and distortion, both in the input image and in the synthesis output. Applications range from synthesizing textures directly from photographs to high‐quality texture completion.
    Texture Synthesis
    Texture atlas
    Projective texture mapping
    Texture filtering
    Bidirectional texture function
    Texture (cosmology)
    Texture compression
    Displacement mapping
    Distortion (music)
    Perspective distortion
    Computer graphics applications often use textures to decorate virtual objects without modeling geometric details. These textures can be seamlessly generated from sample textures using texture synthesis algorithms. However, most existing texture synthesis algorithms are designed for rectangular domains and cannot be easily extended to general surfaces. In this paper, we present a novel method for texture synthesis and texture growing on triangle mesh with arbitrary topology. Using the sampled texture patches as the building blocks for texture synthesis, the visually seamless texture can be synthesized for a wide variety of textures ranging from regular to stochastic. The synthesizing process generates an initial texture block from the input texture patches and then iteratively searches for all texture blocks from the sampled texture patches to select a patch which optimizes the smoothness of the overlapping boundaries of the synthesized texture. It is showed that the proposed scheme is effective and efficient for synthesizing and growing of a wide range of textures on surfaces.
    Texture Synthesis
    Texture atlas
    Texture filtering
    Texture compression
    Bidirectional texture function
    Projective texture mapping
    Texture (cosmology)
    Displacement mapping
    Texture memory
    Smoothness
    Textures can describe a wide variety of natural phenomena with random variations over repeating patterns. Examples of textures include images, motions, and surface geometry. Since reproducing the realism of the physical world is a major goal for computer graphics, textures are important for rendering synthetic images and animations. However, because textures are so diverse it is difficult to describe and reproduce them under a common framework. In this thesis, we present new methods for synthesizing textures. The first part of the thesis is concerned with a basic algorithm for reproducing image textures. The algorithm is easy to use and requires only a sample texture as input. It generates textures with perceived quality equal to or better than those produced by previous techniques, but runs orders of magnitude faster. The algorithm is derived from Markov Random Field texture models and generates textures through a deterministic searching process. Because of the use of this deterministic searching, our algorithm can avoid the computational demand of probability sampling and can be directly accelerated by a point searching algorithm such as tree-structured vector quantization. The second part of the thesis concerns various extensions and applications of our texture synthesis algorithm. The applications include constrained synthesis, in which artifacts in files or photographs are removed by replacing them with synthesized texture backgrounds; motion texture synthesis, in which texture synthesis is applied to generate repetitive motion textures such as 3D temporal textures and 1D articulated motion signals; surface texture synthesis, in which textures are grown directly over manifold surfaces; multiple-source texture synthesis, in which new textures are generated from multiple sources; and order-independent texture synthesis, in which a new texture can be generated on demand in an arbitrary traversal order without producing inconsistent results. We conclude this thesis by analyzing the algorithm behavior and discussing future work.
    Texture Synthesis
    Texture atlas
    Bidirectional texture function
    Texture compression
    Projective texture mapping
    Texture filtering
    Markov random field
    Displacement mapping
    Texture (cosmology)
    Citations (84)
    We present an extension to texture mapping that supports the representation of 3-D surface details and view motion parallax. The results are correct for viewpoints that are static or moving, far away or nearby. Our approach is very simple: a relief texture (texture extended with an orthogonal displacement per texel) is mapped onto a polygon using a two-step process: First, it is converted into an ordinary texture using a surprisingly simple 1-D forward transform. The resulting texture is then mapped onto the polygon using standard texture mapping. The 1-D warping functions work in texture coordinates to handle the parallax and visibility changes that result from the 3-D shape of the displacement surface. The subsequent texture-mapping operation handles the transformation from texture to screen coordinates.
    Projective texture mapping
    Displacement mapping
    Texture filtering
    Texel
    Texture compression
    Bidirectional texture function
    Texture atlas
    Texture (cosmology)
    Polygon (computer graphics)
    Image warping
    Parallax
    Texture memory
    Citations (271)