logo
    Optimized Designs for Control Rod Absorbers with a High-Fidelity Simulation Method Implemented in the Rmc Code
    0
    Citation
    0
    Reference
    10
    Related Paper
    Abstract:
    A high-fidelity simulation method of the control rod absorbers is implemented in the Monte Carlo code RMC. For rare-earth absorbers and B4C which have been widely researched, detailed calculations and analyses are conducted for neutronic and depletion characteristics. Using this method, three kinds of optimized control rod absorbers are proposed with higher reactivity worth and favorable burnup stability. In these new designs, the composite absorbers possess higher reactivity worth and only average 20% reactivity loss at the end of cycle (EOC). The enriched absorbers have better neutron absorbing capacity and more stable radiation resistence, and the reactivity loss of enriched B4C reduces from 90% to only 7.42% at EOC. For mixed absorbers, the best properties are found for the Dy and B4C mixture when the proportion of Dy is about 60%. The coupling of ZrH2 and Hf achieves a better control efficiency, but unfortunately the reactivity loss increases at EOC.
    Keywords:
    Code (set theory)
    High fidelity
    A high-fidelity simulation method of the control rod absorbers is implemented in the Monte Carlo code RMC. For rare-earth absorbers and B4C which have been widely researched, detailed calculations and analyses are conducted for neutronic and depletion characteristics. Using this method, three kinds of optimized control rod absorbers are proposed with higher reactivity worth and favorable burnup stability. In these new designs, the composite absorbers possess higher reactivity worth and only average 20% reactivity loss at the end of cycle (EOC). The enriched absorbers have better neutron absorbing capacity and more stable radiation resistence, and the reactivity loss of enriched B4C reduces from 90% to only 7.42% at EOC. For mixed absorbers, the best properties are found for the Dy and B4C mixture when the proportion of Dy is about 60%. The coupling of ZrH2 and Hf achieves a better control efficiency, but unfortunately the reactivity loss increases at EOC.
    Code (set theory)
    High fidelity
    Citations (0)
    Quantum simulation promises to address many challenges in fields ranging from quantum chemistry to material science and high-energy physics, and could be implemented in noisy intermediate-scale quantum devices. A challenge in building good digital quantum simulators is the fidelity of the engineered dynamics given a finite set of elementary operations. Here we present a framework for optimizing the order of operations based on a geometric picture, thus abstracting from the operation details and achieving computational efficiency. Based on this geometric framework, we provide two alternative second-order Trotter expansions: one with optimal fidelity at a short timescale, and the second robust at a long timescale. Thanks to the improved fidelity at different timescales, the two expansions we introduce can form the basis for experimental-constrained digital quantum simulation.
    High fidelity
    Ranging
    Basis (linear algebra)
    accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multi-physics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.
    High fidelity
    Compass
    Citations (0)
    The objective for this work is to develop a data-driven proxy to high-fidelity numerical flow simulations using digital images. The proposed model can capture the flow field and permeability in a large verity of digital porous media based on solid grain geometry and pore size distribution by detailed analyses of the local pore geometry and the local flow fields. To develop the model, the detailed pore space geometry and simulation runs data from 3500 two-dimensional high-fidelity Lattice Boltzmann simulation runs are used to train and to predict the solutions with a high accuracy in much less computational time. The proposed methodology harness the enormous amount of generated data from high-fidelity flow simulations to decode the often under-utilized patterns in simulations and to accurately predict solutions to new cases. The developed model can truly capture the physics of the problem and enhance prediction capabilities of the simulations at a much lower cost. These predictive models, in essence, do not spatio-temporally reduce the order of the problem. They, however, possess the same numerical resolutions as their Lattice Boltzmann simulations equivalents do with the great advantage that their solutions can be achieved by significant reduction in computational costs (speed and memory).
    Lattice Boltzmann methods
    High fidelity
    Citations (1)
    Tomographic imaging can now be routinely performed over three orders of magnitude in length scale with correspondingly high data fidelity. This capability, coupled with the development of advanced computational algorithms for image interpretation, three-dimensional visualization, and structural characterization and computation of physical properties on image data, allows for a new numerical laboratory approach to the study of real complex materials: the Virtual Materials Laboratory. Numerical measurements performed directly on images can, in many cases, be performed with similar accuracy to equivalent laboratory measurements, but also on traditionally intractable materials. These emerging capabilities and their impact on a range of scientific disciplines and industry are explored here.
    High fidelity
    Characterization
    Diffusion models have recently achieved astonishing performance in generating high-fidelity photo-realistic images. Given their huge success, it is still unclear whether synthetic images are applicable for knowledge distillation when real images are unavailable. In this paper, we extensively study whether and how synthetic images produced from state-of-the-art diffusion models can be used for knowledge distillation without access to real images, and obtain three key conclusions: (1) synthetic data from diffusion models can easily lead to state-of-the-art performance among existing synthesis-based distillation methods, (2) low-fidelity synthetic images are better teaching materials, and (3) relatively weak classifiers are better teachers. Code is available at https://github.com/zhengli97/DM-KD.
    Synthetic data
    Code (set theory)
    High fidelity
    Citations (4)
    We present fast, realistic image generation on high-resolution, multimodal datasets using hierarchical variational autoencoders (VAEs) trained on a deterministic autoencoder's latent space. In this two-stage setup, the autoencoder compresses the image into its semantic features, which are then modeled with a deep VAE. With this method, the VAE avoids modeling the fine-grained details that constitute the majority of the image's code length, allowing it to focus on learning its structural components. We demonstrate the effectiveness of our two-stage approach, achieving a FID of 9.34 on the ImageNet-256 dataset which is comparable to BigGAN. We make our implementation available online.
    Autoencoder
    Code (set theory)
    High fidelity
    Citations (2)
    With the advent of new thumbprint identification techniques, accurate personal identificat ion is now easy and cheaper with appro ximately zero false acceptance rates.This paper focuses on developing an advance feature for thumbprint based identificat ion systems with the help of soft computing and 2D transformation which makes the technique more flexib le and Fidel.The thu mbprint images of individuals were scanned with the help of H3 T&A terminal for co llect ing self generated datasets.The thumbprints of self generated and standard datasets were trained to form a refined set which includes linear and angular displacements of thumbprint images.The new obtained features of refined datasets were stored in the database for further identification.In the proposed technique, the minutiae coordinates and orientation angles of the thumbprint of a person to be identified are co mputed and merged together for comparison.The minutia coordinates and orientation angles of a person are compared with the minutiae trained set values stored in the database at different linear and angular rotations for identity verification.The proposed technique was tested on fifty persons self generated and standard datasets of FVC2002, FVC2004 and CASIA databases.In the experimentation and result analysis we observed that the proposed technique accurately identifies a person on the basis of minutiae features of a thumbprint with lo w FNM R (False Non-Match Rate) values.
    Minutiae
    Identification
    Feature (linguistics)
    Citations (3)
    NSWC PCD has developed a high-fidelity 3-D finite-element (FE) modeling system that computes acoustic color templates (target strength vs. frequency and aspect angle) of single or multiple realistic objects (e.g., target + clutter) in littoral environments. High-fidelity means that 3-D physics is used in all solids and fluids, including even thin shells, so that solutions include not only all propagating waves but also all evanescent waves, the latter critically affecting the former. Although novel modeling techniques have accelerated the code by several orders of magnitude, it takes about one day to compute an acoustic color template. However, NSWC PCD wants to be able to compute thousands of templates quickly, varying target/environment features by small amounts, in order to develop statistically robust classification algorithms. To accomplish this, NSWC PCD is implementing a radically different FE technology that has already been developed and verified. It preserves all the 3-D physics but promises to accelerate the code another two to three orders of magnitude. Porting the code to an HPC center will accelerate it another one to two orders of magnitude, bringing performance to seconds per template. The talk will briefly review the existing system and then describe the new technology.
    Porting
    High fidelity
    Code (set theory)
    Template
    Citations (0)
    Benchmarks are introduced for evaluating the performance of numerical simulations of space deployable structures. These benchmarks embody the key challenges of interest to future large space deployable structures, including large angle motion, contact between flexible bodies, and the presence of both soft and stiff mechanical components. The benchmarks were used in companion studies to evaluate the ADAMS multibody dynamics code, the LS-Dyna nonlinear finite element code, and the Sierra large-scale parallel nonlinear finite element code. In the past, only multibody codes would have been considered for this application. This study found that all three codes could be used for these benchmarks, a finding that may lead to larger scale, higher fidelity simulations in the future.
    Citations (6)