Boosting Piezocatalytic Performance of BaTiO3 by Tuning Defects at Room Temperature
4
Citation
58
Reference
10
Related Paper
Citation Trend
Abstract:
Defect engineering constitutes a widely-employed method of adjusting the electronic structure and properties of oxide materials. However, controlling defects at room temperature remains a significant challenge due to the considerable thermal stability of oxide materials. In this work, a facile room-temperature lithium reduction strategy is utilized to implant oxide defects into perovskite BaTiO3 (BTO) nanoparticles to enhance piezocatalytic properties. As a potential application, the piezocatalytic performance of defective BTO is examined. The reaction rate constant increases up to 0.1721 min−1, representing an approximate fourfold enhancement over pristine BTO. The effect of oxygen vacancies on piezocatalytic performance is discussed in detail. This work gives us a deeper understanding of vibration catalysis and provides a promising strategy for designing efficient multi-field catalytic systems in the future.Keywords:
Boosting
Thermal Stability
Boosting has been shown to improve the performance of classifiers in many situations, including when data is imbalanced. There are, however, two possible implementations of boosting, and it is unclear which should be used. Boosting by reweighting is typically used, but can only be applied to base learners which are designed to handle example weights. On the other hand, boosting by resampling can be applied to any base learner. In this work, we empirically evaluate the differences between these two boosting implementations using imbalanced training data. Using 10 boosting algorithms, 4 learners and 15 datasets, we find that boosting by resampling performs as well as, or significantly better than, boosting by reweighting (which is often the default boosting implementation). We therefore conclude that in general, boosting by resampling is preferred over boosting by weighting.
Boosting
Resampling
Gradient boosting
Implementation
Cite
Citations (74)
We study the reasons of high efficiency of methods based on classifiers compositions, such as boosting. It has been shown that one of the main grounds for such efficiency is the usage of the effect of independence of features. To investigate the performance of a method we run it directly on the distributions. We compare approximating capabilities of boosting and splines. Also we show the relation between a complexity and a margin.
Boosting
Margin (machine learning)
Independence
Cite
Citations (0)
Changes in the data distribution (concept drift) makes online learning a challenge that is progressively attracting more attention. This paper proposes Boosting-like Online Learning Ensemble (BOLE) based on heuristic modifications to Adaptable Diversity-based Online Boosting (ADOB), which is a modified version of Oza and Russell's Online Boosting. More precisely, we empirically investigate the effects of (a) weakening the requirements to allow the experts to vote and (b) changing the concept drift detection method internally used, aiming to improve the ensemble accuracy. BOLE was tested against the original and other modified versions of both boosting methods as well as three renowned ensembles using well-known artificial and real-world datasets and statistically surpassed the accuracies of both boosting methods as well as those of the three ensembles. The accuracy improved in most tested situations but this is more evident in the datasets with more concept drifts, where the accuracy gains were very high.
Boosting
Ensemble Learning
Gradient boosting
Cite
Citations (60)
Boosting
Margin (machine learning)
Cite
Citations (18)
Boosting
Gradient boosting
Ensemble Learning
Cite
Citations (0)
Boosting
Gradient boosting
Cite
Citations (32)
Boosting is a general method for improving the accuracy of any given learning algorithm. This short paper introduces the boosting algorithm AdaBoost, and explains the underlying theory of boosting, including an explanation of why boosting often does not suffer from overfitting. Some examples of recent applications of boosting are also described.
Boosting
Overfitting
AdaBoost
Gradient boosting
Cite
Citations (1,233)
Boosting
AdaBoost
Gradient boosting
Cite
Citations (4)
As shown in the bibliography, training an ensemble of networks is an interesting way to improve the performance with respect to a single network. The two key factors to design an ensemble are how to train the individual networks and how to combine them to give a single output. Boosting is a well known methodology to build an ensemble. Some boosting methods use an specific combiner (Boosting Combiner) based on the accuracy of the network. Although the Boosting combiner provides good results on boosting ensembles, the simple combiner Output Average worked better in three new boosting methods we successfully proposed in previouses papers. In this paper, we study the performance of sixteen different combination methods for ensembles previously trained with Adaptive Boosting and Average Boosting in order to see which combiner fits better on these ensembles. Finally, the results show that the accuracy of the ensembles trained with these original boosting methods can be improved by using the appropriate alternative combiner. In fact, the Output average and the Weighted average on low/medium sized ensembles provide the best results in most of the cases.
Boosting
Gradient boosting
Ensemble Learning
AdaBoost
Cite
Citations (3)
This chapter contains sections titled: Introduction Hypothesis Boosting Problem Learn Boosting by Majority AdaBoost BrownBoost AdaBoost for Feature Selection Conclusion References
Boosting
AdaBoost
Cite
Citations (0)