We propose the multi-level network Lasso, which aims to overcome the key limitations of existing personalized learning methods, such as ignoring sample homogeneity or heterogeneity, and over-parametrization. Multi-level network Lasso learns both sample-common model and sample-specific model, that are succinct and interpretable in the sense that model parameters are shared across neighboring samples based on only a subset of relevant features. To apply personalized learning in multi-task scenarios, we further extend the multi-level network Lasso for multi-task personalized learning by learning underlying task groups in the feature subspace. Additionally, we investigate a family of the multi-level network Lasso based on the $\ell_{p}$ quasi-norm ($0
Traditional learning usually assumes that all samples share the same global model, which fails to preserve critical local information for heterogeneous data. It can be tackled by detecting sample clusters and learning sample-specific models, but is limited to sample-level clustering and sample-specific feature selection. In this paper, we propose multi-level sparse network lasso (MSN Lasso) for flexible local learning. It multiplicatively decomposes model parameters into two components: one is shared across elements within a specially designed group and another one is element-specific. By treating the pairwise difference of arbitrary two local models as a group, it achieves clustering at sample-level and feature-level. Similarly, by treating each feature as a group, it enables across-sample and sample-specific feature selection. Theoretical analysis reveals its equivalence with a jointly regularized local model and helps develop an efficient algorithm. Based on extensive experiments on various datasets, MSN Lasso outperforms alternatives and achieves higher flexibility.
From 1963 to 1964, three herpetological explorations were carried out in Hainan Island. 37 species and subspecies of amphibians belonging to 13 genera, 7 families and 2 orders were collected. 3 new species (Nectophryne scalptus, Rana fragilis and Philantus ocellatus), a new record for China (Bufo galeatus Guenther), 7 new Hainan records (table 1) have been found and 5 topotypes have been collected. Micrixalus torrentis Smith actually belongs to the genus Staurois. Kaloula pulchra hainana Gressitt is verified as a valid subspecies. Megophrys hasgeltii (Tschudi) is a species of Vibrissaphora.As the natural conditions of the different parts of Hainan do not vary much, there is no significant regional herpetological fauna. The Hainan herpetofauna, as a whole, conspicuously belongs to South China Region. 20 out of 37 species of amphibians (54%) from Hainan are confined in distribution to South China. Among them 8 species of amphibians are found to be endemic to Hainan. Hainan may thus be considered as a subregion of South China Region.The diagnoses of 3 new species are given in English as follows:1. Nectophryne scalptus Liu et Hu, sp. nov. (Color plate, fig. 1; Text fig. 1) Holotype: No. 64III0604, adult male; type locality : Xin-min Ziang, Wuzhi Shan,Hainan, altitude 750m; May 20, 1964; collected by Xiau Han-chi.Allotype: No. 64III0605, adult female; collected with the holotype.Paratypes: 21 , 3 from Wuzhi Shan and Dali of Diaulo Shan, Hainan; altitude 350-1,400 m; from April 23 to June 12, 1964.Diagnosis: Nectophryne scalptus is a new distinct species which differs from all other species of this genus in having: 1) a dark triangular mark in the interorbital region and two wide dark -shaped marks on the dorsum; 2) wrinkled skin like scales in appearance in the thoracic region; 3) a small triangular dermal fold covering the anal opening at the rear dorsal region.This genus is new to China.2. Rana fragilis Liu et Hu, sp. nov. (Color plate, figs. 2-3; Text fig. 2) Holotype: No. 64III3963, adult male; type locality: Jingko Ling, Baisha Hsien,Hainan, altitude 780 m; August 27, 1964; collected by Wang I-sheng.Allotype: No. 64III0673, adult 'female; Wuzhi Shan, Hainan, altitude 800 m; May 4, 1964.Paratypes: 10 , 32 and tadpoles of different stages including metamor-phosized individuals, from Jingko Ling, Wuzhi Shan, Dali of Diaulo Shan, Gianfen Ling and Dan Hsieh, Hainan, altitude 290-900 m; from April 25 to August 27, 1964. Diagnosis: The adult of Rana fragilis is similar to that of Rana kuhlii Dumeril et Bibron, but Rana fragilis can be readily distinguished from the latter species by the most peculiar mouth-parts of its tadpole. The lower labial lip of the tadpole is broadly expanded and divided into several lobes. There are many small scattered papillae on the free surface of this broad lip, the tooth formula for lower lip being constantly I:1-1. The skin of Rana fragilis is very delicate. A mere cratch made by hard pebbles or a forceful catching may cause the skin to break.3. Philautus ocellatus Liu et Ha, sp. nov. (Color plate, fig. 4)Holotype: No. 64III1371, adult male; type locality: Wuzhi Shan, Hainan, altitude 700 m; May 14, 1964; collected by Fei Liang.Allotype: No. 64III1370, adult female; collected with the holotype.Paratypes: 5 , 3 from Wuzhi Shan, Dali of Diaulo Shan and Gianfen Ling, Hainan; altitude 400-700 m; from April 25 to June 24, 1964.Diagnosis: This new Philautus resembles P. parvulus (Boulenger) in size and color pattern on the sides, but differs from the latter in: 1) tympanum very distinct; 2) nostril nearer to the tip of snout than, to the eye; 3) legs long, with tibio-tarsal articulation reaching beyond the anterior corner of the eye or between the eye and the snout. In comparison with P. hazelae Taylor, the body and finger discs of the new species are smaller in size, and it has no tubercle at the distal part of the tibia.
Multi-task learning (MTL) improves generalization by sharing information among related tasks. Structured sparsity-inducing regularization has been widely used in MTL to learn interpretable and compact models, especially in high-dimensional settings. These methods have achieved much success in practice, however, there are still some key limitations, such as limited generalization ability due to specific sparse constraints on parameters, usually restricted in matrix form that ignores high-order feature interactions among tasks, and formulated in various forms with different optimization algorithms. Inspired by Generalized Lasso, we propose the Generalized Group Lasso (GenGL) to overcome these limitations. In GenGL, a linear operator is introduced to make it adaptable to diverse sparsity settings, and helps it to handle hierarchical sparsity and multi-component decomposition in general tensor form, leading to enhanced flexibility and expressivity. Based on GenGL, we propose a novel framework for Structured Sparse MTL (SSMTL), that unifies a number of existing MTL methods, and implement its two new variants in shallow and deep architectures, respectively. An efficient optimization algorithm is developed to solve the unified problem, and its effectiveness is validated by synthetic and real-world experiments.
Abstract Multi-Label Classification (MLC) assigns multiple relevant labels to each sample simultaneously,while multi-view MLC aims to apply MLC to handle heterogeneous data represented by multiple feature subsets.In recent years, a variety of methods have been proposed to handle these problems and have achieved great success in a wide range of applications. MLC saves global label correlation by building a single model shared by all samples but ignores sample-specific local structures, while Personalized Learning (PL) is able to preserve sample-specific information by learning local models but ignores the global structure. Integrating PL with MLC is a straightforward way to overcome the limitations, but it still faces three key challenges. 1) capture both local and global structures in a unified model; 2) efficiently preserve full-order interactions between labels, samples and features or multi-view features in heterogeneous data; 3) learn a concise and interpretable model where only a fraction of interactions are associated with multiple labels. In this paper, we propose a novel Multi-Label Personalized Classification (MLPC) method and its multi-view extension to handle these challenges. For 1), it integrates local and global components to preserve sample-specific information and global structure shared across samples, respectively. For 2), a multilinear model is developed to capture full-order label-feature-sample interactions, and over-parameterization is avoided by tensor factorization. For 3), exclusive sparsity regularization penalizes factorization by promoting intra-group competition, thereby eliminating irrelevant and redundant interactions during Exclusive Sparse Tensor Factorization (ESTF). Moreover, theoretical analysis generalizes the proposed ESTF and reveals the equivalence between MLPC and a family of jointly regularized counterparts. We develop an alternating algorithm to solve the optimization problem, and demonstrate its effectiveness based on comprehensive experiments on both synthetic and real-world benchmark datasets.