LSH-GAN: in-silico generation of cells for small sample high dimensional scRNA-seq data
1
Citation
34
Reference
10
Related Paper
Citation Trend
Abstract:
Abstract A fundamental problem of downstream analysis of scRNA-seq data is the unavailability of enough cell samples compare to the feature size. This is mostly due to the budgetary constraint of single cell experiments or simply because of the small number of available patient samples. Here, we present an improved version of generative adversarial network (GAN) called LSH-GAN to address this issue by producing new realistic cell samples. We update the training procedure of the generator of GAN using locality sensitive hashing which speeds up the sample generation, thus maintains the feasibility of applying the standard procedures of downstream analysis. LSH-GAN outperforms the benchmarks for realistic generation of quality cell samples. Experimental results show that generated samples of LSH-GAN improves the performance of the downstream analysis such as feature (gene) selection and cell clustering.Keywords:
Unavailability
Sample (material)
Feature (linguistics)
Locality-sensitive hashing
Unavailability
Capital (architecture)
Cite
Citations (0)
An analytical approach to the unavailability estimation of protected connections in optical networks
Most existing methods for unavailability estimation of protected connections in optical networks using shared backup are approximate. This paper presents an analytical approach to the unavailability estimation. The failure state probabilities used by the approach are derived from an analytical way. The computed unavailability values based on the analytical state probabilities are viable.
Unavailability
Cite
Citations (0)
Unavailability
Component (thermodynamics)
Cite
Citations (0)
The combination of MinHash-based signatures and locality- sensitive hashing (LSH) schemes has been effectively used for finding approximate matches in very large audio and image retrieval systems. In this study, we introduce the idea of permutation-grouping to intelligently design the hash functions that are used to index the LSH tables. This helps to overcome the inefficiencies introduced by hashing real-world data that is noisy, structured, and most importantly is not independently and identically distributed. Through extensive tests, we find that permutation-grouping dramatically increases the efficiency of the overall retrieval system by lowering the number of low-probability candidates that must be examined by 30-50%.
Locality-sensitive hashing
Dynamic perfect hashing
Universal hashing
Cite
Citations (4)
Locality-Sensitive Hashing (LSH) and its variants are well-known methods for solving the c-approximate NN Search problem in high-dimensional space. Traditionally, several LSH functions are concatenated to form a "static" compound hash function for building a hash table. In this paper, we propose to use a base of m single LSH functions to construct "dynamic" compound hash functions, and define a new LSH scheme called Collision Counting LSH (C2LSH). If the number of LSH functions under which a data object o collides with a query object q is greater than a pre-specified collision threhold l, then o can be regarded as a good candidate of c-approximate NN of q. This is the basic idea of C2LSH.
Locality-sensitive hashing
Dynamic perfect hashing
K-independent hashing
Collision attack
Collision resistance
Cite
Citations (197)
This paper addresses the problem of improving the fly hashing [1] that is a high-dimensional hash function based on the fruit fly olfactory circuit. The encoding of fly hashing only uses sparsely addition operations instead of the usual costly dense multiplications, and thus results in efficient computations which is important for near duplicate detection tasks in large-scale search system. However, the firing rate based winner-take-all (WTA) circuit of it is neither biologically plausible nor energy saving, and if this circuit is taken into consideration, theoretical results of locality-sensitive are no longer strong. To improve the fly hashing, we proposed a locality-sensitive hash function based on random projection and threshold based spike-threshold-surface (STS) circuit, and both of them are biologically plausible and can be computed very efficiently in hardware. We also presented a strong theoretical analysis of the proposed hash function, and the experimental result supports our proofs. In addition, we performed experiments on datasets SIFT, GloVe and MNIST, and obtained high search precisions as well as fly hashing with less time to consume.
Locality-sensitive hashing
Dynamic perfect hashing
On the fly
MNIST database
Cite
Citations (0)
Unavailability
Cite
Citations (3)
Web services' high usability is restrained by services unavailability which includes binding unavailability,invoking unavailability and executing unavailability.There has main reason is environment changing which causes Web services unavailability.The reflection technologies are presented to improving Web services self-adaptive and robust ability in this paper. The reflection layers are deployed in client's and server's sides to catch the traps that are some status of Web services unavailability.The reflection layers can adjust the internal structures and states of services,which avoid services unavailability in time and make services adapt to environment changing.The methods and process procedures of the reflection layers are introduced to solve problems of services unavailability.The experiment and analysis is shown that it is effective that using reflection technology to deal with services unavailability.
Unavailability
Reflection
Cite
Citations (0)
Unavailability
Cite
Citations (0)
Abstract With the increase of public safety awareness, video anomaly detection has attracted researchers’ attention. In the paper, a novel approach is proposed to detect anomalies in the video. It is based on Locality Sensitive Hashing (LSH), which maps similar data to the same bucket with high probabilities, and non-similar data is mapped to the same bucket with a low probability to detect abnormal videos that are not similar to normal videos. In order to improve the probability of similar data mapping into the same bucket, the Genetic Algorithm (GA) is used to optimize the entire hash function group while maintaining the diversity of the hash function group. The algorithm gets AUC 0.78 on the dataset UCSD ped1 and AUC 0.94 on the dataset UCSD ped2, which confirmed the effectiveness of the algorithm.
Locality-sensitive hashing
Anomaly (physics)
Cite
Citations (2)