Kernel collaborative online algorithms for multi-task learning
2019
In many real time applications, we often have to deal with classification, regression or clustering problems that involve multiple tasks. The conventional machine learning approaches solve these tasks independently by ignoring the task relatedness. In multi-task learning (MTL), these related tasks are learned simultaneously by extracting and utilizing the shared information across tasks. This approach of learning related tasks together increases the sample size for each task and improves the generalization performance. Thus MTL is especially beneficial when the training size is small for each task. This paper describes multi-task learning using kernel online learning approach. As many real world applications are online in nature, development of efficient online learning techniques is very much needed. Since online learning processes only one data at a time, these techniques could be effectively applied on large data sets. The MTL model we developed involves a global function and a task specific function corresponding to each task. The cost function used for finding the task specific function makes use of the global model for incorporating the necessary information from other tasks. Such modeling strategies improve the generalization capacity of the model. The problem of finding global and task specific functions is formulated as two separate problems and at each step of arrival of new data, the global vector is solved at the first instant and its information is used to update the task specific vector. The updation rule for the task specific function involves the approximation of global components using task specific components by means of projection. We applied the developed frame work on real world problems and the results were found to be promising.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
13
References
0
Citations
NaN
KQI