Jubatus
![]() | The topic of this article may not meet Wikipedia's notability guidelines for products and services. (February 2014) |
Developer(s) | Nippon Telegraph and Telephone & |
---|---|
Stable release | 0.4.3
/ April 19, 2013 |
Written in | C++ |
Operating system | Linux |
Type | machine learning |
License | GNU Lesser General Public License 2.1 |
Website | jubat |
Jubatus is an open-source online machine learning and distributed computing framework developed at Nippon Telegraph and Telephone and . Its features include classification, recommendation, regression, anomaly detection and graph mining. It supports many client languages, including C++, Java, Ruby and Python. It uses Iterative Parameter Mixture[1][2] for distributed machine learning.
Notable Features[]
Jubatus supports:
- Multi-classification algorithms:
- Recommendation algorithms using:
- Regression algorithms:
- Passive Aggressive
- feature extraction method for natural language:
References[]
- ^ Ryan McDonald, K. Hall and G. Mann, Distributed Training Strategies for the Structured Perceptron, North American Association for Computational Linguistics (NAACL), 2010.
- ^ Gideon Mann, R. McDonald, M. Mohri, N. Silberman, and D. Walker, Efficient Large-Scale Distributed Training of Conditional Maximum Entropy Models, Neural Information Processing Systems (NIPS), 2009.
- ^ Crammer, Koby; Dekel, Ofer; Shalev-Shwartz, Shai; Singer, Yoram (2003). Online Passive-Aggressive Algorithms. Proceedings of the Sixteenth Annual Conference on Neural Information Processing Systems (NIPS).
- ^ Koby Crammer and Yoram Singer. Ultraconservative online algorithms for multiclass problems. Journal of Machine Learning Research, 2003.
- ^ Koby Crammer, Ofer Dekel, Joseph Keshet, Shai Shalev-Shwartz, Yoram Singer, Online Passive-Aggressive Algorithms. Journal of Machine Learning Research, 2006.
- ^ Mark Dredze, Koby Crammer and Fernando Pereira, Confidence-Weighted Linear Classification, Proceedings of the 25th International Conference on Machine Learning (ICML), 2008
- ^ Koby Crammer, Mark Dredze and Fernando Pereira, Exact Convex Confidence-Weighted Learning, Proceedings of the Twenty Second Annual Conference on Neural Information Processing Systems (NIPS), 2008
- ^ Koby Crammer, Mark Dredze and Alex Kulesza, Multi-Class Confidence Weighted Algorithms, Empirical Methods in Natural Language Processing (EMNLP), 2009
- ^ Koby Crammer, Alex Kulesza and Mark Dredze, Adaptive Regularization Of Weight Vectors, Advances in Neural Information Processing Systems, 2009
- ^ Koby Crammer and Daniel D. Lee, Learning via Gaussian Herding, Neural Information Processing Systems (NIPS), 2010.
Categories:
- Data mining and machine learning software
- Free software stubs