Institute of Mathematics and University of Stuttgart
print

Publications of Prof. Dr. Ingo Steinwart

Book

I. Steinwart and A. Christmann, Support Vector Machines. New York: Springer, 2008. [ final ]

Preprints

I. Steinwart, Measuring the capacity of sets of functions in the analysis of ERM, Tech. Rep. 2014-008, Fakultät für Mathematik und Physik, Universität Stuttgart, 2014. [ preprint.pdf ]

I. Steinwart, Convergence types and rates in generic Karhunen-Loève expansions with applications to sample path properties, Tech. Rep. 2014-007, Fakultät für Mathematik und Physik, Universität Stuttgart, 2014. [ preprint.pdf ]

M. Eberts and I. Steinwart, Optimal learning rates for localized SVMs, Tech. Rep. 2014-002, Fakultät für Mathematik und Physik, Universität Stuttgart, 2014. [ preprint.pdf ]

I. Steinwart, Fully adaptive density-based clustering, Tech. Rep. 2013-016, Fakultät für Mathematik und Physik, Universität Stuttgart, 2013. [ preprint.pdf ]

L. Bornn, M. Anghel, and I. Steinwart, Forecasting with historical data or process knowledge under misspecification: A comparison, tech. rep., Fakultät für Mathematik und Physik, Universität Stuttgart, 2012. [ preprint.pdf ]

Accepted

I. Steinwart, C. Pasin, R. Williamson, and S. Zhang, Elicitation and identification of properties, in JMLR Workshop and Conference Proceedings Volume XX: Proceedings of the 27th Conference on Learning Theory 2014 (M. F. Balcan and C. Szepesvari, eds.), 2014. [ final | preprint.pdf ]

2014

H. Hang and I. Steinwart, Fast learning from α-mixing observations, J. Multivariate Anal., vol. 127, pp. 184-199, 2014. [ final | preprint.pdf ]

2013

I. Steinwart, Some remarks on the statistical analysis of SVMs and related methods, in Empirical Inference - Festschrift in Honor of Vladimir N. Vapnik (B. Schölkopf, Z. Luo, and V. Vovk, eds.), ch. 4, pp. 25-36, Berlin: Springer, 2013. [ final | preprint.pdf ]

M. Eberts and I. Steinwart, Optimal regression rates for SVMs using Gaussian kernels, Electron. J. Stat., vol. 7, pp. 1-42, 2013. [ final | preprint.pdf ]

2012

B. Sriperumbudur and I. Steinwart, Consistency and rates for clustering with DBSCAN, in JMLR Workshop and Conference Proceedings Volume 22: Proceedings of the 15th International Conference on Artificial Intelligence and Statistics 2012 (N. Lawrence and M. Girolami, eds.), pp. 1090-1098, 2012. [ final | preprint.pdf ]

I. Steinwart and C. Scovel, Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs, Constr. Approx., vol. 35, pp. 363-417, 2012. [ final | preprint.pdf ]

2011

M. Eberts and I. Steinwart, Optimal learning rates for least squares SVMs using Gaussian kernels, in Advances in Neural Information Processing Systems 24 (J. Shawe-Taylor, R. Zemel, P. Bartlett, F. Pereira, and K. Weinberger, eds.), pp. 1539-1547, 2011. [ final | preprint.pdf ]

I. Steinwart, Adaptive density level set clustering, in JMLR Workshop and Conference Proceedings Volume 19: Proceedings of the 24th Conference on Learning Theory 2011 (S. Kakade and U. von Luxburg, eds.), pp. 703-738, 2011. [ final | preprint.pdf ]

I. Steinwart and A. Christmann, Estimating conditional quantiles with the help of the pinball loss, Bernoulli, vol. 17, pp. 211-225, 2011. [ final | preprint.pdf ]

I. Steinwart, D. Hush, and C. Scovel, Training SVMs without offset, J. Mach. Learn. Res., vol. 12, pp. 141-202, 2011. [ final | preprint.pdf ]

2010

A. Christmann and I. Steinwart, Universal kernels on non-standard input spaces, in Advances in Neural Information Processing Systems 23 (J. Lafferty, C. K. I. Williams, J. Shawe-Taylor, R. Zemel, and A. Culotta, eds.), pp. 406-414, 2010. [ final | preprint.pdf ]

I. Steinwart, J. Theiler, and D. Llamocca, Using support vector machines for anomalous change detection, in IEEE Geoscience and Remote Sensing Society and the IGARSS 2010, pp. 3732-3735, 2010. [ preprint.pdf ]

C. Scovel, D. Hush, I. Steinwart, and J. Theiler, Radial kernels and their reproducing kernel Hilbert spaces, J. Complexity, vol. 26, pp. 641-660, 2010. [ final | preprint.pdf ]

2009

I. Steinwart and A. Christmann, Sparsity of SVMs that use the ε-insensitive loss, in Advances in Neural Information Processing Systems 21 (D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, eds.), pp. 1569-1576, 2009. [ final | preprint.pdf ]

I. Steinwart and A. Christmann, Fast learning from non-i.i.d. observations, in Advances in Neural Information Processing Systems 22 (Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, and A. Culotta, eds.), pp. 1768-1776, 2009. [ final | preprint.pdf ]

I. Steinwart, D. Hush, and C. Scovel, Optimal rates for regularized least squares regression, in Proceedings of the 22nd Annual Conference on Learning Theory (S. Dasgupta and A. Klivans, eds.), pp. 79-93, 2009. [ final | preprint.pdf ]

A. Christmann, A. van Messem, and I. Steinwart, On consistency and robustness properties of support vector machines for heavy-tailed distributions, Stat. Interface, vol. 2, pp. 311-327, 2009. [ final ]

I. Steinwart, Oracle inequalities for SVMs that are based on random entropy numbers, J. Complexity, vol. 25, pp. 437-454, 2009. [ final | preprint.pdf ]

I. Steinwart, Two oracle inequalities for regularized boosting classifiers, Stat. Interface, vol. 2, pp. 271-284, 2009. [ final ]

I. Steinwart, D. Hush, and C. Scovel, Learning from dependent observations, J. Multivariate Anal., vol. 100, pp. 175-194, 2009. [ final | preprint.pdf ]

I. Steinwart and M. Anghel, An SVM approach for forecasting the evolution of an unknown ergodic dynamical system from observations with unknown noise, Ann. Statist., vol. 37, pp. 841-875, 2009. [ final ]

2008

I. Steinwart and A. Christmann, Support Vector Machines. New York: Springer, 2008. [ final ]

I. Steinwart and A. Christmann, How SVMs can estimate quantiles and the median, in Advances in Neural Information Processing Systems 20 (J. Platt, D. Koller, Y. Singer, and S. Roweis, eds.), (Cambridge, MA), pp. 305-312, MIT Press, 2008. [ final | preprint.pdf ]

A. Christmann and I. Steinwart, Consistency of kernel based quantile regression, Appl. Stoch. Models Bus. Ind., vol. 24, pp. 171-183, 2008. [ final ]

2007

N. List, D. Hush, C. Scovel, and I. Steinwart, Gaps in support vector optimization, in Proceedings of the 20th Conference on Learning Theory (N. Bshouty and C. Gentile, eds.), (New York), pp. 336-348, Springer, 2007. [ final | preprint.pdf ]

I. Steinwart, D. Hush, and C. Scovel, An oracle inequality for clipped regularized risk minimizers, in Advances in Neural Information Processing Systems 19 (B. Schölkopf, J. Platt, and T. Hoffman, eds.), (Cambridge, MA), pp. 1321-1328, MIT Press, 2007. [ final | preprint.pdf ]

A. Christmann, I. Steinwart, and M. Hubert, Robust learning from bites for data mining, Comput. Statist. Data Anal., vol. 52, pp. 347-361, 2007. [ final | preprint.pdf ]

A. Christmann and I. Steinwart, Consistency and robustness of kernel-based regression in convex risk minimization, Bernoulli, vol. 13, pp. 799-819, 2007. [ final ]

D. Hush, C. Scovel, and I. Steinwart, Stability of unstable learning algorithms, Mach. Learn., vol. 67, pp. 197-206, 2007. [ final ]

C. Scovel, D. Hush, and I. Steinwart, Approximate duality, J. Optim. Theory Appl., vol. 135, pp. 429-443, 2007. [ final | preprint.pdf ]

I. Steinwart and C. Scovel, Fast rates for support vector machines using Gaussian kernels, Ann. Statist., vol. 35, pp. 575-607, 2007. [ final ]

I. Steinwart, How to compare different loss functions, Constr. Approx., vol. 26, pp. 225-287, 2007. [ final | preprint.pdf ]

2006

I. Steinwart, D. Hush, and C. Scovel, Function classes that approximate the Bayes risk, in Proceedings of the 19th Annual Conference on Learning Theory (G. Lugosi and H. U. Simon, eds.), (New York), pp. 79-93, Springer, 2006. [ final | preprint.pdf ]

I. Steinwart, D. Hush, and C. Scovel, A new concentration result for regularized risk minimizers, in High Dimensional Probability IV (E. Giné, V. Koltchinskii, W. Li, and J. Zinn, eds.), (Beachwood, OH), pp. 260-275, Institute of Mathematical Statistics, 2006. [ final | preprint.pdf ]

D. Hush, P. Kelly, C. Scovel, and I. Steinwart, QP algorithms with guaranteed accuracy and run time for support vector machines, J. Mach. Learn. Res., vol. 7, pp. 733-769, 2006. [ final ]

I. Steinwart, D. Hush, and C. Scovel, An explicit description of the reproducing kernel Hilbert spaces of Gaussian RBF kernels, IEEE Trans. Inform. Theory, vol. 52, pp. 4635-4643, 2006. [ final ]

2005

D. Hush, P. Kelly, C. Scovel, and I. Steinwart, Provably fast algorithms for anomaly detection, in International Workshop on Data Mining Methods for Anomaly Detection at KDD 2005, pp. 27-31, 2005. [ final | preprint.pdf ]

I. Steinwart, D. Hush, and C. Scovel, Density level detection is classification, in Advances in Neural Information Processing Systems 17 (L. K. Saul, Y. Weiss, and L. Bottou, eds.), (Cambridge, MA), pp. 1337-1344, MIT Press, 2005. [ final | preprint.pdf ]

I. Steinwart and C. Scovel, Fast rates for support vector machines, in Proceedings of the 18th Annual Conference on Learning Theory (P. Auer and R. Meir, eds.), (New York), pp. 279-294, Springer, 2005. [ final | preprint.pdf ]

C. Scovel, D. Hush, and I. Steinwart, Learning rates for density level detection, Anal. Appl., vol. 3, pp. 356-371, 2005. Electronic version of an article published in the Journal above, DOI: 10.1142/S0219530505000625, © World Scientific Publishing Company, http://www.worldscinet.com/aa. [ preprint.pdf ]

I. Steinwart, Consistency of support vector machines and other regularized kernel machines, IEEE Trans. Inform. Theory, vol. 51, pp. 128-142, 2005. [ final ]

I. Steinwart, D. Hush, and C. Scovel, A classification framework for anomaly detection, J. Mach. Learn. Res., vol. 6, pp. 211-232, 2005. [ final ]

2004

I. Steinwart, Sparseness of support vector machines-some asymptotically sharp bounds, in Advances in Neural Information Processing Systems 16 (S. Thrun, L. Saul, and B. Schölkopf, eds.), (Cambridge, MA), pp. 1069-1076, MIT Press, 2004. [ final | preprint.pdf ]

I. Steinwart and C. Scovel, When do support vector machines learn fast?, in 16th International Symposium on Mathematical Theory of Networks and Systems, 2004. [ preprint.pdf ]

A. Christmann and I. Steinwart, On robustness properties of convex risk minimization methods for pattern recognition, J. Mach. Learn. Res., vol. 5, pp. 1007-1034, 2004. [ final ]

I. Steinwart, Entropy of convex hulls-some Lorentz norm results, J. Approx. Theory, vol. 128, pp. 42-52, 2004. [ final | preprint.pdf ]

2003

K. Mittmann and I. Steinwart, On the existence of continuous modifications of vector-valued random fields, Georgian Math. J., vol. 10, pp. 311-317, 2003. [ final ]

I. Steinwart, Entropy numbers of convex hulls and an application to learning algorithms, Arch. Math., vol. 80, pp. 310-318, 2003. [ final | preprint.pdf ]

I. Steinwart, On the optimal parameter choice for ν-support vector machines, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, pp. 1274-1284, 2003. [ final ]

I. Steinwart, Sparseness of support vector machines, J. Mach. Learn. Res., vol. 4, pp. 1071-1105, 2003. [ final ]

2002

J. Creutzig and I. Steinwart, Metric entropy of convex hulls in type p spaces-the critical case, Proc. Amer. Math. Soc., vol. 130, pp. 733-743, 2002. [ final ]

I. Steinwart, Support vector machines are universally consistent, J. Complexity, vol. 18, pp. 768-791, 2002. [ final | preprint.pdf ]

2001

I. Steinwart, On the influence of the kernel on the consistency of support vector machines, J. Mach. Learn. Res., vol. 2, pp. 67-93, 2001. [ final ]

2000

I. Steinwart, Entropy of C(K)-valued operators, J. Approx. Theory, vol. 103, pp. 302-328, 2000. [ final | preprint.pdf ]

Theses

I. Steinwart, Entropy of C(K)-valued operators and some applications. PhD thesis, Friedrich-Schiller Universität Jena, Fakultät für Mathematik und Informatik, 2000. [ final ]

I. Bartels, Gewichtete Normungleichungen für Operatoren zwischen Räumen Bochner-integrierbarer Funktionen, Master's thesis, Carl-von-Ossietzky Universität Oldenburg, Fachbereich Mathematik, 1997. [ final ]

Unpublished

I. Steinwart, Which data-dependent bounds are suitable for SVM's?. 2002. [ preprint.pdf ]