Number of found documents: 1097
Published from to

Nonparametric Bootstrap Techniques for Implicitly Weighted Robust Estimators
Kalina, Jan
2018 - English
The paper is devoted to highly robust statistical estimators based on implicit weighting, which have a potential to find econometric applications. Two particular methods include a robust correlation coefficient based on the least weighted squares regression and the minimum weighted covariance determinant estimator, where the latter allows to estimate the mean and covariance matrix of multivariate data. New tools are proposed allowing to test hypotheses about these robust estimators or to estimate their variance. The techniques considered in the paper include resampling approaches with or without replacement, i.e. permutation tests, bootstrap variance estimation, and bootstrap confidence intervals. The performance of the newly described tools is illustrated on numerical examples. They reveal the suitability of the robust procedures also for non-contaminated data, as their confidence intervals are not much wider compared to those for standard maximum likelihood estimators. While resampling without replacement turns out to be more suitable for hypothesis testing, bootstrapping with replacement yields reliable confidence intervals but not corresponding hypothesis tests. Keywords: robust statistics; econometrics; correlation coefficient; multivariate data Fulltext is available at external website.
Nonparametric Bootstrap Techniques for Implicitly Weighted Robust Estimators

The paper is devoted to highly robust statistical estimators based on implicit weighting, which have a potential to find econometric applications. Two particular methods include a robust correlation ...

Kalina, Jan
Ústav informatiky, 2018

How to down-weight observations in robust regression: A metalearning study
Kalina, Jan; Pitra, Zbyněk
2018 - English
Metalearning is becoming an increasingly important methodology for extracting knowledge from a data base of available training data sets to a new (independent) data set. The concept of metalearning is becoming popular in statistical learning and there is an increasing number of metalearning applications also in the analysis of economic data sets. Still, not much attention has been paid to its limitations and disadvantages. For this purpose, we use various linear regression estimators (including highly robust ones) over a set of 30 data sets with economic background and perform a metalearning study over them as well as over the same data sets after an artificial contamination. We focus on comparing the prediction performance of the least weighted squares estimator with various weighting schemes. A broader spectrum of classification methods is applied and a support vector machine turns out to yield the best results. While results of a leave-1-out cross validation are very different from results of autovalidation, we realize that metalearning is highly unstable and its results should be interpreted with care. We also focus on discussing all possible limitations of the metalearning methodology in general. Keywords: metalearning; robust statistics; linear regression; outliers Available at various institutes of the ASCR
How to down-weight observations in robust regression: A metalearning study

Metalearning is becoming an increasingly important methodology for extracting knowledge from a data base of available training data sets to a new (independent) data set. The concept of metalearning is ...

Kalina, Jan; Pitra, Zbyněk
Ústav informatiky, 2018

Sparse Test Problems for Nonlinear Least Squares
Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
2018 - English
This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page http://www.cs.cas.cz/~luksan/test.html. Furthermore, all test problems contained in these subroutines are presented in the analytic form. Keywords: large-scale optimization; least squares; test problems Available in a digital repository NRGL
Sparse Test Problems for Nonlinear Least Squares

This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page ...

Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
Ústav informatiky, 2018

Problems for Nonlinear Least Squares and Nonlinear Equations
Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
2018 - English
This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page http://www.cs.cas.cz/~luksan/test.html. Furthermore, all test problems contained in these subroutines are presented in the analytic form. Keywords: large-scale optimization; least squares; nonlinear equations,; test problems Available in a digital repository NRGL
Problems for Nonlinear Least Squares and Nonlinear Equations

This report contains a description of subroutines which can be used for testing large-scale optimization codes. These subroutines can easily be obtained from the web page ...

Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
Ústav informatiky, 2018

Numerical solution of generalized minimax problems
Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
2018 - English
Keywords: Numerical optimization; nonlinear approximation; nonsmooth optimization; generalized minimax problems; recursive quadratic programming methods; interior point methods; smoothing methods; algorithms; numerical experiments Available in digital repository of the ASCR
Numerical solution of generalized minimax problems

Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan
Ústav informatiky, 2018

A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions
Vlček, Jan; Lukšan, Ladislav
2018 - English
Keywords: Unconstrained minimization; variable metric methods; limited-memory methods; the repeated BFGS update; global convergence; numerical results Available in digital repository of the ASCR
A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions

Vlček, Jan; Lukšan, Ladislav
Ústav informatiky, 2018

Robust Metalearning: Comparing Robust Regression Using A Robust Prediction Error
Peštová, Barbora; Kalina, Jan
2018 - English
The aim of this paper is to construct a classification rule for predicting the best regression estimator for a new data set based on a database of 20 training data sets. Various estimators considered here include some popular methods of robust statistics. The methodology used for constructing the classification rule can be described as metalearning. Nevertheless, standard approaches of metalearning should be robustified if working with data sets contaminated by outlying measurements (outliers). Therefore, our contribution can be also described as robustification of the metalearning process by using a robust prediction error. In addition to performing the metalearning study by means of both standard and robust approaches, we search for a detailed interpretation in two particular situations. The results of detailed investigation show that the knowledge obtained by a metalearning approach standing on standard principles is prone to great variability and instability, which makes it hard to believe that the results are not just a consequence of a mere chance. Such aspect of metalearning seems not to have been previously analyzed in literature. Keywords: metalearning; robust regression; outliers; robust prediction error Fulltext is available at external website.
Robust Metalearning: Comparing Robust Regression Using A Robust Prediction Error

The aim of this paper is to construct a classification rule for predicting the best regression estimator for a new data set based on a database of 20 training data sets. Various estimators considered ...

Peštová, Barbora; Kalina, Jan
Ústav informatiky, 2018

Semigroup Structure of Sets of Solutions to Equation X^s = X^m
Porubský, Štefan
2017 - English
Using an idempotent semigroup approach we describe the semigroup and group structure of the set of solutions to equation X^m = X^s in successive steps over a periodic commutative semigroup, over multiplicative semigroups of factor rings of residually finite commutative rings and finally over multiplicative semigroups of factor rings of residually finite commutative principal ideal domains. The analysis is done through the use of the maximal subsemigroups and groups corresponding to an idempotent of the corresponding semigroup and in the case of residually finite PID’s employing the available analysis of the Euler-Fermat Theorem as given in [11]. In particular the case when this set of solutions is a union of groups is handled. As a simple application we show a not yet noticed group structure of the set of solutions to x^n = x connected with the message space of RSA cryptosystems and Fermat pseudoprimes. Keywords: set of solutions; idempotent; maximal semigroup corresponding to an idempotent; maximal group corresponding to an idempotent; equation X^s = X^m; finite commutative ring with identity element; residually finite commutative principal ideal domains Available on request at various institutes of the ASCR
Semigroup Structure of Sets of Solutions to Equation X^s = X^m

Using an idempotent semigroup approach we describe the semigroup and group structure of the set of solutions to equation X^m = X^s in successive steps over a periodic commutative semigroup, over ...

Porubský, Štefan
Ústav informatiky, 2017

An adaptive recursive multilevel approximate inverse preconditioning: Computation of the Schur complement
Kopal, Jiří; Rozložník, Miroslav; Tůma, Miroslav
2017 - English
Available in digital repository of the ASCR
An adaptive recursive multilevel approximate inverse preconditioning: Computation of the Schur complement

Kopal, Jiří; Rozložník, Miroslav; Tůma, Miroslav
Ústav informatiky, 2017

The Computational Power of Neural Networks and Representations of Numbers in Non-Integer Bases.
Šíma, Jiří
2017 - English
We briefly survey the basic concepts and results concerning the computational power of neural networks which basically depends on the information content of weight parameters. In particular, recurrent neural networks with integer, rational, and arbitrary real weights are classified within the Chomsky and finer complexity hierarchies. Then we refine the analysis between integer and rational weights by investigating an intermediate model of integer-weight neural networks with an extra analog rational-weight neuron (1ANN). We show a representation theorem which characterizes the classification problems solvable by 1ANNs, by using so-called cut languages. Our analysis reveals an interesting link to an active research field on non-standard positional numeral systems with non-integer bases. Within this framework, we introduce a new concept of quasi-periodic numbers which is used to classify the computational power of 1ANNs within the Chomsky hierarchy. Keywords: neural network; Chomsky hierarchy; beta-expansion; cut language Available at various institutes of the ASCR
The Computational Power of Neural Networks and Representations of Numbers in Non-Integer Bases.

We briefly survey the basic concepts and results concerning the computational power of neural networks which basically depends on the information content of weight parameters. In particular, recurrent ...

Šíma, Jiří
Ústav informatiky, 2017

About project

NRGL provides central access to information on grey literature produced in the Czech Republic in the fields of science, research and education. You can find more information about grey literature and NRGL at service web

Send your suggestions and comments to nusl@techlib.cz

Provider

http://www.techlib.cz

Facebook

Other bases