Parameter estimation in generalized partial linear models with tikhanov regularization
dc.contributor.advisor | Karasözen, Bülent | |
dc.contributor.advisor | Weber, Gerhard Wilhelm | |
dc.contributor.author | Kayhan, Belgin | |
dc.date.accessioned | 2020-12-10T09:06:59Z | |
dc.date.available | 2020-12-10T09:06:59Z | |
dc.date.submitted | 2010 | |
dc.date.issued | 2018-08-06 | |
dc.identifier.uri | https://acikbilim.yok.gov.tr/handle/20.500.12812/223885 | |
dc.description.abstract | ||
dc.description.abstract | Regression analysis refers to techniques for modeling and analyzing several variablesin statistical learning. There are various types of regression models. In our study,we analyzed Generalized Partial Linear Models (GPLMs), which decomposes inputvariables into two sets, and additively combines classical linear models with nonlinearmodel part. By separating linear models from nonlinear ones, an inverse problemmethod Tikhonov regularization was applied for the nonlinear submodels separately,within the entire GPLM. Such a particular representation of submodels provides botha better accuracy and a better stability (regularity) under noise in the data.We aim to smooth the nonparametric part of GPLM by using a modied form of Mul-tiple Adaptive Regression Spline (MARS) which is very useful for high-dimensionalproblems and does not impose any specic relationship between the predictor anddependent variables. Instead, it can estimate the contribution of the basis functionsso that both the additive and interaction eects of the predictors are allowed to de-termine the dependent variable. The MARS algorithm has two steps: the forward and backward stepwise algorithms. In the rst one, the model is built by adding basisfunctions until a maximum level of complexity is reached. On the other hand, thebackward stepwise algorithm starts with removing the least signicant basis functionsfrom the model.In this study, we propose to use a penalized residual sum of squares (PRSS) insteadof the backward stepwise algorithm and construct PRSS for MARS as a Tikhonovregularization problem. Besides, we provide numeric example with two data sets; onehas interaction and the other one does not have. As well as studying the regular-ization of the nonparametric part, we also mention theoretically the regularizationof the parametric part. Furthermore, we make a comparison between Innite KernelLearning (IKL) and Tikhonov regularization by using two data sets, with the dier-ence consisting in the (non-)homogeneity of the data set. The thesis concludes withan outlook on future research. | en_US |
dc.language | English | |
dc.language.iso | en | |
dc.rights | info:eu-repo/semantics/openAccess | |
dc.rights | Attribution 4.0 United States | tr_TR |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.subject | Matematik | tr_TR |
dc.subject | Mathematics | en_US |
dc.subject | İstatistik | tr_TR |
dc.subject | Statistics | en_US |
dc.title | Parameter estimation in generalized partial linear models with tikhanov regularization | |
dc.title.alternative | Genelleştirilmiş parçalı doğrusal modellerde tikhanov düzenleme ile parametre tahmini | |
dc.type | masterThesis | |
dc.date.updated | 2018-08-06 | |
dc.contributor.department | Bilimsel Hesaplama Anabilim Dalı | |
dc.subject.ytm | Parameter estimation | |
dc.subject.ytm | Optimization problem | |
dc.subject.ytm | Generalized linear models | |
dc.subject.ytm | Optimization | |
dc.identifier.yokid | 384234 | |
dc.publisher.institute | Uygulamalı Matematik Enstitüsü | |
dc.publisher.university | ORTA DOĞU TEKNİK ÜNİVERSİTESİ | |
dc.identifier.thesisid | 275854 | |
dc.description.pages | 115 | |
dc.publisher.discipline | Diğer |