Artículos (Estadística e Investigación Operativa)

URI permanente para esta colecciónhttps://hdl.handle.net/11441/10844

Examinar

Envíos recientes

Mostrando 1 - 20 de 415
  • Acceso AbiertoArtículo
    Probability, Statistics and Estimation
    (MDPI, 2025-01-23) Gómez, Yolanda M.; Barranco Chamorro, Inmaculada; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM153: Estadística e Investigación Operativa
  • Acceso AbiertoArtículo
    Dealing with Imprecision in Performance Evaluation Processes Using Indicators: A Fuzzy Distance-Based Approach
    (Springer, 2015-11-16) Gálvez Ruiz, David; Pino Mejías, José Luis; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM328: Metodos Cuantitativos en Evaluacion
    This paper presents an approach for dealing with imprecision in performance evaluation analysis focused particularly on Public Policy but applicable to other fields. While most previous research builds fuzzy indicators by assessing a membership function as the indicator itself, with direct subjective evaluation, this new approach is based on the fuzzification of crisp indicators which are subject to imprecision, vagueness or an incomplete perception of reality. These indicators may be quantitative or qualitative yet approximate expressions from real data in order to define results and systematize the information. The evaluation process is based on those expressions and uses fuzzy distances as a performance evaluation tool to show the degree of efficacy related to both crisp and fuzzy targets set as the reference value in the evaluation process.
  • Acceso AbiertoArtículo
    Generalized modified slash distribution with applications
    (Taylor & Francis, 2019-01-07) Reyes, Jimmy; Barranco Chamorro, Inmaculada; Gómez, Héctor W.; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM153: Estadística e Investigación Operativa
    In this paper a generalization of the modified slash distribution is introduced. This model is based on the quotient of two independent random variables, whose distributions are a normal and a one-parameter gamma, respectively. The resulting distribution is a new model whose kurtosis is greater than other slash distributions. The probability density function, its properties, moments, and kurtosis coefficient are obtained. Inference based on moment and maximum likelihood methods is carried out. The multivariate version is also introduced. Applications to two datasets are considered. They illustrate the applicability of our results, along with the fact that the new model can provide a better fit to symmetric data with heavy tails than other slash extensions previously introduced in literature.
  • Acceso AbiertoArtículo
    Modelling zero-inflated count data with a special case of the generalised poisson distribution
    (Cambridge University Press, 2019-09-04) Calderín Ojeda, Enrique; Gómez Déniz, Emilio; Barranco Chamorro, Inmaculada; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM153: Estadística e Investigación Operativa
    A one-parameter version of the generalised Poisson distribution provided by Consul and Jain (1973) is considered in this paper. The distribution is unimodal with a zero vertex and over-dispersed. A generalised linear model related to this distribution is also presented. Its parameters can be estimated by using a Fisher-Scoring algorithm which is equivalent to iteratively reweighted least squares. Due to its flexibility and capacity to describe highly skewed data with an excessive number of zeros, the model is suitable to be applied in insurance settings as an alternative to the negative binomial and zero-inflated model.
  • Acceso AbiertoArtículo
    Discovering potential biomarkers for Ochratoxin A production by Penicillium nordicum in dry-cured meat matrices through untargeted metabolomics
    (Elsevier, 2024-01-30) Garrido Rodríguez, David; Andrade, María J.; Delgado, Josué; Cebrián, Eva; Barranco Chamorro, Inmaculada; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM153: Estadística e Investigación Operativa
    Ochratoxin A (OTA) is a mycotoxin produced by Penicillium nordicum, being this species a notable producer in dry-cured meat products. This toxin is one of the main fungal contaminants found in a variety of foods, including ripened meat products. Due to its harmful effects on health, it is essential to develop strategies from different approaches to identify whether products are contaminated and try to predict mycotoxin production during maturation to establish necessary corrective measures. In this study, targeted and untargeted metabolomic analyses were conducted to identify differential metabolites produced by three P. nordicum strains. These metabolites could be related to OTA production in agar-ham and dry-fermented sausage agar media under different water activity conditions. A Q Exactive™ Plus Hybrid Quadrupole-Orbitrap™ Mass Spectrometer was used to identify potential biomarkers and predictor-variables associated with OTA, employing: six comparatives that covered various aspects before and after OTA emergence, two analysis platforms (RStudio and MetaboAnalyst) and two approaches (imputation and non-imputation). The analysis was performed following two strategies, a Univariate and a Multivariate Statistical Analysis. As a result, several significant metabolites were identified as significantly linked to OTA production, including Asenjonamide C, (S,S)-Anacine, SPF-32629 A and Pseudouridine, which could potentially serve as biomarkers for OTA production. The study also highlighted the importance of using bioinformatic platforms, such as RStudio and MetaboAnalyst, for comprehensive analysis when integrating multiple outputs. This study paves the way for OTA prevention and taking corrective actions within the Hazard Analysis and Critical Control Points (HACCP) framework, after pointing out metabolites associated with OTA production, which deserve to be further studied in hams undergone to industrial ripening.
  • Acceso AbiertoArtículo
    Statistical analysis of solid lipid nanoparticles produced by high-pressure homogenization: a practical prediction approach
    (Springer Nature, 2013) Durán Lobato, María Matilde; Enguix González, Alicia; Fernández Arévalo, María Mercedes; Martín Banderas, Lucía; Universidad de Sevilla. Departamento de Farmacia y Tecnología Farmacéutica; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Junta de Andalucía
    Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 μm, negative zeta potential under –30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R L/S) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R L/S, while the number of passes applied mainly determined polydispersion. α-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.
  • Acceso AbiertoArtículo
    E-commerce shipping through a third-party supply chain
    (Elsevier, 2020-05-09) Ponce López, Diego; Contreras, Iván; Laporte, Gilbert; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM331: Métodos y Modelos de la Estadística y la Investigación Operativa
    We consider an e-commerce retailer who must ship orders from a warehouse to a set of customers with delivery deadlines. As is often the case, the retailer uses a third-party service provider to ensure its distribution. The retailer can enter the supply chain of the service provider at various levels. Entering it at a higher level entails lower sorting costs for the retailer, but higher delivery costs, and longer delivery times. The customer orders arrive at various moments over a rolling planning horizon. This means that the retailer must also make consolidation decisions. We model and solve the static and dynamic cases of this problem. The static case is modeled as an integer linear program and solved by CPLEX. We develop and compare four shipping policies for the dynamic case. Extensive computational results based on real location data from California and Texas are reported.
  • Acceso AbiertoArtículo
    Portfolio problems with two levels decision-makers: Optimal portfolio selection with pricing decisions on transaction costs
    (Elsevier, 2019-12-26) Leal Palazón, Marina; Ponce López, Diego; Puerto Albandoz, Justo; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM331: Métodos y Modelos de la Estadística y la Investigación Operativa
    This paper presents novel bilevel leader-follower portfolio selection problems in which the financial intermediary becomes a decision-maker. This financial intermediary decides on the unit transaction costs for investing in some securities, maximizing its benefits, and the investor chooses his optimal portfolio, minimizing risk and ensuring a given expected return. Hence, transaction costs become decision variables in the portfolio problem, and two levels of decision-makers are incorporated: the financial intermediary and the investor. These situations give rise to general Nonlinear Programming formulations in both levels of the decision process. We present different bilevel versions of the problem: financial intermediary-leader, investor-leader, and social welfare; besides, their properties are analyzed. Moreover, we develop Mixed Integer Linear Programming formulations for some of the proposed problems and effective algorithms for some others. Finally, we report on some computational experiments performed on data taken from the Dow Jones Industrial Average, and analyze and compare the results obtained by the different models.
  • Acceso AbiertoArtículo
    Continuous location under the effect of ‘refraction’
    (Springer, 2016-03-08) Blanco, Víctor; Puerto Albandoz, Justo; Ponce López, Diego; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM331: Métodos y Modelos de la Estadística y la Investigación Operativa
    In this paper we address the problem of locating a new facility on a d-dimensional space when the distance measure (- or polyhedral-norms) is different at each one of the sides of a given hyperplane. We relate this problem with the physical phenomenon of refraction, and extend it to any finite dimensional space and different distances at each one of the sides of any hyperplane. An application to this problem is the location of a facility within or outside an urban area where different distance measures must be used. We provide a new second order cone programming formulation, based on the -norm representation given in Blanco et al. (Comput Optim Appl 58(3):563–595, 2014) that allows to solve the problem in any finite dimensional space with second order cone or semidefinite programming tools. We also extend the problem to the case where the hyperplane is considered as a rapid transit media (a different third norm is also considered over ) that allows the demand to travel, whenever it is convenient, through to reach the new facility. Extensive computational experiments run in Gurobi are reported in order to show the effectiveness of the approach. Some extensions of these models are also presented.
  • Acceso AbiertoArtículo
    A comparative study of formulations and solution methods for the discrete ordered p-median problem
    (Elsevier, 2016-06-06) Labbé, Martine; Ponce López, Diego; Puerto Albandoz, Justo; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM331: Métodos y Modelos de la Estadística y la Investigación Operativa
    This paper presents several new formulations for the Discrete Ordered Median Problem (DOMP) based on its similarity with some scheduling problems. Some of the new formulations present a considerably smaller number of constraints to define the problem with respect to some previously known formulations. Furthermore, the lower bounds provided by their linear relaxations improve the ones obtained with previous formulations in the literature even when strengthening is not applied. We also present a polyhedral study of the assignment polytope of our tightest formulation showing its proximity to the convex hull of the integer solutions of the problem. Several resolution approaches, among which we mention a branch and cut algorithm, are compared. Extensive computational results on two families of instances, namely randomly generated and from Beasley's OR-library, show the power of our methods for solving DOMP.
  • Acceso AbiertoArtículo
    A fresh view on the Discrete Ordered Median Problem based on partial monotonicity
    (Elsevier, 2020-04-14) Marín, Alfredo; Ponce López, Diego; Puerto Albandoz, Justo; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM331: Métodos y Modelos de la Estadística y la Investigación Operativa
    This paper presents new results for the Discrete Ordered Median Problem (DOMP). It exploits properties of k-sum optimization to derive specific formulations for the monotone DOMP (MDOMP), that arises when the λ weights are non-decreasing monotone, and new formulations for the general non-monotone DOMP. The main idea in our approach is to express ordered weighted averages as telescopic sums whose terms are k-sums, with positive and negative coefficients. Formulations of k-sums with positive coefficients derive from the linear programming representations obtained by Ogryczack and Tamir (2003) and Blanco, Ali, and Puerto (2014). Valid formulations for k-sums with negative coefficients are more elaborated and we present 4 different approaches, all of them based on mixed integer programming formulations. An extensive computational experience based on a collection of well-known instances shows the usefulness of the new formulations to solve difficult problems such as trimmed and anti-trimmed mean.
  • Acceso AbiertoArtículo
    Identification problem in plug-flow chemical reactors using the adjoint method
    (Elsevier, 2016-11-25) Bermúdez, A.; Esteben, N.; Ferrín, J.L.; Rodríguez Calo, J.F.; Sillero Denamiel, María Remedios; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM329 Optimización
    The aim of this work is to solve identification problems in plug-flow chemical reactors. For this purpose an adjoint-based algorithm for parameter identification problems in systems of partial differential equations is presented. The adjoint method allows us to calculate the gradient of the objective function and the constraint functions with respect to the unknown parameters significantly reducing the computer time. This leads to solve a minimization problem, in which an objective function is defined in order to quantify the mismatch between the observed data and the numerical solution of the parameterized chemical model. For solving the initial and boundary-value problem we use finite-difference schemes. More precisely, we propose a second-order BDF method initialized with a first-order one. The algorithm proposed was implemented in a computer program and some numerical results are shown. The efficiency of the adjoint method, compared with the classical formula of incremental quotients, is also presented.
  • Acceso AbiertoArtículo
    On sparse ensemble methods: An application to short-term predictions of the evolution of COVID-19
    (Elsevier, 2021-12-01) Benítez Peña, Sandra; Carrizosa Priego, Emilio José; Guerrero, Vanesa; Jiménez Gamero, María Dolores; Martín Barragán, Belén; Molero Río, Cristina; Ramírez Cobo, Josefa; Romero Morales, María Dolores; Sillero Denamiel, María Remedios; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM329 Optimización
    Since the seminal paper by Bates and Granger in 1969, a vast number of ensemble methods that combine different base regressors to generate a unique one have been proposed in the literature. The so-obtained regressor method may have better accuracy than its components, but at the same time it may overfit, it may be distorted by base regressors with low accuracy, and it may be too complex to understand and explain. This paper proposes and studies a novel Mathematical Optimization model to build a sparse ensemble, which trades off the accuracy of the ensemble and the number of base regressors used. The latter is controlled by means of a regularization term that penalizes regressors with a poor individual performance. Our approach is flexible to incorporate desirable properties one may have on the ensemble, such as controlling the performance of the ensemble in critical groups of records, or the costs associated with the base regressors involved in the ensemble. We illustrate our approach with real data sets arising in the COVID-19 context.
  • Acceso AbiertoArtículo
    Bayesian Influence Diagnostics in Radiocarbon Dating
    (Taylor & Francis, 2012-08-27) Fernández Ponce, José María; Palacios Rodríguez, Fátima; Rodríguez Griñolo, María del Rosario; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM328. Métodos cuantitativos en evaluación
    Linear models constitute the primary statistical technique for any experimental science. A major topic in this area is the detection of influential subsets of data, that is, of observations that are influential in terms of their effect on the estimation of parameters in linear regression or of the total population parameters. Numerous studies exist on radiocarbon dating which propose a value consensus and remove possible outliers after the corresponding testing. An influence analysis for the value consensus from a Bayesian perspective is developed in this article.
  • Acceso AbiertoArtículo
    Generalized Pareto processes for simulating space-time extreme events: an application to precipitation reanalyses
    (Springer, 2020-10-03) Palacios Rodríguez, Fátima; Toulemonde, G.; Carreau, J.; Opitz, T.; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM328. Métodos cuantitativos en evaluación
    To better manage the risks of destructive natural disasters, impact models can be fed with simulations of extreme scenarios to study the sensitivity to temporal and spatial variability. We propose a semi-parametric stochastic framework that enables simulations of realistic spatio-temporal extreme fields using a moderate number of observed extreme space-time episodes to generate an unlimited number of extreme scenarios of any magnitude. Our framework draws sound theoretical justification from extreme value theory, building on generalized Pareto limit processes arising as limits for event magnitudes exceeding a high threshold. Specifically, we exploit asymptotic stability properties by decomposing extreme event episodes into a scalar magnitude variable (that is resampled), and an empirical profile process representing space-time variability. For illustration on hourly gridded precipitation data in Mediterranean France, we calculate various risk measures using extreme event simulations for yet unobserved magnitudes, and we highlight contrasted behavior for different definitions of the magnitude variable.
  • Acceso AbiertoArtículo
    On multivariate extensions of the conditional Value-at-Risk measure
    (Elsevier, 2015-03-12) Di Bernardino, Elena; Fernández Ponce, E.; Palacios Rodríguez, Fátima; Rodríguez Griñolo, María del Rosario; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM328. Métodos cuantitativos en evaluación
    CoVaR is a systemic risk measure proposed by Adrian and Brunnermeier (2011) able to measure a financial institution’s contribution to systemic risk and its contribution to the risk of other financial institutions. CoVaR stands for conditional Value-at-Risk, i.e. it indicates the Value at Risk for a financial institution that is conditional on a certain scenario. In this paper, two alternative extensions of the classic univariate Conditional Value-at-Risk are introduced in a multivariate setting. The two proposed multivariate CoVaRs are constructed from level sets of multivariate distribution functions (resp. of multivariate survival distribution functions). These vector-valued measures have the same dimension as the underlying risk portfolio. Several characterizations of these new risk measures are provided in terms of the copula structure and stochastic orderings of the marginal distributions. Interestingly, these results are consistent with existing properties on univariate risk measures. Furthermore, comparisons between existent risk measures and the proposed multivariate CoVaR are developed. Illustrations are given in the class of Archimedean copulas. Estimation procedure for the multivariate proposed CoVaRs is illustrated in simulated studies and insurance real data.
  • Acceso AbiertoArtículo
    Estimation of extreme quantiles conditioning on multivariate critical layers
    (Wiley, 2016-02-08) Di Bernardino, Elena; Palacios Rodríguez, Fátima; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM328. Métodos cuantitativos en evaluación
    Let Ti:=[Xi|X∈∂L(α)], for i = 1,…,d, where X = (X1,…,Xd) is a risk vector and ∂L(α) is the associated multivariate critical layer at level α∈(0,1). The aim of this work is to propose a non-parametric extreme estimation procedure for the (1 − pn)-quantile of Ti for a fixed α and when pn→0, as the sample size n→+∞. An extrapolation method is developed under the Archimedean copula assumption for the dependence structure of X and the von Mises condition for marginal Xi. The main result is the central limit theorem for our estimator for p = pn→0, when n tends towards infinity. A set of simulations illustrates the finite-sample performance of the proposed estimator. We finally illustrate how the proposed estimation procedure can help in the evaluation of extreme multivariate hydrological risks. Copyright © 2016 John Wiley & Sons, Ltd.
  • Acceso AbiertoArtículo
    Estimation of extreme Component-wise Excess design realization: a hydrological application
    (Springer, 2017-02-09) Di Bernardino, Elena; Palacios Rodríguez, Fátima; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa; Universidad de Sevilla. FQM328. Métodos cuantitativos en evaluación
    The classic univariate risk measure in environmental sciences is the Return Period (RP). The RP is traditionally defined as “the average time elapsing between two successive realizations of a prescribed event”. The notion of design quantile related with RP is also of great importance. The design quantile represents the “value of the variable(s) characterizing the event associated with a given RP”. Since an individual risk may strongly be affected by the degree of dependence amongst all risks, the need for the provision of multivariate design quantiles has gained ground. In contrast to the univariate case, the design quantile definition in the multivariate setting presents certain difficulties. In particular, Salvadori, G., De Michele, C. and Durante F. define in the paper called “On the return period and design in a multivariate framework” (Hydrol Earth Syst Sci 15:3293–3305, 2011) the design realization as the vector that maximizes a weight function given that the risk vector belongs to a given critical layer of its joint multivariate distribution function. In this paper, we provide the explicit expression of the aforementioned multivariate risk measure in the Archimedean copula setting. Furthermore, this measure is estimated by using Extreme Value Theory techniques and the asymptotic normality of the proposed estimator is studied. The performance of our estimator is evaluated on simulated data. We conclude with an application on a real hydrological data-set.
  • Acceso AbiertoArtículo
    Smooth copula-based generalized extreme value model and spatial interpolation for extreme rainfall in Central Eastern Canada
    (Wiley, 2023-02-11) Palacios Rodríguez, Fátima; Di Bernardino, Elena; Maihot, Melina; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa
    This paper proposes a smooth copula-based Generalized Extreme Value (GEV) model to map and predict extreme rainfall in Central Eastern Canada. The considered data contains a large portion of missing values, and one observes several nonconcomitant record periods at different stations. The proposed two-step approach combines GEV parameters' smooth functions in space through the use of spatial covariates and a flexible hierarchical copula-based model to take into account dependence between the recording stations. The hierarchical copula structure is detected via a clustering algorithm implemented with an adapted version of the copula-based dissimilarity measure recently introduced in the literature. Finally, we compare the classical GEV parameter interpolation approaches with the proposed smooth copula-based GEV modeling approach.
  • Acceso AbiertoArtículo
    On the exact reproduction number in SIS epidemic models with vertical transmission
    (Springer, 2023-08-22) Gómez Corral, A.; Palacios Rodríguez, Fátima; Rodríguez Bernal, M.T.; Universidad de Sevilla. Departamento de Estadística e Investigación Operativa
    This paper proposes a bi-variate competition process to describe the spread of epidemics of SIS type through both horizontal and vertical transmission. The interest is in the exact reproduction number, , which is seen to be the stochastic version of the well-known basic reproduction number. We characterize the probability distribution function of by decomposing this number into two random contributions allowing us to distinguish between infectious person-to-person contacts and infections of newborns with infective parents. Numerical examples are presented to illustrate our analytical results.