Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

2007

Discipline
Institution
Keyword
Publication
Publication Type
File Type

Articles 2641 - 2670 of 6758

Full-Text Articles in Physical Sciences and Mathematics

Type I Error Rates Of The Kenward-Roger Adjusted Degree Of Freedom F-Test For A Split-Plot Design With Missing Values, Miguel A. Padilla, James Algina May 2007

Type I Error Rates Of The Kenward-Roger Adjusted Degree Of Freedom F-Test For A Split-Plot Design With Missing Values, Miguel A. Padilla, James Algina

Journal of Modern Applied Statistical Methods

The Type I error rate of the Kenward-Roger (KR) test, implemented by PROC MIXED in SAS, was assessed through a simulation study for a one between- and one within-subjects factor split-plot design with ignorable missing values and covariance heterogeneity. The KR test controlled the Type I error well under all of the simulation factors, with all estimated Type I error rates between .040 and .075. The best control was for testing the between-subjects main effect (error rates between .041 and .057) and the worst control was for the between-by-within interaction (.040 to .075). The simulated factors had very small effects …


Application Of A New Procedure For Power Analysis And Comparison Of The Adjusted Univariate And Multivariate Tests In Repeated Measures Designs, Sean W. Mulvenon, M. Austin Betz, Kening Wang, Bruno D. Zumbo May 2007

Application Of A New Procedure For Power Analysis And Comparison Of The Adjusted Univariate And Multivariate Tests In Repeated Measures Designs, Sean W. Mulvenon, M. Austin Betz, Kening Wang, Bruno D. Zumbo

Journal of Modern Applied Statistical Methods

A relationship between the multivariate and univariate noncentrality parameters in repeated measures designs was developed for the purpose of assessing the relative power of the univariate and multivariate approaches. An application is provided examining the use of repeated measures designs to evaluate student achievement in a K-12 school system


Comparison Of The T Vs. Wilcoxon Signed-Rank Test For Likert Scale Data And Small Samples, Gary E. Meek, Ceyhun Ozgur, Kenneth Dunning May 2007

Comparison Of The T Vs. Wilcoxon Signed-Rank Test For Likert Scale Data And Small Samples, Gary E. Meek, Ceyhun Ozgur, Kenneth Dunning

Journal of Modern Applied Statistical Methods

The one sample t-test is compared with the Wilcoxon Signed-Rank test for identical data sets representing various Likert scales. An empirical approach is used with simulated data. Comparisons are based on observed error rates for 27,850 data sets. Recommendations are provided.


Another Look At The Confidence Intervals For The Noncentral T Distribution, Bruno Lecoutre May 2007

Another Look At The Confidence Intervals For The Noncentral T Distribution, Bruno Lecoutre

Journal of Modern Applied Statistical Methods

An alternative approach to the computation of confidence intervals for the noncentrality parameter of the Noncentral t distribution is proposed. It involves the percent points of a statistical distribution. This conceptual improvement renders the technical process for deriving the limits more comprehensible. Accurate approximations can be derived and easily used.


The Effects Of Heteroscedasticity On Tests Of Equivalence, Jamie A. Gruman, Robert A. Cribbie, Chantal A. Arpin-Cribbie May 2007

The Effects Of Heteroscedasticity On Tests Of Equivalence, Jamie A. Gruman, Robert A. Cribbie, Chantal A. Arpin-Cribbie

Journal of Modern Applied Statistical Methods

Tests of equivalence, which are designed to assess the similarity of group means, are becoming more popular, yet very little is known about the statistical properties of these tests. Monte Carlo methods are used to compare the test of equivalence proposed by Schuirmann with modified tests of equivalence that incorporate a heteroscedastic error term. It was found that the latter were more accurate than the Schuirmann test in detecting equivalence when sample sizes and variances were unequal.


Approximate Bayesian Confidence Intervals For The Mean Of An Exponential Distribution Versus Fisher Matrix Bounds Models, Vincent A. R. Camara May 2007

Approximate Bayesian Confidence Intervals For The Mean Of An Exponential Distribution Versus Fisher Matrix Bounds Models, Vincent A. R. Camara

Journal of Modern Applied Statistical Methods

The aim of this article is to obtain and compare confidence intervals for the mean of an exponential distribution. Considering respectively the square error and the Higgins-Tsokos loss functions, approximate Bayesian confidence intervals for parameters of exponential population are derived. Using exponential data, the obtained approximate Bayesian confidence intervals will then be compared to the ones obtained with Fisher Matrix bounds method. It is shown that the proposed approximate Bayesian approach relies only on the observations. The Fisher Matrix bounds method, that uses the z-table, does not always yield the best confidence intervals, and the proposed approach often performs better.


A Comparison Of Eight Shrinkage Formulas Under Extreme Conditions, David A. Walker May 2007

A Comparison Of Eight Shrinkage Formulas Under Extreme Conditions, David A. Walker

Journal of Modern Applied Statistical Methods

The performance of various shrinkage formulas for estimating the population squared multiple correlation coefficient (ρ2) were compared under extreme conditions often found in educational research with small sample sizes of 10, 15, 20, 25, 30 and regressor variates ranging from 2 to 4. A new formula for estimating ρ2, Adj R2 DW, was examined in terms of its performance under various conditions of N, p, ρ2, along with its bias properties and standard error estimates. The two shrinkage formulas that performed most consistently were the Claudy (Adj R2 C) and Walker (Adj R2 DW)


Better Binomial Confidence Intervals, James F. Reed Iii May 2007

Better Binomial Confidence Intervals, James F. Reed Iii

Journal of Modern Applied Statistical Methods

The construction of a confidence interval for a binomial parameter is a basic analysis in statistical inference. Most introductory statistics textbook authors present the binomial confidence interval based on the asymptotic normality of the sample proportion and estimating the standard error - the Wald method. For the one sample binomial confidence interval the Clopper-Pearson exact method has been regarded as definitive as it eliminates both overshoot and zero width intervals. The Clopper-Pearson exact method is the most conservative and is unquestionably a better alternative to the Wald method. Other viable alternatives include Wilson's Score, the Agresti-Coull method, and the Borkowf …


Beta-Weibull Distribution: Some Properties And Applications To Censored Data, Carl Lee, Felix Famoye, Olugbenga Olumolade May 2007

Beta-Weibull Distribution: Some Properties And Applications To Censored Data, Carl Lee, Felix Famoye, Olugbenga Olumolade

Journal of Modern Applied Statistical Methods

Some properties of a four-parameter beta-Weibull distribution are discussed. The beta-Weibull distribution is shown to have bathtub, unimodal, increasing, and decreasing hazard functions. The distribution is applied to censored data sets on bus-motor failures and a censored data set on head-and-neck-cancer clinical trial. A simulation is conducted to compare the beta-Weibull distribution with the exponentiated Weibull distribution.


On The Product Of Maxwell And Rice Random Variables, M. Shakil, B. M. Golam Kibria May 2007

On The Product Of Maxwell And Rice Random Variables, M. Shakil, B. M. Golam Kibria

Journal of Modern Applied Statistical Methods

The distributions of the product of independent random variables arise in many applied problems. These have been extensively studied by many researchers. In this paper, the exact distributions of the product |XY| have been derived when X and Y are Maxwell and Rice random variables respectively, and are distributed independently of each other. The associated cdfs, pdfs, and kth moments have been given.


Optimal Lp-Metric For Minimizing Powered Deviations In Regression, Stan Lipovetsky May 2007

Optimal Lp-Metric For Minimizing Powered Deviations In Regression, Stan Lipovetsky

Journal of Modern Applied Statistical Methods

Minimizations by least squares or by least absolute deviations are well known criteria in regression modeling. In this work the criterion of generalized mean by powered deviations is suggested. If the parameter of the generalized mean equals one or two, the fitting corresponds to the least absolute or the least squared deviations, respectively. Varying the power parameter yields an optimum value for the objective with a minimum possible residual error. Estimation of a most favorable value of the generalized mean parameter shows that it almost does not depend on data. The optimal power always occurs to be close to 1.7, …


A Spline-Based Lack-Of-Fit Test For Independent Variable Effect, Chin-Shang Li, Wanzhu Tu May 2007

A Spline-Based Lack-Of-Fit Test For Independent Variable Effect, Chin-Shang Li, Wanzhu Tu

Journal of Modern Applied Statistical Methods

In regression analysis of count data, independent variables are often modeled by their linear effects under the assumption of log-linearity. In reality, the validity of such an assumption is rarely tested, and its use is at times unjustifiable. A lack-of-fit test is proposed for the adequacy of a postulated functional form of an independent variable within the framework of semiparametric Poisson regression models based on penalized splines. It offers added flexibility in accommodating the potentially non-loglinear effect of the independent variable. A likelihood ratio test is constructed for the adequacy of the postulated parametric form, for example log-linearity, of the …


Practical Unit-Root Analysis Using Information Criteria: Simulation Evidence, Kosei Fukuda May 2007

Practical Unit-Root Analysis Using Information Criteria: Simulation Evidence, Kosei Fukuda

Journal of Modern Applied Statistical Methods

The information-criterion-based model selection method for detecting a unit root is proposed. The simulation results suggest that the performances of the proposed method are usually comparable to and sometimes better than those of the conventional unit-root tests. The advantages of the proposed method in practical applications are also discussed.


A Comparison Of One-High-Threshold And Two-High-Threshold Multinomial Models Of Source Monitoring, Mahesh Menon, Todd S. Woodward May 2007

A Comparison Of One-High-Threshold And Two-High-Threshold Multinomial Models Of Source Monitoring, Mahesh Menon, Todd S. Woodward

Journal of Modern Applied Statistical Methods

A data simulation study comparing the one-high-threshold (1HT) and two-high-threshold (2HT) multinomial models suggested that 2HT models are more likely to misestimate the underlying parameter values, due to inflation of some parameters (b and d), and deflation of others (D).


Examining Cronbach Alpha, Theta, Omega Reliability Coefficients According To Sample Size, Ilker Ercan, Berna Yazici, Deniz Sigirli, Bulent Ediz, Ismet Kan May 2007

Examining Cronbach Alpha, Theta, Omega Reliability Coefficients According To Sample Size, Ilker Ercan, Berna Yazici, Deniz Sigirli, Bulent Ediz, Ismet Kan

Journal of Modern Applied Statistical Methods

Differentiations according to the sample size of different reliability coefficients are examined. It is concluded that the estimates obtained by Cronbach alpha and teta coefficients are not related with the sample size, even the estimates obtained from the small samples can represent the population parameter. However, the Omega coefficient requires large sample sizes.


Estimation Of Risk For Developing Cardiac Problem In Patients Of Type 2 Diabetes As Obtained By The Technique Of Density Estimation, Ajit Mukherjee, Ajit Mathur, Rakesh Mittal May 2007

Estimation Of Risk For Developing Cardiac Problem In Patients Of Type 2 Diabetes As Obtained By The Technique Of Density Estimation, Ajit Mukherjee, Ajit Mathur, Rakesh Mittal

Journal of Modern Applied Statistical Methods

High levels of cholesterol and triglyceride are known to be strongly associated with development of cardiac problem in patients of type 2 diabetes. In a hospital-based study, patients showing ECG positive were compared with those who were not. The observations on cholesterol and triglyceride were considered for estimation of risk for developing the cardiac problem. The technique of density estimation employing Epanechnikov kernel was used for estimating bivariate probability density functions with respect to observations on cholesterol and triglyceride of the two groups. Using the odds form of Bayes’ rule, the estimates of posterior odds were computed.


Multinomial Logistic Regression Model For The Inferential Risk Age Groups For Infection Caused By Vibrio Cholerae In Kolkata, India, Krishnan Rajendran, Thandavarayan Ramamurthy, Dipika Sur May 2007

Multinomial Logistic Regression Model For The Inferential Risk Age Groups For Infection Caused By Vibrio Cholerae In Kolkata, India, Krishnan Rajendran, Thandavarayan Ramamurthy, Dipika Sur

Journal of Modern Applied Statistical Methods

Multinomial Logistic Regression (MLR) modeling is an effective approach for categorical outcomes, as compared with discriminant function analysis and log-linear models for profiling individual category of dependent variable. To explore the yearly change of inferential age groups of acute diarrhoeal patients infected with Vibrio cholerae during 1996-2000 by MLR, systematic sampling data were generated from an active surveillance study. Among 1330 V.cholerae infected cases, the predominant age category was up to 5 years accounting for 478 (30.5%) cases. The independent variables V.cholerae O1 (p<0.001) and non-O1 and non-O139 (p < 0.001) were significantly associated with children under 5 years age group. V.cholerae O139 inferential age group was > 40 years. The infection mediated by V.cholerae O1 had significantly decreasing trend Exp(B) year wise from …


Jmasm27: An Algorithm For Implementing Gibbs Sampling For 2pno Irt Models (Fortran), Yanyan Sheng, Todd C. Headrick May 2007

Jmasm27: An Algorithm For Implementing Gibbs Sampling For 2pno Irt Models (Fortran), Yanyan Sheng, Todd C. Headrick

Journal of Modern Applied Statistical Methods

A Fortran 77 subroutine is provided for implementing the Gibbs sampling procedure to a normal ogive IRT model for binary item response data with the choice of uniform and normal prior distributions for item parameters. The subroutine requires the user to have access to the IMSL library. The source code is available at http://www.siu.edu/~epse1/sheng/Fortran/, along with a stand alone executable file.


Mathmatics In Volume I Of Scripta Universitatis, Shlomo S. Sawilowsky May 2007

Mathmatics In Volume I Of Scripta Universitatis, Shlomo S. Sawilowsky

Journal of Modern Applied Statistical Methods

Immanuel Velikovsky’s journal, Scripta Universitatis, edited by Albert Einstein and first published in 1923, played a significant role in the establishment of the library, and hence, Hebrew University in Jerusalem. The inaugural issue contained an article by the French mathematician Jacques Hadamard. Excerpts from Velikovsky’s diary pertaining to the rationale for the creation of the journal, and the interest in Jewish scholars such as Hadamard, are translated here.


Statistical Pronouncements V, Shlomo S. Sawilowsky May 2007

Statistical Pronouncements V, Shlomo S. Sawilowsky

Theoretical and Behavioral Foundations of Education Faculty Publications

No abstract provided.


Existence Of Minimizers And Necessary Conditions In Set-Valued Optimization With Equilibrium Constraints, Truong Q. Bao, Boris S. Mordukhovich May 2007

Existence Of Minimizers And Necessary Conditions In Set-Valued Optimization With Equilibrium Constraints, Truong Q. Bao, Boris S. Mordukhovich

Mathematics Research Reports

In this paper we study set-valued optimization problems with equilibrium constraints (SOPEOs) described by parametric generalized equations in the form 0 is an element of the set G(x) + Q(x) where both G and Q are set-valued mappings between infinite-dimensional spaces. Such models particularly arise from certain optimization-related problems governed by set-valued variational inequalities and first-order optimality conditions in nondifferentiable programming. We establish general results on the existence of optimal solutions under appropriate assumptions of the PalaisSmale type and then derive necessary conditions for optimality in the models under consideration by using advanced tools of variational analysis and generalized differentiation.


Bound Optimization For Parallel Quadratic Sieving Using Large Prime Variations, Andrew G. West May 2007

Bound Optimization For Parallel Quadratic Sieving Using Large Prime Variations, Andrew G. West

Andrew G. West

The Quadratic Sieve (QS) factorization algorithm is a powerful means to perform prime decompositions that combines number theory, linear algebra, and brute processing power. Created by Carl Pomerance in 1985, it is the second fastest general purpose factorization method as of this writing, behind only the Number Field Sieve.

We describe an efficient QS implementation which is accessible to an undergraduate audience. The majority of papers on this topic rely on complex mathematical notation as their primary means of explanation. Instead, we attempt to combine math, discussion, and examples to promote understanding. Additionally, few authors ever present implementation level detail. …


Subsidies For Energy Efficiency Improvements: Theory And Practice, Theodoros Zachariadis May 2007

Subsidies For Energy Efficiency Improvements: Theory And Practice, Theodoros Zachariadis

Theodoros Zachariadis

No abstract provided.


Estimating The Effect Of Vigorous Physical Activity On Mortality In The Elderly Based On Realistic Individualized Treatment And Intention-To-Treat Rules, Oliver Bembom, Mark J. Van Der Laan May 2007

Estimating The Effect Of Vigorous Physical Activity On Mortality In The Elderly Based On Realistic Individualized Treatment And Intention-To-Treat Rules, Oliver Bembom, Mark J. Van Der Laan

Oliver Bembom

The effect of vigorous physical activity on mortality in the elderly is difficult to estimate using conventional approaches to causal inference that define this effect by comparing the mortality risks corresponding to hypothetical scenarios in which all subjects in the target population engage in a given level of vigorous physical activity. A causal effect defined on the basis of such a static treatment intervention can only be identified from observed data if all subjects in the target population have a positive probability of selecting each of the candidate treatment options, an assumption that is highly unrealistic in this case since …


Analyzing Sequentially Randomized Trials Based On Causal Effect Models For Realistic Individualized Treatment Rules, Oliver Bembom, Mark J. Van Der Laan May 2007

Analyzing Sequentially Randomized Trials Based On Causal Effect Models For Realistic Individualized Treatment Rules, Oliver Bembom, Mark J. Van Der Laan

Oliver Bembom

In this paper, we argue that causal effect models for realistic individualized treatment rules represent an attractive tool for analyzing sequentially randomized trials. Unlike a number of methods proposed previously, this approach does not rely on the assumption that intermediate outcomes are discrete or that models for the distributions of these intermediate outcomes given the observed past are correctly specified. In addition, it generalizes the methodology for performing pairwise comparisons between individualized treatment rules by allowing the user to posit a marginal structural model for all candidate treatment rules simultaneously. If only a small number of candidate treatment rules are …


Evaluating A Group Sequential Design In The Setting Of Nonproportional Hazards, Daniel L. Gillen, Scott S. Emerson May 2007

Evaluating A Group Sequential Design In The Setting Of Nonproportional Hazards, Daniel L. Gillen, Scott S. Emerson

UW Biostatistics Working Paper Series

Group sequential methods have been widely described and implemented in a clinical trial setting where parametric and semiparametric models are deemed suitable. In these situations, the evaluation of the operating characteristics of a group sequential stopping rule remains relatively straightforward. However, in the presence of nonproportional hazards survival data nonparametric methods are often used, and the evaluation of stopping rules is no longer a trivial task. Specifically, nonparametric test statistics do not necessarily correspond to a parameter of clinical interest, thus making it difficult to characterize alternatives at which operating characteristics are to be computed. We describe an approach for …


Coarse-Graining Schemes And A Posteriori Error Estimates For Stochastic Lattice Systems, Ma Katsoulakis, P Plechac, L Rey-Bellet, Dk Tsagkarogiannis May 2007

Coarse-Graining Schemes And A Posteriori Error Estimates For Stochastic Lattice Systems, Ma Katsoulakis, P Plechac, L Rey-Bellet, Dk Tsagkarogiannis

Luc Rey-Bellet

The primary objective of this work is to develop coarse-graining schemes for stochastic many-body microscopic models and quantify their effectiveness in terms of a priori and a posteriori error analysis. In this paper we focus on stochastic lattice systems of interacting particles at equilibrium. The proposed algorithms are derived from an initial coarse-grained approximation that is directly computable by Monte Carlo simulations, and the corresponding numerical error is calculated using the specific relative entropy between the exact and approximate coarse-grained equilibrium measures. Subsequently we carry out a cluster expansion around this first – and often inadequate – approximation and obtain …


A Sum Frequency Generation Vibrational Spectroscopic Study Of The Adsorption And Reactions Of C6 Hydrocarbons At High Pressures On Pt(100), Kaitlin M. Bratlie, Gabor A. Somorjai May 2007

A Sum Frequency Generation Vibrational Spectroscopic Study Of The Adsorption And Reactions Of C6 Hydrocarbons At High Pressures On Pt(100), Kaitlin M. Bratlie, Gabor A. Somorjai

Kaitlin M. Bratlie

Sum frequency generation (SFG) vibrational spectroscopy was used to investigate the adsorption geometries and surface reactions of various C6 hydrocarbons (n-hexane, 2-methylpentane, 3-methylpentane, and 1-hexene) on Pt(100). At 300 K and in the presence of excess hydrogen, n-hexane, 3-methylpentane, and 2-methylpentane adsorb molecularly on Pt(100) mostly in “flat-lying” conformations. Upon heating the metal surface to 450 K, the molecules underwent dehydrogenation to form new surface species in “standing-up” conformations, such as hexylidyne and metallacyclic species.


Growth By Molecular Beam Epitaxy Of Self-Assembled Inas Quantum Dots On Inalas And Ingaas Lattice-Matched To Inp, Paul J. Simmonds, H W. Li, H E. Beere, P See, A J. Shields, D A. Ritchie May 2007

Growth By Molecular Beam Epitaxy Of Self-Assembled Inas Quantum Dots On Inalas And Ingaas Lattice-Matched To Inp, Paul J. Simmonds, H W. Li, H E. Beere, P See, A J. Shields, D A. Ritchie

Paul J. Simmonds

The authors report the results of a detailed study of the effect of growth conditions, for molecular beam epitaxy, on the structural and optical properties of self-assembled InAs quantum dots (QDs) on In0.524Al0.476As. InAs QDs both buried in, and on top of, In0.524Al0.476As were analyzed using photoluminescence (PL) and atomic force microscopy. InAs QD morphology and peak PL emission wavelength both scale linearly with deposition thickness in monolayers (MLs). InAs deposition thickness can be used to tune QD PL wavelength by 170 nm/ML, over a range of almost 700 nm. Increasing growth …


Direct Analysis Of Solid Corrosion Products By Laser Ablation Icp-Ms: Method Development And The Interaction Of Aqueous Uranium, Gadolinium And Neodymium With Iron Shot And Iron (Iii) Oxide, James Cizdziel, Klaus J. Stetzenbach, Amy J. Smiecinski May 2007

Direct Analysis Of Solid Corrosion Products By Laser Ablation Icp-Ms: Method Development And The Interaction Of Aqueous Uranium, Gadolinium And Neodymium With Iron Shot And Iron (Iii) Oxide, James Cizdziel, Klaus J. Stetzenbach, Amy J. Smiecinski

Publications (YM)

The purpose of this report is to summarize the work and present conclusions of Project Activity Task ORD-RF-03 conducted under cooperative agreement number DE-FC28-04RW12237 between the U.S. Department of Energy and the Nevada System of Higher Education (NSHE). The work was conducted in the Harry Reid Center for Environmental Studies of the University of Nevada Las Vegas from October 1, 2004 to September 30, 2006. The purpose of the study was to develop a method using laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) for the direct analysis of iron corrosion products, to evaluate its capabilities, advantages, and limitations, and …