Open Access. Powered by Scholars. Published by Universities.®

Physical Sciences and Mathematics Commons

Open Access. Powered by Scholars. Published by Universities.®

Brigham Young University

Discipline
Keyword
Publication Year
Publication
Publication Type
File Type

Articles 2461 - 2490 of 2906

Full-Text Articles in Physical Sciences and Mathematics

Optimization By Varied Beam Search In Hopfield Networks, Tony R. Martinez, Xinchuan Zeng May 2002

Optimization By Varied Beam Search In Hopfield Networks, Tony R. Martinez, Xinchuan Zeng

Faculty Publications

This paper shows that the performance of the Hopfield network for solving optimization problems can be improved by a varied beam search algorithm. The algorithm varies the beam search size and beam intensity during the network relaxation process. It consists of two stages: increasing the beam search parameters in the flrst stage and then decreasing them in the second stage. The purpose of using such a scheme is to provide the network with a better chance to find more and better solutions. A large number of simulation results based on 200 randomly generated city distributions of the 10-city traveling salesman …


Probabilistic Connections In Relaxation Networks, Dan A. Ventura May 2002

Probabilistic Connections In Relaxation Networks, Dan A. Ventura

Faculty Publications

This paper reports results from studying the behavior of Hopfield-type networks with probabilistic connections. As the probabilities decrease, network performance degrades. In order to compensate, two network modifications - input persistence and a new activation function - are suggested, and empirical results indicate that the modifications significantly improve network performance.


Rapid Heterogeneous Ad Hoc Connection Establishment: Accelerating Bluetooth Inquiry Using Irda, Trevor Clifton, Derek D. Joos, Charles D. Knutson, Ryan W. Woodings Mar 2002

Rapid Heterogeneous Ad Hoc Connection Establishment: Accelerating Bluetooth Inquiry Using Irda, Trevor Clifton, Derek D. Joos, Charles D. Knutson, Ryan W. Woodings

Faculty Publications

Bluetooth device discovery is a time-intensive phase of the Bluetooth connection-establishment procedure. In this paper we propose a technique that integrates existing IrDA technology with Bluetooth technology to improve the ad hoc connection establishment time of Bluetooth devices. We accomplish this improvement by first establishing an IrDA connection between two devices equipped with both Bluetooth and IrDA capabilities and then exchanging Bluetooth device discovery information via the established IrDA connection. As a result of this cooperative exchange, the devices are able to bypass the time-intensive Bluetooth device discovery procedure. Our research shows that IrDA-assisted Bluetooth connection establishment is up to …


A Comparison Of Coalescent Estimation Software, Kristen Piggott Shepherd Mar 2002

A Comparison Of Coalescent Estimation Software, Kristen Piggott Shepherd

Theses and Dissertations

Coalescent theory is a method often used by population geneticists in order to make inferences about evolutionary parameters. The coalescent is a stochastic model that approximates ancestral relationships among genes. An understanding of the coalescent pattern of a sample of sequences, along with some knowledge of the mutations that have occurred, provides information about the evolutionary forces that have acted on the population. Processes such as migration, recombination, variable population size, or natural selection are the forces that affect the genealogies and lead to genetic variability in a sample. Coalescent theory provides a statistical description of the variability in the …


Pattern Classification Using A Quantum System, Dan A. Ventura Mar 2002

Pattern Classification Using A Quantum System, Dan A. Ventura

Faculty Publications

We consider and compare three approaches to quantum pattern classification, presenting empirical results from simulations.


Negative Band Gap Bowing In Epitaxial Inas/Gaas Alloys And Predicted Band Offsets Of The Strained Binaries And Alloys On Various Substrates, Gus L. W. Hart, Kwiseon Kim, Alex Zunger Feb 2002

Negative Band Gap Bowing In Epitaxial Inas/Gaas Alloys And Predicted Band Offsets Of The Strained Binaries And Alloys On Various Substrates, Gus L. W. Hart, Kwiseon Kim, Alex Zunger

Faculty Publications

We use pseudopotential theory to provide (1) the band offsets of strained GaAs and InAs on various substrates and (2) the energies Ev(x) of the valence and conduction bands of InxGa1-xAs alloy, as a function of composition. Results are presented for both the bulk alloy and for the alloy strained on InP or GaAs. We predict that while Ex(x) bows downward for relaxed bulk alloys, it bows upward for strained epitaxial alloys. The calculated alloy offsets are used to discuss electron and hole localization in this system.


Pair Attribute Learning: Network Construction Using Pair Features, Tony R. Martinez, Eric K. Henderson Jan 2002

Pair Attribute Learning: Network Construction Using Pair Features, Tony R. Martinez, Eric K. Henderson

Faculty Publications

We present the Pair Attribute Learning (PAL) algorithm for the selection of relevant inputs and network topology. Correlations on training instance pairs are used to drive network construction of a single-hidden layer MLP. Results on nine learning problems demonstrate 70% less complexity, on average, without a significant loss of accuracy.


The Algebra And Geometry Of Curve And Surface Inversion, Thomas W. Sederberg, Eng-Wee Chionh, Kent Ridge Jan 2002

The Algebra And Geometry Of Curve And Surface Inversion, Thomas W. Sederberg, Eng-Wee Chionh, Kent Ridge

Faculty Publications

An inversion equation takes the Cartesian coordinates of a point on a parametric curve or surface and returns the parameter value(s) of that point. A 2-D curve inversion equation has the form t = f(x,y)/g(x,y). This paper shows that practical insight into inversion can be obtained by studying the geometry of the implicit curves f(x,y) = 0 and g(x,y) = 0. For example, the relationship between the singular locus of the parametric curve and the lowest possible degree of an inversion equation can be understood in this way. Also, insight is given into what parameter value will be returned if …


A New Occurrence Of Archaeoscyphia Pulchra (Bassler) From The Ordovician Of Western Canada, J. Keith Rigby, Godfrey S. Nowlan, Peter A. Rowlands Jan 2002

A New Occurrence Of Archaeoscyphia Pulchra (Bassler) From The Ordovician Of Western Canada, J. Keith Rigby, Godfrey S. Nowlan, Peter A. Rowlands

Faculty Publications

A few specimens of the ornate anthaspidellid demosponge, Archaeoscyphia pulchra (Bassler), have been collected from the Lower Ordovician Outram Formation or Skoki Formation, from a saddle at the head of South Rice Brook in northeastern British Columbia. This is the first report of the flanged-appearing annulate, steeply obconical sponge in western Canada, although it has been reported from the Mingan Islands of Quebec and was initially described from Nevada, in the western United States. The taxon has also been reported as other species of Archaeoscyphia from Ordovician rocks of Missouri and from the San Juan region of Argentina.


Peppering Knowledge Sources With Salt: Boosting Conceptual Content For Ontology Generation, Deryle W. Lonsdale, Yihong Ding, David W. Embley, Alan Melby Jan 2002

Peppering Knowledge Sources With Salt: Boosting Conceptual Content For Ontology Generation, Deryle W. Lonsdale, Yihong Ding, David W. Embley, Alan Melby

Faculty Publications

This paper describes work done to explore the common ground between two different ongoing research projects: the standardization of lexical and terminological resources, and the use of conceptual ontologies for information extraction and data integration. Specifically, this paper explores improving the generation of extraction ontologies through use of a comprehensive terminology database that has been represented in a standardized format for easy tool-based implementation. We show how, via the successful integration of these two distinct efforts, it is possible to leverage large-scale terminological and conceptual information having relationship-rich semantic resources in order to reformulate, match, and merge retrieved information of …


Origins Of Nonstoichiometry And Vacancy Ordering In Sc1-X□Xs, Gus L. W. Hart, Alex Zunger Dec 2001

Origins Of Nonstoichiometry And Vacancy Ordering In Sc1-X□Xs, Gus L. W. Hart, Alex Zunger

Faculty Publications

Whereas nearly all compounds AnBm obey Dalton's rule of integer stoichiometry (n:m, both integer), there is a class of systems, exemplified by the rocksalt structure Sc1-x□xS, that exhibits large deviations from stoichiometry via vacancies, even at low temperatures. By combining first-principles total energy calculations with lattice statistical mechanics, we scan an astronomical number of possible structures, identifying the stable ground states. Surprisingly, all have the same motifs: (111) planes with (112) vacancy rows arranged in (110) columns. Electronic structure calculations of the ground states (identified out of ~3 × 10^6 structures) reveal the remarkable origins of nonstoichiometry.


A Confidence Measure For Boundary Detection And Object Selection, William A. Barrett, Eric N. Mortensen Dec 2001

A Confidence Measure For Boundary Detection And Object Selection, William A. Barrett, Eric N. Mortensen

Faculty Publications

We introduce a confidence measure that estimates the assurance that a graph arc (or edge) corresponds to an object boundary in an image. A weighted, planar graph is imposed onto the watershed lines of a gradient magnitude image and the confidence measure is a function of the cost of fixed-length paths emanating from and extending to each end of a graph arc. The confidence measure is applied to automate the detection of object boundaries and thereby reduces (often greatly) the time and effort required for object boundary definition within a user-guided image segmentation environment.


Fast Focal Length Solution In Partial Panoramic Image Stitching, William A. Barrett, Kirk L. Duffin Dec 2001

Fast Focal Length Solution In Partial Panoramic Image Stitching, William A. Barrett, Kirk L. Duffin

Faculty Publications

Accurate estimation of effective camera focal length is crucial to the success of panoramic image stitching. Fast techniques for estimating the focal length exist, but are dependent upon a close initial approximation or the existence of a full circle panoramic image sequence. Numerical solutions of the focal length demonstrate strong coupling between the focal length and the angles used to position each component image about the common spherical center. This paper demonstrates that parameterizing panoramic image positions using spherical arc length instead of angles effectively decouples the focal length Ji.om the image position. This new parameterization does not require an …


Houghing The Hough: Peak Collection For Detection Of Corners, Junctions And Line Intersections, William A. Barrett, Kevin D. Petersen Dec 2001

Houghing The Hough: Peak Collection For Detection Of Corners, Junctions And Line Intersections, William A. Barrett, Kevin D. Petersen

Faculty Publications

We exploit the Accumulator Array of the Hough Transform by finding collections of (2 or more) peaks through which a given sinusoid will pass. Such sinusoids identify points in the original image where lines intersect. Peak collection (or line aggregation) is performed by making a second pass through the edge map, but instead of laying points down in the accumulator array (as with the original Hough Transform), we compute the line integral over each sinusoid that corresponds to the current edge point. If a sinusoid passes through greater than or equal to 2 peaks, we deposit that sum/integral into a …


Image Magnification Using Level-Set Reconstruction, Bryan S. Morse, Duane Schwartzwald Dec 2001

Image Magnification Using Level-Set Reconstruction, Bryan S. Morse, Duane Schwartzwald

Faculty Publications

Image magnification is a common problem in imaging applications, requiring interpolation to “read between the pixels”. Although many magnification/interpolation algorithms have been proposed in the literature, all methods must suffer to some degree the effects of impefect reconstruction―false high-frequency content introduced by the underlying original sampling. Most often, these effects manifest themselves as jagged contours in the image. This paper presents a method for constrained smoothing of such artifacts that attempts to produce smooth reconstructions of the image’s level curves while still maintaining image fidelity. This is similar to other iterative reconstruction algorithms and to Bayesian restoration techniques, but instead …


Using Ssm Proxies To Provide Efficient Multiple-Source Multicast Delivery, Daniel Zappala, Aaron Fabbri Nov 2001

Using Ssm Proxies To Provide Efficient Multiple-Source Multicast Delivery, Daniel Zappala, Aaron Fabbri

Faculty Publications

We consider the possibility that single-source multicast (SSM) will become a universal multicast service, enabling large-scale distribution of content from a few well-known sources to a general audience. Operating under this assumption, we explore the problem of building the traditional IP model of any-source multicast on top of SSM. Toward this end, we design an SSM proxy service that allows any sender to efficiently deliver content to a multicast group. We demonstrate the performance improvements this service offers over standard SSM and describe extensions for access control, dynamic proxy discovery, and multicast proxy distribution.


Modeling The Effects Of Transforming Growth Factor-Beta On Extracellular Matrix Alignment In Dermal Wound Repair, J. C. Dallon, J. A. Sherratt, P. K. Maini Oct 2001

Modeling The Effects Of Transforming Growth Factor-Beta On Extracellular Matrix Alignment In Dermal Wound Repair, J. C. Dallon, J. A. Sherratt, P. K. Maini

Faculty Publications

We present a novel mathematical model for collagen deposition and alignment during dermal wound healing, focusing on the regulatory effects of TGF. Our work extends a previously developed model which considers the interactions between fibroblasts and extracellular matrix, composed of collagen and a fibrin based blood clot, by allowing fibroblasts to orient the collagen matrix, and produce and degrade the extracellular matrix, while the matrix can direct the fibroblasts and control their speed. Here we extend the model by allowing a time varying concentration of TGF to alter the properties of the fibroblasts. Thus we are able to simulate experiments …


Statistics And Supermetallicity: The Metallicity Of Mu Leonis, B. J. Taylor Oct 2001

Statistics And Supermetallicity: The Metallicity Of Mu Leonis, B. J. Taylor

Faculty Publications

For the often-studied "SMR" giant µ Leo, Smith & Ruck (2000) have recently found that [Fe/H] approximately + 0.3 dex. Their conclusion is tested here in a "statistical" paradigm, in which statistical principles are used to select published high-dispersion µ Leo data and assign error bars to them. When data from Smith & Ruck and from Takeda et al. (1998) are added to a data base compiled in 1999, it is found that conclusions from an earlier analysis (Taylor 1999c) are essentially unchanged: the mean value of [Fe/H] approximately + 0.23 ± 0.025 dex, and values ≤ + 0.2 dex …


Modeling Irda Performance: The Effect Of Irlap Negotiation Parameters On Throughput, Scott V. Hansen, Charles D. Knutson, Michael G. Robertson, Franklin E. Sorenson Oct 2001

Modeling Irda Performance: The Effect Of Irlap Negotiation Parameters On Throughput, Scott V. Hansen, Charles D. Knutson, Michael G. Robertson, Franklin E. Sorenson

Faculty Publications

The Infrared Data Association's (IrDA) infrared data transmission protocol is a widely used mechanism for short-range wireless data communications. In order to provide flexibility for connections between devices of potentially disparate capabilities, IrDA devices negotiate the values of several transmission parameters based on the capabilities of the devices establishing the connection. This paper describes the design and implementation of a software tool, Irdaperf, to model IrDA performance based on negotiated transmission parameters. Using Irdaperf, we demonstrate that for fast data rates, maximizing window size and data size are key factors for overcoming the negative effects of a relatively long link …


Chemical Bonding, Elasticity, And Valence Force Field Models: A Case Study For Α-Pt2si And Ptsi, Gus L. W. Hart, J. E. Klepeis, O. Beckstein, O. Pankratov Sep 2001

Chemical Bonding, Elasticity, And Valence Force Field Models: A Case Study For Α-Pt2si And Ptsi, Gus L. W. Hart, J. E. Klepeis, O. Beckstein, O. Pankratov

Faculty Publications

We have carried out a detailed study of the chemical bonding for two room-temperature stable platinum silicide phases, tetragonal α-Pt2Si. These elements of the bonding are further analyzed by constructing valence force field models using the results from recent first principles calculations of the six (nine) independent, nonzero elastic constants of α-Pt2Si (PtSi). The resulting volume-, radial-, and angular-dependent force constants provide insight into the relative strength of various bonding elements as well as the trends observed in the elastic constants themselves. The valence force field analysis yields quantitative information about the nature of the chemical bonding that is not …


Poynting's Theorem And Luminal Total Energy Transport In Passive Dielectric Media, Scott Glasgow, Michael Ware, Justin Peatross Sep 2001

Poynting's Theorem And Luminal Total Energy Transport In Passive Dielectric Media, Scott Glasgow, Michael Ware, Justin Peatross

Faculty Publications

Without approximation the energy density in Poynting's theorem for the generally dispersive and passive dielectric medium is demonstrated to be a system total dynamical energy density. Thus the density in Poynting's theorem is a conserved form that by virtue of its positive definiteness prescribes important qualitative and quantitative features of the medium-field dynamics by rendering the system dynamically closed. This fully three-dimensional result, applicable to anisotropic and inhomogeneous media, is model independent, relying solely on the complex-analytic consequences of causality and passivity. As direct applications of this result, we show 1 that a causal medium responds to a virtual, instantaneous …


Improving Cluster Utilization Through Set Based Allocation Policies, Quinn O. Snell, Julio C. Facelli, Brian D. Haymore, David B. Jackson Sep 2001

Improving Cluster Utilization Through Set Based Allocation Policies, Quinn O. Snell, Julio C. Facelli, Brian D. Haymore, David B. Jackson

Faculty Publications

While clusters have already proven themselves in the world of high performance computing, some clusters are beginning to exhibit resource inefficiencies due to increasing hardware diversity. Much of the success of clusters lies in the use of commodity components built to meet various hardware standards. These standards have allowed a great level of hardware backwards compatibility that is now resulting in a condition referred to as hardware 'drift' or heterogeneity. The hardware heterogeneity introduces problems when diverse compute nodes are allocated to a parallel job, as most parallel jobs are not self-balancing. This paper presents a new method that allows …


Statistics And Supermetallicity: The Metallicity Of Ngc 6791, B. J. Taylor Aug 2001

Statistics And Supermetallicity: The Metallicity Of Ngc 6791, B. J. Taylor

Faculty Publications

For the old galactic cluster NGC 6791, Peterson & Green (1998a) and Chaboyer et al. (1999) have found that [Fe/H] approximately + 0.4 dex. A second look at that conclusion is taken in this paper. Zero-point problems are reviewed for a high-dispersion analysis done by Peterson & Green, and it is found that accidental errors have not been determined rigorously for the results of that analysis. It is also noted that in a color-magnitude analysis performed by Chaboyer et al., the important metallicity range between 0.0 and + 0.3 dex is not explored and hence is not ruled out. Moreover, …


Livelock Avoidance For Meta-Schedulers, Mark J. Clement, John Jardine, Quinn O. Snell Aug 2001

Livelock Avoidance For Meta-Schedulers, Mark J. Clement, John Jardine, Quinn O. Snell

Faculty Publications

Meta-scheduling, a process which allows a user to schedule a job across multiple sites, has a potential for livelock. Current systems avoid livelock by locking down resources at multiple sites and allowing a metascheduler to control the resources during the lock down period or by limiting job size to that which will fit on one site. The former approach leads to poor utilization; the later poses limitations on job size. This research uses BYU's Meta-scheduler (YMS) which allows jobs to be scheduled across multiple sites without the need for locking down the nodes. YMS avoids livelock through exponential back-off This …


Improved Hopfield Networks By Training With Noisy Data, Fred Clift, Tony R. Martinez Jul 2001

Improved Hopfield Networks By Training With Noisy Data, Fred Clift, Tony R. Martinez

Faculty Publications

A new approach to training a generalized Hopfield network is developed and evaluated in this work. Both the weight symmetricity constraint and the zero self-connection constraint are removed from standard Hopfield networks. Training is accomplished with Back-Propagation Through Time, using noisy versions of the memorized patterns. Training in this way is referred to as Noisy Associative Training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets …


Improving The Hopfield Network Through Beam Search, Tony R. Martinez, Xinchuan Zeng Jul 2001

Improving The Hopfield Network Through Beam Search, Tony R. Martinez, Xinchuan Zeng

Faculty Publications

In this paper we propose a beam search mechanism to improve the performance of the Hopfield network for solving optimization problems. The beam search readjusts the top M (M > 1) activated neurons to more similar activation levels in the early phase of relaxation, so that the network has the opportunity to explore more alternative, potentially better solutions. We evaluated this approach using a large number of simulations (20,000 for each parameter setting), based on 200 randomly generated city distributions of the 10-city traveling salesman problem. The results show that the beam search has the capability of significantly improving the network …


Speed Training: Improving The Rate Of Backpropagation Learning Through Stochastic Sample Presentation, Timothy L. Andersen, Tony R. Martinez, Michael E. Rimer Jul 2001

Speed Training: Improving The Rate Of Backpropagation Learning Through Stochastic Sample Presentation, Timothy L. Andersen, Tony R. Martinez, Michael E. Rimer

Faculty Publications

Artificial neural networks provide an effective empirical predictive model for pattern classification. However, using complex neural networks to learn very large training sets is often problematic, imposing prohibitive time constraints on the training process. We present four practical methods for dramatically decreasing training time through dynamic stochastic sample presentation, a technique we call speed training. These methods are shown to be robust to retaining generalization accuracy over a diverse collection of real world data sets. In particular, the SET technique achieves a training speedup of 4278% on a large OCR database with no detectable loss in generalization.


The Need For Small Learning Rates On Large Problems, Tony R. Martinez, D. Randall Wilson Jul 2001

The Need For Small Learning Rates On Large Problems, Tony R. Martinez, D. Randall Wilson

Faculty Publications

In gradient descent learning algorithms such as error backpropagation, the learning rate parameter can have a significant effect on generalization accuracy. In particular, decreasing the learning rate below that which yields the fastest convergence can significantly improve generalization accuracy, especially on large, complex problems. The learning rate also directly affects training speed, but not necessarily in the way that many people expect. Many neural network practitioners currently attempt to use the largest learning rate that still allows for convergence, in order to improve training speed. However, a learning rate that is too large can be as slow as a learning …


Lazy Training: Improving Backpropagation Learning Through Network Interaction, Timothy L. Andersen, Tony R. Martinez, Michael E. Rimer Jul 2001

Lazy Training: Improving Backpropagation Learning Through Network Interaction, Timothy L. Andersen, Tony R. Martinez, Michael E. Rimer

Faculty Publications

Backpropagation, similar to most high-order learning algorithms, is prone to overfitting. We address this issue by introducing interactive training (IT), a logical extension to backpropagation training that employs interaction among multiple networks. This method is based on the theory that centralized control is more effective for learning in deep problem spaces in a multi-agent paradigm. IT methods allow networks to work together to form more complex systems while not restraining their individual ability to specialize. Lazy training, an implementation of IT that minimizes misclassification error, is presented. Lazy training discourages overfitting and is conducive to higher accuracy in multiclass problems …


Optimal Artificial Neural Network Architecture Selection For Bagging, Timothy L. Andersen, Tony R. Martinez, Michael E. Rimer Jul 2001

Optimal Artificial Neural Network Architecture Selection For Bagging, Timothy L. Andersen, Tony R. Martinez, Michael E. Rimer

Faculty Publications

This paper studies the performance of standard architecture selection strategies, such as cost/performance and CV based strategies, for voting methods such as bagging. It is shown that standard architecture selection strategies are not optimal for voting methods and tend to underestimate the complexity of the optimal network architecture, since they only examine the performance of the network on an individual basis and do not consider the correlation between responses from multiple networks.