Open Access. Powered by Scholars. Published by Universities.®
Physical Sciences and Mathematics Commons™
Open Access. Powered by Scholars. Published by Universities.®
- Institution
-
- Singapore Management University (7311)
- China Simulation Federation (3382)
- TÜBİTAK (3020)
- Selected Works (2644)
- Wright State University (2602)
-
- Purdue University (2057)
- Old Dominion University (1458)
- Missouri University of Science and Technology (1371)
- Edith Cowan University (1173)
- Air Force Institute of Technology (1108)
- University of Texas at El Paso (1105)
- University of Nebraska - Lincoln (1073)
- Dartmouth College (1022)
- San Jose State University (995)
- Embry-Riddle Aeronautical University (853)
- City University of New York (CUNY) (819)
- Brigham Young University (812)
- Technological University Dublin (791)
- Washington University in St. Louis (785)
- California Polytechnic State University, San Luis Obispo (733)
- Kennesaw State University (714)
- University for Business and Technology in Kosovo (626)
- University of Texas at Arlington (614)
- Nova Southeastern University (535)
- Portland State University (531)
- SelectedWorks (527)
- Western University (526)
- New Jersey Institute of Technology (516)
- Syracuse University (501)
- University of Nebraska at Omaha (485)
- Keyword
-
- Machine learning (1342)
- Deep learning (801)
- Artificial intelligence (663)
- Machine Learning (653)
- Computer Science (637)
-
- Security (619)
- Computer science (434)
- Privacy (409)
- Cybersecurity (407)
- Simulation (379)
- Classification (369)
- Technical Reports (369)
- UTEP Computer Science Department (367)
- Data mining (350)
- Deep Learning (350)
- Algorithms (338)
- Optimization (337)
- Neural networks (308)
- Artificial Intelligence (300)
- Computer vision (290)
- Education (254)
- Software engineering (254)
- College for Professional Studies (253)
- Clustering (251)
- Cloud computing (244)
- Department of Computer Science and Engineering (243)
- Applied sciences (237)
- School of Computer & Information Science (236)
- Visualization (236)
- Natural language processing (233)
- Publication Year
- Publication
-
- Research Collection School Of Computing and Information Systems (6959)
- Journal of System Simulation (3382)
- Turkish Journal of Electrical Engineering and Computer Sciences (3020)
- Theses and Dissertations (2511)
- Department of Computer Science Technical Reports (1721)
-
- Computer Science & Engineering Syllabi (1312)
- Departmental Technical Reports (CS) (892)
- Master's Projects (845)
- Electronic Theses and Dissertations (738)
- Computer Science Faculty Publications (725)
- All Computer Science and Engineering Research (683)
- Computer Science Technical Reports (673)
- Computer Science Faculty Research & Creative Works (601)
- Faculty Publications (588)
- Kno.e.sis Publications (543)
- Journal of Digital Forensics, Security and Law (536)
- Computer Science Faculty Publications and Presentations (509)
- Dissertations (505)
- CCE Theses and Dissertations (488)
- All Works (478)
- Walden Dissertations and Doctoral Studies (468)
- Chulalongkorn University Theses and Dissertations (Chula ETD) (462)
- Masters Theses (402)
- Computer Science and Engineering Theses (356)
- Electronic Thesis and Dissertation Repository (334)
- Theses (334)
- Computer Science and Software Engineering (332)
- USF Tampa Graduate Theses and Dissertations (320)
- Computer Science: Faculty Publications (318)
- Mathematics, Statistics and Computer Science Faculty Research and Publications (315)
- Publication Type
Articles 52111 - 52140 of 58247
Full-Text Articles in Physical Sciences and Mathematics
Open Source Software: A History, David Bretthauer
Open Source Software: A History, David Bretthauer
Published Works
In the 30 years from 1970-2000, open source software began as an assumption without a name or a clear alternative. It has evolved into a sophisticated movement which has produced some of the most stable and widely used software packages ever produced. This paper traces the evolution of three operating systems: GNU, BSD, and Linux, as well as the communities which have evolved with these systems and some of the commonly-used software packages developed using the open source model. It also discusses some of the major figures in open source software, and defines both “free software” and “open source software.”
Night Out Itinerary Creator, Jennifer Hood
Night Out Itinerary Creator, Jennifer Hood
Honors Capstone Projects and Theses
No abstract provided.
Active Information Retrieval, Tommi Jaakkola, Hava Siegelmann
Active Information Retrieval, Tommi Jaakkola, Hava Siegelmann
Hava Siegelmann
In classical large information retrieval systems, the system responds to a user initiated query with a list of results ranked by relevance. The users may further refine their query as needed. This process may result in a lengthy correspondence without conclusion. We propose an alternative active learning approach, where the system responds to the initial user’s query by successively probing the user for distinctions at multiple levels of abstraction. The system’s initiated queries are optimized for speedy recovery and the user is permitted to respond with multiple selections or may reject the query. The information is in each case unambiguously …
On Detecting Service Violations And Bandwidth Theft In Qos Network Domains, Ahsan Habib, Sonia Fahmy, Srinivas R. Avasarala, Venkatesh Prabhakar, Bharat Bhargava
On Detecting Service Violations And Bandwidth Theft In Qos Network Domains, Ahsan Habib, Sonia Fahmy, Srinivas R. Avasarala, Venkatesh Prabhakar, Bharat Bhargava
Department of Computer Science Technical Reports
No abstract provided.
Automated Online News Classification With Personalization, Chee-Hong Chan, Aixin Sun, Ee Peng Lim
Automated Online News Classification With Personalization, Chee-Hong Chan, Aixin Sun, Ee Peng Lim
Research Collection School Of Computing and Information Systems
Classification of online news, in the past, has often been done manually. In our proposed Categorizor system, we have experimented an automated approach to classify online news using the Support Vector Machine (SVM). SVM has been shown to deliver good classification results when ample training documents are given. In our research, we have applied SVM to personalized classification of online news.
Dual Heuristic Programming For Fuzzy Control, George G. Lendaris, Thaddeus T. Shannon, Larry J. Schultz, Steven Hutsell, Alec Rogers
Dual Heuristic Programming For Fuzzy Control, George G. Lendaris, Thaddeus T. Shannon, Larry J. Schultz, Steven Hutsell, Alec Rogers
Systems Science Faculty Publications and Presentations
Overview material for the Special Session (Tuning Fuzzy Controllers Using Adaptive Critic Based Approximate Dynamic Programming) is provided. The Dual Heuristic Programming (DHP) method of Approximate Dynamic Programming is described and used to the design a fuzzy control system. DHP and related techniques have been developed in the neurocontrol context but can be equally productive when used with fuzzy controllers or neuro-fuzzy hybrids. This technique is demonstrated by designing a temperature controller for a simple water bath system. In this example, we take advantage of the TSK model framework to initialize the tunable parameters of our plant model with reasonable …
Meeting Medical Terminology Needs: The Ontology-Enhanced Medical Concept Mapper, Gondy Leroy, Hsinchun Chen
Meeting Medical Terminology Needs: The Ontology-Enhanced Medical Concept Mapper, Gondy Leroy, Hsinchun Chen
CGU Faculty Publications and Research
This paper describes the development and testing of the Medical Concept Mapper, a tool designed to facilitate access to online medical information sources by providing users with appropriate medical search terms for their personal queries. Our system is valuable for patients whose knowledge of medical vocabularies is inadequate to find the desired information, and for medical experts who search for information outside their field of expertise. The Medical Concept Mapper maps synonyms and semantically related concepts to a user's query. The system is unique because it integrates our natural language processing tool, i.e., the Arizona (AZ) Noun Phraser, with human-created …
Meeting Medical Terminology Needs: The Ontology-Enhanced Medical Concept Mapper, Gondy Leroy, Hsinchun Chen
Meeting Medical Terminology Needs: The Ontology-Enhanced Medical Concept Mapper, Gondy Leroy, Hsinchun Chen
CGU Faculty Publications and Research
This paper describes the development and testing of the Medical Concept Mapper, a tool designed to facilitate access to online medical information sources by providing users with appropriate medical search terms for their personal queries. Our system is valuable for patients whose knowledge of medical vocabularies is inadequate to find the desired information, and for medical experts who search for information outside their field of expertise. The Medical Concept Mapper maps synonyms and semantically related concepts to a user's query. The system is unique because it integrates our natural language processing tool, i.e., the Arizona (AZ) Noun Phraser, with human-created …
Modeling Intersections Of Geospatial Lifelines, Ramaswam Hariharan
Modeling Intersections Of Geospatial Lifelines, Ramaswam Hariharan
Electronic Theses and Dissertations
Modeling moving objects involves spatio-temporal reasoning. The continuous movements of objects in space-time captured as discrete samples form geospatial lifelines. Existing lifeline models can represent the movement of objects between samples from most likely location to all possible locations. This thesis builds on a model called lifeline bead and necklace that captures all the possible locations of moving objects. Beads are 3-dimensional representations of an object's movements and a series of beads form a necklace. The extent of finding the possible locations is constrained by the speed of movement of the objects. Intersections of lifelines occur when two or more …
Image Compression By Wavelet Transform., Panrong Xiao
Image Compression By Wavelet Transform., Panrong Xiao
Electronic Theses and Dissertations
Digital images are widely used in computer applications. Uncompressed digital images require considerable storage capacity and transmission bandwidth. Efficient image compression solutions are becoming more critical with the recent growth of data intensive, multimedia-based web applications.
This thesis studies image compression with wavelet transforms. As a necessary background, the basic concepts of graphical image storage and currently used compression algorithms are discussed. The mathematical properties of several types of wavelets, including Haar, Daubechies, and biorthogonal spline wavelets are covered and the Enbedded Zerotree Wavelet (EZW) coding algorithm is introduced. The last part of the thesis analyzes the compression results to …
Deploying Integrated Web-Based Spatial Applications Within An Oracle Database Environment, James Carswell
Deploying Integrated Web-Based Spatial Applications Within An Oracle Database Environment, James Carswell
Conference papers
In this paper, we describe the architectural and functional characteristics of e-Spatial™ technology, comprising an innovative software package that represents a timely alternative to traditional and complex proprietary GIS application packages. The two main components of the package, developed by e- Spatial Solutions, are the iSMART™ database development technology and the i-Spatial™ Information Server (iSIS), both implemented within an Oracle 9i Spatial database environment. This technology allows users to build and deploy spatially enabled or standard Internet applications without requiring any application-specific source code. It can be deployed on any Oracle supported hardware platform and on any device that supports …
Metaxpath, Curtis Dyreson, Michael H. Böhen, Christian S. Jensen
Metaxpath, Curtis Dyreson, Michael H. Böhen, Christian S. Jensen
Curtis Dyreson
This paper presents the METAXPath data model and query language. METAXPath extends XPath with support for XML metadata. XPath is a specification language for locations in an XML document it serves as the basis for XML query languages like XSLT and the XML Query Algebra.
The METAXPath data model is a nested XPath tree. Each level of metadata induces a new level of nesting. The data model separates metadata and data into different data spaces, supports meta-metadata and enables sharing of metadata common to a group of nodes without duplication. The METAXPath query language has a level shift operator to …
A Framework For Multilingual Information Processing, Steven Edward Atkin
A Framework For Multilingual Information Processing, Steven Edward Atkin
Theses and Dissertations
Recent and (continuing) rapid increases in computing power now enable more of humankind's written communication to be represented as digital data. The most recent and obvious changes in multilingual information processing have been the introduction of larger character sets encompassing more writing systems. Yet the very richness of larger collections of characters has made the interpretation and processing of text more difficult. The many competing motivations (satisfying the needs of linguists, computer scientists, and typographers) for standardizing character sets threaten the purpose of information processing: accurate and facile manipulation of data. Existing character sets are constructed without a consistent strategy …
Real-Time Distributed Scheduling In Non-Guaranteed Networks Within A Quality Of Service Framework, Hisham Nabil El-Zahhar
Real-Time Distributed Scheduling In Non-Guaranteed Networks Within A Quality Of Service Framework, Hisham Nabil El-Zahhar
Archived Theses and Dissertations
No abstract provided.
Slowstart Congestion Control For Fine-Grained Layered Multicast, Khalid Shaheen
Slowstart Congestion Control For Fine-Grained Layered Multicast, Khalid Shaheen
Archived Theses and Dissertations
No abstract provided.
Fast Implementation Of Depth Contours Using Topological Sweep, Kim Miller, Suneeta Ramaswami, Peter Rousseeuw, Toni Sellarès, Diane Souvaine, Ileana Streinu, Anja Struyf
Fast Implementation Of Depth Contours Using Topological Sweep, Kim Miller, Suneeta Ramaswami, Peter Rousseeuw, Toni Sellarès, Diane Souvaine, Ileana Streinu, Anja Struyf
Computer Science: Faculty Publications
The concept of location depth was introduced in statistics as a way to extend the univariate notion of ranking to a bivariate configuration of data points. It has been used successfully for robust estimation, hypothesis testing, and graphical display. These reguire the computation of depth regions, which form a collection of nested polygons. The center of the deepest region is called the Tukey median. The only available implemented algorithms for the depth contours and the Tukey median are slow, which limits their usefulness. In this paper we describe an optimal algorithm which computes all depth contours in &Ogr;(n 2) time …
Houghing The Hough: Peak Collection For Detection Of Corners, Junctions And Line Intersections, William A. Barrett, Kevin D. Petersen
Houghing The Hough: Peak Collection For Detection Of Corners, Junctions And Line Intersections, William A. Barrett, Kevin D. Petersen
Faculty Publications
We exploit the Accumulator Array of the Hough Transform by finding collections of (2 or more) peaks through which a given sinusoid will pass. Such sinusoids identify points in the original image where lines intersect. Peak collection (or line aggregation) is performed by making a second pass through the edge map, but instead of laying points down in the accumulator array (as with the original Hough Transform), we compute the line integral over each sinusoid that corresponds to the current edge point. If a sinusoid passes through greater than or equal to 2 peaks, we deposit that sum/integral into a …
Mobile-Agent Versus Client/Server Performance: Scalability In An Information-Retrieval Task, Robert S. Gray, David Kotz, Ronald A. Peterson, Joyce Barton, Daria Chacon, Peter Gerken, Martin Hofmann, Jeffrey Bradshaw, Maggie Breedy, Renia Jeffers, Niranjan Suri
Mobile-Agent Versus Client/Server Performance: Scalability In An Information-Retrieval Task, Robert S. Gray, David Kotz, Ronald A. Peterson, Joyce Barton, Daria Chacon, Peter Gerken, Martin Hofmann, Jeffrey Bradshaw, Maggie Breedy, Renia Jeffers, Niranjan Suri
Dartmouth Scholarship
Building applications with mobile agents often reduces the bandwidth required for the application, and improves performance. The cost is increased server workload. There are, however, few studies of the scalability of mobile-agent systems. We present scalability experiments that compare four mobile-agent platforms with a traditional client/server approach. The four mobile-agent platforms have similar behavior, but their absolute performance varies with underlying implementation choices. Our experiments demonstrate the complex interaction between environmental, application, and system parameters.
Image Magnification Using Level-Set Reconstruction, Bryan S. Morse, Duane Schwartzwald
Image Magnification Using Level-Set Reconstruction, Bryan S. Morse, Duane Schwartzwald
Faculty Publications
Image magnification is a common problem in imaging applications, requiring interpolation to “read between the pixels”. Although many magnification/interpolation algorithms have been proposed in the literature, all methods must suffer to some degree the effects of impefect reconstruction―false high-frequency content introduced by the underlying original sampling. Most often, these effects manifest themselves as jagged contours in the image. This paper presents a method for constrained smoothing of such artifacts that attempts to produce smooth reconstructions of the image’s level curves while still maintaining image fidelity. This is similar to other iterative reconstruction algorithms and to Bayesian restoration techniques, but instead …
A Confidence Measure For Boundary Detection And Object Selection, William A. Barrett, Eric N. Mortensen
A Confidence Measure For Boundary Detection And Object Selection, William A. Barrett, Eric N. Mortensen
Faculty Publications
We introduce a confidence measure that estimates the assurance that a graph arc (or edge) corresponds to an object boundary in an image. A weighted, planar graph is imposed onto the watershed lines of a gradient magnitude image and the confidence measure is a function of the cost of fixed-length paths emanating from and extending to each end of a graph arc. The confidence measure is applied to automate the detection of object boundaries and thereby reduces (often greatly) the time and effort required for object boundary definition within a user-guided image segmentation environment.
Fast Focal Length Solution In Partial Panoramic Image Stitching, William A. Barrett, Kirk L. Duffin
Fast Focal Length Solution In Partial Panoramic Image Stitching, William A. Barrett, Kirk L. Duffin
Faculty Publications
Accurate estimation of effective camera focal length is crucial to the success of panoramic image stitching. Fast techniques for estimating the focal length exist, but are dependent upon a close initial approximation or the existence of a full circle panoramic image sequence. Numerical solutions of the focal length demonstrate strong coupling between the focal length and the angles used to position each component image about the common spherical center. This paper demonstrates that parameterizing panoramic image positions using spherical arc length instead of angles effectively decouples the focal length Ji.om the image position. This new parameterization does not require an …
Development Of A Systems Engineering Model Of The Chemical Separations Process: Quarterly Progress Report 8/16/01- 11/15/01, Yitung Chen, Randy Clarksean, Darrell Pepper
Development Of A Systems Engineering Model Of The Chemical Separations Process: Quarterly Progress Report 8/16/01- 11/15/01, Yitung Chen, Randy Clarksean, Darrell Pepper
Separations Campaign (TRP)
The AAA program is developing technology for the transmutation of nuclear waste to address many of the long-term disposal issues. An integral part of this program is the proposed chemical separations scheme.
Two activities are proposed in this Phase I task: the development of a systems engineering model and the refinement of the Argonne code AMUSE (Argonne Model for Universal Solvent Extraction). The detailed systems engineering model is the start of an integrated approach to the analysis of the materials separations associated with the AAA Program. A second portion of the project is to streamline and improve an integral part …
Oral History Interview With William Mcgee, Philip L. Frana
Oral History Interview With William Mcgee, Philip L. Frana
Philip L Frana
William McGee is a retired senior programmer at the IBM Santa Teresa Laboratory. McGee received the AB degree in physics from the University of California at Berkeley in 1949, and the MA degree in physics from Columbia University in 1951. From 1951 to 1959, McGee managed the numerical analysis unit at the General Electric Hanford Atomic Products Operation in Richland, Washington. Between 1959 and 1964 he led systems programming and research at Ramo Wooldridge Corporation in Canoga Park, California. McGee joined ...
The Vacuum Buffer, Voicu Popescu
The Vacuum Buffer, Voicu Popescu
Link Foundation Modeling, Simulation and Training Fellowship Reports
Image-based rendering (IBR) techniques have the potential of alleviating some of the bottlenecks of traditional geometry-based rendering such as modeling difficulty and prohibitive cost of photorealism. One of the most appealing IBR approaches uses images enhanced with per-pixel depth and creates new views by 3D warping (IBRW). Modeling a scene with depth images lets one automatically capture intricate details, which are hard to model conventionally. Also, rendering from such representations has the potential of being efficient since it seems that the number of samples that need to be warped is independent of the scene complexity and is just a fraction …
Wright State University College Of Engineering And Computer Science Bits And Pcs Newsletter, Volume 18, Number 3, November 2001, College Of Engineering And Computer Science, Wright State University
Wright State University College Of Engineering And Computer Science Bits And Pcs Newsletter, Volume 18, Number 3, November 2001, College Of Engineering And Computer Science, Wright State University
BITs and PCs Newsletter
An eight page newsletter created by the Wright State University College of Engineering and Computer Science that addresses the current affairs of the college.
Developing Haptic And Visual Perceptual Categories For Reaching And Grasping With A Humanoid Robot, Jefferson Coelho, Justus Piater, Roderic Grupen
Developing Haptic And Visual Perceptual Categories For Reaching And Grasping With A Humanoid Robot, Jefferson Coelho, Justus Piater, Roderic Grupen
Roderic Grupen
Properties of the human embodiment sensorimotor apparatus and neurological structure participate directly in the growth and development of cognitive processes against enormous worst case complexity. It is our position that relationships between morphology and perception over time lead to incerasingly comprehensive models that describe the agent's relationship to the world.We are applying insight derived from neuroscience, neurology, and developmental psychology to the design of advanced robot architectures. To investigate developmental processes, we have begun to approximate the human sensorimotor configuration and to engage sensory and motor subsystems in developmental sequences. Many such sequences have been documented in studies of infant …
Parameter Synthesis Of Higher Kinematic Planars, Min-Ho Kyung, Elisha Sacks
Parameter Synthesis Of Higher Kinematic Planars, Min-Ho Kyung, Elisha Sacks
Department of Computer Science Technical Reports
No abstract provided.
A Round Trip Time And Timeout Aware Traffic Conditioner For Differentiated Services Networks, Ahsan Habib, Bharat Bhargava, Sonia Fahmy
A Round Trip Time And Timeout Aware Traffic Conditioner For Differentiated Services Networks, Ahsan Habib, Bharat Bhargava, Sonia Fahmy
Department of Computer Science Technical Reports
No abstract provided.
Knowledge Discovery In Biological Datasets Using A Hybrid Bayes Classifier/Evolutionary Algorithm, Michael L. Raymer, Leslie A. Kuhn, William F. Punch
Knowledge Discovery In Biological Datasets Using A Hybrid Bayes Classifier/Evolutionary Algorithm, Michael L. Raymer, Leslie A. Kuhn, William F. Punch
Kno.e.sis Publications
A key element of bioinformatics research is the extraction of meaningful information from large experimental data sets. Various approaches, including statistical and graph theoretical methods, data mining, and computational pattern recognition, have been applied to this task with varying degrees of success. We have previously shown that a genetic algorithm coupled with a k-nearest-neighbors classifier performs well in extracting information about protein-water binding from X-ray crystallographic protein structure data. Using a novel classifier based on the Bayes discriminant function, we present a hybrid algorithm that employs feature selection and extraction to isolate salient features from large biological data sets. The …
Profile Combinatorics For Fragment Selection In Comparative Protein Structure Modeling, Deacon Sweeney, Travis E. Doom, Michael L. Raymer
Profile Combinatorics For Fragment Selection In Comparative Protein Structure Modeling, Deacon Sweeney, Travis E. Doom, Michael L. Raymer
Kno.e.sis Publications
Sequencing of the human genome was a great stride towards modeling cellular complexes, massive systems whose key players are proteins and DNA. A major bottleneck limiting the modeling process is structure and function annotation for the new genes. Contemporary protein structure prediction algorithms represent the sequence of every protein of known structure with a profile to which the profile of a protein sequence of unknown structure is compared for recognition. We propose a novel approach to increase the scope and resolution of protein structure profiles. Our technique locates equivalent regions among the members of a structurally similar fold family, and …