Master's theses
Permanent URI for this communityhttps://laurentian.scholaris.ca/handle/10219/2095
Browse
Browsing Master's theses by Issue Date
Now showing 1 - 20 of 112
- Results Per Page
- Sort Options
Item Generating random shapes for Monte Carlo accuracy testing of pairwise comparisons(Laurentian University of Sudbury, 2013-10-08) Almowanes, AbdullahThis thesis shows highly encouraging results as the gain of accuracy reached 18.4% when the pairwise comparisons method was used instead of the direct method for comparing random shapes. The thesis describes a heuristic for generating random but nice shapes, called placated shapes. Random, but visually nice shapes, are often needed for cognitive experiments and processes. These shapes are produced by applying the Gaussian blur to randomly generated polygons. Afterwards, the threshold is set to transform pixels to black and white from di erent shades of gray. This transformation produces placated shapes for easier estimation of areas. Randomly generated placated shapes are used to perform the Monte Carlo method to test the accuracy of cognitive processes by using pairwise comparisons. An on-line questionnaire has been implemented and participants were asked to estimate the areas of ve shapes using a provided unit of measure. They were also asked to compare the shapes in pairs. Such Monte Carlo experiment has never been conducted for 2D case. The received results are of considerable importance.Item Three dimensional reconstruction of objects based on digital fringe projection(Laurentian University of Sudbury, 2013-10-09) Talebi, RezaThree-dimensional reconstruction of small objects has been one of the most challenging problems over the last decade. Computer graphics researchers and photography professionals have been working on improving 3D reconstruction algorithms to fit the high demands of various real life applications. In this thesis, we implemented a 3D scanner system based on fringe projection method. Two different methods have been implemented and used as the unwrapping solution in fringe projection method. A parameterization tool has been created in order to generate different fringe patterns for distinctive needs in the fringe projection method. Considering our first practical implementation (based on phase shifting and multi wavelength techniques) the number of pictures used in phase shifting method has been decreased and the effects of reducing the fringe patterns on the level of precision of the 3D model have been investigated. Optical arrangement and calibration of the system (fringe projection method) have been studied, and numerous suggestions have been proposed to improve the precision of the system. Also, an evaluation method has been implemented based on calibration techniques. The error rate on both surface and height of the 3D model compare with the object has been calculated.Item Practical approaches to complex role assignment problems in role-based collaboration(Laurentian University of Sudbury, 2013-10-09) Feng, LumingGroup role assignment (GRA) is an important task in Role-Based Collaboration (RBC). The complexity of group role assignment becomes very high as the constraints are introduced. According to recent studies, considerable efforts have been put towards research on complex group role assignment problems. Some of these problems are clearly defined and initial solutions are proposed. However some of these solutions were unable to guarantee an optimal result, or the time complexity is very high. In fact, many real world collaboration problems concern many types of constraints. Therefore, to make them practical, the accuracy and efficiency of the algorithms should be improved. Role is the center of a role-based collaboration mechanism. Role plays a very essential part in the whole process of a collaboration system, without the roles, there would be no collaboration. One important function of the role is that it defines the features or requirements of a position which can be used to filter or access the candidates. The definition of roles greatly influences the evaluation results of candidates, which in turn influence the RBC algorithms significantly. Based on previous research, the role-based evaluation is associated with multiple attribute decision making (MADM). Role-based evaluation methods can be adopted from MADM methods. Selecting an appropriate method for a specific problem is difficult and domain oriented. Therefore, a dynamic evaluation model which can be expanded by domain experts and adapted to many cases is required. At present, there is limited research related to this requirement. This thesis first focuses on two complex role-based collaboration problems. The first being group role assignment problems with constraints of conflicting agents, and the iv second an agent training problem for a sustainable group. Practical solutions to these problems are proposed and resolved by IBM ILOG CPLEX. Simulations are conducted to demonstrate the performance of these solutions. From which I compare the solutions’ performances with the initial solutions, and indicate the improvement of these proposed solutions. Secondly, this thesis clarifies the difficulties of connecting evaluation methods with real world requirements. In order to overcome these difficulties, I introduce an additional parameter, propose a dynamic evaluation model, and provide four synthesis methods to facilitate the requirements of a co-operation project which is funded by NSERC (Natural Sciences and Engineering Research Council of Canada). The contributions of this thesis includes: clarifying the complexity of two complex role-based collaboration problem; proposing a better solution and verifying its efficiency and practicability; discussing the difficulties of connecting evaluation methods with real world problems; introducing an additional parameter to improve the accuracy of evaluation to some problems; proposing a role-based evaluation model to meet the requirements of adaptive and expandable.Item A self-learning audio player that uses a rough set and neural net hybrid approach(Laurentian University of Sudbury, 2013-10-16) Zuo, HongmingA self-‐learning Audio Player was built to learn users habits by analyzing operations the user does when listening to music. The self-‐learning component is intended to provide a better music experience for the user by generating a special playlist based on the prediction of users favorite songs. The rough set core characteristics are used throughout the learning process to capture the dynamics of changing user interactions with the audio player. The engine is evaluated by simulation data. The simulation process ensures the data contain specific predetermined patterns. Evaluation results show the predictive power and stability of the hybrid engine for learning a users habits and the increased intelligence achieved by combining rough sets and NN when compared with using NN by itself.Item Application of advanced diagonalization methods to quantum spin systems.(Laurentian University of Sudbury, 2014-05-13) Wang, JieyuQuantum spin models play an important role in theoretical condensed matter physics and quantum information theory. One numerical technique that is frequently used in studies of quantum spin systems is exact diagonalization. In this approach, numerical methods are used to find the lowest eigenvalues and associated eigenvectors of the Hamilton matrix of the quantum system. The computational problem is thus to determine the lowest eigenpairs of an extremely large, sparse matrix. Although many sophisticated iterative techniques for the determination of a small number of lowest eigenpairs can be found in the literature, most exact diagonalization studies of quantum spin systems have employed the Lanczos algorithm. In contrast to this, other methods have been applied very successfully to the similar problem of electronic structure calculations. The well known VASP code for example uses a Block Davidson method as well as the residual-minimization - direct inversion of the iterative subspace algorithm (RMM-DIIS). The Davidson algorithm is closely related to the Lanczos method but usually needs less iterations. The RMM-DIIS method was originally proposed by Pulay and later modified by Wood and Zunger. The RMM-DIIS method is particularly interesting if more than one eigenpair is sought since it does not require orthogonalization of the trial vectors at each step. In this work I study the efficiency of the Lanczos, Block Davidson and RMM-DIIS method when applied to basic quantum spin models like the spin-1/2 Heisenberg chain, ladder and dimerized ladder. I have implemented all three methods and are currently applying the methods to the different models. In our presentation I will compare the three algorithms based on the number of iterations to achieve convergence, the required computational time. An Intel's Many-Integrated Core architecture with Intel Xeon Phi coprocessor 5110P integrates 60 cores with 4 hardware threads per core was used for RMM-DIIS method, the achieved parallel speedups were compared with those obtained on a conventional multi-core system.Item Credibility modeling with applications(Laurentian University of Sudbury, 2014-05-16) Khapaeva, TatianaThe purpose of this thesis is to show how the theory and practice of credibility can bene t statistical modeling. The task was, fundamentally, to derive models that could provide the best estimate of the losses for any given class and also to assess the variability of the losses, both from a class perspective as well as from an aggregate perspective. The model tting and diagnostic tests will be carried out using standard statistical packages. A case study that predicts the number of deaths due to cancer is considered, utilizing data furnished by the Colorado Department of Public Health and Environment. Several credibility models are used, including Bayesian, B uhlmann and B uhlmann-Straub approaches, which are useful in a wide range of actuarial applications.Item A comparative study of D2L's Performance with a purpose built E-learning user interface for visual- and hearing-Impaired students(2014-08-29) Farhan, WejdanAn e-learning system in an academic setting is an efficient tool for all students especially for students with physical impairments. This thesis discusses an e-learning system through the design and development of an e-learning user interface for students with visual- and hearing- impairment. In this thesis the tools and features in the user interface required to make the learning process easy and effective for students with such disabilities have been presented. Further, an integration framework is proposed to integrate the new tools and features into the existing e-learning system Desire-To-Learn (D2L). The tools and features added to the user interface were tested by the selected participants with visually-and hearing- impaired students from Laurentian University’s population. Two questionnaires were filled out to assess the usability methods for both the D2L e-learning user interface at Laurentian University and the new e-learning user interface designed for students with visual and hearing impairment. After collecting and analyzing the data, the results from different usability factors such as effectiveness, ease of use, and accessibility showed that the participants were not completely satisfied with the existing D2L e-learning system, but were satisfied with the proposed new user interface. Based on the new interface, the results showed also that the tools and features proposed for students with visual and hearing impairment can be integrated into the existing D2L e-learning system.Item New computational approaches for the transportation models(Laurentian University of Sudbury, 2014-09-11) Almaatani, Dalia EssaThe Transportation model (TP) is one of the oldest practical problems in mathematical programing. This model and its relevant extensions play important roles in Operations Research for finding the optimal solutions for several planning problems in Business and Industry. Several methods have been developed to solve these models, the most known is Vogels Approximation Method (VAM). A modified version of VAM is proposed to obtain near optimal solutions or the optimum in some defined cases. Modified Vogel Method (MVM) consists iteratively in constructing a reduced cost matrix before applying VAM. Beside to MVM, another approach has been developed, namely the Zero Case Penalty, which represents different penalty computational aspects. Through the research, the results of methods-comparison studies and comparative analysis are presented. Furthermore, special classes, the Unbalanced TP and the Transshipment models, were studied and solved with different approaches. Additionally, we provide an application of MVM to Traveling Salesman Problem.Item Computer-interpretable guidelines using GLIF with Windows workflow foundation(Laurentian University of Sudbury, 2014-09-22) Minor, RyanModern medicine is increasingly using evidence based medicine (EBM). EBM has become an integral part of medical training and ultimately on practice. Davis et al. [6] describe the “clinical care gap” where actual day-to-day clinical practice differs from EBC, leading to poor outcomes. This thesis researches the GLIF specification and implements the foundation for a GLIF based guideline system using Windows Workflow Foundation 4.0. There exists no public domain computer implementable guideline system. The guideline system developed allows a guideline implementer to create a guideline visually using certain medical related tasks, and to test and debug them before implementation. Chapter 5 of this thesis shows how to implement a guideline called Group A Streptococcal Disease Surveillance Protocol for Ontario Hospitals which is of fundamental importance for Ontario hospitals. The workflow approach allows developers to create custom tasks should the need arise. The Workflow Foundation provides a powerful set of base classes to implement clinical guidelines.Item Finding patterns in student and medical office data using rough sets(Laurentian University of Sudbury, 2014-10-08) Alenezi, AnwarData have been obtained from King Khaled General Hospital in Saudi Arabia. In this project, I am trying to discover patterns in these data by using implemented algorithms in an experimental tool, called Rough Set Graphic User Interface (RSGUI). Several algorithms are available in RSGUI, each of which is based in Rough Set theory. My objective is to find short meaningful predictive rules. First, we need to find a minimum set of attributes that fully characterize the data. Some of the rules generated from this minimum set will be obvious, and therefore uninteresting. Others will be surprising, and therefore interesting. Usual measures of strength of a rule, such as length of the rule, certainty and coverage were considered. In addition, a measure of interestingness of the rules has been developed based on questionnaires administered to human subjects. There were bugs in the RSGUI java codes and one algorithm in particular, Inductive Learning Algorithm (ILA) missed some cases that were subsequently resolved in ILA2 but not updated in RSGUI. I solved the ILA issue on RSGUI. So now ILA on RSGUI is running well and gives good results for all cases encountered in the hospital administration and student records data.Item Condition monitoring of a fan using neural networks(Laurentian University of Sudbury, 2015-02-24) Zhang, BoItem Classification approaches for microarray gene expression data analysis(2015-03-13) Almoeirfi, MakkeyahThe technology of Microarray is among the vital technological advancements in bioinformatics. Usually, microarray data is characterized by noisiness as well as increased dimensionality. Therefore, data, that is finely tuned, is a requirement for conducting the microarray data analysis. Classification of biological samples represents the most performed analysis on microarray data. This study is focused on the determination of the confidence level used for the classification of a sample of an unknown gene based on microarray data. A support vector machine classifier (SVM) was applied, and the results compared with other classifiers including K-nearest neighbor (KNN) and neural network (NN). Four datasets of microarray data including leukemia data set, prostate dataset, colon dataset, and breast dataset were used in the research. Additionally, the study analyzed two different kernels of SVM. These were radial kernel and linear kernels. The analysis was conducted by varying percentages of dataset distribution coupled with training and test datasets in order to make sure that the best positive sets of data provided the best results. The 10-fold cross validation method (LOOCV) and the L1 L2 techniques of regularization were used to get solutions for the over-fitting issues as well as feature selection in classification. The ROC curve and a confusion matrix were applied in performance assessment. K-nearest neighbor and neural network classifiers were trained with similar sets of data and comparison of the results was done. The results showed that the SVM exceeded the performance and accuracy compared to other classifiers. For each set of data, support vector machine was the best functional method based on the linear kernel since it yielded better results than the other methods. The highest accuracy of colon data was 83% with SVM classifier, while the accuracy of NN with the same data was 77% and KNN was 72%. Leukemia data had the highest accuracy of 97% with SVM, 85% with NN, and 91% with KNN. For breast data, the highest accuracy was 73% with SVM-L2, while the accuracy was 56% with NN and 47% with KNN. Finally, the highest accuracy of prostate data was 80% with SVM-L1, while the accuracy was 75% with NN and 66% with KNN. It showed the highest accuracy as well as the area under curve compared to k-nearest neighbor and neural network in the three different tests.Item Educational data mining using fuzzy sets to facilitate usability and user experience - an approach to integrate artificial intelligence and human-computer interaction(2015-03-27) Jahan, Sheikh ShushmitaArtificial Intelligence (AI) and Human-Computer Interaction (HCI) have the common goal of enhancing effectiveness of a system and making it easier for people to use. AI accomplishes that by demonstrating intelligent behavior on a machine, whereas HCI involves the design approach required to obtain usability and user experience. This study integrates AI and HCI techniques in a real-world application complementing the aims of each field. A web based system was developed for a school board in Eastern Canada by following the user-centered approach of HCI. In the course of designing a good interface, it was found that fuzzy inference of AI was going on in users’ minds when they formed conceptual models to understand the application. The interface was evaluated by applying heuristic evaluation, cognitive walkthroughs and user feedback. It was shown that usability and user experience can be improved by employing fuzzy set techniques. Therefore, fuzzy set modeling can serve as a user centered method for HCI design. Furthermore, data gathering techniques of HCI helped to define the cognitive processes that could be replicated with the aid of fuzzy sets.Item Testing the reliability of predictive models on three different devices(2015-04-07) Alsalmi, BasimNowadays, the first sources of information are websites on the Internet with its categories. Some users find certain websites’ layout and design not always user friendly. This may lead to poor performance, ambiguity and loss of potential customers. Websites designs’ use the principles of user interface (UI) laws and usability guidelines to be consistent and convenient to users. Due to the evolution of technology, society now uses various devices with a wide range of features like screen size, in order to obtain their information. Examples include personal computers (PC), smartphones (SP,) and tablets (TB). Since user interface laws were founded in the 20th century and most were applicable for the PC’s websites, this thesis tests and investigates the reliability of predictive models on three devices: PC, SP and TB. User interfaces were designed with five tasks, and each task represented one of the user interface laws. Human-Computer Interaction (HCI) techniques, user interface design and evaluation methods were followed to test the reliability of predictive models on different devices.Item Survival analysis approaches for prostate cancer.(2015-04-15) Alhasawi, EmanSurvival time has become an essential outcome of clinical trial, which began to emerge among the latter half of the 20th century. A present study was carried out on the survival analysis for patients with prostate cancer. The data was obtained from Memorial Sloan Kettering where each sample was collected from the recipients of the treatment of radical prostatectomy. The Kaplan-Meier method was used to obtain and estimate the survival function and median time among the primary and metastatic tumor of prostate cancer. Results showed that the metastatic tumor has a poor survival rate compared to the primary tumor, which give us a hint that primary tumor has a higher probability of surviving. The log-rank test was used to test the differences in the survival curves. The results showed that the difference in survival rate between the patients of the two groups of tumor was significant with a p-value of 4.44e-15. The second approach was based on the efficiency of cox proportional hazards model and parametric model. Some criteria of residuals were used for judging the goodness of fit among the candidate models. The cox proportional hazard (PH) model provided an effective covariate on the hazard function. As a result of cox PH model, the influence of standard clinical prognostic factors is based on the hazard rate of prostate cancer patients. These prognostic factors are: prostate specific antigen (PSA) level at diagnosis, tumor size, Secondary Gleason grade, and Gleason score which is helpful to determine the treatment. The Gleason score [HR 4.835, 95% CI 2.7847- 8.3937, p=2.20E-08] has the most significant progression-associated prognosticators and reveal to be an effective criteria leading to death in prostate cancer. The Accelerated Failure Time (AFT) was applied to the data with four distortions. AFT with Weibull distortions was chosen to be the best model for our data by testing the AIC.Item On Supporting Group Decision Making by a Hybrid Method of the Method of Pairwise Comparisons and the Delphi Method(Laurentian University of Sudbury, 2015-05-28) Duncan, Grant G. O.Ranking and prioritization are important endeavors that take place in a multitude of industrial, commercial and professional domains. The Delphi method is a systematic procedure for aggregating experts’ opinions, particularly those that are predicated on qualitative or otherwise judgmental information. The method of pairwise comparisons provides an elegant means by which the qualities of criteria or alternatives can be compared, ranked, quantized and provides methods for the identification and resolution of inconsistencies. This thesis examines and proposes a means by which the two methods can be combined, in order to provide an easy to use system (which is then implemented as a reference implementation) for ranking and prioritization of a number of alternatives that is broadly applicable to any number of ranking and prioritization problems.Item Somewhat homomorphic encryption scheme for secure range query process in a cloud environment(2015-06-04) Wei, ShaoboWith the development of the cloud computing, recently, many service models have appeared which are based on the cloud computing, such as infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS), and database as a service (DaaS). For DaaS, there exist many security issues. Especially, the database as a service cannot be fully secured because of some security problems. This research area of cloud computing is called as cloud security. One of the problems is that it is difficult to execute queries on encrypted data in cloud database without any information leakage. This thesis proposes a secure range query process which is based on a somewhat homomorphic encryption scheme to improve secure database functionalities. There is no sensitive information leakage in the secure range query process. The data that are stored in the cloud database are the integers which are encrypted with their binary forms by bits. A homomorphic “greater-than” algorithm is used in the process to compare two integers. Efficiency, security, and the maximum noise that can be controlled in the process are covered in the security and efficiency analysis. Parameter setting analysis of the process will also be discussed. Results of the proposed method have been analyzed through some experiments to test the secure range query process for its practicability with some relatively practical parameter settings.Item The design and evaluation of Novel prototypes to visualize web browsing history(Laurentian University of Sudbury, 2015-08-10) Makkena, AnudeepMainstream Web browsers support users in revisiting Web Pages by providing them with a history tool. Research shows that this history tool is severely underutilized. One possible reason is the manner in which the pages are displayed: a linear list of textual links. This thesis investigates the redesign of the history tool by introducing visualization to display the visited Web pages. Three distinct visual prototypes were designed ranging from a traditional scientific visualization method to a concrete visualization that incorporates a metaphor and knowledge transfer from the real-world. The low-fidelity prototypes were evaluated by participants and the best performing design was implemented as a high-fidelity prototype. Further evaluation with participants was conducted and the results were compared against the performance of participants using the traditional history tool of linear textual links.Item Predicting Alzheimer's disease by segmenting and classifying 3D-brain MRI images using clustering technique and SVM classifiers.(2015-08-31) Matoug, SofiaAlzheimer's disease (AD) is the most common form of dementia affecting seniors age 65 and over. When AD is suspected, the diagnosis is usually confirmed with behavioural assessments and cognitive tests, often followed by a brain scan. Advanced medical imaging and pattern recognition techniques are good tools to create a learning database in the first step and to predict the class label of incoming data in order to assess the development of the disease, i.e., the conversion from prodromal stages (mild cognitive impairment) to Alzheimer's disease. Advanced medical imaging such as the volumetric MRI can detect changes in the size of brain regions due to the loss of the brain tissues. Measuring regions that atrophy during the progress of Alzheimer's disease can help neurologists in detecting and staging the disease. In this thesis, we want to diagnose the Alzheimer’s disease from MRI images. We segment brain MRI images to extract the brain chambers. Then, features are extracted from the segmented area. Finally, a classifier is trained to differentiate between normal and AD brain tissues. We discuss an automatic scheme that reads volumetric MRI, extracts the middle slices of the brain region, performs 2-dimensional (volume slices) and volumetric segmentation methods in order to segment gray matter, white matter and cerebrospinal fluid (CSF), generates a feature vector that characterizes this region, creates a database that contains the generated data, and finally classifies the images based on the extracted features. For our results, we have used the MRI data sets from the Alzheimer’s disease Neuroimaging Initiative (ADNI) database1. We assessed the performance of the classifiers by using results from the clinical tests.Item Estimating the win probability in a hockey game(2016-04-15) Yang, ShudanWhen a hockey game is being played, its data comes continuously. Therefore, it is possible to use the stream mining method to estimate the win probability (WP) of a team once the game begins. Based on 8 seasons’ data of NHL from 2003-2014, we provide three methods to estimate the win probability in a hockey game. Win probability calculation method based on statistics is the first model, which is built based on the summary of the historical data. Win probability calculation method based on data mining classification technique is the second model. In this model, we implemented some data classification algorithms on our data and compared the results, then chose the best algorithm to build the win probability model. Naive Bayes, SVM, VFDT, and Random Tree data classification methods have been compared in this thesis on the hockey dataset. We used stream mining technique in our last model, which is a real time prediction model, which can be interpreted as a trainingupdate- training model. Every 20 events in a hockey game are split as a window. We use the last window as the training data set to get decision tree rules used for classifying the current window. Then a parameter can be calculated by the rules trained by these two windows. This parameter can tell us which rule is better than another to train the next window. In our models the variables time, leadsize, number of shots, number of misses, number of penalties are combined to calculate the win probability. Our WP estimates can provide useful evaluations of plays, prediction of game result and in some cases, guidance for coach decisions.