Insertion/deletion detection and bit-resynchronisation using the viterbi algorithm
- Authors: Santos, Marco Paulo Ferreira dos
- Date: 2009-02-26T12:17:53Z
- Subjects: Algorithms , Decoders (electronics)
- Type: Thesis
- Identifier: uj:8152 , http://hdl.handle.net/10210/2159
- Description: M.Ing.
- Full Text:
Subjective analysis of image coding errors
- Authors: El-Hesnawi, Mohamed Rahoma
- Date: 2009-02-26T12:18:47Z
- Subjects: Visual pathways , Image processing , Image analysis , Algorithms
- Type: Thesis
- Identifier: uj:8156 , http://hdl.handle.net/10210/2162
- Description: D.Ing. , The rapid use of digital images and the necessity to compress them, has created the need for the development of image quality metrics. Subjective evaluation is the most accurate of the image quality evaluation methods, but it is time consuming, tedious and expensive. In the mean time widely used objective evaluations such as the mean squared error measure has proven that they do not assess the image quality the way a human observer does. Since the human observer is the final receiver of most visual information, taking the way humans perceive visual information will be greatly beneficial for the development of an objective image quality metric that will reflect the subjective evaluation of distorted images. Many attempts have been carried out in the past, which tried to develop distortion metrics that model the processes of the human visual system, and many promising results have been achieved. However most of these metrics were developed with the use of simple visual stimuli, and most of these models were based on the visibility threshold measures, which are not representative of the distortion introduced in complex natural compressed images. In this thesis, a new image quality metric based on the human visual system properties as related to image perception is proposed. This metric provides an objective image quality measure for the subjective quality of coded natural images with suprathreshold degradation. This proposed model specifically takes into account the structure of the natural images, by analyzing the images into their different components, namely: the edges, texture and background (smooth) components, as these components influence the formation of perception in the HVS differently. Hence the HVS sensitivity to errors in images depends on weather these errors lie in more active areas of the image, such as strong edges or texture, or in the less active areas such as the smooth areas. These components are then summed to obtain the combined image which represents the way the HVS is postulated to perceive the image. Extensive subjective evaluation was carried out for the different image components and the combined image, obtained for the coded images at different qualities. The objective (RMSE) for these images was also calculated. A transformation between the subjective and the objective quality measures was performed, from which the objective metric that can predict the human perception of image quality was developed. The metric was shown to provide an accurate prediction of image quality, which agrees well with the prediction provided by the expensive and lengthy process of subjective evaluation. Furthermore it has the desired properties of the RMSE of being easier and cheaper to implement. Therefore, this metric will be useful for evaluating error mechanisms present in proposed coding schemes.
- Full Text:
Evolutionary generation of plant models
- Authors: Venter, Johannes
- Date: 2011-09-05T09:52:32Z
- Subjects: Three-dimensional imaging in biology , Algorithms , L systems
- Type: Thesis
- Identifier: uj:7190 , http://hdl.handle.net/10210/3824
- Description: M.Sc. (Computer Science) , Modelling the geometry of a 3D plant for use in a virtual environment can be highly laborious, and hence modelling a large collection of variations of the same plant can be a difficult task. Procedural rule-based methods, such as L-Systems, that generate plant geometry indirectly are powerful techniques for the modelling of plants. However such methods often require expert knowledge and skill in order to be used effectively. This dissertation explores a method for the modelling of procedurally generated plants using an evolutionary algorithm. The model is based on gene expression programming, and uses a hybrid of automated and interactive fitness evaluation. In the model, organisms are represented with linear genomes that can be expressed as L-Systems. The L-Systems can in turn be interpreted as geometry for 3D plants. Several automated fitness functions are presented to rate plants based on various topological and geometric attributes. These fitness functions are used in conjunction with user-based, interactive fitness evaluation in order to provide a comparison of different organisms. The model discussed in this dissertation offers advantages over previous approaches to modelling plants with evolutionary algorithms, and allows a user to quickly generate a population of varied plants without requiring knowledge of the underlying L-Systems.
- Full Text:
A modified hiding high utility item first algorithm (HHUIF) with item selector (MHIS) for hiding sensitive itemsets
- Authors: Selvaraj, Rajalakshmi , Kuthadi, Venu Madhav
- Date: 2013
- Subjects: Utility mining , Data mining , Algorithms , Hiding high utility item first algorithm
- Type: Article
- Identifier: uj:5419 , ISSN 1349-4198 , http://hdl.handle.net/10210/10971
- Description: In privacy preserving data mining, utility mining plays an important role. In privacy preserving utility mining, some sensitive itemsets are concealed from the data- base according to certain privacy policies. Hiding sensitive itemsets from the adversaries is becoming an important issue nowadays. Also, only very few methods are available in the literature to hide the sensitive itemsets in the database. One of the existing privacy preserving utility mining methods utilizes two algorithms, HHUIF and MSICF to con- ceal the sensitive itemsets, so that the adversaries cannot mine them from the modi ed database. To accomplish the hiding process, this method nds the sensitive itemsets and modi es the frequency of the high valued utility items. However, the performance of this method lacks if the utility value of the items are the same. The items with the same utility value decrease the hiding performance of the sensitive itemsets and also it has introduced computational complexity due to the frequency modi cation in each item. To solve this problem, in this paper a modified HHUIF algorithm with Item Selector (MHIS) is pro- posed. The proposed MHIS algorithm is a modified version of existing HHUIF algorithm. The MHIS algorithm computes the sensitive itemsets by utilizing the user defined utility threshold value. In order to hide the sensitive itemsets, the frequency value of the items is changed. If the utility values of the items are the same, the MHIS algorithm selects the accurate items and then the frequency values of the selected items are modified. The proposed MHIS reduces the computation complexity as well as improves the hiding per- formance of the itemsets. The algorithm is implemented and the resultant itemsets are compared against the itemsets that are obtained from the conventional privacy preserving utility mining algorithms.
- Full Text:
A new data stream mining algorithm for interestingness-rich association rules
- Authors: Kuthadi, Venu Madhav
- Date: 2013
- Subjects: Data stream , Data stream parameters-tuned interestingness , Algorithm parameters-tuned interestingness , Frequency supporters-tuned interestingness , Algorithms
- Type: Article
- Identifier: uj:5418 , ISSN 0887-4417 , http://hdl.handle.net/10210/10970
- Description: Frequent itemset mining and association rule generation is a challenging task in data stream. Even though, various algorithms have been proposed to solve the issue, it has been found out that only frequency does not decides the significance interestingness of the mined itemset and hence the association rules. This accelerates the algorithms to mine the association rules based on utility i.e. proficiency of the mined rules. However, fewer algorithms exist in the literature to deal with the utility as most of them deals with reducing the complexity in frequent itemset/association rules mining algorithm. Also, those few algorithms consider only the overall utility of the association rules and not the consistency of the rules throughout a defined number of periods. To solve this issue, in this paper, an enhanced association rule mining algorithm is proposed. The algorithm introduces new weightage validation in the conventional association rule mining algorithms to validate the utility and its consistency in the mined association rules. The utility is validated by the integrated calculation of the cost/price efficiency of the itemsets and its frequency. The consistency validation is performed at every defined number of windows using the probability distribution function, assuming that the weights are normally distributed. Hence, validated and the obtained rules are frequent and utility efficient and their interestingness are distributed throughout the entire time period. The algorithm is implemented and the resultant rules are compared against the rules that can be obtained from conventional mining algorithms.
- Full Text:
Flows in networks : an algorithmic approach
- Authors: Marcon, Alister Justin
- Date: 2013-05-01
- Subjects: Graph theory , Network analysis (Planning) , Algorithms , Mathematical optimization , System analysis
- Type: Thesis
- Identifier: uj:7507 , http://hdl.handle.net/10210/8364
- Description: M.Sc. (Mathematics) , In Chapter 1, we consider the relevant theory pertaining to graphs and digraphs that will be used in the study of flows in networks. Warshall’s algorithm for reachability is also considered since it will allow us to ascertain whether some paths exist in some instance. In Chapter 2, we explore flows and cuts in networks. We define the basic concepts of source, sink, intermediate vertices, capacity, costs and lower-bounds. Feasible flows are defined, as well as the value of a flow. Cuts in capacitated networks are explored and further theory relating the value of a flow and cuts is given. We considered the problem of determining a maximal flow. In particular, we consider augmentations of the flow—this allows us to give a characterization of a maximal flow. The important Max-flow Min-cut theorem is also considered. After having explored the relevant theory, we move on to methods of finding a maximal flow for a given s-t network that has a capacity on each of its arcs. Firstly, we consider zero-one and unit-capacity networks since these play a role in the applications of maximal flows in Chapter 4. We, then, compile the relevant theory and algorithms in order to implement two augmenting path finding algorithms.
- Full Text:
Laser based mapping of an unknown environment
- Authors: Corregedor, Antonio Rodrigues
- Date: 2014-03-17
- Subjects: Laser based mapping , Algorithms , Simulation methods & models , Mappings (Mathematics) , Environmental mapping , Machine learning
- Type: Thesis
- Identifier: uj:4341 , http://hdl.handle.net/10210/9690
- Description: M.Ing. (Electrical and Electronic Engineering) , This dissertation deals with the mapping of an unknown environment. Mapping of an environment can be accomplished by asking the question “What is in my world?” whilst moving through the environment. Once the objects occupying the ‘world’ have been discovered, the locations of these objects are stored somewhere (for example on paper), so that the environment can be navigated at a later stage. In the context of robots, a map provides the robot with a certain degree of “intelligence”. Several different types of applications are available for robots with “intelligence”; ranging from mining applications, to search and rescue situations, to surveillance applications and recognisance applications. The research hypothesis posed by this dissertation is as follows: Produce a human readable map for an unknown defined structured environment using a single laser range finder (LRF). The focus was on mapping environments resembling mine tunnels. In mine tunnel environments sensors, such as wheel odometers, can fail. This failure makes it advantageous to be able to create a map of the environment with the data obtained solely from the LRF. For this dissertation, the following restrictions were placed on the environment being mapped. It had to be structured (i.e. the environment could be described by simple geometric primitives such as lines); it had to be static (the only entity allowed to move in the environment was the LRF to obtain data); and the environment had to be defined (i.e. have a starting and ending point). During the course of this Masters research, it was discovered that in order to create a human readable map, one has to determine the accurate localisation of the sensor in the environment whilst mapping. The described scenario is a typical problem in mapping and is referred to as the ‘simultaneous localisation and mapping (SLAM) problem’. This dissertation shows results when mapping was done with – and without – accurate localisation. The final approach used to create the human readable map consisted of determining scan matched odometry (based on a feature matching and ICP algorithm). The scan matched odometry is incorporated into a grid-based SLAM technique that utilises a particle filter to accurately determine the position of the sensor in the environment, in order to create a human readable map of the environment. The algorithm used (as described) was able to close loops (i.e. the mapping algorithm was able to handle the sensor returning to its starting point) and it produced satisfactory results for the types of environments as required by the scope of this dissertation.
- Full Text:
Die implementering en eksperimentele evaluering van enkele diskrete optimeringsalgoritmes
- Authors: Meyer, Thomas William Saymoir
- Date: 2014-04-03
- Subjects: Mathematical optimization , Algorithms
- Type: Thesis
- Identifier: uj:10500 , http://hdl.handle.net/10210/10003
- Description: M.Sc. (Computer Science) , Chapter 1 is a summary in which the problems- discussed in this study, as well as the relationship between them are shown. The algorithmic notation used when discussing the problems are also defined. In chapter 2 the three classes of algorithms for finding a minimum spanning tree, i.e. the algorithms of Prim, Kruskal and SolIin are discussed. PASCAL implementations of all the algorithms are presented. We also report on our computational experience with these implementations. It was found that the implementation of the Prim algorithm was very efficient, while implementations of the Kruskal algorithm also gave good results. Implementations of the Sollin algorithm were less efficient, because of the complex data structures involved. Algorithms for finding an optimum arborescence or branching in a network have independently been proposed by Edmonds, Chu & Liu as well as Bock. Tarjan as well as Camerini et al subsequently discussed aspects of the efficient implementation of the algorithm. In chapter 3 we draw attention to the related work of Fulkerson by reformulating the Edmonds-Fulkerson algorithm and giving a simple proof of the correctness of the algorithm. We also discuss and present an implementation of the algorithm and report on our computational experience. From the results presented it is clear that the number of cycles encountered during the first phase of the algorithm has a significant effect on the efficiency of the algorithm...
- Full Text:
Differential power analysis of an AES software implementation
- Authors: Moabalobelo, Phindile Terrence
- Date: 2014-04-16
- Subjects: Computer security , Data encryption (Computer science) , Algorithms , Cryptography
- Type: Thesis
- Identifier: uj:10801 , http://hdl.handle.net/10210/10308
- Description: M.Ing. (Electrical and Electronic Engineering) , Please refer to full text to view abstract
- Full Text:
Expressed sequence tag clustering using commercial gaming hardware
- Authors: Van Deventer, Charl
- Date: 2014-04-16
- Subjects: Bioinformatics , Soft computing , Intelligent agents (Computer software) , Algorithms
- Type: Thesis
- Identifier: uj:10799 , http://hdl.handle.net/10210/10306
- Description: M.Ing. (Electrical And Electronic Engineering) , Bioinformatics is one of the most rapidly advancing sciences today. It is a scienti c domain that attempts to apply modern computing and information technologies to the eld of biology, the study of life itself and involves documenting and analysing genetics, proteins, viruses, bacteria and cancer as well as hereditary traits and diseases, as well as researching cures and treatments for whole ranges of health threats. The growth of bioinformatics and developments, both theoretical and experimental in biology, can largely be linked to the IT explosion which gives the eld more powerful processing options with much cheaper solutions, limited only by the steady yet signi cant improvements as promised by Moore's Law [3]. This IT explosion has also caused signi cant advances due to the high consumer demand region of computer graphics hardware, or GPUs (Graphics Processing Units). The consumer demand has actually managed to advance GPUs far faster than classical CPUs (Central Processing Units), outpacing CPU performance improvements by a large margin. As of early 2010, the fastest available PC processor(Intel Core i7 980 XE) has a theoretical performance of 107.55 GFLOPS [4], while GPUs with TFLOPS (1000 GFLOPS) of performance have been commercially available since 2008 (ATI HD4800). While typically used only for graphical rendering, modern innovations have greatly increased GPU exibility and has given rise to the eld of GPGPU (General Purpose GPU) which allows graphics processors to be applied to non-graphics applications. By utilizing GPU processing power to solve bioinformatics problems, the eld can theoretically be boosted once again, increasing the amount of computational power available to scientists by an order of magnitude or more...
- Full Text:
Incorporation of the first derivative of the objective function into the linear training of a radial basis function neural network for approximation via strict interpolation
- Authors: Prentice, Justin Steven Calder
- Date: 2014-07-23
- Subjects: Interpolation , Approximation theory , Numerical integration , Algorithms , Neural networks (Computer science)
- Type: Thesis
- Identifier: uj:11837 , http://hdl.handle.net/10210/11569
- Description: D.Phil. (Applied mathematics) , Please refer to full text to view abstract
- Full Text:
Groepteoretiese algoritmes en die grafiekisomorfie-probleem
- Authors: Barnard, Andries
- Date: 2014-09-01
- Subjects: Group theory , Algorithms
- Type: Thesis
- Identifier: uj:12179 , http://hdl.handle.net/10210/11920
- Description: M.Sc. (Mathematics) , Please refer to full text to view abstract
- Full Text:
An analysis of robust fuzzy - PID control for AGC systems : focusing on PID tuning and design
- Authors: Thakur, Priyanka
- Date: 2015
- Subjects: PID controllers , Fuzzy systems , Automatic control , Control theory , Algorithms
- Language: English
- Type: Masters (Thesis)
- Identifier: http://hdl.handle.net/10210/213546 , uj:21166
- Description: Abstract: Please refer to full text to view abstract , M.Ing. (Electrical and Electronic Engineering))
- Full Text:
Online computational algorithms for portfolio-selection problems
- Authors: Nkomo, Raphael Ndem
- Date: 2015
- Subjects: Algorithms , Portfolio management
- Language: English
- Type: Doctoral (Thesis)
- Identifier: http://hdl.handle.net/10210/493032 , uj:45051
- Description: Abstract: This thesis contributes to the problem of equity portfolio management using computational intelligence methodologies. The focus is on generating automated nancial reasoning, with a basis in computational nance research, through searching a space of semantically meaningful propositions. In comparison with classical nancial modelling, our proposed algorithms allow continual adaptation to changing market conditions and a non- linear solution representations in most cases. When compared with other computational intelligence approaches, the focus is on a holistic design that integrates nancial research with machine learning. The major aim of the thesis is to develop portfolio allocation techniques for learning investment-decision making that can easily adapt to changes in market processes together with speed and accuracy. We evaluate the algorithms developed in out-of-sample trading framework using historical data sets. The testing is designed to be realistic; for instance, considering factors such as transaction costs, stock splits and data snooping. To demonstrate the robustness of our approach we perform extensive historical simulations using previously untested real market datasets. On all data sets considered, our proposed algorithms signi cantly outperform existing portfolio allocation techniques, sometimes in a spectacular way, without any additional computational demand or modeling complexity. Before proceeding any further, we stress that setting up abstract and complex mathe- matical models is neither the intention nor the scope of this thesis. Our aim rather is to investigate empirically and possibly capture any existing nonlinearities or non-stochasticities that are apparent in the dynamics of cross sectional returns of stock prices. In doing so we iii utilise some novel techniques, which are mostly based on such methodologies that have been used successfully in the physical sciences were the deterministic dynamics of the phenomena are more easily detected. Our intention is to provide an additional empirical analysis frame- work that could shed new light in the investigation of the nature of financial time-series data generating processes. , Ph.D. (Economics and Financial Sciences)
- Full Text:
A study on Hough transform-based fingerprint alignment algorithms
- Authors: Mlambo, Cynthia Sthembile
- Date: 2015-06-26
- Subjects: Fingerprints - Identification , Algorithms , Image processing - Digital techniques
- Type: Thesis
- Identifier: uj:13636 , http://hdl.handle.net/10210/13816
- Description: M.Ing. (Electrical and Electronic Engineering) , Please refer to full text to view abstract
- Full Text:
Complexity aspects of certain graphical parameters
- Authors: Lategan, Frans Adriaan
- Date: 2015-10-07
- Subjects: Graph theory , Algorithms
- Type: Thesis
- Identifier: uj:14236 , http://hdl.handle.net/10210/14689
- Description: M.Sc. (Mathematics) , Please refer to full text to view abstract
- Full Text:
Faster than the speed of law : evaluating the challenges faced in regulating algorithmic and high frequency trading
- Authors: Abrahams, Niyaaz
- Date: 2016
- Subjects: Electronic trading of securities , Algorithms , Program trading (Securities) , Investments , Stocks - Mathematical models , Investment analysis
- Language: English
- Type: Masters (Thesis)
- Identifier: http://ujcontent.uj.ac.za8080/10210/367483 , http://hdl.handle.net/10210/86891 , uj:19540
- Description: Abstract: This minor dissertation explores the highly technical world of algorithmic and high frequency trading. It provides a brief overview of the key concepts, benefits and market concerns surrounding these technologies. The dissertation looks at the multitude of challenges faced in attempting to regulate and investigate high frequency trading. Further, the current Financial Markets Act is evaluated to determine the extent of its effectiveness, in light of these new technologies. The dissertation then looks at the regulatory developments made in the European Union and determines whether South African regulation should follow suit. It finds that the perceived benefits of high frequency trading do not adequately outweigh the detrimental effects that these systems could cause. The Financial Markets Act is wholly insufficient in dealing with the new risks posed by these systems and it is therefore recommended that urgent regulatory changes are implemented. With so much investment being made in developing technology, innovative regulation has not been equally developed. It is concluded that without the implementation of better regulation, South African financial markets regulation will remain too slow to keep up with rapidly advancing financial markets. , LL.M. (Banking law)
- Full Text:
Detecting emotions from speech using machine learning techniques
- Authors: Roy, Tanmoy
- Date: 2019
- Subjects: Artificial intelligence , Automatic speech perception - Technological innovations , Automatic speech recognition , Algorithms , Speech processing systems
- Language: English
- Type: Doctoral (Thesis)
- Identifier: http://hdl.handle.net/10210/437516 , uj:37992
- Description: D.Phil. (Electronic Engineering)
- Full Text:
Fake narratives, dominant discourses : the role and influence of algorithms on the online South African land reform debate
- Authors: Jansen Van Vuuren, Anna‐Marie , Celik, Turgay
- Date: 2019
- Subjects: Algorithms , Bias , Data
- Language: English
- Type: Conference proceedings
- Identifier: http://hdl.handle.net/10210/403838 , uj:33856 , Citation: Jansen Van Vuuren, A.m. & Celik, T. 2019. Fake narratives, dominant discourses : the role and influence of algorithms on the online South African land reform debate.
- Description: Abstract: Within this paper the authors explore the discourse surrounding algorithmic processes, by examining the way a search engine’s result influenced an online debate about land reform in South Africa. The article begins by reflecting on the rise of the internet contributing to issues around fake news. Then it continues to discuss the incident that serves as case study for this paper. In June 2018 Twitter users criticised the search engine Google for only displaying photos of white people when one type the words “squatter camps in South Africa” into the Google Image search bar. In the debate that followed, many of the online users accused Google of propagating a biased narrative to destabilise land reform. The paper’s purpose is to explain the role of algorithms to readers from the fields of humanities and social sciences without a technical background in computer science. Through exploring the “squatter camps in South Africa”‐case study, the authors intend to reveal to the reader the power online search engines have in shaping the public debate. Yet, through conducting their own search on different international search engines, but using the same key words, they prove that it is not in fact the algorithms that are biased, but rather the online data that is generated by internet users themselves.
- Full Text:
Multi-period portfolio optimization : a differential evolution copula-based approach
- Authors: Mba, Jules Clement
- Date: 2019
- Subjects: Copulas (Mathematical statistics) , Data encryption (Computer science) , Econometrics , Algorithms , Finance - Mathematical models
- Language: English
- Type: Masters (Thesis)
- Identifier: http://hdl.handle.net/10210/295977 , uj:32240
- Description: Abstract: Please refer to full text to view abstract. , M.Com. (Financial Economics)
- Full Text: