A regional, 3-D computer-based sedimentological model of the permian Witbank coalfield, South Africa
- Authors: Grodner, Mark William
- Date: 2009-01-28T09:41:56Z
- Subjects: Geological modeling , Sedimentology , Computer simulation , Witbank (South Africa)
- Type: Thesis
- Identifier: uj:14839 , http://hdl.handle.net/10210/1962
- Description: M.Sc. , The objective of this work is to establish a regional, three-dimensional sedimentological model of the clastic strata of the Vryheid Formation and pre-existing basin floor topography of the Witbank Coalfield, South Africa. This considers an area extending from 25°30’ S to 26°30’ S by 28°30’E to 30°E. This has been undertaken using computer modelling to examine the three-dimensional distribution of the sedimentary rocks. The basis for the geological model presented in this thesis is 1190 borehole logs collected from three mining companies involved in coal extraction in the study area, namely Anglo American Coal Corporation, Duiker Mining Limited and Ingwe Coal Corporation Limited. These borehole logs are converted to a common data format for inclusion into the 3-D model. These borehole logs are correctly positioned in space using a digital elevation model. The primary objective of this research was to visualize the interburden and overburden sedimentary rocks within the study area, so as to be able to understand the distribution and hence origin of these rocks. As commercially available modelling tools have several restrictions with respect to the current work, Geovision cc. was contracted to develop the tools needed for the geological visualization of the data. Using the three-dimensional model, the distribution of the various facies and facies associations can be defined, to understand the depositional history of the basin. These characteristics include the delineation of the general trend of the basement topography. The distribution of the lower glacial and paraglacial sedimentary rocks within the study in steep-sided palaeovalleys, marked by rapid facies and thickness changes from diamictite through argillite, and minor sandstones and conglomerates is shown in 3-D. Evidence of lobate deltas and bedload sandy river deposits between the No. 2 and No. 4 Seams is presented in 3-D. The 3-D characteristics of the rocks between the No. 4 and No. 5 Seams, proposed to represent a period of deltaic progradation during an overall marine transgression is described. By using a 3-D model it is possible to evaluate the distribution of sedimentary rocks, both through space and through time. These palaeoenvironmental interpretations facilitate a better understanding of the genesis of the Witbank Coalfield. Furthermore, this enhanced geological modelling can facilitate improved mine planning and mining techniques.
- Full Text:
Chaotic neural network swarm optimization
- Authors: Sun, Y-X
- Date: 2007
- Subjects: Artificial intelligence , Chaos theory , Computer simulation , Convergence of numerical methods , Global optimization , Hopfield neural networks
- Language: Chinese
- Type: Article
- Identifier: http://hdl.handle.net/10210/18234 , uj:15975 , ISSN: 1671-5497 , Citation: Sun, Y-X. et al. 2007. Chaotic neural network swarm optimization. Engineering village, 37(9):113-116.
- Description: A single particle structure of particle swarm optimization was analyzed which is found to have some properties of a Chaos-Hopfield neural net work. A new model of particle swarm optimization is presented. The model is a deterministic Chaos-Hopfield neural network swarm which is different from the existing one with stochastic parameters. Its search orbits show an evolution process of inverse period bifurcation from chaos to periodic orbits then to sink. In this evolution process, the initial chaos-like search expands the optimal scope, and inverse period bifurcation determines the stability and convergence of the search. Moreover, the convergence is theoretically analyzed. Finally, the numerical simulation shows the basic procedure of the proposed model and verifies its efficiency.
- Full Text:
Digital environment evolution modelling and simulation
- Authors: Bengis, Merrick Kenna
- Date: 2020
- Subjects: Computer science , Computer simulation , Information technology
- Language: English
- Type: Doctoral (Thesis)
- Identifier: http://hdl.handle.net/10210/458387 , uj:40713
- Description: Ph.D. (Computer Science and Software Engineering) , Abstract: The concurrent growth of the human population and advancement in technology, together with ever-changing social interaction, has led to the creation of a large, abstract and complex entity known as the Digital Environment. In the current world, the Digital Environment, which is continually growing and ever-evolving, is now almost unrecognisable from what it started off as nearly 50 years ago. The human population has grown rapidly in the past century, growing to nearly 8 billion people in 2019, already double the population from 1975. This has created a world with more people than ever before, all of whom have a need to communicate with others, share information and form communities. Technology also experienced unprecedented advancements in this time, with important inventions such as electricity, computational machines, and communication networks. These technologies grew and allowed for people around the world to communicate as if they were next to each other, facilitated by the advent of the Internet. Presently, people all around the world are creating, sharing, and consuming information, while forming online communities, and also growing the physical footprint of the Internet and all connected devices. The intersection of these events formed the Digital Environment: an amalgamation of the physical, digital and cyber worlds. It is evident how rapidly and completely the Digital Environment has evolved in the past few decades, so what is in store for the future? Can people prepare for what the Digital Environment is to become and possibly even change its course? This thesis proposes a novel model for the simulation and prediction of the evolution of the Digital Environment: the Digital Environment Evolution Modelling and Simulation model or DEEv-MoS. The DEEv-MoS model proposes a method that makes use of well-developed and commonly used fields of research to create a holistic simulation of the Digital Environment and its many parts. Through the use of intelligent agents, entity component systems and machine learning, accurate simulations can be run to determine how the future digital landscape will grow and change. This allows researchers to further understand what the future holds and prepare for any eventualities, whether they are positive or negative...
- Full Text:
Interactive speech-driven facial animation
- Authors: Hodgkinson, Warren
- Date: 2008-07-18T13:41:13Z
- Subjects: Computer animation , Speech processing systems , Three dimensional imaging , Computer simulation , Signal processing (digital techniques)
- Type: Thesis
- Identifier: uj:7330 , http://hdl.handle.net/10210/807
- Description: One of the fastest developing areas in the entertainment industry is digital animation. Television programmes and movies frequently use 3D animations to enhance or replace actors and scenery. With the increase in computing power, research is also being done to apply these animations in an interactive manner. Two of the biggest obstacles to the success of these undertakings are control (manipulating the models) and realism. This text describes many of the ways to improve control and realism aspects, in such a way that interactive animation becomes possible. Specifically, lip-synchronisation (driven by human speech), and various modeling and rendering techniques are discussed. A prototype that shows that interactive animation is feasible, is also described. , Mr. A. Hardy Prof. S. von Solms
- Full Text:
Markov chain Monte Carlo methods for finite element model updating
- Authors: Joubert, Daniel Johannes
- Date: 2015
- Subjects: Finite Element Method , Markov processes , Monte Carlo method , Computer simulation
- Language: English
- Type: Masters (Thesis)
- Identifier: http://hdl.handle.net/10210/57283 , uj:16375
- Description: Abstract: Finite Element model updating is a computation tool aimed at aligning the computed dynamic properties in the Finite Element (FE) model, i.e. eigenvalues and eigenvectors, and experimental modal data of rigid body structures. Generally, FE models have very high degrees of freedom, often several thousands. The Finite Element Method (FEM) is only able to accurately predict a few of the natural frequencies and mode shapes (eigenvalues and eigenvectors). In order to ensure the validity of the FEM, a chosen number of the natural frequencies and mode shapes are experimentally measured. These are often misaligned or in disagreement with the results from the computed FEM. Finite Element (FE) model updating is a concept wherein a variety of methods are used to compute physically accurate modal frequencies for structures, accounting for random behavior of material properties under dynamic conditions, this behavior can be termed stochastic. The author applies two methods applied in recent years, and one new algorithm to further investigate the effectiveness of introducing multivariate Gaussian mixture models and Bayesian analysis in the model updating context. The focus is largely based on Markov Chain Monte Carlo methods whereby all inference on uncertainties is based on the posterior probability distribution obtained from Bayes’ theorem. Observations are obtained sequentially providing an on-line inference in approximating the posterior probability. In the coming Chapters detailed descriptions will cover all the theory and arithmetic involved for the simulated algorithms. The three algorithms are, the standard Metropolis Hastings (MH), Adaptive Metropolis Hastings (AMH), and Monte Carlo Dynamically Weighted Importance Sampling (MCDWIS). Metropolis Hastings (MH) is a well-known Markov Chain Monte Carlo (MCMC) sampling method. The desired result from this algorithm is a good acceptance rate and good correlation between the computed stochastic parameters. The Adaptive Metropolis Hastings (AMH) algorithm adaptively scales the covariance matrix ‘on the fly’ to see convergence to a Gaussian target distribution. From the AMH algorithm we want to observe the adaptation of the scaling factor and the covariance matrix. Monte Carlo Dynamically Weighted Importance Sampling (MCDWIS) is an algorithm which combines Importance Sampling theory with Dynamic Weighting theory in a population control scheme, namely the Adaptive Pruned Enriched Population Control Scheme (APEPCS). The motivation behind applying MCDWIS is in the complexity of computing normalizing constants in higher dimensional or multimodal systems. In addition, a dynamic weighting step with an Adaptive Pruned Enriched Population Control Scheme (APEPCS) allows for further control over weighted samples and population size. The performance of the MCDWIS simulation... , M.Ing. Mechanical Engineering
- Full Text:
Meeting national response time targets for priority 1 incidents in an urban emergency medical services system : more ambulances won’t help
- Authors: Stein, Christopher , Wallis, Lee , Adetunji, Olufemi
- Date: 2015
- Subjects: Emergency medical services , Response time , Computer simulation
- Language: English
- Type: Article
- Identifier: http://hdl.handle.net/10210/69116 , uj:17820 , Citation: Stein, C., Wallis, L. % Adetunji, O. 2015. Meeting national response time targets for priority 1 incidents in an urban emergency medical services system : more ambulances won’t help.
- Description: Abstract: Objective: To determine the effect of increased emergency vehicle numbers on response time performance for priority 1 incidents in an urban Emergency Medical Services system using discrete-event computer simulation. Method: A simulation model was created, based on input data from part of the Emergency Medical Services operations in Cape Town. Two different versions of the model were used, one with primary response vehicles and ambulances and one with only ambulances. In both cases the models were run in seven different scenarios. The first scenario used the actual number of emergency vehicles in the real system and in each subsequent scenario vehicle numbers were increased by adding the baseline number to the cumulative total. Results: The model using only ambulances had shorter response times and a greater number of responses meeting national response time targets than models using primary response vehicles and ambulances. In both cases an improvement in response times and the number of responses meeting national response time targets was observed with the first incremental addition of vehicles. After this the improvements rapidly diminished and eventually became negligible with each successive increase in vehicle numbers. The national response time target for urban areas was never met, even with a seven-fold increase in vehicle numbers. Conclusion: This study showed that the addition of emergency vehicles to an urban Emergency Medical Services system improves response times in priority 1 incidents but alone is not capable of the magnitude of response time improvement needed to meet the national response time targets.
- Full Text:
Optimisering van die bedryf van besproeiingskanaalstelsels
- Authors: Benade, Nico
- Date: 2014-03-10
- Subjects: Irrigation engineering , Computer software - Development , Computer simulation
- Type: Thesis
- Identifier: uj:4246 , http://hdl.handle.net/10210/9604
- Description: M.Ing. , An optimization system, consisting of a water office database, monitoring stations, communication system and simulation model is described. The main objective of the optimization system is to minimize management related distribution losses in irrigation canals. The optimization system can be implemented in parts, or as a whole, depending on the requirements of the user. This property makes it flexible and facilitates systematic implementation on an irrigation scheme. The water office database which was developed on an IBM-PC promotes computerization of the water register and facilitates compilation of water accounts. Input hydrographs can also be recalculated on short notice. The monitoring stations consist mainly of waterloggers and sensors which record waterdepth as a function of time. These stations can be telemetrically connected to a computer in the water office. The telemetric connection makes it possible to monitor canal operation from the water office and can _be used as an aid in water loss control. The recording stations play an important roll in the calibration of the simulation model. The simulation model was also developed on an IBM-PC and simulates unsteady non-uniform flow of water in irrigation canals. The simulation of unsteady non-uniform flow of water in irrigation canals consists of the solution of the St Venant equations which were discretized with the aid of the Preissmann scheme. The model can simulate a number of watertakeoffs and is only restricted to a maximum of 1300 takeoffs per canal. Changing slope, changing roughness, manual and upstream controlled sluices, pressure controlled and manual turnouts, weirs, transition losses, discharge and waterdepth as a function of time at the end of the canal, free overflows, any .change in cross section and any losses in the form of seepage and evaporation can also be taken into account. The five different types of sections which can be handled are trapeziodal, rectangular, circular, triangular and parabolic sections. Flow in irregular cross sections of rivers can be simulated by storing cross section properties in table format. Waterflow in pipelines and rectangular culverts can also be simulated over short distances. The output of the computer program at each node is available in the form of hydrographs, with a choice of output to a printer or screen. The time dependant variables that can be examined are discharge, waterdepth, velocity and cross sectional area of flow.
- Full Text:
Risks in traditional computer systems development
- Authors: Du Toit, Anton
- Date: 2014-04-14
- Subjects: Risk perception - Data processing , Computer simulation , Management - Data processing
- Type: Thesis
- Identifier: uj:10610 , http://hdl.handle.net/10210/10132
- Description: M.Com. (Accounting) , Please refer to full text to view abstract
- Full Text:
The conceptual design and evaluation of research reactors utilizing a Monte Carlo and diffusion based computational modeling tool
- Authors: Govender, Nicolin
- Date: 2012-08-06
- Subjects: Materials testing reactors , Monte Carlo method , Computer simulation
- Type: Thesis
- Identifier: uj:8934 , http://hdl.handle.net/10210/5406
- Description: M.Sc. , Due to the demand for medical isotopes, new Materials Testing Reactors (MTR's) are being considered and built globally. Different countries all have varying design requirements resulting in a plethora of different designs. South-Africa is also considering a new MTR reactor for dedicated medical radio-isotope production. A neutronic analysis of these various designs is used to ascertain/evaluate the viability of each. Most safety and utilization parameters can be calculated from the neutron flux. The code systems that are used to perform these analysis are either stochastic or deterministic in nature. In performing such an analysis the tracking of the depletion of isotopes is essential, to ensure that the modeled macroscopic cross-sections are as close as possible to that of the actual reactor. Stochastic methods are currently too slow when performing depletion analysis, but are very accurate and flexible. Deterministic based methods, on the other hand are much faster, but are generally not as accurate or flexible due to the approximations made in solving the Boltzmann Transport Equation. The aim of this work is therefore to synergistically use a deterministic (diffusion) code to obtain an equilibrium material distribution for a given design and a stochastic (Monte Carlo) code to evaluate the neutronics of the resulting core model - therefore applying a hybrid approach to conceptual core design. A comparison between the hybrid approach and the diffusion code demonstrates the limitations and strengths of the diffusion-based calculational path for various core designs. In order to facilitate the described process, and implement it in a consistent manner, a computational tool termed COREGEN has been developed. This tool facilitates the creation of neutronics models of conceptual reactor cores for both the Monte Carlo and diffusion codes in order to implement the described hybrid approach. The system uses the Monte-Carlo based MCNP code system developed at Los Alamos National Laboratory as stochastic solver, and the nodal diffusion based OSCAR-4 code system developed at Necsa as the deterministic solver. Given basic input for a core design, COREGEN will generate a detailed OSCAR-4 and MCNP input model. An equilibrium core obtained by running OSCAR-4, is then used in the MCNP model. COREGEN will analyze the most important core parameters with both codes and provide comparisons. In this work, various MTR reactor designs are evaluated to meet the primary requirement of isotope production. A heavy water reflected core with 20 isotope production rigs was found to be the most promising candidate. Based on the comparison of the various parameters between Monte Carlo and diffusion for the various cores, we found that the diffusion based OSCAR-4 system compares well to Monte Carlo in the neutronic analysis of cores with in-core irradiation positions (average error 4.5% in assembly power). However, for the heavy water reflected cores with ex-core rigs, the diffusion method differs significantly from the MonteCarlo solution in the rig positions (average error 17.0% in assembly power) and parameters obtained from OSCAR must be used with caution in these ex-core regions. The solution of the deterministic approach in in-core regions corresponded to the stochastic approach within 7% (in assembly averaged power) for all core designs.
- Full Text: