Finally, the methodology would be validated against established procedures for such studies and care would be taken to stick to the practices of international research studies as much as possible. Data Collection Technique Data will be collected through secondary sources including online libraries, peer reviewed journals, and prior researches closely related to the topic. While the secondary data is general skewed and thus affected the outcome of any research based on that data, the secondary data in question is not skewed.
As such, secondary sources of data especially those found on the World Wide Web will be used to develop a comprehensive picture of topic. The inquiry will therefore constitute selection of articles from published and peer reviewed books and journals as well as credible web sources that contain information on the Simulated Annealing and the Genetic Algorithm. Furthermore, all the cited articles and books selected will be up to date having been published from year 1990 onwards. Similarly, websites used in the study will include certified agency websites, news articles from credible media companies as well as materials from authoritative web sources.
Limitation of the research to secondary sources will facilitate analysis of numerous articles from diverse sources and authors on the issue of the Simulated Annealing and the Genetic Algorithm. The report Summary: As a problem statement, it can be highlighted that the Global optimisation techniques are currently attracting a lot of research interest and new application areas are being explored. Simulated Annealing and the Genetic Algorithm involve large amount computation for analyzing. The data is entered in form of algorithm or probabilistic patterns for the purpose computation.
The main purpose of Global optimisation techniques is to reduce the time of computation for large amount of data and obtain a better solution. Since they use GPU memory, it is possible to process large and complex data like genetic variability of a population of cattle in certain location. Some models may take large amount of time to process some information but Global optimisation techniques computing helps increasing processing time by changing codes. In order for one to process data using these techniques one will need to decompose the data into small units as well as arranging this data in a manner that algorithms will handle it.
Global optimisation techniques allow convenient access to a pool of computing resources that configure and provide minimal interaction to data. Global optimisation techniques are revolutionizing computational and allowing them to tap from computing resources that are extremely powerful through the server. The key lesson is that given a problem decomposition decision, programmers will typically have to select from a variety of algorithms. Some of these algorithms achieve different trade- offs while maintaining the same numerical accuracy.
Others involve sacrificing some level of accuracy to achieve much more scalable running times. The cutoff strategy is perhaps the most popular of such strategies. Even though we introduced cutoff in the context of electrostatic potential map calculation, it is used in many domains including ray tracing in graphics and collision detection in games (Kirk and Hwu, 2014 p. 91). Computational thinking skills allow an algorithm designer to work around the roadblocks and reach a good solution. Literature Review: Genetic algorithms Genetic algorithm is good in determining optimal solution such as determining the minimum number of test.
Algorithm is selected and used for the purpose of matrix multiplication, division, and subtraction. Genetic algorithm relies on the in-built spatial sensitivity of the array to obtain some of the computational information of data. In Genetic algorithm techniques, several reduced data sets are obtained simultaneously by using an array of receiver coils such as a four-channel phased array coil in order to enhance the sampling speed.
Read More