John W![]() Regular ![]() ![]() ![]() Posts: 96 Joined: 6/18/2011 Location: Sydney, NSW, Australia ![]() | When training GA's I often notice that some of the intervals being trained are just fantastic, others are so-so and others worsen the training result. Here's a simple example when training for 300 iterations and re-initializing every 100 iterations. In the first 100 iterations the rules formed didn't do much to improve the equity curve. The rules generated in the 101-200 iterations caused a dramatic lift in performance, and the 201-300 iterations caused a performance decline. Wouldn't it be great if the settings used by the GA in the period 101-200 could be extended for more iterations because it appears those settings are finding valuable rules and getting good training results? Question - would adding an 'n' bar extension period (e.g. 2X the interval size) using the settings for the 'best' re-initialization interval result in better rules? In this simple example after 300 bars had been run, the interval 101-200 settings would be applied to see if further valuable rules could be gleaned by running a further 200 iterations. Or could the GA automatically use the best interval settings and then let it decide when to terminate - when those settings no longer generate rules that improve results at a rapid rate? [Edited by John W on 2/1/2020 10:39 PM] ![]() |