Thanks, your post saved me a lot of time. I was also interested in using XGBoost as model in RL, so I started looking for options to implement that. I was very surprised when I read in your post that GBTs can not be trained partially/incrementally. So I did some research. Short version (summary from hcho3 answer here) — it’s basically a fundamental limitation of GBT algo itself, not specific lib (XGBoost/LightGBM/etc). Tree construction require whole data at the beginning to create optimal splits. One way to kinda do this (in very limited way), is `process_type: update`, but it will only modify leaves, not the tree structure. Which basically means no go for RL, because you need good initial starting point. Some discussion here also.

Creator of | FullStack Dev | {Algo}Trader |

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store