I'm becoming concerned that we should stop advising to optimize the Solr indexes. The common advice from the Solr community is "don't optimize unless you have a specific good reason." Lucene is designed to manage its index segments well in most cases, and I'm not convinced that we are a special case. Are we? why?
There are known cases, not all that uncommon, in which optimizing aggressively can create a segment which is too big to merge and thus will accumulate deleted documents forever, becoming increasingly less efficient than average. The problem has been addressed in Solr 7.5, but it's still a good idea in most cases to just let Lucene manage its segments as designed.
We should remove the optimization code from DSpace altogether. If you really know what you are doing, and thus probably have a good reason, you can use Solr's admin. console to optimize a core when it needs it. If you aren't comfortable with doing that, you probably shouldn't be monkeying with optimization.
I note that there is prior discussion of Solr optimization in
DS-615. I will also note that there has been nine years of Lucene development since then. Do we understand why folks back then saw improvements, and are those reasons still valid?