You may have heard the phrase ‘less is more‘. I believe it applies well to predictive modeling and programming in general. Perhaps it also applies to transmittal of information more generally. As Mark Twain once remarked: I did not have time to write a short letter, so I wrote a long one instead.
Even in the sciences, compact equations (e=mc2) are elegant and seem to demonstrate that we have understood something about our reality in a profound way. Larger, messier representations of reality convey a feeling that more work needs to be done to distill out irrelevant details and get to a deeper state of understanding.
Predictive modeling is no exception in my opinion. More complex models are not ‘better’. In fact, they are sub-optimal if simpler models with fewer assumptions are possible with little or no loss of performance. Most of the time one would (and should) trade-off some performance for simplicity. And by the way, this does not apply to the final model form (e.g. 10 predictors instead of 5), but also applies to modeling techniques (e.g. black-box methods vs. transparent and easier to understand approaches).
This is 2019 – and our predictive modeling efforts are still for review and use by other humans. When the time comes when our algorithms are for the sole consumption of our machine overlords, you can chuck the principle of parsimony and be as opaque as you please. Until that time, keep focusing relentlessly on simplifying your methods and models to the point where degradation in performance is significant relative to your application. You will build better models, communicate them more effectively, and increase the odds that you have understood something real about the world.