I'm not involved with the science behind the project or its computing but my understanding is that the longer runs are more scientifically useful.
I want to point out that for the electricity used, older cards just don't compute as efficiently or as well as newest generation cards and you should consider the electricity being spent for the computational result and be aware that this project computes the same thing thousands and thousands of times or more to get enough statistical information to predict a protein behavior, and by its nature is very power intensive. Like we're using a few megawatts of power collectively on the 10,000 or so GPUs running this project.
When this is a fact of grid computing, perhaps its better to let more modern hardware run, like a RTX 2070 for example which will compute 5 times faster than an old GTX 680 for example and use a fraction of the power.
Check where your local municipality or state generates its power from. For example CA has http://www.caiso.com/TodaysOutlook/Pages/supply.aspx If you're all hydro or nuclear or solar go for it, but if you're in California which generates 70-90% from natural gas, the electricity being used to run these older cards is directly contributing to CO2 emissions in the environment. It may be better to retire those older cards, especially when most of the computing being done here is a brute force method of gathering statistics and probabilities.