For the past couple years, we have not been focused on applications at all. We’ve changed that a bit recently. @Subutai talked about how we are now looking for ways to apply components of our theory into ML frameworks in Numenta's 2018 Year in Review. The most obvious place to start is with SDRs. And the way the SP creates sparsity retains semantic meaning, so it is more valuable than something like dropout. Instead of removing random connections, the SP is learning.