I was looking at the opf model (link). I had this query: Basically, the TM predicts the next set of neurons that is going to fire. So, in the link above, during training, we are sending SDRs in which every SDR is a combination of timestamp and consumption value. So, the next prediction by the TM is going to a set of neurons (that too an SDR) which will even be a combination of some timestamp (time of day, weekday or not) and predicted consumption value…right?
If thats so, how are we getting the predicted consumption value and how are we inferring what timestamp does it correspond to?
And can someone please explain about
multiStepBestPredictions. I dont get the point of best prediction. I thought there was only one prediction at any point in time by TM. Is it related to the case where there might be multiple predictions (like
eats word in the HTM school episode). If so, then it means
multiStepBestPredictions make sense only incase of ambiguous predictions (when there can be more than one prediction)?
And whats 5 step prediction? Is it like the model predicts for the next timestep, gives that as input to get the prediction for next timestep and so for 5 times and gives the output?
Is 1-step and 5-step even comparable?
And can I know whats
aggregation in the