I’m still journeying through the wilderness of CLA Classifier code trying to wrap my head around it… and I’ve got another question

Lines 129-145 of `BitHistory`

, for reference, are as follows:

```
// This is to prevent errors associated with infinite rescale if too large
if(denom == 0 || dcNew > DUTY_CYCLE_UPDATE_INTERVAL) {
double exp = Math.pow((1.0 - classifier.alpha), (iteration - lastTotalUpdate));
double dcT = 0;
for(int i = 0;i < stats.size();i++) {
dcT *= exp;
stats.set(i, dcT);
}
// Reset time since last update
lastTotalUpdate = iteration;
// Add alpha since now exponent is 0
dc = stats.get(bucketIdx) + classifier.alpha;
} else {
dc = dcNew;
}
```

As I understand it, this block of code exists to guard against brobdingnagian sized duty cycle values (aka duty cycles that are larger than `Integer.MAX_VALUE`

). If a duty cycle this large is calculated, then the above block of code is entered and it proceeds to perform some arcane black magic on all of the buckets’ duty cycle values. The end result is that each of the duty cycles in `stats`

(the `TDoubleList`

that holds the duty cycles values for each bucket) are scaled by `exp`

such that they are made smaller, but still retain the same ratio to one another. This means that we must also scale down the brobdingnagian duty cycle that caused us to enter this if-statement in the first place. That’s what line 142 is for:

```
dc = stats.get(bucketIdx) + classifier.alpha;
```

…which just takes the previous value for that bucket and adds alpha to it (a result of the formula used to calculate the duty cycles, found on line 121).

So, this whole mess just keeps the duty cycles from exceeding the `Integer.MAX_VALUE`

threshold, while preserving the ratios between the duty cycles themselves. Great.

Now my question: **Why are the duty cycle values for all of the buckets (besides the currently active one) just reset to 0?** I’m probably overlooking something terribly obvious, but that is what lines 131-136 appear to be doing:

```
double exp = Math.pow((1.0 - classifier.alpha), (iteration - lastTotalUpdate));
double dcT = 0;
for(int i = 0;i < stats.size();i++) {
dcT *= exp;
stats.set(i, dcT);
}
```

Doesn’t this skew the ratios between the duty cycles? Aren’t there easier ways to reset a list to 0? And why calculate `exp`

if we don’t even use it?

Also, the Python version appears to be doing it as I described above - not just reseting them all to 0 (lines 139-143 of python version):

```
exp = ((1.0 - self._classifier.alpha) **
(iteration - self._lastTotalUpdate))
for (bucketIdxT, dcT) in enumerate(self._stats):
dcT *= exp
self._stats[bucketIdxT] = dcT
```

I am *not* a python expert at all, but I think the above code is using a comprehension expression, and so the equivalent Java code would be:

```
double exp = Math.pow((1.0 - classifier.alpha), (iteration - lastTotalUpdate));
double dcT = 0;
for(int i = 0;i < stats.size();i++) {
dcT = stats.get(i);
dcT *= exp;
stats.set(i, dcT);
}
```

Again, I’m probably just overlooking something incredibly obvious, or don’t understand what the code is supposed to be doing. That wouldn’t be anything new I’d really appreciate it if any of you HTM wizards could help me out with understanding this

Just to recap, I essentially have two questions:

- Is my understanding of lines 129-145 of
`BitHistory`

correct? - Why do the Java and Python implementations differ? Well, not that they do, but what is it that I’m misunderstanding here that is causing me to think they do?