# Off by 1? 121.0 != 120

I read many research papers from Numenta and “On Intelligence” book, and I must say, I’m hooked. So I tried doing what I do well, coding.

I am having a bizarre issue. I started with the Algorithms API tutorial, and I got it working, I understand what happens. I now tried to modify it to learn a sequence “boy”. I thought I could take the word “boy”, taking one letter at a time. At each iteration, I simply convert the character to its Unicode code point.

Now, I can see that the sequence is learned, but I have issues getting the output converted back to a character. I am using a classifier to translate active cells back to scalar value, and then back to a character. Below code is executed in a loop.

``````string = "boy"
symbol = ord(string[count % 3])
count += 1

# nothing here should be surprising, regular nupic operations
# a lot of boilerplate code was left out for brevity
# but it is a simply copy paste from Algorithms API tutorial
symbolBits = numpy.zeros(scalarEncoder.getWidth())
scalarEncoder.encodeIntoArray(symbol, symbolBits)
encoding = numpy.concatenate( [symbolBits] )
activeColumns = numpy.zeros(spParams["columnCount"])
sp.compute(encoding, True, activeColumns)
activeColumnIndices = numpy.nonzero(activeColumns)[0]
tm.compute(activeColumnIndices, learn=True)
activeCells = tm.getActiveCells()
bucketIdx = scalarEncoder.getBucketIndices(symbol)[0]
classifierResult = classifier.compute(
recordNum=count,
patternNZ=activeCells,
classification={
"bucketIdx": bucketIdx,
"actValue": symbol
},
learn=True,
infer=True
)

oneStepConfidence, oneStep = sorted(
zip(classifierResult[1], classifierResult["actualValues"]),
reverse=True
)[0]

# this is where things get weird
print "start"
print oneStep
print int(oneStep)
print "end"
``````

Shockingly, this is what I see printed:

``````start
98.0
98
end
start
111.0
111
end
start
121.0
121
end
start
121.0
121
end
start
121.0
121
end
start
121.0
121
end
start
121.0
120
end
start
121.0
120
end
start
98.0
98
end
start
111.0
110
end
``````

I can see that oneStep values are correct, but since they are floats, i am attempting to convert them to int. At first, it works fine, but then I end up with wrong int values. So instead of getting “boyboyboy”, I am getting “bnxbnxbnx” (off by 1). I know that ints have less precision than floats, but 121.0 should easily be 120, always.

I cannot reproduce this issue in a standalone python shell, simply converting 121.0 to an int, I always get 121 back.

What am I missing here?

1 Like

Hi @Rafal,

Welcome to the forums!

What you’re seeing are the pitfalls of float arithmetic. Python only stores 53 bits of precision for a float, so behaviour like this is expected. In this case, the value being returned into the oneStep variable is actually some small amount less than 121.0, e.g. 120.9999999999. You usually get a value like this by dividing two numbers.

Then, the way it’s rounded for printing is slightly different to the way it’s cast as an int.

See this example (Python 2.7.15):

``````>>> f=120.9999999999
>>> print f
121.0
>>> print int(f)
120
>>>
``````

See this link for more detail: https://docs.python.org/2/tutorial/floatingpoint.html

Happy coding, and I recommend the HTM school videos if you haven’t watched them yet.

2 Likes

Thank you. I see! I had a hunch that this was related to precision. And yes, HTM school is in my YT library. Great content.

2 Likes