I’m now sending data through the system in 10 dimensions and getting predictions for my data. After 10000 timesteps at 1 minute using SDRClassifiers and AUTO_CLASSIFY
, I’m seeing roughly 10% of difference between the values sent and the values predicted for 1 minute ahead of time. And it gets to a good level of prediction relatively quickly
However, I need to make predictions hours or days in advance. So a few questions come to mind:
- (most importantly) How can I set up the Network so that the classifier gets other step sizes beyond the default of 1?
Also, while I’m at it:
- this is called a “classifier” but it seems to be doing what I would call a regression. Is there a way in which it might perform actual classification on data?
- I presume if I want to make predictions of shapes in the data, I should send a series of points in each dimension lagged by the width of the shape I’m looking for?
Well at least with question 1, I have a workaround. I noticed the makeClassifiers
method is default access, so I created a class locally to override it. If this helps anyone else here it is:
package org.numenta.nupic.network
import gnu.trove.list.TIntList
import org.numenta.nupic.Parameters
import org.numenta.nupic.algorithms.{CLAClassifier, SDRClassifier}
import org.numenta.nupic.encoders.MultiEncoder
import org.numenta.nupic.util.NamedTuple
final case class ClassifyingLayer[T](name: String, p: Parameters, config: Option[ClassifierConfig] = None) extends Layer[T](name, null, p) {
override def makeClassifiers (encoder: MultiEncoder): NamedTuple = {
val tuple = super.makeClassifiers(encoder)
import scala.collection.JavaConverters._
config.map{ cfg =>
val newValues = tuple.values().asScala.map{
case _: SDRClassifier =>
new SDRClassifier(cfg.steps, cfg.alpha, cfg.actValueAlpha, cfg.verbosity)
case _: CLAClassifier =>
new CLAClassifier(cfg.steps, cfg.alpha, cfg.actValueAlpha, cfg.verbosity)
case v => v
}.toSeq
new NamedTuple(tuple.keys, newValues:_*)
}.getOrElse(tuple)
}
}
final case class ClassifierConfig(steps: TIntList, alpha: Double = 0.001, actValueAlpha: Double = 0.3, verbosity: Int = 0)
I then create it using the apply
thus:
private lazy val layer23: Layer[_] = ClassifyingLayer("Layer 2/3", p, Some(ClassifierConfig(new TIntArrayList(Seq(1,2).toArray))))
.alterParameter(KEY.AUTO_CLASSIFY, true)
.add(Anomaly.create())
.add(new TemporalMemory())
.add(new SpatialPooler())
.add(MultiEncoder.builder().build())
Hi @keith.nordstrom,
Like most frameworks just starting out, HTM.java starts with being able to execute “basic” functionality with an eye toward evolving the code’s capacity as time goes on, which is why this version only has support for a step size of “1” (and also why God made PRs and community developer pools). (Although God made Java, I’m almost certain the Devil gets collaboration points for Scala) LOL!
Anyway, if a case is made for opening up the access modifier definition of certain areas of code, that’s not a problem - I’m not a stickler for those kinds of things and not sure I believe in holding peep’s hands to prevent them from shooting themselves in the foot. People should be able to do what they need with code, imho!
I like the concept of what you’re doing - providing a “config” object which can be used internally to set properties which are otherwise out of scope. That’s cool. Anyway, I haven’t heard of a “formal classifier” in NuPIC in the application sense - here it is meant to be a mapping of original input values to the eventual SDR output together with some storage of metrics - and that’s all?
@cogmission Understood. No criticism implied at its lack. I’m happy to contribute if this is functionality that would help your project, I just needed to keep this experiment in motion and when I dug into the code I was pretty sure the answer to my question above was going to be negative.
As a rule I prefer sending well-typed config objects in an API because it can be validated. At the risk of sounding critical, the big untyped hash map of various parameters in the HTM approach (starting with the python and ported to Java) is a nerve wracking thing from a production standpoint. I’ve spent a fair amount of time digging through null pointer exceptions and class cast exceptions while trying to get this running - things I thought I’d left behind when we abandoned Java a few years back.
IMO if the devil was involved with either language, it was in allowing the concept of null
to proliferate.
Thanks for the quick responses, they’re appreciated.
1 Like
In this pull request I’ve made it avaiable to do multistep predictions:
I’ve added a KEy for the steps:
defaultParams.putAll(DEFAULTS_ENCODER);
INFERRED_STEPS("inferenceSteps", int[].class),
And I modifyed makeClassyfiers too:
NamedTuple makeClassifiers(MultiEncoder encoder) {
Map<String, Class<? extends Classifier>> inferredFields = (Map<String, Class<? extends Classifier>>) params.get(KEY.INFERRED_FIELDS);
int[] steps = (int[]) params.get(KEY.INFERRED_STEPS);
if(inferredFields == null || inferredFields.entrySet().size() == 0) {
throw new IllegalStateException(
"KEY.AUTO_CLASSIFY has been set to \"true\", but KEY.INFERRED_FIELDS is null or\n\t" +
@@ -1930,6 +1932,7 @@ NamedTuple makeClassifiers(MultiEncoder encoder) {
"value in Parameters)."
);
}
String[] names = new String[encoder.getEncoders(encoder).size()];
Classifier[] ca = new Classifier[names.length];
int i = 0;
@@ -1940,12 +1943,12 @@ NamedTuple makeClassifiers(MultiEncoder encoder) {
LOGGER.info("Not classifying \"" + et.getName() + "\" input field");
}
else if(CLAClassifier.class.isAssignableFrom(fieldClassifier)) {
LOGGER.info("Classifying \"" + et.getName() + "\" input field with CLAClassifier");
ca[i] = new CLAClassifier(new TIntArrayList(steps), 0.001, 0.3, 0);
}
else if(SDRClassifier.class.isAssignableFrom(fieldClassifier)) {
LOGGER.info("Classifying \"" + et.getName() + "\" input field with SDRClassifier");
ca[i] = new SDRClassifier(new TIntArrayList(steps), 0.001, 0.3, 0);
}
else {
throw new IllegalStateException(
1 Like
Hi @Matheus_Araujo,
Hang in there with me, I’m going to get to all of them! Thanks for all of your hard work!
Cheers,
David
1 Like
I’m the one who has something to be gratefull!
Your job writing the htm.java was amazing!
1 Like
@Matheus_Araujo - thank you my friend, you are too kind!