-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add caching #86
Add caching #86
Conversation
7faca04
to
be3b194
Compare
be3b194
to
42c66b3
Compare
42c66b3
to
f0a164d
Compare
6662f3b
to
0af2cb4
Compare
f0a164d
to
ed10267
Compare
1cac8f8
to
0eed776
Compare
ed10267
to
4805515
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
([f lru-threshold] | ||
#?(:clj (memo/lru f :lru/threshold lru-threshold) | ||
:cljs (memoizee f #js {"max" lru-threshold | ||
"normalizer" js/JSON.stringify})))) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if we'll run into trouble using js/JSON.stringify
with some of the parameters passed to these functions, like model
(which can be a special type, eg. reify) or cljs data types which look pretty weird (though maybe that's ok, if it's more performant than using clj->js
first..)

It might be worth doing some simple benchmarks before committing to an approach, as some of these conversions can be surprisingly costly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My predictions are that it should be fine, but it won't hurt to check. Will have to wait until I get back from vacation, though.
The serialized weirdness should be ok as long as they never result in accidentally identical string representations. If anything, I think the opposite is true. There are probably things that could share keys, but won't. Luckily, that just means some cache misses.
I would be very surprised if clj->js + JSON/stringify was faster than JSON/stringify on the original, but no need to guess! I'll try it out when I get back. We'll see what my posteriors are then.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, it turns out I am very surprised. Unfortunately, I don't think it will work out to use clj->js first.
I ran a quick-and-dirty test in cljs:
(let [n 100000
m {"x" 0 :foo {:bar 1 :moop/floop "asdf"} :baz [1 2 3]}
a (reify IAtom)]
(js/console.log "JSON.stringify:")
(time
(dotimes [_ n]
(js/JSON.stringify m)
(js/JSON.stringify :foo)
(js/JSON.stringify a)))
(js/console.log "clj->js, then JSON.stringify:")
(time
(dotimes [_ n]
(-> m clj->js (js/JSON.stringify))
(-> :foo clj->js (js/JSON.stringify))
(-> a clj->js (js/JSON.stringify))))
(js/console.log "clj->js w/ str keyword-fn, then JSON.stringify:")
(time
(dotimes [_ n]
(-> m (clj->js :keyword-fn str) (js/JSON.stringify))
(-> :foo (clj->js :keyword-fn str) (js/JSON.stringify))
(-> a (clj->js :keyword-fn str) (js/JSON.stringify))))
(js/console.log "bean/->js, then JSON.stringify:")
(time
(dotimes [_ n]
(-> m (bean/->js :key->prop str) (js/JSON.stringify))
(-> :foo (bean/->js :key->prop str)(js/JSON.stringify))
(-> a (bean/->js :key->prop str) (js/JSON.stringify)))))
and got:
JSON.stringify:
"Elapsed time: 1274.645467 msecs"
clj->js, then JSON.stringify:
"Elapsed time: 1208.113983 msecs"
clj->js w/ str keyword-fn, then JSON.stringify:
"Elapsed time: 1443.455173 msecs"
bean/->js, then JSON.stringify:
"Elapsed time: 1506.302295 msecs"
Using clj->js before JSON.stringify is slightly faster, but unfortunately, it can't distinguish between string and keyword keys. Having duplicate string and keyword keys would cause other problems, so it might be safe to do. But, I'd rather err on the side of correctness, especially since these timings aren't too far off from each other.
We can always revisit our caching strategy, if necessary.
4805515
to
255366a
Compare
9fe096b
to
a680fc2
Compare
255366a
to
f971a58
Compare
e57d5bf
to
630bff6
Compare
…strain/mutual-info fns in clj Also bump up inferenceql.inference version to avoid arrow constructor bug
1a7a78f
to
328df8f
Compare
Add caching, tests, and cache scalar
pdf/prob/condition/constrain/mutual-info
fns in cljAlso bump up inferenceql.inference version to avoid arrow constructor bug
Notes
CLJS does not have core.cache or core.memoize. And in Js, there were surprisingly few comprehensive options for caching, so I went with one that seemed popular, well-maintained, and flexible enough to handle custom keys.
The Clojure
hash
fn hashes both 0 and nil to the same value. Thus hash alone cannot distinguish between{:x 0}
and{:x nil}
. (Hash maps have fallbacks to handle collisions, and disambiguate between 0 and nil, but these don't work with most Js cache implementations afaict.)There are two popular solutions to this problem: