core.cache

FIFOCache StackOverflowError with high :threshold due to concat

Details

  • Type: Defect Defect
  • Status: Open Open
  • Priority: Critical Critical
  • Resolution: Unresolved
  • Affects Version/s: None
  • Fix Version/s: None
  • Component/s: None
  • Labels:
    None
  • Environment:
    Centos

Description

FIFOCache keeps a sequence of keys internally, limited in size to :threshold

Each element entered into FIFOCache causes the existing sequences of keys to be concat with [new-key]

It is trivial to cause a StackOverFlow error under these circumstance as described by Stuart here: http://stuartsierra.com/2015/04/26/clojure-donts-concat

Unable to find source-code formatter for language: clojure. Available languages are: javascript, sql, xhtml, actionscript, none, html, xml, java
(def a-cache (atom (cache/fifo-cache-factory {} :threshold 5000)))
=> #'user/a-cache

(doseq [a-key (range 5000)]
  (swap! a-cache cache/miss (keyword (str a-key)) a-key))
StackOverflowError   clojure.lang.LazySeq.sval (LazySeq.java:42)

I've recently experienced this while working with a FIFOCache as part of a Storm topology.

I'm happy to work on a patch if that would help.

Activity

There are no comments yet on this issue.

People

Vote (2)
Watch (1)

Dates

  • Created:
    Updated: