Affects Version/s: None
Fix Version/s: None
This patch uses `io/reader` in conjunction with `with-open` to prevent
"too many open files" errors. This happens in a browser REPL when
evaluating a file that references a couple of other files multiple
From the Google Group discussion at: https://groups.google.com/forum/#!topic/clojurescript/r2iGPh2Lv0U
Hello Clojure Scripters,
with my current REPL setup I get some really annoying "Too many
files open" errors. I'm not sure if it's a problem with
ClojureScript itself, Austin the browser REPL, nREPL or my own
miss-configuration of something.
When I connect to the browser REPL via AUSTIN and eval a whole
ClojureScript file the first time a lot of Ajax requests are sent
over the wire and my main namespace is getting compiled and
shipped to the browser. So far so good, my Java process is at
around 18676 open files. I don't care yet.
Compiling the same file again and again increases the open files
not sure if this is a problem with my setup, but I
18676, 19266, 22750, 21352, 33097, 62913, 64398, 64398, 64398,
64398, 64398 up to 171977, where some ulimit is reached and I get
an exception like this:
java.io.FileNotFoundException: .repl/5614/request/routes.js (Too many open files)
and my ClojureScript hangs up and I have to do a
cider-restart. Ok maybe I shouldn't eval whole files too often
over the XHR connection, but this seems not right.
I used the command "lsof -n | grep java | wc -l" to watch the
above numbers while evaluating the file again and again.
Does someone had a similar problem, knows how to solve that, or
has any ideas how to track this one down?
Thanks for your help, Roman.
|Assignee||David Nolen [ dnolen ]|
|Priority||Major [ 3 ]||Blocker [ 1 ]|
|Resolution||Completed [ 1 ]|
|Status||Open [ 1 ]||Resolved [ 5 ]|
|Status||Resolved [ 5 ]||Closed [ 6 ]|