clojure.contrib.sql runs out of memory on very large datasets


  • Type: Defect Defect
  • Status: In Progress In Progress
  • Resolution: Unresolved
  • Affects Version/s: None
  • Fix Version/s: None
  • Component/s: None
  • Labels:


If you have a query that results in a very large dataset you will run out of memory because by default, with-query-results pulls the entire result set into memory (even though it creates a lazy-seq using resultset-seq).

This issue has been discussed previously here:

The fix is simple, just call (.setFetchSize stmt 1). But, with-query-results doesn't give you a way to do that currently. I propose adding an optional parameter after sql-params that is a hash of attributes to set on the PreparedStatement before executing it.

So you can do this:

(with-connection {���} 
    (.setAutoCommit (sql/connection) false) ;; needed for postgres 
    (with-query-results results ["SELECT id, data FROM nodes"] 
      {:fetch-size 1000} 
      (doseq [r results] 

The new code in clojure.contrib.sql is very simple, but it depends on a new contrib namespace called clojure.contrib.bean. Here is more info on my proposed fix:


No changes have yet been made on this issue.


Vote (0)
Watch (0)


  • Created: