Quantcast
Viewing all articles
Browse latest Browse all 2

Answer by Evan Carroll for PostgreSQL - Working with array of thousands of elements

Is there a better way of doing this?

Yes, use a temp table. There is nothing wrong with creating an indexed temp table when your query is that insane.

BEGIN;  CREATE TEMP TABLE myitems ( item_id int PRIMARY KEY );  INSERT INTO myitems(item_id) VALUES (1), (2); -- and on and on  CREATE INDEX ON myitems(item_id);COMMIT;ANALYZE myitems;SELECT item_id, other_stuff, ...FROM (  SELECT      -- Partitioned row number as we only want N rows per id      ROW_NUMBER() OVER (PARTITION BY item_id ORDER BY start_date) AS r,      item_id, other_stuff, ...  FROM mytable  INNER JOIN myitems USING (item_id)  WHERE end_date > $2  ORDER BY item_id ASC, start_date ASC, allowed ASC) xWHERE x.r <= 12;

But even better than that...

"500k different item_id" ... "int array can contain up to 15,000 elements"

You're selecting 3% of your database individually. I have to wonder if you're not better off creating groups/tags etc in the schema itself. I have never personally had to send 15,000 different IDs into a query.


Viewing all articles
Browse latest Browse all 2

Trending Articles