@recaptime-dev's working patches + fork for Phorge, a community fork of Phabricator. (Upstream dev and stable branches are at upstream/main and upstream/stable respectively.) hq.recaptime.dev/wiki/Phorge
phorge phabricator
1
fork

Configure Feed

Select the types of activity you want to include in your feed.

Require rows passed to "loadAllFromArray()" have unique keys

Summary:
See PHI1809, which identified a bug in Project search where queries with a large number of slugs could paginate improperly.

This change detects problems in this category: cases where multiple rows with the same ID are passed to "loadAllFromArray()". It's likely that all cases it detects are cases where a GROUP BY is missing.

Since this might have some false positives or detect some things which aren't fundamentally problematic, I'm planning to hold it until the next release.

Test Plan:
- Reverted D21399, then created a project with multiple slugs and queried for one of them via "project.search". Hit this new exeception.
- Browsed around a bit, didn't immediately catch any collateral damage.

Differential Revision: https://secure.phabricator.com/D21400

+12 -1
+12 -1
src/infrastructure/storage/lisk/LiskDAO.php
··· 653 653 foreach ($rows as $row) { 654 654 $obj = clone $this; 655 655 if ($id_key && isset($row[$id_key])) { 656 - $result[$row[$id_key]] = $obj->loadFromArray($row); 656 + $row_id = $row[$id_key]; 657 + 658 + if (isset($result[$row_id])) { 659 + throw new Exception( 660 + pht( 661 + 'Rows passed to "loadAllFromArray(...)" include two or more '. 662 + 'rows with the same ID ("%s"). Rows must have unique IDs. '. 663 + 'An underlying query may be missing a GROUP BY.', 664 + $row_id)); 665 + } 666 + 667 + $result[$row_id] = $obj->loadFromArray($row); 657 668 } else { 658 669 $result[] = $obj->loadFromArray($row); 659 670 }