diff options
| author | Daniel Sanders <daniel_l_sanders@apple.com> | 2019-03-22 20:16:35 +0000 |
|---|---|---|
| committer | Daniel Sanders <daniel_l_sanders@apple.com> | 2019-03-22 20:16:35 +0000 |
| commit | ef8761fd3b0f0b24b7e9347ad4ae686fc9ed8f94 (patch) | |
| tree | 0419f137c9865eafa74c4210d671562c1f87af1e /llvm/lib/Transforms | |
| parent | 462446fd9a5d85458f1c61de353ce679ff5dbf9b (diff) | |
| download | bcm5719-llvm-ef8761fd3b0f0b24b7e9347ad4ae686fc9ed8f94.tar.gz bcm5719-llvm-ef8761fd3b0f0b24b7e9347ad4ae686fc9ed8f94.zip | |
Fix non-determinism in Reassociate caused by address coincidences
Summary:
Between building the pair map and querying it there are a few places that
erase and create Values. It's rare but the address of these newly created
Values is occasionally the same as a just-erased Value that we already
have in the pair map. These coincidences should be accounted for to avoid
non-determinism.
Thanks to Roman Tereshin for the test case.
Reviewers: rtereshin, bogner
Reviewed By: rtereshin
Subscribers: mgrang, llvm-commits
Tags: #llvm
Differential Revision: https://reviews.llvm.org/D59401
llvm-svn: 356803
Diffstat (limited to 'llvm/lib/Transforms')
| -rw-r--r-- | llvm/lib/Transforms/Scalar/Reassociate.cpp | 23 |
1 files changed, 18 insertions, 5 deletions
diff --git a/llvm/lib/Transforms/Scalar/Reassociate.cpp b/llvm/lib/Transforms/Scalar/Reassociate.cpp index 6da95712d64..34066aea27d 100644 --- a/llvm/lib/Transforms/Scalar/Reassociate.cpp +++ b/llvm/lib/Transforms/Scalar/Reassociate.cpp @@ -2217,8 +2217,15 @@ void ReassociatePass::ReassociateExpression(BinaryOperator *I) { if (std::less<Value *>()(Op1, Op0)) std::swap(Op0, Op1); auto it = PairMap[Idx].find({Op0, Op1}); - if (it != PairMap[Idx].end()) - Score += it->second; + if (it != PairMap[Idx].end()) { + // Functions like BreakUpSubtract() can erase the Values we're using + // as keys and create new Values after we built the PairMap. There's a + // small chance that the new nodes can have the same address as + // something already in the table. We shouldn't accumulate the stored + // score in that case as it refers to the wrong Value. + if (it->second.isValid()) + Score += it->second.Score; + } unsigned MaxRank = std::max(Ops[i].Rank, Ops[j].Rank); if (Score > Max || (Score == Max && MaxRank < BestRank)) { @@ -2287,9 +2294,15 @@ ReassociatePass::BuildPairMap(ReversePostOrderTraversal<Function *> &RPOT) { std::swap(Op0, Op1); if (!Visited.insert({Op0, Op1}).second) continue; - auto res = PairMap[BinaryIdx].insert({{Op0, Op1}, 1}); - if (!res.second) - ++res.first->second; + auto res = PairMap[BinaryIdx].insert({{Op0, Op1}, {Op0, Op1, 1}}); + if (!res.second) { + // If either key value has been erased then we've got the same + // address by coincidence. That can't happen here because nothing is + // erasing values but it can happen by the time we're querying the + // map. + assert(res.first->second.isValid() && "WeakVH invalidated"); + ++res.first->second.Score; + } } } } |

