diff options
| author | Tim Northover <tnorthover@apple.com> | 2016-02-17 23:07:04 +0000 |
|---|---|---|
| committer | Tim Northover <tnorthover@apple.com> | 2016-02-17 23:07:04 +0000 |
| commit | 7687bcee4a5a83fc01141f47cc17a64fbdb96207 (patch) | |
| tree | bdeb29a14966664a8c70e94f1a50dfaecc070668 /llvm/lib/Target | |
| parent | 053ac453b9f8e6045fd915abde13663636383429 (diff) | |
| download | bcm5719-llvm-7687bcee4a5a83fc01141f47cc17a64fbdb96207.tar.gz bcm5719-llvm-7687bcee4a5a83fc01141f47cc17a64fbdb96207.zip | |
AArch64: always clear kill flags up to last eliminated copy
After r261154, we were only clearing flags if the known-zero register was
originally live-in to the basic block, but we have to do it even if not when
more than one COPY has been eliminated, otherwise the user of the first COPY
may still have <kill> marked.
E.g.
BB#N:
%X0 = COPY %XZR
STRXui %X0<kill>, <fi#0>
%X0 = COPY %XZR
STRXui %X0<kill>, <fi#1>
We can eliminate both copies, X0 is not live-in, but we must clear the kill on
the first store.
Unfortunately, I've been unable to come up with a non-fragile test for this.
I've only seen it in the wild with regalloc-created spills, and attempts to
reproduce that in a reasonable way run afoul of COPY coalescing. Even volatile
asm clobbers were moved around. Should fix the aarch64 bot though.
llvm-svn: 261175
Diffstat (limited to 'llvm/lib/Target')
| -rw-r--r-- | llvm/lib/Target/AArch64/AArch64RedundantCopyElimination.cpp | 14 |
1 files changed, 7 insertions, 7 deletions
diff --git a/llvm/lib/Target/AArch64/AArch64RedundantCopyElimination.cpp b/llvm/lib/Target/AArch64/AArch64RedundantCopyElimination.cpp index b52c19a026d..8def8f32d70 100644 --- a/llvm/lib/Target/AArch64/AArch64RedundantCopyElimination.cpp +++ b/llvm/lib/Target/AArch64/AArch64RedundantCopyElimination.cpp @@ -149,15 +149,15 @@ bool AArch64RedundantCopyElimination::optimizeCopy(MachineBasicBlock *MBB) { // CBZ/CBNZ. Conservatively mark as much as we can live. CompBr->clearRegisterKills(SmallestDef, TRI); - // Clear any kills of TargetReg between CompBr and MI. - if (std::any_of(TargetRegs.begin(), TargetRegs.end(), - [&](unsigned Reg) { return MBB->isLiveIn(Reg); })) { - for (MachineInstr &MMI : - make_range(MBB->begin()->getIterator(), LastChange->getIterator())) - MMI.clearRegisterKills(SmallestDef, TRI); - } else + if (std::none_of(TargetRegs.begin(), TargetRegs.end(), + [&](unsigned Reg) { return MBB->isLiveIn(Reg); })) MBB->addLiveIn(TargetReg); + // Clear any kills of TargetReg between CompBr and the last removed COPY. + for (MachineInstr &MMI : + make_range(MBB->begin()->getIterator(), LastChange->getIterator())) + MMI.clearRegisterKills(SmallestDef, TRI); + return true; } |

