diff options
| author | River Riddle <riverriddle@google.com> | 2020-01-11 08:54:04 -0800 |
|---|---|---|
| committer | River Riddle <riverriddle@google.com> | 2020-01-11 08:54:39 -0800 |
| commit | 2bdf33cc4c733342fc83081bc7410ac5e9a24f55 (patch) | |
| tree | 3306d769c2bbabda1060928e0cea79d021ea9da2 /mlir/docs/Tutorials | |
| parent | 1d641daf260308815d014d1bf1b424a1ed1e7277 (diff) | |
| download | bcm5719-llvm-2bdf33cc4c733342fc83081bc7410ac5e9a24f55.tar.gz bcm5719-llvm-2bdf33cc4c733342fc83081bc7410ac5e9a24f55.zip | |
[mlir] NFC: Remove Value::operator* and Value::operator-> now that Value is properly value-typed.
Summary: These were temporary methods used to simplify the transition.
Reviewed By: antiagainst
Differential Revision: https://reviews.llvm.org/D72548
Diffstat (limited to 'mlir/docs/Tutorials')
| -rw-r--r-- | mlir/docs/Tutorials/Toy/Ch-3.md | 6 | ||||
| -rw-r--r-- | mlir/docs/Tutorials/Toy/Ch-4.md | 4 |
2 files changed, 5 insertions, 5 deletions
diff --git a/mlir/docs/Tutorials/Toy/Ch-3.md b/mlir/docs/Tutorials/Toy/Ch-3.md index 6ff9d3cb299..a535d1c95c6 100644 --- a/mlir/docs/Tutorials/Toy/Ch-3.md +++ b/mlir/docs/Tutorials/Toy/Ch-3.md @@ -92,7 +92,7 @@ struct SimplifyRedundantTranspose : public mlir::OpRewritePattern<TransposeOp> { // Look through the input of the current transpose. mlir::Value transposeInput = op.getOperand(); TransposeOp transposeInputOp = - llvm::dyn_cast_or_null<TransposeOp>(transposeInput->getDefiningOp()); + llvm::dyn_cast_or_null<TransposeOp>(transposeInput.getDefiningOp()); // If the input is defined by another Transpose, bingo! if (!transposeInputOp) return matchFailure(); @@ -194,7 +194,7 @@ An example is a transformation that eliminates reshapes when they are redundant, i.e. when the input and output shapes are identical. ```tablegen -def TypesAreIdentical : Constraint<CPred<"$0->getType() == $1->getType()">>; +def TypesAreIdentical : Constraint<CPred<"$0.getType() == $1.getType()">>; def RedundantReshapeOptPattern : Pat< (ReshapeOp:$res $arg), (replaceWithValue $arg), [(TypesAreIdentical $res, $arg)]>; @@ -208,7 +208,7 @@ optimize Reshape of a constant value by reshaping the constant in place and eliminating the reshape operation. ```tablegen -def ReshapeConstant : NativeCodeCall<"$0.reshape(($1->getType()).cast<ShapedType>())">; +def ReshapeConstant : NativeCodeCall<"$0.reshape(($1.getType()).cast<ShapedType>())">; def FoldConstantReshapeOptPattern : Pat< (ReshapeOp:$res (ConstantOp $arg)), (ConstantOp (ReshapeConstant $arg, $res))>; diff --git a/mlir/docs/Tutorials/Toy/Ch-4.md b/mlir/docs/Tutorials/Toy/Ch-4.md index 2df009ddc2d..d449fe5c712 100644 --- a/mlir/docs/Tutorials/Toy/Ch-4.md +++ b/mlir/docs/Tutorials/Toy/Ch-4.md @@ -82,7 +82,7 @@ struct ToyInlinerInterface : public DialectInlinerInterface { // Replace the values directly with the return operands. assert(returnOp.getNumOperands() == valuesToRepl.size()); for (const auto &it : llvm::enumerate(returnOp.getOperands())) - valuesToRepl[it.index()]->replaceAllUsesWith(it.value()); + valuesToRepl[it.index()].replaceAllUsesWith(it.value()); } }; ``` @@ -310,7 +310,7 @@ inferred as the shape of the inputs. ```c++ /// Infer the output shape of the MulOp, this is required by the shape inference /// interface. -void MulOp::inferShapes() { getResult()->setType(getOperand(0)->getType()); } +void MulOp::inferShapes() { getResult().setType(getOperand(0).getType()); } ``` At this point, each of the necessary Toy operations provide a mechanism by which |

