mirror of
https://github.com/leanprover/lean4.git
synced 2026-04-13 23:54:10 +00:00
Compare commits
93 Commits
fix_map_su
...
reducePow_
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
aa4dfbb348 | ||
|
|
4236d8a85b | ||
|
|
5f31e938c1 | ||
|
|
95bf6793aa | ||
|
|
30cf3bb3bf | ||
|
|
238885436b | ||
|
|
30a52b794a | ||
|
|
5da9038fb4 | ||
|
|
3a588e7547 | ||
|
|
9f76cb9aa5 | ||
|
|
88a7f5c592 | ||
|
|
dd6ed124ba | ||
|
|
6dd502321f | ||
|
|
bcbd7299e9 | ||
|
|
b144107ed5 | ||
|
|
c24d2186fc | ||
|
|
7ef2d2fea2 | ||
|
|
7776852d22 | ||
|
|
63c4de5fea | ||
|
|
3b14642c42 | ||
|
|
d52da36e68 | ||
|
|
bf82965eec | ||
|
|
4bac74c4ac | ||
|
|
8b9d27de31 | ||
|
|
d15f0335a9 | ||
|
|
240ebff549 | ||
|
|
a29bca7f00 | ||
|
|
313f6b3c74 | ||
|
|
43fa46412d | ||
|
|
234704e304 | ||
|
|
12a714a6f9 | ||
|
|
cdc7ed0224 | ||
|
|
217abdf97a | ||
|
|
490a2b4bf9 | ||
|
|
84d45deb10 | ||
|
|
f46d216e18 | ||
|
|
cc42a17931 | ||
|
|
e106be19dd | ||
|
|
1efd6657d4 | ||
|
|
473b34561d | ||
|
|
574066b30b | ||
|
|
1e6d617aad | ||
|
|
c17a4ddc94 | ||
|
|
5be4f5e30c | ||
|
|
3c5ac9496f | ||
|
|
6c1f8a8a63 | ||
|
|
7bea3c1508 | ||
|
|
a27d4a9519 | ||
|
|
4a2fb6e922 | ||
|
|
b7db82894b | ||
|
|
35e1554ef7 | ||
|
|
14d59b3599 | ||
|
|
a8e480cd52 | ||
|
|
d07239d1bd | ||
|
|
590de785cc | ||
|
|
d671d0d61a | ||
|
|
8e476e9d22 | ||
|
|
a3d144a362 | ||
|
|
87d41e6326 | ||
|
|
d6cb2432c6 | ||
|
|
c0ffc85d75 | ||
|
|
f62359acc7 | ||
|
|
2d09c96caf | ||
|
|
21b4377d36 | ||
|
|
1e9d96be22 | ||
|
|
647a5e9492 | ||
|
|
9c4028aab4 | ||
|
|
2c002718e0 | ||
|
|
b07384acbb | ||
|
|
efc99b982e | ||
|
|
ee430b6c80 | ||
|
|
a8740f5ed9 | ||
|
|
5e6a3cf5f9 | ||
|
|
0ed1cf7244 | ||
|
|
e83f78d5af | ||
|
|
32b9de8c77 | ||
|
|
a856016b9d | ||
|
|
c517688f1d | ||
|
|
db594425bf | ||
|
|
dcea47db02 | ||
|
|
f869902a4b | ||
|
|
d5a8c9647f | ||
|
|
d19bab0c27 | ||
|
|
6a4159c4a7 | ||
|
|
8acdafd5b3 | ||
|
|
688da9d8a7 | ||
|
|
d5e7dbad80 | ||
|
|
82f48740dc | ||
|
|
a827759f1d | ||
|
|
a4015ca36c | ||
|
|
81719f94c9 | ||
|
|
afe0b5a013 | ||
|
|
90dab5e267 |
4
.github/workflows/nix-ci.yml
vendored
4
.github/workflows/nix-ci.yml
vendored
@@ -103,7 +103,7 @@ jobs:
|
||||
continue-on-error: true
|
||||
- name: Build manual
|
||||
run: |
|
||||
nix build $NIX_BUILD_ARGS --update-input lean --no-write-lock-file ./doc#{lean-mdbook,leanInk,alectryon,test,inked} -o push-doc
|
||||
nix build $NIX_BUILD_ARGS --update-input lean --no-write-lock-file ./doc#{lean-mdbook,leanInk,alectryon,inked} -o push-doc
|
||||
nix build $NIX_BUILD_ARGS --update-input lean --no-write-lock-file ./doc
|
||||
# https://github.com/netlify/cli/issues/1809
|
||||
cp -r --dereference ./result ./dist
|
||||
@@ -146,5 +146,3 @@ jobs:
|
||||
- name: Fixup CCache Cache
|
||||
run: |
|
||||
sudo chown -R $USER /nix/var/cache
|
||||
- name: CCache stats
|
||||
run: CCACHE_DIR=/nix/var/cache/ccache nix run .#nixpkgs.ccache -- -s
|
||||
|
||||
296
RELEASES.md
296
RELEASES.md
@@ -14,10 +14,289 @@ Development in progress.
|
||||
|
||||
v4.10.0
|
||||
----------
|
||||
Release candidate, release notes will be copied from branch `releases/v4.10.0` once completed.
|
||||
|
||||
### Language features, tactics, and metaprograms
|
||||
|
||||
* `split` tactic:
|
||||
* [#4401](https://github.com/leanprover/lean4/pull/4401) improves the strategy `split` uses to generalize discriminants of matches and adds `trace.split.failure` trace class for diagnosing issues.
|
||||
|
||||
* `rw` tactic:
|
||||
* [#4385](https://github.com/leanprover/lean4/pull/4385) prevents the tactic from claiming pre-existing goals are new subgoals.
|
||||
* [dac1da](https://github.com/leanprover/lean4/commit/dac1dacc5b39911827af68247d575569d9c399b5) adds configuration for ordering new goals, like for `apply`.
|
||||
|
||||
* `simp` tactic:
|
||||
* [#4430](https://github.com/leanprover/lean4/pull/4430) adds `dsimproc`s for `if` expressions (`ite` and `dite`).
|
||||
* [#4434](https://github.com/leanprover/lean4/pull/4434) improves heuristics for unfolding. Equational lemmas now have priorities where more-specific equationals lemmas are tried first before a possible catch-all.
|
||||
* [#4481](https://github.com/leanprover/lean4/pull/4481) fixes an issue where function-valued `OfNat` numeric literals would become denormalized.
|
||||
* [#4467](https://github.com/leanprover/lean4/pull/4467) fixes an issue where dsimp theorems might not apply to literals.
|
||||
* [#4484](https://github.com/leanprover/lean4/pull/4484) fixes the source position for the warning for deprecated simp arguments.
|
||||
* [#4258](https://github.com/leanprover/lean4/pull/4258) adds docstrings for `dsimp` configuration.
|
||||
* [#4567](https://github.com/leanprover/lean4/pull/4567) improves the accuracy of used simp lemmas reported by `simp?`.
|
||||
* [fb9727](https://github.com/leanprover/lean4/commit/fb97275dcbb683efe6da87ed10a3f0cd064b88fd) adds (but does not implement) the simp configuration option `implicitDefEqProofs`, which will enable including `rfl`-theorems in proof terms.
|
||||
* `omega` tactic:
|
||||
* [#4360](https://github.com/leanprover/lean4/pull/4360) makes the tactic generate error messages lazily, improving its performance when used in tactic combinators.
|
||||
* `bv_omega` tactic:
|
||||
* [#4579](https://github.com/leanprover/lean4/pull/4579) works around changes to the definition of `Fin.sub` in this release.
|
||||
* [#4490](https://github.com/leanprover/lean4/pull/4490) sets up groundwork for a tactic index in generated documentation, as there was in Lean 3. See PR description for details.
|
||||
|
||||
* **Commands**
|
||||
* [#4370](https://github.com/leanprover/lean4/pull/4370) makes the `variable` command fully elaborate binders during validation, fixing an issue where some errors would be reported only at the next declaration.
|
||||
* [#4408](https://github.com/leanprover/lean4/pull/4408) fixes a discrepency in universe parameter order between `theorem` and `def` declarations.
|
||||
* [#4493](https://github.com/leanprover/lean4/pull/4493) and
|
||||
[#4482](https://github.com/leanprover/lean4/pull/4482) fix a discrepancy in the elaborators for `theorem`, `def`, and `example`,
|
||||
making `Prop`-valued `example`s and other definition commands elaborate like `theorem`s.
|
||||
* [8f023b](https://github.com/leanprover/lean4/commit/8f023b85c554186ae562774b8122322d856c674e), [3c4d6b](https://github.com/leanprover/lean4/commit/3c4d6ba8648eb04d90371eb3fdbd114d16949501) and [0783d0](https://github.com/leanprover/lean4/commit/0783d0fcbe31b626fbd3ed2f29d838e717f09101) change the `#reduce` command to be able to control what gets reduced.
|
||||
For example, `#reduce (proofs := true) (types := false) e` reduces both proofs and types in the expression `e`.
|
||||
By default, neither proofs or types are reduced.
|
||||
* [#4489](https://github.com/leanprover/lean4/pull/4489) fixes an elaboration bug in `#check_tactic`.
|
||||
* [#4505](https://github.com/leanprover/lean4/pull/4505) adds support for `open _root_.<namespace>`.
|
||||
|
||||
* **Options**
|
||||
* [#4576](https://github.com/leanprover/lean4/pull/4576) adds the `debug.byAsSorry` option. Setting `set_option debug.byAsSorry true` causes all `by ...` terms to elaborate as `sorry`.
|
||||
* [7b56eb](https://github.com/leanprover/lean4/commit/7b56eb20a03250472f4b145118ae885274d1f8f7) and [d8e719](https://github.com/leanprover/lean4/commit/d8e719f9ab7d049e423473dfc7a32867d32c856f) add the `debug.skipKernelTC` option. Setting `set_option debug.skipKernelTC true` turns off kernel typechecking. This is meant for temporarily working around kernel performance issues, and it compromises soundness since buggy tactics may produce invalid proofs, which will not be caught if this option is set to true.
|
||||
|
||||
* [#4301](https://github.com/leanprover/lean4/pull/4301)
|
||||
adds a linter to flag situations where a local variable's name is one of
|
||||
the argumentless constructors of its type. This can arise when a user either
|
||||
doesn't open a namespace or doesn't add a dot or leading qualifier, as
|
||||
in the following:
|
||||
|
||||
```lean
|
||||
inductive Tree (α : Type) where
|
||||
| leaf
|
||||
| branch (left : Tree α) (val : α) (right : Tree α)
|
||||
|
||||
def depth : Tree α → Nat
|
||||
| leaf => 0
|
||||
```
|
||||
|
||||
With this linter, the `leaf` pattern is highlighted as a local
|
||||
variable whose name overlaps with the constructor `Tree.leaf`.
|
||||
|
||||
The linter can be disabled with `set_option linter.constructorNameAsVariable false`.
|
||||
|
||||
Additionally, the error message that occurs when a name in a pattern that takes arguments isn't valid now suggests similar names that would be valid. This means that the following definition:
|
||||
|
||||
```lean
|
||||
def length (list : List α) : Nat :=
|
||||
match list with
|
||||
| nil => 0
|
||||
| cons x xs => length xs + 1
|
||||
```
|
||||
|
||||
now results in the following warning:
|
||||
|
||||
```
|
||||
warning: Local variable 'nil' resembles constructor 'List.nil' - write '.nil' (with a dot) or 'List.nil' to use the constructor.
|
||||
note: this linter can be disabled with `set_option linter.constructorNameAsVariable false`
|
||||
```
|
||||
|
||||
and error:
|
||||
|
||||
```
|
||||
invalid pattern, constructor or constant marked with '[match_pattern]' expected
|
||||
|
||||
Suggestion: 'List.cons' is similar
|
||||
```
|
||||
|
||||
* **Metaprogramming**
|
||||
* [#4454](https://github.com/leanprover/lean4/pull/4454) adds public `Name.isInternalDetail` function for filtering declarations using naming conventions for internal names.
|
||||
|
||||
* **Other fixes or improvements**
|
||||
* [#4416](https://github.com/leanprover/lean4/pull/4416) sorts the ouput of `#print axioms` for determinism.
|
||||
* [#4528](https://github.com/leanprover/lean4/pull/4528) fixes error message range for the cdot focusing tactic.
|
||||
|
||||
### Language server, widgets, and IDE extensions
|
||||
|
||||
* [#4443](https://github.com/leanprover/lean4/pull/4443) makes the watchdog be more resilient against badly behaving clients.
|
||||
|
||||
### Pretty printing
|
||||
|
||||
* [#4433](https://github.com/leanprover/lean4/pull/4433) restores fallback pretty printers when context is not available, and documents `addMessageContext`.
|
||||
* [#4556](https://github.com/leanprover/lean4/pull/4556) introduces `pp.maxSteps` option and sets the default value of `pp.deepTerms` to `false`. Together, these keep excessively large or deep terms from overwhelming the Infoview.
|
||||
|
||||
### Library
|
||||
* [#4560](https://github.com/leanprover/lean4/pull/4560) splits `GetElem` class into `GetElem` and `GetElem?`.
|
||||
This enables removing `Decidable` instance arguments from `GetElem.getElem?` and `GetElem.getElem!`, improving their rewritability.
|
||||
See the docstrings for these classes for more information.
|
||||
* `Array`
|
||||
* [#4389](https://github.com/leanprover/lean4/pull/4389) makes `Array.toArrayAux_eq` be a `simp` lemma.
|
||||
* [#4399](https://github.com/leanprover/lean4/pull/4399) improves robustness of the proof for `Array.reverse_data`.
|
||||
* `List`
|
||||
* [#4469](https://github.com/leanprover/lean4/pull/4469) and [#4475](https://github.com/leanprover/lean4/pull/4475) improve the organization of the `List` API.
|
||||
* [#4470](https://github.com/leanprover/lean4/pull/4470) improves the `List.set` and `List.concat` API.
|
||||
* [#4472](https://github.com/leanprover/lean4/pull/4472) upstreams lemmas about `List.filter` from Batteries.
|
||||
* [#4473](https://github.com/leanprover/lean4/pull/4473) adjusts `@[simp]` attributes.
|
||||
* [#4488](https://github.com/leanprover/lean4/pull/4488) makes `List.getElem?_eq_getElem` be a simp lemma.
|
||||
* [#4487](https://github.com/leanprover/lean4/pull/4487) adds missing `List.replicate` API.
|
||||
* [#4521](https://github.com/leanprover/lean4/pull/4521) adds lemmas about `List.map`.
|
||||
* [#4500](https://github.com/leanprover/lean4/pull/4500) changes `List.length_cons` to use `as.length + 1` instead of `as.length.succ`.
|
||||
* [#4524](https://github.com/leanprover/lean4/pull/4524) fixes the statement of `List.filter_congr`.
|
||||
* [#4525](https://github.com/leanprover/lean4/pull/4525) changes binder explicitness in `List.bind_map`.
|
||||
* [#4550](https://github.com/leanprover/lean4/pull/4550) adds `maximum?_eq_some_iff'` and `minimum?_eq_some_iff?`.
|
||||
* [#4400](https://github.com/leanprover/lean4/pull/4400) switches the normal forms for indexing `List` and `Array` to `xs[n]` and `xs[n]?`.
|
||||
* `HashMap`
|
||||
* [#4372](https://github.com/leanprover/lean4/pull/4372) fixes linearity in `HashMap.insert` and `HashMap.erase`, leading to a 40% speedup in a replace-heavy workload.
|
||||
* `Option`
|
||||
* [#4403](https://github.com/leanprover/lean4/pull/4403) generalizes type of `Option.forM` from `Unit` to `PUnit`.
|
||||
* [#4504](https://github.com/leanprover/lean4/pull/4504) remove simp attribute from `Option.elim` and instead adds it to individal reduction lemmas, making unfolding less aggressive.
|
||||
* `Nat`
|
||||
* [#4242](https://github.com/leanprover/lean4/pull/4242) adds missing theorems for `n + 1` and `n - 1` normal forms.
|
||||
* [#4486](https://github.com/leanprover/lean4/pull/4486) makes `Nat.min_assoc` be a simp lemma.
|
||||
* [#4522](https://github.com/leanprover/lean4/pull/4522) moves `@[simp]` from `Nat.pred_le` to `Nat.sub_one_le`.
|
||||
* [#4532](https://github.com/leanprover/lean4/pull/4532) changes various `Nat.succ n` to `n + 1`.
|
||||
* `Int`
|
||||
* [#3850](https://github.com/leanprover/lean4/pull/3850) adds complete div/mod simprocs for `Int`.
|
||||
* `String`/`Char`
|
||||
* [#4357](https://github.com/leanprover/lean4/pull/4357) make the byte size interface be `Nat`-valued with functions `Char.utf8Size` and `String.utf8ByteSize`.
|
||||
* [#4438](https://github.com/leanprover/lean4/pull/4438) upstreams `Char.ext` from Batteries and adds some `Char` documentation to the manual.
|
||||
* `Fin`
|
||||
* [#4421](https://github.com/leanprover/lean4/pull/4421) adjusts `Fin.sub` to be more performant in definitional equality checks.
|
||||
* `Prod`
|
||||
* [#4526](https://github.com/leanprover/lean4/pull/4526) adds missing `Prod.map` lemmas.
|
||||
* [#4533](https://github.com/leanprover/lean4/pull/4533) fixes binder explicitness in lemmas.
|
||||
* `BitVec`
|
||||
* [#4428](https://github.com/leanprover/lean4/pull/4428) adds missing `simproc` for `BitVec` equality.
|
||||
* [#4417](https://github.com/leanprover/lean4/pull/4417) adds `BitVec.twoPow` and lemmas, toward bitblasting multiplication for LeanSAT.
|
||||
* `Std` library
|
||||
* [#4499](https://github.com/leanprover/lean4/pull/4499) introduces `Std`, a library situated between `Init` and `Lean`, providing functionality not in the prelude both to Lean's implementation and to external users.
|
||||
* **Other fixes or improvements**
|
||||
* [#3056](https://github.com/leanprover/lean4/pull/3056) standardizes on using `(· == a)` over `(a == ·)`.
|
||||
* [#4502](https://github.com/leanprover/lean4/pull/4502) fixes errors reported by running the library through the the Batteries linters.
|
||||
|
||||
### Lean internals
|
||||
|
||||
* [#4391](https://github.com/leanprover/lean4/pull/4391) makes `getBitVecValue?` recognize `BitVec.ofNatLt`.
|
||||
* [#4410](https://github.com/leanprover/lean4/pull/4410) adjusts `instantiateMVars` algorithm to zeta reduce `let` expressions while beta reducing instantiated metavariables.
|
||||
* [#4420](https://github.com/leanprover/lean4/pull/4420) fixes occurs check for metavariable assignments to also take metavariable types into account.
|
||||
* [#4425](https://github.com/leanprover/lean4/pull/4425) fixes `forEachModuleInDir` to iterate over each Lean file exactly once.
|
||||
* [#3886](https://github.com/leanprover/lean4/pull/3886) adds support to build Lean core oleans using Lake.
|
||||
* **Defeq and WHNF algorithms**
|
||||
* [#4387](https://github.com/leanprover/lean4/pull/4387) improves performance of `isDefEq` by eta reducing lambda-abstracted terms during metavariable assignments, since these are beta reduced during metavariable instantiation anyway.
|
||||
* [#4388](https://github.com/leanprover/lean4/pull/4388) removes redundant code in `isDefEqQuickOther`.
|
||||
* **Typeclass inference**
|
||||
* [#4530](https://github.com/leanprover/lean4/pull/4530) fixes handling of metavariables when caching results at `synthInstance?`.
|
||||
* **Elaboration**
|
||||
* [#4426](https://github.com/leanprover/lean4/pull/4426) makes feature where the "don't know how to synthesize implicit argument" error reports the name of the argument more reliable.
|
||||
* [#4497](https://github.com/leanprover/lean4/pull/4497) fixes a name resolution bug for generalized field notation (dot notation).
|
||||
* [#4536](https://github.com/leanprover/lean4/pull/4536) blocks the implicit lambda feature for `(e :)` notation.
|
||||
* [#4562](https://github.com/leanprover/lean4/pull/4562) makes it be an error for there to be two functions with the same name in a `where`/`let rec` block.
|
||||
* Recursion principles
|
||||
* [#4549](https://github.com/leanprover/lean4/pull/4549) refactors `findRecArg`, extracting `withRecArgInfo`.
|
||||
Errors are now reported in parameter order rather than the order they are tried (non-indices are tried first).
|
||||
For every argument, it will say why it wasn't tried, even if the reason is obvious (e.g. a fixed prefix or is `Prop`-typed, etc.).
|
||||
* Porting core C++ to Lean
|
||||
* [#4474](https://github.com/leanprover/lean4/pull/4474) takes a step to refactor `constructions` toward a future port to Lean.
|
||||
* [#4498](https://github.com/leanprover/lean4/pull/4498) ports `mk_definition_inferring_unsafe` to Lean.
|
||||
* [#4516](https://github.com/leanprover/lean4/pull/4516) ports `recOn` construction to Lean.
|
||||
* [#4517](https://github.com/leanprover/lean4/pull/4517), [#4653](https://github.com/leanprover/lean4/pull/4653), and [#4651](https://github.com/leanprover/lean4/pull/4651) port `below` and `brecOn` construction to Lean.
|
||||
* Documentation
|
||||
* [#4501](https://github.com/leanprover/lean4/pull/4501) adds a more-detailed docstring for `PersistentEnvExtension`.
|
||||
* **Other fixes or improvements**
|
||||
* [#4382](https://github.com/leanprover/lean4/pull/4382) removes `@[inline]` attribute from `NameMap.find?`, which caused respecialization at each call site.
|
||||
* [5f9ded](https://github.com/leanprover/lean4/commit/5f9dedfe5ee9972acdebd669f228f487844a6156) improves output of `trace.Elab.snapshotTree`.
|
||||
* [#4424](https://github.com/leanprover/lean4/pull/4424) removes "you might need to open '{dir}' in your editor" message that is now handled by Lake and the VS Code extension.
|
||||
* [#4451](https://github.com/leanprover/lean4/pull/4451) improves the performance of `CollectMVars` and `FindMVar`.
|
||||
* [#4479](https://github.com/leanprover/lean4/pull/4479) adds missing `DecidableEq` and `Repr` instances for intermediate structures used by the `BitVec` and `Fin` simprocs.
|
||||
* [#4492](https://github.com/leanprover/lean4/pull/4492) adds tests for a previous `isDefEq` issue.
|
||||
* [9096d6](https://github.com/leanprover/lean4/commit/9096d6fc7180fe533c504f662bcb61550e4a2492) removes `PersistentHashMap.size`.
|
||||
* [#4508](https://github.com/leanprover/lean4/pull/4508) fixes `@[implemented_by]` for functions defined by well-founded recursion.
|
||||
* [#4509](https://github.com/leanprover/lean4/pull/4509) adds additional tests for `apply?` tactic.
|
||||
* [d6eab3](https://github.com/leanprover/lean4/commit/d6eab393f4df9d473b5736d636b178eb26d197e6) fixes a benchmark.
|
||||
* [#4563](https://github.com/leanprover/lean4/pull/4563) adds a workaround for a bug in `IndPredBelow.mkBelowMatcher`.
|
||||
* **Cleanup:** [#4380](https://github.com/leanprover/lean4/pull/4380), [#4431](https://github.com/leanprover/lean4/pull/4431), [#4494](https://github.com/leanprover/lean4/pull/4494), [e8f768](https://github.com/leanprover/lean4/commit/e8f768f9fd8cefc758533bc76e3a12b398ed4a39), [de2690](https://github.com/leanprover/lean4/commit/de269060d17a581ed87f40378dbec74032633b27), [d3a756](https://github.com/leanprover/lean4/commit/d3a7569c97123d022828106468d54e9224ed8207), [#4404](https://github.com/leanprover/lean4/pull/4404), [#4537](https://github.com/leanprover/lean4/pull/4537).
|
||||
|
||||
### Compiler, runtime, and FFI
|
||||
|
||||
* [d85d3d](https://github.com/leanprover/lean4/commit/d85d3d5f3a09ff95b2ee47c6f89ef50b7e339126) fixes criterion for tail-calls in ownership calculation.
|
||||
* [#3963](https://github.com/leanprover/lean4/pull/3963) adds validation of UTF-8 at the C++-to-Lean boundary in the runtime.
|
||||
* [#4512](https://github.com/leanprover/lean4/pull/4512) fixes missing unboxing in interpreter when loading initialized value.
|
||||
* [#4477](https://github.com/leanprover/lean4/pull/4477) exposes the compiler flags for the bundled C compiler (clang).
|
||||
|
||||
### Lake
|
||||
|
||||
* [#4384](https://github.com/leanprover/lean4/pull/4384) deprecates `inputFile` and replaces it with `inputBinFile` and `inputTextFile`. Unlike `inputBinFile` (and `inputFile`), `inputTextFile` normalizes line endings, which helps ensure text file traces are platform-independent.
|
||||
* [#4371](https://github.com/leanprover/lean4/pull/4371) simplifies dependency resolution code.
|
||||
* [#4439](https://github.com/leanprover/lean4/pull/4439) touches up the Lake configuration DSL and makes other improvements:
|
||||
string literals can now be used instead of identifiers for names,
|
||||
avoids using French quotes in `lake new` and `lake init` templates,
|
||||
changes the `exe` template to use `Main` for the main module,
|
||||
improves the `math` template error if `lean-toolchain` fails to download,
|
||||
and downgrades unknown configuration fields from an error to a warning to improve cross-version compatibility.
|
||||
* [#4496](https://github.com/leanprover/lean4/pull/4496) tweaks `require` syntax and updates docs. Now `require` in TOML for a package name such as `doc-gen4` does not need French quotes.
|
||||
* [#4485](https://github.com/leanprover/lean4/pull/4485) fixes a bug where package versions in indirect dependencies would take precedence over direct dependencies.
|
||||
* [#4478](https://github.com/leanprover/lean4/pull/4478) fixes a bug where Lake incorrectly included the module dynamic library in a platform-independent trace.
|
||||
* [#4529](https://github.com/leanprover/lean4/pull/4529) fixes some issues with bad import errors.
|
||||
A bad import in an executable no longer prevents the executable's root
|
||||
module from being built. This also fixes a problem where the location
|
||||
of a transitive bad import would not been shown.
|
||||
The root module of the executable now respects `nativeFacets`.
|
||||
* [#4564](https://github.com/leanprover/lean4/pull/4564) fixes a bug where non-identifier script names could not be entered on the CLI without French quotes.
|
||||
* [#4566](https://github.com/leanprover/lean4/pull/4566) addresses a few issues with precompiled libraries.
|
||||
* Fixes a bug where Lake would always precompile the package of a module.
|
||||
* If a module is precompiled, it now precompiles its imports. Previously, it would only do this if imported.
|
||||
* [#4495](https://github.com/leanprover/lean4/pull/4495), [#4692](https://github.com/leanprover/lean4/pull/4692), [#4849](https://github.com/leanprover/lean4/pull/4849)
|
||||
add a new type of `require` that fetches package metadata from a
|
||||
registry API endpoint (e.g. Reservoir) and then clones a Git package
|
||||
using the information provided. To require such a dependency, the new
|
||||
syntax is:
|
||||
|
||||
```lean
|
||||
require <scope> / <pkg-name> [@ git <rev>]
|
||||
-- Examples:
|
||||
require "leanprover" / "doc-gen4"
|
||||
require "leanprover-community" / "proofwidgets" @ git "v0.0.39"
|
||||
```
|
||||
|
||||
Or in TOML:
|
||||
```toml
|
||||
[[require]]
|
||||
name = "<pkg-name>"
|
||||
scope = "<scope>"
|
||||
rev = "<rev>"
|
||||
```
|
||||
|
||||
Unlike with Git dependencies, Lake can make use of the richer
|
||||
information provided by the registry to determine the default branch of
|
||||
the package. This means for repositories of packages like `doc-gen4`
|
||||
which have a default branch that is not `master`, Lake will now use said
|
||||
default branch (e.g., in `doc-gen4`'s case, `main`).
|
||||
|
||||
Lake also supports configuring the registry endpoint via an environment
|
||||
variable: `RESERVIOR_API_URL`. Thus, any server providing a similar
|
||||
interface to Reservoir can be used as the registry. Further
|
||||
configuration options paralleling those of Cargo's [Alternative Registries](https://doc.rust-lang.org/cargo/reference/registries.html)
|
||||
and [Source Replacement](https://doc.rust-lang.org/cargo/reference/source-replacement.html)
|
||||
will come in the future.
|
||||
|
||||
### DevOps/CI
|
||||
* [#4427](https://github.com/leanprover/lean4/pull/4427) uses Namespace runners for CI for `leanprover/lean4`.
|
||||
* [#4440](https://github.com/leanprover/lean4/pull/4440) fixes speedcenter tests in CI.
|
||||
* [#4441](https://github.com/leanprover/lean4/pull/4441) fixes that workflow change would break CI for unrebased PRs.
|
||||
* [#4442](https://github.com/leanprover/lean4/pull/4442) fixes Wasm release-ci.
|
||||
* [6d265b](https://github.com/leanprover/lean4/commit/6d265b42b117eef78089f479790587a399da7690) fixes for `github.event.pull_request.merge_commit_sha` sometimes not being available.
|
||||
* [16cad2](https://github.com/leanprover/lean4/commit/16cad2b45c6a77efe4dce850dcdbaafaa7c91fc3) adds optimization for CI to not fetch complete history.
|
||||
* [#4544](https://github.com/leanprover/lean4/pull/4544) causes releases to be marked as prerelease on GitHub.
|
||||
* [#4446](https://github.com/leanprover/lean4/pull/4446) switches Lake to using `src/lake/lakefile.toml` to avoid needing to load a version of Lake to build Lake.
|
||||
* Nix
|
||||
* [5eb5fa](https://github.com/leanprover/lean4/commit/5eb5fa49cf9862e99a5bccff8d4ca1a062f81900) fixes `update-stage0-commit` for Nix.
|
||||
* [#4476](https://github.com/leanprover/lean4/pull/4476) adds gdb to Nix shell.
|
||||
* [e665a0](https://github.com/leanprover/lean4/commit/e665a0d716dc42ba79b339b95e01eb99fe932cb3) fixes `update-stage0` for Nix.
|
||||
* [4808eb](https://github.com/leanprover/lean4/commit/4808eb7c4bfb98f212b865f06a97d46c44978a61) fixes `cacheRoots` for Nix.
|
||||
* [#3811](https://github.com/leanprover/lean4/pull/3811) adds platform-dependent flag to lib target.
|
||||
* [#4587](https://github.com/leanprover/lean4/pull/4587) adds linking of `-lStd` back into nix build flags on darwin.
|
||||
|
||||
### Breaking changes
|
||||
|
||||
* `Char.csize` is replaced by `Char.utf8Size` ([#4357](https://github.com/leanprover/lean4/pull/4357)).
|
||||
* Library lemmas now are in terms of `(· == a)` over `(a == ·)` ([#3056](https://github.com/leanprover/lean4/pull/3056)).
|
||||
* Now the normal forms for indexing into `List` and `Array` is `xs[n]` and `xs[n]?` rather than using functions like `List.get` ([#4400](https://github.com/leanprover/lean4/pull/4400)).
|
||||
* Sometimes terms created via a sequence of unifications will be more eta reduced than before and proofs will require adaptation ([#4387](https://github.com/leanprover/lean4/pull/4387)).
|
||||
* The `GetElem` class has been split into two; see the docstrings for `GetElem` and `GetElem?` for more information ([#4560](https://github.com/leanprover/lean4/pull/4560)).
|
||||
|
||||
|
||||
v4.9.0
|
||||
----------
|
||||
----------
|
||||
|
||||
### Language features, tactics, and metaprograms
|
||||
|
||||
@@ -40,6 +319,8 @@ v4.9.0
|
||||
* [#4395](https://github.com/leanprover/lean4/pull/4395) adds conservative fix for whitespace handling to avoid incremental reuse leading to goals in front of the text cursor being shown.
|
||||
* [#4407](https://github.com/leanprover/lean4/pull/4407) fixes non-incremental commands in macros blocking further incremental reporting.
|
||||
* [#4436](https://github.com/leanprover/lean4/pull/4436) fixes incremental reporting when there are nested tactics in terms.
|
||||
* [#4459](https://github.com/leanprover/lean4/pull/4459) adds incrementality support for `next` and `if` tactics.
|
||||
* [#4554](https://github.com/leanprover/lean4/pull/4554) disables incrementality for tactics in terms in tactics.
|
||||
* **Functional induction**
|
||||
* [#4135](https://github.com/leanprover/lean4/pull/4135) ensures that the names used for functional induction are reserved.
|
||||
* [#4327](https://github.com/leanprover/lean4/pull/4327) adds support for structural recursion on reflexive types.
|
||||
@@ -85,7 +366,7 @@ v4.9.0
|
||||
When `index := false`, only the head function is taken into account, like in Lean 3.
|
||||
This feature can help users diagnose tricky simp failures or issues in code from libraries
|
||||
developed using Lean 3 and then ported to Lean 4.
|
||||
|
||||
|
||||
In the following example, it will report that `foo` is a problematic theorem.
|
||||
```lean
|
||||
opaque f : Nat → Nat → Nat
|
||||
@@ -105,7 +386,7 @@ v4.9.0
|
||||
opaque f : Nat → Nat → Nat
|
||||
|
||||
@[simp] theorem foo : f x (no_index (x, y).2) = y := by sorry
|
||||
|
||||
|
||||
example : f a b ≤ b := by
|
||||
simp -- `foo` is still applied with `index := true`
|
||||
```
|
||||
@@ -123,6 +404,8 @@ v4.9.0
|
||||
* [#4267](https://github.com/leanprover/lean4/pull/4267) cases signature elaboration errors to show even if there are parse errors in the body.
|
||||
* [#4368](https://github.com/leanprover/lean4/pull/4368) improves error messages when numeric literals fail to synthesize an `OfNat` instance,
|
||||
including special messages warning when the expected type of the numeral can be a proposition.
|
||||
* [#4643](https://github.com/leanprover/lean4/pull/4643) fixes issue leading to nested error messages and info trees vanishing, where snapshot subtrees were not restored on reuse.
|
||||
* [#4657](https://github.com/leanprover/lean4/pull/4657) calculates error suppression per snapshot, letting elaboration errors appear even when there are later parse errors ([RFC #3556](https://github.com/leanprover/lean4/issues/3556)).
|
||||
* **Metaprogramming**
|
||||
* [#4167](https://github.com/leanprover/lean4/pull/4167) adds `Lean.MVarId.revertAll` to revert all free variables.
|
||||
* [#4169](https://github.com/leanprover/lean4/pull/4169) adds `Lean.MVarId.ensureNoMVar` to ensure the goal's target contains no expression metavariables.
|
||||
@@ -239,6 +522,8 @@ v4.9.0
|
||||
* [#4192](https://github.com/leanprover/lean4/pull/4192) fixes restoration of infotrees when auto-bound implicit feature is activated,
|
||||
fixing a pretty printing error in hovers and strengthening the unused variable linter.
|
||||
* [dfb496](https://github.com/leanprover/lean4/commit/dfb496a27123c3864571aec72f6278e2dad1cecf) fixes `declareBuiltin` to allow it to be called multiple times per declaration.
|
||||
* [#4569](https://github.com/leanprover/lean4/pull/4569) fixes an issue introduced in a merge conflict, where the interrupt exception was swallowed by some `tryCatchRuntimeEx` uses.
|
||||
* [b056a0](https://github.com/leanprover/lean4/commit/b056a0b395bb728512a3f3e83bf9a093059d4301) adapts kernel interruption to the new cancellation system.
|
||||
* Cleanup: [#4112](https://github.com/leanprover/lean4/pull/4112), [#4126](https://github.com/leanprover/lean4/pull/4126), [#4091](https://github.com/leanprover/lean4/pull/4091), [#4139](https://github.com/leanprover/lean4/pull/4139), [#4153](https://github.com/leanprover/lean4/pull/4153).
|
||||
* Tests: [030406](https://github.com/leanprover/lean4/commit/03040618b8f9b35b7b757858483e57340900cdc4), [#4133](https://github.com/leanprover/lean4/pull/4133).
|
||||
|
||||
@@ -249,6 +534,7 @@ v4.9.0
|
||||
* [#3915](https://github.com/leanprover/lean4/pull/3915) documents the runtime memory layout for inductive types.
|
||||
|
||||
### Lake
|
||||
* [#4518](https://github.com/leanprover/lean4/pull/4518) makes trace reading more robust. Lake now rebuilds if trace files are invalid or unreadable and is backwards compatible with previous pure numeric traces.
|
||||
* [#4057](https://github.com/leanprover/lean4/pull/4057) adds support for docstrings on `require` commands.
|
||||
* [#4088](https://github.com/leanprover/lean4/pull/4088) improves hovers for `family_def` and `library_data` commands.
|
||||
* [#4147](https://github.com/leanprover/lean4/pull/4147) adds default `README.md` to package templates
|
||||
@@ -298,12 +584,14 @@ v4.9.0
|
||||
* [#4333](https://github.com/leanprover/lean4/pull/4333) adjusts workflow to update Batteries in manifest when creating `lean-pr-testing-NNNN` Mathlib branches.
|
||||
* [#4355](https://github.com/leanprover/lean4/pull/4355) simplifies `lean4checker` step of release checklist.
|
||||
* [#4361](https://github.com/leanprover/lean4/pull/4361) adds installing elan to `pr-release` CI step.
|
||||
* [#4628](https://github.com/leanprover/lean4/pull/4628) fixes the Windows build, which was missing an exported symbol.
|
||||
|
||||
### Breaking changes
|
||||
While most changes could be considered to be a breaking change, this section makes special note of API changes.
|
||||
|
||||
* `Nat.zero_or` and `Nat.or_zero` have been swapped ([#4094](https://github.com/leanprover/lean4/pull/4094)).
|
||||
* `IsLawfulSingleton` is now `LawfulSingleton` ([#4350](https://github.com/leanprover/lean4/pull/4350)).
|
||||
* The `BitVec` literal notation is now `<num>#<term>` rather than `<term>#<term>`, and it is global rather than scoped. Use `BitVec.ofNat w x` rather than `x#w` when `x` is a not a numeric literal ([0d3051](https://github.com/leanprover/lean4/commit/0d30517dca094a07bcb462252f718e713b93ffba)).
|
||||
* `BitVec.rotateLeft` and `BitVec.rotateRight` now take the shift modulo the bitwidth ([#4229](https://github.com/leanprover/lean4/pull/4229)).
|
||||
* These are no longer simp lemmas:
|
||||
`List.length_pos` ([#4172](https://github.com/leanprover/lean4/pull/4172)),
|
||||
|
||||
@@ -5,7 +5,7 @@ Some notes on how to debug Lean, which may also be applicable to debugging Lean
|
||||
|
||||
## Tracing
|
||||
|
||||
In `CoreM` and derived monads, we use `trace![traceCls] "msg with {interpolations}"` to fill the structured trace viewable with `set_option trace.traceCls true`.
|
||||
In `CoreM` and derived monads, we use `trace[traceCls] "msg with {interpolations}"` to fill the structured trace viewable with `set_option trace.traceCls true`.
|
||||
New trace classes have to be registered using `registerTraceClass` first.
|
||||
|
||||
Notable trace classes:
|
||||
@@ -22,7 +22,9 @@ Notable trace classes:
|
||||
|
||||
In pure contexts or when execution is aborted before the messages are finally printed, one can instead use the term `dbg_trace "msg with {interpolations}"; val` (`;` can also be replaced by a newline), which will print the message to stderr before evaluating `val`. `dbgTraceVal val` can be used as a shorthand for `dbg_trace "{val}"; val`.
|
||||
Note that if the return value is not actually used, the trace code is silently dropped as well.
|
||||
In the language server, stderr output is buffered and shown as messages after a command has been elaborated, unless the option `server.stderrAsMessages` is deactivated.
|
||||
|
||||
By default, such stderr output is buffered and shown as messages after a command has been elaborated, which is necessary to ensure deterministic ordering of messages under parallelism.
|
||||
If Lean aborts the process before it can finish the command or takes too long to do that, using `-DstderrAsMessages=false` avoids this buffering and shows `dbg_trace` output (but not `trace`s or other diagnostics) immediately.
|
||||
|
||||
## Debuggers
|
||||
|
||||
|
||||
@@ -5,8 +5,11 @@ See below for the checklist for release candidates.
|
||||
|
||||
We'll use `v4.6.0` as the intended release version as a running example.
|
||||
|
||||
- One week before the planned release, ensure that (1) someone has written the release notes and (2) someone has written the first draft of the release blog post.
|
||||
If there is any material in `./releases_drafts/`, then the release notes are not done. (See the section "Writing the release notes".)
|
||||
- One week before the planned release, ensure that
|
||||
(1) someone has written the release notes and
|
||||
(2) someone has written the first draft of the release blog post.
|
||||
If there is any material in `./releases_drafts/` on the `releases/v4.6.0` branch, then the release notes are not done.
|
||||
(See the section "Writing the release notes".)
|
||||
- `git checkout releases/v4.6.0`
|
||||
(This branch should already exist, from the release candidates.)
|
||||
- `git pull`
|
||||
@@ -144,33 +147,31 @@ We'll use `v4.7.0-rc1` as the intended release version in this example.
|
||||
- You can monitor this at `https://github.com/leanprover/lean4/actions/workflows/ci.yml`, looking for the `v4.7.0-rc1` tag.
|
||||
- This step can take up to an hour.
|
||||
- (GitHub release notes) Once the release appears at https://github.com/leanprover/lean4/releases/
|
||||
- Edit the release notes on Github to select the "Set as a pre-release box".
|
||||
- If release notes have been written already, copy the section of `RELEASES.md` for this version into the Github release notes
|
||||
and use the title "Changes since v4.6.0 (from RELEASES.md)".
|
||||
- Otherwise, in the "previous tag" dropdown, select `v4.6.0`, and click "Generate release notes".
|
||||
- Verify that the release is marked as a prerelease (this should have been done automatically by the CI release job).
|
||||
- In the "previous tag" dropdown, select `v4.6.0`, and click "Generate release notes".
|
||||
This will add a list of all the commits since the last stable version.
|
||||
- Delete anything already mentioned in the hand-written release notes above.
|
||||
- Delete "update stage0" commits, and anything with a completely inscrutable commit message.
|
||||
- Briefly rearrange the remaining items by category (e.g. `simp`, `lake`, `bug fixes`),
|
||||
but for minor items don't put any work in expanding on commit messages.
|
||||
- (How we want to release notes to look is evolving: please update this section if it looks wrong!)
|
||||
- Next, we will move a curated list of downstream repos to the release candidate.
|
||||
- This assumes that there is already a *reviewed* branch `bump/v4.7.0` on each repository
|
||||
containing the required adaptations (or no adaptations are required).
|
||||
The preparation of this branch is beyond the scope of this document.
|
||||
- This assumes that for each repository either:
|
||||
* There is already a *reviewed* branch `bump/v4.7.0` containing the required adaptations.
|
||||
The preparation of this branch is beyond the scope of this document.
|
||||
* The repository does not need any changes to move to the new version.
|
||||
- For each of the target repositories:
|
||||
- Checkout the `bump/v4.7.0` branch.
|
||||
- Verify that the `lean-toolchain` is set to the nightly from which the release candidate was created.
|
||||
- `git merge origin/master`
|
||||
- Change the `lean-toolchain` to `leanprover/lean4:v4.7.0-rc1`
|
||||
- In `lakefile.lean`, change any dependencies which were using `nightly-testing` or `bump/v4.7.0` branches
|
||||
back to `master` or `main`, and run `lake update` for those dependencies.
|
||||
- Run `lake build` to ensure that dependencies are found (but it's okay to stop it after a moment).
|
||||
- `git commit`
|
||||
- `git push`
|
||||
- Open a PR from `bump/v4.7.0` to `master`, and either merge it yourself after CI, if appropriate,
|
||||
or notify the maintainers that it is ready to go.
|
||||
- Once this PR has been merged, tag `master` with `v4.7.0-rc1` and push this tag.
|
||||
- If the repository does not need any changes (i.e. `bump/v4.7.0` does not exist) then create
|
||||
a new PR updating `lean-toolchain` to `leanprover/lean4:v4.7.0-rc1` and running `lake update`.
|
||||
- Otherwise:
|
||||
- Checkout the `bump/v4.7.0` branch.
|
||||
- Verify that the `lean-toolchain` is set to the nightly from which the release candidate was created.
|
||||
- `git merge origin/master`
|
||||
- Change the `lean-toolchain` to `leanprover/lean4:v4.7.0-rc1`
|
||||
- In `lakefile.lean`, change any dependencies which were using `nightly-testing` or `bump/v4.7.0` branches
|
||||
back to `master` or `main`, and run `lake update` for those dependencies.
|
||||
- Run `lake build` to ensure that dependencies are found (but it's okay to stop it after a moment).
|
||||
- `git commit`
|
||||
- `git push`
|
||||
- Open a PR from `bump/v4.7.0` to `master`, and either merge it yourself after CI, if appropriate,
|
||||
or notify the maintainers that it is ready to go.
|
||||
- Once the PR has been merged, tag `master` with `v4.7.0-rc1` and push this tag.
|
||||
- We do this for the same list of repositories as for stable releases, see above.
|
||||
As above, there are dependencies between these, and so the process above is iterative.
|
||||
It greatly helps if you can merge the `bump/v4.7.0` PRs yourself!
|
||||
@@ -187,6 +188,8 @@ We'll use `v4.7.0-rc1` as the intended release version in this example.
|
||||
Please also make sure that whoever is handling social media knows the release is out.
|
||||
- Begin the next development cycle (i.e. for `v4.8.0`) on the Lean repository, by making a PR that:
|
||||
- Updates `src/CMakeLists.txt` to say `set(LEAN_VERSION_MINOR 8)`
|
||||
- Replaces the "release notes will be copied" text in the `v4.6.0` section of `RELEASES.md` with the
|
||||
finalized release notes from the `releases/v4.6.0` branch.
|
||||
- Replaces the "development in progress" in the `v4.7.0` section of `RELEASES.md` with
|
||||
```
|
||||
Release candidate, release notes will be copied from `branch releases/v4.7.0` once completed.
|
||||
@@ -222,12 +225,15 @@ Please read https://leanprover-community.github.io/contribute/tags_and_branches.
|
||||
* This can either be done by the person managing this process directly,
|
||||
or by soliciting assistance from authors of files, or generally helpful people on Zulip!
|
||||
* Each repo has a `bump/v4.7.0` which accumulates reviewed changes adapting to new versions.
|
||||
* Once `nightly-testing` is working on a given nightly, say `nightly-2024-02-15`, we:
|
||||
* Once `nightly-testing` is working on a given nightly, say `nightly-2024-02-15`, we will create a PR to `bump/v4.7.0`.
|
||||
* For Mathlib, there is a script in `scripts/create-adaptation-pr.sh` that automates this process.
|
||||
* For Batteries and Aesop it is currently manual.
|
||||
* For all of these repositories, the process is the same:
|
||||
* Make sure `bump/v4.7.0` is up to date with `master` (by merging `master`, no PR necessary)
|
||||
* Create from `bump/v4.7.0` a `bump/nightly-2024-02-15` branch.
|
||||
* In that branch, `git merge --squash nightly-testing` to bring across changes from `nightly-testing`.
|
||||
* In that branch, `git merge nightly-testing` to bring across changes from `nightly-testing`.
|
||||
* Sanity check changes, commit, and make a PR to `bump/v4.7.0` from the `bump/nightly-2024-02-15` branch.
|
||||
* Solicit review, merge the PR into `bump/v4,7,0`.
|
||||
* Solicit review, merge the PR into `bump/v4.7.0`.
|
||||
* It is always okay to merge in the following directions:
|
||||
`master` -> `bump/v4.7.0` -> `bump/nightly-2024-02-15` -> `nightly-testing`.
|
||||
Please remember to push any merges you make to intermediate steps!
|
||||
@@ -239,7 +245,7 @@ The exact steps are a work in progress.
|
||||
Here is the general idea:
|
||||
|
||||
* The work is done right on the `releases/v4.6.0` branch sometime after it is created but before the stable release is made.
|
||||
The release notes for `v4.6.0` will be copied to `master`.
|
||||
The release notes for `v4.6.0` will later be copied to `master` when we begin a new development cycle.
|
||||
* There can be material for release notes entries in commit messages.
|
||||
* There can also be pre-written entries in `./releases_drafts`, which should be all incorporated in the release notes and then deleted from the branch.
|
||||
See `./releases_drafts/README.md` for more information.
|
||||
|
||||
138
doc/flake.lock
generated
138
doc/flake.lock
generated
@@ -18,12 +18,15 @@
|
||||
}
|
||||
},
|
||||
"flake-utils": {
|
||||
"inputs": {
|
||||
"systems": "systems"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1656928814,
|
||||
"narHash": "sha256-RIFfgBuKz6Hp89yRr7+NR5tzIAbn52h8vT6vXkYjZoM=",
|
||||
"lastModified": 1710146030,
|
||||
"narHash": "sha256-SZ5L6eA7HJ/nmkzGG7/ISclqe6oZdOZTNoesiInkXPQ=",
|
||||
"owner": "numtide",
|
||||
"repo": "flake-utils",
|
||||
"rev": "7e2a3b3dfd9af950a856d66b0a7d01e3c18aa249",
|
||||
"rev": "b1d9ab70662946ef0850d488da1c9019f3a9752a",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -35,13 +38,12 @@
|
||||
"lean": {
|
||||
"inputs": {
|
||||
"flake-utils": "flake-utils",
|
||||
"lean4-mode": "lean4-mode",
|
||||
"nix": "nix",
|
||||
"nixpkgs": "nixpkgs_2"
|
||||
"nixpkgs": "nixpkgs",
|
||||
"nixpkgs-old": "nixpkgs-old"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 0,
|
||||
"narHash": "sha256-YnYbmG0oou1Q/GE4JbMNb8/yqUVXBPIvcdQQJHBqtPk=",
|
||||
"narHash": "sha256-saRAtQ6VautVXKDw1XH35qwP0KEBKTKZbg/TRa4N9Vw=",
|
||||
"path": "../.",
|
||||
"type": "path"
|
||||
},
|
||||
@@ -50,22 +52,6 @@
|
||||
"type": "path"
|
||||
}
|
||||
},
|
||||
"lean4-mode": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1659020985,
|
||||
"narHash": "sha256-+dRaXB7uvN/weSZiKcfSKWhcdJVNg9Vg8k0pJkDNjpc=",
|
||||
"owner": "leanprover",
|
||||
"repo": "lean4-mode",
|
||||
"rev": "37d5c99b7b29c80ab78321edd6773200deb0bca6",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "leanprover",
|
||||
"repo": "lean4-mode",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"leanInk": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
@@ -83,22 +69,6 @@
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"lowdown-src": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1633514407,
|
||||
"narHash": "sha256-Dw32tiMjdK9t3ETl5fzGrutQTzh2rufgZV4A/BbxuD4=",
|
||||
"owner": "kristapsdz",
|
||||
"repo": "lowdown",
|
||||
"rev": "d2c2b44ff6c27b936ec27358a2653caaef8f73b8",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "kristapsdz",
|
||||
"repo": "lowdown",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"mdBook": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
@@ -115,65 +85,13 @@
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nix": {
|
||||
"inputs": {
|
||||
"lowdown-src": "lowdown-src",
|
||||
"nixpkgs": "nixpkgs",
|
||||
"nixpkgs-regression": "nixpkgs-regression"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1657097207,
|
||||
"narHash": "sha256-SmeGmjWM3fEed3kQjqIAO8VpGmkC2sL1aPE7kKpK650=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nix",
|
||||
"rev": "f6316b49a0c37172bca87ede6ea8144d7d89832f",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"repo": "nix",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1653988320,
|
||||
"narHash": "sha256-ZaqFFsSDipZ6KVqriwM34T739+KLYJvNmCWzErjAg7c=",
|
||||
"lastModified": 1710889954,
|
||||
"narHash": "sha256-Pr6F5Pmd7JnNEMHHmspZ0qVqIBVxyZ13ik1pJtm2QXk=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "2fa57ed190fd6c7c746319444f34b5917666e5c1",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"ref": "nixos-22.05-small",
|
||||
"repo": "nixpkgs",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs-regression": {
|
||||
"locked": {
|
||||
"lastModified": 1643052045,
|
||||
"narHash": "sha256-uGJ0VXIhWKGXxkeNnq4TvV3CIOkUJ3PAoLZ3HMzNVMw=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "215d4d0fd80ca5163643b03a33fde804a29cc1e2",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "215d4d0fd80ca5163643b03a33fde804a29cc1e2",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs_2": {
|
||||
"locked": {
|
||||
"lastModified": 1657208011,
|
||||
"narHash": "sha256-BlIFwopAykvdy1DYayEkj6ZZdkn+cVgPNX98QVLc0jM=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "2770cc0b1e8faa0e20eb2c6aea64c256a706d4f2",
|
||||
"rev": "7872526e9c5332274ea5932a0c3270d6e4724f3b",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
@@ -183,6 +101,23 @@
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs-old": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1581379743,
|
||||
"narHash": "sha256-i1XCn9rKuLjvCdu2UeXKzGLF6IuQePQKFt4hEKRU5oc=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "34c7eb7545d155cc5b6f499b23a7cb1c96ab4d59",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"ref": "nixos-19.03",
|
||||
"repo": "nixpkgs",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"root": {
|
||||
"inputs": {
|
||||
"alectryon": "alectryon",
|
||||
@@ -194,6 +129,21 @@
|
||||
"leanInk": "leanInk",
|
||||
"mdBook": "mdBook"
|
||||
}
|
||||
},
|
||||
"systems": {
|
||||
"locked": {
|
||||
"lastModified": 1681028828,
|
||||
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
|
||||
"owner": "nix-systems",
|
||||
"repo": "default",
|
||||
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "nix-systems",
|
||||
"repo": "default",
|
||||
"type": "github"
|
||||
}
|
||||
}
|
||||
},
|
||||
"root": "root",
|
||||
|
||||
@@ -17,7 +17,7 @@
|
||||
};
|
||||
|
||||
outputs = inputs@{ self, ... }: inputs.flake-utils.lib.eachDefaultSystem (system:
|
||||
with inputs.lean.packages.${system}; with nixpkgs;
|
||||
with inputs.lean.packages.${system}.deprecated; with nixpkgs;
|
||||
let
|
||||
doc-src = lib.sourceByRegex ../. ["doc.*" "tests(/lean(/beginEndAsMacro.lean)?)?"];
|
||||
in {
|
||||
@@ -44,21 +44,6 @@
|
||||
mdbook build -d $out
|
||||
'';
|
||||
};
|
||||
# We use a separate derivation instead of `checkPhase` so we can push it but not `doc` to the binary cache
|
||||
test = stdenv.mkDerivation {
|
||||
name ="lean-doc-test";
|
||||
src = doc-src;
|
||||
buildInputs = [ lean-mdbook stage1.Lean.lean-package strace ];
|
||||
patchPhase = ''
|
||||
cd doc
|
||||
patchShebangs test
|
||||
'';
|
||||
buildPhase = ''
|
||||
mdbook test
|
||||
touch $out
|
||||
'';
|
||||
dontInstall = true;
|
||||
};
|
||||
leanInk = (buildLeanPackage {
|
||||
name = "Main";
|
||||
src = inputs.leanInk;
|
||||
|
||||
@@ -27,9 +27,9 @@ Setting up a basic parallelized release build:
|
||||
git clone https://github.com/leanprover/lean4
|
||||
cd lean4
|
||||
cmake --preset release
|
||||
make -C build/release -j$(nproc) # see below for macOS
|
||||
make -C build/release -j$(nproc || sysctl -n hw.logicalcpu)
|
||||
```
|
||||
You can replace `$(nproc)`, which is not available on macOS and some alternative shells, with the desired parallelism amount.
|
||||
You can replace `$(nproc || sysctl -n hw.logicalcpu)` with the desired parallelism amount.
|
||||
|
||||
The above commands will compile the Lean library and binaries into the
|
||||
`stage1` subfolder; see below for details.
|
||||
|
||||
113
flake.lock
generated
113
flake.lock
generated
@@ -1,21 +1,5 @@
|
||||
{
|
||||
"nodes": {
|
||||
"flake-compat": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1673956053,
|
||||
"narHash": "sha256-4gtG9iQuiKITOjNQQeQIpoIB6b16fm+504Ch3sNKLd8=",
|
||||
"owner": "edolstra",
|
||||
"repo": "flake-compat",
|
||||
"rev": "35bb57c0c8d8b62bbfd284272c928ceb64ddbde9",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "edolstra",
|
||||
"repo": "flake-compat",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"flake-utils": {
|
||||
"inputs": {
|
||||
"systems": "systems"
|
||||
@@ -34,71 +18,18 @@
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"lean4-mode": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1709737301,
|
||||
"narHash": "sha256-uT9JN2kLNKJK9c/S/WxLjiHmwijq49EgLb+gJUSDpz0=",
|
||||
"owner": "leanprover",
|
||||
"repo": "lean4-mode",
|
||||
"rev": "f1f24c15134dee3754b82c9d9924866fe6bc6b9f",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "leanprover",
|
||||
"repo": "lean4-mode",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"libgit2": {
|
||||
"flake": false,
|
||||
"locked": {
|
||||
"lastModified": 1697646580,
|
||||
"narHash": "sha256-oX4Z3S9WtJlwvj0uH9HlYcWv+x1hqp8mhXl7HsLu2f0=",
|
||||
"owner": "libgit2",
|
||||
"repo": "libgit2",
|
||||
"rev": "45fd9ed7ae1a9b74b957ef4f337bc3c8b3df01b5",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "libgit2",
|
||||
"repo": "libgit2",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nix": {
|
||||
"inputs": {
|
||||
"flake-compat": "flake-compat",
|
||||
"libgit2": "libgit2",
|
||||
"nixpkgs": "nixpkgs",
|
||||
"nixpkgs-regression": "nixpkgs-regression"
|
||||
},
|
||||
"locked": {
|
||||
"lastModified": 1711102798,
|
||||
"narHash": "sha256-CXOIJr8byjolqG7eqCLa+Wfi7rah62VmLoqSXENaZnw=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nix",
|
||||
"rev": "a22328066416650471c3545b0b138669ea212ab4",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"repo": "nix",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs": {
|
||||
"locked": {
|
||||
"lastModified": 1709083642,
|
||||
"narHash": "sha256-7kkJQd4rZ+vFrzWu8sTRtta5D1kBG0LSRYAfhtmMlSo=",
|
||||
"lastModified": 1710889954,
|
||||
"narHash": "sha256-Pr6F5Pmd7JnNEMHHmspZ0qVqIBVxyZ13ik1pJtm2QXk=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "b550fe4b4776908ac2a861124307045f8e717c8e",
|
||||
"rev": "7872526e9c5332274ea5932a0c3270d6e4724f3b",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"ref": "release-23.11",
|
||||
"ref": "nixpkgs-unstable",
|
||||
"repo": "nixpkgs",
|
||||
"type": "github"
|
||||
}
|
||||
@@ -120,44 +51,10 @@
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs-regression": {
|
||||
"locked": {
|
||||
"lastModified": 1643052045,
|
||||
"narHash": "sha256-uGJ0VXIhWKGXxkeNnq4TvV3CIOkUJ3PAoLZ3HMzNVMw=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "215d4d0fd80ca5163643b03a33fde804a29cc1e2",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "215d4d0fd80ca5163643b03a33fde804a29cc1e2",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"nixpkgs_2": {
|
||||
"locked": {
|
||||
"lastModified": 1710889954,
|
||||
"narHash": "sha256-Pr6F5Pmd7JnNEMHHmspZ0qVqIBVxyZ13ik1pJtm2QXk=",
|
||||
"owner": "NixOS",
|
||||
"repo": "nixpkgs",
|
||||
"rev": "7872526e9c5332274ea5932a0c3270d6e4724f3b",
|
||||
"type": "github"
|
||||
},
|
||||
"original": {
|
||||
"owner": "NixOS",
|
||||
"ref": "nixpkgs-unstable",
|
||||
"repo": "nixpkgs",
|
||||
"type": "github"
|
||||
}
|
||||
},
|
||||
"root": {
|
||||
"inputs": {
|
||||
"flake-utils": "flake-utils",
|
||||
"lean4-mode": "lean4-mode",
|
||||
"nix": "nix",
|
||||
"nixpkgs": "nixpkgs_2",
|
||||
"nixpkgs": "nixpkgs",
|
||||
"nixpkgs-old": "nixpkgs-old"
|
||||
}
|
||||
},
|
||||
|
||||
61
flake.nix
61
flake.nix
@@ -1,38 +1,21 @@
|
||||
{
|
||||
description = "Lean interactive theorem prover";
|
||||
description = "Lean development flake. Not intended for end users.";
|
||||
|
||||
inputs.nixpkgs.url = "github:NixOS/nixpkgs/nixpkgs-unstable";
|
||||
# old nixpkgs used for portable release with older glibc (2.27)
|
||||
inputs.nixpkgs-old.url = "github:NixOS/nixpkgs/nixos-19.03";
|
||||
inputs.nixpkgs-old.flake = false;
|
||||
inputs.flake-utils.url = "github:numtide/flake-utils";
|
||||
inputs.nix.url = "github:NixOS/nix";
|
||||
inputs.lean4-mode = {
|
||||
url = "github:leanprover/lean4-mode";
|
||||
flake = false;
|
||||
};
|
||||
# used *only* by `stage0-from-input` below
|
||||
#inputs.lean-stage0 = {
|
||||
# url = github:leanprover/lean4;
|
||||
# inputs.nixpkgs.follows = "nixpkgs";
|
||||
# inputs.flake-utils.follows = "flake-utils";
|
||||
# inputs.nix.follows = "nix";
|
||||
# inputs.lean4-mode.follows = "lean4-mode";
|
||||
#};
|
||||
|
||||
outputs = { self, nixpkgs, nixpkgs-old, flake-utils, nix, lean4-mode, ... }@inputs: flake-utils.lib.eachDefaultSystem (system:
|
||||
outputs = { self, nixpkgs, nixpkgs-old, flake-utils, ... }@inputs: flake-utils.lib.eachDefaultSystem (system:
|
||||
let
|
||||
pkgs = import nixpkgs {
|
||||
inherit system;
|
||||
# for `vscode-with-extensions`
|
||||
config.allowUnfree = true;
|
||||
};
|
||||
pkgs = import nixpkgs { inherit system; };
|
||||
# An old nixpkgs for creating releases with an old glibc
|
||||
pkgsDist-old = import nixpkgs-old { inherit system; };
|
||||
# An old nixpkgs for creating releases with an old glibc
|
||||
pkgsDist-old-aarch = import nixpkgs-old { localSystem.config = "aarch64-unknown-linux-gnu"; };
|
||||
|
||||
lean-packages = pkgs.callPackage (./nix/packages.nix) { src = ./.; inherit nix lean4-mode; };
|
||||
lean-packages = pkgs.callPackage (./nix/packages.nix) { src = ./.; };
|
||||
|
||||
devShellWithDist = pkgsDist: pkgs.mkShell.override {
|
||||
stdenv = pkgs.overrideCC pkgs.stdenv lean-packages.llvmPackages.clang;
|
||||
@@ -58,41 +41,15 @@
|
||||
GDB = pkgsDist.gdb;
|
||||
});
|
||||
in {
|
||||
packages = lean-packages // rec {
|
||||
debug = lean-packages.override { debug = true; };
|
||||
stage0debug = lean-packages.override { stage0debug = true; };
|
||||
asan = lean-packages.override { extraCMakeFlags = [ "-DLEAN_EXTRA_CXX_FLAGS=-fsanitize=address" "-DLEANC_EXTRA_FLAGS=-fsanitize=address" "-DSMALL_ALLOCATOR=OFF" "-DSYMBOLIC=OFF" ]; };
|
||||
asandebug = asan.override { debug = true; };
|
||||
tsan = lean-packages.override {
|
||||
extraCMakeFlags = [ "-DLEAN_EXTRA_CXX_FLAGS=-fsanitize=thread" "-DLEANC_EXTRA_FLAGS=-fsanitize=thread" "-DCOMPRESSED_OBJECT_HEADER=OFF" ];
|
||||
stage0 = (lean-packages.override {
|
||||
# Compressed headers currently trigger data race reports in tsan.
|
||||
# Turn them off for stage 0 as well so stage 1 can read its own stdlib.
|
||||
extraCMakeFlags = [ "-DCOMPRESSED_OBJECT_HEADER=OFF" ];
|
||||
}).stage1;
|
||||
};
|
||||
tsandebug = tsan.override { debug = true; };
|
||||
stage0-from-input = lean-packages.override {
|
||||
stage0 = pkgs.writeShellScriptBin "lean" ''
|
||||
exec ${inputs.lean-stage0.packages.${system}.lean}/bin/lean -Dinterpreter.prefer_native=false "$@"
|
||||
'';
|
||||
};
|
||||
inherit self;
|
||||
packages = {
|
||||
# to be removed when Nix CI is not needed anymore
|
||||
inherit (lean-packages) cacheRoots test update-stage0-commit ciShell;
|
||||
deprecated = lean-packages;
|
||||
};
|
||||
defaultPackage = lean-packages.lean-all;
|
||||
|
||||
# The default development shell for working on lean itself
|
||||
devShells.default = devShellWithDist pkgs;
|
||||
devShells.oldGlibc = devShellWithDist pkgsDist-old;
|
||||
devShells.oldGlibcAArch = devShellWithDist pkgsDist-old-aarch;
|
||||
|
||||
checks.lean = lean-packages.test;
|
||||
}) // rec {
|
||||
templates.pkg = {
|
||||
path = ./nix/templates/pkg;
|
||||
description = "A custom Lean package";
|
||||
};
|
||||
|
||||
defaultTemplate = templates.pkg;
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
stdenv, lib, cmake, gmp, git, gnumake, bash, buildLeanPackage, writeShellScriptBin, runCommand, symlinkJoin, lndir, perl, gnused, darwin, llvmPackages, linkFarmFromDrvs,
|
||||
... } @ args:
|
||||
with builtins;
|
||||
rec {
|
||||
lib.warn "The Nix-based build is deprecated" rec {
|
||||
inherit stdenv;
|
||||
sourceByRegex = p: rs: lib.sourceByRegex p (map (r: "(/src/)?${r}") rs);
|
||||
buildCMake = args: stdenv.mkDerivation ({
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
{ lean, lean-leanDeps ? lean, lean-final ? lean, leanc,
|
||||
stdenv, lib, coreutils, gnused, writeShellScriptBin, bash, lean-emacs, lean-vscode, nix, substituteAll, symlinkJoin, linkFarmFromDrvs,
|
||||
stdenv, lib, coreutils, gnused, writeShellScriptBin, bash, substituteAll, symlinkJoin, linkFarmFromDrvs,
|
||||
runCommand, darwin, mkShell, ... }:
|
||||
let lean-final' = lean-final; in
|
||||
lib.makeOverridable (
|
||||
@@ -197,19 +197,6 @@ with builtins; let
|
||||
then map (m: m.module) header.imports
|
||||
else abort "errors while parsing imports of ${mod}:\n${lib.concatStringsSep "\n" header.errors}";
|
||||
in mkMod mod (map (dep: if modDepsMap ? ${dep} then modCandidates.${dep} else externalModMap.${dep}) deps)) modDepsMap;
|
||||
makeEmacsWrapper = name: emacs: lean: writeShellScriptBin name ''
|
||||
${emacs} --eval "(progn (setq lean4-rootdir \"${lean}\"))" "$@"
|
||||
'';
|
||||
makeVSCodeWrapper = name: lean: writeShellScriptBin name ''
|
||||
PATH=${lean}/bin:$PATH ${lean-vscode}/bin/code "$@"
|
||||
'';
|
||||
printPaths = deps: writeShellScriptBin "print-paths" ''
|
||||
echo '${toJSON {
|
||||
oleanPath = [(depRoot "print-paths" deps)];
|
||||
srcPath = ["."] ++ map (dep: dep.src) allExternalDeps;
|
||||
loadDynlibPaths = map pathOfSharedLib (loadDynlibsOfDeps deps);
|
||||
}}'
|
||||
'';
|
||||
expandGlob = g:
|
||||
if typeOf g == "string" then [g]
|
||||
else if g.glob == "one" then [g.mod]
|
||||
@@ -257,48 +244,4 @@ in rec {
|
||||
-o $out/bin/${executableName} \
|
||||
${lib.concatStringsSep " " allLinkFlags}
|
||||
'') {};
|
||||
|
||||
lean-package = writeShellScriptBin "lean" ''
|
||||
LEAN_PATH=${modRoot}:$LEAN_PATH LEAN_SRC_PATH=$LEAN_SRC_PATH:${src} exec ${lean-final}/bin/lean "$@"
|
||||
'';
|
||||
emacs-package = makeEmacsWrapper "emacs-package" lean-package;
|
||||
vscode-package = makeVSCodeWrapper "vscode-package" lean-package;
|
||||
|
||||
link-ilean = writeShellScriptBin "link-ilean" ''
|
||||
dest=''${1:-.}
|
||||
mkdir -p $dest/build/lib
|
||||
ln -sf ${iTree}/* $dest/build/lib
|
||||
'';
|
||||
|
||||
makePrintPathsFor = deps: mods: printPaths deps // mapAttrs (_: mod: makePrintPathsFor (deps ++ [mod]) mods) mods;
|
||||
print-paths = makePrintPathsFor [] (mods' // externalModMap);
|
||||
# `lean` wrapper that dynamically runs Nix for the actual `lean` executable so the same editor can be
|
||||
# used for multiple projects/after upgrading the `lean` input/for editing both stage 1 and the tests
|
||||
lean-bin-dev = substituteAll {
|
||||
name = "lean";
|
||||
dir = "bin";
|
||||
src = ./lean-dev.in;
|
||||
isExecutable = true;
|
||||
srcRoot = fullSrc; # use root flake.nix in case of Lean repo
|
||||
inherit bash nix srcTarget srcArgs;
|
||||
};
|
||||
lake-dev = substituteAll {
|
||||
name = "lake";
|
||||
dir = "bin";
|
||||
src = ./lake-dev.in;
|
||||
isExecutable = true;
|
||||
srcRoot = fullSrc; # use root flake.nix in case of Lean repo
|
||||
inherit bash nix srcTarget srcArgs;
|
||||
};
|
||||
lean-dev = symlinkJoin { name = "lean-dev"; paths = [ lean-bin-dev lake-dev ]; };
|
||||
emacs-dev = makeEmacsWrapper "emacs-dev" "${lean-emacs}/bin/emacs" lean-dev;
|
||||
emacs-path-dev = makeEmacsWrapper "emacs-path-dev" "emacs" lean-dev;
|
||||
vscode-dev = makeVSCodeWrapper "vscode-dev" lean-dev;
|
||||
|
||||
devShell = mkShell {
|
||||
buildInputs = [ nix ];
|
||||
shellHook = ''
|
||||
export LEAN_SRC_PATH="${srcPath}"
|
||||
'';
|
||||
};
|
||||
})
|
||||
|
||||
@@ -1,9 +1,6 @@
|
||||
{ src, pkgs, nix, ... } @ args:
|
||||
{ src, pkgs, ... } @ args:
|
||||
with pkgs;
|
||||
let
|
||||
nix-pinned = writeShellScriptBin "nix" ''
|
||||
${nix.packages.${system}.default}/bin/nix --experimental-features 'nix-command flakes' --extra-substituters https://lean4.cachix.org/ --option warn-dirty false "$@"
|
||||
'';
|
||||
# https://github.com/NixOS/nixpkgs/issues/130963
|
||||
llvmPackages = if stdenv.isDarwin then llvmPackages_11 else llvmPackages_15;
|
||||
cc = (ccacheWrapper.override rec {
|
||||
@@ -42,40 +39,9 @@ let
|
||||
inherit (lean) stdenv;
|
||||
lean = lean.stage1;
|
||||
inherit (lean.stage1) leanc;
|
||||
inherit lean-emacs lean-vscode;
|
||||
nix = nix-pinned;
|
||||
}));
|
||||
lean4-mode = emacsPackages.melpaBuild {
|
||||
pname = "lean4-mode";
|
||||
version = "1";
|
||||
commit = "1";
|
||||
src = args.lean4-mode;
|
||||
packageRequires = with pkgs.emacsPackages.melpaPackages; [ dash f flycheck magit-section lsp-mode s ];
|
||||
recipe = pkgs.writeText "recipe" ''
|
||||
(lean4-mode
|
||||
:repo "leanprover/lean4-mode"
|
||||
:fetcher github
|
||||
:files ("*.el" "data"))
|
||||
'';
|
||||
};
|
||||
lean-emacs = emacsWithPackages [ lean4-mode ];
|
||||
# updating might be nicer by building from source from a flake input, but this is good enough for now
|
||||
vscode-lean4 = vscode-utils.extensionFromVscodeMarketplace {
|
||||
name = "lean4";
|
||||
publisher = "leanprover";
|
||||
version = "0.0.63";
|
||||
sha256 = "sha256-kjEex7L0F2P4pMdXi4NIZ1y59ywJVubqDqsoYagZNkI=";
|
||||
};
|
||||
lean-vscode = vscode-with-extensions.override {
|
||||
vscodeExtensions = [ vscode-lean4 ];
|
||||
};
|
||||
in {
|
||||
inherit cc lean4-mode buildLeanPackage llvmPackages vscode-lean4;
|
||||
lean = lean.stage1;
|
||||
stage0print-paths = lean.stage1.Lean.print-paths;
|
||||
HEAD-as-stage0 = (lean.stage1.Lean.overrideArgs { srcTarget = "..#stage0-from-input.stage0"; srcArgs = "(--override-input lean-stage0 ..\?rev=$(git rev-parse HEAD) -- -Dinterpreter.prefer_native=false \"$@\")"; });
|
||||
HEAD-as-stage1 = (lean.stage1.Lean.overrideArgs { srcTarget = "..\?rev=$(git rev-parse HEAD)#stage0"; });
|
||||
nix = nix-pinned;
|
||||
inherit cc buildLeanPackage llvmPackages;
|
||||
nixpkgs = pkgs;
|
||||
ciShell = writeShellScriptBin "ciShell" ''
|
||||
set -o pipefail
|
||||
@@ -83,5 +49,4 @@ in {
|
||||
# prefix lines with cumulative and individual execution time
|
||||
"$@" |& ts -i "(%.S)]" | ts -s "[%M:%S"
|
||||
'';
|
||||
vscode = lean-vscode;
|
||||
} // lean.stage1.Lean // lean.stage1 // lean
|
||||
} // lean.stage1
|
||||
|
||||
@@ -1,45 +0,0 @@
|
||||
A new linter flags situations where a local variable's name is one of
|
||||
the argumentless constructors of its type. This can arise when a user either
|
||||
doesn't open a namespace or doesn't add a dot or leading qualifier, as
|
||||
in the following:
|
||||
|
||||
````
|
||||
inductive Tree (α : Type) where
|
||||
| leaf
|
||||
| branch (left : Tree α) (val : α) (right : Tree α)
|
||||
|
||||
def depth : Tree α → Nat
|
||||
| leaf => 0
|
||||
````
|
||||
|
||||
With this linter, the `leaf` pattern is highlighted as a local
|
||||
variable whose name overlaps with the constructor `Tree.leaf`.
|
||||
|
||||
The linter can be disabled with `set_option linter.constructorNameAsVariable false`.
|
||||
|
||||
Additionally, the error message that occurs when a name in a pattern that takes arguments isn't valid now suggests similar names that would be valid. This means that the following definition:
|
||||
|
||||
```
|
||||
def length (list : List α) : Nat :=
|
||||
match list with
|
||||
| nil => 0
|
||||
| cons x xs => length xs + 1
|
||||
```
|
||||
|
||||
now results in the following warning:
|
||||
|
||||
```
|
||||
warning: Local variable 'nil' resembles constructor 'List.nil' - write '.nil' (with a dot) or 'List.nil' to use the constructor.
|
||||
note: this linter can be disabled with `set_option linter.constructorNameAsVariable false`
|
||||
```
|
||||
|
||||
and error:
|
||||
|
||||
```
|
||||
invalid pattern, constructor or constant marked with '[match_pattern]' expected
|
||||
|
||||
Suggestion: 'List.cons' is similar
|
||||
```
|
||||
|
||||
|
||||
#4301
|
||||
@@ -1204,12 +1204,12 @@ def Prod.map {α₁ : Type u₁} {α₂ : Type u₂} {β₁ : Type v₁} {β₂
|
||||
|
||||
/-! # Dependent products -/
|
||||
|
||||
theorem PSigma.exists {α : Sort u} {p : α → Prop} : (PSigma (fun x => p x)) → Exists (fun x => p x)
|
||||
theorem Exists.of_psigma_prop {α : Sort u} {p : α → Prop} : (PSigma (fun x => p x)) → Exists (fun x => p x)
|
||||
| ⟨x, hx⟩ => ⟨x, hx⟩
|
||||
|
||||
@[deprecated PSigma.exists (since := "2024-07-27")]
|
||||
@[deprecated Exists.of_psigma_prop (since := "2024-07-27")]
|
||||
theorem ex_of_PSigma {α : Type u} {p : α → Prop} : (PSigma (fun x => p x)) → Exists (fun x => p x) :=
|
||||
PSigma.exists
|
||||
Exists.of_psigma_prop
|
||||
|
||||
protected theorem PSigma.eta {α : Sort u} {β : α → Sort v} {a₁ a₂ : α} {b₁ : β a₁} {b₂ : β a₂}
|
||||
(h₁ : a₁ = a₂) (h₂ : Eq.ndrec b₁ h₁ = b₂) : PSigma.mk a₁ b₁ = PSigma.mk a₂ b₂ := by
|
||||
|
||||
@@ -108,7 +108,7 @@ def swap (a : Array α) (i j : @& Fin a.size) : Array α :=
|
||||
a'.set (size_set a i v₂ ▸ j) v₁
|
||||
|
||||
/--
|
||||
Swaps two entries in an array, or panics if either index is out of bounds.
|
||||
Swaps two entries in an array, or returns the array unchanged if either index is out of bounds.
|
||||
|
||||
This will perform the update destructively provided that `a` has a reference
|
||||
count of 1 when called.
|
||||
|
||||
@@ -7,6 +7,7 @@ prelude
|
||||
import Init.Data.Nat.MinMax
|
||||
import Init.Data.Nat.Lemmas
|
||||
import Init.Data.List.Monadic
|
||||
import Init.Data.List.Nat.Range
|
||||
import Init.Data.Fin.Basic
|
||||
import Init.Data.Array.Mem
|
||||
import Init.TacticsExtra
|
||||
@@ -336,6 +337,10 @@ theorem not_mem_nil (a : α) : ¬ a ∈ #[] := nofun
|
||||
|
||||
/-- # get lemmas -/
|
||||
|
||||
theorem lt_of_getElem {x : α} {a : Array α} {idx : Nat} {hidx : idx < a.size} (_ : a[idx] = x) :
|
||||
idx < a.size :=
|
||||
hidx
|
||||
|
||||
theorem getElem?_mem {l : Array α} {i : Fin l.size} : l[i] ∈ l := by
|
||||
erw [Array.mem_def, getElem_eq_data_getElem]
|
||||
apply List.get_mem
|
||||
@@ -505,6 +510,13 @@ theorem size_eq_length_data (as : Array α) : as.size = as.data.length := rfl
|
||||
simp only [mkEmpty_eq, size_push] at *
|
||||
omega
|
||||
|
||||
@[simp] theorem data_range (n : Nat) : (range n).data = List.range n := by
|
||||
induction n <;> simp_all [range, Nat.fold, flip, List.range_succ]
|
||||
|
||||
@[simp]
|
||||
theorem getElem_range {n : Nat} {x : Nat} (h : x < (Array.range n).size) : (Array.range n)[x] = x := by
|
||||
simp [getElem_eq_data_getElem]
|
||||
|
||||
set_option linter.deprecated false in
|
||||
@[simp] theorem reverse_data (a : Array α) : a.reverse.data = a.data.reverse := by
|
||||
let rec go (as : Array α) (i j hj)
|
||||
@@ -707,13 +719,22 @@ theorem mapIdx_spec (as : Array α) (f : Fin as.size → α → β)
|
||||
unfold modify modifyM Id.run
|
||||
split <;> simp
|
||||
|
||||
theorem get_modify {arr : Array α} {x i} (h : i < arr.size) :
|
||||
(arr.modify x f).get ⟨i, by simp [h]⟩ =
|
||||
if x = i then f (arr.get ⟨i, h⟩) else arr.get ⟨i, h⟩ := by
|
||||
simp [modify, modifyM, Id.run]; split
|
||||
· simp [get_set _ _ _ h]; split <;> simp [*]
|
||||
theorem getElem_modify {as : Array α} {x i} (h : i < as.size) :
|
||||
(as.modify x f)[i]'(by simp [h]) = if x = i then f as[i] else as[i] := by
|
||||
simp only [modify, modifyM, get_eq_getElem, Id.run, Id.pure_eq]
|
||||
split
|
||||
· simp only [Id.bind_eq, get_set _ _ _ h]; split <;> simp [*]
|
||||
· rw [if_neg (mt (by rintro rfl; exact h) ‹_›)]
|
||||
|
||||
theorem getElem_modify_self {as : Array α} {i : Nat} (h : i < as.size) (f : α → α) :
|
||||
(as.modify i f)[i]'(by simp [h]) = f as[i] := by
|
||||
simp [getElem_modify h]
|
||||
|
||||
theorem getElem_modify_of_ne {as : Array α} {i : Nat} (hj : j < as.size)
|
||||
(f : α → α) (h : i ≠ j) :
|
||||
(as.modify i f)[j]'(by rwa [size_modify]) = as[j] := by
|
||||
simp [getElem_modify hj, h]
|
||||
|
||||
/-! ### filter -/
|
||||
|
||||
@[simp] theorem filter_data (p : α → Bool) (l : Array α) :
|
||||
|
||||
@@ -20,6 +20,8 @@ We define many of the bitvector operations from the
|
||||
of SMT-LIBv2.
|
||||
-/
|
||||
|
||||
set_option linter.missingDocs true
|
||||
|
||||
/--
|
||||
A bitvector of the specified width.
|
||||
|
||||
@@ -34,14 +36,14 @@ structure BitVec (w : Nat) where
|
||||
O(1), because we use `Fin` as the internal representation of a bitvector. -/
|
||||
toFin : Fin (2^w)
|
||||
|
||||
@[deprecated (since := "2024-04-12")]
|
||||
protected abbrev Std.BitVec := _root_.BitVec
|
||||
|
||||
/--
|
||||
Bitvectors have decidable equality. This should be used via the instance `DecidableEq (BitVec n)`.
|
||||
-/
|
||||
-- We manually derive the `DecidableEq` instances for `BitVec` because
|
||||
-- we want to have builtin support for bit-vector literals, and we
|
||||
-- need a name for this function to implement `canUnfoldAtMatcher` at `WHNF.lean`.
|
||||
def BitVec.decEq (a b : BitVec n) : Decidable (a = b) :=
|
||||
match a, b with
|
||||
def BitVec.decEq (x y : BitVec n) : Decidable (x = y) :=
|
||||
match x, y with
|
||||
| ⟨n⟩, ⟨m⟩ =>
|
||||
if h : n = m then
|
||||
isTrue (h ▸ rfl)
|
||||
@@ -67,9 +69,9 @@ protected def ofNat (n : Nat) (i : Nat) : BitVec n where
|
||||
instance instOfNat : OfNat (BitVec n) i where ofNat := .ofNat n i
|
||||
instance natCastInst : NatCast (BitVec w) := ⟨BitVec.ofNat w⟩
|
||||
|
||||
/-- Given a bitvector `a`, return the underlying `Nat`. This is O(1) because `BitVec` is a
|
||||
/-- Given a bitvector `x`, return the underlying `Nat`. This is O(1) because `BitVec` is a
|
||||
(zero-cost) wrapper around a `Nat`. -/
|
||||
protected def toNat (a : BitVec n) : Nat := a.toFin.val
|
||||
protected def toNat (x : BitVec n) : Nat := x.toFin.val
|
||||
|
||||
/-- Return the bound in terms of toNat. -/
|
||||
theorem isLt (x : BitVec w) : x.toNat < 2^w := x.toFin.isLt
|
||||
@@ -121,18 +123,18 @@ section getXsb
|
||||
@[inline] def getMsb (x : BitVec w) (i : Nat) : Bool := i < w && getLsb x (w-1-i)
|
||||
|
||||
/-- Return most-significant bit in bitvector. -/
|
||||
@[inline] protected def msb (a : BitVec n) : Bool := getMsb a 0
|
||||
@[inline] protected def msb (x : BitVec n) : Bool := getMsb x 0
|
||||
|
||||
end getXsb
|
||||
|
||||
section Int
|
||||
|
||||
/-- Interpret the bitvector as an integer stored in two's complement form. -/
|
||||
protected def toInt (a : BitVec n) : Int :=
|
||||
if 2 * a.toNat < 2^n then
|
||||
a.toNat
|
||||
protected def toInt (x : BitVec n) : Int :=
|
||||
if 2 * x.toNat < 2^n then
|
||||
x.toNat
|
||||
else
|
||||
(a.toNat : Int) - (2^n : Nat)
|
||||
(x.toNat : Int) - (2^n : Nat)
|
||||
|
||||
/-- The `BitVec` with value `(2^n + (i mod 2^n)) mod 2^n`. -/
|
||||
protected def ofInt (n : Nat) (i : Int) : BitVec n := .ofNatLt (i % (Int.ofNat (2^n))).toNat (by
|
||||
@@ -213,7 +215,7 @@ instance : Neg (BitVec n) := ⟨.neg⟩
|
||||
/--
|
||||
Return the absolute value of a signed bitvector.
|
||||
-/
|
||||
protected def abs (s : BitVec n) : BitVec n := if s.msb then .neg s else s
|
||||
protected def abs (x : BitVec n) : BitVec n := if x.msb then .neg x else x
|
||||
|
||||
/--
|
||||
Multiplication for bit vectors. This can be interpreted as either signed or unsigned negation
|
||||
@@ -260,12 +262,12 @@ sdiv 5#4 -2 = -2#4
|
||||
sdiv (-7#4) (-2) = 3#4
|
||||
```
|
||||
-/
|
||||
def sdiv (s t : BitVec n) : BitVec n :=
|
||||
match s.msb, t.msb with
|
||||
| false, false => udiv s t
|
||||
| false, true => .neg (udiv s (.neg t))
|
||||
| true, false => .neg (udiv (.neg s) t)
|
||||
| true, true => udiv (.neg s) (.neg t)
|
||||
def sdiv (x y : BitVec n) : BitVec n :=
|
||||
match x.msb, y.msb with
|
||||
| false, false => udiv x y
|
||||
| false, true => .neg (udiv x (.neg y))
|
||||
| true, false => .neg (udiv (.neg x) y)
|
||||
| true, true => udiv (.neg x) (.neg y)
|
||||
|
||||
/--
|
||||
Signed division for bit vectors using SMTLIB rules for division by zero.
|
||||
@@ -274,40 +276,40 @@ Specifically, `smtSDiv x 0 = if x >= 0 then -1 else 1`
|
||||
|
||||
SMT-Lib name: `bvsdiv`.
|
||||
-/
|
||||
def smtSDiv (s t : BitVec n) : BitVec n :=
|
||||
match s.msb, t.msb with
|
||||
| false, false => smtUDiv s t
|
||||
| false, true => .neg (smtUDiv s (.neg t))
|
||||
| true, false => .neg (smtUDiv (.neg s) t)
|
||||
| true, true => smtUDiv (.neg s) (.neg t)
|
||||
def smtSDiv (x y : BitVec n) : BitVec n :=
|
||||
match x.msb, y.msb with
|
||||
| false, false => smtUDiv x y
|
||||
| false, true => .neg (smtUDiv x (.neg y))
|
||||
| true, false => .neg (smtUDiv (.neg x) y)
|
||||
| true, true => smtUDiv (.neg x) (.neg y)
|
||||
|
||||
/--
|
||||
Remainder for signed division rounding to zero.
|
||||
|
||||
SMT_Lib name: `bvsrem`.
|
||||
-/
|
||||
def srem (s t : BitVec n) : BitVec n :=
|
||||
match s.msb, t.msb with
|
||||
| false, false => umod s t
|
||||
| false, true => umod s (.neg t)
|
||||
| true, false => .neg (umod (.neg s) t)
|
||||
| true, true => .neg (umod (.neg s) (.neg t))
|
||||
def srem (x y : BitVec n) : BitVec n :=
|
||||
match x.msb, y.msb with
|
||||
| false, false => umod x y
|
||||
| false, true => umod x (.neg y)
|
||||
| true, false => .neg (umod (.neg x) y)
|
||||
| true, true => .neg (umod (.neg x) (.neg y))
|
||||
|
||||
/--
|
||||
Remainder for signed division rounded to negative infinity.
|
||||
|
||||
SMT_Lib name: `bvsmod`.
|
||||
-/
|
||||
def smod (s t : BitVec m) : BitVec m :=
|
||||
match s.msb, t.msb with
|
||||
| false, false => umod s t
|
||||
def smod (x y : BitVec m) : BitVec m :=
|
||||
match x.msb, y.msb with
|
||||
| false, false => umod x y
|
||||
| false, true =>
|
||||
let u := umod s (.neg t)
|
||||
(if u = .zero m then u else .add u t)
|
||||
let u := umod x (.neg y)
|
||||
(if u = .zero m then u else .add u y)
|
||||
| true, false =>
|
||||
let u := umod (.neg s) t
|
||||
(if u = .zero m then u else .sub t u)
|
||||
| true, true => .neg (umod (.neg s) (.neg t))
|
||||
let u := umod (.neg x) y
|
||||
(if u = .zero m then u else .sub y u)
|
||||
| true, true => .neg (umod (.neg x) (.neg y))
|
||||
|
||||
end arithmetic
|
||||
|
||||
@@ -371,8 +373,8 @@ end relations
|
||||
|
||||
section cast
|
||||
|
||||
/-- `cast eq i` embeds `i` into an equal `BitVec` type. -/
|
||||
@[inline] def cast (eq : n = m) (i : BitVec n) : BitVec m := .ofNatLt i.toNat (eq ▸ i.isLt)
|
||||
/-- `cast eq x` embeds `x` into an equal `BitVec` type. -/
|
||||
@[inline] def cast (eq : n = m) (x : BitVec n) : BitVec m := .ofNatLt x.toNat (eq ▸ x.isLt)
|
||||
|
||||
@[simp] theorem cast_ofNat {n m : Nat} (h : n = m) (x : Nat) :
|
||||
cast h (BitVec.ofNat n x) = BitVec.ofNat m x := by
|
||||
@@ -389,7 +391,7 @@ Extraction of bits `start` to `start + len - 1` from a bit vector of size `n` to
|
||||
new bitvector of size `len`. If `start + len > n`, then the vector will be zero-padded in the
|
||||
high bits.
|
||||
-/
|
||||
def extractLsb' (start len : Nat) (a : BitVec n) : BitVec len := .ofNat _ (a.toNat >>> start)
|
||||
def extractLsb' (start len : Nat) (x : BitVec n) : BitVec len := .ofNat _ (x.toNat >>> start)
|
||||
|
||||
/--
|
||||
Extraction of bits `hi` (inclusive) down to `lo` (inclusive) from a bit vector of size `n` to
|
||||
@@ -397,12 +399,12 @@ yield a new bitvector of size `hi - lo + 1`.
|
||||
|
||||
SMT-Lib name: `extract`.
|
||||
-/
|
||||
def extractLsb (hi lo : Nat) (a : BitVec n) : BitVec (hi - lo + 1) := extractLsb' lo _ a
|
||||
def extractLsb (hi lo : Nat) (x : BitVec n) : BitVec (hi - lo + 1) := extractLsb' lo _ x
|
||||
|
||||
/--
|
||||
A version of `zeroExtend` that requires a proof, but is a noop.
|
||||
-/
|
||||
def zeroExtend' {n w : Nat} (le : n ≤ w) (x : BitVec n) : BitVec w :=
|
||||
def zeroExtend' {n w : Nat} (le : n ≤ w) (x : BitVec n) : BitVec w :=
|
||||
x.toNat#'(by
|
||||
apply Nat.lt_of_lt_of_le x.isLt
|
||||
exact Nat.pow_le_pow_of_le_right (by trivial) le)
|
||||
@@ -411,8 +413,8 @@ def zeroExtend' {n w : Nat} (le : n ≤ w) (x : BitVec n) : BitVec w :=
|
||||
`shiftLeftZeroExtend x n` returns `zeroExtend (w+n) x <<< n` without
|
||||
needing to compute `x % 2^(2+n)`.
|
||||
-/
|
||||
def shiftLeftZeroExtend (msbs : BitVec w) (m : Nat) : BitVec (w+m) :=
|
||||
let shiftLeftLt {x : Nat} (p : x < 2^w) (m : Nat) : x <<< m < 2^(w+m) := by
|
||||
def shiftLeftZeroExtend (msbs : BitVec w) (m : Nat) : BitVec (w + m) :=
|
||||
let shiftLeftLt {x : Nat} (p : x < 2^w) (m : Nat) : x <<< m < 2^(w + m) := by
|
||||
simp [Nat.shiftLeft_eq, Nat.pow_add]
|
||||
apply Nat.mul_lt_mul_of_pos_right p
|
||||
exact (Nat.two_pow_pos m)
|
||||
@@ -500,24 +502,24 @@ instance : Complement (BitVec w) := ⟨.not⟩
|
||||
|
||||
/--
|
||||
Left shift for bit vectors. The low bits are filled with zeros. As a numeric operation, this is
|
||||
equivalent to `a * 2^s`, modulo `2^n`.
|
||||
equivalent to `x * 2^s`, modulo `2^n`.
|
||||
|
||||
SMT-Lib name: `bvshl` except this operator uses a `Nat` shift value.
|
||||
-/
|
||||
protected def shiftLeft (a : BitVec n) (s : Nat) : BitVec n := BitVec.ofNat n (a.toNat <<< s)
|
||||
protected def shiftLeft (x : BitVec n) (s : Nat) : BitVec n := BitVec.ofNat n (x.toNat <<< s)
|
||||
instance : HShiftLeft (BitVec w) Nat (BitVec w) := ⟨.shiftLeft⟩
|
||||
|
||||
/--
|
||||
(Logical) right shift for bit vectors. The high bits are filled with zeros.
|
||||
As a numeric operation, this is equivalent to `a / 2^s`, rounding down.
|
||||
As a numeric operation, this is equivalent to `x / 2^s`, rounding down.
|
||||
|
||||
SMT-Lib name: `bvlshr` except this operator uses a `Nat` shift value.
|
||||
-/
|
||||
def ushiftRight (a : BitVec n) (s : Nat) : BitVec n :=
|
||||
(a.toNat >>> s)#'(by
|
||||
let ⟨a, lt⟩ := a
|
||||
def ushiftRight (x : BitVec n) (s : Nat) : BitVec n :=
|
||||
(x.toNat >>> s)#'(by
|
||||
let ⟨x, lt⟩ := x
|
||||
simp only [BitVec.toNat, Nat.shiftRight_eq_div_pow, Nat.div_lt_iff_lt_mul (Nat.two_pow_pos s)]
|
||||
rw [←Nat.mul_one a]
|
||||
rw [←Nat.mul_one x]
|
||||
exact Nat.mul_lt_mul_of_lt_of_le' lt (Nat.two_pow_pos s) (Nat.le_refl 1))
|
||||
|
||||
instance : HShiftRight (BitVec w) Nat (BitVec w) := ⟨.ushiftRight⟩
|
||||
@@ -525,15 +527,24 @@ instance : HShiftRight (BitVec w) Nat (BitVec w) := ⟨.ushiftRight⟩
|
||||
/--
|
||||
Arithmetic right shift for bit vectors. The high bits are filled with the
|
||||
most-significant bit.
|
||||
As a numeric operation, this is equivalent to `a.toInt >>> s`.
|
||||
As a numeric operation, this is equivalent to `x.toInt >>> s`.
|
||||
|
||||
SMT-Lib name: `bvashr` except this operator uses a `Nat` shift value.
|
||||
-/
|
||||
def sshiftRight (a : BitVec n) (s : Nat) : BitVec n := .ofInt n (a.toInt >>> s)
|
||||
def sshiftRight (x : BitVec n) (s : Nat) : BitVec n := .ofInt n (x.toInt >>> s)
|
||||
|
||||
instance {n} : HShiftLeft (BitVec m) (BitVec n) (BitVec m) := ⟨fun x y => x <<< y.toNat⟩
|
||||
instance {n} : HShiftRight (BitVec m) (BitVec n) (BitVec m) := ⟨fun x y => x >>> y.toNat⟩
|
||||
|
||||
/--
|
||||
Arithmetic right shift for bit vectors. The high bits are filled with the
|
||||
most-significant bit.
|
||||
As a numeric operation, this is equivalent to `a.toInt >>> s.toNat`.
|
||||
|
||||
SMT-Lib name: `bvashr`.
|
||||
-/
|
||||
def sshiftRight' (a : BitVec n) (s : BitVec m) : BitVec n := a.sshiftRight s.toNat
|
||||
|
||||
/-- Auxiliary function for `rotateLeft`, which does not take into account the case where
|
||||
the rotation amount is greater than the bitvector width. -/
|
||||
def rotateLeftAux (x : BitVec w) (n : Nat) : BitVec w :=
|
||||
@@ -583,11 +594,9 @@ instance : HAppend (BitVec w) (BitVec v) (BitVec (w + v)) := ⟨.append⟩
|
||||
-- TODO: write this using multiplication
|
||||
/-- `replicate i x` concatenates `i` copies of `x` into a new vector of length `w*i`. -/
|
||||
def replicate : (i : Nat) → BitVec w → BitVec (w*i)
|
||||
| 0, _ => 0
|
||||
| 0, _ => 0#0
|
||||
| n+1, x =>
|
||||
have hEq : w + w*n = w*(n + 1) := by
|
||||
rw [Nat.mul_add, Nat.add_comm, Nat.mul_one]
|
||||
hEq ▸ (x ++ replicate n x)
|
||||
(x ++ replicate n x).cast (by rw [Nat.mul_succ]; omega)
|
||||
|
||||
/-!
|
||||
### Cons and Concat
|
||||
|
||||
@@ -28,6 +28,8 @@ https://github.com/mhk119/lean-smt/blob/bitvec/Smt/Data/Bitwise.lean.
|
||||
|
||||
-/
|
||||
|
||||
set_option linter.missingDocs true
|
||||
|
||||
open Nat Bool
|
||||
|
||||
namespace Bool
|
||||
@@ -287,18 +289,18 @@ theorem sle_eq_carry (x y : BitVec w) :
|
||||
A recurrence that describes multiplication as repeated addition.
|
||||
Is useful for bitblasting multiplication.
|
||||
-/
|
||||
def mulRec (l r : BitVec w) (s : Nat) : BitVec w :=
|
||||
let cur := if r.getLsb s then (l <<< s) else 0
|
||||
def mulRec (x y : BitVec w) (s : Nat) : BitVec w :=
|
||||
let cur := if y.getLsb s then (x <<< s) else 0
|
||||
match s with
|
||||
| 0 => cur
|
||||
| s + 1 => mulRec l r s + cur
|
||||
| s + 1 => mulRec x y s + cur
|
||||
|
||||
theorem mulRec_zero_eq (l r : BitVec w) :
|
||||
mulRec l r 0 = if r.getLsb 0 then l else 0 := by
|
||||
theorem mulRec_zero_eq (x y : BitVec w) :
|
||||
mulRec x y 0 = if y.getLsb 0 then x else 0 := by
|
||||
simp [mulRec]
|
||||
|
||||
theorem mulRec_succ_eq (l r : BitVec w) (s : Nat) :
|
||||
mulRec l r (s + 1) = mulRec l r s + if r.getLsb (s + 1) then (l <<< (s + 1)) else 0 := rfl
|
||||
theorem mulRec_succ_eq (x y : BitVec w) (s : Nat) :
|
||||
mulRec x y (s + 1) = mulRec x y s + if y.getLsb (s + 1) then (x <<< (s + 1)) else 0 := rfl
|
||||
|
||||
/--
|
||||
Recurrence lemma: truncating to `i+1` bits and then zero extending to `w`
|
||||
@@ -324,29 +326,29 @@ theorem zeroExtend_truncate_succ_eq_zeroExtend_truncate_add_twoPow (x : BitVec w
|
||||
by_cases hi : x.getLsb i <;> simp [hi] <;> omega
|
||||
|
||||
/--
|
||||
Recurrence lemma: multiplying `l` with the first `s` bits of `r` is the
|
||||
same as truncating `r` to `s` bits, then zero extending to the original length,
|
||||
Recurrence lemma: multiplying `x` with the first `s` bits of `y` is the
|
||||
same as truncating `y` to `s` bits, then zero extending to the original length,
|
||||
and performing the multplication. -/
|
||||
theorem mulRec_eq_mul_signExtend_truncate (l r : BitVec w) (s : Nat) :
|
||||
mulRec l r s = l * ((r.truncate (s + 1)).zeroExtend w) := by
|
||||
theorem mulRec_eq_mul_signExtend_truncate (x y : BitVec w) (s : Nat) :
|
||||
mulRec x y s = x * ((y.truncate (s + 1)).zeroExtend w) := by
|
||||
induction s
|
||||
case zero =>
|
||||
simp only [mulRec_zero_eq, ofNat_eq_ofNat, Nat.reduceAdd]
|
||||
by_cases r.getLsb 0
|
||||
case pos hr =>
|
||||
simp only [hr, ↓reduceIte, truncate, zeroExtend_one_eq_ofBool_getLsb_zero,
|
||||
hr, ofBool_true, ofNat_eq_ofNat]
|
||||
by_cases y.getLsb 0
|
||||
case pos hy =>
|
||||
simp only [hy, ↓reduceIte, truncate, zeroExtend_one_eq_ofBool_getLsb_zero,
|
||||
ofBool_true, ofNat_eq_ofNat]
|
||||
rw [zeroExtend_ofNat_one_eq_ofNat_one_of_lt (by omega)]
|
||||
simp
|
||||
case neg hr =>
|
||||
simp [hr, zeroExtend_one_eq_ofBool_getLsb_zero]
|
||||
case neg hy =>
|
||||
simp [hy, zeroExtend_one_eq_ofBool_getLsb_zero]
|
||||
case succ s' hs =>
|
||||
rw [mulRec_succ_eq, hs]
|
||||
have heq :
|
||||
(if r.getLsb (s' + 1) = true then l <<< (s' + 1) else 0) =
|
||||
(l * (r &&& (BitVec.twoPow w (s' + 1)))) := by
|
||||
(if y.getLsb (s' + 1) = true then x <<< (s' + 1) else 0) =
|
||||
(x * (y &&& (BitVec.twoPow w (s' + 1)))) := by
|
||||
simp only [ofNat_eq_ofNat, and_twoPow]
|
||||
by_cases hr : r.getLsb (s' + 1) <;> simp [hr]
|
||||
by_cases hy : y.getLsb (s' + 1) <;> simp [hy]
|
||||
rw [heq, ← BitVec.mul_add, ← zeroExtend_truncate_succ_eq_zeroExtend_truncate_add_twoPow]
|
||||
|
||||
theorem getLsb_mul (x y : BitVec w) (i : Nat) :
|
||||
@@ -403,12 +405,8 @@ theorem shiftLeftRec_eq {x : BitVec w₁} {y : BitVec w₂} {n : Nat} :
|
||||
induction n generalizing x y
|
||||
case zero =>
|
||||
ext i
|
||||
simp only [shiftLeftRec_zero, twoPow_zero, Nat.reduceAdd, truncate_one]
|
||||
suffices (y &&& 1#w₂) = zeroExtend w₂ (ofBool (y.getLsb 0)) by simp [this]
|
||||
ext i
|
||||
by_cases h : (↑i : Nat) = 0
|
||||
· simp [h, Bool.and_comm]
|
||||
· simp [h]; omega
|
||||
simp only [shiftLeftRec_zero, twoPow_zero, Nat.reduceAdd, truncate_one,
|
||||
and_one_eq_zeroExtend_ofBool_getLsb]
|
||||
case succ n ih =>
|
||||
simp only [shiftLeftRec_succ, and_twoPow]
|
||||
rw [ih]
|
||||
@@ -431,4 +429,128 @@ theorem shiftLeft_eq_shiftLeftRec (x : BitVec w₁) (y : BitVec w₂) :
|
||||
· simp [of_length_zero]
|
||||
· simp [shiftLeftRec_eq]
|
||||
|
||||
/- ### Arithmetic shift right (sshiftRight) recurrence -/
|
||||
|
||||
/--
|
||||
`sshiftRightRec x y n` shifts `x` arithmetically/signed to the right by the first `n` bits of `y`.
|
||||
The theorem `sshiftRight_eq_sshiftRightRec` proves the equivalence of `(x.sshiftRight y)` and `sshiftRightRec`.
|
||||
Together with equations `sshiftRightRec_zero`, `sshiftRightRec_succ`,
|
||||
this allows us to unfold `sshiftRight` into a circuit for bitblasting.
|
||||
-/
|
||||
def sshiftRightRec (x : BitVec w₁) (y : BitVec w₂) (n : Nat) : BitVec w₁ :=
|
||||
let shiftAmt := (y &&& (twoPow w₂ n))
|
||||
match n with
|
||||
| 0 => x.sshiftRight' shiftAmt
|
||||
| n + 1 => (sshiftRightRec x y n).sshiftRight' shiftAmt
|
||||
|
||||
@[simp]
|
||||
theorem sshiftRightRec_zero_eq (x : BitVec w₁) (y : BitVec w₂) :
|
||||
sshiftRightRec x y 0 = x.sshiftRight' (y &&& 1#w₂) := by
|
||||
simp only [sshiftRightRec, twoPow_zero]
|
||||
|
||||
@[simp]
|
||||
theorem sshiftRightRec_succ_eq (x : BitVec w₁) (y : BitVec w₂) (n : Nat) :
|
||||
sshiftRightRec x y (n + 1) = (sshiftRightRec x y n).sshiftRight' (y &&& twoPow w₂ (n + 1)) := by
|
||||
simp [sshiftRightRec]
|
||||
|
||||
/--
|
||||
If `y &&& z = 0`, `x.sshiftRight (y ||| z) = (x.sshiftRight y).sshiftRight z`.
|
||||
This follows as `y &&& z = 0` implies `y ||| z = y + z`,
|
||||
and thus `x.sshiftRight (y ||| z) = x.sshiftRight (y + z) = (x.sshiftRight y).sshiftRight z`.
|
||||
-/
|
||||
theorem sshiftRight'_or_of_and_eq_zero {x : BitVec w₁} {y z : BitVec w₂}
|
||||
(h : y &&& z = 0#w₂) :
|
||||
x.sshiftRight' (y ||| z) = (x.sshiftRight' y).sshiftRight' z := by
|
||||
simp [sshiftRight', ← add_eq_or_of_and_eq_zero _ _ h,
|
||||
toNat_add_of_and_eq_zero h, sshiftRight_add]
|
||||
|
||||
theorem sshiftRightRec_eq (x : BitVec w₁) (y : BitVec w₂) (n : Nat) :
|
||||
sshiftRightRec x y n = x.sshiftRight' ((y.truncate (n + 1)).zeroExtend w₂) := by
|
||||
induction n generalizing x y
|
||||
case zero =>
|
||||
ext i
|
||||
simp [twoPow_zero, Nat.reduceAdd, and_one_eq_zeroExtend_ofBool_getLsb, truncate_one]
|
||||
case succ n ih =>
|
||||
simp only [sshiftRightRec_succ_eq, and_twoPow, ih]
|
||||
by_cases h : y.getLsb (n + 1)
|
||||
· rw [zeroExtend_truncate_succ_eq_zeroExtend_truncate_or_twoPow_of_getLsb_true h,
|
||||
sshiftRight'_or_of_and_eq_zero (by simp), h]
|
||||
simp
|
||||
· rw [zeroExtend_truncate_succ_eq_zeroExtend_truncate_of_getLsb_false (i := n + 1)
|
||||
(by simp [h])]
|
||||
simp [h]
|
||||
|
||||
/--
|
||||
Show that `x.sshiftRight y` can be written in terms of `sshiftRightRec`.
|
||||
This can be unfolded in terms of `sshiftRightRec_zero_eq`, `sshiftRightRec_succ_eq` for bitblasting.
|
||||
-/
|
||||
theorem sshiftRight_eq_sshiftRightRec (x : BitVec w₁) (y : BitVec w₂) :
|
||||
(x.sshiftRight' y).getLsb i = (sshiftRightRec x y (w₂ - 1)).getLsb i := by
|
||||
rcases w₂ with rfl | w₂
|
||||
· simp [of_length_zero]
|
||||
· simp [sshiftRightRec_eq]
|
||||
|
||||
/- ### Logical shift right (ushiftRight) recurrence for bitblasting -/
|
||||
|
||||
/--
|
||||
`ushiftRightRec x y n` shifts `x` logically to the right by the first `n` bits of `y`.
|
||||
|
||||
The theorem `shiftRight_eq_ushiftRightRec` proves the equivalence
|
||||
of `(x >>> y)` and `ushiftRightRec`.
|
||||
|
||||
Together with equations `ushiftRightRec_zero`, `ushiftRightRec_succ`,
|
||||
this allows us to unfold `ushiftRight` into a circuit for bitblasting.
|
||||
-/
|
||||
def ushiftRightRec (x : BitVec w₁) (y : BitVec w₂) (n : Nat) : BitVec w₁ :=
|
||||
let shiftAmt := (y &&& (twoPow w₂ n))
|
||||
match n with
|
||||
| 0 => x >>> shiftAmt
|
||||
| n + 1 => (ushiftRightRec x y n) >>> shiftAmt
|
||||
|
||||
@[simp]
|
||||
theorem ushiftRightRec_zero (x : BitVec w₁) (y : BitVec w₂) :
|
||||
ushiftRightRec x y 0 = x >>> (y &&& twoPow w₂ 0) := by
|
||||
simp [ushiftRightRec]
|
||||
|
||||
@[simp]
|
||||
theorem ushiftRightRec_succ (x : BitVec w₁) (y : BitVec w₂) :
|
||||
ushiftRightRec x y (n + 1) = (ushiftRightRec x y n) >>> (y &&& twoPow w₂ (n + 1)) := by
|
||||
simp [ushiftRightRec]
|
||||
|
||||
/--
|
||||
If `y &&& z = 0`, `x >>> (y ||| z) = x >>> y >>> z`.
|
||||
This follows as `y &&& z = 0` implies `y ||| z = y + z`,
|
||||
and thus `x >>> (y ||| z) = x >>> (y + z) = x >>> y >>> z`.
|
||||
-/
|
||||
theorem ushiftRight'_or_of_and_eq_zero {x : BitVec w₁} {y z : BitVec w₂}
|
||||
(h : y &&& z = 0#w₂) :
|
||||
x >>> (y ||| z) = x >>> y >>> z := by
|
||||
simp [← add_eq_or_of_and_eq_zero _ _ h, toNat_add_of_and_eq_zero h, shiftRight_add]
|
||||
|
||||
theorem ushiftRightRec_eq (x : BitVec w₁) (y : BitVec w₂) (n : Nat) :
|
||||
ushiftRightRec x y n = x >>> (y.truncate (n + 1)).zeroExtend w₂ := by
|
||||
induction n generalizing x y
|
||||
case zero =>
|
||||
ext i
|
||||
simp only [ushiftRightRec_zero, twoPow_zero, Nat.reduceAdd,
|
||||
and_one_eq_zeroExtend_ofBool_getLsb, truncate_one]
|
||||
case succ n ih =>
|
||||
simp only [ushiftRightRec_succ, and_twoPow]
|
||||
rw [ih]
|
||||
by_cases h : y.getLsb (n + 1) <;> simp only [h, ↓reduceIte]
|
||||
· rw [zeroExtend_truncate_succ_eq_zeroExtend_truncate_or_twoPow_of_getLsb_true h,
|
||||
ushiftRight'_or_of_and_eq_zero]
|
||||
simp
|
||||
· simp [zeroExtend_truncate_succ_eq_zeroExtend_truncate_of_getLsb_false, h]
|
||||
|
||||
/--
|
||||
Show that `x >>> y` can be written in terms of `ushiftRightRec`.
|
||||
This can be unfolded in terms of `ushiftRightRec_zero`, `ushiftRightRec_succ` for bitblasting.
|
||||
-/
|
||||
theorem shiftRight_eq_ushiftRightRec (x : BitVec w₁) (y : BitVec w₂) :
|
||||
x >>> y = ushiftRightRec x y (w₂ - 1) := by
|
||||
rcases w₂ with rfl | w₂
|
||||
· simp [of_length_zero]
|
||||
· simp [ushiftRightRec_eq]
|
||||
|
||||
end BitVec
|
||||
|
||||
@@ -8,6 +8,8 @@ import Init.Data.BitVec.Lemmas
|
||||
import Init.Data.Nat.Lemmas
|
||||
import Init.Data.Fin.Iterate
|
||||
|
||||
set_option linter.missingDocs true
|
||||
|
||||
namespace BitVec
|
||||
|
||||
/--
|
||||
|
||||
@@ -12,6 +12,8 @@ import Init.Data.Nat.Lemmas
|
||||
import Init.Data.Nat.Mod
|
||||
import Init.Data.Int.Bitwise.Lemmas
|
||||
|
||||
set_option linter.missingDocs true
|
||||
|
||||
namespace BitVec
|
||||
|
||||
/--
|
||||
@@ -21,7 +23,7 @@ theorem ofFin_eq_ofNat : @BitVec.ofFin w (Fin.mk x lt) = BitVec.ofNat w x := by
|
||||
simp only [BitVec.ofNat, Fin.ofNat', lt, Nat.mod_eq_of_lt]
|
||||
|
||||
/-- Prove equality of bitvectors in terms of nat operations. -/
|
||||
theorem eq_of_toNat_eq {n} : ∀ {i j : BitVec n}, i.toNat = j.toNat → i = j
|
||||
theorem eq_of_toNat_eq {n} : ∀ {x y : BitVec n}, x.toNat = y.toNat → x = y
|
||||
| ⟨_, _⟩, ⟨_, _⟩, rfl => rfl
|
||||
|
||||
@[simp] theorem val_toFin (x : BitVec w) : x.toFin.val = x.toNat := rfl
|
||||
@@ -226,12 +228,12 @@ theorem toNat_ge_of_msb_true {x : BitVec n} (p : BitVec.msb x = true) : x.toNat
|
||||
/-! ### toInt/ofInt -/
|
||||
|
||||
/-- Prove equality of bitvectors in terms of nat operations. -/
|
||||
theorem toInt_eq_toNat_cond (i : BitVec n) :
|
||||
i.toInt =
|
||||
if 2*i.toNat < 2^n then
|
||||
(i.toNat : Int)
|
||||
theorem toInt_eq_toNat_cond (x : BitVec n) :
|
||||
x.toInt =
|
||||
if 2*x.toNat < 2^n then
|
||||
(x.toNat : Int)
|
||||
else
|
||||
(i.toNat : Int) - (2^n : Nat) :=
|
||||
(x.toNat : Int) - (2^n : Nat) :=
|
||||
rfl
|
||||
|
||||
theorem msb_eq_false_iff_two_mul_lt (x : BitVec w) : x.msb = false ↔ 2 * x.toNat < 2^w := by
|
||||
@@ -258,13 +260,13 @@ theorem toInt_eq_toNat_bmod (x : BitVec n) : x.toInt = Int.bmod x.toNat (2^n) :=
|
||||
omega
|
||||
|
||||
/-- Prove equality of bitvectors in terms of nat operations. -/
|
||||
theorem eq_of_toInt_eq {i j : BitVec n} : i.toInt = j.toInt → i = j := by
|
||||
theorem eq_of_toInt_eq {x y : BitVec n} : x.toInt = y.toInt → x = y := by
|
||||
intro eq
|
||||
simp [toInt_eq_toNat_cond] at eq
|
||||
apply eq_of_toNat_eq
|
||||
revert eq
|
||||
have _ilt := i.isLt
|
||||
have _jlt := j.isLt
|
||||
have _xlt := x.isLt
|
||||
have _ylt := y.isLt
|
||||
split <;> split <;> omega
|
||||
|
||||
theorem toInt_inj (x y : BitVec n) : x.toInt = y.toInt ↔ x = y :=
|
||||
@@ -505,6 +507,13 @@ theorem or_assoc (x y z : BitVec w) :
|
||||
x ||| y ||| z = x ||| (y ||| z) := by
|
||||
ext i
|
||||
simp [Bool.or_assoc]
|
||||
instance : Std.Associative (α := BitVec n) (· ||| ·) := ⟨BitVec.or_assoc⟩
|
||||
|
||||
theorem or_comm (x y : BitVec w) :
|
||||
x ||| y = y ||| x := by
|
||||
ext i
|
||||
simp [Bool.or_comm]
|
||||
instance : Std.Commutative (fun (x y : BitVec w) => x ||| y) := ⟨BitVec.or_comm⟩
|
||||
|
||||
/-! ### and -/
|
||||
|
||||
@@ -536,11 +545,13 @@ theorem and_assoc (x y z : BitVec w) :
|
||||
x &&& y &&& z = x &&& (y &&& z) := by
|
||||
ext i
|
||||
simp [Bool.and_assoc]
|
||||
instance : Std.Associative (α := BitVec n) (· &&& ·) := ⟨BitVec.and_assoc⟩
|
||||
|
||||
theorem and_comm (x y : BitVec w) :
|
||||
x &&& y = y &&& x := by
|
||||
ext i
|
||||
simp [Bool.and_comm]
|
||||
instance : Std.Commutative (fun (x y : BitVec w) => x &&& y) := ⟨BitVec.and_comm⟩
|
||||
|
||||
/-! ### xor -/
|
||||
|
||||
@@ -566,6 +577,13 @@ theorem xor_assoc (x y z : BitVec w) :
|
||||
x ^^^ y ^^^ z = x ^^^ (y ^^^ z) := by
|
||||
ext i
|
||||
simp [Bool.xor_assoc]
|
||||
instance : Std.Associative (fun (x y : BitVec w) => x ^^^ y) := ⟨BitVec.xor_assoc⟩
|
||||
|
||||
theorem xor_comm (x y : BitVec w) :
|
||||
x ^^^ y = y ^^^ x := by
|
||||
ext i
|
||||
simp [Bool.xor_comm]
|
||||
instance : Std.Commutative (fun (x y : BitVec w) => x ^^^ y) := ⟨BitVec.xor_comm⟩
|
||||
|
||||
/-! ### not -/
|
||||
|
||||
@@ -731,6 +749,31 @@ theorem getLsb_shiftLeft' {x : BitVec w₁} {y : BitVec w₂} {i : Nat} :
|
||||
getLsb (x >>> i) j = getLsb x (i+j) := by
|
||||
unfold getLsb ; simp
|
||||
|
||||
theorem ushiftRight_xor_distrib (x y : BitVec w) (n : Nat) :
|
||||
(x ^^^ y) >>> n = (x >>> n) ^^^ (y >>> n) := by
|
||||
ext
|
||||
simp
|
||||
|
||||
theorem ushiftRight_and_distrib (x y : BitVec w) (n : Nat) :
|
||||
(x &&& y) >>> n = (x >>> n) &&& (y >>> n) := by
|
||||
ext
|
||||
simp
|
||||
|
||||
theorem ushiftRight_or_distrib (x y : BitVec w) (n : Nat) :
|
||||
(x ||| y) >>> n = (x >>> n) ||| (y >>> n) := by
|
||||
ext
|
||||
simp
|
||||
|
||||
@[simp]
|
||||
theorem ushiftRight_zero_eq (x : BitVec w) : x >>> 0 = x := by
|
||||
simp [bv_toNat]
|
||||
|
||||
/-! ### ushiftRight reductions from BitVec to Nat -/
|
||||
|
||||
@[simp]
|
||||
theorem ushiftRight_eq' (x : BitVec w₁) (y : BitVec w₂) :
|
||||
x >>> y = x >>> y.toNat := by rfl
|
||||
|
||||
/-! ### sshiftRight -/
|
||||
|
||||
theorem sshiftRight_eq {x : BitVec n} {i : Nat} :
|
||||
@@ -774,7 +817,7 @@ theorem sshiftRight_eq_of_msb_true {x : BitVec w} {s : Nat} (h : x.msb = true) :
|
||||
· rw [Nat.shiftRight_eq_div_pow]
|
||||
apply Nat.lt_of_le_of_lt (Nat.div_le_self _ _) (by omega)
|
||||
|
||||
theorem getLsb_sshiftRight (x : BitVec w) (s i : Nat) :
|
||||
@[simp] theorem getLsb_sshiftRight (x : BitVec w) (s i : Nat) :
|
||||
getLsb (x.sshiftRight s) i =
|
||||
(!decide (w ≤ i) && if s + i < w then x.getLsb (s + i) else x.msb) := by
|
||||
rcases hmsb : x.msb with rfl | rfl
|
||||
@@ -795,6 +838,41 @@ theorem getLsb_sshiftRight (x : BitVec w) (s i : Nat) :
|
||||
Nat.not_lt, decide_eq_true_eq]
|
||||
omega
|
||||
|
||||
/-- The msb after arithmetic shifting right equals the original msb. -/
|
||||
theorem sshiftRight_msb_eq_msb {n : Nat} {x : BitVec w} :
|
||||
(x.sshiftRight n).msb = x.msb := by
|
||||
rw [msb_eq_getLsb_last, getLsb_sshiftRight, msb_eq_getLsb_last]
|
||||
by_cases hw₀ : w = 0
|
||||
· simp [hw₀]
|
||||
· simp only [show ¬(w ≤ w - 1) by omega, decide_False, Bool.not_false, Bool.true_and,
|
||||
ite_eq_right_iff]
|
||||
intros h
|
||||
simp [show n = 0 by omega]
|
||||
|
||||
@[simp] theorem sshiftRight_zero {x : BitVec w} : x.sshiftRight 0 = x := by
|
||||
ext i
|
||||
simp
|
||||
|
||||
theorem sshiftRight_add {x : BitVec w} {m n : Nat} :
|
||||
x.sshiftRight (m + n) = (x.sshiftRight m).sshiftRight n := by
|
||||
ext i
|
||||
simp only [getLsb_sshiftRight, Nat.add_assoc]
|
||||
by_cases h₁ : w ≤ (i : Nat)
|
||||
· simp [h₁]
|
||||
· simp only [h₁, decide_False, Bool.not_false, Bool.true_and]
|
||||
by_cases h₂ : n + ↑i < w
|
||||
· simp [h₂]
|
||||
· simp only [h₂, ↓reduceIte]
|
||||
by_cases h₃ : m + (n + ↑i) < w
|
||||
· simp [h₃]
|
||||
omega
|
||||
· simp [h₃, sshiftRight_msb_eq_msb]
|
||||
|
||||
/-! ### sshiftRight reductions from BitVec to Nat -/
|
||||
|
||||
@[simp]
|
||||
theorem sshiftRight_eq' (x : BitVec w) : x.sshiftRight' y = x.sshiftRight y.toNat := rfl
|
||||
|
||||
/-! ### signExtend -/
|
||||
|
||||
/-- Equation theorem for `Int.sub` when both arguments are `Int.ofNat` -/
|
||||
@@ -857,15 +935,15 @@ theorem append_def (x : BitVec v) (y : BitVec w) :
|
||||
(x ++ y).toNat = x.toNat <<< n ||| y.toNat :=
|
||||
rfl
|
||||
|
||||
@[simp] theorem getLsb_append {v : BitVec n} {w : BitVec m} :
|
||||
getLsb (v ++ w) i = bif i < m then getLsb w i else getLsb v (i - m) := by
|
||||
@[simp] theorem getLsb_append {x : BitVec n} {y : BitVec m} :
|
||||
getLsb (x ++ y) i = bif i < m then getLsb y i else getLsb x (i - m) := by
|
||||
simp only [append_def, getLsb_or, getLsb_shiftLeftZeroExtend, getLsb_zeroExtend']
|
||||
by_cases h : i < m
|
||||
· simp [h]
|
||||
· simp [h]; simp_all
|
||||
|
||||
@[simp] theorem getMsb_append {v : BitVec n} {w : BitVec m} :
|
||||
getMsb (v ++ w) i = bif n ≤ i then getMsb w (i - n) else getMsb v i := by
|
||||
@[simp] theorem getMsb_append {x : BitVec n} {y : BitVec m} :
|
||||
getMsb (x ++ y) i = bif n ≤ i then getMsb y (i - n) else getMsb x i := by
|
||||
simp [append_def]
|
||||
by_cases h : n ≤ i
|
||||
· simp [h]
|
||||
@@ -1549,4 +1627,54 @@ theorem zeroExtend_truncate_succ_eq_zeroExtend_truncate_or_twoPow_of_getLsb_true
|
||||
simp [hx]
|
||||
· by_cases hik' : k < i + 1 <;> simp [hik, hik'] <;> omega
|
||||
|
||||
/-- Bitwise and of `(x : BitVec w)` with `1#w` equals zero extending `x.lsb` to `w`. -/
|
||||
theorem and_one_eq_zeroExtend_ofBool_getLsb {x : BitVec w} :
|
||||
(x &&& 1#w) = zeroExtend w (ofBool (x.getLsb 0)) := by
|
||||
ext i
|
||||
simp only [getLsb_and, getLsb_one, getLsb_zeroExtend, Fin.is_lt, decide_True, getLsb_ofBool,
|
||||
Bool.true_and]
|
||||
by_cases h : (0 = (i : Nat)) <;> simp [h] <;> omega
|
||||
|
||||
@[simp]
|
||||
theorem replicate_zero_eq {x : BitVec w} : x.replicate 0 = 0#0 := by
|
||||
simp [replicate]
|
||||
|
||||
@[simp]
|
||||
theorem replicate_succ_eq {x : BitVec w} :
|
||||
x.replicate (n + 1) =
|
||||
(x ++ replicate n x).cast (by rw [Nat.mul_succ]; omega) := by
|
||||
simp [replicate]
|
||||
|
||||
/--
|
||||
If a number `w * n ≤ i < w * (n + 1)`, then `i - w * n` equals `i % w`.
|
||||
This is true by subtracting `w * n` from the inequality, giving
|
||||
`0 ≤ i - w * n < w`, which uniquely identifies `i % w`.
|
||||
-/
|
||||
private theorem Nat.sub_mul_eq_mod_of_lt_of_le (hlo : w * n ≤ i) (hhi : i < w * (n + 1)) :
|
||||
i - w * n = i % w := by
|
||||
rw [Nat.mod_def]
|
||||
congr
|
||||
symm
|
||||
apply Nat.div_eq_of_lt_le
|
||||
(by rw [Nat.mul_comm]; omega)
|
||||
(by rw [Nat.mul_comm]; omega)
|
||||
|
||||
@[simp]
|
||||
theorem getLsb_replicate {n w : Nat} (x : BitVec w) :
|
||||
(x.replicate n).getLsb i =
|
||||
(decide (i < w * n) && x.getLsb (i % w)) := by
|
||||
induction n generalizing x
|
||||
case zero => simp
|
||||
case succ n ih =>
|
||||
simp only [replicate_succ_eq, getLsb_cast, getLsb_append]
|
||||
by_cases hi : i < w * (n + 1)
|
||||
· simp only [hi, decide_True, Bool.true_and]
|
||||
by_cases hi' : i < w * n
|
||||
· simp [hi', ih]
|
||||
· simp only [hi', decide_False, cond_false]
|
||||
rw [Nat.sub_mul_eq_mod_of_lt_of_le] <;> omega
|
||||
· rw [Nat.mul_succ] at hi ⊢
|
||||
simp only [show ¬i < w * n by omega, decide_False, cond_false, hi, Bool.false_and]
|
||||
apply BitVec.getLsb_ge (x := x) (i := i - w * n) (ge := by omega)
|
||||
|
||||
end BitVec
|
||||
|
||||
@@ -438,6 +438,24 @@ Added for confluence between `if_true_left` and `ite_false_same` on
|
||||
-/
|
||||
@[simp] theorem eq_true_imp_eq_false : ∀(b:Bool), (b = true → b = false) ↔ (b = false) := by decide
|
||||
|
||||
/-! ### forall -/
|
||||
|
||||
theorem forall_bool' {p : Bool → Prop} (b : Bool) : (∀ x, p x) ↔ p b ∧ p !b :=
|
||||
⟨fun h ↦ ⟨h _, h _⟩, fun ⟨h₁, h₂⟩ x ↦ by cases b <;> cases x <;> assumption⟩
|
||||
|
||||
@[simp]
|
||||
theorem forall_bool {p : Bool → Prop} : (∀ b, p b) ↔ p false ∧ p true :=
|
||||
forall_bool' false
|
||||
|
||||
/-! ### exists -/
|
||||
|
||||
theorem exists_bool' {p : Bool → Prop} (b : Bool) : (∃ x, p x) ↔ p b ∨ p !b :=
|
||||
⟨fun ⟨x, hx⟩ ↦ by cases x <;> cases b <;> first | exact .inl ‹_› | exact .inr ‹_›,
|
||||
fun h ↦ by cases h <;> exact ⟨_, ‹_›⟩⟩
|
||||
|
||||
@[simp]
|
||||
theorem exists_bool {p : Bool → Prop} : (∃ b, p b) ↔ p false ∨ p true :=
|
||||
exists_bool' false
|
||||
|
||||
/-! ### cond -/
|
||||
|
||||
|
||||
@@ -191,6 +191,121 @@ def foldlM {β : Type v} {m : Type v → Type w} [Monad m] (f : β → UInt8 →
|
||||
def foldl {β : Type v} (f : β → UInt8 → β) (init : β) (as : ByteArray) (start := 0) (stop := as.size) : β :=
|
||||
Id.run <| as.foldlM f init start stop
|
||||
|
||||
/-- Iterator over the bytes (`UInt8`) of a `ByteArray`.
|
||||
|
||||
Typically created by `arr.iter`, where `arr` is a `ByteArray`.
|
||||
|
||||
An iterator is *valid* if the position `i` is *valid* for the array `arr`, meaning `0 ≤ i ≤ arr.size`
|
||||
|
||||
Most operations on iterators return arbitrary values if the iterator is not valid. The functions in
|
||||
the `ByteArray.Iterator` API should rule out the creation of invalid iterators, with two exceptions:
|
||||
|
||||
- `Iterator.next iter` is invalid if `iter` is already at the end of the array (`iter.atEnd` is
|
||||
`true`)
|
||||
- `Iterator.forward iter n`/`Iterator.nextn iter n` is invalid if `n` is strictly greater than the
|
||||
number of remaining bytes.
|
||||
-/
|
||||
structure Iterator where
|
||||
/-- The array the iterator is for. -/
|
||||
array : ByteArray
|
||||
/-- The current position.
|
||||
|
||||
This position is not necessarily valid for the array, for instance if one keeps calling
|
||||
`Iterator.next` when `Iterator.atEnd` is true. If the position is not valid, then the
|
||||
current byte is `(default : UInt8)`. -/
|
||||
idx : Nat
|
||||
deriving Inhabited
|
||||
|
||||
/-- Creates an iterator at the beginning of an array. -/
|
||||
def mkIterator (arr : ByteArray) : Iterator :=
|
||||
⟨arr, 0⟩
|
||||
|
||||
@[inherit_doc mkIterator]
|
||||
abbrev iter := mkIterator
|
||||
|
||||
/-- The size of an array iterator is the number of bytes remaining. -/
|
||||
instance : SizeOf Iterator where
|
||||
sizeOf i := i.array.size - i.idx
|
||||
|
||||
theorem Iterator.sizeOf_eq (i : Iterator) : sizeOf i = i.array.size - i.idx :=
|
||||
rfl
|
||||
|
||||
namespace Iterator
|
||||
|
||||
/-- Number of bytes remaining in the iterator. -/
|
||||
def remainingBytes : Iterator → Nat
|
||||
| ⟨arr, i⟩ => arr.size - i
|
||||
|
||||
@[inherit_doc Iterator.idx]
|
||||
def pos := Iterator.idx
|
||||
|
||||
/-- The byte at the current position.
|
||||
|
||||
On an invalid position, returns `(default : UInt8)`. -/
|
||||
@[inline]
|
||||
def curr : Iterator → UInt8
|
||||
| ⟨arr, i⟩ =>
|
||||
if h:i < arr.size then
|
||||
arr[i]'h
|
||||
else
|
||||
default
|
||||
|
||||
/-- Moves the iterator's position forward by one byte, unconditionally.
|
||||
|
||||
It is only valid to call this function if the iterator is not at the end of the array, *i.e.*
|
||||
`Iterator.atEnd` is `false`; otherwise, the resulting iterator will be invalid. -/
|
||||
@[inline]
|
||||
def next : Iterator → Iterator
|
||||
| ⟨arr, i⟩ => ⟨arr, i + 1⟩
|
||||
|
||||
/-- Decreases the iterator's position.
|
||||
|
||||
If the position is zero, this function is the identity. -/
|
||||
@[inline]
|
||||
def prev : Iterator → Iterator
|
||||
| ⟨arr, i⟩ => ⟨arr, i - 1⟩
|
||||
|
||||
/-- True if the iterator is past the array's last byte. -/
|
||||
@[inline]
|
||||
def atEnd : Iterator → Bool
|
||||
| ⟨arr, i⟩ => i ≥ arr.size
|
||||
|
||||
/-- True if the iterator is not past the array's last byte. -/
|
||||
@[inline]
|
||||
def hasNext : Iterator → Bool
|
||||
| ⟨arr, i⟩ => i < arr.size
|
||||
|
||||
/-- True if the position is not zero. -/
|
||||
@[inline]
|
||||
def hasPrev : Iterator → Bool
|
||||
| ⟨_, i⟩ => i > 0
|
||||
|
||||
/-- Moves the iterator's position to the end of the array.
|
||||
|
||||
Note that `i.toEnd.atEnd` is always `true`. -/
|
||||
@[inline]
|
||||
def toEnd : Iterator → Iterator
|
||||
| ⟨arr, _⟩ => ⟨arr, arr.size⟩
|
||||
|
||||
/-- Moves the iterator's position several bytes forward.
|
||||
|
||||
The resulting iterator is only valid if the number of bytes to skip is less than or equal to
|
||||
the number of bytes left in the iterator. -/
|
||||
@[inline]
|
||||
def forward : Iterator → Nat → Iterator
|
||||
| ⟨arr, i⟩, f => ⟨arr, i + f⟩
|
||||
|
||||
@[inherit_doc forward, inline]
|
||||
def nextn : Iterator → Nat → Iterator := forward
|
||||
|
||||
/-- Moves the iterator's position several bytes back.
|
||||
|
||||
If asked to go back more bytes than available, stops at the beginning of the array. -/
|
||||
@[inline]
|
||||
def prevn : Iterator → Nat → Iterator
|
||||
| ⟨arr, i⟩, f => ⟨arr, i - f⟩
|
||||
|
||||
end Iterator
|
||||
end ByteArray
|
||||
|
||||
def List.toByteArray (bs : List UInt8) : ByteArray :=
|
||||
|
||||
@@ -322,8 +322,8 @@ protected def pow (m : Int) : Nat → Int
|
||||
| 0 => 1
|
||||
| succ n => Int.pow m n * m
|
||||
|
||||
instance : HPow Int Nat Int where
|
||||
hPow := Int.pow
|
||||
instance : NatPow Int where
|
||||
pow := Int.pow
|
||||
|
||||
instance : LawfulBEq Int where
|
||||
eq_of_beq h := by simp [BEq.beq] at h; assumption
|
||||
|
||||
@@ -354,7 +354,7 @@ theorem erase_eq_iff [LawfulBEq α] {a : α} {l : List α} :
|
||||
rw [erase_of_not_mem]
|
||||
simp_all
|
||||
|
||||
theorem Nodup.erase_eq_filter [BEq α] [LawfulBEq α] {l} (d : Nodup l) (a : α) : l.erase a = l.filter (· != a) := by
|
||||
theorem Nodup.erase_eq_filter [LawfulBEq α] {l} (d : Nodup l) (a : α) : l.erase a = l.filter (· != a) := by
|
||||
induction d with
|
||||
| nil => rfl
|
||||
| cons m _n ih =>
|
||||
@@ -367,13 +367,13 @@ theorem Nodup.erase_eq_filter [BEq α] [LawfulBEq α] {l} (d : Nodup l) (a : α)
|
||||
simpa [@eq_comm α] using m
|
||||
· simp [beq_false_of_ne h, ih, h]
|
||||
|
||||
theorem Nodup.mem_erase_iff [BEq α] [LawfulBEq α] {a : α} (d : Nodup l) : a ∈ l.erase b ↔ a ≠ b ∧ a ∈ l := by
|
||||
theorem Nodup.mem_erase_iff [LawfulBEq α] {a : α} (d : Nodup l) : a ∈ l.erase b ↔ a ≠ b ∧ a ∈ l := by
|
||||
rw [Nodup.erase_eq_filter d, mem_filter, and_comm, bne_iff_ne]
|
||||
|
||||
theorem Nodup.not_mem_erase [BEq α] [LawfulBEq α] {a : α} (h : Nodup l) : a ∉ l.erase a := fun H => by
|
||||
theorem Nodup.not_mem_erase [LawfulBEq α] {a : α} (h : Nodup l) : a ∉ l.erase a := fun H => by
|
||||
simpa using ((Nodup.mem_erase_iff h).mp H).left
|
||||
|
||||
theorem Nodup.erase [BEq α] [LawfulBEq α] (a : α) : Nodup l → Nodup (l.erase a) :=
|
||||
theorem Nodup.erase [LawfulBEq α] (a : α) : Nodup l → Nodup (l.erase a) :=
|
||||
Nodup.sublist <| erase_sublist _ _
|
||||
|
||||
end erase
|
||||
|
||||
@@ -84,7 +84,7 @@ theorem cons_subset_cons {l₁ l₂ : List α} (a : α) (s : l₁ ⊆ l₂) : a
|
||||
@[simp] theorem subset_nil {l : List α} : l ⊆ [] ↔ l = [] :=
|
||||
⟨fun h => match l with | [] => rfl | _::_ => (nomatch h (.head ..)), fun | rfl => Subset.refl _⟩
|
||||
|
||||
theorem map_subset {l₁ l₂ : List α} {f : α → β} (h : l₁ ⊆ l₂) : map f l₁ ⊆ map f l₂ :=
|
||||
theorem map_subset {l₁ l₂ : List α} (f : α → β) (h : l₁ ⊆ l₂) : map f l₁ ⊆ map f l₂ :=
|
||||
fun x => by simp only [mem_map]; exact .imp fun a => .imp_left (@h _)
|
||||
|
||||
theorem filter_subset {l₁ l₂ : List α} (p : α → Bool) (H : l₁ ⊆ l₂) : filter p l₁ ⊆ filter p l₂ :=
|
||||
|
||||
@@ -411,11 +411,11 @@ theorem dropWhile_replicate (p : α → Bool) :
|
||||
split <;> simp_all
|
||||
|
||||
theorem take_takeWhile {l : List α} (p : α → Bool) n :
|
||||
(l.takeWhile p).take n = (l.takeWhile p).take n := by
|
||||
induction l with
|
||||
| nil => rfl
|
||||
(l.takeWhile p).take n = (l.take n).takeWhile p := by
|
||||
induction l generalizing n with
|
||||
| nil => simp
|
||||
| cons x xs ih =>
|
||||
by_cases h : p x <;> simp [takeWhile_cons, h, ih]
|
||||
by_cases h : p x <;> cases n <;> simp [takeWhile_cons, h, ih, take_succ_cons]
|
||||
|
||||
@[simp] theorem all_takeWhile {l : List α} : (l.takeWhile p).all p = true := by
|
||||
induction l with
|
||||
|
||||
@@ -203,6 +203,10 @@ theorem mod_add_div (m k : Nat) : m % k + k * (m / k) = m := by
|
||||
| base x y h => simp [h]
|
||||
| ind x y h IH => simp [h]; rw [Nat.mul_succ, ← Nat.add_assoc, IH, Nat.sub_add_cancel h.2]
|
||||
|
||||
theorem mod_def (m k : Nat) : m % k = m - k * (m / k) := by
|
||||
rw [Nat.sub_eq_of_eq_add]
|
||||
apply (Nat.mod_add_div _ _).symm
|
||||
|
||||
@[simp] protected theorem div_one (n : Nat) : n / 1 = n := by
|
||||
have := mod_add_div n 1
|
||||
rwa [mod_one, Nat.zero_add, Nat.one_mul] at this
|
||||
|
||||
@@ -5,9 +5,18 @@ Author: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Init.SimpLemmas
|
||||
import Init.NotationExtra
|
||||
|
||||
instance [BEq α] [BEq β] [LawfulBEq α] [LawfulBEq β] : LawfulBEq (α × β) where
|
||||
eq_of_beq {a b} (h : a.1 == b.1 && a.2 == b.2) := by
|
||||
cases a; cases b
|
||||
refine congr (congrArg _ (eq_of_beq ?_)) (eq_of_beq ?_) <;> simp_all
|
||||
rfl {a} := by cases a; simp [BEq.beq, LawfulBEq.rfl]
|
||||
|
||||
@[simp]
|
||||
protected theorem Prod.forall {p : α × β → Prop} : (∀ x, p x) ↔ ∀ a b, p (a, b) :=
|
||||
⟨fun h a b ↦ h (a, b), fun h ⟨a, b⟩ ↦ h a b⟩
|
||||
|
||||
@[simp]
|
||||
protected theorem Prod.exists {p : α × β → Prop} : (∃ x, p x) ↔ ∃ a b, p (a, b) :=
|
||||
⟨fun ⟨⟨a, b⟩, h⟩ ↦ ⟨a, b, h⟩, fun ⟨a, b, h⟩ ↦ ⟨⟨a, b⟩, h⟩⟩
|
||||
|
||||
@@ -78,7 +78,10 @@ end Elab.Tactic.Ext
|
||||
end Lean
|
||||
|
||||
attribute [ext] Prod PProd Sigma PSigma
|
||||
attribute [ext] funext propext Subtype.eq
|
||||
attribute [ext] funext propext Subtype.eq Array.ext
|
||||
|
||||
@[ext] protected theorem PUnit.ext (x y : PUnit) : x = y := rfl
|
||||
protected theorem Unit.ext (x y : Unit) : x = y := rfl
|
||||
|
||||
@[ext] protected theorem Thunk.ext : {a b : Thunk α} → a.get = b.get → a = b
|
||||
| {..}, {..}, heq => congrArg _ <| funext fun _ => heq
|
||||
|
||||
@@ -109,4 +109,4 @@ A more restrictive but efficient max sharing primitive.
|
||||
Remark: it optimizes the number of RC operations, and the strategy for caching results.
|
||||
-/
|
||||
@[extern "lean_sharecommon_quick"]
|
||||
def ShareCommon.shareCommon' (a : α) : α := a
|
||||
def ShareCommon.shareCommon' (a : @& α) : α := a
|
||||
|
||||
@@ -435,6 +435,12 @@ Note that EOF does not actually close a handle, so further reads may block and r
|
||||
|
||||
end Handle
|
||||
|
||||
/--
|
||||
Resolves a pathname to an absolute pathname with no '.', '..', or symbolic links.
|
||||
|
||||
This function coincides with the [POSIX `realpath` function](https://pubs.opengroup.org/onlinepubs/9699919799/functions/realpath.html),
|
||||
see there for more information.
|
||||
-/
|
||||
@[extern "lean_io_realpath"] opaque realPath (fname : FilePath) : IO FilePath
|
||||
@[extern "lean_io_remove_file"] opaque removeFile (fname : @& FilePath) : IO Unit
|
||||
/-- Remove given directory. Fails if not empty; see also `IO.FS.removeDirAll`. -/
|
||||
@@ -464,31 +470,23 @@ def withFile (fn : FilePath) (mode : Mode) (f : Handle → IO α) : IO α :=
|
||||
def Handle.putStrLn (h : Handle) (s : String) : IO Unit :=
|
||||
h.putStr (s.push '\n')
|
||||
|
||||
partial def Handle.readBinToEnd (h : Handle) : IO ByteArray := do
|
||||
partial def Handle.readBinToEndInto (h : Handle) (buf : ByteArray) : IO ByteArray := do
|
||||
let rec loop (acc : ByteArray) : IO ByteArray := do
|
||||
let buf ← h.read 1024
|
||||
if buf.isEmpty then
|
||||
return acc
|
||||
else
|
||||
loop (acc ++ buf)
|
||||
loop ByteArray.empty
|
||||
loop buf
|
||||
|
||||
partial def Handle.readToEnd (h : Handle) : IO String := do
|
||||
let rec loop (s : String) := do
|
||||
let line ← h.getLine
|
||||
if line.isEmpty then
|
||||
return s
|
||||
else
|
||||
loop (s ++ line)
|
||||
loop ""
|
||||
partial def Handle.readBinToEnd (h : Handle) : IO ByteArray := do
|
||||
h.readBinToEndInto .empty
|
||||
|
||||
def readBinFile (fname : FilePath) : IO ByteArray := do
|
||||
let h ← Handle.mk fname Mode.read
|
||||
h.readBinToEnd
|
||||
|
||||
def readFile (fname : FilePath) : IO String := do
|
||||
let h ← Handle.mk fname Mode.read
|
||||
h.readToEnd
|
||||
def Handle.readToEnd (h : Handle) : IO String := do
|
||||
let data ← h.readBinToEnd
|
||||
match String.fromUTF8? data with
|
||||
| some s => return s
|
||||
| none => throw <| .userError s!"Tried to read from handle containing non UTF-8 data."
|
||||
|
||||
partial def lines (fname : FilePath) : IO (Array String) := do
|
||||
let h ← Handle.mk fname Mode.read
|
||||
@@ -594,6 +592,28 @@ end System.FilePath
|
||||
|
||||
namespace IO
|
||||
|
||||
namespace FS
|
||||
|
||||
def readBinFile (fname : FilePath) : IO ByteArray := do
|
||||
-- Requires metadata so defined after metadata
|
||||
let mdata ← fname.metadata
|
||||
let size := mdata.byteSize.toUSize
|
||||
let handle ← IO.FS.Handle.mk fname .read
|
||||
let buf ←
|
||||
if size > 0 then
|
||||
handle.read mdata.byteSize.toUSize
|
||||
else
|
||||
pure <| ByteArray.mkEmpty 0
|
||||
handle.readBinToEndInto buf
|
||||
|
||||
def readFile (fname : FilePath) : IO String := do
|
||||
let data ← readBinFile fname
|
||||
match String.fromUTF8? data with
|
||||
| some s => return s
|
||||
| none => throw <| .userError s!"Tried to read file '{fname}' containing non UTF-8 data."
|
||||
|
||||
end FS
|
||||
|
||||
def withStdin [Monad m] [MonadFinally m] [MonadLiftT BaseIO m] (h : FS.Stream) (x : m α) : m α := do
|
||||
let prev ← setStdin h
|
||||
try x finally discard <| setStdin prev
|
||||
|
||||
@@ -68,6 +68,7 @@ noncomputable def recursion {C : α → Sort v} (a : α) (h : ∀ x, (∀ y, r y
|
||||
induction (apply hwf a) with
|
||||
| intro x₁ _ ih => exact h x₁ ih
|
||||
|
||||
include hwf in
|
||||
theorem induction {C : α → Prop} (a : α) (h : ∀ x, (∀ y, r y x → C y) → C x) : C a :=
|
||||
recursion hwf a h
|
||||
|
||||
|
||||
@@ -53,7 +53,7 @@ structure AttributeImpl extends AttributeImplCore where
|
||||
erase (decl : Name) : AttrM Unit := throwError "attribute cannot be erased"
|
||||
deriving Inhabited
|
||||
|
||||
builtin_initialize attributeMapRef : IO.Ref (HashMap Name AttributeImpl) ← IO.mkRef {}
|
||||
builtin_initialize attributeMapRef : IO.Ref (Std.HashMap Name AttributeImpl) ← IO.mkRef {}
|
||||
|
||||
/-- Low level attribute registration function. -/
|
||||
def registerBuiltinAttribute (attr : AttributeImpl) : IO Unit := do
|
||||
@@ -296,7 +296,7 @@ end EnumAttributes
|
||||
-/
|
||||
|
||||
abbrev AttributeImplBuilder := Name → List DataValue → Except String AttributeImpl
|
||||
abbrev AttributeImplBuilderTable := HashMap Name AttributeImplBuilder
|
||||
abbrev AttributeImplBuilderTable := Std.HashMap Name AttributeImplBuilder
|
||||
|
||||
builtin_initialize attributeImplBuilderTableRef : IO.Ref AttributeImplBuilderTable ← IO.mkRef {}
|
||||
|
||||
@@ -307,7 +307,7 @@ def registerAttributeImplBuilder (builderId : Name) (builder : AttributeImplBuil
|
||||
|
||||
def mkAttributeImplOfBuilder (builderId ref : Name) (args : List DataValue) : IO AttributeImpl := do
|
||||
let table ← attributeImplBuilderTableRef.get
|
||||
match table.find? builderId with
|
||||
match table[builderId]? with
|
||||
| none => throw (IO.userError ("unknown attribute implementation builder '" ++ toString builderId ++ "'"))
|
||||
| some builder => IO.ofExcept <| builder ref args
|
||||
|
||||
@@ -317,7 +317,7 @@ inductive AttributeExtensionOLeanEntry where
|
||||
|
||||
structure AttributeExtensionState where
|
||||
newEntries : List AttributeExtensionOLeanEntry := []
|
||||
map : HashMap Name AttributeImpl
|
||||
map : Std.HashMap Name AttributeImpl
|
||||
deriving Inhabited
|
||||
|
||||
abbrev AttributeExtension := PersistentEnvExtension AttributeExtensionOLeanEntry (AttributeExtensionOLeanEntry × AttributeImpl) AttributeExtensionState
|
||||
@@ -348,7 +348,7 @@ private def AttributeExtension.addImported (es : Array (Array AttributeExtension
|
||||
let map ← es.foldlM
|
||||
(fun map entries =>
|
||||
entries.foldlM
|
||||
(fun (map : HashMap Name AttributeImpl) entry => do
|
||||
(fun (map : Std.HashMap Name AttributeImpl) entry => do
|
||||
let attrImpl ← mkAttributeImplOfEntry ctx.env ctx.opts entry
|
||||
return map.insert attrImpl.name attrImpl)
|
||||
map)
|
||||
@@ -378,7 +378,7 @@ def getBuiltinAttributeNames : IO (List Name) :=
|
||||
|
||||
def getBuiltinAttributeImpl (attrName : Name) : IO AttributeImpl := do
|
||||
let m ← attributeMapRef.get
|
||||
match m.find? attrName with
|
||||
match m[attrName]? with
|
||||
| some attr => pure attr
|
||||
| none => throw (IO.userError ("unknown attribute '" ++ toString attrName ++ "'"))
|
||||
|
||||
@@ -396,7 +396,7 @@ def getAttributeNames (env : Environment) : List Name :=
|
||||
|
||||
def getAttributeImpl (env : Environment) (attrName : Name) : Except String AttributeImpl :=
|
||||
let m := (attributeExtension.getState env).map
|
||||
match m.find? attrName with
|
||||
match m[attrName]? with
|
||||
| some attr => pure attr
|
||||
| none => throw ("unknown attribute '" ++ toString attrName ++ "'")
|
||||
|
||||
|
||||
@@ -26,9 +26,9 @@ instance : Hashable Key := ⟨getHash⟩
|
||||
end OwnedSet
|
||||
|
||||
open OwnedSet (Key) in
|
||||
abbrev OwnedSet := HashMap Key Unit
|
||||
def OwnedSet.insert (s : OwnedSet) (k : OwnedSet.Key) : OwnedSet := HashMap.insert s k ()
|
||||
def OwnedSet.contains (s : OwnedSet) (k : OwnedSet.Key) : Bool := HashMap.contains s k
|
||||
abbrev OwnedSet := Std.HashMap Key Unit
|
||||
def OwnedSet.insert (s : OwnedSet) (k : OwnedSet.Key) : OwnedSet := Std.HashMap.insert s k ()
|
||||
def OwnedSet.contains (s : OwnedSet) (k : OwnedSet.Key) : Bool := Std.HashMap.contains s k
|
||||
|
||||
/-! We perform borrow inference in a block of mutually recursive functions.
|
||||
Join points are viewed as local functions, and are identified using
|
||||
@@ -49,7 +49,7 @@ instance : Hashable Key := ⟨getHash⟩
|
||||
end ParamMap
|
||||
|
||||
open ParamMap (Key)
|
||||
abbrev ParamMap := HashMap Key (Array Param)
|
||||
abbrev ParamMap := Std.HashMap Key (Array Param)
|
||||
|
||||
def ParamMap.fmt (map : ParamMap) : Format :=
|
||||
let fmts := map.fold (fun fmt k ps =>
|
||||
@@ -109,7 +109,7 @@ partial def visitFnBody (fn : FunId) (paramMap : ParamMap) : FnBody → FnBody
|
||||
| FnBody.jdecl j _ v b =>
|
||||
let v := visitFnBody fn paramMap v
|
||||
let b := visitFnBody fn paramMap b
|
||||
match paramMap.find? (ParamMap.Key.jp fn j) with
|
||||
match paramMap[ParamMap.Key.jp fn j]? with
|
||||
| some ys => FnBody.jdecl j ys v b
|
||||
| none => unreachable!
|
||||
| FnBody.case tid x xType alts =>
|
||||
@@ -125,7 +125,7 @@ def visitDecls (decls : Array Decl) (paramMap : ParamMap) : Array Decl :=
|
||||
decls.map fun decl => match decl with
|
||||
| Decl.fdecl f _ ty b info =>
|
||||
let b := visitFnBody f paramMap b
|
||||
match paramMap.find? (ParamMap.Key.decl f) with
|
||||
match paramMap[ParamMap.Key.decl f]? with
|
||||
| some xs => Decl.fdecl f xs ty b info
|
||||
| none => unreachable!
|
||||
| other => other
|
||||
@@ -178,7 +178,7 @@ def isOwned (x : VarId) : M Bool := do
|
||||
/-- Updates `map[k]` using the current set of `owned` variables. -/
|
||||
def updateParamMap (k : ParamMap.Key) : M Unit := do
|
||||
let s ← get
|
||||
match s.paramMap.find? k with
|
||||
match s.paramMap[k]? with
|
||||
| some ps => do
|
||||
let ps ← ps.mapM fun (p : Param) => do
|
||||
if !p.borrow then pure p
|
||||
@@ -192,7 +192,7 @@ def updateParamMap (k : ParamMap.Key) : M Unit := do
|
||||
|
||||
def getParamInfo (k : ParamMap.Key) : M (Array Param) := do
|
||||
let s ← get
|
||||
match s.paramMap.find? k with
|
||||
match s.paramMap[k]? with
|
||||
| some ps => pure ps
|
||||
| none =>
|
||||
match k with
|
||||
|
||||
@@ -11,6 +11,7 @@ import Lean.Compiler.IR.Basic
|
||||
import Lean.Compiler.IR.CompilerM
|
||||
import Lean.Compiler.IR.FreeVars
|
||||
import Lean.Compiler.IR.ElimDeadVars
|
||||
import Lean.Data.AssocList
|
||||
|
||||
namespace Lean.IR.ExplicitBoxing
|
||||
/-!
|
||||
|
||||
@@ -152,7 +152,7 @@ def getFunctionSummary? (env : Environment) (fid : FunId) : Option Value :=
|
||||
| some modIdx => findAtSorted? (functionSummariesExt.getModuleEntries env modIdx) fid
|
||||
| none => functionSummariesExt.getState env |>.find? fid
|
||||
|
||||
abbrev Assignment := HashMap VarId Value
|
||||
abbrev Assignment := Std.HashMap VarId Value
|
||||
|
||||
structure InterpContext where
|
||||
currFnIdx : Nat := 0
|
||||
@@ -172,7 +172,7 @@ def findVarValue (x : VarId) : M Value := do
|
||||
let ctx ← read
|
||||
let s ← get
|
||||
let assignment := s.assignments[ctx.currFnIdx]!
|
||||
return assignment.findD x bot
|
||||
return assignment.getD x bot
|
||||
|
||||
def findArgValue (arg : Arg) : M Value :=
|
||||
match arg with
|
||||
@@ -303,7 +303,7 @@ partial def elimDeadAux (assignment : Assignment) : FnBody → FnBody
|
||||
| FnBody.vdecl x t e b => FnBody.vdecl x t e (elimDeadAux assignment b)
|
||||
| FnBody.jdecl j ys v b => FnBody.jdecl j ys (elimDeadAux assignment v) (elimDeadAux assignment b)
|
||||
| FnBody.case tid x xType alts =>
|
||||
let v := assignment.findD x bot
|
||||
let v := assignment.getD x bot
|
||||
let alts := alts.map fun alt =>
|
||||
match alt with
|
||||
| Alt.ctor i b => Alt.ctor i <| if containsCtor v i then elimDeadAux assignment b else FnBody.unreachable
|
||||
|
||||
@@ -94,7 +94,15 @@ def emitCInitName (n : Name) : M Unit :=
|
||||
def shouldExport (n : Name) : Bool :=
|
||||
-- HACK: exclude symbols very unlikely to be used by the interpreter or other consumers of
|
||||
-- libleanshared to avoid Windows symbol limit
|
||||
!(`Lean.Compiler.LCNF).isPrefixOf n && !(`Lean.IR).isPrefixOf n && !(`Lean.Server).isPrefixOf n
|
||||
!(`Lean.Compiler.LCNF).isPrefixOf n &&
|
||||
!(`Lean.IR).isPrefixOf n &&
|
||||
-- Lean.Server.findModuleRefs is used in SubVerso, and the contents of RequestM are used by the
|
||||
-- full Verso as well as anything else that extends the LSP server.
|
||||
(!(`Lean.Server.Watchdog).isPrefixOf n) &&
|
||||
(!(`Lean.Server.ImportCompletion).isPrefixOf n) &&
|
||||
(!(`Lean.Server.Completion).isPrefixOf n)
|
||||
|
||||
|
||||
|
||||
def emitFnDeclAux (decl : Decl) (cppBaseName : String) (isExternal : Bool) : M Unit := do
|
||||
let ps := decl.params
|
||||
@@ -249,7 +257,7 @@ def throwUnknownVar {α : Type} (x : VarId) : M α :=
|
||||
|
||||
def getJPParams (j : JoinPointId) : M (Array Param) := do
|
||||
let ctx ← read;
|
||||
match ctx.jpMap.find? j with
|
||||
match ctx.jpMap[j]? with
|
||||
| some ps => pure ps
|
||||
| none => throw "unknown join point"
|
||||
|
||||
|
||||
@@ -65,8 +65,8 @@ structure Context (llvmctx : LLVM.Context) where
|
||||
llvmmodule : LLVM.Module llvmctx
|
||||
|
||||
structure State (llvmctx : LLVM.Context) where
|
||||
var2val : HashMap VarId (LLVM.LLVMType llvmctx × LLVM.Value llvmctx)
|
||||
jp2bb : HashMap JoinPointId (LLVM.BasicBlock llvmctx)
|
||||
var2val : Std.HashMap VarId (LLVM.LLVMType llvmctx × LLVM.Value llvmctx)
|
||||
jp2bb : Std.HashMap JoinPointId (LLVM.BasicBlock llvmctx)
|
||||
|
||||
abbrev Error := String
|
||||
|
||||
@@ -84,7 +84,7 @@ def addJpTostate (jp : JoinPointId) (bb : LLVM.BasicBlock llvmctx) : M llvmctx U
|
||||
|
||||
def emitJp (jp : JoinPointId) : M llvmctx (LLVM.BasicBlock llvmctx) := do
|
||||
let state ← get
|
||||
match state.jp2bb.find? jp with
|
||||
match state.jp2bb[jp]? with
|
||||
| .some bb => return bb
|
||||
| .none => throw s!"unable to find join point {jp}"
|
||||
|
||||
@@ -531,7 +531,7 @@ def emitFnDecls : M llvmctx Unit := do
|
||||
|
||||
def emitLhsSlot_ (x : VarId) : M llvmctx (LLVM.LLVMType llvmctx × LLVM.Value llvmctx) := do
|
||||
let state ← get
|
||||
match state.var2val.find? x with
|
||||
match state.var2val[x]? with
|
||||
| .some v => return v
|
||||
| .none => throw s!"unable to find variable {x}"
|
||||
|
||||
@@ -1029,7 +1029,7 @@ def emitTailCall (builder : LLVM.Builder llvmctx) (f : FunId) (v : Expr) : M llv
|
||||
|
||||
def emitJmp (builder : LLVM.Builder llvmctx) (jp : JoinPointId) (xs : Array Arg) : M llvmctx Unit := do
|
||||
let llvmctx ← read
|
||||
let ps ← match llvmctx.jpMap.find? jp with
|
||||
let ps ← match llvmctx.jpMap[jp]? with
|
||||
| some ps => pure ps
|
||||
| none => throw s!"Unknown join point {jp}"
|
||||
unless xs.size == ps.size do throw s!"Invalid goto, mismatched sizes between arguments, formal parameters."
|
||||
|
||||
@@ -51,8 +51,8 @@ end CollectUsedDecls
|
||||
def collectUsedDecls (env : Environment) (decl : Decl) (used : NameSet := {}) : NameSet :=
|
||||
(CollectUsedDecls.collectDecl decl env).run' used
|
||||
|
||||
abbrev VarTypeMap := HashMap VarId IRType
|
||||
abbrev JPParamsMap := HashMap JoinPointId (Array Param)
|
||||
abbrev VarTypeMap := Std.HashMap VarId IRType
|
||||
abbrev JPParamsMap := Std.HashMap JoinPointId (Array Param)
|
||||
|
||||
namespace CollectMaps
|
||||
abbrev Collector := (VarTypeMap × JPParamsMap) → (VarTypeMap × JPParamsMap)
|
||||
|
||||
@@ -10,7 +10,7 @@ import Lean.Compiler.IR.FreeVars
|
||||
|
||||
namespace Lean.IR.ExpandResetReuse
|
||||
/-- Mapping from variable to projections -/
|
||||
abbrev ProjMap := HashMap VarId Expr
|
||||
abbrev ProjMap := Std.HashMap VarId Expr
|
||||
namespace CollectProjMap
|
||||
abbrev Collector := ProjMap → ProjMap
|
||||
@[inline] def collectVDecl (x : VarId) (v : Expr) : Collector := fun m =>
|
||||
@@ -148,20 +148,20 @@ def setFields (y : VarId) (zs : Array Arg) (b : FnBody) : FnBody :=
|
||||
def isSelfSet (ctx : Context) (x : VarId) (i : Nat) (y : Arg) : Bool :=
|
||||
match y with
|
||||
| Arg.var y =>
|
||||
match ctx.projMap.find? y with
|
||||
match ctx.projMap[y]? with
|
||||
| some (Expr.proj j w) => j == i && w == x
|
||||
| _ => false
|
||||
| _ => false
|
||||
|
||||
/-- Given `uset x[i] := y`, return true iff `y := uproj[i] x` -/
|
||||
def isSelfUSet (ctx : Context) (x : VarId) (i : Nat) (y : VarId) : Bool :=
|
||||
match ctx.projMap.find? y with
|
||||
match ctx.projMap[y]? with
|
||||
| some (Expr.uproj j w) => j == i && w == x
|
||||
| _ => false
|
||||
|
||||
/-- Given `sset x[n, i] := y`, return true iff `y := sproj[n, i] x` -/
|
||||
def isSelfSSet (ctx : Context) (x : VarId) (n : Nat) (i : Nat) (y : VarId) : Bool :=
|
||||
match ctx.projMap.find? y with
|
||||
match ctx.projMap[y]? with
|
||||
| some (Expr.sproj m j w) => n == m && j == i && w == x
|
||||
| _ => false
|
||||
|
||||
|
||||
@@ -64,34 +64,34 @@ instance : AddMessageContext CompilerM where
|
||||
|
||||
def getType (fvarId : FVarId) : CompilerM Expr := do
|
||||
let lctx := (← get).lctx
|
||||
if let some decl := lctx.letDecls.find? fvarId then
|
||||
if let some decl := lctx.letDecls[fvarId]? then
|
||||
return decl.type
|
||||
else if let some decl := lctx.params.find? fvarId then
|
||||
else if let some decl := lctx.params[fvarId]? then
|
||||
return decl.type
|
||||
else if let some decl := lctx.funDecls.find? fvarId then
|
||||
else if let some decl := lctx.funDecls[fvarId]? then
|
||||
return decl.type
|
||||
else
|
||||
throwError "unknown free variable {fvarId.name}"
|
||||
|
||||
def getBinderName (fvarId : FVarId) : CompilerM Name := do
|
||||
let lctx := (← get).lctx
|
||||
if let some decl := lctx.letDecls.find? fvarId then
|
||||
if let some decl := lctx.letDecls[fvarId]? then
|
||||
return decl.binderName
|
||||
else if let some decl := lctx.params.find? fvarId then
|
||||
else if let some decl := lctx.params[fvarId]? then
|
||||
return decl.binderName
|
||||
else if let some decl := lctx.funDecls.find? fvarId then
|
||||
else if let some decl := lctx.funDecls[fvarId]? then
|
||||
return decl.binderName
|
||||
else
|
||||
throwError "unknown free variable {fvarId.name}"
|
||||
|
||||
def findParam? (fvarId : FVarId) : CompilerM (Option Param) :=
|
||||
return (← get).lctx.params.find? fvarId
|
||||
return (← get).lctx.params[fvarId]?
|
||||
|
||||
def findLetDecl? (fvarId : FVarId) : CompilerM (Option LetDecl) :=
|
||||
return (← get).lctx.letDecls.find? fvarId
|
||||
return (← get).lctx.letDecls[fvarId]?
|
||||
|
||||
def findFunDecl? (fvarId : FVarId) : CompilerM (Option FunDecl) :=
|
||||
return (← get).lctx.funDecls.find? fvarId
|
||||
return (← get).lctx.funDecls[fvarId]?
|
||||
|
||||
def findLetValue? (fvarId : FVarId) : CompilerM (Option LetValue) := do
|
||||
let some { value, .. } ← findLetDecl? fvarId | return none
|
||||
@@ -166,7 +166,7 @@ it is a free variable, a type (or type former), or `lcErased`.
|
||||
|
||||
`Check.lean` contains a substitution validator.
|
||||
-/
|
||||
abbrev FVarSubst := HashMap FVarId Expr
|
||||
abbrev FVarSubst := Std.HashMap FVarId Expr
|
||||
|
||||
/--
|
||||
Replace the free variables in `e` using the given substitution.
|
||||
@@ -190,7 +190,7 @@ where
|
||||
go (e : Expr) : Expr :=
|
||||
if e.hasFVar then
|
||||
match e with
|
||||
| .fvar fvarId => match s.find? fvarId with
|
||||
| .fvar fvarId => match s[fvarId]? with
|
||||
| some e => if translator then e else go e
|
||||
| none => e
|
||||
| .lit .. | .const .. | .sort .. | .mvar .. | .bvar .. => e
|
||||
@@ -224,7 +224,7 @@ That is, it is not a type (or type former), nor `lcErased`. Recall that a valid
|
||||
expressions that are free variables, `lcErased`, or type formers.
|
||||
-/
|
||||
private partial def normFVarImp (s : FVarSubst) (fvarId : FVarId) (translator : Bool) : NormFVarResult :=
|
||||
match s.find? fvarId with
|
||||
match s[fvarId]? with
|
||||
| some (.fvar fvarId') =>
|
||||
if translator then
|
||||
.fvar fvarId'
|
||||
@@ -246,7 +246,7 @@ private partial def normArgImp (s : FVarSubst) (arg : Arg) (translator : Bool) :
|
||||
match arg with
|
||||
| .erased => arg
|
||||
| .fvar fvarId =>
|
||||
match s.find? fvarId with
|
||||
match s[fvarId]? with
|
||||
| some (.fvar fvarId') =>
|
||||
let arg' := .fvar fvarId'
|
||||
if translator then arg' else normArgImp s arg' translator
|
||||
|
||||
@@ -268,7 +268,7 @@ def getFunctionSummary? (env : Environment) (fid : Name) : Option Value :=
|
||||
A map from variable identifiers to the `Value` produced by the abstract
|
||||
interpreter for them.
|
||||
-/
|
||||
abbrev Assignment := HashMap FVarId Value
|
||||
abbrev Assignment := Std.HashMap FVarId Value
|
||||
|
||||
/--
|
||||
The context of `InterpM`.
|
||||
@@ -332,7 +332,7 @@ If none is available return `Value.bot`.
|
||||
-/
|
||||
def findVarValue (var : FVarId) : InterpM Value := do
|
||||
let assignment ← getAssignment
|
||||
return assignment.findD var .bot
|
||||
return assignment.getD var .bot
|
||||
|
||||
/--
|
||||
Find the value of `arg` using the logic of `findVarValue`.
|
||||
@@ -547,13 +547,13 @@ where
|
||||
| .jp decl k | .fun decl k =>
|
||||
return code.updateFun! (← decl.updateValue (← go decl.value)) (← go k)
|
||||
| .cases cs =>
|
||||
let discrVal := assignment.findD cs.discr .bot
|
||||
let discrVal := assignment.getD cs.discr .bot
|
||||
let processAlt typ alt := do
|
||||
match alt with
|
||||
| .alt ctor args body =>
|
||||
if discrVal.containsCtor ctor then
|
||||
let filter param := do
|
||||
if let some val := assignment.find? param.fvarId then
|
||||
if let some val := assignment[param.fvarId]? then
|
||||
if let some literal ← val.getLiteral then
|
||||
return some (param, literal)
|
||||
return none
|
||||
|
||||
@@ -62,7 +62,7 @@ structure State where
|
||||
Whenever there is function application `f a₁ ... aₙ`, where `f` is in `decls`, `f` is not `main`, and
|
||||
we visit with the abstract values assigned to `aᵢ`, but first we record the visit here.
|
||||
-/
|
||||
visited : HashSet (Name × Array AbsValue) := {}
|
||||
visited : Std.HashSet (Name × Array AbsValue) := {}
|
||||
/--
|
||||
Bitmask containing the result, i.e., which parameters of `main` are fixed.
|
||||
We initialize it with `true` everywhere.
|
||||
|
||||
@@ -59,7 +59,7 @@ structure FloatState where
|
||||
/--
|
||||
A map from identifiers of declarations to their current decision.
|
||||
-/
|
||||
decision : HashMap FVarId Decision
|
||||
decision : Std.HashMap FVarId Decision
|
||||
/--
|
||||
A map from decisions (excluding `unknown`) to the declarations with
|
||||
these decisions (in correct order). Basically:
|
||||
@@ -67,7 +67,7 @@ structure FloatState where
|
||||
- Which declarations do we move into a certain arm
|
||||
- Which declarations do we move into the default arm
|
||||
-/
|
||||
newArms : HashMap Decision (List CodeDecl)
|
||||
newArms : Std.HashMap Decision (List CodeDecl)
|
||||
|
||||
/--
|
||||
Use to collect relevant declarations for the floating mechanism.
|
||||
@@ -116,8 +116,8 @@ up to this point, with respect to `cs`. The initial decisions are:
|
||||
- `arm` or `default` if we see the declaration only being used in exactly one cases arm
|
||||
- `unknown` otherwise
|
||||
-/
|
||||
def initialDecisions (cs : Cases) : BaseFloatM (HashMap FVarId Decision) := do
|
||||
let mut map := mkHashMap (← read).decls.length
|
||||
def initialDecisions (cs : Cases) : BaseFloatM (Std.HashMap FVarId Decision) := do
|
||||
let mut map := Std.HashMap.empty (← read).decls.length
|
||||
let folder val acc := do
|
||||
if let .let decl := val then
|
||||
if (← ignore? decl) then
|
||||
@@ -130,25 +130,25 @@ def initialDecisions (cs : Cases) : BaseFloatM (HashMap FVarId Decision) := do
|
||||
(_, map) ← goCases cs |>.run map
|
||||
return map
|
||||
where
|
||||
goFVar (plannedDecision : Decision) (var : FVarId) : StateRefT (HashMap FVarId Decision) BaseFloatM Unit := do
|
||||
if let some decision := (← get).find? var then
|
||||
goFVar (plannedDecision : Decision) (var : FVarId) : StateRefT (Std.HashMap FVarId Decision) BaseFloatM Unit := do
|
||||
if let some decision := (← get)[var]? then
|
||||
if decision == .unknown then
|
||||
modify fun s => s.insert var plannedDecision
|
||||
else if decision != plannedDecision then
|
||||
modify fun s => s.insert var .dont
|
||||
-- otherwise we already have the proper decision
|
||||
|
||||
goAlt (alt : Alt) : StateRefT (HashMap FVarId Decision) BaseFloatM Unit :=
|
||||
goAlt (alt : Alt) : StateRefT (Std.HashMap FVarId Decision) BaseFloatM Unit :=
|
||||
forFVarM (goFVar (.ofAlt alt)) alt
|
||||
goCases (cs : Cases) : StateRefT (HashMap FVarId Decision) BaseFloatM Unit :=
|
||||
goCases (cs : Cases) : StateRefT (Std.HashMap FVarId Decision) BaseFloatM Unit :=
|
||||
cs.alts.forM goAlt
|
||||
|
||||
/--
|
||||
Compute the initial new arms. This will just set up a map from all arms of
|
||||
`cs` to empty `Array`s, plus one additional entry for `dont`.
|
||||
-/
|
||||
def initialNewArms (cs : Cases) : HashMap Decision (List CodeDecl) := Id.run do
|
||||
let mut map := mkHashMap (cs.alts.size + 1)
|
||||
def initialNewArms (cs : Cases) : Std.HashMap Decision (List CodeDecl) := Id.run do
|
||||
let mut map := Std.HashMap.empty (cs.alts.size + 1)
|
||||
map := map.insert .dont []
|
||||
cs.alts.foldr (init := map) fun val acc => acc.insert (.ofAlt val) []
|
||||
|
||||
@@ -170,7 +170,7 @@ respectively but since `z` can't be moved we don't want that to move `x` and `y`
|
||||
-/
|
||||
def dontFloat (decl : CodeDecl) : FloatM Unit := do
|
||||
forFVarM goFVar decl
|
||||
modify fun s => { s with newArms := s.newArms.insert .dont (decl :: s.newArms.find! .dont) }
|
||||
modify fun s => { s with newArms := s.newArms.insert .dont (decl :: s.newArms[Decision.dont]!) }
|
||||
where
|
||||
goFVar (fvar : FVarId) : FloatM Unit := do
|
||||
if (← get).decision.contains fvar then
|
||||
@@ -223,12 +223,12 @@ Will:
|
||||
If we are at `y` `x` is still marked to be moved but we don't want that.
|
||||
-/
|
||||
def float (decl : CodeDecl) : FloatM Unit := do
|
||||
let arm := (← get).decision.find! decl.fvarId
|
||||
let arm := (← get).decision[decl.fvarId]!
|
||||
forFVarM (goFVar · arm) decl
|
||||
modify fun s => { s with newArms := s.newArms.insert arm (decl :: s.newArms.find! arm) }
|
||||
modify fun s => { s with newArms := s.newArms.insert arm (decl :: s.newArms[arm]!) }
|
||||
where
|
||||
goFVar (fvar : FVarId) (arm : Decision) : FloatM Unit := do
|
||||
let some decision := (← get).decision.find? fvar | return ()
|
||||
let some decision := (← get).decision[fvar]? | return ()
|
||||
if decision != arm then
|
||||
modify fun s => { s with decision := s.decision.insert fvar .dont }
|
||||
else if decision == .unknown then
|
||||
@@ -249,7 +249,7 @@ where
|
||||
-/
|
||||
goCases : FloatM Unit := do
|
||||
for decl in (← read).decls do
|
||||
let currentDecision := (← get).decision.find! decl.fvarId
|
||||
let currentDecision := (← get).decision[decl.fvarId]!
|
||||
if currentDecision == .unknown then
|
||||
/-
|
||||
If the decision is still unknown by now this means `decl` is
|
||||
@@ -284,10 +284,10 @@ where
|
||||
newArms := initialNewArms cs
|
||||
}
|
||||
let (_, res) ← goCases |>.run base
|
||||
let remainders := res.newArms.find! .dont
|
||||
let remainders := res.newArms[Decision.dont]!
|
||||
let altMapper alt := do
|
||||
let decision := .ofAlt alt
|
||||
let newCode := res.newArms.find! decision
|
||||
let decision := Decision.ofAlt alt
|
||||
let newCode := res.newArms[decision]!
|
||||
trace[Compiler.floatLetIn] "Size of code that was pushed into arm: {repr decision} {newCode.length}"
|
||||
let fused ← withNewScope do
|
||||
go (attachCodeDecls newCode.toArray alt.getCode)
|
||||
|
||||
@@ -29,7 +29,7 @@ structure CandidateInfo where
|
||||
The set of candidates that rely on this candidate to be a join point.
|
||||
For a more detailed explanation see the documentation of `find`
|
||||
-/
|
||||
associated : HashSet FVarId
|
||||
associated : Std.HashSet FVarId
|
||||
deriving Inhabited
|
||||
|
||||
/--
|
||||
@@ -39,14 +39,14 @@ structure FindState where
|
||||
/--
|
||||
All current join point candidates accessible by their `FVarId`.
|
||||
-/
|
||||
candidates : HashMap FVarId CandidateInfo := .empty
|
||||
candidates : Std.HashMap FVarId CandidateInfo := .empty
|
||||
/--
|
||||
The `FVarId`s of all `fun` declarations that were declared within the
|
||||
current `fun`.
|
||||
-/
|
||||
scope : HashSet FVarId := .empty
|
||||
scope : Std.HashSet FVarId := .empty
|
||||
|
||||
abbrev ReplaceCtx := HashMap FVarId Name
|
||||
abbrev ReplaceCtx := Std.HashMap FVarId Name
|
||||
|
||||
abbrev FindM := ReaderT (Option FVarId) StateRefT FindState ScopeM
|
||||
abbrev ReplaceM := ReaderT ReplaceCtx CompilerM
|
||||
@@ -55,7 +55,7 @@ abbrev ReplaceM := ReaderT ReplaceCtx CompilerM
|
||||
Attempt to find a join point candidate by its `FVarId`.
|
||||
-/
|
||||
private def findCandidate? (fvarId : FVarId) : FindM (Option CandidateInfo) := do
|
||||
return (← get).candidates.find? fvarId
|
||||
return (← get).candidates[fvarId]?
|
||||
|
||||
/--
|
||||
Erase a join point candidate as well as all the ones that depend on it
|
||||
@@ -69,7 +69,7 @@ private partial def eraseCandidate (fvarId : FVarId) : FindM Unit := do
|
||||
/--
|
||||
Combinator for modifying the candidates in `FindM`.
|
||||
-/
|
||||
private def modifyCandidates (f : HashMap FVarId CandidateInfo → HashMap FVarId CandidateInfo) : FindM Unit :=
|
||||
private def modifyCandidates (f : Std.HashMap FVarId CandidateInfo → Std.HashMap FVarId CandidateInfo) : FindM Unit :=
|
||||
modify (fun state => {state with candidates := f state.candidates })
|
||||
|
||||
/--
|
||||
@@ -196,7 +196,7 @@ where
|
||||
return code
|
||||
| _, _ => return Code.updateLet! code decl (← go k)
|
||||
| .fun decl k =>
|
||||
if let some replacement := (← read).find? decl.fvarId then
|
||||
if let some replacement := (← read)[decl.fvarId]? then
|
||||
let newDecl := { decl with
|
||||
binderName := replacement,
|
||||
value := (← go decl.value)
|
||||
@@ -244,7 +244,7 @@ structure ExtendState where
|
||||
to `Param`s. The free variables in this map are the once that the context
|
||||
of said join point will be extended by by passing in the respective parameter.
|
||||
-/
|
||||
fvarMap : HashMap FVarId (HashMap FVarId Param) := {}
|
||||
fvarMap : Std.HashMap FVarId (Std.HashMap FVarId Param) := {}
|
||||
|
||||
/--
|
||||
The monad for the `extendJoinPointContext` pass.
|
||||
@@ -262,7 +262,7 @@ otherwise just return `fvar`.
|
||||
def replaceFVar (fvar : FVarId) : ExtendM FVarId := do
|
||||
if (← read).candidates.contains fvar then
|
||||
if let some currentJp := (← read).currentJp? then
|
||||
if let some replacement := (← get).fvarMap.find! currentJp |>.find? fvar then
|
||||
if let some replacement := (← get).fvarMap[currentJp]![fvar]? then
|
||||
return replacement.fvarId
|
||||
return fvar
|
||||
|
||||
@@ -313,7 +313,7 @@ This is necessary if:
|
||||
-/
|
||||
def extendByIfNecessary (fvar : FVarId) : ExtendM Unit := do
|
||||
if let some currentJp := (← read).currentJp? then
|
||||
let mut translator := (← get).fvarMap.find! currentJp
|
||||
let mut translator := (← get).fvarMap[currentJp]!
|
||||
let candidates := (← read).candidates
|
||||
if !(← isInScope fvar) && !translator.contains fvar && candidates.contains fvar then
|
||||
let typ ← getType fvar
|
||||
@@ -337,7 +337,7 @@ of `j.2` in `j.1`.
|
||||
-/
|
||||
def mergeJpContextIfNecessary (jp : FVarId) : ExtendM Unit := do
|
||||
if (← read).currentJp?.isSome then
|
||||
let additionalArgs := (← get).fvarMap.find! jp |>.toArray
|
||||
let additionalArgs := (← get).fvarMap[jp]!.toArray
|
||||
for (fvar, _) in additionalArgs do
|
||||
extendByIfNecessary fvar
|
||||
|
||||
@@ -405,7 +405,7 @@ where
|
||||
| .jp decl k =>
|
||||
let decl ← withNewJpScope decl do
|
||||
let value ← go decl.value
|
||||
let additionalParams := (← get).fvarMap.find! decl.fvarId |>.toArray |>.map Prod.snd
|
||||
let additionalParams := (← get).fvarMap[decl.fvarId]!.toArray |>.map Prod.snd
|
||||
let newType := additionalParams.foldr (init := decl.type) (fun val acc => .forallE val.binderName val.type acc .default)
|
||||
decl.update newType (additionalParams ++ decl.params) value
|
||||
mergeJpContextIfNecessary decl.fvarId
|
||||
@@ -426,7 +426,7 @@ where
|
||||
return Code.updateCases! code cs.resultType discr alts
|
||||
| .jmp fn args =>
|
||||
let mut newArgs ← args.mapM (mapFVarM goFVar)
|
||||
let additionalArgs := (← get).fvarMap.find! fn |>.toArray |>.map Prod.fst
|
||||
let additionalArgs := (← get).fvarMap[fn]!.toArray |>.map Prod.fst
|
||||
if let some _currentJp := (← read).currentJp? then
|
||||
let f := fun arg => do
|
||||
return .fvar (← goFVar arg)
|
||||
@@ -545,7 +545,7 @@ where
|
||||
if let some knownArgs := (← get).jpJmpArgs.find? fn then
|
||||
let mut newArgs := knownArgs
|
||||
for (param, arg) in decl.params.zip args do
|
||||
if let some knownVal := newArgs.find? param.fvarId then
|
||||
if let some knownVal := newArgs[param.fvarId]? then
|
||||
if arg.toExpr != knownVal then
|
||||
newArgs := newArgs.erase param.fvarId
|
||||
modify fun s => { s with jpJmpArgs := s.jpJmpArgs.insert fn newArgs }
|
||||
|
||||
@@ -13,9 +13,9 @@ namespace Lean.Compiler.LCNF
|
||||
LCNF local context.
|
||||
-/
|
||||
structure LCtx where
|
||||
params : HashMap FVarId Param := {}
|
||||
letDecls : HashMap FVarId LetDecl := {}
|
||||
funDecls : HashMap FVarId FunDecl := {}
|
||||
params : Std.HashMap FVarId Param := {}
|
||||
letDecls : Std.HashMap FVarId LetDecl := {}
|
||||
funDecls : Std.HashMap FVarId FunDecl := {}
|
||||
deriving Inhabited
|
||||
|
||||
def LCtx.addParam (lctx : LCtx) (param : Param) : LCtx :=
|
||||
|
||||
@@ -30,7 +30,7 @@ structure State where
|
||||
/-- Counter for generating new (normalized) universe parameter names. -/
|
||||
nextIdx : Nat := 1
|
||||
/-- Mapping from existing universe parameter names to the new ones. -/
|
||||
map : HashMap Name Level := {}
|
||||
map : Std.HashMap Name Level := {}
|
||||
/-- Parameters that have been normalized. -/
|
||||
paramNames : Array Name := #[]
|
||||
|
||||
@@ -49,7 +49,7 @@ partial def normLevel (u : Level) : M Level := do
|
||||
| .max v w => return u.updateMax! (← normLevel v) (← normLevel w)
|
||||
| .imax v w => return u.updateIMax! (← normLevel v) (← normLevel w)
|
||||
| .mvar _ => unreachable!
|
||||
| .param n => match (← get).map.find? n with
|
||||
| .param n => match (← get).map[n]? with
|
||||
| some u => return u
|
||||
| none =>
|
||||
let u := Level.param <| (`u).appendIndexAfter (← get).nextIdx
|
||||
|
||||
@@ -31,9 +31,9 @@ def sortedBySize : Probe Decl (Nat × Decl) := fun decls =>
|
||||
if sz₁ == sz₂ then Name.lt decl₁.name decl₂.name else sz₁ < sz₂
|
||||
|
||||
def countUnique [ToString α] [BEq α] [Hashable α] : Probe α (α × Nat) := fun data => do
|
||||
let mut map := HashMap.empty
|
||||
let mut map := Std.HashMap.empty
|
||||
for d in data do
|
||||
if let some count := map.find? d then
|
||||
if let some count := map[d]? then
|
||||
map := map.insert d (count + 1)
|
||||
else
|
||||
map := map.insert d 1
|
||||
|
||||
@@ -40,7 +40,7 @@ structure FunDeclInfoMap where
|
||||
/--
|
||||
Mapping from local function name to inlining information.
|
||||
-/
|
||||
map : HashMap FVarId FunDeclInfo := {}
|
||||
map : Std.HashMap FVarId FunDeclInfo := {}
|
||||
deriving Inhabited
|
||||
|
||||
def FunDeclInfoMap.format (s : FunDeclInfoMap) : CompilerM Format := do
|
||||
@@ -56,7 +56,7 @@ Add new occurrence for the local function with binder name `key`.
|
||||
def FunDeclInfoMap.add (s : FunDeclInfoMap) (fvarId : FVarId) : FunDeclInfoMap :=
|
||||
match s with
|
||||
| { map } =>
|
||||
match map.find? fvarId with
|
||||
match map[fvarId]? with
|
||||
| some .once => { map := map.insert fvarId .many }
|
||||
| none => { map := map.insert fvarId .once }
|
||||
| _ => { map }
|
||||
@@ -67,7 +67,7 @@ Add new occurrence for the local function occurring as an argument for another f
|
||||
def FunDeclInfoMap.addHo (s : FunDeclInfoMap) (fvarId : FVarId) : FunDeclInfoMap :=
|
||||
match s with
|
||||
| { map } =>
|
||||
match map.find? fvarId with
|
||||
match map[fvarId]? with
|
||||
| some .once | none => { map := map.insert fvarId .many }
|
||||
| _ => { map }
|
||||
|
||||
|
||||
@@ -173,7 +173,7 @@ Execute `x` with `fvarId` set as `mustInline`.
|
||||
After execution the original setting is restored.
|
||||
-/
|
||||
def withAddMustInline (fvarId : FVarId) (x : SimpM α) : SimpM α := do
|
||||
let saved? := (← get).funDeclInfoMap.map.find? fvarId
|
||||
let saved? := (← get).funDeclInfoMap.map[fvarId]?
|
||||
try
|
||||
addMustInline fvarId
|
||||
x
|
||||
@@ -185,7 +185,7 @@ Return true if the given local function declaration or join point id is marked a
|
||||
`once` or `mustInline`. We use this information to decide whether to inline them.
|
||||
-/
|
||||
def isOnceOrMustInline (fvarId : FVarId) : SimpM Bool := do
|
||||
match (← get).funDeclInfoMap.map.find? fvarId with
|
||||
match (← get).funDeclInfoMap.map[fvarId]? with
|
||||
| some .once | some .mustInline => return true
|
||||
| _ => return false
|
||||
|
||||
|
||||
@@ -199,9 +199,9 @@ structure State where
|
||||
/-- Cache from Lean regular expression to LCNF argument. -/
|
||||
cache : PHashMap Expr Arg := {}
|
||||
/-- `toLCNFType` cache -/
|
||||
typeCache : HashMap Expr Expr := {}
|
||||
typeCache : Std.HashMap Expr Expr := {}
|
||||
/-- isTypeFormerType cache -/
|
||||
isTypeFormerTypeCache : HashMap Expr Bool := {}
|
||||
isTypeFormerTypeCache : Std.HashMap Expr Bool := {}
|
||||
/-- LCNF sequence, we chain it to create a LCNF `Code` object. -/
|
||||
seq : Array Element := #[]
|
||||
/--
|
||||
@@ -257,7 +257,7 @@ private partial def isTypeFormerType (type : Expr) : M Bool := do
|
||||
| .true => return true
|
||||
| .false => return false
|
||||
| .undef =>
|
||||
if let some result := (← get).isTypeFormerTypeCache.find? type then
|
||||
if let some result := (← get).isTypeFormerTypeCache[type]? then
|
||||
return result
|
||||
let result ← liftMetaM <| Meta.isTypeFormerType type
|
||||
modify fun s => { s with isTypeFormerTypeCache := s.isTypeFormerTypeCache.insert type result }
|
||||
@@ -305,7 +305,7 @@ def applyToAny (type : Expr) : M Expr := do
|
||||
| _ => none
|
||||
|
||||
def toLCNFType (type : Expr) : M Expr := do
|
||||
match (← get).typeCache.find? type with
|
||||
match (← get).typeCache[type]? with
|
||||
| some type' => return type'
|
||||
| none =>
|
||||
let type' ← liftMetaM <| LCNF.toLCNFType type
|
||||
|
||||
@@ -31,7 +31,16 @@ register_builtin_option maxHeartbeats : Nat := {
|
||||
descr := "maximum amount of heartbeats per command. A heartbeat is number of (small) memory allocations (in thousands), 0 means no limit"
|
||||
}
|
||||
|
||||
def useDiagnosticMsg := s!"use `set_option {diagnostics.name} true` to get diagnostic information"
|
||||
/--
|
||||
If the `diagnostics` option is not already set, gives a message explaining this option.
|
||||
Begins with a `\n`, so an error message can look like `m!"some error occurred{useDiagnosticMsg}"`.
|
||||
-/
|
||||
def useDiagnosticMsg : MessageData :=
|
||||
MessageData.lazy fun ctx =>
|
||||
if diagnostics.get ctx.opts then
|
||||
pure ""
|
||||
else
|
||||
pure s!"\nAdditional diagnostic information may be available using the `set_option {diagnostics.name} true` command."
|
||||
|
||||
namespace Core
|
||||
|
||||
@@ -300,8 +309,10 @@ register_builtin_option debug.moduleNameAtTimeout : Bool := {
|
||||
def throwMaxHeartbeat (moduleName : Name) (optionName : Name) (max : Nat) : CoreM Unit := do
|
||||
let includeModuleName := debug.moduleNameAtTimeout.get (← getOptions)
|
||||
let atModuleName := if includeModuleName then s!" at `{moduleName}`" else ""
|
||||
let msg := s!"(deterministic) timeout{atModuleName}, maximum number of heartbeats ({max/1000}) has been reached\nuse `set_option {optionName} <num>` to set the limit\n{useDiagnosticMsg}"
|
||||
throw <| Exception.error (← getRef) (MessageData.ofFormat (Std.Format.text msg))
|
||||
throw <| Exception.error (← getRef) m!"\
|
||||
(deterministic) timeout{atModuleName}, maximum number of heartbeats ({max/1000}) has been reached\n\
|
||||
Use `set_option {optionName} <num>` to set the limit.\
|
||||
{useDiagnosticMsg}"
|
||||
|
||||
def checkMaxHeartbeatsCore (moduleName : String) (optionName : Name) (max : Nat) : CoreM Unit := do
|
||||
unless max == 0 do
|
||||
|
||||
@@ -6,6 +6,8 @@ Author: Leonardo de Moura
|
||||
prelude
|
||||
import Init.Data.Nat.Power2
|
||||
import Lean.Data.AssocList
|
||||
import Std.Data.HashMap.Basic
|
||||
import Std.Data.HashMap.Raw
|
||||
namespace Lean
|
||||
|
||||
def HashMapBucket (α : Type u) (β : Type v) :=
|
||||
@@ -269,17 +271,11 @@ def ofListWith (l : List (α × β)) (f : β → β → β) : HashMap α β :=
|
||||
| none => m.insert p.fst p.snd
|
||||
| some v => m.insert p.fst $ f v p.snd)
|
||||
|
||||
end Lean.HashMap
|
||||
attribute [deprecated Std.HashMap] HashMap
|
||||
attribute [deprecated Std.HashMap.Raw] HashMapImp
|
||||
attribute [deprecated Std.HashMap.Raw.empty] mkHashMapImp
|
||||
attribute [deprecated Std.HashMap.empty] mkHashMap
|
||||
attribute [deprecated Std.HashMap.empty] HashMap.empty
|
||||
attribute [deprecated Std.HashMap.ofList] HashMap.ofList
|
||||
|
||||
/--
|
||||
Groups all elements `x`, `y` in `xs` with `key x == key y` into the same array
|
||||
`(xs.groupByKey key).find! (key x)`. Groups preserve the relative order of elements in `xs`.
|
||||
-/
|
||||
def Array.groupByKey [BEq α] [Hashable α] (key : β → α) (xs : Array β)
|
||||
: Lean.HashMap α (Array β) := Id.run do
|
||||
let mut groups := ∅
|
||||
for x in xs do
|
||||
let group := groups.findD (key x) #[]
|
||||
groups := groups.erase (key x) -- make `group` referentially unique
|
||||
groups := groups.insert (key x) (group.push x)
|
||||
return groups
|
||||
end Lean.HashMap
|
||||
|
||||
@@ -6,6 +6,8 @@ Author: Leonardo de Moura
|
||||
prelude
|
||||
import Init.Data.Nat.Power2
|
||||
import Init.Data.List.Control
|
||||
import Std.Data.HashSet.Basic
|
||||
import Std.Data.HashSet.Raw
|
||||
namespace Lean
|
||||
universe u v w
|
||||
|
||||
@@ -217,3 +219,9 @@ def insertMany [ForIn Id ρ α] (s : HashSet α) (as : ρ) : HashSet α := Id.ru
|
||||
def merge {α : Type u} [BEq α] [Hashable α] (s t : HashSet α) : HashSet α :=
|
||||
t.fold (init := s) fun s a => s.insert a
|
||||
-- We don't use `insertMany` here because it gives weird universes.
|
||||
|
||||
attribute [deprecated Std.HashSet] HashSet
|
||||
attribute [deprecated Std.HashSet.Raw] HashSetImp
|
||||
attribute [deprecated Std.HashSet.Raw.empty] mkHashSetImp
|
||||
attribute [deprecated Std.HashSet.empty] mkHashSet
|
||||
attribute [deprecated Std.HashSet.empty] HashSet.empty
|
||||
|
||||
@@ -12,10 +12,11 @@ import Lean.Data.RBMap
|
||||
namespace Lean.Json.Parser
|
||||
|
||||
open Lean.Parsec
|
||||
open Lean.Parsec.String
|
||||
|
||||
@[inline]
|
||||
def hexChar : Parsec Nat := do
|
||||
let c ← anyChar
|
||||
def hexChar : Parser Nat := do
|
||||
let c ← any
|
||||
if '0' ≤ c ∧ c ≤ '9' then
|
||||
pure $ c.val.toNat - '0'.val.toNat
|
||||
else if 'a' ≤ c ∧ c ≤ 'f' then
|
||||
@@ -25,8 +26,8 @@ def hexChar : Parsec Nat := do
|
||||
else
|
||||
fail "invalid hex character"
|
||||
|
||||
def escapedChar : Parsec Char := do
|
||||
let c ← anyChar
|
||||
def escapedChar : Parser Char := do
|
||||
let c ← any
|
||||
match c with
|
||||
| '\\' => return '\\'
|
||||
| '"' => return '"'
|
||||
@@ -41,13 +42,13 @@ def escapedChar : Parsec Char := do
|
||||
return Char.ofNat $ 4096*u1 + 256*u2 + 16*u3 + u4
|
||||
| _ => fail "illegal \\u escape"
|
||||
|
||||
partial def strCore (acc : String) : Parsec String := do
|
||||
partial def strCore (acc : String) : Parser String := do
|
||||
let c ← peek!
|
||||
if c = '"' then -- "
|
||||
skip
|
||||
return acc
|
||||
else
|
||||
let c ← anyChar
|
||||
let c ← any
|
||||
if c = '\\' then
|
||||
strCore (acc.push (← escapedChar))
|
||||
-- as to whether c.val > 0xffff should be split up and encoded with multiple \u,
|
||||
@@ -58,9 +59,9 @@ partial def strCore (acc : String) : Parsec String := do
|
||||
else
|
||||
fail "unexpected character in string"
|
||||
|
||||
def str : Parsec String := strCore ""
|
||||
def str : Parser String := strCore ""
|
||||
|
||||
partial def natCore (acc digits : Nat) : Parsec (Nat × Nat) := do
|
||||
partial def natCore (acc digits : Nat) : Parser (Nat × Nat) := do
|
||||
let some c ← peek? | return (acc, digits)
|
||||
if '0' ≤ c ∧ c ≤ '9' then
|
||||
skip
|
||||
@@ -70,7 +71,7 @@ partial def natCore (acc digits : Nat) : Parsec (Nat × Nat) := do
|
||||
return (acc, digits)
|
||||
|
||||
@[inline]
|
||||
def lookahead (p : Char → Prop) (desc : String) [DecidablePred p] : Parsec Unit := do
|
||||
def lookahead (p : Char → Prop) (desc : String) [DecidablePred p] : Parser Unit := do
|
||||
let c ← peek!
|
||||
if p c then
|
||||
return ()
|
||||
@@ -78,22 +79,22 @@ def lookahead (p : Char → Prop) (desc : String) [DecidablePred p] : Parsec Uni
|
||||
fail <| "expected " ++ desc
|
||||
|
||||
@[inline]
|
||||
def natNonZero : Parsec Nat := do
|
||||
def natNonZero : Parser Nat := do
|
||||
lookahead (fun c => '1' ≤ c ∧ c ≤ '9') "1-9"
|
||||
let (n, _) ← natCore 0 0
|
||||
return n
|
||||
|
||||
@[inline]
|
||||
def natNumDigits : Parsec (Nat × Nat) := do
|
||||
def natNumDigits : Parser (Nat × Nat) := do
|
||||
lookahead (fun c => '0' ≤ c ∧ c ≤ '9') "digit"
|
||||
natCore 0 0
|
||||
|
||||
@[inline]
|
||||
def natMaybeZero : Parsec Nat := do
|
||||
def natMaybeZero : Parser Nat := do
|
||||
let (n, _) ← natNumDigits
|
||||
return n
|
||||
|
||||
def num : Parsec JsonNumber := do
|
||||
def num : Parser JsonNumber := do
|
||||
let c ← peek!
|
||||
let sign ← if c = '-' then
|
||||
skip
|
||||
@@ -132,10 +133,10 @@ def num : Parsec JsonNumber := do
|
||||
else
|
||||
return res
|
||||
|
||||
partial def arrayCore (anyCore : Parsec Json) (acc : Array Json) : Parsec (Array Json) := do
|
||||
partial def arrayCore (anyCore : Parser Json) (acc : Array Json) : Parser (Array Json) := do
|
||||
let hd ← anyCore
|
||||
let acc' := acc.push hd
|
||||
let c ← anyChar
|
||||
let c ← any
|
||||
if c = ']' then
|
||||
ws
|
||||
return acc'
|
||||
@@ -145,12 +146,12 @@ partial def arrayCore (anyCore : Parsec Json) (acc : Array Json) : Parsec (Array
|
||||
else
|
||||
fail "unexpected character in array"
|
||||
|
||||
partial def objectCore (anyCore : Parsec Json) : Parsec (RBNode String (fun _ => Json)) := do
|
||||
partial def objectCore (anyCore : Parser Json) : Parser (RBNode String (fun _ => Json)) := do
|
||||
lookahead (fun c => c = '"') "\""; skip; -- "
|
||||
let k ← strCore ""; ws
|
||||
lookahead (fun c => c = ':') ":"; skip; ws
|
||||
let v ← anyCore
|
||||
let c ← anyChar
|
||||
let c ← any
|
||||
if c = '}' then
|
||||
ws
|
||||
return RBNode.singleton k v
|
||||
@@ -161,7 +162,7 @@ partial def objectCore (anyCore : Parsec Json) : Parsec (RBNode String (fun _ =>
|
||||
else
|
||||
fail "unexpected character in object"
|
||||
|
||||
partial def anyCore : Parsec Json := do
|
||||
partial def anyCore : Parser Json := do
|
||||
let c ← peek!
|
||||
if c = '[' then
|
||||
skip; ws
|
||||
@@ -203,7 +204,7 @@ partial def anyCore : Parsec Json := do
|
||||
fail "unexpected input"
|
||||
|
||||
|
||||
def any : Parsec Json := do
|
||||
def any : Parser Json := do
|
||||
ws
|
||||
let res ← anyCore
|
||||
eof
|
||||
|
||||
@@ -150,7 +150,7 @@ instance : FromJson RefInfo where
|
||||
pure { definition?, usages }
|
||||
|
||||
/-- References from a single module/file -/
|
||||
def ModuleRefs := HashMap RefIdent RefInfo
|
||||
def ModuleRefs := Std.HashMap RefIdent RefInfo
|
||||
|
||||
instance : ToJson ModuleRefs where
|
||||
toJson m := Json.mkObj <| m.toList.map fun (ident, info) => (ident.toJson.compress, toJson info)
|
||||
@@ -158,7 +158,7 @@ instance : ToJson ModuleRefs where
|
||||
instance : FromJson ModuleRefs where
|
||||
fromJson? j := do
|
||||
let node ← j.getObj?
|
||||
node.foldM (init := HashMap.empty) fun m k v =>
|
||||
node.foldM (init := Std.HashMap.empty) fun m k v =>
|
||||
return m.insert (← RefIdent.fromJson? (← Json.parse k)) (← fromJson? v)
|
||||
|
||||
/--
|
||||
|
||||
@@ -4,6 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Std.Data.HashSet.Basic
|
||||
import Lean.Data.HashSet
|
||||
import Lean.Data.RBMap
|
||||
import Lean.Data.RBTree
|
||||
@@ -64,14 +65,14 @@ abbrev insert (s : NameSSet) (n : Name) : NameSSet := SSet.insert s n
|
||||
abbrev contains (s : NameSSet) (n : Name) : Bool := SSet.contains s n
|
||||
end NameSSet
|
||||
|
||||
def NameHashSet := HashSet Name
|
||||
def NameHashSet := Std.HashSet Name
|
||||
|
||||
namespace NameHashSet
|
||||
@[inline] def empty : NameHashSet := HashSet.empty
|
||||
@[inline] def empty : NameHashSet := Std.HashSet.empty
|
||||
instance : EmptyCollection NameHashSet := ⟨empty⟩
|
||||
instance : Inhabited NameHashSet := ⟨{}⟩
|
||||
def insert (s : NameHashSet) (n : Name) := HashSet.insert s n
|
||||
def contains (s : NameHashSet) (n : Name) : Bool := HashSet.contains s n
|
||||
def insert (s : NameHashSet) (n : Name) := Std.HashSet.insert s n
|
||||
def contains (s : NameHashSet) (n : Name) : Bool := Std.HashSet.contains s n
|
||||
end NameHashSet
|
||||
|
||||
def MacroScopesView.isPrefixOf (v₁ v₂ : MacroScopesView) : Bool :=
|
||||
|
||||
@@ -4,181 +4,6 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Dany Fabian
|
||||
-/
|
||||
prelude
|
||||
import Init.NotationExtra
|
||||
import Init.Data.ToString.Macro
|
||||
|
||||
namespace Lean
|
||||
|
||||
namespace Parsec
|
||||
inductive ParseResult (α : Type) where
|
||||
| success (pos : String.Iterator) (res : α)
|
||||
| error (pos : String.Iterator) (err : String)
|
||||
deriving Repr
|
||||
end Parsec
|
||||
|
||||
def Parsec (α : Type) : Type := String.Iterator → Lean.Parsec.ParseResult α
|
||||
|
||||
namespace Parsec
|
||||
|
||||
open ParseResult
|
||||
|
||||
instance (α : Type) : Inhabited (Parsec α) :=
|
||||
⟨λ it => error it ""⟩
|
||||
|
||||
@[inline]
|
||||
protected def pure (a : α) : Parsec α := λ it =>
|
||||
success it a
|
||||
|
||||
@[inline]
|
||||
def bind {α β : Type} (f : Parsec α) (g : α → Parsec β) : Parsec β := λ it =>
|
||||
match f it with
|
||||
| success rem a => g a rem
|
||||
| error pos msg => error pos msg
|
||||
|
||||
instance : Monad Parsec :=
|
||||
{ pure := Parsec.pure, bind }
|
||||
|
||||
@[inline]
|
||||
def fail (msg : String) : Parsec α := fun it =>
|
||||
error it msg
|
||||
|
||||
@[inline]
|
||||
def tryCatch (p : Parsec α)
|
||||
(csuccess : α → Parsec β)
|
||||
(cerror : Unit → Parsec β)
|
||||
: Parsec β := fun it =>
|
||||
match p it with
|
||||
| .success rem a => csuccess a rem
|
||||
| .error rem err =>
|
||||
-- We assume that it.s never changes as the `Parsec` monad only modifies `it.pos`.
|
||||
if it.pos = rem.pos then cerror () rem else .error rem err
|
||||
|
||||
@[inline]
|
||||
def orElse (p : Parsec α) (q : Unit → Parsec α) : Parsec α :=
|
||||
tryCatch p pure q
|
||||
|
||||
@[inline]
|
||||
def attempt (p : Parsec α) : Parsec α := λ it =>
|
||||
match p it with
|
||||
| success rem res => success rem res
|
||||
| error _ err => error it err
|
||||
|
||||
instance : Alternative Parsec :=
|
||||
{ failure := fail "", orElse }
|
||||
|
||||
protected def run (p : Parsec α) (s : String) : Except String α :=
|
||||
match p s.mkIterator with
|
||||
| Parsec.ParseResult.success _ res => Except.ok res
|
||||
| Parsec.ParseResult.error it err => Except.error s!"offset {repr it.i.byteIdx}: {err}"
|
||||
|
||||
def expectedEndOfInput := "expected end of input"
|
||||
|
||||
@[inline]
|
||||
def eof : Parsec Unit := fun it =>
|
||||
if it.hasNext then
|
||||
error it expectedEndOfInput
|
||||
else
|
||||
success it ()
|
||||
|
||||
@[specialize]
|
||||
partial def manyCore (p : Parsec α) (acc : Array α) : Parsec $ Array α :=
|
||||
tryCatch p (manyCore p <| acc.push ·) (fun _ => pure acc)
|
||||
|
||||
@[inline]
|
||||
def many (p : Parsec α) : Parsec $ Array α := manyCore p #[]
|
||||
|
||||
@[inline]
|
||||
def many1 (p : Parsec α) : Parsec $ Array α := do manyCore p #[←p]
|
||||
|
||||
@[specialize]
|
||||
partial def manyCharsCore (p : Parsec Char) (acc : String) : Parsec String :=
|
||||
tryCatch p (manyCharsCore p <| acc.push ·) (fun _ => pure acc)
|
||||
|
||||
@[inline]
|
||||
def manyChars (p : Parsec Char) : Parsec String := manyCharsCore p ""
|
||||
|
||||
@[inline]
|
||||
def many1Chars (p : Parsec Char) : Parsec String := do manyCharsCore p (←p).toString
|
||||
|
||||
/-- Parses the given string. -/
|
||||
def pstring (s : String) : Parsec String := λ it =>
|
||||
let substr := it.extract (it.forward s.length)
|
||||
if substr = s then
|
||||
success (it.forward s.length) substr
|
||||
else
|
||||
error it s!"expected: {s}"
|
||||
|
||||
@[inline]
|
||||
def skipString (s : String) : Parsec Unit := pstring s *> pure ()
|
||||
|
||||
def unexpectedEndOfInput := "unexpected end of input"
|
||||
|
||||
@[inline]
|
||||
def anyChar : Parsec Char := λ it =>
|
||||
if it.hasNext then success it.next it.curr else error it unexpectedEndOfInput
|
||||
|
||||
@[inline]
|
||||
def pchar (c : Char) : Parsec Char := attempt do
|
||||
if (←anyChar) = c then pure c else fail s!"expected: '{c}'"
|
||||
|
||||
@[inline]
|
||||
def skipChar (c : Char) : Parsec Unit := pchar c *> pure ()
|
||||
|
||||
@[inline]
|
||||
def digit : Parsec Char := attempt do
|
||||
let c ← anyChar
|
||||
if '0' ≤ c ∧ c ≤ '9' then return c else fail s!"digit expected"
|
||||
|
||||
@[inline]
|
||||
def hexDigit : Parsec Char := attempt do
|
||||
let c ← anyChar
|
||||
if ('0' ≤ c ∧ c ≤ '9')
|
||||
∨ ('a' ≤ c ∧ c ≤ 'f')
|
||||
∨ ('A' ≤ c ∧ c ≤ 'F') then return c else fail s!"hex digit expected"
|
||||
|
||||
@[inline]
|
||||
def asciiLetter : Parsec Char := attempt do
|
||||
let c ← anyChar
|
||||
if ('A' ≤ c ∧ c ≤ 'Z') ∨ ('a' ≤ c ∧ c ≤ 'z') then return c else fail s!"ASCII letter expected"
|
||||
|
||||
@[inline]
|
||||
def satisfy (p : Char → Bool) : Parsec Char := attempt do
|
||||
let c ← anyChar
|
||||
if p c then return c else fail "condition not satisfied"
|
||||
|
||||
@[inline]
|
||||
def notFollowedBy (p : Parsec α) : Parsec Unit := λ it =>
|
||||
match p it with
|
||||
| success _ _ => error it ""
|
||||
| error _ _ => success it ()
|
||||
|
||||
partial def skipWs (it : String.Iterator) : String.Iterator :=
|
||||
if it.hasNext then
|
||||
let c := it.curr
|
||||
if c = '\u0009' ∨ c = '\u000a' ∨ c = '\u000d' ∨ c = '\u0020' then
|
||||
skipWs it.next
|
||||
else
|
||||
it
|
||||
else
|
||||
it
|
||||
|
||||
@[inline]
|
||||
def peek? : Parsec (Option Char) := fun it =>
|
||||
if it.hasNext then
|
||||
success it it.curr
|
||||
else
|
||||
success it none
|
||||
|
||||
@[inline]
|
||||
def peek! : Parsec Char := do
|
||||
let some c ← peek? | fail unexpectedEndOfInput
|
||||
return c
|
||||
|
||||
@[inline]
|
||||
def skip : Parsec Unit := fun it =>
|
||||
success it.next ()
|
||||
|
||||
@[inline]
|
||||
def ws : Parsec Unit := fun it =>
|
||||
success (skipWs it) ()
|
||||
end Parsec
|
||||
import Lean.Data.Parsec.Basic
|
||||
import Lean.Data.Parsec.String
|
||||
import Lean.Data.Parsec.ByteArray
|
||||
|
||||
144
src/Lean/Data/Parsec/Basic.lean
Normal file
144
src/Lean/Data/Parsec/Basic.lean
Normal file
@@ -0,0 +1,144 @@
|
||||
/-
|
||||
Copyright (c) 2021 Microsoft Corporation. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Dany Fabian, Henrik Böving
|
||||
-/
|
||||
prelude
|
||||
import Init.NotationExtra
|
||||
import Init.Data.ToString.Macro
|
||||
|
||||
namespace Lean
|
||||
|
||||
namespace Parsec
|
||||
|
||||
inductive ParseResult (α : Type) (ι : Type) where
|
||||
| success (pos : ι) (res : α)
|
||||
| error (pos : ι) (err : String)
|
||||
deriving Repr
|
||||
|
||||
end Parsec
|
||||
|
||||
def Parsec (ι : Type) (α : Type) : Type := ι → Lean.Parsec.ParseResult α ι
|
||||
|
||||
namespace Parsec
|
||||
|
||||
class Input (ι : Type) (elem : outParam Type) (idx : outParam Type) [DecidableEq idx] [DecidableEq elem] where
|
||||
pos : ι → idx
|
||||
next : ι → ι
|
||||
curr : ι → elem
|
||||
hasNext : ι → Bool
|
||||
|
||||
variable {α : Type} {ι : Type} {elem : Type} {idx : Type}
|
||||
variable [DecidableEq idx] [DecidableEq elem] [Input ι elem idx]
|
||||
|
||||
instance : Inhabited (Parsec ι α) where
|
||||
default := fun it => .error it ""
|
||||
|
||||
@[inline]
|
||||
protected def pure (a : α) : Parsec ι α := fun it =>
|
||||
.success it a
|
||||
|
||||
@[inline]
|
||||
def bind {α β : Type} (f : Parsec ι α) (g : α → Parsec ι β) : Parsec ι β := fun it =>
|
||||
match f it with
|
||||
| .success rem a => g a rem
|
||||
| .error pos msg => .error pos msg
|
||||
|
||||
instance : Monad (Parsec ι) where
|
||||
pure := Parsec.pure
|
||||
bind := Parsec.bind
|
||||
|
||||
@[inline]
|
||||
def fail (msg : String) : Parsec ι α := fun it =>
|
||||
.error it msg
|
||||
|
||||
@[inline]
|
||||
def tryCatch (p : Parsec ι α) (csuccess : α → Parsec ι β) (cerror : Unit → Parsec ι β)
|
||||
: Parsec ι β := fun it =>
|
||||
match p it with
|
||||
| .success rem a => csuccess a rem
|
||||
| .error rem err =>
|
||||
-- We assume that it.s never changes as the `Parsec` monad only modifies `it.pos`.
|
||||
if Input.pos it = Input.pos rem then cerror () rem else .error rem err
|
||||
|
||||
@[inline]
|
||||
def orElse (p : Parsec ι α) (q : Unit → Parsec ι α) : Parsec ι α :=
|
||||
tryCatch p pure q
|
||||
|
||||
@[inline]
|
||||
def attempt (p : Parsec ι α) : Parsec ι α := fun it =>
|
||||
match p it with
|
||||
| .success rem res => .success rem res
|
||||
| .error _ err => .error it err
|
||||
|
||||
instance : Alternative (Parsec ι) where
|
||||
failure := fail ""
|
||||
orElse := orElse
|
||||
|
||||
def expectedEndOfInput := "expected end of input"
|
||||
|
||||
@[inline]
|
||||
def eof : Parsec ι Unit := fun it =>
|
||||
if Input.hasNext it then
|
||||
.error it expectedEndOfInput
|
||||
else
|
||||
.success it ()
|
||||
|
||||
@[specialize]
|
||||
partial def manyCore (p : Parsec ι α) (acc : Array α) : Parsec ι <| Array α :=
|
||||
tryCatch p (manyCore p <| acc.push ·) (fun _ => pure acc)
|
||||
|
||||
@[inline]
|
||||
def many (p : Parsec ι α) : Parsec ι <| Array α := manyCore p #[]
|
||||
|
||||
@[inline]
|
||||
def many1 (p : Parsec ι α) : Parsec ι <| Array α := do manyCore p #[← p]
|
||||
|
||||
def unexpectedEndOfInput := "unexpected end of input"
|
||||
|
||||
@[inline]
|
||||
def any : Parsec ι elem := fun it =>
|
||||
if Input.hasNext it then
|
||||
.success (Input.next it) (Input.curr it)
|
||||
else
|
||||
.error it unexpectedEndOfInput
|
||||
|
||||
@[inline]
|
||||
def satisfy (p : elem → Bool) : Parsec ι elem := attempt do
|
||||
let c ← any
|
||||
if p c then return c else fail "condition not satisfied"
|
||||
|
||||
@[inline]
|
||||
def notFollowedBy (p : Parsec ι α) : Parsec ι Unit := fun it =>
|
||||
match p it with
|
||||
| .success _ _ => .error it ""
|
||||
| .error _ _ => .success it ()
|
||||
|
||||
@[inline]
|
||||
def peek? : Parsec ι (Option elem) := fun it =>
|
||||
if Input.hasNext it then
|
||||
.success it (Input.curr it)
|
||||
else
|
||||
.success it none
|
||||
|
||||
@[inline]
|
||||
def peek! : Parsec ι elem := do
|
||||
let some c ← peek? | fail unexpectedEndOfInput
|
||||
return c
|
||||
|
||||
@[inline]
|
||||
def skip : Parsec ι Unit := fun it =>
|
||||
.success (Input.next it) ()
|
||||
|
||||
@[specialize]
|
||||
partial def manyCharsCore (p : Parsec ι Char) (acc : String) : Parsec ι String :=
|
||||
tryCatch p (manyCharsCore p <| acc.push ·) (fun _ => pure acc)
|
||||
|
||||
@[inline]
|
||||
def manyChars (p : Parsec ι Char) : Parsec ι String := manyCharsCore p ""
|
||||
|
||||
@[inline]
|
||||
def many1Chars (p : Parsec ι Char) : Parsec ι String := do manyCharsCore p (← p).toString
|
||||
|
||||
|
||||
end Parsec
|
||||
103
src/Lean/Data/Parsec/ByteArray.lean
Normal file
103
src/Lean/Data/Parsec/ByteArray.lean
Normal file
@@ -0,0 +1,103 @@
|
||||
/-
|
||||
Copyright (c) 2024 Lean FRO, LLC. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Henrik Böving
|
||||
-/
|
||||
prelude
|
||||
import Lean.Data.Parsec.Basic
|
||||
import Init.Data.ByteArray.Basic
|
||||
import Init.Data.String.Extra
|
||||
|
||||
namespace Lean
|
||||
namespace Parsec
|
||||
namespace ByteArray
|
||||
|
||||
instance : Input ByteArray.Iterator UInt8 Nat where
|
||||
pos it := it.pos
|
||||
next it := it.next
|
||||
curr it := it.curr
|
||||
hasNext it := it.hasNext
|
||||
|
||||
abbrev Parser (α : Type) : Type := Parsec ByteArray.Iterator α
|
||||
|
||||
protected def Parser.run (p : Parser α) (arr : ByteArray) : Except String α :=
|
||||
match p arr.iter with
|
||||
| .success _ res => Except.ok res
|
||||
| .error it err => Except.error s!"offset {repr it.pos}: {err}"
|
||||
|
||||
@[inline]
|
||||
def pbyte (b : UInt8) : Parser UInt8 := attempt do
|
||||
if (← any) = b then pure b else fail s!"expected: '{b}'"
|
||||
|
||||
@[inline]
|
||||
def skipByte (b : UInt8) : Parser Unit := pbyte b *> pure ()
|
||||
|
||||
def skipBytes (arr : ByteArray) : Parser Unit := do
|
||||
for b in arr do
|
||||
skipByte b
|
||||
|
||||
@[inline]
|
||||
def pstring (s : String) : Parser String := do
|
||||
skipBytes s.toUTF8
|
||||
return s
|
||||
|
||||
@[inline]
|
||||
def skipString (s : String) : Parser Unit := pstring s *> pure ()
|
||||
|
||||
/--
|
||||
Parse a `Char` that can be represented in 1 byte. If `c` uses more than 1 byte it is truncated.
|
||||
-/
|
||||
@[inline]
|
||||
def pByteChar (c : Char) : Parser Char := attempt do
|
||||
if (← any) = c.toUInt8 then pure c else fail s!"expected: '{c}'"
|
||||
|
||||
/--
|
||||
Skip a `Char` that can be represented in 1 byte. If `c` uses more than 1 byte it is truncated.
|
||||
-/
|
||||
@[inline]
|
||||
def skipByteChar (c : Char) : Parser Unit := skipByte c.toUInt8
|
||||
|
||||
@[inline]
|
||||
def digit : Parser Char := attempt do
|
||||
let b ← any
|
||||
if '0'.toUInt8 ≤ b ∧ b ≤ '9'.toUInt8 then return Char.ofUInt8 b else fail s!"digit expected"
|
||||
|
||||
@[inline]
|
||||
def hexDigit : Parser Char := attempt do
|
||||
let b ← any
|
||||
if ('0'.toUInt8 ≤ b ∧ b ≤ '9'.toUInt8)
|
||||
∨ ('a'.toUInt8 ≤ b ∧ b ≤ 'f'.toUInt8)
|
||||
∨ ('A'.toUInt8 ≤ b ∧ b ≤ 'F'.toUInt8) then return Char.ofUInt8 b else fail s!"hex digit expected"
|
||||
|
||||
@[inline]
|
||||
def asciiLetter : Parser Char := attempt do
|
||||
let b ← any
|
||||
if ('A'.toUInt8 ≤ b ∧ b ≤ 'Z'.toUInt8) ∨ ('a'.toUInt8 ≤ b ∧ b ≤ 'z'.toUInt8) then
|
||||
return Char.ofUInt8 b
|
||||
else
|
||||
fail s!"ASCII letter expected"
|
||||
|
||||
private partial def skipWs (it : ByteArray.Iterator) : ByteArray.Iterator :=
|
||||
if it.hasNext then
|
||||
let b := it.curr
|
||||
if b = '\u0009'.toUInt8 ∨ b = '\u000a'.toUInt8 ∨ b = '\u000d'.toUInt8 ∨ b = '\u0020'.toUInt8 then
|
||||
skipWs it.next
|
||||
else
|
||||
it
|
||||
else
|
||||
it
|
||||
|
||||
@[inline]
|
||||
def ws : Parser Unit := fun it =>
|
||||
.success (skipWs it) ()
|
||||
|
||||
def take (n : Nat) : Parser ByteArray := fun it =>
|
||||
let subarr := it.array.extract it.idx (it.idx + n)
|
||||
if subarr.size != n then
|
||||
.error it s!"expected: {n} bytes"
|
||||
else
|
||||
.success (it.forward n) subarr
|
||||
|
||||
end ByteArray
|
||||
end Parsec
|
||||
end Lean
|
||||
84
src/Lean/Data/Parsec/String.lean
Normal file
84
src/Lean/Data/Parsec/String.lean
Normal file
@@ -0,0 +1,84 @@
|
||||
/-
|
||||
Copyright (c) 2021 Microsoft Corporation. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Dany Fabian, Henrik Böving
|
||||
-/
|
||||
prelude
|
||||
import Lean.Data.Parsec.Basic
|
||||
|
||||
namespace Lean
|
||||
namespace Parsec
|
||||
namespace String
|
||||
|
||||
instance : Input String.Iterator Char String.Pos where
|
||||
pos it := it.pos
|
||||
next it := it.next
|
||||
curr it := it.curr
|
||||
hasNext it := it.hasNext
|
||||
|
||||
abbrev Parser (α : Type) : Type := Parsec String.Iterator α
|
||||
|
||||
protected def Parser.run (p : Parser α) (s : String) : Except String α :=
|
||||
match p s.mkIterator with
|
||||
| .success _ res => Except.ok res
|
||||
| .error it err => Except.error s!"offset {repr it.i.byteIdx}: {err}"
|
||||
|
||||
/-- Parses the given string. -/
|
||||
def pstring (s : String) : Parser String := fun it =>
|
||||
let substr := it.extract (it.forward s.length)
|
||||
if substr = s then
|
||||
.success (it.forward s.length) substr
|
||||
else
|
||||
.error it s!"expected: {s}"
|
||||
|
||||
@[inline]
|
||||
def skipString (s : String) : Parser Unit := pstring s *> pure ()
|
||||
|
||||
@[inline]
|
||||
def pchar (c : Char) : Parser Char := attempt do
|
||||
if (← any) = c then pure c else fail s!"expected: '{c}'"
|
||||
|
||||
@[inline]
|
||||
def skipChar (c : Char) : Parser Unit := pchar c *> pure ()
|
||||
|
||||
@[inline]
|
||||
def digit : Parser Char := attempt do
|
||||
let c ← any
|
||||
if '0' ≤ c ∧ c ≤ '9' then return c else fail s!"digit expected"
|
||||
|
||||
@[inline]
|
||||
def hexDigit : Parser Char := attempt do
|
||||
let c ← any
|
||||
if ('0' ≤ c ∧ c ≤ '9')
|
||||
∨ ('a' ≤ c ∧ c ≤ 'f')
|
||||
∨ ('A' ≤ c ∧ c ≤ 'F') then return c else fail s!"hex digit expected"
|
||||
|
||||
@[inline]
|
||||
def asciiLetter : Parser Char := attempt do
|
||||
let c ← any
|
||||
if ('A' ≤ c ∧ c ≤ 'Z') ∨ ('a' ≤ c ∧ c ≤ 'z') then return c else fail s!"ASCII letter expected"
|
||||
|
||||
private partial def skipWs (it : String.Iterator) : String.Iterator :=
|
||||
if it.hasNext then
|
||||
let c := it.curr
|
||||
if c = '\u0009' ∨ c = '\u000a' ∨ c = '\u000d' ∨ c = '\u0020' then
|
||||
skipWs it.next
|
||||
else
|
||||
it
|
||||
else
|
||||
it
|
||||
|
||||
@[inline]
|
||||
def ws : Parser Unit := fun it =>
|
||||
.success (skipWs it) ()
|
||||
|
||||
def take (n : Nat) : Parser String := fun it =>
|
||||
let substr := it.extract (it.forward n)
|
||||
if substr.length != n then
|
||||
.error it s!"expected: {n} codepoints"
|
||||
else
|
||||
.success (it.forward n) substr
|
||||
|
||||
end String
|
||||
end Parsec
|
||||
end Lean
|
||||
@@ -4,6 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Std.Data.HashMap.Basic
|
||||
import Lean.Data.HashMap
|
||||
import Lean.Data.PersistentHashMap
|
||||
universe u v w w'
|
||||
@@ -28,7 +29,7 @@ namespace Lean
|
||||
-/
|
||||
structure SMap (α : Type u) (β : Type v) [BEq α] [Hashable α] where
|
||||
stage₁ : Bool := true
|
||||
map₁ : HashMap α β := {}
|
||||
map₁ : Std.HashMap α β := {}
|
||||
map₂ : PHashMap α β := {}
|
||||
|
||||
namespace SMap
|
||||
@@ -37,7 +38,7 @@ variable {α : Type u} {β : Type v} [BEq α] [Hashable α]
|
||||
instance : Inhabited (SMap α β) := ⟨{}⟩
|
||||
def empty : SMap α β := {}
|
||||
|
||||
@[inline] def fromHashMap (m : HashMap α β) (stage₁ := true) : SMap α β :=
|
||||
@[inline] def fromHashMap (m : Std.HashMap α β) (stage₁ := true) : SMap α β :=
|
||||
{ map₁ := m, stage₁ := stage₁ }
|
||||
|
||||
@[specialize] def insert : SMap α β → α → β → SMap α β
|
||||
@@ -49,8 +50,8 @@ def empty : SMap α β := {}
|
||||
| ⟨false, m₁, m₂⟩, k, v => ⟨false, m₁, m₂.insert k v⟩
|
||||
|
||||
@[specialize] def find? : SMap α β → α → Option β
|
||||
| ⟨true, m₁, _⟩, k => m₁.find? k
|
||||
| ⟨false, m₁, m₂⟩, k => (m₂.find? k).orElse fun _ => m₁.find? k
|
||||
| ⟨true, m₁, _⟩, k => m₁[k]?
|
||||
| ⟨false, m₁, m₂⟩, k => (m₂.find? k).orElse fun _ => m₁[k]?
|
||||
|
||||
@[inline] def findD (m : SMap α β) (a : α) (b₀ : β) : β :=
|
||||
(m.find? a).getD b₀
|
||||
@@ -67,8 +68,8 @@ def empty : SMap α β := {}
|
||||
/-- Similar to `find?`, but searches for result in the hashmap first.
|
||||
So, the result is correct only if we never "overwrite" `map₁` entries using `map₂`. -/
|
||||
@[specialize] def find?' : SMap α β → α → Option β
|
||||
| ⟨true, m₁, _⟩, k => m₁.find? k
|
||||
| ⟨false, m₁, m₂⟩, k => (m₁.find? k).orElse fun _ => m₂.find? k
|
||||
| ⟨true, m₁, _⟩, k => m₁[k]?
|
||||
| ⟨false, m₁, m₂⟩, k => m₁[k]?.orElse fun _ => m₂.find? k
|
||||
|
||||
def forM [Monad m] (s : SMap α β) (f : α → β → m PUnit) : m PUnit := do
|
||||
s.map₁.forM f
|
||||
@@ -96,7 +97,7 @@ def fold {σ : Type w} (f : σ → α → β → σ) (init : σ) (m : SMap α β
|
||||
m.map₂.foldl f $ m.map₁.fold f init
|
||||
|
||||
def numBuckets (m : SMap α β) : Nat :=
|
||||
m.map₁.numBuckets
|
||||
Std.HashMap.Internal.numBuckets m.map₁
|
||||
|
||||
def toList (m : SMap α β) : List (α × β) :=
|
||||
m.fold (init := []) fun es a b => (a, b)::es
|
||||
|
||||
@@ -13,23 +13,24 @@ namespace Lean
|
||||
namespace Xml
|
||||
|
||||
namespace Parser
|
||||
|
||||
open Lean.Parsec
|
||||
open Parsec.ParseResult
|
||||
open Lean.Parsec.String
|
||||
|
||||
abbrev LeanChar := Char
|
||||
|
||||
/-- consume a newline character sequence pretending, that we read '\n'. As per spec:
|
||||
https://www.w3.org/TR/xml/#sec-line-ends -/
|
||||
def endl : Parsec LeanChar := (skipString "\r\n" <|> skipChar '\r' <|> skipChar '\n') *> pure '\n'
|
||||
def endl : Parser LeanChar := (skipString "\r\n" <|> skipChar '\r' <|> skipChar '\n') *> pure '\n'
|
||||
|
||||
def quote (p : Parsec α) : Parsec α :=
|
||||
def quote (p : Parser α) : Parser α :=
|
||||
skipChar '\'' *> p <* skipChar '\''
|
||||
<|> skipChar '"' *> p <* skipChar '"'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Char -/
|
||||
def Char : Parsec LeanChar :=
|
||||
def Char : Parser LeanChar :=
|
||||
(attempt do
|
||||
let c ← anyChar
|
||||
let c ← any
|
||||
let cNat := c.toNat
|
||||
if (0x20 ≤ cNat ∧ cNat ≤ 0xD7FF)
|
||||
∨ (0xE000 ≤ cNat ∧ cNat ≤ 0xFFFD)
|
||||
@@ -37,11 +38,11 @@ def Char : Parsec LeanChar :=
|
||||
<|> pchar '\t' <|> endl
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-S -/
|
||||
def S : Parsec String :=
|
||||
def S : Parser String :=
|
||||
many1Chars (pchar ' ' <|> endl <|> pchar '\t')
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Eq -/
|
||||
def Eq : Parsec Unit :=
|
||||
def Eq : Parser Unit :=
|
||||
optional S *> skipChar '=' <* optional S
|
||||
|
||||
private def nameStartCharRanges : Array (Nat × Nat) :=
|
||||
@@ -59,8 +60,8 @@ private def nameStartCharRanges : Array (Nat × Nat) :=
|
||||
(0x10000, 0xEFFFF)]
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-NameStartChar -/
|
||||
def NameStartChar : Parsec LeanChar := attempt do
|
||||
let c ← anyChar
|
||||
def NameStartChar : Parser LeanChar := attempt do
|
||||
let c ← any
|
||||
if ('A' ≤ c ∧ c ≤ 'Z') ∨ ('a' ≤ c ∧ c ≤ 'z') then pure c
|
||||
else if c = ':' ∨ c = '_' then pure c
|
||||
else
|
||||
@@ -69,44 +70,44 @@ def NameStartChar : Parsec LeanChar := attempt do
|
||||
else fail "expected a name character"
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-NameChar -/
|
||||
def NameChar : Parsec LeanChar :=
|
||||
def NameChar : Parser LeanChar :=
|
||||
NameStartChar <|> digit <|> pchar '-' <|> pchar '.' <|> pchar '\xB7'
|
||||
<|> satisfy (λ c => ('\u0300' ≤ c ∧ c ≤ '\u036F') ∨ ('\u203F' ≤ c ∧ c ≤ '\u2040'))
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Name -/
|
||||
def Name : Parsec String := do
|
||||
def Name : Parser String := do
|
||||
let x ← NameStartChar
|
||||
manyCharsCore NameChar x.toString
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-VersionNum -/
|
||||
def VersionNum : Parsec Unit :=
|
||||
def VersionNum : Parser Unit :=
|
||||
skipString "1." <* (many1 digit)
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-VersionInfo -/
|
||||
def VersionInfo : Parsec Unit := do
|
||||
def VersionInfo : Parser Unit := do
|
||||
S *>
|
||||
skipString "version"
|
||||
Eq
|
||||
quote VersionNum
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-EncName -/
|
||||
def EncName : Parsec String := do
|
||||
def EncName : Parser String := do
|
||||
let x ← asciiLetter
|
||||
manyCharsCore (asciiLetter <|> digit <|> pchar '-' <|> pchar '_' <|> pchar '.') x.toString
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-EncodingDecl -/
|
||||
def EncodingDecl : Parsec String := do
|
||||
def EncodingDecl : Parser String := do
|
||||
S *>
|
||||
skipString "encoding"
|
||||
Eq
|
||||
quote EncName
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-SDDecl -/
|
||||
def SDDecl : Parsec String := do
|
||||
def SDDecl : Parser String := do
|
||||
S *> skipString "standalone" *> Eq *> quote (pstring "yes" <|> pstring "no")
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-XMLDecl -/
|
||||
def XMLdecl : Parsec Unit := do
|
||||
def XMLdecl : Parser Unit := do
|
||||
skipString "<?xml"
|
||||
VersionInfo
|
||||
optional EncodingDecl *>
|
||||
@@ -115,7 +116,7 @@ def XMLdecl : Parsec Unit := do
|
||||
skipString "?>"
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Comment -/
|
||||
def Comment : Parsec String :=
|
||||
def Comment : Parser String :=
|
||||
let notDash := Char.toString <$> satisfy (λ c => c ≠ '-')
|
||||
skipString "<!--" *>
|
||||
Array.foldl String.append "" <$> many (attempt <| notDash <|> (do
|
||||
@@ -125,45 +126,45 @@ def Comment : Parsec String :=
|
||||
<* skipString "-->"
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-PITarget -/
|
||||
def PITarget : Parsec String :=
|
||||
def PITarget : Parser String :=
|
||||
Name <* (skipChar 'X' <|> skipChar 'x') <* (skipChar 'M' <|> skipChar 'm') <* (skipChar 'L' <|> skipChar 'l')
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-PI -/
|
||||
def PI : Parsec Unit := do
|
||||
def PI : Parser Unit := do
|
||||
skipString "<?"
|
||||
<* PITarget <*
|
||||
optional (S *> manyChars (notFollowedBy (skipString "?>") *> Char))
|
||||
skipString "?>"
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Misc -/
|
||||
def Misc : Parsec Unit :=
|
||||
def Misc : Parser Unit :=
|
||||
Comment *> pure () <|> PI <|> S *> pure ()
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-SystemLiteral -/
|
||||
def SystemLiteral : Parsec String :=
|
||||
def SystemLiteral : Parser String :=
|
||||
pchar '"' *> manyChars (satisfy λ c => c ≠ '"') <* pchar '"'
|
||||
<|> pchar '\'' *> manyChars (satisfy λ c => c ≠ '\'') <* pure '\''
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-PubidChar -/
|
||||
def PubidChar : Parsec LeanChar :=
|
||||
def PubidChar : Parser LeanChar :=
|
||||
asciiLetter <|> digit <|> endl <|> attempt do
|
||||
let c ← anyChar
|
||||
let c ← any
|
||||
if "-'()+,./:=?;!*#@$_%".contains c then pure c else fail "PublidChar expected"
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-PubidLiteral -/
|
||||
def PubidLiteral : Parsec String :=
|
||||
def PubidLiteral : Parser String :=
|
||||
pchar '"' *> manyChars PubidChar <* pchar '"'
|
||||
<|> pchar '\'' *> manyChars (attempt do
|
||||
let c ← PubidChar
|
||||
if c = '\'' then fail "'\\'' not expected" else pure c) <* pchar '\''
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-ExternalID -/
|
||||
def ExternalID : Parsec Unit :=
|
||||
def ExternalID : Parser Unit :=
|
||||
skipString "SYSTEM" *> S *> SystemLiteral *> pure ()
|
||||
<|> skipString "PUBLIC" *> S *> PubidLiteral *> S *> SystemLiteral *> pure ()
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Mixed -/
|
||||
def Mixed : Parsec Unit :=
|
||||
def Mixed : Parser Unit :=
|
||||
(do
|
||||
skipChar '('
|
||||
optional S *>
|
||||
@@ -175,11 +176,11 @@ def Mixed : Parsec Unit :=
|
||||
|
||||
mutual
|
||||
/-- https://www.w3.org/TR/xml/#NT-cp -/
|
||||
partial def cp : Parsec Unit :=
|
||||
partial def cp : Parser Unit :=
|
||||
(Name *> pure () <|> choice <|> seq) <* optional (skipChar '?' <|> skipChar '*' <|> skipChar '+')
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-choice -/
|
||||
partial def choice : Parsec Unit := do
|
||||
partial def choice : Parser Unit := do
|
||||
skipChar '('
|
||||
optional S *>
|
||||
cp
|
||||
@@ -188,7 +189,7 @@ mutual
|
||||
skipChar ')'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-seq -/
|
||||
partial def seq : Parsec Unit := do
|
||||
partial def seq : Parser Unit := do
|
||||
skipChar '('
|
||||
optional S *>
|
||||
cp
|
||||
@@ -198,15 +199,15 @@ mutual
|
||||
end
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-children -/
|
||||
def children : Parsec Unit :=
|
||||
def children : Parser Unit :=
|
||||
(choice <|> seq) <* optional (skipChar '?' <|> skipChar '*' <|> skipChar '+')
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-contentspec -/
|
||||
def contentspec : Parsec Unit := do
|
||||
def contentspec : Parser Unit := do
|
||||
skipString "EMPTY" <|> skipString "ANY" <|> Mixed <|> children
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-elementdecl -/
|
||||
def elementDecl : Parsec Unit := do
|
||||
def elementDecl : Parser Unit := do
|
||||
skipString "<!ELEMENT"
|
||||
S *>
|
||||
Name *>
|
||||
@@ -215,11 +216,11 @@ def elementDecl : Parsec Unit := do
|
||||
skipChar '>'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-StringType -/
|
||||
def StringType : Parsec Unit :=
|
||||
def StringType : Parser Unit :=
|
||||
skipString "CDATA"
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-TokenizedType -/
|
||||
def TokenizedType : Parsec Unit :=
|
||||
def TokenizedType : Parser Unit :=
|
||||
skipString "ID"
|
||||
<|> skipString "IDREF"
|
||||
<|> skipString "IDREFS"
|
||||
@@ -229,7 +230,7 @@ def TokenizedType : Parsec Unit :=
|
||||
<|> skipString "NMTOKENS"
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-NotationType -/
|
||||
def NotationType : Parsec Unit := do
|
||||
def NotationType : Parser Unit := do
|
||||
skipString "NOTATION"
|
||||
S *>
|
||||
skipChar '(' <*
|
||||
@@ -239,11 +240,11 @@ def NotationType : Parsec Unit := do
|
||||
skipChar ')'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Nmtoken -/
|
||||
def Nmtoken : Parsec String := do
|
||||
def Nmtoken : Parser String := do
|
||||
many1Chars NameChar
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Enumeration -/
|
||||
def Enumeration : Parsec Unit := do
|
||||
def Enumeration : Parser Unit := do
|
||||
skipChar '('
|
||||
optional S *>
|
||||
Nmtoken *> many (optional S *> skipChar '|' *> optional S *> Nmtoken) *>
|
||||
@@ -251,11 +252,11 @@ def Enumeration : Parsec Unit := do
|
||||
skipChar ')'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-EnumeratedType -/
|
||||
def EnumeratedType : Parsec Unit :=
|
||||
def EnumeratedType : Parser Unit :=
|
||||
NotationType <|> Enumeration
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-AttType -/
|
||||
def AttType : Parsec Unit :=
|
||||
def AttType : Parser Unit :=
|
||||
StringType <|> TokenizedType <|> EnumeratedType
|
||||
|
||||
def predefinedEntityToChar : String → Option LeanChar
|
||||
@@ -267,7 +268,7 @@ def predefinedEntityToChar : String → Option LeanChar
|
||||
| _ => none
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-EntityRef -/
|
||||
def EntityRef : Parsec $ Option LeanChar := attempt $
|
||||
def EntityRef : Parser $ Option LeanChar := attempt $
|
||||
skipChar '&' *> predefinedEntityToChar <$> Name <* skipChar ';'
|
||||
|
||||
@[inline]
|
||||
@@ -280,7 +281,7 @@ def digitsToNat (base : Nat) (digits : Array Nat) : Nat :=
|
||||
digits.foldl (λ r d => r * base + d) 0
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-CharRef -/
|
||||
def CharRef : Parsec LeanChar := do
|
||||
def CharRef : Parser LeanChar := do
|
||||
skipString "&#"
|
||||
let charCode ←
|
||||
digitsToNat 10 <$> many1 (hexDigitToNat <$> digit)
|
||||
@@ -289,11 +290,11 @@ def CharRef : Parsec LeanChar := do
|
||||
return Char.ofNat charCode
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Reference -/
|
||||
def Reference : Parsec $ Option LeanChar :=
|
||||
def Reference : Parser $ Option LeanChar :=
|
||||
EntityRef <|> some <$> CharRef
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-AttValue -/
|
||||
def AttValue : Parsec String := do
|
||||
def AttValue : Parser String := do
|
||||
let chars ←
|
||||
(do
|
||||
skipChar '"'
|
||||
@@ -306,25 +307,25 @@ def AttValue : Parsec String := do
|
||||
return chars.foldl (λ s c => if let some c := c then s.push c else s) ""
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-DefaultDecl -/
|
||||
def DefaultDecl : Parsec Unit :=
|
||||
def DefaultDecl : Parser Unit :=
|
||||
skipString "#REQUIRED"
|
||||
<|> skipString "#IMPLIED"
|
||||
<|> optional (skipString "#FIXED" <* S) *> AttValue *> pure ()
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-AttDef -/
|
||||
def AttDef : Parsec Unit :=
|
||||
def AttDef : Parser Unit :=
|
||||
S *> Name *> S *> AttType *> S *> DefaultDecl
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-AttlistDecl -/
|
||||
def AttlistDecl : Parsec Unit :=
|
||||
def AttlistDecl : Parser Unit :=
|
||||
skipString "<!ATTLIST" *> S *> Name *> many AttDef *> optional S *> skipChar '>'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-PEReference -/
|
||||
def PEReference : Parsec Unit :=
|
||||
def PEReference : Parser Unit :=
|
||||
skipChar '%' *> Name *> skipChar ';'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-EntityValue -/
|
||||
def EntityValue : Parsec String := do
|
||||
def EntityValue : Parser String := do
|
||||
let chars ←
|
||||
(do
|
||||
skipChar '"'
|
||||
@@ -338,51 +339,51 @@ def EntityValue : Parsec String := do
|
||||
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-NDataDecl -/
|
||||
def NDataDecl : Parsec Unit :=
|
||||
def NDataDecl : Parser Unit :=
|
||||
S *> skipString "NDATA" <* S <* Name
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-EntityDef -/
|
||||
def EntityDef : Parsec Unit :=
|
||||
def EntityDef : Parser Unit :=
|
||||
EntityValue *> pure () <|> (ExternalID <* optional NDataDecl)
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-GEDecl -/
|
||||
def GEDecl : Parsec Unit :=
|
||||
def GEDecl : Parser Unit :=
|
||||
skipString "<!ENTITY" *> S *> Name *> S *> EntityDef *> optional S *> skipChar '>'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-PEDef -/
|
||||
def PEDef : Parsec Unit :=
|
||||
def PEDef : Parser Unit :=
|
||||
EntityValue *> pure () <|> ExternalID
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-PEDecl -/
|
||||
def PEDecl : Parsec Unit :=
|
||||
def PEDecl : Parser Unit :=
|
||||
skipString "<!ENTITY" *> S *> skipChar '%' *> S *> Name *> PEDef *> optional S *> skipChar '>'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-EntityDecl -/
|
||||
def EntityDecl : Parsec Unit :=
|
||||
def EntityDecl : Parser Unit :=
|
||||
GEDecl <|> PEDecl
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-PublicID -/
|
||||
def PublicID : Parsec Unit :=
|
||||
def PublicID : Parser Unit :=
|
||||
skipString "PUBLIC" <* S <* PubidLiteral
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-NotationDecl -/
|
||||
def NotationDecl : Parsec Unit :=
|
||||
def NotationDecl : Parser Unit :=
|
||||
skipString "<!NOTATION" *> S *> Name *> (ExternalID <|> PublicID) *> optional S *> skipChar '>'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-markupdecl -/
|
||||
def markupDecl : Parsec Unit :=
|
||||
def markupDecl : Parser Unit :=
|
||||
elementDecl <|> AttlistDecl <|> EntityDecl <|> NotationDecl <|> PI <|> (Comment *> pure ())
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-DeclSep -/
|
||||
def DeclSep : Parsec Unit :=
|
||||
def DeclSep : Parser Unit :=
|
||||
PEReference <|> S *> pure ()
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-intSubset -/
|
||||
def intSubset : Parsec Unit :=
|
||||
def intSubset : Parser Unit :=
|
||||
many (markupDecl <|> DeclSep) *> pure ()
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-doctypedecl -/
|
||||
def doctypedecl : Parsec Unit := do
|
||||
def doctypedecl : Parser Unit := do
|
||||
skipString "<!DOCTYPE"
|
||||
S *>
|
||||
Name *>
|
||||
@@ -392,19 +393,19 @@ def doctypedecl : Parsec Unit := do
|
||||
skipChar '>'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-prolog -/
|
||||
def prolog : Parsec Unit :=
|
||||
def prolog : Parser Unit :=
|
||||
optional XMLdecl *>
|
||||
many Misc *>
|
||||
optional (doctypedecl <* many Misc) *> pure ()
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-Attribute -/
|
||||
def Attribute : Parsec (String × String) := do
|
||||
def Attribute : Parser (String × String) := do
|
||||
let name ← Name
|
||||
Eq
|
||||
let value ← AttValue
|
||||
return (name, value)
|
||||
|
||||
protected def elementPrefix : Parsec (Array Content → Element) := do
|
||||
protected def elementPrefix : Parser (Array Content → Element) := do
|
||||
skipChar '<'
|
||||
let name ← Name
|
||||
let attributes ← many (attempt <| S *> Attribute)
|
||||
@@ -412,40 +413,40 @@ protected def elementPrefix : Parsec (Array Content → Element) := do
|
||||
return Element.Element name (RBMap.fromList attributes.toList compare)
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-EmptyElemTag -/
|
||||
def EmptyElemTag (elem : Array Content → Element) : Parsec Element := do
|
||||
def EmptyElemTag (elem : Array Content → Element) : Parser Element := do
|
||||
skipString "/>" *> pure (elem #[])
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-STag -/
|
||||
def STag (elem : Array Content → Element) : Parsec (Array Content → Element) := do
|
||||
def STag (elem : Array Content → Element) : Parser (Array Content → Element) := do
|
||||
skipChar '>' *> pure elem
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-ETag -/
|
||||
def ETag : Parsec Unit :=
|
||||
def ETag : Parser Unit :=
|
||||
skipString "</" *> Name *> optional S *> skipChar '>'
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-CDStart -/
|
||||
def CDStart : Parsec Unit :=
|
||||
def CDStart : Parser Unit :=
|
||||
skipString "<![CDATA["
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-CDEnd -/
|
||||
def CDEnd : Parsec Unit :=
|
||||
def CDEnd : Parser Unit :=
|
||||
skipString "]]>"
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-CData -/
|
||||
def CData : Parsec String :=
|
||||
manyChars (notFollowedBy (skipString "]]>") *> anyChar)
|
||||
def CData : Parser String :=
|
||||
manyChars (notFollowedBy (skipString "]]>") *> any)
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-CDSect -/
|
||||
def CDSect : Parsec String :=
|
||||
def CDSect : Parser String :=
|
||||
CDStart *> CData <* CDEnd
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-CharData -/
|
||||
def CharData : Parsec String :=
|
||||
def CharData : Parser String :=
|
||||
notFollowedBy (skipString "]]>") *> manyChars (satisfy λ c => c ≠ '<' ∧ c ≠ '&')
|
||||
|
||||
mutual
|
||||
/-- https://www.w3.org/TR/xml/#NT-content -/
|
||||
partial def content : Parsec (Array Content) := do
|
||||
partial def content : Parser (Array Content) := do
|
||||
let x ← optional (Content.Character <$> CharData)
|
||||
let xs ← many do
|
||||
let y ←
|
||||
@@ -468,20 +469,20 @@ mutual
|
||||
return res
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-element -/
|
||||
partial def element : Parsec Element := do
|
||||
partial def element : Parser Element := do
|
||||
let elem ← Parser.elementPrefix
|
||||
EmptyElemTag elem <|> STag elem <*> content <* ETag
|
||||
|
||||
end
|
||||
|
||||
/-- https://www.w3.org/TR/xml/#NT-document -/
|
||||
def document : Parsec Element := prolog *> element <* many Misc <* eof
|
||||
def document : Parser Element := prolog *> element <* many Misc <* eof
|
||||
|
||||
end Parser
|
||||
|
||||
def parse (s : String) : Except String Element :=
|
||||
match Xml.Parser.document s.mkIterator with
|
||||
| Parsec.ParseResult.success _ res => Except.ok res
|
||||
| Parsec.ParseResult.error it err => Except.error s!"offset {it.i.byteIdx.repr}: {err}\n{(it.prevn 10).extract it}"
|
||||
| .success _ res => Except.ok res
|
||||
| .error it err => Except.error s!"offset {it.i.byteIdx.repr}: {err}\n{(it.prevn 10).extract it}"
|
||||
|
||||
end Xml
|
||||
|
||||
@@ -541,7 +541,7 @@ mutual
|
||||
/--
|
||||
Process a `fType` of the form `(x : A) → B x`.
|
||||
This method assume `fType` is a function type -/
|
||||
private partial def processExplictArg (argName : Name) : M Expr := do
|
||||
private partial def processExplicitArg (argName : Name) : M Expr := do
|
||||
match (← get).args with
|
||||
| arg::args =>
|
||||
if (← anyNamedArgDependsOnCurrent) then
|
||||
@@ -586,6 +586,16 @@ mutual
|
||||
| Except.ok tacticSyntax =>
|
||||
-- TODO(Leo): does this work correctly for tactic sequences?
|
||||
let tacticBlock ← `(by $(⟨tacticSyntax⟩))
|
||||
/-
|
||||
We insert position information from the current ref into `stx` everywhere, simulating this being
|
||||
a tactic script inserted by the user, which ensures error messages and logging will always be attributed
|
||||
to this application rather than sometimes being placed at position (1,0) in the file.
|
||||
Placing position information on `by` syntax alone is not sufficient since incrementality
|
||||
(in particular, `Lean.Elab.Term.withReuseContext`) controls the ref to avoid leakage of outside data.
|
||||
Note that `tacticSyntax` contains no position information itself, since it is erased by `Lean.Elab.Term.quoteAutoTactic`.
|
||||
-/
|
||||
let info := (← getRef).getHeadInfo
|
||||
let tacticBlock := tacticBlock.raw.rewriteBottomUp (·.setInfo info)
|
||||
let argNew := Arg.stx tacticBlock
|
||||
propagateExpectedType argNew
|
||||
elabAndAddNewArg argName argNew
|
||||
@@ -615,7 +625,7 @@ mutual
|
||||
This method assume `fType` is a function type -/
|
||||
private partial def processImplicitArg (argName : Name) : M Expr := do
|
||||
if (← read).explicit then
|
||||
processExplictArg argName
|
||||
processExplicitArg argName
|
||||
else
|
||||
addImplicitArg argName
|
||||
|
||||
@@ -624,7 +634,7 @@ mutual
|
||||
This method assume `fType` is a function type -/
|
||||
private partial def processStrictImplicitArg (argName : Name) : M Expr := do
|
||||
if (← read).explicit then
|
||||
processExplictArg argName
|
||||
processExplicitArg argName
|
||||
else if (← hasArgsToProcess) then
|
||||
addImplicitArg argName
|
||||
else
|
||||
@@ -643,7 +653,7 @@ mutual
|
||||
addNewArg argName arg
|
||||
main
|
||||
else
|
||||
processExplictArg argName
|
||||
processExplicitArg argName
|
||||
else
|
||||
let arg ← mkFreshExprMVar (← getArgExpectedType) MetavarKind.synthetic
|
||||
addInstMVar arg.mvarId!
|
||||
@@ -668,7 +678,7 @@ mutual
|
||||
| .implicit => processImplicitArg binderName
|
||||
| .instImplicit => processInstImplicitArg binderName
|
||||
| .strictImplicit => processStrictImplicitArg binderName
|
||||
| _ => processExplictArg binderName
|
||||
| _ => processExplicitArg binderName
|
||||
else if (← hasArgsToProcess) then
|
||||
synthesizePendingAndNormalizeFunType
|
||||
main
|
||||
|
||||
@@ -502,6 +502,16 @@ def elabRunMeta : CommandElab := fun stx =>
|
||||
addDocString declName (← getDocStringText doc)
|
||||
| _ => throwUnsupportedSyntax
|
||||
|
||||
@[builtin_command_elab Lean.Parser.Command.include] def elabInclude : CommandElab
|
||||
| `(Lean.Parser.Command.include| include $ids*) => do
|
||||
let vars := (← getScope).varDecls.concatMap getBracketedBinderIds
|
||||
for id in ids do
|
||||
unless vars.contains id.getId do
|
||||
throwError "invalid 'include', variable '{id}' has not been declared in the current scope"
|
||||
modifyScope fun sc =>
|
||||
{ sc with includedVars := sc.includedVars ++ ids.toList.map (·.getId) }
|
||||
| _ => throwUnsupportedSyntax
|
||||
|
||||
@[builtin_command_elab Parser.Command.exit] def elabExit : CommandElab := fun _ =>
|
||||
logWarning "using 'exit' to interrupt Lean"
|
||||
|
||||
|
||||
@@ -257,31 +257,52 @@ partial def hasCDot : Syntax → Bool
|
||||
Return `some` if succeeded expanding `·` notation occurring in
|
||||
the given syntax. Otherwise, return `none`.
|
||||
Examples:
|
||||
- `· + 1` => `fun _a_1 => _a_1 + 1`
|
||||
- `f · · b` => `fun _a_1 _a_2 => f _a_1 _a_2 b` -/
|
||||
- `· + 1` => `fun x => x + 1`
|
||||
- `f · · b` => `fun x1 x2 => f x1 x2 b` -/
|
||||
partial def expandCDot? (stx : Term) : MacroM (Option Term) := do
|
||||
if hasCDot stx then
|
||||
let (newStx, binders) ← (go stx).run #[]
|
||||
`(fun $binders* => $(⟨newStx⟩))
|
||||
withFreshMacroScope do
|
||||
let mut (newStx, binders) ← (go stx).run #[]
|
||||
if binders.size == 1 then
|
||||
-- It is nicer using `x` over `x1` if there's only a single binder.
|
||||
let x1 := binders[0]!
|
||||
let x := mkIdentFrom x1 (← MonadQuotation.addMacroScope `x) (canonical := true)
|
||||
binders := binders.set! 0 x
|
||||
newStx ← newStx.replaceM fun s => pure (if s == x1 then x else none)
|
||||
`(fun $binders* => $(⟨newStx⟩))
|
||||
else
|
||||
pure none
|
||||
where
|
||||
/--
|
||||
Auxiliary function for expanding the `·` notation.
|
||||
The extra state `Array Syntax` contains the new binder names.
|
||||
If `stx` is a `·`, we create a fresh identifier, store in the
|
||||
extra state, and return it. Otherwise, we just return `stx`. -/
|
||||
Auxiliary function for expanding the `·` notation.
|
||||
The extra state `Array Syntax` contains the new binder names.
|
||||
If `stx` is a `·`, we create a fresh identifier, store it in the
|
||||
extra state, and return it. Otherwise, we just return `stx`.
|
||||
-/
|
||||
go : Syntax → StateT (Array Ident) MacroM Syntax
|
||||
| stx@`(($(_))) => pure stx
|
||||
| stx@`(·) => withFreshMacroScope do
|
||||
let id ← mkFreshIdent stx (canonical := true)
|
||||
modify (·.push id)
|
||||
pure id
|
||||
| stx => match stx with
|
||||
| .node _ k args => do
|
||||
let args ← args.mapM go
|
||||
return .node (.fromRef stx (canonical := true)) k args
|
||||
| _ => pure stx
|
||||
| stx@`(($(_))) => pure stx
|
||||
| stx@`(·) => do
|
||||
let name ← MonadQuotation.addMacroScope <| Name.mkSimple s!"x{(← get).size + 1}"
|
||||
let id := mkIdentFrom stx name (canonical := true)
|
||||
modify (fun s => s.push id)
|
||||
pure id
|
||||
| stx => match stx with
|
||||
| .node _ k args => do
|
||||
let args ←
|
||||
if k == choiceKind then
|
||||
if args.isEmpty then
|
||||
return stx
|
||||
let s ← get
|
||||
let args' ← args.mapM (fun arg => go arg |>.run s)
|
||||
let s' := args'[0]!.2
|
||||
unless args'.all (fun (_, s'') => s''.size == s'.size) do
|
||||
Macro.throwErrorAt stx "Ambiguous notation in cdot function has different numbers of '·' arguments in each alternative."
|
||||
set s'
|
||||
pure <| args'.map Prod.fst
|
||||
else
|
||||
args.mapM go
|
||||
return .node (.fromRef stx (canonical := true)) k args
|
||||
| _ => pure stx
|
||||
|
||||
/--
|
||||
Helper method for elaborating terms such as `(.+.)` where a constant name is expected.
|
||||
|
||||
@@ -42,7 +42,7 @@ def mkCalcTrans (result resultType step stepType : Expr) : MetaM (Expr × Expr)
|
||||
unless (← getCalcRelation? resultType).isSome do
|
||||
throwError "invalid 'calc' step, step result is not a relation{indentExpr resultType}"
|
||||
return (result, resultType)
|
||||
| _ => throwError "invalid 'calc' step, failed to synthesize `Trans` instance{indentExpr selfType}\n{useDiagnosticMsg}"
|
||||
| _ => throwError "invalid 'calc' step, failed to synthesize `Trans` instance{indentExpr selfType}{useDiagnosticMsg}"
|
||||
|
||||
/--
|
||||
Adds a type annotation to a hole that occurs immediately at the beginning of the term.
|
||||
|
||||
@@ -51,6 +51,8 @@ structure Scope where
|
||||
even if they do not work with binders per se.
|
||||
-/
|
||||
varDecls : Array (TSyntax ``Parser.Term.bracketedBinder) := #[]
|
||||
/-- `include`d section variable names -/
|
||||
includedVars : List Name := []
|
||||
/--
|
||||
Globally unique internal identifiers for the `varDecls`.
|
||||
There is one identifier per variable introduced by the binders
|
||||
@@ -201,12 +203,12 @@ def mkMessageAux (ctx : Context) (ref : Syntax) (msgData : MessageData) (severit
|
||||
|
||||
private def addTraceAsMessagesCore (ctx : Context) (log : MessageLog) (traceState : TraceState) : MessageLog := Id.run do
|
||||
if traceState.traces.isEmpty then return log
|
||||
let mut traces : HashMap (String.Pos × String.Pos) (Array MessageData) := ∅
|
||||
let mut traces : Std.HashMap (String.Pos × String.Pos) (Array MessageData) := ∅
|
||||
for traceElem in traceState.traces do
|
||||
let ref := replaceRef traceElem.ref ctx.ref
|
||||
let pos := ref.getPos?.getD 0
|
||||
let endPos := ref.getTailPos?.getD pos
|
||||
traces := traces.insert (pos, endPos) <| traces.findD (pos, endPos) #[] |>.push traceElem.msg
|
||||
traces := traces.insert (pos, endPos) <| traces.getD (pos, endPos) #[] |>.push traceElem.msg
|
||||
let mut log := log
|
||||
let traces' := traces.toArray.qsort fun ((a, _), _) ((b, _), _) => a < b
|
||||
for ((pos, endPos), traceMsg) in traces' do
|
||||
|
||||
@@ -81,10 +81,6 @@ end Frontend
|
||||
|
||||
open Frontend
|
||||
|
||||
def IO.processCommands (inputCtx : Parser.InputContext) (parserState : Parser.ModuleParserState) (commandState : Command.State) : IO State := do
|
||||
let (_, s) ← (Frontend.processCommands.run { inputCtx := inputCtx }).run { commandState := commandState, parserState := parserState, cmdPos := parserState.pos }
|
||||
pure s
|
||||
|
||||
structure IncrementalState extends State where
|
||||
inputCtx : Parser.InputContext
|
||||
initialSnap : Language.Lean.CommandParsedSnapshot
|
||||
@@ -92,12 +88,10 @@ deriving Nonempty
|
||||
|
||||
open Language in
|
||||
/--
|
||||
Variant of `IO.processCommands` that uses the new Lean language processor implementation for
|
||||
potential incremental reuse. Pass in result of a previous invocation done with the same state
|
||||
(but usually different input context) to allow for reuse.
|
||||
Variant of `IO.processCommands` that allows for potential incremental reuse. Pass in the result of a
|
||||
previous invocation done with the same state (but usually different input context) to allow for
|
||||
reuse.
|
||||
-/
|
||||
-- `IO.processCommands` can be reimplemented on top of this as soon as the additional tasks speed up
|
||||
-- things instead of slowing them down
|
||||
partial def IO.processCommandsIncrementally (inputCtx : Parser.InputContext)
|
||||
(parserState : Parser.ModuleParserState) (commandState : Command.State)
|
||||
(old? : Option IncrementalState) :
|
||||
@@ -110,7 +104,7 @@ where
|
||||
let snap := t.get
|
||||
let commands := commands.push snap.data.stx
|
||||
if let some next := snap.nextCmdSnap? then
|
||||
go initialSnap next commands
|
||||
go initialSnap next.task commands
|
||||
else
|
||||
-- Opting into reuse also enables incremental reporting, so make sure to collect messages from
|
||||
-- all snapshots
|
||||
@@ -126,6 +120,11 @@ where
|
||||
inputCtx, initialSnap, commands
|
||||
}
|
||||
|
||||
def IO.processCommands (inputCtx : Parser.InputContext) (parserState : Parser.ModuleParserState)
|
||||
(commandState : Command.State) : IO State := do
|
||||
let st ← IO.processCommandsIncrementally inputCtx parserState commandState none
|
||||
return st.toState
|
||||
|
||||
def process (input : String) (env : Environment) (opts : Options) (fileName : Option String := none) : IO (Environment × MessageLog) := do
|
||||
let fileName := fileName.getD "<input>"
|
||||
let inputCtx := Parser.mkInputContext input fileName
|
||||
@@ -144,62 +143,31 @@ def runFrontend
|
||||
: IO (Environment × Bool) := do
|
||||
let startTime := (← IO.monoNanosNow).toFloat / 1000000000
|
||||
let inputCtx := Parser.mkInputContext input fileName
|
||||
if true then
|
||||
-- Temporarily keep alive old cmdline driver for the Lean language so that we don't pay the
|
||||
-- overhead of passing the environment between snapshots until we actually make good use of it
|
||||
-- outside the server
|
||||
let (header, parserState, messages) ← Parser.parseHeader inputCtx
|
||||
-- allow `env` to be leaked, which would live until the end of the process anyway
|
||||
let (env, messages) ← processHeader (leakEnv := true) header opts messages inputCtx trustLevel
|
||||
let env := env.setMainModule mainModuleName
|
||||
let mut commandState := Command.mkState env messages opts
|
||||
let elabStartTime := (← IO.monoNanosNow).toFloat / 1000000000
|
||||
|
||||
if ileanFileName?.isSome then
|
||||
-- Collect InfoTrees so we can later extract and export their info to the ilean file
|
||||
commandState := { commandState with infoState.enabled := true }
|
||||
|
||||
let s ← IO.processCommands inputCtx parserState commandState
|
||||
Language.reportMessages s.commandState.messages opts jsonOutput
|
||||
|
||||
if let some ileanFileName := ileanFileName? then
|
||||
let trees := s.commandState.infoState.trees.toArray
|
||||
let references ←
|
||||
Lean.Server.findModuleRefs inputCtx.fileMap trees (localVars := false) |>.toLspModuleRefs
|
||||
let ilean := { module := mainModuleName, references : Lean.Server.Ilean }
|
||||
IO.FS.writeFile ileanFileName $ Json.compress $ toJson ilean
|
||||
|
||||
if let some out := trace.profiler.output.get? opts then
|
||||
let traceState := s.commandState.traceState
|
||||
-- importing does not happen in an elaboration monad, add now
|
||||
let traceState := { traceState with
|
||||
traces := #[{
|
||||
ref := .missing,
|
||||
msg := .trace { cls := `Import, startTime, stopTime := elabStartTime }
|
||||
(.ofFormat "importing") #[]
|
||||
}].toPArray' ++ traceState.traces
|
||||
}
|
||||
let profile ← Firefox.Profile.export mainModuleName.toString startTime traceState opts
|
||||
IO.FS.writeFile ⟨out⟩ <| Json.compress <| toJson profile
|
||||
|
||||
return (s.commandState.env, !s.commandState.messages.hasErrors)
|
||||
|
||||
let opts := Language.Lean.internal.minimalSnapshots.set opts true
|
||||
let ctx := { inputCtx with }
|
||||
let processor := Language.Lean.process
|
||||
let snap ← processor (fun _ => pure <| .ok { mainModuleName, opts, trustLevel }) none ctx
|
||||
let snaps := Language.toSnapshotTree snap
|
||||
snaps.runAndReport opts jsonOutput
|
||||
|
||||
if let some ileanFileName := ileanFileName? then
|
||||
let trees := snaps.getAll.concatMap (match ·.infoTree? with | some t => #[t] | _ => #[])
|
||||
let references := Lean.Server.findModuleRefs inputCtx.fileMap trees (localVars := false)
|
||||
let ilean := { module := mainModuleName, references := ← references.toLspModuleRefs : Lean.Server.Ilean }
|
||||
IO.FS.writeFile ileanFileName $ Json.compress $ toJson ilean
|
||||
|
||||
let hasErrors := snaps.getAll.any (·.diagnostics.msgLog.hasErrors)
|
||||
-- TODO: remove default when reworking cmdline interface in Lean; currently the only case
|
||||
-- where we use the environment despite errors in the file is `--stats`
|
||||
let env := Language.Lean.waitForFinalEnv? snap |>.getD (← mkEmptyEnvironment)
|
||||
pure (env, !hasErrors)
|
||||
let some cmdState := Language.Lean.waitForFinalCmdState? snap
|
||||
| return (← mkEmptyEnvironment, false)
|
||||
|
||||
if let some out := trace.profiler.output.get? opts then
|
||||
let traceState := cmdState.traceState
|
||||
let profile ← Firefox.Profile.export mainModuleName.toString startTime traceState opts
|
||||
IO.FS.writeFile ⟨out⟩ <| Json.compress <| toJson profile
|
||||
|
||||
let hasErrors := snaps.getAll.any (·.diagnostics.msgLog.hasErrors)
|
||||
pure (cmdState.env, !hasErrors)
|
||||
|
||||
|
||||
end Lean.Elab
|
||||
|
||||
@@ -630,7 +630,7 @@ private def replaceIndFVarsWithConsts (views : Array InductiveView) (indFVars :
|
||||
let type := type.replace fun e =>
|
||||
if !e.isFVar then
|
||||
none
|
||||
else match indFVar2Const.find? e with
|
||||
else match indFVar2Const[e]? with
|
||||
| none => none
|
||||
| some c => mkAppN c (params.extract 0 numVars)
|
||||
instantiateMVars (← mkForallFVars params type)
|
||||
|
||||
@@ -425,7 +425,7 @@ private def applyRefMap (e : Expr) (map : ExprMap Expr) : Expr :=
|
||||
e.replace fun e =>
|
||||
match patternWithRef? e with
|
||||
| some _ => some e -- stop `e` already has annotation
|
||||
| none => match map.find? e with
|
||||
| none => match map[e]? with
|
||||
| some eWithRef => some eWithRef -- stop `e` found annotation
|
||||
| none => none -- continue
|
||||
|
||||
|
||||
@@ -323,7 +323,49 @@ private def declValToTerminationHint (declVal : Syntax) : TermElabM TerminationH
|
||||
else
|
||||
return .none
|
||||
|
||||
private def elabFunValues (headers : Array DefViewElabHeader) : TermElabM (Array Expr) :=
|
||||
def instantiateMVarsProfiling (e : Expr) : MetaM Expr := do
|
||||
profileitM Exception s!"instantiate metavars" (← getOptions) do
|
||||
instantiateMVars e
|
||||
|
||||
/--
|
||||
Runs `k` with a restricted local context where only section variables from `vars` are included that
|
||||
* are directly referenced in any `headers`,
|
||||
* are included in `includedVars` (via the `include` command),
|
||||
* are directly referenced in any variable included by these rules, OR
|
||||
* are instance-implicit variables that only reference section variables included by these rules.
|
||||
-/
|
||||
private def withHeaderSecVars {α} (vars : Array Expr) (includedVars : List Name) (headers : Array DefViewElabHeader)
|
||||
(k : Array Expr → TermElabM α) : TermElabM α := do
|
||||
let (_, used) ← collectUsed.run {}
|
||||
let (lctx, localInsts, vars) ← removeUnused vars used
|
||||
withLCtx lctx localInsts <| k vars
|
||||
where
|
||||
collectUsed : StateRefT CollectFVars.State MetaM Unit := do
|
||||
-- directly referenced in headers
|
||||
headers.forM (·.type.collectFVars)
|
||||
-- included by `include`
|
||||
vars.forM fun var => do
|
||||
let ldecl ← getFVarLocalDecl var
|
||||
if includedVars.contains ldecl.userName then
|
||||
modify (·.add ldecl.fvarId)
|
||||
-- transitively referenced
|
||||
get >>= (·.addDependencies) >>= set
|
||||
-- instances (`addDependencies` unnecessary as by definition they may only reference variables
|
||||
-- already included)
|
||||
vars.forM fun var => do
|
||||
let ldecl ← getFVarLocalDecl var
|
||||
let st ← get
|
||||
if ldecl.binderInfo.isInstImplicit && (← getFVars ldecl.type).all st.fvarSet.contains then
|
||||
modify (·.add ldecl.fvarId)
|
||||
getFVars (e : Expr) : MetaM (Array FVarId) :=
|
||||
(·.2.fvarIds) <$> e.collectFVars.run {}
|
||||
|
||||
register_builtin_option deprecated.oldSectionVars : Bool := {
|
||||
defValue := false
|
||||
descr := "re-enable deprecated behavior of including exactly the section variables used in a declaration"
|
||||
}
|
||||
|
||||
private def elabFunValues (headers : Array DefViewElabHeader) (vars : Array Expr) (includedVars : List Name) : TermElabM (Array Expr) :=
|
||||
headers.mapM fun header => do
|
||||
let mut reusableResult? := none
|
||||
if let some snap := header.bodySnap? then
|
||||
@@ -338,6 +380,7 @@ private def elabFunValues (headers : Array DefViewElabHeader) : TermElabM (Array
|
||||
withReuseContext header.value do
|
||||
withDeclName header.declName <| withLevelNames header.levelNames do
|
||||
let valStx ← liftMacroM <| declValToTerm header.value
|
||||
(if header.kind.isTheorem && !deprecated.oldSectionVars.get (← getOptions) then withHeaderSecVars vars includedVars #[header] else fun x => x #[]) fun vars => do
|
||||
forallBoundedTelescope header.type header.numParams fun xs type => do
|
||||
-- Add new info nodes for new fvars. The server will detect all fvars of a binder by the binder's source location.
|
||||
for i in [0:header.binderIds.size] do
|
||||
@@ -348,8 +391,21 @@ private def elabFunValues (headers : Array DefViewElabHeader) : TermElabM (Array
|
||||
elabTermEnsuringType valStx type <* synthesizeSyntheticMVarsNoPostponing
|
||||
-- NOTE: without this `instantiatedMVars`, `mkLambdaFVars` may leave around a redex that
|
||||
-- leads to more section variables being included than necessary
|
||||
let val ← instantiateMVars val
|
||||
mkLambdaFVars xs val
|
||||
let val ← instantiateMVarsProfiling val
|
||||
let val ← mkLambdaFVars xs val
|
||||
unless header.type.hasSorry || val.hasSorry do
|
||||
for var in vars do
|
||||
unless header.type.containsFVar var.fvarId! ||
|
||||
val.containsFVar var.fvarId! ||
|
||||
(← vars.anyM (fun v => return (← v.fvarId!.getType).containsFVar var.fvarId!)) do
|
||||
let varDecl ← var.fvarId!.getDecl
|
||||
let var := if varDecl.userName.hasMacroScopes && varDecl.binderInfo.isInstImplicit then
|
||||
m!"[{varDecl.type}]".group
|
||||
else
|
||||
var
|
||||
logWarningAt header.ref m!"included section variable '{var}' is not used in \
|
||||
'{header.declName}', consider excluding it"
|
||||
return val
|
||||
if let some snap := header.bodySnap? then
|
||||
snap.new.resolve <| some {
|
||||
diagnostics :=
|
||||
@@ -389,7 +445,7 @@ private def instantiateMVarsAtHeader (header : DefViewElabHeader) : TermElabM De
|
||||
|
||||
private def instantiateMVarsAtLetRecToLift (toLift : LetRecToLift) : TermElabM LetRecToLift := do
|
||||
let type ← instantiateMVars toLift.type
|
||||
let val ← instantiateMVars toLift.val
|
||||
let val ← instantiateMVarsProfiling toLift.val
|
||||
pure { toLift with type, val }
|
||||
|
||||
private def typeHasRecFun (type : Expr) (funFVars : Array Expr) (letRecsToLift : List LetRecToLift) : Option FVarId :=
|
||||
@@ -597,7 +653,7 @@ private def pickMaxFVar? (lctx : LocalContext) (fvarIds : Array FVarId) : Option
|
||||
fvarIds.getMax? fun fvarId₁ fvarId₂ => (lctx.get! fvarId₁).index < (lctx.get! fvarId₂).index
|
||||
|
||||
private def preprocess (e : Expr) : TermElabM Expr := do
|
||||
let e ← instantiateMVars e
|
||||
let e ← instantiateMVarsProfiling e
|
||||
-- which let-decls are dependent. We say a let-decl is dependent if its lambda abstraction is type incorrect.
|
||||
Meta.check e
|
||||
pure e
|
||||
@@ -708,7 +764,7 @@ private def mkLetRecClosures (sectionVars : Array Expr) (mainFVarIds : Array FVa
|
||||
-- This can happen when this particular let-rec has nested let-rec that have been resolved in previous iterations.
|
||||
-- This code relies on the fact that nested let-recs occur before the outer most let-recs at `letRecsToLift`.
|
||||
-- Unresolved nested let-recs appear as metavariables before they are resolved. See `assignExprMVar` at `mkLetRecClosureFor`
|
||||
let valNew ← instantiateMVars letRecsToLift[i]!.val
|
||||
let valNew ← instantiateMVarsProfiling letRecsToLift[i]!.val
|
||||
letRecsToLift := letRecsToLift.modify i fun t => { t with val := valNew }
|
||||
-- We have to recompute the `freeVarMap` in this case. This overhead should not be an issue in practice.
|
||||
freeVarMap ← mkFreeVarMap sectionVars mainFVarIds recFVarIds letRecsToLift
|
||||
@@ -821,10 +877,10 @@ def main (sectionVars : Array Expr) (mainHeaders : Array DefViewElabHeader) (mai
|
||||
let letRecsToLift ← letRecsToLift.mapM fun toLift => withLCtx toLift.lctx toLift.localInstances do
|
||||
Meta.check toLift.type
|
||||
Meta.check toLift.val
|
||||
return { toLift with val := (← instantiateMVars toLift.val), type := (← instantiateMVars toLift.type) }
|
||||
return { toLift with val := (← instantiateMVarsProfiling toLift.val), type := (← instantiateMVars toLift.type) }
|
||||
let letRecClosures ← mkLetRecClosures sectionVars mainFVarIds recFVarIds letRecsToLift
|
||||
-- mkLetRecClosures assign metavariables that were placeholders for the lifted declarations.
|
||||
let mainVals ← mainVals.mapM (instantiateMVars ·)
|
||||
let mainVals ← mainVals.mapM (instantiateMVarsProfiling ·)
|
||||
let mainHeaders ← mainHeaders.mapM instantiateMVarsAtHeader
|
||||
let letRecClosures ← letRecClosures.mapM fun closure => do pure { closure with toLift := (← instantiateMVarsAtLetRecToLift closure.toLift) }
|
||||
-- Replace fvarIds for functions being defined with closed terms
|
||||
@@ -900,7 +956,7 @@ partial def checkForHiddenUnivLevels (allUserLevelNames : List Name) (preDefs :
|
||||
for preDef in preDefs do
|
||||
checkPreDef preDef
|
||||
|
||||
def elabMutualDef (vars : Array Expr) (views : Array DefView) : TermElabM Unit :=
|
||||
def elabMutualDef (vars : Array Expr) (includedVars : List Name) (views : Array DefView) : TermElabM Unit :=
|
||||
if isExample views then
|
||||
withoutModifyingEnv do
|
||||
-- save correct environment in info tree
|
||||
@@ -921,9 +977,9 @@ where
|
||||
addLocalVarInfo view.declId funFVar
|
||||
let values ←
|
||||
try
|
||||
let values ← elabFunValues headers
|
||||
let values ← elabFunValues headers vars includedVars
|
||||
Term.synthesizeSyntheticMVarsNoPostponing
|
||||
values.mapM (instantiateMVars ·)
|
||||
values.mapM (instantiateMVarsProfiling ·)
|
||||
catch ex =>
|
||||
logException ex
|
||||
headers.mapM fun header => mkSorry header.type (synthetic := true)
|
||||
@@ -931,7 +987,7 @@ where
|
||||
let letRecsToLift ← getLetRecsToLift
|
||||
let letRecsToLift ← letRecsToLift.mapM instantiateMVarsAtLetRecToLift
|
||||
checkLetRecsToLiftTypes funFVars letRecsToLift
|
||||
withUsed vars headers values letRecsToLift fun vars => do
|
||||
(if headers.all (·.kind.isTheorem) && !deprecated.oldSectionVars.get (← getOptions) then withHeaderSecVars vars includedVars headers else withUsed vars headers values letRecsToLift) fun vars => do
|
||||
let preDefs ← MutualClosure.main vars headers funFVars values letRecsToLift
|
||||
for preDef in preDefs do
|
||||
trace[Elab.definition] "{preDef.declName} : {preDef.type} :=\n{preDef.value}"
|
||||
@@ -1002,7 +1058,8 @@ def elabMutualDef (ds : Array Syntax) : CommandElabM Unit := do
|
||||
if let some snap := snap? then
|
||||
-- no non-fatal diagnostics at this point
|
||||
snap.new.resolve <| .ofTyped { defs, diagnostics := .empty : DefsParsedSnapshot }
|
||||
runTermElabM fun vars => Term.elabMutualDef vars views
|
||||
let includedVars := (← getScope).includedVars
|
||||
runTermElabM fun vars => Term.elabMutualDef vars includedVars views
|
||||
|
||||
builtin_initialize
|
||||
registerTraceClass `Elab.definition.mkClosure
|
||||
|
||||
@@ -7,6 +7,9 @@ prelude
|
||||
import Init.ShareCommon
|
||||
import Lean.Compiler.NoncomputableAttr
|
||||
import Lean.Util.CollectLevelParams
|
||||
import Lean.Util.NumObjs
|
||||
import Lean.Util.NumApps
|
||||
import Lean.PrettyPrinter
|
||||
import Lean.Meta.AbstractNestedProofs
|
||||
import Lean.Meta.ForEachExpr
|
||||
import Lean.Elab.RecAppSyntax
|
||||
@@ -17,7 +20,6 @@ namespace Lean.Elab
|
||||
open Meta
|
||||
open Term
|
||||
|
||||
|
||||
/--
|
||||
A (potentially recursive) definition.
|
||||
The elaborator converts it into Kernel definitions using many different strategies.
|
||||
@@ -98,15 +100,33 @@ private def compileDecl (decl : Declaration) : TermElabM Bool := do
|
||||
throw ex
|
||||
return true
|
||||
|
||||
register_builtin_option diagnostics.threshold.proofSize : Nat := {
|
||||
defValue := 16384
|
||||
group := "diagnostics"
|
||||
descr := "only display proof statistics when proof has at least this number of terms"
|
||||
}
|
||||
|
||||
private def reportTheoremDiag (d : TheoremVal) : TermElabM Unit := do
|
||||
if (← isDiagnosticsEnabled) then
|
||||
let proofSize ← d.value.numObjs
|
||||
if proofSize > diagnostics.threshold.proofSize.get (← getOptions) then
|
||||
let sizeMsg := MessageData.trace { cls := `size } m!"{proofSize}" #[]
|
||||
let constOccs ← d.value.numApps (threshold := diagnostics.threshold.get (← getOptions))
|
||||
let constOccsMsg ← constOccs.mapM fun (declName, numOccs) => return MessageData.trace { cls := `occs } m!"{MessageData.ofConst (← mkConstWithLevelParams declName)} ↦ {numOccs}" #[]
|
||||
-- let info
|
||||
logInfo <| MessageData.trace { cls := `theorem } m!"{d.name}" (#[sizeMsg] ++ constOccsMsg)
|
||||
|
||||
private def addNonRecAux (preDef : PreDefinition) (compile : Bool) (all : List Name) (applyAttrAfterCompilation := true) : TermElabM Unit :=
|
||||
withRef preDef.ref do
|
||||
let preDef ← abstractNestedProofs preDef
|
||||
let decl ←
|
||||
match preDef.kind with
|
||||
| DefKind.«theorem» =>
|
||||
pure <| Declaration.thmDecl {
|
||||
let d := {
|
||||
name := preDef.declName, levelParams := preDef.levelParams, type := preDef.type, value := preDef.value, all
|
||||
}
|
||||
reportTheoremDiag d
|
||||
pure <| Declaration.thmDecl d
|
||||
| DefKind.«opaque» =>
|
||||
pure <| Declaration.opaqueDecl {
|
||||
name := preDef.declName, levelParams := preDef.levelParams, type := preDef.type, value := preDef.value
|
||||
@@ -144,8 +164,11 @@ def addNonRec (preDef : PreDefinition) (applyAttrAfterCompilation := true) (all
|
||||
/--
|
||||
Eliminate recursive application annotations containing syntax. These annotations are used by the well-founded recursion module
|
||||
to produce better error messages. -/
|
||||
def eraseRecAppSyntaxExpr (e : Expr) : CoreM Expr :=
|
||||
Core.transform e (post := fun e => pure <| TransformStep.done <| if (getRecAppSyntax? e).isSome then e.mdataExpr! else e)
|
||||
def eraseRecAppSyntaxExpr (e : Expr) : CoreM Expr := do
|
||||
if e.find? hasRecAppSyntax |>.isSome then
|
||||
Core.transform e (post := fun e => pure <| TransformStep.done <| if hasRecAppSyntax e then e.mdataExpr! else e)
|
||||
else
|
||||
return e
|
||||
|
||||
def eraseRecAppSyntax (preDef : PreDefinition) : CoreM PreDefinition :=
|
||||
return { preDef with value := (← eraseRecAppSyntaxExpr preDef.value) }
|
||||
|
||||
@@ -69,12 +69,15 @@ private def ensureNoUnassignedMVarsAtPreDef (preDef : PreDefinition) : TermElabM
|
||||
This method beta-reduces them to make sure they can be eliminated by the well-founded recursion module. -/
|
||||
private def betaReduceLetRecApps (preDefs : Array PreDefinition) : MetaM (Array PreDefinition) :=
|
||||
preDefs.mapM fun preDef => do
|
||||
let value ← Core.transform preDef.value fun e => do
|
||||
if e.isApp && e.getAppFn.isLambda && e.getAppArgs.all fun arg => arg.getAppFn.isConst && preDefs.any fun preDef => preDef.declName == arg.getAppFn.constName! then
|
||||
return .visit e.headBeta
|
||||
else
|
||||
return .continue
|
||||
return { preDef with value }
|
||||
if preDef.value.find? (fun e => e.isConst && preDefs.any fun preDef => preDef.declName == e.constName!) |>.isSome then
|
||||
let value ← Core.transform preDef.value fun e => do
|
||||
if e.isApp && e.getAppFn.isLambda && e.getAppArgs.all fun arg => arg.getAppFn.isConst && preDefs.any fun preDef => preDef.declName == arg.getAppFn.constName! then
|
||||
return .visit e.headBeta
|
||||
else
|
||||
return .continue
|
||||
return { preDef with value }
|
||||
else
|
||||
return preDef
|
||||
|
||||
private def addAsAxioms (preDefs : Array PreDefinition) : TermElabM Unit := do
|
||||
for preDef in preDefs do
|
||||
|
||||
@@ -146,7 +146,7 @@ See issue #837 for an example where we can show termination using the index of a
|
||||
we don't get the desired definitional equalities.
|
||||
-/
|
||||
def nonIndicesFirst (recArgInfos : Array RecArgInfo) : Array RecArgInfo := Id.run do
|
||||
let mut indicesPos : HashSet Nat := {}
|
||||
let mut indicesPos : Std.HashSet Nat := {}
|
||||
for recArgInfo in recArgInfos do
|
||||
for pos in recArgInfo.indicesPos do
|
||||
indicesPos := indicesPos.insert pos
|
||||
|
||||
@@ -596,7 +596,7 @@ private partial def compileStxMatch (discrs : List Term) (alts : List Alt) : Ter
|
||||
`(have __discr := $discr; $stx)
|
||||
| _, _ => unreachable!
|
||||
|
||||
abbrev IdxSet := HashSet Nat
|
||||
abbrev IdxSet := Std.HashSet Nat
|
||||
|
||||
private partial def hasNoErrorIfUnused : Syntax → Bool
|
||||
| `(no_error_if_unused% $_) => true
|
||||
|
||||
@@ -11,27 +11,35 @@ namespace Lean
|
||||
private def recAppKey := `_recApp
|
||||
|
||||
/--
|
||||
We store the syntax at recursive applications to be able to generate better error messages
|
||||
when performing well-founded and structural recursion.
|
||||
We store the syntax at recursive applications to be able to generate better error messages
|
||||
when performing well-founded and structural recursion.
|
||||
-/
|
||||
def mkRecAppWithSyntax (e : Expr) (stx : Syntax) : Expr :=
|
||||
mkMData (KVMap.empty.insert recAppKey (DataValue.ofSyntax stx)) e
|
||||
mkMData (KVMap.empty.insert recAppKey (.ofSyntax stx)) e
|
||||
|
||||
/--
|
||||
Retrieve (if available) the syntax object attached to a recursive application.
|
||||
Retrieve (if available) the syntax object attached to a recursive application.
|
||||
-/
|
||||
def getRecAppSyntax? (e : Expr) : Option Syntax :=
|
||||
match e with
|
||||
| Expr.mdata d _ =>
|
||||
| .mdata d _ =>
|
||||
match d.find recAppKey with
|
||||
| some (DataValue.ofSyntax stx) => some stx
|
||||
| _ => none
|
||||
| _ => none
|
||||
|
||||
/--
|
||||
Checks if the `MData` is for a recursive applciation.
|
||||
Checks if the `MData` is for a recursive applciation.
|
||||
-/
|
||||
def MData.isRecApp (d : MData) : Bool :=
|
||||
d.contains recAppKey
|
||||
|
||||
/--
|
||||
Return `true` if `getRecAppSyntax? e` is a `some`.
|
||||
-/
|
||||
def hasRecAppSyntax (e : Expr) : Bool :=
|
||||
match e with
|
||||
| .mdata d _ => d.isRecApp
|
||||
| _ => false
|
||||
|
||||
end Lean
|
||||
|
||||
@@ -445,13 +445,13 @@ private def expandParentFields (s : Struct) : TermElabM Struct := do
|
||||
| _ => throwErrorAt ref "failed to access field '{fieldName}' in parent structure"
|
||||
| _ => return field
|
||||
|
||||
private abbrev FieldMap := HashMap Name Fields
|
||||
private abbrev FieldMap := Std.HashMap Name Fields
|
||||
|
||||
private def mkFieldMap (fields : Fields) : TermElabM FieldMap :=
|
||||
fields.foldlM (init := {}) fun fieldMap field =>
|
||||
match field.lhs with
|
||||
| .fieldName _ fieldName :: _ =>
|
||||
match fieldMap.find? fieldName with
|
||||
match fieldMap[fieldName]? with
|
||||
| some (prevField::restFields) =>
|
||||
if field.isSimple || prevField.isSimple then
|
||||
throwErrorAt field.ref "field '{fieldName}' has already been specified"
|
||||
@@ -677,6 +677,10 @@ private partial def elabStruct (s : Struct) (expectedType? : Option Expr) : Term
|
||||
| .error err => throwError err
|
||||
| .ok tacticSyntax =>
|
||||
let stx ← `(by $tacticSyntax)
|
||||
-- See comment in `Lean.Elab.Term.ElabAppArgs.processExplicitArg` about `tacticSyntax`.
|
||||
-- We add info to get reliable positions for messages from evaluating the tactic script.
|
||||
let info := field.ref.getHeadInfo
|
||||
let stx := stx.raw.rewriteBottomUp (·.setInfo info)
|
||||
cont (← elabTermEnsuringType stx (d.getArg! 0).consumeTypeAnnotations) field
|
||||
| _ =>
|
||||
if bi == .instImplicit then
|
||||
|
||||
@@ -246,7 +246,7 @@ private def getSomeSyntheticMVarsRef : TermElabM Syntax := do
|
||||
private def throwStuckAtUniverseCnstr : TermElabM Unit := do
|
||||
-- This code assumes `entries` is not empty. Note that `processPostponed` uses `exceptionOnFailure` to guarantee this property
|
||||
let entries ← getPostponed
|
||||
let mut found : HashSet (Level × Level) := {}
|
||||
let mut found : Std.HashSet (Level × Level) := {}
|
||||
let mut uniqueEntries := #[]
|
||||
for entry in entries do
|
||||
let mut lhs := entry.lhs
|
||||
|
||||
@@ -8,8 +8,6 @@ import Init.Omega.Constraint
|
||||
import Lean.Elab.Tactic.Omega.OmegaM
|
||||
import Lean.Elab.Tactic.Omega.MinNatAbs
|
||||
|
||||
open Lean (HashMap HashSet)
|
||||
|
||||
namespace Lean.Elab.Tactic.Omega
|
||||
|
||||
initialize Lean.registerTraceClass `omega
|
||||
@@ -167,11 +165,11 @@ structure Problem where
|
||||
/-- The number of variables in the problem. -/
|
||||
numVars : Nat := 0
|
||||
/-- The current constraints, indexed by their coefficients. -/
|
||||
constraints : HashMap Coeffs Fact := ∅
|
||||
constraints : Std.HashMap Coeffs Fact := ∅
|
||||
/--
|
||||
The coefficients for which `constraints` contains an exact constraint (i.e. an equality).
|
||||
-/
|
||||
equalities : HashSet Coeffs := ∅
|
||||
equalities : Std.HashSet Coeffs := ∅
|
||||
/--
|
||||
Equations that have already been used to eliminate variables,
|
||||
along with the variable which was removed, and its coefficient (either `1` or `-1`).
|
||||
@@ -251,7 +249,7 @@ combining it with any existing constraints for the same coefficients.
|
||||
def addConstraint (p : Problem) : Fact → Problem
|
||||
| f@⟨x, s, j⟩ =>
|
||||
if p.possible then
|
||||
match p.constraints.find? x with
|
||||
match p.constraints[x]? with
|
||||
| none =>
|
||||
match s with
|
||||
| .trivial => p
|
||||
@@ -313,7 +311,7 @@ After solving, the variable will have been eliminated from all constraints.
|
||||
def solveEasyEquality (p : Problem) (c : Coeffs) : Problem :=
|
||||
let i := c.findIdx? (·.natAbs = 1) |>.getD 0 -- findIdx? is always some
|
||||
let sign := c.get i |> Int.sign
|
||||
match p.constraints.find? c with
|
||||
match p.constraints[c]? with
|
||||
| some f =>
|
||||
let init :=
|
||||
{ assumptions := p.assumptions
|
||||
@@ -335,7 +333,7 @@ After solving the easy equality,
|
||||
the minimum lexicographic value of `(c.minNatAbs, c.maxNatAbs)` will have been reduced.
|
||||
-/
|
||||
def dealWithHardEquality (p : Problem) (c : Coeffs) : OmegaM Problem :=
|
||||
match p.constraints.find? c with
|
||||
match p.constraints[c]? with
|
||||
| some ⟨_, ⟨some r, some r'⟩, j⟩ => do
|
||||
let m := c.minNatAbs + 1
|
||||
-- We have to store the valid value of the newly introduced variable in the atoms.
|
||||
@@ -479,7 +477,7 @@ def fourierMotzkinData (p : Problem) : Array FourierMotzkinData := Id.run do
|
||||
let n := p.numVars
|
||||
let mut data : Array FourierMotzkinData :=
|
||||
(List.range p.numVars).foldl (fun a i => a.push { var := i}) #[]
|
||||
for (_, f@⟨xs, s, _⟩) in p.constraints.toList do -- We could make a forIn instance for HashMap
|
||||
for (_, f@⟨xs, s, _⟩) in p.constraints do
|
||||
for i in [0:n] do
|
||||
let x := Coeffs.get xs i
|
||||
data := data.modify i fun d =>
|
||||
|
||||
@@ -58,7 +58,7 @@ structure MetaProblem where
|
||||
-/
|
||||
disjunctions : List Expr := []
|
||||
/-- Facts which have already been processed; we keep these to avoid duplicates. -/
|
||||
processedFacts : HashSet Expr := ∅
|
||||
processedFacts : Std.HashSet Expr := ∅
|
||||
|
||||
/-- Construct the `rfl` proof that `lc.eval atoms = e`. -/
|
||||
def mkEvalRflProof (e : Expr) (lc : LinearCombo) : OmegaM Expr := do
|
||||
@@ -80,7 +80,7 @@ def mkCoordinateEvalAtomsEq (e : Expr) (n : Nat) : OmegaM Expr := do
|
||||
mkEqTrans eq (← mkEqSymm (mkApp2 (.const ``LinearCombo.coordinate_eval []) n atoms))
|
||||
|
||||
/-- Construct the linear combination (and its associated proof and new facts) for an atom. -/
|
||||
def mkAtomLinearCombo (e : Expr) : OmegaM (LinearCombo × OmegaM Expr × HashSet Expr) := do
|
||||
def mkAtomLinearCombo (e : Expr) : OmegaM (LinearCombo × OmegaM Expr × Std.HashSet Expr) := do
|
||||
let (n, facts) ← lookup e
|
||||
return ⟨LinearCombo.coordinate n, mkCoordinateEvalAtomsEq e n, facts.getD ∅⟩
|
||||
|
||||
@@ -94,9 +94,9 @@ Gives a small (10%) speedup in testing.
|
||||
I tried using a pointer based cache,
|
||||
but there was never enough subexpression sharing to make it effective.
|
||||
-/
|
||||
partial def asLinearCombo (e : Expr) : OmegaM (LinearCombo × OmegaM Expr × HashSet Expr) := do
|
||||
partial def asLinearCombo (e : Expr) : OmegaM (LinearCombo × OmegaM Expr × Std.HashSet Expr) := do
|
||||
let cache ← get
|
||||
match cache.find? e with
|
||||
match cache.get? e with
|
||||
| some (lc, prf) =>
|
||||
trace[omega] "Found in cache: {e}"
|
||||
return (lc, prf, ∅)
|
||||
@@ -120,7 +120,7 @@ We also transform the expression as we descend into it:
|
||||
* pushing coercions: `↑(x + y)`, `↑(x * y)`, `↑(x / k)`, `↑(x % k)`, `↑k`
|
||||
* unfolding `emod`: `x % k` → `x - x / k`
|
||||
-/
|
||||
partial def asLinearComboImpl (e : Expr) : OmegaM (LinearCombo × OmegaM Expr × HashSet Expr) := do
|
||||
partial def asLinearComboImpl (e : Expr) : OmegaM (LinearCombo × OmegaM Expr × Std.HashSet Expr) := do
|
||||
trace[omega] "processing {e}"
|
||||
match groundInt? e with
|
||||
| some i =>
|
||||
@@ -142,7 +142,7 @@ partial def asLinearComboImpl (e : Expr) : OmegaM (LinearCombo × OmegaM Expr ×
|
||||
mkEqTrans
|
||||
(← mkAppM ``Int.add_congr #[← prf₁, ← prf₂])
|
||||
(← mkEqSymm add_eval)
|
||||
pure (l₁ + l₂, prf, facts₁.merge facts₂)
|
||||
pure (l₁ + l₂, prf, facts₁.union facts₂)
|
||||
| (``HSub.hSub, #[_, _, _, _, e₁, e₂]) => do
|
||||
let (l₁, prf₁, facts₁) ← asLinearCombo e₁
|
||||
let (l₂, prf₂, facts₂) ← asLinearCombo e₂
|
||||
@@ -152,7 +152,7 @@ partial def asLinearComboImpl (e : Expr) : OmegaM (LinearCombo × OmegaM Expr ×
|
||||
mkEqTrans
|
||||
(← mkAppM ``Int.sub_congr #[← prf₁, ← prf₂])
|
||||
(← mkEqSymm sub_eval)
|
||||
pure (l₁ - l₂, prf, facts₁.merge facts₂)
|
||||
pure (l₁ - l₂, prf, facts₁.union facts₂)
|
||||
| (``Neg.neg, #[_, _, e']) => do
|
||||
let (l, prf, facts) ← asLinearCombo e'
|
||||
let prf' : OmegaM Expr := do
|
||||
@@ -178,7 +178,7 @@ partial def asLinearComboImpl (e : Expr) : OmegaM (LinearCombo × OmegaM Expr ×
|
||||
mkEqTrans
|
||||
(← mkAppM ``Int.mul_congr #[← xprf, ← yprf])
|
||||
(← mkEqSymm mul_eval)
|
||||
pure (some (LinearCombo.mul xl yl, prf, xfacts.merge yfacts), true)
|
||||
pure (some (LinearCombo.mul xl yl, prf, xfacts.union yfacts), true)
|
||||
else
|
||||
pure (none, false)
|
||||
match r? with
|
||||
@@ -235,7 +235,7 @@ where
|
||||
Apply a rewrite rule to an expression, and interpret the result as a `LinearCombo`.
|
||||
(We're not rewriting any subexpressions here, just the top level, for efficiency.)
|
||||
-/
|
||||
rewrite (lhs rw : Expr) : OmegaM (LinearCombo × OmegaM Expr × HashSet Expr) := do
|
||||
rewrite (lhs rw : Expr) : OmegaM (LinearCombo × OmegaM Expr × Std.HashSet Expr) := do
|
||||
trace[omega] "rewriting {lhs} via {rw} : {← inferType rw}"
|
||||
match (← inferType rw).eq? with
|
||||
| some (_, _lhs', rhs) =>
|
||||
@@ -243,7 +243,7 @@ where
|
||||
let prf' : OmegaM Expr := do mkEqTrans rw (← prf)
|
||||
pure (lc, prf', facts)
|
||||
| none => panic! "Invalid rewrite rule in 'asLinearCombo'"
|
||||
handleNatCast (e i n : Expr) : OmegaM (LinearCombo × OmegaM Expr × HashSet Expr) := do
|
||||
handleNatCast (e i n : Expr) : OmegaM (LinearCombo × OmegaM Expr × Std.HashSet Expr) := do
|
||||
match n with
|
||||
| .fvar h =>
|
||||
if let some v ← h.getValue? then
|
||||
@@ -296,7 +296,7 @@ where
|
||||
| (``Fin.val, #[n, x]) =>
|
||||
handleFinVal e i n x
|
||||
| _ => mkAtomLinearCombo e
|
||||
handleFinVal (e i n x : Expr) : OmegaM (LinearCombo × OmegaM Expr × HashSet Expr) := do
|
||||
handleFinVal (e i n x : Expr) : OmegaM (LinearCombo × OmegaM Expr × Std.HashSet Expr) := do
|
||||
match x with
|
||||
| .fvar h =>
|
||||
if let some v ← h.getValue? then
|
||||
@@ -342,7 +342,7 @@ We solve equalities as they are discovered, as this often results in an earlier
|
||||
-/
|
||||
def addIntEquality (p : MetaProblem) (h x : Expr) : OmegaM MetaProblem := do
|
||||
let (lc, prf, facts) ← asLinearCombo x
|
||||
let newFacts : HashSet Expr := facts.fold (init := ∅) fun s e =>
|
||||
let newFacts : Std.HashSet Expr := facts.fold (init := ∅) fun s e =>
|
||||
if p.processedFacts.contains e then s else s.insert e
|
||||
trace[omega] "Adding proof of {lc} = 0"
|
||||
pure <|
|
||||
@@ -358,7 +358,7 @@ We solve equalities as they are discovered, as this often results in an earlier
|
||||
-/
|
||||
def addIntInequality (p : MetaProblem) (h y : Expr) : OmegaM MetaProblem := do
|
||||
let (lc, prf, facts) ← asLinearCombo y
|
||||
let newFacts : HashSet Expr := facts.fold (init := ∅) fun s e =>
|
||||
let newFacts : Std.HashSet Expr := facts.fold (init := ∅) fun s e =>
|
||||
if p.processedFacts.contains e then s else s.insert e
|
||||
trace[omega] "Adding proof of {lc} ≥ 0"
|
||||
pure <|
|
||||
@@ -590,7 +590,7 @@ where
|
||||
|
||||
-- We sort the constraints; otherwise the order is dependent on details of the hashing
|
||||
-- and this can cause test suite output churn
|
||||
prettyConstraints (names : Array String) (constraints : HashMap Coeffs Fact) : String :=
|
||||
prettyConstraints (names : Array String) (constraints : Std.HashMap Coeffs Fact) : String :=
|
||||
constraints.toList
|
||||
|>.toArray
|
||||
|>.qsort (·.1 < ·.1)
|
||||
@@ -615,7 +615,7 @@ where
|
||||
(if Int.natAbs c = 1 then names[i]! else s!"{c.natAbs}*{names[i]!}"))
|
||||
|> String.join
|
||||
|
||||
mentioned (atoms : Array Expr) (constraints : HashMap Coeffs Fact) : MetaM (Array Bool) := do
|
||||
mentioned (atoms : Array Expr) (constraints : Std.HashMap Coeffs Fact) : MetaM (Array Bool) := do
|
||||
let initMask := Array.mkArray atoms.size false
|
||||
return constraints.fold (init := initMask) fun mask coeffs _ =>
|
||||
coeffs.enum.foldl (init := mask) fun mask (i, c) =>
|
||||
|
||||
@@ -10,6 +10,8 @@ import Init.Omega.Logic
|
||||
import Init.Data.BitVec.Basic
|
||||
import Lean.Meta.AppBuilder
|
||||
import Lean.Meta.Canonicalizer
|
||||
import Std.Data.HashMap.Basic
|
||||
import Std.Data.HashSet.Basic
|
||||
|
||||
/-!
|
||||
# The `OmegaM` state monad.
|
||||
@@ -52,7 +54,7 @@ structure Context where
|
||||
/-- The internal state for the `OmegaM` monad, recording previously encountered atoms. -/
|
||||
structure State where
|
||||
/-- The atoms up-to-defeq encountered so far. -/
|
||||
atoms : HashMap Expr Nat := {}
|
||||
atoms : Std.HashMap Expr Nat := {}
|
||||
|
||||
/-- An intermediate layer in the `OmegaM` monad. -/
|
||||
abbrev OmegaM' := StateRefT State (ReaderT Context CanonM)
|
||||
@@ -60,7 +62,7 @@ abbrev OmegaM' := StateRefT State (ReaderT Context CanonM)
|
||||
/--
|
||||
Cache of expressions that have been visited, and their reflection as a linear combination.
|
||||
-/
|
||||
def Cache : Type := HashMap Expr (LinearCombo × OmegaM' Expr)
|
||||
def Cache : Type := Std.HashMap Expr (LinearCombo × OmegaM' Expr)
|
||||
|
||||
/--
|
||||
The `OmegaM` monad maintains two pieces of state:
|
||||
@@ -71,7 +73,7 @@ abbrev OmegaM := StateRefT Cache OmegaM'
|
||||
|
||||
/-- Run a computation in the `OmegaM` monad, starting with no recorded atoms. -/
|
||||
def OmegaM.run (m : OmegaM α) (cfg : OmegaConfig) : MetaM α :=
|
||||
m.run' HashMap.empty |>.run' {} { cfg } |>.run'
|
||||
m.run' Std.HashMap.empty |>.run' {} { cfg } |>.run'
|
||||
|
||||
/-- Retrieve the user-specified configuration options. -/
|
||||
def cfg : OmegaM OmegaConfig := do pure (← read).cfg
|
||||
@@ -162,11 +164,11 @@ def mkEqReflWithExpectedType (a b : Expr) : MetaM Expr := do
|
||||
Analyzes a newly recorded atom,
|
||||
returning a collection of interesting facts about it that should be added to the context.
|
||||
-/
|
||||
def analyzeAtom (e : Expr) : OmegaM (HashSet Expr) := do
|
||||
def analyzeAtom (e : Expr) : OmegaM (Std.HashSet Expr) := do
|
||||
match e.getAppFnArgs with
|
||||
| (``Nat.cast, #[.const ``Int [], _, e']) =>
|
||||
-- Casts of natural numbers are non-negative.
|
||||
let mut r := HashSet.empty.insert (Expr.app (.const ``Int.ofNat_nonneg []) e')
|
||||
let mut r := Std.HashSet.empty.insert (Expr.app (.const ``Int.ofNat_nonneg []) e')
|
||||
match (← cfg).splitNatSub, e'.getAppFnArgs with
|
||||
| true, (``HSub.hSub, #[_, _, _, _, a, b]) =>
|
||||
-- `((a - b : Nat) : Int)` gives a dichotomy
|
||||
@@ -188,7 +190,7 @@ def analyzeAtom (e : Expr) : OmegaM (HashSet Expr) := do
|
||||
let ne_zero := mkApp3 (.const ``Ne [1]) (.const ``Int []) k (toExpr (0 : Int))
|
||||
let pos := mkApp4 (.const ``LT.lt [0]) (.const ``Int []) (.const ``Int.instLTInt [])
|
||||
(toExpr (0 : Int)) k
|
||||
pure <| HashSet.empty.insert
|
||||
pure <| Std.HashSet.empty.insert
|
||||
(mkApp3 (.const ``Int.mul_ediv_self_le []) x k (← mkDecideProof ne_zero)) |>.insert
|
||||
(mkApp3 (.const ``Int.lt_mul_ediv_self_add []) x k (← mkDecideProof pos))
|
||||
| (``HMod.hMod, #[_, _, _, _, x, k]) =>
|
||||
@@ -200,7 +202,7 @@ def analyzeAtom (e : Expr) : OmegaM (HashSet Expr) := do
|
||||
let b_pos := mkApp4 (.const ``LT.lt [0]) (.const ``Int []) (.const ``Int.instLTInt [])
|
||||
(toExpr (0 : Int)) b
|
||||
let pow_pos := mkApp3 (.const ``Lean.Omega.Int.pos_pow_of_pos []) b exp (← mkDecideProof b_pos)
|
||||
pure <| HashSet.empty.insert
|
||||
pure <| Std.HashSet.empty.insert
|
||||
(mkApp3 (.const ``Int.emod_nonneg []) x k
|
||||
(mkApp3 (.const ``Int.ne_of_gt []) k (toExpr (0 : Int)) pow_pos)) |>.insert
|
||||
(mkApp3 (.const ``Int.emod_lt_of_pos []) x k pow_pos)
|
||||
@@ -214,7 +216,7 @@ def analyzeAtom (e : Expr) : OmegaM (HashSet Expr) := do
|
||||
(toExpr (0 : Nat)) b
|
||||
let pow_pos := mkApp3 (.const ``Nat.pos_pow_of_pos []) b exp (← mkDecideProof b_pos)
|
||||
let cast_pos := mkApp2 (.const ``Int.ofNat_pos_of_pos []) k' pow_pos
|
||||
pure <| HashSet.empty.insert
|
||||
pure <| Std.HashSet.empty.insert
|
||||
(mkApp3 (.const ``Int.emod_nonneg []) x k
|
||||
(mkApp3 (.const ``Int.ne_of_gt []) k (toExpr (0 : Int)) cast_pos)) |>.insert
|
||||
(mkApp3 (.const ``Int.emod_lt_of_pos []) x k cast_pos)
|
||||
@@ -222,18 +224,18 @@ def analyzeAtom (e : Expr) : OmegaM (HashSet Expr) := do
|
||||
| (``Nat.cast, #[.const ``Int [], _, x']) =>
|
||||
-- Since we push coercions inside `%`, we need to record here that
|
||||
-- `(x : Int) % (y : Int)` is non-negative.
|
||||
pure <| HashSet.empty.insert (mkApp2 (.const ``Int.emod_ofNat_nonneg []) x' k)
|
||||
pure <| Std.HashSet.empty.insert (mkApp2 (.const ``Int.emod_ofNat_nonneg []) x' k)
|
||||
| _ => pure ∅
|
||||
| _ => pure ∅
|
||||
| (``Min.min, #[_, _, x, y]) =>
|
||||
pure <| HashSet.empty.insert (mkApp2 (.const ``Int.min_le_left []) x y) |>.insert
|
||||
pure <| Std.HashSet.empty.insert (mkApp2 (.const ``Int.min_le_left []) x y) |>.insert
|
||||
(mkApp2 (.const ``Int.min_le_right []) x y)
|
||||
| (``Max.max, #[_, _, x, y]) =>
|
||||
pure <| HashSet.empty.insert (mkApp2 (.const ``Int.le_max_left []) x y) |>.insert
|
||||
pure <| Std.HashSet.empty.insert (mkApp2 (.const ``Int.le_max_left []) x y) |>.insert
|
||||
(mkApp2 (.const ``Int.le_max_right []) x y)
|
||||
| (``ite, #[α, i, dec, t, e]) =>
|
||||
if α == (.const ``Int []) then
|
||||
pure <| HashSet.empty.insert <| mkApp5 (.const ``ite_disjunction [0]) α i dec t e
|
||||
pure <| Std.HashSet.empty.insert <| mkApp5 (.const ``ite_disjunction [0]) α i dec t e
|
||||
else
|
||||
pure {}
|
||||
| _ => pure ∅
|
||||
@@ -248,10 +250,10 @@ Return its index, and, if it is new, a collection of interesting facts about the
|
||||
* for each new atom of the form `((a - b : Nat) : Int)`, the fact:
|
||||
`b ≤ a ∧ ((a - b : Nat) : Int) = a - b ∨ a < b ∧ ((a - b : Nat) : Int) = 0`
|
||||
-/
|
||||
def lookup (e : Expr) : OmegaM (Nat × Option (HashSet Expr)) := do
|
||||
def lookup (e : Expr) : OmegaM (Nat × Option (Std.HashSet Expr)) := do
|
||||
let c ← getThe State
|
||||
let e ← canon e
|
||||
match c.atoms.find? e with
|
||||
match c.atoms[e]? with
|
||||
| some i => return (i, none)
|
||||
| none =>
|
||||
trace[omega] "New atom: {e}"
|
||||
|
||||
@@ -996,7 +996,7 @@ def synthesizeInstMVarCore (instMVar : MVarId) (maxResultSize? : Option Nat := n
|
||||
if (← read).ignoreTCFailures then
|
||||
return false
|
||||
else
|
||||
throwError "failed to synthesize{indentExpr type}{extraErrorMsg}\n{useDiagnosticMsg}"
|
||||
throwError "failed to synthesize{indentExpr type}{extraErrorMsg}{useDiagnosticMsg}"
|
||||
|
||||
def mkCoe (expectedType : Expr) (e : Expr) (f? : Option Expr := none) (errorMsgHeader? : Option String := none) : TermElabM Expr := do
|
||||
withTraceNode `Elab.coe (fun _ => return m!"adding coercion for {e} : {← inferType e} =?= {expectedType}") do
|
||||
|
||||
@@ -7,8 +7,8 @@ prelude
|
||||
import Init.Control.StateRef
|
||||
import Init.Data.Array.BinSearch
|
||||
import Init.Data.Stream
|
||||
import Lean.Data.HashMap
|
||||
import Lean.ImportingFlag
|
||||
import Lean.Data.HashMap
|
||||
import Lean.Data.SMap
|
||||
import Lean.Declaration
|
||||
import Lean.LocalContext
|
||||
@@ -134,7 +134,7 @@ structure Environment where
|
||||
the field `constants`. These auxiliary constants are invisible to the Lean kernel and elaborator.
|
||||
Only the code generator uses them.
|
||||
-/
|
||||
const2ModIdx : HashMap Name ModuleIdx
|
||||
const2ModIdx : Std.HashMap Name ModuleIdx
|
||||
/--
|
||||
Mapping from constant name to `ConstantInfo`. It contains all constants (definitions, theorems, axioms, etc)
|
||||
that have been already type checked by the kernel.
|
||||
@@ -205,7 +205,7 @@ private def getTrustLevel (env : Environment) : UInt32 :=
|
||||
env.header.trustLevel
|
||||
|
||||
def getModuleIdxFor? (env : Environment) (declName : Name) : Option ModuleIdx :=
|
||||
env.const2ModIdx.find? declName
|
||||
env.const2ModIdx[declName]?
|
||||
|
||||
def isConstructor (env : Environment) (declName : Name) : Bool :=
|
||||
match env.find? declName with
|
||||
@@ -721,7 +721,7 @@ def writeModule (env : Environment) (fname : System.FilePath) : IO Unit := do
|
||||
Construct a mapping from persistent extension name to entension index at the array of persistent extensions.
|
||||
We only consider extensions starting with index `>= startingAt`.
|
||||
-/
|
||||
def mkExtNameMap (startingAt : Nat) : IO (HashMap Name Nat) := do
|
||||
def mkExtNameMap (startingAt : Nat) : IO (Std.HashMap Name Nat) := do
|
||||
let descrs ← persistentEnvExtensionsRef.get
|
||||
let mut result := {}
|
||||
for h : i in [startingAt : descrs.size] do
|
||||
@@ -742,7 +742,7 @@ private def setImportedEntries (env : Environment) (mods : Array ModuleData) (st
|
||||
have : modIdx < mods.size := h.upper
|
||||
let mod := mods[modIdx]
|
||||
for (extName, entries) in mod.entries do
|
||||
if let some entryIdx := extNameIdx.find? extName then
|
||||
if let some entryIdx := extNameIdx[extName]? then
|
||||
env := extDescrs[entryIdx]!.toEnvExtension.modifyState env fun s => { s with importedEntries := s.importedEntries.set! modIdx entries }
|
||||
return env
|
||||
|
||||
@@ -790,9 +790,9 @@ structure ImportState where
|
||||
moduleData : Array ModuleData := #[]
|
||||
regions : Array CompactedRegion := #[]
|
||||
|
||||
def throwAlreadyImported (s : ImportState) (const2ModIdx : HashMap Name ModuleIdx) (modIdx : Nat) (cname : Name) : IO α := do
|
||||
def throwAlreadyImported (s : ImportState) (const2ModIdx : Std.HashMap Name ModuleIdx) (modIdx : Nat) (cname : Name) : IO α := do
|
||||
let modName := s.moduleNames[modIdx]!
|
||||
let constModName := s.moduleNames[const2ModIdx[cname].get!.toNat]!
|
||||
let constModName := s.moduleNames[const2ModIdx[cname]!.toNat]!
|
||||
throw <| IO.userError s!"import {modName} failed, environment already contains '{cname}' from {constModName}"
|
||||
|
||||
abbrev ImportStateM := StateRefT ImportState IO
|
||||
@@ -856,21 +856,21 @@ def finalizeImport (s : ImportState) (imports : Array Import) (opts : Options) (
|
||||
(leakEnv := false) : IO Environment := do
|
||||
let numConsts := s.moduleData.foldl (init := 0) fun numConsts mod =>
|
||||
numConsts + mod.constants.size + mod.extraConstNames.size
|
||||
let mut const2ModIdx : HashMap Name ModuleIdx := mkHashMap (capacity := numConsts)
|
||||
let mut constantMap : HashMap Name ConstantInfo := mkHashMap (capacity := numConsts)
|
||||
let mut const2ModIdx : Std.HashMap Name ModuleIdx := Std.HashMap.empty (capacity := numConsts)
|
||||
let mut constantMap : Std.HashMap Name ConstantInfo := Std.HashMap.empty (capacity := numConsts)
|
||||
for h:modIdx in [0:s.moduleData.size] do
|
||||
let mod := s.moduleData[modIdx]'h.upper
|
||||
for cname in mod.constNames, cinfo in mod.constants do
|
||||
match constantMap.insertIfNew cname cinfo with
|
||||
| (constantMap', cinfoPrev?) =>
|
||||
match constantMap.getThenInsertIfNew? cname cinfo with
|
||||
| (cinfoPrev?, constantMap') =>
|
||||
constantMap := constantMap'
|
||||
if let some cinfoPrev := cinfoPrev? then
|
||||
-- Recall that the map has not been modified when `cinfoPrev? = some _`.
|
||||
unless equivInfo cinfoPrev cinfo do
|
||||
throwAlreadyImported s const2ModIdx modIdx cname
|
||||
const2ModIdx := const2ModIdx.insertIfNew cname modIdx |>.1
|
||||
const2ModIdx := const2ModIdx.insertIfNew cname modIdx
|
||||
for cname in mod.extraConstNames do
|
||||
const2ModIdx := const2ModIdx.insertIfNew cname modIdx |>.1
|
||||
const2ModIdx := const2ModIdx.insertIfNew cname modIdx
|
||||
let constants : ConstMap := SMap.fromHashMap constantMap false
|
||||
let exts ← mkInitialExtensionStates
|
||||
let mut env : Environment := {
|
||||
@@ -936,7 +936,7 @@ builtin_initialize namespacesExt : SimplePersistentEnvExtension Name NameSSet
|
||||
6.18% of the runtime is here. It was 9.31% before the `HashMap` optimization.
|
||||
-/
|
||||
let capacity := as.foldl (init := 0) fun r e => r + e.size
|
||||
let map : HashMap Name Unit := mkHashMap capacity
|
||||
let map : Std.HashMap Name Unit := Std.HashMap.empty capacity
|
||||
let map := mkStateFromImportedEntries (fun map name => map.insert name ()) map as
|
||||
SMap.fromHashMap map |>.switch
|
||||
addEntryFn := fun s n => s.insert n
|
||||
|
||||
@@ -8,6 +8,7 @@ import Init.Data.Hashable
|
||||
import Lean.Data.KVMap
|
||||
import Lean.Data.SMap
|
||||
import Lean.Level
|
||||
import Std.Data.HashSet.Basic
|
||||
|
||||
namespace Lean
|
||||
|
||||
@@ -244,7 +245,7 @@ def FVarIdSet.insert (s : FVarIdSet) (fvarId : FVarId) : FVarIdSet :=
|
||||
A set of unique free variable identifiers implemented using hashtables.
|
||||
Hashtables are faster than red-black trees if they are used linearly.
|
||||
They are not persistent data-structures. -/
|
||||
def FVarIdHashSet := HashSet FVarId
|
||||
def FVarIdHashSet := Std.HashSet FVarId
|
||||
deriving Inhabited, EmptyCollection
|
||||
|
||||
/--
|
||||
@@ -1388,11 +1389,11 @@ def mkDecIsTrue (pred proof : Expr) :=
|
||||
def mkDecIsFalse (pred proof : Expr) :=
|
||||
mkAppB (mkConst `Decidable.isFalse) pred proof
|
||||
|
||||
abbrev ExprMap (α : Type) := HashMap Expr α
|
||||
abbrev ExprMap (α : Type) := Std.HashMap Expr α
|
||||
abbrev PersistentExprMap (α : Type) := PHashMap Expr α
|
||||
abbrev SExprMap (α : Type) := SMap Expr α
|
||||
|
||||
abbrev ExprSet := HashSet Expr
|
||||
abbrev ExprSet := Std.HashSet Expr
|
||||
abbrev PersistentExprSet := PHashSet Expr
|
||||
abbrev PExprSet := PersistentExprSet
|
||||
|
||||
@@ -1417,7 +1418,7 @@ instance : ToString ExprStructEq := ⟨fun e => toString e.val⟩
|
||||
|
||||
end ExprStructEq
|
||||
|
||||
abbrev ExprStructMap (α : Type) := HashMap ExprStructEq α
|
||||
abbrev ExprStructMap (α : Type) := Std.HashMap ExprStructEq α
|
||||
abbrev PersistentExprStructMap (α : Type) := PHashMap ExprStructEq α
|
||||
|
||||
namespace Expr
|
||||
@@ -1452,28 +1453,26 @@ partial def betaRev (f : Expr) (revArgs : Array Expr) (useZeta := false) (preser
|
||||
else
|
||||
let sz := revArgs.size
|
||||
let rec go (e : Expr) (i : Nat) : Expr :=
|
||||
let done (_ : Unit) : Expr :=
|
||||
let n := sz - i
|
||||
mkAppRevRange (e.instantiateRange n sz revArgs) 0 n revArgs
|
||||
match e with
|
||||
| Expr.lam _ _ b _ =>
|
||||
| .lam _ _ b _ =>
|
||||
if i + 1 < sz then
|
||||
go b (i+1)
|
||||
else
|
||||
let n := sz - (i + 1)
|
||||
mkAppRevRange (b.instantiateRange n sz revArgs) 0 n revArgs
|
||||
| Expr.letE _ _ v b _ =>
|
||||
b.instantiate revArgs
|
||||
| .letE _ _ v b _ =>
|
||||
if useZeta && i < sz then
|
||||
go (b.instantiate1 v) i
|
||||
else
|
||||
let n := sz - i
|
||||
mkAppRevRange (e.instantiateRange n sz revArgs) 0 n revArgs
|
||||
| Expr.mdata k b =>
|
||||
done ()
|
||||
| .mdata _ b =>
|
||||
if preserveMData then
|
||||
let n := sz - i
|
||||
mkMData k (mkAppRevRange (b.instantiateRange n sz revArgs) 0 n revArgs)
|
||||
done ()
|
||||
else
|
||||
go b i
|
||||
| b =>
|
||||
let n := sz - i
|
||||
mkAppRevRange (b.instantiateRange n sz revArgs) 0 n revArgs
|
||||
| _ => done ()
|
||||
go f 0
|
||||
|
||||
/--
|
||||
|
||||
@@ -31,7 +31,7 @@ namespace Lean
|
||||
abbrev LabelExtension := SimpleScopedEnvExtension Name (Array Name)
|
||||
|
||||
/-- The collection of all current `LabelExtension`s, indexed by name. -/
|
||||
abbrev LabelExtensionMap := HashMap Name LabelExtension
|
||||
abbrev LabelExtensionMap := Std.HashMap Name LabelExtension
|
||||
|
||||
/-- Store the current `LabelExtension`s. -/
|
||||
builtin_initialize labelExtensionMapRef : IO.Ref LabelExtensionMap ← IO.mkRef {}
|
||||
@@ -88,7 +88,7 @@ macro (name := _root_.Lean.Parser.Command.registerLabelAttr)
|
||||
/-- When `attrName` is an attribute created using `register_labelled_attr`,
|
||||
return the names of all declarations labelled using that attribute. -/
|
||||
def labelled (attrName : Name) : CoreM (Array Name) := do
|
||||
match (← labelExtensionMapRef.get).find? attrName with
|
||||
match (← labelExtensionMapRef.get)[attrName]? with
|
||||
| none => throwError "No extension named {attrName}"
|
||||
| some ext => pure <| ext.getState (← getEnv)
|
||||
|
||||
|
||||
@@ -10,8 +10,8 @@ Authors: Sebastian Ullrich
|
||||
|
||||
prelude
|
||||
import Lean.Language.Basic
|
||||
import Lean.Language.Lean.Types
|
||||
import Lean.Parser.Module
|
||||
import Lean.Elab.Command
|
||||
import Lean.Elab.Import
|
||||
|
||||
/-!
|
||||
@@ -166,11 +166,6 @@ namespace Lean.Language.Lean
|
||||
open Lean.Elab Command
|
||||
open Lean.Parser
|
||||
|
||||
private def pushOpt (a? : Option α) (as : Array α) : Array α :=
|
||||
match a? with
|
||||
| some a => as.push a
|
||||
| none => as
|
||||
|
||||
/-- Option for capturing output to stderr during elaboration. -/
|
||||
register_builtin_option stderrAsMessages : Bool := {
|
||||
defValue := true
|
||||
@@ -178,101 +173,6 @@ register_builtin_option stderrAsMessages : Bool := {
|
||||
descr := "(server) capture output to the Lean stderr channel (such as from `dbg_trace`) during elaboration of a command as a diagnostic message"
|
||||
}
|
||||
|
||||
/-! The hierarchy of Lean snapshot types -/
|
||||
|
||||
/-- Snapshot after elaboration of the entire command. -/
|
||||
structure CommandFinishedSnapshot extends Language.Snapshot where
|
||||
/-- Resulting elaboration state. -/
|
||||
cmdState : Command.State
|
||||
deriving Nonempty
|
||||
instance : ToSnapshotTree CommandFinishedSnapshot where
|
||||
toSnapshotTree s := ⟨s.toSnapshot, #[]⟩
|
||||
|
||||
/-- State after a command has been parsed. -/
|
||||
structure CommandParsedSnapshotData extends Snapshot where
|
||||
/-- Syntax tree of the command. -/
|
||||
stx : Syntax
|
||||
/-- Resulting parser state. -/
|
||||
parserState : Parser.ModuleParserState
|
||||
/--
|
||||
Snapshot for incremental reporting and reuse during elaboration, type dependent on specific
|
||||
elaborator.
|
||||
-/
|
||||
elabSnap : SnapshotTask DynamicSnapshot
|
||||
/-- State after processing is finished. -/
|
||||
finishedSnap : SnapshotTask CommandFinishedSnapshot
|
||||
/-- Cache for `save`; to be replaced with incrementality. -/
|
||||
tacticCache : IO.Ref Tactic.Cache
|
||||
deriving Nonempty
|
||||
|
||||
/-- State after a command has been parsed. -/
|
||||
-- workaround for lack of recursive structures
|
||||
inductive CommandParsedSnapshot where
|
||||
/-- Creates a command parsed snapshot. -/
|
||||
| mk (data : CommandParsedSnapshotData)
|
||||
(nextCmdSnap? : Option (SnapshotTask CommandParsedSnapshot))
|
||||
deriving Nonempty
|
||||
/-- The snapshot data. -/
|
||||
abbrev CommandParsedSnapshot.data : CommandParsedSnapshot → CommandParsedSnapshotData
|
||||
| mk data _ => data
|
||||
/-- Next command, unless this is a terminal command. -/
|
||||
abbrev CommandParsedSnapshot.nextCmdSnap? : CommandParsedSnapshot →
|
||||
Option (SnapshotTask CommandParsedSnapshot)
|
||||
| mk _ next? => next?
|
||||
partial instance : ToSnapshotTree CommandParsedSnapshot where
|
||||
toSnapshotTree := go where
|
||||
go s := ⟨s.data.toSnapshot,
|
||||
#[s.data.elabSnap.map (sync := true) toSnapshotTree,
|
||||
s.data.finishedSnap.map (sync := true) toSnapshotTree] |>
|
||||
pushOpt (s.nextCmdSnap?.map (·.map (sync := true) go))⟩
|
||||
|
||||
/-- State after successful importing. -/
|
||||
structure HeaderProcessedState where
|
||||
/-- The resulting initial elaboration state. -/
|
||||
cmdState : Command.State
|
||||
/-- First command task (there is always at least a terminal command). -/
|
||||
firstCmdSnap : SnapshotTask CommandParsedSnapshot
|
||||
|
||||
/-- State after the module header has been processed including imports. -/
|
||||
structure HeaderProcessedSnapshot extends Snapshot where
|
||||
/-- State after successful importing. -/
|
||||
result? : Option HeaderProcessedState
|
||||
isFatal := result?.isNone
|
||||
instance : ToSnapshotTree HeaderProcessedSnapshot where
|
||||
toSnapshotTree s := ⟨s.toSnapshot, #[] |>
|
||||
pushOpt (s.result?.map (·.firstCmdSnap.map (sync := true) toSnapshotTree))⟩
|
||||
|
||||
/-- State after successfully parsing the module header. -/
|
||||
structure HeaderParsedState where
|
||||
/-- Resulting parser state. -/
|
||||
parserState : Parser.ModuleParserState
|
||||
/-- Header processing task. -/
|
||||
processedSnap : SnapshotTask HeaderProcessedSnapshot
|
||||
|
||||
/-- State after the module header has been parsed. -/
|
||||
structure HeaderParsedSnapshot extends Snapshot where
|
||||
/-- Parser input context supplied by the driver, stored here for incremental parsing. -/
|
||||
ictx : Parser.InputContext
|
||||
/-- Resulting syntax tree. -/
|
||||
stx : Syntax
|
||||
/-- State after successful parsing. -/
|
||||
result? : Option HeaderParsedState
|
||||
isFatal := result?.isNone
|
||||
/-- Cancellation token for interrupting processing of this run. -/
|
||||
cancelTk? : Option IO.CancelToken
|
||||
|
||||
instance : ToSnapshotTree HeaderParsedSnapshot where
|
||||
toSnapshotTree s := ⟨s.toSnapshot,
|
||||
#[] |> pushOpt (s.result?.map (·.processedSnap.map (sync := true) toSnapshotTree))⟩
|
||||
|
||||
/-- Shortcut accessor to the final header state, if successful. -/
|
||||
def HeaderParsedSnapshot.processedResult (snap : HeaderParsedSnapshot) :
|
||||
SnapshotTask (Option HeaderProcessedState) :=
|
||||
snap.result?.bind (·.processedSnap.map (sync := true) (·.result?)) |>.getD (.pure none)
|
||||
|
||||
/-- Initial snapshot of the Lean language processor: a "header parsed" snapshot. -/
|
||||
abbrev InitialSnapshot := HeaderParsedSnapshot
|
||||
|
||||
/-- Lean-specific processing context. -/
|
||||
structure LeanProcessingContext extends ProcessingContext where
|
||||
/-- Position of the first file difference if there was a previous invocation. -/
|
||||
@@ -334,6 +234,54 @@ structure SetupImportsResult where
|
||||
/-- Kernel trust level. -/
|
||||
trustLevel : UInt32 := 0
|
||||
|
||||
/-- Performance option used by cmdline driver. -/
|
||||
register_builtin_option internal.minimalSnapshots : Bool := {
|
||||
defValue := false
|
||||
descr := "reduce information stored in snapshots to the minimum necessary for the cmdline \
|
||||
driver: diagnostics per command and final full snapshot"
|
||||
}
|
||||
|
||||
/--
|
||||
Parses values of options registered during import and left by the C++ frontend as strings, fails if
|
||||
any option names remain unknown.
|
||||
-/
|
||||
def reparseOptions (opts : Options) : IO Options := do
|
||||
let mut opts := opts
|
||||
let decls ← getOptionDecls
|
||||
for (name, val) in opts do
|
||||
let .ofString val := val
|
||||
| continue -- Already parsed by C++
|
||||
-- Options can be prefixed with `weak` in order to turn off the error when the option is not
|
||||
-- defined
|
||||
let weak := name.getRoot == `weak
|
||||
if weak then
|
||||
opts := opts.erase name
|
||||
let name := name.replacePrefix `weak Name.anonymous
|
||||
let some decl := decls.find? name
|
||||
| unless weak do
|
||||
throw <| .userError s!"invalid -D parameter, unknown configuration option '{name}'
|
||||
|
||||
If the option is defined in this library, use '-D{`weak ++ name}' to set it conditionally"
|
||||
|
||||
match decl.defValue with
|
||||
| .ofBool _ =>
|
||||
match val with
|
||||
| "true" => opts := opts.insert name true
|
||||
| "false" => opts := opts.insert name false
|
||||
| _ =>
|
||||
throw <| .userError s!"invalid -D parameter, invalid configuration option '{val}' value, \
|
||||
it must be true/false"
|
||||
| .ofNat _ =>
|
||||
let some val := val.toNat?
|
||||
| throw <| .userError s!"invalid -D parameter, invalid configuration option '{val}' value, \
|
||||
it must be a natural number"
|
||||
opts := opts.insert name val
|
||||
| .ofString _ => opts := opts.insert name val
|
||||
| _ => throw <| .userError s!"invalid -D parameter, configuration option '{name}' \
|
||||
cannot be set in the command line, use set_option command"
|
||||
|
||||
return opts
|
||||
|
||||
/--
|
||||
Entry point of the Lean language processor.
|
||||
|
||||
@@ -379,9 +327,11 @@ where
|
||||
if let some oldProcSuccess := oldProcessed.result? then
|
||||
-- also wait on old command parse snapshot as parsing is cheap and may allow for
|
||||
-- elaboration reuse
|
||||
oldProcSuccess.firstCmdSnap.bindIO (sync := true) fun oldCmd =>
|
||||
oldProcSuccess.firstCmdSnap.bindIO (sync := true) fun oldCmd => do
|
||||
let prom ← IO.Promise.new
|
||||
let _ ← IO.asTask (parseCmd oldCmd newParserState oldProcSuccess.cmdState oldProcSuccess.cmdState.env prom ctx)
|
||||
return .pure { oldProcessed with result? := some { oldProcSuccess with
|
||||
firstCmdSnap := (← parseCmd oldCmd newParserState oldProcSuccess.cmdState ctx) } }
|
||||
firstCmdSnap := { range? := none, task := prom.result } } }
|
||||
else
|
||||
return .pure oldProcessed) } }
|
||||
else return old
|
||||
@@ -443,42 +393,68 @@ where
|
||||
let setup ← match (← setupImports stx) with
|
||||
| .ok setup => pure setup
|
||||
| .error snap => return snap
|
||||
|
||||
let startTime := (← IO.monoNanosNow).toFloat / 1000000000
|
||||
-- allows `headerEnv` to be leaked, which would live until the end of the process anyway
|
||||
let (headerEnv, msgLog) ← Elab.processHeader (leakEnv := true) stx setup.opts .empty
|
||||
ctx.toInputContext setup.trustLevel
|
||||
let stopTime := (← IO.monoNanosNow).toFloat / 1000000000
|
||||
let diagnostics := (← Snapshot.Diagnostics.ofMessageLog msgLog)
|
||||
if msgLog.hasErrors then
|
||||
return { diagnostics, result? := none }
|
||||
|
||||
let headerEnv := headerEnv.setMainModule setup.mainModuleName
|
||||
let cmdState := Elab.Command.mkState headerEnv msgLog setup.opts
|
||||
let cmdState := { cmdState with infoState := {
|
||||
enabled := true
|
||||
trees := #[Elab.InfoTree.context (.commandCtx {
|
||||
env := headerEnv
|
||||
fileMap := ctx.fileMap
|
||||
ngen := { namePrefix := `_import }
|
||||
}) (Elab.InfoTree.node
|
||||
(Elab.Info.ofCommandInfo { elaborator := `header, stx })
|
||||
(stx[1].getArgs.toList.map (fun importStx =>
|
||||
Elab.InfoTree.node (Elab.Info.ofCommandInfo {
|
||||
elaborator := `import
|
||||
stx := importStx
|
||||
}) #[].toPArray'
|
||||
)).toPArray'
|
||||
)].toPArray'
|
||||
}}
|
||||
let mut traceState := default
|
||||
if trace.profiler.output.get? setup.opts |>.isSome then
|
||||
traceState := {
|
||||
traces := #[{
|
||||
ref := .missing,
|
||||
msg := .trace { cls := `Import, startTime, stopTime }
|
||||
(.ofFormat "importing") #[]
|
||||
: TraceElem
|
||||
}].toPArray'
|
||||
}
|
||||
-- now that imports have been loaded, check options again
|
||||
let opts ← reparseOptions setup.opts
|
||||
let cmdState := Elab.Command.mkState headerEnv msgLog opts
|
||||
let cmdState := { cmdState with
|
||||
infoState := {
|
||||
enabled := true
|
||||
trees := #[Elab.InfoTree.context (.commandCtx {
|
||||
env := headerEnv
|
||||
fileMap := ctx.fileMap
|
||||
ngen := { namePrefix := `_import }
|
||||
}) (Elab.InfoTree.node
|
||||
(Elab.Info.ofCommandInfo { elaborator := `header, stx })
|
||||
(stx[1].getArgs.toList.map (fun importStx =>
|
||||
Elab.InfoTree.node (Elab.Info.ofCommandInfo {
|
||||
elaborator := `import
|
||||
stx := importStx
|
||||
}) #[].toPArray'
|
||||
)).toPArray'
|
||||
)].toPArray'
|
||||
}
|
||||
traceState
|
||||
}
|
||||
let prom ← IO.Promise.new
|
||||
-- The speedup of these `markPersistent`s is negligible but they help in making unexpected
|
||||
-- `inc_ref_cold`s more visible
|
||||
let parserState := Runtime.markPersistent parserState
|
||||
let cmdState := Runtime.markPersistent cmdState
|
||||
let ctx := Runtime.markPersistent ctx
|
||||
let _ ← IO.asTask (parseCmd none parserState cmdState cmdState.env prom ctx)
|
||||
return {
|
||||
diagnostics
|
||||
infoTree? := cmdState.infoState.trees[0]!
|
||||
result? := some {
|
||||
cmdState
|
||||
firstCmdSnap := (← parseCmd none parserState cmdState)
|
||||
firstCmdSnap := { range? := none, task := prom.result }
|
||||
}
|
||||
}
|
||||
|
||||
parseCmd (old? : Option CommandParsedSnapshot) (parserState : Parser.ModuleParserState)
|
||||
(cmdState : Command.State) : LeanProcessingM (SnapshotTask CommandParsedSnapshot) := do
|
||||
(cmdState : Command.State) (initEnv : Environment) (prom : IO.Promise CommandParsedSnapshot) :
|
||||
LeanProcessingM Unit := do
|
||||
let ctx ← read
|
||||
|
||||
-- check for cancellation, most likely during elaboration of previous command, before starting
|
||||
@@ -487,83 +463,63 @@ where
|
||||
-- this is a bit ugly as we don't want to adjust our API with `Option`s just for cancellation
|
||||
-- (as no-one should look at this result in that case) but anything containing `Environment`
|
||||
-- is not `Inhabited`
|
||||
return .pure <| .mk (nextCmdSnap? := none) {
|
||||
prom.resolve <| .mk (nextCmdSnap? := none) {
|
||||
diagnostics := .empty, stx := .missing, parserState
|
||||
elabSnap := .pure <| .ofTyped { diagnostics := .empty : SnapshotLeaf }
|
||||
finishedSnap := .pure { diagnostics := .empty, cmdState }
|
||||
tacticCache := (← IO.mkRef {})
|
||||
}
|
||||
return
|
||||
|
||||
let unchanged old newParserState : BaseIO CommandParsedSnapshot :=
|
||||
let unchanged old newParserState : BaseIO Unit :=
|
||||
-- when syntax is unchanged, reuse command processing task as is
|
||||
-- NOTE: even if the syntax tree is functionally unchanged, the new parser state may still
|
||||
-- have changed because of trailing whitespace and comments etc., so it is passed separately
|
||||
-- from `old`
|
||||
if let some oldNext := old.nextCmdSnap? then
|
||||
return .mk (data := old.data)
|
||||
(nextCmdSnap? := (← old.data.finishedSnap.bindIO (sync := true) fun oldFinished =>
|
||||
-- also wait on old command parse snapshot as parsing is cheap and may allow for
|
||||
-- elaboration reuse
|
||||
oldNext.bindIO (sync := true) fun oldNext => do
|
||||
parseCmd oldNext newParserState oldFinished.cmdState ctx))
|
||||
else return old -- terminal command, we're done!
|
||||
if let some oldNext := old.nextCmdSnap? then do
|
||||
let newProm ← IO.Promise.new
|
||||
let _ ← old.data.finishedSnap.bindIO fun oldFinished =>
|
||||
-- also wait on old command parse snapshot as parsing is cheap and may allow for
|
||||
-- elaboration reuse
|
||||
oldNext.bindIO (sync := true) fun oldNext => do
|
||||
parseCmd oldNext newParserState oldFinished.cmdState initEnv newProm ctx
|
||||
return .pure ()
|
||||
prom.resolve <| .mk (data := old.data) (nextCmdSnap? := some { range? := none, task := newProm.result })
|
||||
else prom.resolve old -- terminal command, we're done!
|
||||
|
||||
-- fast path, do not even start new task for this snapshot
|
||||
if let some old := old? then
|
||||
if let some nextCom ← old.nextCmdSnap?.bindM (·.get?) then
|
||||
if (← isBeforeEditPos nextCom.data.parserState.pos) then
|
||||
return .pure (← unchanged old old.data.parserState)
|
||||
return (← unchanged old old.data.parserState)
|
||||
|
||||
SnapshotTask.ofIO (some ⟨parserState.pos, ctx.input.endPos⟩) do
|
||||
let beginPos := parserState.pos
|
||||
let scope := cmdState.scopes.head!
|
||||
let pmctx := {
|
||||
env := cmdState.env, options := scope.opts, currNamespace := scope.currNamespace
|
||||
openDecls := scope.openDecls
|
||||
}
|
||||
let (stx, parserState, msgLog) := Parser.parseCommand ctx.toInputContext pmctx parserState
|
||||
.empty
|
||||
let beginPos := parserState.pos
|
||||
let scope := cmdState.scopes.head!
|
||||
let pmctx := {
|
||||
env := cmdState.env, options := scope.opts, currNamespace := scope.currNamespace
|
||||
openDecls := scope.openDecls
|
||||
}
|
||||
let (stx, parserState, msgLog) :=
|
||||
profileit "parsing" scope.opts fun _ =>
|
||||
Parser.parseCommand ctx.toInputContext pmctx parserState .empty
|
||||
|
||||
-- semi-fast path
|
||||
if let some old := old? then
|
||||
-- NOTE: as `parserState.pos` includes trailing whitespace, this forces reprocessing even if
|
||||
-- only that whitespace changes, which is wasteful but still necessary because it may
|
||||
-- influence the range of error messages such as from a trailing `exact`
|
||||
if stx.eqWithInfo old.data.stx then
|
||||
-- Here we must make sure to pass the *new* parser state; see NOTE in `unchanged`
|
||||
return (← unchanged old parserState)
|
||||
-- on first change, make sure to cancel old invocation
|
||||
-- TODO: pass token into incrementality-aware elaborators to improve reuse of still-valid,
|
||||
-- still-running elaboration steps?
|
||||
if let some tk := ctx.oldCancelTk? then
|
||||
tk.set
|
||||
-- semi-fast path
|
||||
if let some old := old? then
|
||||
-- NOTE: as `parserState.pos` includes trailing whitespace, this forces reprocessing even if
|
||||
-- only that whitespace changes, which is wasteful but still necessary because it may
|
||||
-- influence the range of error messages such as from a trailing `exact`
|
||||
if stx.eqWithInfo old.data.stx then
|
||||
-- Here we must make sure to pass the *new* parser state; see NOTE in `unchanged`
|
||||
return (← unchanged old parserState)
|
||||
-- on first change, make sure to cancel old invocation
|
||||
-- TODO: pass token into incrementality-aware elaborators to improve reuse of still-valid,
|
||||
-- still-running elaboration steps?
|
||||
if let some tk := ctx.oldCancelTk? then
|
||||
tk.set
|
||||
|
||||
-- definitely resolved in `doElab` task
|
||||
let elabPromise ← IO.Promise.new
|
||||
let tacticCache ← old?.map (·.data.tacticCache) |>.getDM (IO.mkRef {})
|
||||
let finishedSnap ←
|
||||
doElab stx cmdState beginPos
|
||||
{ old? := old?.map fun old => ⟨old.data.stx, old.data.elabSnap⟩, new := elabPromise }
|
||||
tacticCache
|
||||
ctx
|
||||
|
||||
let next? ← if Parser.isTerminalCommand stx then pure none
|
||||
-- for now, wait on "command finished" snapshot before parsing next command
|
||||
else some <$> finishedSnap.bindIO fun finished =>
|
||||
parseCmd none parserState finished.cmdState ctx
|
||||
return .mk (nextCmdSnap? := next?) {
|
||||
diagnostics := (← Snapshot.Diagnostics.ofMessageLog msgLog)
|
||||
stx
|
||||
parserState
|
||||
elabSnap := { range? := stx.getRange?, task := elabPromise.result }
|
||||
finishedSnap
|
||||
tacticCache
|
||||
}
|
||||
|
||||
doElab (stx : Syntax) (cmdState : Command.State) (beginPos : String.Pos)
|
||||
(snap : SnapshotBundle DynamicSnapshot) (tacticCache : IO.Ref Tactic.Cache) :
|
||||
LeanProcessingM (SnapshotTask CommandFinishedSnapshot) := do
|
||||
let ctx ← read
|
||||
-- definitely resolved in `doElab` task
|
||||
let elabPromise ← IO.Promise.new
|
||||
let finishedPromise ← IO.Promise.new
|
||||
-- (Try to) use last line of command as range for final snapshot task. This ensures we do not
|
||||
-- retract the progress bar to a previous position in case the command support incremental
|
||||
-- reporting but has significant work after resolving its last incremental promise, such as
|
||||
@@ -571,48 +527,85 @@ where
|
||||
-- `parseCmd` and containing the entire range of the command will determine the reported
|
||||
-- progress and be resolved effectively at the same time as this snapshot task, so `tailPos` is
|
||||
-- irrelevant in this case.
|
||||
let tailPos := stx.getTailPos? |>.getD beginPos
|
||||
SnapshotTask.ofIO (some ⟨tailPos, tailPos⟩) do
|
||||
let scope := cmdState.scopes.head!
|
||||
let cmdStateRef ← IO.mkRef { cmdState with messages := .empty }
|
||||
/-
|
||||
The same snapshot may be executed by different tasks. So, to make sure `elabCommandTopLevel`
|
||||
has exclusive access to the cache, we create a fresh reference here. Before this change, the
|
||||
following `tacticCache.modify` would reset the tactic post cache while another snapshot was
|
||||
still using it.
|
||||
-/
|
||||
let tacticCacheNew ← IO.mkRef (← tacticCache.get)
|
||||
let cmdCtx : Elab.Command.Context := { ctx with
|
||||
cmdPos := beginPos
|
||||
tacticCache? := some tacticCacheNew
|
||||
snap? := some snap
|
||||
cancelTk? := some ctx.newCancelTk
|
||||
}
|
||||
let (output, _) ←
|
||||
IO.FS.withIsolatedStreams (isolateStderr := stderrAsMessages.get scope.opts) do
|
||||
liftM (m := BaseIO) do
|
||||
withLoggingExceptions
|
||||
(getResetInfoTrees *> Elab.Command.elabCommandTopLevel stx)
|
||||
cmdCtx cmdStateRef
|
||||
let postNew := (← tacticCacheNew.get).post
|
||||
tacticCache.modify fun _ => { pre := postNew, post := {} }
|
||||
let cmdState ← cmdStateRef.get
|
||||
let mut messages := cmdState.messages
|
||||
if !output.isEmpty then
|
||||
messages := messages.add {
|
||||
fileName := ctx.fileName
|
||||
severity := MessageSeverity.information
|
||||
pos := ctx.fileMap.toPosition beginPos
|
||||
data := output
|
||||
let endRange? := stx.getTailPos?.map fun pos => ⟨pos, pos⟩
|
||||
let finishedSnap := { range? := endRange?, task := finishedPromise.result }
|
||||
let tacticCache ← old?.map (·.data.tacticCache) |>.getDM (IO.mkRef {})
|
||||
|
||||
let minimalSnapshots := internal.minimalSnapshots.get cmdState.scopes.head!.opts
|
||||
let next? ← if Parser.isTerminalCommand stx then pure none
|
||||
-- for now, wait on "command finished" snapshot before parsing next command
|
||||
else some <$> IO.Promise.new
|
||||
let diagnostics ← Snapshot.Diagnostics.ofMessageLog msgLog
|
||||
let data := if minimalSnapshots && !Parser.isTerminalCommand stx then {
|
||||
diagnostics
|
||||
stx := .missing
|
||||
parserState := {}
|
||||
elabSnap := { range? := stx.getRange?, task := elabPromise.result }
|
||||
finishedSnap := { range? := none, task := finishedPromise.result.map fun finishedSnap => {
|
||||
diagnostics := finishedSnap.diagnostics
|
||||
infoTree? := none
|
||||
cmdState := {
|
||||
env := initEnv
|
||||
maxRecDepth := 0
|
||||
}
|
||||
let cmdState := { cmdState with messages }
|
||||
-- definitely resolve eventually
|
||||
snap.new.resolve <| .ofTyped { diagnostics := .empty : SnapshotLeaf }
|
||||
return {
|
||||
diagnostics := (← Snapshot.Diagnostics.ofMessageLog cmdState.messages)
|
||||
infoTree? := some cmdState.infoState.trees[0]!
|
||||
cmdState
|
||||
}}
|
||||
tacticCache
|
||||
} else {
|
||||
diagnostics, stx, parserState, tacticCache
|
||||
elabSnap := { range? := stx.getRange?, task := elabPromise.result }
|
||||
finishedSnap
|
||||
}
|
||||
prom.resolve <| .mk (nextCmdSnap? := next?.map ({ range? := some ⟨parserState.pos, ctx.input.endPos⟩, task := ·.result })) data
|
||||
doElab stx cmdState beginPos
|
||||
{ old? := old?.map fun old => ⟨old.data.stx, old.data.elabSnap⟩, new := elabPromise }
|
||||
finishedPromise tacticCache ctx
|
||||
if let some next := next? then
|
||||
parseCmd none parserState finishedSnap.get.cmdState initEnv next ctx
|
||||
|
||||
doElab (stx : Syntax) (cmdState : Command.State) (beginPos : String.Pos)
|
||||
(snap : SnapshotBundle DynamicSnapshot) (finishedPromise : IO.Promise CommandFinishedSnapshot)
|
||||
(tacticCache : IO.Ref Tactic.Cache) : LeanProcessingM Unit := do
|
||||
let ctx ← read
|
||||
let scope := cmdState.scopes.head!
|
||||
let cmdStateRef ← IO.mkRef { cmdState with messages := .empty }
|
||||
/-
|
||||
The same snapshot may be executed by different tasks. So, to make sure `elabCommandTopLevel`
|
||||
has exclusive access to the cache, we create a fresh reference here. Before this change, the
|
||||
following `tacticCache.modify` would reset the tactic post cache while another snapshot was
|
||||
still using it.
|
||||
-/
|
||||
let tacticCacheNew ← IO.mkRef (← tacticCache.get)
|
||||
let cmdCtx : Elab.Command.Context := { ctx with
|
||||
cmdPos := beginPos
|
||||
tacticCache? := some tacticCacheNew
|
||||
snap? := if internal.minimalSnapshots.get scope.opts then none else snap
|
||||
cancelTk? := some ctx.newCancelTk
|
||||
}
|
||||
let (output, _) ←
|
||||
IO.FS.withIsolatedStreams (isolateStderr := stderrAsMessages.get scope.opts) do
|
||||
liftM (m := BaseIO) do
|
||||
withLoggingExceptions
|
||||
(getResetInfoTrees *> Elab.Command.elabCommandTopLevel stx)
|
||||
cmdCtx cmdStateRef
|
||||
let postNew := (← tacticCacheNew.get).post
|
||||
tacticCache.modify fun _ => { pre := postNew, post := {} }
|
||||
let cmdState ← cmdStateRef.get
|
||||
let mut messages := cmdState.messages
|
||||
if !output.isEmpty then
|
||||
messages := messages.add {
|
||||
fileName := ctx.fileName
|
||||
severity := MessageSeverity.information
|
||||
pos := ctx.fileMap.toPosition beginPos
|
||||
data := output
|
||||
}
|
||||
let cmdState := { cmdState with messages }
|
||||
-- definitely resolve eventually
|
||||
snap.new.resolve <| .ofTyped { diagnostics := .empty : SnapshotLeaf }
|
||||
finishedPromise.resolve {
|
||||
diagnostics := (← Snapshot.Diagnostics.ofMessageLog cmdState.messages)
|
||||
infoTree? := some cmdState.infoState.trees[0]!
|
||||
cmdState
|
||||
}
|
||||
|
||||
/--
|
||||
Convenience function for tool uses of the language processor that skips header handling.
|
||||
@@ -620,14 +613,15 @@ Convenience function for tool uses of the language processor that skips header h
|
||||
def processCommands (inputCtx : Parser.InputContext) (parserState : Parser.ModuleParserState)
|
||||
(commandState : Command.State)
|
||||
(old? : Option (Parser.InputContext × CommandParsedSnapshot) := none) :
|
||||
BaseIO (SnapshotTask CommandParsedSnapshot) := do
|
||||
process.parseCmd (old?.map (·.2)) parserState commandState
|
||||
BaseIO (Task CommandParsedSnapshot) := do
|
||||
let prom ← IO.Promise.new
|
||||
process.parseCmd (old?.map (·.2)) parserState commandState commandState.env prom
|
||||
|>.run (old?.map (·.1))
|
||||
|>.run { inputCtx with }
|
||||
return prom.result
|
||||
|
||||
|
||||
/-- Waits for and returns final environment, if importing was successful. -/
|
||||
partial def waitForFinalEnv? (snap : InitialSnapshot) : Option Environment := do
|
||||
/-- Waits for and returns final command state, if importing was successful. -/
|
||||
partial def waitForFinalCmdState? (snap : InitialSnapshot) : Option Command.State := do
|
||||
let snap ← snap.result?
|
||||
let snap ← snap.processedSnap.get.result?
|
||||
goCmd snap.firstCmdSnap.get
|
||||
@@ -635,6 +629,6 @@ where goCmd snap :=
|
||||
if let some next := snap.nextCmdSnap? then
|
||||
goCmd next.get
|
||||
else
|
||||
snap.data.finishedSnap.get.cmdState.env
|
||||
snap.data.finishedSnap.get.cmdState
|
||||
|
||||
end Lean
|
||||
|
||||
119
src/Lean/Language/Lean/Types.lean
Normal file
119
src/Lean/Language/Lean/Types.lean
Normal file
@@ -0,0 +1,119 @@
|
||||
/-
|
||||
Copyright (c) 2023 Lean FRO. All rights reserved.
|
||||
Released under Apache 2.0 license as described in the file LICENSE.
|
||||
|
||||
Implementation of the Lean language: parsing and processing of header and commands, incremental
|
||||
recompilation
|
||||
|
||||
Authors: Sebastian Ullrich
|
||||
-/
|
||||
|
||||
prelude
|
||||
import Lean.Language.Basic
|
||||
import Lean.Elab.Command
|
||||
|
||||
set_option linter.missingDocs true
|
||||
|
||||
namespace Lean.Language.Lean
|
||||
open Lean.Elab Command
|
||||
open Lean.Parser
|
||||
|
||||
private def pushOpt (a? : Option α) (as : Array α) : Array α :=
|
||||
match a? with
|
||||
| some a => as.push a
|
||||
| none => as
|
||||
|
||||
/-! The hierarchy of Lean snapshot types -/
|
||||
|
||||
/-- Snapshot after elaboration of the entire command. -/
|
||||
structure CommandFinishedSnapshot extends Language.Snapshot where
|
||||
/-- Resulting elaboration state. -/
|
||||
cmdState : Command.State
|
||||
deriving Nonempty
|
||||
instance : ToSnapshotTree CommandFinishedSnapshot where
|
||||
toSnapshotTree s := ⟨s.toSnapshot, #[]⟩
|
||||
|
||||
/-- State after a command has been parsed. -/
|
||||
structure CommandParsedSnapshotData extends Snapshot where
|
||||
/-- Syntax tree of the command. -/
|
||||
stx : Syntax
|
||||
/-- Resulting parser state. -/
|
||||
parserState : Parser.ModuleParserState
|
||||
/--
|
||||
Snapshot for incremental reporting and reuse during elaboration, type dependent on specific
|
||||
elaborator.
|
||||
-/
|
||||
elabSnap : SnapshotTask DynamicSnapshot
|
||||
/-- State after processing is finished. -/
|
||||
finishedSnap : SnapshotTask CommandFinishedSnapshot
|
||||
/-- Cache for `save`; to be replaced with incrementality. -/
|
||||
tacticCache : IO.Ref Tactic.Cache
|
||||
deriving Nonempty
|
||||
|
||||
/-- State after a command has been parsed. -/
|
||||
-- workaround for lack of recursive structures
|
||||
inductive CommandParsedSnapshot where
|
||||
/-- Creates a command parsed snapshot. -/
|
||||
| mk (data : CommandParsedSnapshotData)
|
||||
(nextCmdSnap? : Option (SnapshotTask CommandParsedSnapshot))
|
||||
deriving Nonempty
|
||||
/-- The snapshot data. -/
|
||||
abbrev CommandParsedSnapshot.data : CommandParsedSnapshot → CommandParsedSnapshotData
|
||||
| mk data _ => data
|
||||
/-- Next command, unless this is a terminal command. -/
|
||||
abbrev CommandParsedSnapshot.nextCmdSnap? : CommandParsedSnapshot →
|
||||
Option (SnapshotTask CommandParsedSnapshot)
|
||||
| mk _ next? => next?
|
||||
partial instance : ToSnapshotTree CommandParsedSnapshot where
|
||||
toSnapshotTree := go where
|
||||
go s := ⟨s.data.toSnapshot,
|
||||
#[s.data.elabSnap.map (sync := true) toSnapshotTree,
|
||||
s.data.finishedSnap.map (sync := true) toSnapshotTree] |>
|
||||
pushOpt (s.nextCmdSnap?.map (·.map (sync := true) go))⟩
|
||||
|
||||
/-- State after successful importing. -/
|
||||
structure HeaderProcessedState where
|
||||
/-- The resulting initial elaboration state. -/
|
||||
cmdState : Command.State
|
||||
/-- First command task (there is always at least a terminal command). -/
|
||||
firstCmdSnap : SnapshotTask CommandParsedSnapshot
|
||||
|
||||
/-- State after the module header has been processed including imports. -/
|
||||
structure HeaderProcessedSnapshot extends Snapshot where
|
||||
/-- State after successful importing. -/
|
||||
result? : Option HeaderProcessedState
|
||||
isFatal := result?.isNone
|
||||
instance : ToSnapshotTree HeaderProcessedSnapshot where
|
||||
toSnapshotTree s := ⟨s.toSnapshot, #[] |>
|
||||
pushOpt (s.result?.map (·.firstCmdSnap.map (sync := true) toSnapshotTree))⟩
|
||||
|
||||
/-- State after successfully parsing the module header. -/
|
||||
structure HeaderParsedState where
|
||||
/-- Resulting parser state. -/
|
||||
parserState : Parser.ModuleParserState
|
||||
/-- Header processing task. -/
|
||||
processedSnap : SnapshotTask HeaderProcessedSnapshot
|
||||
|
||||
/-- State after the module header has been parsed. -/
|
||||
structure HeaderParsedSnapshot extends Snapshot where
|
||||
/-- Parser input context supplied by the driver, stored here for incremental parsing. -/
|
||||
ictx : Parser.InputContext
|
||||
/-- Resulting syntax tree. -/
|
||||
stx : Syntax
|
||||
/-- State after successful parsing. -/
|
||||
result? : Option HeaderParsedState
|
||||
isFatal := result?.isNone
|
||||
/-- Cancellation token for interrupting processing of this run. -/
|
||||
cancelTk? : Option IO.CancelToken
|
||||
|
||||
instance : ToSnapshotTree HeaderParsedSnapshot where
|
||||
toSnapshotTree s := ⟨s.toSnapshot,
|
||||
#[] |> pushOpt (s.result?.map (·.processedSnap.map (sync := true) toSnapshotTree))⟩
|
||||
|
||||
/-- Shortcut accessor to the final header state, if successful. -/
|
||||
def HeaderParsedSnapshot.processedResult (snap : HeaderParsedSnapshot) :
|
||||
SnapshotTask (Option HeaderProcessedState) :=
|
||||
snap.result?.bind (·.processedSnap.map (sync := true) (·.result?)) |>.getD (.pure none)
|
||||
|
||||
/-- Initial snapshot of the Lean language processor: a "header parsed" snapshot. -/
|
||||
abbrev InitialSnapshot := HeaderParsedSnapshot
|
||||
@@ -614,9 +614,9 @@ where
|
||||
|
||||
end Level
|
||||
|
||||
abbrev LevelMap (α : Type) := HashMap Level α
|
||||
abbrev LevelMap (α : Type) := Std.HashMap Level α
|
||||
abbrev PersistentLevelMap (α : Type) := PHashMap Level α
|
||||
abbrev LevelSet := HashSet Level
|
||||
abbrev LevelSet := Std.HashSet Level
|
||||
abbrev PersistentLevelSet := PHashSet Level
|
||||
abbrev PLevelSet := PersistentLevelSet
|
||||
|
||||
|
||||
@@ -34,7 +34,7 @@ def constructorNameAsVariable : Linter where
|
||||
| return
|
||||
|
||||
let infoTrees := (← get).infoState.trees.toArray
|
||||
let warnings : IO.Ref (Lean.HashMap String.Range (Syntax × Name × Name)) ← IO.mkRef {}
|
||||
let warnings : IO.Ref (Std.HashMap String.Range (Syntax × Name × Name)) ← IO.mkRef {}
|
||||
|
||||
for tree in infoTrees do
|
||||
tree.visitM' (preNode := fun ci info _ => do
|
||||
|
||||
@@ -149,7 +149,7 @@ def checkDecl : SimpleHandler := fun stx => do
|
||||
lintField rest[1][0] stx[1] "computed field"
|
||||
else if rest.getKind == ``«structure» then
|
||||
unless rest[5][2].isNone do
|
||||
let redecls : HashSet String.Pos :=
|
||||
let redecls : Std.HashSet String.Pos :=
|
||||
(← get).infoState.trees.foldl (init := {}) fun s tree =>
|
||||
tree.foldInfo (init := s) fun _ info s =>
|
||||
if let .ofFieldRedeclInfo info := info then
|
||||
|
||||
@@ -270,14 +270,14 @@ pointer identity and does not store the objects, so it is important not to store
|
||||
pointer to an object in the map, or it can be freed and reused, resulting in incorrect behavior.
|
||||
|
||||
Returns `true` if the object was not already in the set. -/
|
||||
unsafe def insertObjImpl {α : Type} (set : IO.Ref (HashSet USize)) (a : α) : IO Bool := do
|
||||
unsafe def insertObjImpl {α : Type} (set : IO.Ref (Std.HashSet USize)) (a : α) : IO Bool := do
|
||||
if (← set.get).contains (ptrAddrUnsafe a) then
|
||||
return false
|
||||
set.modify (·.insert (ptrAddrUnsafe a))
|
||||
return true
|
||||
|
||||
@[inherit_doc insertObjImpl, implemented_by insertObjImpl]
|
||||
opaque insertObj {α : Type} (set : IO.Ref (HashSet USize)) (a : α) : IO Bool
|
||||
opaque insertObj {α : Type} (set : IO.Ref (Std.HashSet USize)) (a : α) : IO Bool
|
||||
|
||||
/--
|
||||
Collects into `fvarUses` all `fvar`s occurring in the `Expr`s in `assignments`.
|
||||
@@ -285,8 +285,8 @@ This implementation respects subterm sharing in both the `PersistentHashMap` and
|
||||
to ensure that pointer-equal subobjects are not visited multiple times, which is important
|
||||
in practice because these expressions are very frequently highly shared.
|
||||
-/
|
||||
partial def visitAssignments (set : IO.Ref (HashSet USize))
|
||||
(fvarUses : IO.Ref (HashSet FVarId))
|
||||
partial def visitAssignments (set : IO.Ref (Std.HashSet USize))
|
||||
(fvarUses : IO.Ref (Std.HashSet FVarId))
|
||||
(assignments : Array (PersistentHashMap MVarId Expr)) : IO Unit := do
|
||||
MonadCacheT.run do
|
||||
for assignment in assignments do
|
||||
@@ -316,8 +316,8 @@ where
|
||||
/-- Given `aliases` as a map from an alias to what it aliases, we get the original
|
||||
term by recursion. This has no cycle detection, so if `aliases` contains a loop
|
||||
then this function will recurse infinitely. -/
|
||||
partial def followAliases (aliases : HashMap FVarId FVarId) (x : FVarId) : FVarId :=
|
||||
match aliases.find? x with
|
||||
partial def followAliases (aliases : Std.HashMap FVarId FVarId) (x : FVarId) : FVarId :=
|
||||
match aliases[x]? with
|
||||
| none => x
|
||||
| some y => followAliases aliases y
|
||||
|
||||
@@ -343,17 +343,17 @@ structure References where
|
||||
the spans for `foo`, `bar`, and `baz`. Global definitions are always treated as used.
|
||||
(It would be nice to be able to detect unused global definitions but this requires more
|
||||
information than the linter framework can provide.) -/
|
||||
constDecls : HashSet String.Range := .empty
|
||||
constDecls : Std.HashSet String.Range := .empty
|
||||
/-- The collection of all local declarations, organized by the span of the declaration.
|
||||
We collapse all declarations declared at the same position into a single record using
|
||||
`FVarDefinition.aliases`. -/
|
||||
fvarDefs : HashMap String.Range FVarDefinition := .empty
|
||||
fvarDefs : Std.HashMap String.Range FVarDefinition := .empty
|
||||
/-- The set of `FVarId`s that are used directly. These may or may not be aliases. -/
|
||||
fvarUses : HashSet FVarId := .empty
|
||||
fvarUses : Std.HashSet FVarId := .empty
|
||||
/-- A mapping from alias to original FVarId. We don't guarantee that the value is not itself
|
||||
an alias, but we use `followAliases` when adding new elements to try to avoid long chains. -/
|
||||
-- TODO: use a `UnionFind` data structure here
|
||||
fvarAliases : HashMap FVarId FVarId := .empty
|
||||
fvarAliases : Std.HashMap FVarId FVarId := .empty
|
||||
/-- Collection of all `MetavarContext`s following the execution of a tactic. We trawl these
|
||||
if needed to find additional `fvarUses`. -/
|
||||
assignments : Array (PersistentHashMap MVarId Expr) := #[]
|
||||
@@ -391,7 +391,7 @@ def collectReferences (infoTrees : Array Elab.InfoTree) (cmdStxRange : String.Ra
|
||||
if s.startsWith "_" then return
|
||||
-- Record this either as a new `fvarDefs`, or an alias of an existing one
|
||||
modify fun s =>
|
||||
if let some ref := s.fvarDefs.find? range then
|
||||
if let some ref := s.fvarDefs[range]? then
|
||||
{ s with fvarDefs := s.fvarDefs.insert range { ref with aliases := ref.aliases.push id } }
|
||||
else
|
||||
{ s with fvarDefs := s.fvarDefs.insert range { userName := ldecl.userName, stx, opts, aliases := #[id] } }
|
||||
@@ -444,7 +444,7 @@ def unusedVariables : Linter where
|
||||
-- Resolve all recursive references in `fvarAliases`.
|
||||
-- At this point everything in `fvarAliases` is guaranteed not to be itself an alias,
|
||||
-- and should point to some element of `FVarDefinition.aliases` in `s.fvarDefs`
|
||||
let fvarAliases : HashMap FVarId FVarId := s.fvarAliases.fold (init := {}) fun m id baseId =>
|
||||
let fvarAliases : Std.HashMap FVarId FVarId := s.fvarAliases.fold (init := {}) fun m id baseId =>
|
||||
m.insert id (followAliases s.fvarAliases baseId)
|
||||
|
||||
-- Collect all non-alias fvars corresponding to `fvarUses` by resolving aliases in the list.
|
||||
@@ -461,7 +461,7 @@ def unusedVariables : Linter where
|
||||
let fvarUses ← fvarUsesRef.get
|
||||
-- If any of the `fvar`s corresponding to this declaration is (an alias of) a variable in
|
||||
-- `fvarUses`, then it is used
|
||||
if aliases.any fun id => fvarUses.contains (fvarAliases.findD id id) then continue
|
||||
if aliases.any fun id => fvarUses.contains (fvarAliases.getD id id) then continue
|
||||
-- If this is a global declaration then it is (potentially) used after the command
|
||||
if s.constDecls.contains range then continue
|
||||
|
||||
@@ -496,7 +496,7 @@ def unusedVariables : Linter where
|
||||
initializedMVars := true
|
||||
let fvarUses ← fvarUsesRef.get
|
||||
-- Redo the initial check because `fvarUses` could be bigger now
|
||||
if aliases.any fun id => fvarUses.contains (fvarAliases.findD id id) then continue
|
||||
if aliases.any fun id => fvarUses.contains (fvarAliases.getD id id) then continue
|
||||
|
||||
-- If we made it this far then the variable is unused and not ignored
|
||||
unused := unused.push (declStx, userName)
|
||||
|
||||
@@ -16,8 +16,8 @@ structure State where
|
||||
nextParamIdx : Nat := 0
|
||||
paramNames : Array Name := #[]
|
||||
fvars : Array Expr := #[]
|
||||
lmap : HashMap LMVarId Level := {}
|
||||
emap : HashMap MVarId Expr := {}
|
||||
lmap : Std.HashMap LMVarId Level := {}
|
||||
emap : Std.HashMap MVarId Expr := {}
|
||||
abstractLevels : Bool -- whether to abstract level mvars
|
||||
|
||||
abbrev M := StateM State
|
||||
@@ -54,7 +54,7 @@ private partial def abstractLevelMVars (u : Level) : M Level := do
|
||||
if depth != s.mctx.depth then
|
||||
return u -- metavariables from lower depths are treated as constants
|
||||
else
|
||||
match s.lmap.find? mvarId with
|
||||
match s.lmap[mvarId]? with
|
||||
| some u => pure u
|
||||
| none =>
|
||||
let paramId := Name.mkNum `_abstMVar s.nextParamIdx
|
||||
@@ -87,7 +87,7 @@ partial def abstractExprMVars (e : Expr) : M Expr := do
|
||||
if e != eNew then
|
||||
abstractExprMVars eNew
|
||||
else
|
||||
match (← get).emap.find? mvarId with
|
||||
match (← get).emap[mvarId]? with
|
||||
| some e =>
|
||||
return e
|
||||
| none =>
|
||||
|
||||
@@ -4,10 +4,11 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Authors: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Lean.Util.ShareCommon
|
||||
import Lean.Data.HashMap
|
||||
import Lean.Util.ShareCommon
|
||||
import Lean.Meta.Basic
|
||||
import Lean.Meta.FunInfo
|
||||
import Std.Data.HashMap.Raw
|
||||
|
||||
namespace Lean.Meta
|
||||
namespace Canonicalizer
|
||||
@@ -47,12 +48,12 @@ State for the `CanonM` monad.
|
||||
-/
|
||||
structure State where
|
||||
/-- Mapping from `Expr` to hash. -/
|
||||
-- We use `HashMapImp` to ensure we don't have to tag `State` as `unsafe`.
|
||||
cache : HashMapImp ExprVisited UInt64 := mkHashMapImp
|
||||
-- We use `HashMap.Raw` to ensure we don't have to tag `State` as `unsafe`.
|
||||
cache : Std.HashMap.Raw ExprVisited UInt64 := Std.HashMap.Raw.empty
|
||||
/--
|
||||
Given a hashcode `k` and `keyToExprs.find? h = some es`, we have that all `es` have hashcode `k`, and
|
||||
are not definitionally equal modulo the transparency setting used. -/
|
||||
keyToExprs : HashMap UInt64 (List Expr) := mkHashMap
|
||||
keyToExprs : Std.HashMap UInt64 (List Expr) := ∅
|
||||
|
||||
instance : Inhabited State where
|
||||
default := {}
|
||||
@@ -70,7 +71,7 @@ def CanonM.run (x : CanonM α) (transparency := TransparencyMode.instances) (s :
|
||||
StateRefT'.run (x transparency) s
|
||||
|
||||
private partial def mkKey (e : Expr) : CanonM UInt64 := do
|
||||
if let some hash := unsafe (← get).cache.find? { e } then
|
||||
if let some hash := unsafe (← get).cache.get? { e } then
|
||||
return hash
|
||||
else
|
||||
let key ← match e with
|
||||
@@ -107,7 +108,7 @@ private partial def mkKey (e : Expr) : CanonM UInt64 := do
|
||||
return mixHash (← mkKey v) (← mkKey b)
|
||||
| .proj _ i s =>
|
||||
return mixHash i.toUInt64 (← mkKey s)
|
||||
unsafe modify fun { cache, keyToExprs} => { keyToExprs, cache := cache.insert { e } key |>.1 }
|
||||
unsafe modify fun { cache, keyToExprs} => { keyToExprs, cache := cache.insert { e } key }
|
||||
return key
|
||||
|
||||
/--
|
||||
@@ -116,7 +117,7 @@ private partial def mkKey (e : Expr) : CanonM UInt64 := do
|
||||
def canon (e : Expr) : CanonM Expr := do
|
||||
let k ← mkKey e
|
||||
-- Find all expressions canonicalized before that have the same key.
|
||||
if let some es' := unsafe (← get).keyToExprs.find? k then
|
||||
if let some es' := unsafe (← get).keyToExprs[k]? then
|
||||
withTransparency (← read) do
|
||||
for e' in es' do
|
||||
-- Found an expression `e'` that is definitionally equal to `e` and share the same key.
|
||||
|
||||
@@ -127,7 +127,7 @@ abbrev ClosureM := ReaderT Context $ StateRefT State MetaM
|
||||
pure u
|
||||
else
|
||||
let s ← get
|
||||
match s.visitedLevel.find? u with
|
||||
match s.visitedLevel[u]? with
|
||||
| some v => pure v
|
||||
| none => do
|
||||
let v ← f u
|
||||
@@ -139,7 +139,7 @@ abbrev ClosureM := ReaderT Context $ StateRefT State MetaM
|
||||
pure e
|
||||
else
|
||||
let s ← get
|
||||
match s.visitedExpr.find? e with
|
||||
match s.visitedExpr.get? e with
|
||||
| some r => pure r
|
||||
| none =>
|
||||
let r ← f e
|
||||
|
||||
@@ -52,14 +52,14 @@ which appear in the type and local context of `mvarId`, as well as the
|
||||
metavariables which *those* metavariables depend on, etc.
|
||||
-/
|
||||
partial def _root_.Lean.MVarId.getMVarDependencies (mvarId : MVarId) (includeDelayed := false) :
|
||||
MetaM (HashSet MVarId) :=
|
||||
MetaM (Std.HashSet MVarId) :=
|
||||
(·.snd) <$> (go mvarId).run {}
|
||||
where
|
||||
/-- Auxiliary definition for `getMVarDependencies`. -/
|
||||
addMVars (e : Expr) : StateRefT (HashSet MVarId) MetaM Unit := do
|
||||
addMVars (e : Expr) : StateRefT (Std.HashSet MVarId) MetaM Unit := do
|
||||
let mvars ← getMVars e
|
||||
let mut s ← get
|
||||
set ({} : HashSet MVarId) -- Ensure that `s` is not shared.
|
||||
set ({} : Std.HashSet MVarId) -- Ensure that `s` is not shared.
|
||||
for mvarId in mvars do
|
||||
if ← pure includeDelayed <||> notM (mvarId.isDelayedAssigned) then
|
||||
s := s.insert mvarId
|
||||
@@ -67,7 +67,7 @@ where
|
||||
mvars.forM go
|
||||
|
||||
/-- Auxiliary definition for `getMVarDependencies`. -/
|
||||
go (mvarId : MVarId) : StateRefT (HashSet MVarId) MetaM Unit :=
|
||||
go (mvarId : MVarId) : StateRefT (Std.HashSet MVarId) MetaM Unit :=
|
||||
withIncRecDepth do
|
||||
let mdecl ← mvarId.getDecl
|
||||
addMVars mdecl.type
|
||||
|
||||
@@ -35,7 +35,7 @@ structure DiagSummary where
|
||||
def DiagSummary.isEmpty (s : DiagSummary) : Bool :=
|
||||
s.data.isEmpty
|
||||
|
||||
def mkDiagSummary (counters : PHashMap Name Nat) (p : Name → Bool := fun _ => true) : MetaM DiagSummary := do
|
||||
def mkDiagSummary (cls : Name) (counters : PHashMap Name Nat) (p : Name → Bool := fun _ => true) : MetaM DiagSummary := do
|
||||
let threshold := diagnostics.threshold.get (← getOptions)
|
||||
let entries := collectAboveThreshold counters threshold p Name.lt
|
||||
if entries.isEmpty then
|
||||
@@ -43,22 +43,22 @@ def mkDiagSummary (counters : PHashMap Name Nat) (p : Name → Bool := fun _ =>
|
||||
else
|
||||
let mut data := #[]
|
||||
for (declName, counter) in entries do
|
||||
data := data.push m!"{if data.isEmpty then " " else "\n"}{MessageData.ofConst (← mkConstWithLevelParams declName)} ↦ {counter}"
|
||||
data := data.push <| .trace { cls } m!"{MessageData.ofConst (← mkConstWithLevelParams declName)} ↦ {counter}" #[]
|
||||
return { data, max := entries[0]!.2 }
|
||||
|
||||
def mkDiagSummaryForUnfolded (counters : PHashMap Name Nat) (instances := false) : MetaM DiagSummary := do
|
||||
let env ← getEnv
|
||||
mkDiagSummary counters fun declName =>
|
||||
mkDiagSummary `reduction counters fun declName =>
|
||||
getReducibilityStatusCore env declName matches .semireducible
|
||||
&& isInstanceCore env declName == instances
|
||||
|
||||
def mkDiagSummaryForUnfoldedReducible (counters : PHashMap Name Nat) : MetaM DiagSummary := do
|
||||
let env ← getEnv
|
||||
mkDiagSummary counters fun declName =>
|
||||
mkDiagSummary `reduction counters fun declName =>
|
||||
getReducibilityStatusCore env declName matches .reducible
|
||||
|
||||
def mkDiagSummaryForUsedInstances : MetaM DiagSummary := do
|
||||
mkDiagSummary (← get).diag.instanceCounter
|
||||
mkDiagSummary `type_class (← get).diag.instanceCounter
|
||||
|
||||
def mkDiagSynthPendingFailure (failures : PHashMap Expr MessageData) : MetaM DiagSummary := do
|
||||
if failures.isEmpty then
|
||||
@@ -66,7 +66,7 @@ def mkDiagSynthPendingFailure (failures : PHashMap Expr MessageData) : MetaM Dia
|
||||
else
|
||||
let mut data := #[]
|
||||
for (_, msg) in failures do
|
||||
data := data.push m!"{if data.isEmpty then " " else "\n"}{msg}"
|
||||
data := data.push <| .trace { cls := `type_class } msg #[]
|
||||
return { data }
|
||||
|
||||
/--
|
||||
@@ -85,10 +85,10 @@ def reportDiag : MetaM Unit := do
|
||||
let unfoldDefault ← mkDiagSummaryForUnfolded unfoldCounter
|
||||
let unfoldInstance ← mkDiagSummaryForUnfolded unfoldCounter (instances := true)
|
||||
let unfoldReducible ← mkDiagSummaryForUnfoldedReducible unfoldCounter
|
||||
let heu ← mkDiagSummary (← get).diag.heuristicCounter
|
||||
let heu ← mkDiagSummary `def_eq (← get).diag.heuristicCounter
|
||||
let inst ← mkDiagSummaryForUsedInstances
|
||||
let synthPending ← mkDiagSynthPendingFailure (← get).diag.synthPendingFailures
|
||||
let unfoldKernel ← mkDiagSummary (Kernel.getDiagnostics (← getEnv)).unfoldCounter
|
||||
let unfoldKernel ← mkDiagSummary `kernel (Kernel.getDiagnostics (← getEnv)).unfoldCounter
|
||||
let m := MessageData.nil
|
||||
let m := appendSection m `reduction "unfolded declarations" unfoldDefault
|
||||
let m := appendSection m `reduction "unfolded instances" unfoldInstance
|
||||
|
||||
@@ -695,7 +695,7 @@ def throwOutOfScopeFVar : CheckAssignmentM α :=
|
||||
throw <| Exception.internal outOfScopeExceptionId
|
||||
|
||||
private def findCached? (e : Expr) : CheckAssignmentM (Option Expr) := do
|
||||
return (← get).cache.find? e
|
||||
return (← get).cache.get? e
|
||||
|
||||
private def cache (e r : Expr) : CheckAssignmentM Unit := do
|
||||
modify fun s => { s with cache := s.cache.insert e r }
|
||||
|
||||
@@ -4,6 +4,7 @@ Released under Apache 2.0 license as described in the file LICENSE.
|
||||
Author: Leonardo de Moura
|
||||
-/
|
||||
prelude
|
||||
import Lean.Data.AssocList
|
||||
import Lean.HeadIndex
|
||||
import Lean.Meta.Basic
|
||||
|
||||
|
||||
@@ -286,7 +286,7 @@ private structure Trie (α : Type) where
|
||||
/-- Index of trie matching star. -/
|
||||
star : TrieIndex
|
||||
/-- Following matches based on key of trie. -/
|
||||
children : HashMap Key TrieIndex
|
||||
children : Std.HashMap Key TrieIndex
|
||||
/-- Lazy entries at this trie that are not processed. -/
|
||||
pending : Array (LazyEntry α) := #[]
|
||||
deriving Inhabited
|
||||
@@ -318,7 +318,7 @@ structure LazyDiscrTree (α : Type) where
|
||||
/-- Backing array of trie entries. Should be owned by this trie. -/
|
||||
tries : Array (LazyDiscrTree.Trie α) := #[default]
|
||||
/-- Map from discriminator trie roots to the index. -/
|
||||
roots : Lean.HashMap LazyDiscrTree.Key LazyDiscrTree.TrieIndex := {}
|
||||
roots : Std.HashMap LazyDiscrTree.Key LazyDiscrTree.TrieIndex := {}
|
||||
|
||||
namespace LazyDiscrTree
|
||||
|
||||
@@ -445,9 +445,9 @@ private def addLazyEntryToTrie (i:TrieIndex) (e : LazyEntry α) : MatchM α Unit
|
||||
modify (·.modify i (·.pushPending e))
|
||||
|
||||
private def evalLazyEntry (config : WhnfCoreConfig)
|
||||
(p : Array α × TrieIndex × HashMap Key TrieIndex)
|
||||
(p : Array α × TrieIndex × Std.HashMap Key TrieIndex)
|
||||
(entry : LazyEntry α)
|
||||
: MatchM α (Array α × TrieIndex × HashMap Key TrieIndex) := do
|
||||
: MatchM α (Array α × TrieIndex × Std.HashMap Key TrieIndex) := do
|
||||
let (values, starIdx, children) := p
|
||||
let (todo, lctx, v) := entry
|
||||
if todo.isEmpty then
|
||||
@@ -465,7 +465,7 @@ private def evalLazyEntry (config : WhnfCoreConfig)
|
||||
addLazyEntryToTrie starIdx (todo, lctx, v)
|
||||
pure (values, starIdx, children)
|
||||
else
|
||||
match children.find? k with
|
||||
match children[k]? with
|
||||
| none =>
|
||||
let children := children.insert k (← newTrie (todo, lctx, v))
|
||||
pure (values, starIdx, children)
|
||||
@@ -478,16 +478,16 @@ This evaluates all lazy entries in a trie and updates `values`, `starIdx`, and `
|
||||
accordingly.
|
||||
-/
|
||||
private partial def evalLazyEntries (config : WhnfCoreConfig)
|
||||
(values : Array α) (starIdx : TrieIndex) (children : HashMap Key TrieIndex)
|
||||
(values : Array α) (starIdx : TrieIndex) (children : Std.HashMap Key TrieIndex)
|
||||
(entries : Array (LazyEntry α)) :
|
||||
MatchM α (Array α × TrieIndex × HashMap Key TrieIndex) := do
|
||||
MatchM α (Array α × TrieIndex × Std.HashMap Key TrieIndex) := do
|
||||
let mut values := values
|
||||
let mut starIdx := starIdx
|
||||
let mut children := children
|
||||
entries.foldlM (init := (values, starIdx, children)) (evalLazyEntry config)
|
||||
|
||||
private def evalNode (c : TrieIndex) :
|
||||
MatchM α (Array α × TrieIndex × HashMap Key TrieIndex) := do
|
||||
MatchM α (Array α × TrieIndex × Std.HashMap Key TrieIndex) := do
|
||||
let .node vs star cs pending := (←get).get! c
|
||||
if pending.size = 0 then
|
||||
pure (vs, star, cs)
|
||||
@@ -508,7 +508,7 @@ def dropKeyAux (next : TrieIndex) (rest : List Key) :
|
||||
| [] =>
|
||||
modify (·.set! next {values := #[], star, children})
|
||||
| k :: r => do
|
||||
let next := if k == .star then star else children.findD k 0
|
||||
let next := if k == .star then star else children.getD k 0
|
||||
dropKeyAux next r
|
||||
|
||||
/--
|
||||
@@ -519,7 +519,7 @@ def dropKey (t : LazyDiscrTree α) (path : List LazyDiscrTree.Key) : MetaM (Lazy
|
||||
match path with
|
||||
| [] => pure t
|
||||
| rootKey :: rest => do
|
||||
let idx := t.roots.findD rootKey 0
|
||||
let idx := t.roots.getD rootKey 0
|
||||
Prod.snd <$> runMatch t (dropKeyAux idx rest)
|
||||
|
||||
/--
|
||||
@@ -628,7 +628,7 @@ private partial def getMatchLoop (cases : Array PartialMatch) (result : MatchRes
|
||||
else
|
||||
cases.push { todo, score := ca.score, c := star }
|
||||
let pushNonStar (k : Key) (args : Array Expr) (cases : Array PartialMatch) :=
|
||||
match cs.find? k with
|
||||
match cs[k]? with
|
||||
| none => cases
|
||||
| some c => cases.push { todo := todo ++ args, score := ca.score + 1, c }
|
||||
let cases := pushStar cases
|
||||
@@ -650,8 +650,8 @@ private partial def getMatchLoop (cases : Array PartialMatch) (result : MatchRes
|
||||
cases |> pushNonStar k args
|
||||
getMatchLoop cases result
|
||||
|
||||
private def getStarResult (root : Lean.HashMap Key TrieIndex) : MatchM α (MatchResult α) :=
|
||||
match root.find? .star with
|
||||
private def getStarResult (root : Std.HashMap Key TrieIndex) : MatchM α (MatchResult α) :=
|
||||
match root[Key.star]? with
|
||||
| none =>
|
||||
pure <| {}
|
||||
| some idx => do
|
||||
@@ -661,16 +661,16 @@ private def getStarResult (root : Lean.HashMap Key TrieIndex) : MatchM α (Match
|
||||
/-
|
||||
Add partial match to cases if discriminator tree root map has potential matches.
|
||||
-/
|
||||
private def pushRootCase (r : Lean.HashMap Key TrieIndex) (k : Key) (args : Array Expr)
|
||||
private def pushRootCase (r : Std.HashMap Key TrieIndex) (k : Key) (args : Array Expr)
|
||||
(cases : Array PartialMatch) : Array PartialMatch :=
|
||||
match r.find? k with
|
||||
match r[k]? with
|
||||
| none => cases
|
||||
| some c => cases.push { todo := args, score := 1, c }
|
||||
|
||||
/--
|
||||
Find values that match `e` in `root`.
|
||||
-/
|
||||
private def getMatchCore (root : Lean.HashMap Key TrieIndex) (e : Expr) :
|
||||
private def getMatchCore (root : Std.HashMap Key TrieIndex) (e : Expr) :
|
||||
MatchM α (MatchResult α) := do
|
||||
let result ← getStarResult root
|
||||
let (k, args) ← MatchClone.getMatchKeyArgs e (root := true) (← read)
|
||||
@@ -701,7 +701,7 @@ of elements using concurrent functions for generating entries.
|
||||
-/
|
||||
private structure PreDiscrTree (α : Type) where
|
||||
/-- Maps keys to index in tries array. -/
|
||||
roots : HashMap Key Nat := {}
|
||||
roots : Std.HashMap Key Nat := {}
|
||||
/-- Lazy entries for root of trie. -/
|
||||
tries : Array (Array (LazyEntry α)) := #[]
|
||||
deriving Inhabited
|
||||
@@ -711,7 +711,7 @@ namespace PreDiscrTree
|
||||
private def modifyAt (d : PreDiscrTree α) (k : Key)
|
||||
(f : Array (LazyEntry α) → Array (LazyEntry α)) : PreDiscrTree α :=
|
||||
let { roots, tries } := d
|
||||
match roots.find? k with
|
||||
match roots[k]? with
|
||||
| .none =>
|
||||
let roots := roots.insert k tries.size
|
||||
{ roots, tries := tries.push (f #[]) }
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user