Compare commits

...

46 Commits

Author SHA1 Message Date
Kim Morrison
8ab452c991 fix: delete empty .out.expected files instead of leaving them empty
The lint.sh check flags empty .out.expected files. Delete them entirely
since these tests no longer produce expected output.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 23:57:53 +00:00
Kim Morrison
04ccf7cb28 fix: remove expected constructorNameAsVariable warnings from unrelated tests
These tests are not testing the constructorNameAsVariable linter. Now
that the linter correctly respects `linter.all`, the test harness's
`-Dlinter.all=false` suppresses these warnings.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 13:46:47 +00:00
Kim Morrison
7865841547 fix: use LinterOptions in constructorNameAsVariable linter
Use `getLinterOptions` and `toLinterOptions` instead of plain `Options`
so that the linter correctly respects `linter.all`. Add inline
`unset_option` to the test file to undo the test harness's
`-Dlinter.all=false` and update expected constructor suggestions.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 12:17:59 +00:00
Kim Morrison
34a113119d chore: inline getLinterConstructorNameAsVariable
Remove the `getLinterConstructorNameAsVariable` wrapper and use
`getLinterValue linter.constructorNameAsVariable` directly at both
call sites for consistency.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-02 11:41:52 +00:00
Kim Morrison
df766e2ac4 Merge branch 'master' into constructorNameAsVariable_all 2026-03-02 11:40:00 +00:00
Sebastian Ullrich
65a0c61806 chore: idbg refinements (#12691) 2026-02-26 07:49:47 +00:00
Wojciech Różowski
d4b560ec4a test: add cbv tests adapted from LNSym (#12694)
This PR adds two `decide_cbv` stress tests extracted from LNSym (ARMv8
symbolic
simulator, Apache 2.0). `cbv_aes.lean` tests a full AES-128 encryption
on large
bitvector computations. `cbv_arm_ldst.lean` tests ARMv8 load/store
instruction
decoding and execution with nested pattern matching over bitvectors.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-25 17:08:24 +00:00
Wojciech Różowski
7390024170 test: add cbv test for Collatz conjecture verification (#12692)
This PR adds a `cbv` tactic test based on a minimized example extracted
from verifying the Collatz conjecture for small numbers, suggested by
Bhavik Mehta (@b-mehta).

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
Co-authored-by: Bhavik Mehta <bhavikmehta8@gmail.com>
2026-02-25 17:05:51 +00:00
Henrik Böving
805012fb84 chore: revert "perf: improve over-applied cases in ToLCNF (#12284)" (#12693)
This PR reverts commit 9b7a8eb7c8. After
some more contemplation on
the implications of these changes I think this is not the direction we
want to move into.
2026-02-25 15:23:24 +00:00
Garmelon
dc760cf54a chore: fail build on non-make generators (#12690)
At the moment, the build relies on make and will fail with other cmake
generators. This explicit check (as suggested by @LecrisUT in
https://github.com/leanprover/lean4/pull/12577#discussion_r2832295132)
should help prevent confusion like in #12575.
2026-02-25 13:59:40 +00:00
Garmelon
08eb78a5b2 chore: switch to new test/bench suite (#12590)
This PR sets up the new integrated test/bench suite. It then migrates
all benchmarks and some related tests to the new suite. There's also
some documentation and some linting.

For now, a lot of the old tests are left alone so this PR doesn't become
even larger than it already is. Eventually, all tests should be migrated
to the new suite though so there isn't a confusing mix of two systems.
2026-02-25 13:51:53 +00:00
Kyle Miller
bd0c6a42c8 fix: copied 11940 fix for structure command (#12680)
This PR fixes an issue where `mutual public structure` would have a
private constructor. The fix copies the fix from #11940.

Closes #10067. Also recloses duplicate issue #11116 (its test case is
added to the test suite).
2026-02-25 13:50:04 +00:00
Paul Reichert
c86f82161a feat: upstream List/Array/Vector lemmas from human-eval-lean (#12405)
This PR adds several useful lemmas for `List`, `Array` and `Vector`
whenever they were missing, improving API coverage and consistency among
these types.
- `size_singleton`/`sum_singleton`/`sum_push`
-
`foldlM_toArray`/`foldlM_toList`/`foldl_toArray`/`foldl_toList`/`foldrM_toArray`/`foldrM_toList`/`foldr_toList`
- `toArray_toList`
- `foldl_eq_apply_foldr`/`foldr_eq_apply_foldl`, `foldr_eq_foldl`:
relates `foldl` and `foldr` for associative operations with identity
- `sum_eq_foldl`: relates sum to `foldl` for associative operations with
identity
- `Perm.pairwise_iff`/`Perm.pairwise`: pairwise properties are preserved
under permutations of arrays
2026-02-25 12:50:31 +00:00
Paul Reichert
b548cf38b6 feat: enable partial termination proofs about WellFounded.extrinsicFix (#12430)
This PR provides `WellFounded.partialExtrinsicFix`, which makes it
possible to implement and verify partially terminating functions, safely
building on top of the seemingly less general `extrinsicFix` (which is
now called `totalExtrinsicFix`). A proof of termination is only
necessary in order to formally verify the behavior of
`partialExtrinsicFix`.
2026-02-25 12:43:39 +00:00
Henrik Böving
e96d969d59 feat: support for del, isShared, oset and setTag (#12687)
This PR implements the LCNF instructions required for the expand reset
reuse pass.
2026-02-25 10:43:15 +00:00
Sebastian Ullrich
532310313f feat: lake shake --only (#12682)
This PR extends `lake shake` with a flag for minimizing only a specific
module
2026-02-25 10:24:50 +00:00
Marc Huisinga
168c125cf5 chore: relative lean-toolchains (#12652)
This PR changes all `lean-toolchain` to use relative toolchain paths
instead of `lean4` and `lean4-stage0` identifiers, which removes the
need for manually linking toolchains via Elan.

After this PR, at least Elan 4.2.0 and 0.0.224 of the Lean VS Code
extension will be needed to edit core.

Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-25 10:23:35 +00:00
Sebastian Ullrich
54be382b2f chore: fix core after rebootstrap 2026-02-25 11:40:02 +01:00
Sebastian Ullrich
fa31b285df chore: update stage0 2026-02-25 11:40:02 +01:00
Sebastian Ullrich
1fd9adc693 fix: update-stage0 under the Lake cache 2026-02-25 11:40:02 +01:00
Sebastian Ullrich
423671a6c0 feat: strengthen evalConst meta check 2026-02-25 11:40:02 +01:00
Markus Himmel
1e0bfe931f feat: more lemmas about String.Slice.Pos.ofSlice(From|To)? (#12685)
This PR adds some missing material about transferring positions across
the subslicing operations `slice`, `sliceFrom`, `sliceTo`.
2026-02-25 09:39:59 +00:00
Henrik Böving
1bf43863e6 fix: better LCNF pretty printing (#12684) 2026-02-25 09:30:23 +00:00
Markus Himmel
87ec768a50 fix: ensure that tail-recursive List.flatten is used everywhere (#12678)
This PR marks `List.flatten`, `List.flatMap`, `List.intercalate` as
noncomputable to ensure that their `csimp` variants are used everywhere.

We also mark `List.flatMapM` as noncomputable and provide a
tail-recursive implementation, and mark `List.utf8Encode` as
noncomputable, which only exists for specification purposes anyway (at
this point).

Closes #12676.
2026-02-25 06:24:15 +00:00
Kyle Miller
de65af8318 feat: overriding binder kinds of parameters in inductive constructors (#12603)
This PR adds a feature where `inductive` constructors can override the
binder kinds of the type's parameters, like in #9480 for `structure`.
For example, it's possible to make `x` explicit in the constructor
`Eq.refl`, rather than implicit:
```lean
inductive Eq {α : Type u} (x : α) : α → Prop where
  | refl (x) : Eq x x
```
In the Prelude, this is currently accomplished by taking advantage of
auto-promotion of indices to parameters.

**Breaking change.** Inductive types with a constructor that starts with
typeless binders may need to be rewritten, e.g. changing `(x)` to `(x :
_)` if there is a `variable` with that name or if it is meant to shadow
one of the inductive type's parameters.
2026-02-25 02:30:12 +00:00
Kyle Miller
c032af2f51 fix: make tactic .. at * save info contexts (#12607)
This PR fixes an issue where `withLocation` wasn't saving the info
context, which meant that tactics that use `at *` location syntax and do
term elaboration would save infotrees but revert the metacontext,
leading to Infoview messages like "Error updating: Error fetching goals:
Rpc error: InternalError: unknown metavariable" if the tactic failed at
some locations but succeeded at others.

Closes #10898
2026-02-25 01:59:50 +00:00
Kyle Miller
48a715993d fix: pretty printing of constants should consider accessibility of names (#12654)
This PR fixes two aspects of pretty printing of private names.
1. Name unresolution. Now private names are not special cased: the
private prefix is stripped off and the `_root_` prefix is added, then it
tries resolving all suffixes of the result. This is sufficient to handle
imported private names in the new module system. (Additionally,
unresolution takes macro scopes into account now.)
2. Delaboration. Inaccessible private names use a deterministic
algorithm to convert private prefixes into macro scopes. The effect is
that the same private name appearing in multiple times in the same
delaborated expression will now have the same `✝` suffix each time. It
used to use fresh macro scopes per occurrence.

Note: There is currently a small hack to support pretty printing in the
compiler's trace messages, which print constants that do not exist (e.g.
`obj`, `tobj`, and auxiliary definitions being compiled). Even though
these names are inaccessible (for the stronger reason that they don't
exist), we make sure that the pretty printer won't add macro scopes. It
also does some analysis of private names to see if the private names are
for the current module.

Closes #10771, closes #10772, and closes #10773
2026-02-25 00:01:19 +00:00
Wojciech Różowski
f31f50836d fix: withNamespace now correctly calls popScopes after running (#12647)
This PR adds the missing `popScopes` call to `withNamespace`, which
previously
only dropped scopes from the elaborator's `Command.State` but did not
pop the
environment's `ScopedEnvExtension` state stacks. This caused scoped
syntax
declarations to leak keywords outside their namespace when
`withNamespace` had
been called.

Closes #12630

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-24 15:24:58 +00:00
Lean stage0 autoupdater
c1ab1668b2 chore: update stage0 2026-02-24 15:19:57 +00:00
Sebastian Graf
7517f768f9 feat: lightweight dependent match motive for do match (#12673)
This PR allows for a leightweight version of dependent `match` in the
new `do` elaborator: discriminant types get abstracted over previous
discriminants. The match result type and the local context still are not
considered for abstraction. For example, if both `i : Nat` and `h : i <
len` are discrminants, then if an alternative matches `i` with `0`, we
also have `h : 0 < len`:

```lean
example {α : Type u} {β : Type v} {m : Type v → Type w} [Monad m] (as : Array α) (b : β) (f : (a : α) → a ∈ as → β → m (ForInStep β)) : m β :=
  let rec loop (i : Nat) (h : i ≤ as.size) (b : β) : m β := do
    match i, h with
    | 0,   _ => pure b
    | i+1, h =>
      have h' : i < as.size            := Nat.lt_of_lt_of_le (Nat.lt_succ_self i) h
      have : as.size - 1 < as.size     := Nat.sub_lt (Nat.zero_lt_of_lt h') (by decide)
      have : as.size - 1 - i < as.size := Nat.lt_of_le_of_lt (Nat.sub_le (as.size - 1) i) this
      match (← f as[as.size - 1 - i] (Array.getElem_mem this) b) with
      | ForInStep.done b  => pure b
      | ForInStep.yield b => loop i (Nat.le_of_lt h') b
  loop as.size (Nat.le_refl _) b
```

This feature turns out to be enough to save quite a few adaptations
(6/16) during bootstrep.
2026-02-24 14:29:29 +00:00
Sebastian Graf
96cd6909ea doc: fix comment referring to elabElem instead of elabDoElem (#12674) 2026-02-24 14:23:58 +00:00
Sebastian Graf
bb8d8da1af test: add benchmark vcgen_reader_state (#12671)
This PR adds the benchmark vcgen_reader_state that is a variant of
vcgen_add_sub_cancel that takes the value to subtract from a `ReaderT`
layer. Measurements:
```
goal_100: 201 ms, 1 VCs by sorry: 0 ms, kernel: 52 ms
goal_500: 382 ms, 1 VCs by sorry: 0 ms, kernel: 327 ms
goal_1000: 674 ms, 1 VCs by sorry: 1 ms, kernel: 741 ms
```
Which suggests it scales linearly. The generated VC triggers superlinear
behavior in `grind`, though, hence it is discharged by `sorry`.
2026-02-24 13:19:15 +00:00
Sebastian Graf
8916246be5 test: speed up vcgen_get_throw_set.lean by partially evaluating specs (#12670)
This PR speeds up the vcgen_get_throw_set benchmark by a factor of 4 by
partially evaluating specs.
2026-02-24 13:10:42 +00:00
Wojciech Różowski
65f112a165 chore: rename prime filter benchmark and fix the merge sort benchmark (#12669)
This PR renames the "Eratosthenes' sieve" benchmark description to
"prime filter" in the speedcenter config (following the discussion in
https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/sieve.20of.20Eratosthenes.20benchmark/with/575310824),
and adds the missing `#eval runBenchmarks` call to the merge sort
benchmark so it actually executes.
2026-02-24 10:57:47 +00:00
Markus Himmel
75b083d20a chore: API to prepare for String.split API (#12668)
This PR adds lemmas about string positions and patterns that will be
useful for providing high-level API lemmas for `String.split` and
friends.
2026-02-24 10:03:00 +00:00
Sebastian Ullrich
c595413fcc test: robustify but also CI-disable idbg test for now (#12667) 2026-02-24 09:19:53 +00:00
Kyle Miller
cd7f55b6c9 feat: pp.mdata (#12606)
This PR adds the pretty printer option `pp.mdata`, which causes the
pretty printer to annotate terms with any metadata that is present. For
example,
```lean
set_option pp.mdata true
/-- info: [mdata noindex:true] 2 : Nat -/
#guard_msgs in #check no_index 2
```
The `[mdata ...] e` syntax is only for pretty printing.

Thanks to @Rob23oba for an initial version.

Closes #10929
2026-02-24 04:30:26 +00:00
Kyle Miller
673d1a038c feat: clean up binder annotations inside of let rec definitions (#12608)
This PR continues #9674, cleaning up binder annotations inside the
bodies of `let rec` and `where` definitions.

Closes #11025
2026-02-24 04:24:47 +00:00
Lean stage0 autoupdater
66ce282364 chore: update stage0 2026-02-24 00:40:29 +00:00
Sebastian Graf
cdbed919ec fix: preserve TermInfo for do-match discriminant variables (#12666)
This PR fixes spurious unused variable warnings for variables used in
non-atomic match discriminants in `do` notation. For example, in `match
Json.parse s >>= fromJson? with`, the variable `s` would be reported as
unused.

The root cause is that `expandNonAtomicDiscrs?` eagerly elaborates the
discriminant via `Term.elabTerm`, which creates TermInfo for variable
references. The result is then passed to `elabDoElem` for further
elaboration. When the match elaboration is postponed (e.g. because the
discriminant type contains an mvar from `fromJson?`), the result is a
postponed synthetic mvar. The `withTermInfoContext'` wrapper in
`elabDoElemFns` checks `isTacticOrPostponedHole?` on this result,
detects a postponed mvar, and replaces the info subtree with a `hole`
node — discarding all the TermInfo that was accumulated during
discriminant elaboration.

The fix applies `mkSaveInfoAnnotation` to the result, which prevents
`isTacticOrPostponedHole?` from recognizing it as a hole. This is the
same mechanism that `elabLetMVar` uses to preserve info trees when the
body is a metavariable.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 23:54:17 +00:00
Sebastian Ullrich
6d86c8372a perf: shake Lean.Elab.Idbg (#12664) 2026-02-23 21:59:55 +00:00
Lean stage0 autoupdater
5c23579f93 chore: update stage0 2026-02-23 20:33:27 +00:00
Sebastian Ullrich
d0f8eb7bd6 fix: @[nospecialize] is never template-like (#12663)
This PR avoids false-positive error messages on specialization
restrictions under the module system when the declaration is explicitly
marked as not specializable. It could also provide some minor public
size and rebuild savings.
2026-02-23 20:00:36 +00:00
Sebastian Graf
65e5053008 fix: add TermInfo for mut vars in ControlStack.stateT.runInBase (#12661)
This PR fixes false-positive "unused variable" warnings for mutable
variables reassigned inside `try`/`catch` blocks with the new do
elaborator.

The root cause was that `ControlStack.stateT.runInBase` packed mutable
variables into a state tuple without calling `Term.addTermInfo'`, so the
unused variable linter could not see that the variables were used. The
fix mirrors how the `for` loop elaborator handles the same pattern in
`useLoopMutVars`.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-23 19:21:40 +00:00
Sebastian Ullrich
8f80881c2f feat: idbg interactive debug expression evaluator (#12648)
This PR adds the experimental `idbg e`, a new do-element (and term)
syntax for live debugging between the language server and a running
compiled Lean program.

When placed in a `do` block, `idbg` captures all local variables in
scope and expression `e`, then:

- **In the language server**: starts a TCP server on localhost waiting
for the running program to
connect; the editor will mark this part of the program as "in progress"
during this wait but that
  will not block `lake build` of the project.
- **In the compiled program**: on first execution of the `idbg` call
site, connects to the server,
receives the expression, compiles and evaluates it using the program's
actual runtime values, and
  sends the `repr` result back.

The result is displayed as an info diagnostic on the `idbg` keyword. The
expression `e` can be
edited while the program is running - each edit triggers re-elaboration
of `e`, a new TCP exchange,
and an updated result. This makes `idbg` a live REPL for inspecting and
experimenting with
program state at a specific point in execution. Only when `idbg` is
inserted, moved, or removed does
the program need to be recompiled and restarted.

# Known Limitations

* The program will poll for the server for up to 10 minutes and needs to
be killed manually
  otherwise.
* Use of multiple `idbg` at once untested, likely too much overhead from
overlapping imports without
  further changes.
* `LEAN_PATH` must be properly set up so compiled program can import its
origin module.
* Untested on Windows and macOS.
2026-02-23 17:22:44 +00:00
Kim Morrison
3827be0fec chore: constructorNameAsVariable linter respects linter.all 2024-08-09 10:56:39 +10:00
5063 changed files with 23022 additions and 2700 deletions

View File

@@ -4,29 +4,25 @@ To build Lean you should use `make -j$(nproc) -C build/release`.
## Running Tests
See `doc/dev/testing.md` for full documentation. Quick reference:
See `tests/README.md` for full documentation. Quick reference:
```bash
# Full test suite (use after builds to verify correctness)
make -j$(nproc) -C build/release test ARGS="-j$(nproc)"
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test
# Specific test by name (supports regex via ctest -R)
make -j$(nproc) -C build/release test ARGS='-R grind_ematch --output-on-failure'
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test ARGS='-R grind_ematch'
# Rerun only previously failed tests
make -j$(nproc) -C build/release test ARGS='--rerun-failed --output-on-failure'
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test ARGS='--rerun-failed'
# Single test from tests/lean/run/ (quick check during development)
cd tests/lean/run && ./test_single.sh example_test.lean
# ctest directly (from stage1 build dir)
cd build/release/stage1 && ctest -j$(nproc) --output-on-failure --timeout 300
# Single test from tests/foo/bar/ (quick check during development)
cd tests/foo/bar && ./run_test example_test.lean
```
The full test suite includes `tests/lean/`, `tests/lean/run/`, `tests/lean/interactive/`,
`tests/compiler/`, `tests/pkg/`, Lake tests, and more. Using `make test` or `ctest` runs
all of them; `test_single.sh` in `tests/lean/run/` only covers that one directory.
## New features
When asked to implement new features:
@@ -34,8 +30,6 @@ When asked to implement new features:
* write comprehensive tests first (expecting that these will initially fail)
* and then iterate on the implementation until the tests pass.
All new tests should go in `tests/lean/run/`. These tests don't have expected output; we just check there are no errors. You should use `#guard_msgs` to check for specific messages.
## Success Criteria
*Never* report success on a task unless you have verified both a clean build without errors, and that the relevant tests pass.

View File

@@ -85,7 +85,7 @@ jobs:
- name: CI Merge Checkout
run: |
git fetch --depth=1 origin ${{ github.sha }}
git checkout FETCH_HEAD flake.nix flake.lock script/prepare-* tests/lean/run/importStructure.lean
git checkout FETCH_HEAD flake.nix flake.lock script/prepare-* tests/elab/importStructure.lean
if: github.event_name == 'pull_request'
# (needs to be after "Checkout" so files don't get overridden)
- name: Setup emsdk
@@ -235,7 +235,7 @@ jobs:
# prefix `if` above with `always` so it's run even if tests failed
if: always() && steps.test.conclusion != 'skipped'
- name: Check Test Binary
run: ${{ matrix.binary-check }} tests/compiler/534.lean.out
run: ${{ matrix.binary-check }} tests/compile/534.lean.out
if: (!matrix.cross) && steps.test.conclusion != 'skipped'
- name: Build Stage 2
run: |
@@ -246,13 +246,7 @@ jobs:
make -C build -j$NPROC check-stage3
if: matrix.check-stage3
- name: Test Speedcenter Benchmarks
run: |
# Necessary for some timing metrics but does not work on Namespace runners
# and we just want to test that the benchmarks run at all here
#echo -1 | sudo tee /proc/sys/kernel/perf_event_paranoid
export BUILD=$PWD/build PATH=$PWD/build/stage1/bin:$PATH
cd tests/bench
nix shell .#temci -c temci exec --config speedcenter.yaml --included_blocks fast --runs 1
run: nix shell github:Kha/lakeprof -c make -C build -j$NPROC bench
if: matrix.test-speedcenter
- name: Check rebootstrap
run: |

View File

@@ -1,4 +1,8 @@
cmake_minimum_required(VERSION 3.11)
cmake_minimum_required(VERSION 3.21)
if(NOT CMAKE_GENERATOR MATCHES "Makefiles")
message(FATAL_ERROR "Only makefile generators are supported")
endif()
option(USE_MIMALLOC "use mimalloc" ON)
@@ -147,6 +151,7 @@ ExternalProject_Add(
INSTALL_COMMAND ""
DEPENDS stage2
EXCLUDE_FROM_ALL ON
STEP_TARGETS configure
)
# targets forwarded to appropriate stages
@@ -157,6 +162,25 @@ add_custom_target(update-stage0-commit COMMAND $(MAKE) -C stage1 update-stage0-c
add_custom_target(test COMMAND $(MAKE) -C stage1 test DEPENDS stage1)
add_custom_target(
bench
COMMAND $(MAKE) -C stage2
COMMAND $(MAKE) -C stage2 -j1 bench
DEPENDS stage2
)
add_custom_target(
bench-part1
COMMAND $(MAKE) -C stage2
COMMAND $(MAKE) -C stage2 -j1 bench-part1
DEPENDS stage2
)
add_custom_target(
bench-part2
COMMAND $(MAKE) -C stage2
COMMAND $(MAKE) -C stage2 -j1 bench-part2
DEPENDS stage2
)
add_custom_target(clean-stdlib COMMAND $(MAKE) -C stage1 clean-stdlib DEPENDS stage1)
install(CODE "execute_process(COMMAND make -C stage1 install)")

View File

@@ -1,5 +1,9 @@
# Test Suite
**Warning:** This document is partially outdated.
It describes the old test suite, which is currently in the process of being replaced.
The new test suite's documentation can be found at [`tests/README.md`](../../tests/README.md).
After [building Lean](../make/index.md) you can run all the tests using
```
cd build/release

View File

@@ -1 +1 @@
lean4
../../../build/release/stage1

View File

@@ -1 +1 @@
lean4
build/release/stage1

View File

@@ -2,21 +2,9 @@
"folders": [
{
"path": "."
},
{
"path": "src"
},
{
"path": "tests"
},
{
"path": "script"
}
],
"settings": {
// Open terminal at root, not current workspace folder
// (there is not way to directly refer to the root folder included as `.` above)
"terminal.integrated.cwd": "${workspaceFolder:src}/..",
"files.insertFinalNewline": true,
"files.trimTrailingWhitespace": true,
"cmake.buildDirectory": "${workspaceFolder}/build/release",

View File

@@ -83,7 +83,7 @@ def main (args : List String) : IO Unit := do
lastRSS? := some rss
let avgRSSDelta := totalRSSDelta / (n - 2)
IO.println s!"avg-reelab-rss-delta: {avgRSSDelta}"
IO.println s!"measurement: avg-reelab-rss-delta {avgRSSDelta*1024} b"
let _ Ipc.collectDiagnostics requestNo uri versionNo
( Ipc.stdin).writeLspMessage (Message.notification "exit" none)

View File

@@ -82,7 +82,7 @@ def main (args : List String) : IO Unit := do
lastRSS? := some rss
let avgRSSDelta := totalRSSDelta / (n - 2)
IO.println s!"avg-reelab-rss-delta: {avgRSSDelta}"
IO.println s!"measurement: avg-reelab-rss-delta {avgRSSDelta*1024} b"
let _ Ipc.collectDiagnostics requestNo uri versionNo
Ipc.shutdown requestNo

View File

@@ -9,5 +9,5 @@ find -regex '.*/CMakeLists\.txt\(\.in\)?\|.*\.cmake\(\.in\)?' \
! -path "./stage0/*" \
-exec \
uvx gersemi --in-place --line-length 120 --indent 2 \
--definitions src/cmake/Modules/ src/CMakeLists.txt \
--definitions src/cmake/Modules/ src/CMakeLists.txt tests/CMakeLists.txt \
-- {} +

View File

@@ -1 +1 @@
lean4
../build/release/stage1

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env bash
set -euo pipefail
rm -r stage0 || true
rm -rf stage0 || true
# don't copy untracked files
# `:!` is git glob flavor for exclude patterns
for f in $(git ls-files src ':!:src/lake/*' ':!:src/Leanc.lean'); do

View File

@@ -1,6 +1,4 @@
cmake_minimum_required(VERSION 3.10)
cmake_policy(SET CMP0054 NEW)
cmake_policy(SET CMP0110 NEW)
cmake_minimum_required(VERSION 3.21)
if(NOT CMAKE_GENERATOR MATCHES "Unix Makefiles")
message(FATAL_ERROR "The only supported CMake generator at the moment is 'Unix Makefiles'")
endif()

View File

@@ -1339,10 +1339,10 @@ transitive and contains `r`. `TransGen r a z` if and only if there exists a sequ
-/
inductive Relation.TransGen {α : Sort u} (r : α α Prop) : α α Prop
/-- If `r a b`, then `TransGen r a b`. This is the base case of the transitive closure. -/
| single {a b} : r a b TransGen r a b
| single {a b : α} : r a b TransGen r a b
/-- If `TransGen r a b` and `r b c`, then `TransGen r a c`.
This is the inductive case of the transitive closure. -/
| tail {a b c} : TransGen r a b r b c TransGen r a c
| tail {a b c : α} : TransGen r a b r b c TransGen r a c
/-- The transitive closure is transitive. -/
theorem Relation.TransGen.trans {α : Sort u} {r : α α Prop} {a b c} :

View File

@@ -72,6 +72,9 @@ theorem toArray_eq : List.toArray as = xs ↔ as = xs.toList := by
/-! ### size -/
theorem size_singleton {x : α} : #[x].size = 1 := by
simp
theorem eq_empty_of_size_eq_zero (h : xs.size = 0) : xs = #[] := by
cases xs
simp_all
@@ -3483,6 +3486,21 @@ theorem foldl_eq_foldr_reverse {xs : Array α} {f : β → α → β} {b} :
theorem foldr_eq_foldl_reverse {xs : Array α} {f : α β β} {b} :
xs.foldr f b = xs.reverse.foldl (fun x y => f y x) b := by simp
theorem foldl_eq_apply_foldr {xs : Array α} {f : α α α}
[Std.Associative f] [Std.LawfulRightIdentity f init] :
xs.foldl f x = f x (xs.foldr f init) := by
simp [ foldl_toList, foldr_toList, List.foldl_eq_apply_foldr]
theorem foldr_eq_apply_foldl {xs : Array α} {f : α α α}
[Std.Associative f] [Std.LawfulLeftIdentity f init] :
xs.foldr f x = f (xs.foldl f init) x := by
simp [ foldl_toList, foldr_toList, List.foldr_eq_apply_foldl]
theorem foldr_eq_foldl {xs : Array α} {f : α α α}
[Std.Associative f] [Std.LawfulIdentity f init] :
xs.foldr f init = xs.foldl f init := by
simp [foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
@[simp] theorem foldr_push_eq_append {as : Array α} {bs : Array β} {f : α β} (w : start = as.size) :
as.foldr (fun a xs => Array.push xs (f a)) bs start 0 = bs ++ (as.map f).reverse := by
subst w
@@ -4335,16 +4353,33 @@ def sum_eq_sum_toList := @sum_toList
@[simp, grind =]
theorem sum_append [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.LeftIdentity (α := α) (· + ·) 0] [Std.LawfulLeftIdentity (α := α) (· + ·) 0]
[Std.LawfulLeftIdentity (α := α) (· + ·) 0]
{as₁ as₂ : Array α} : (as₁ ++ as₂).sum = as₁.sum + as₂.sum := by
simp [ sum_toList, List.sum_append]
@[simp, grind =]
theorem sum_singleton [Add α] [Zero α] [Std.LawfulRightIdentity (· + ·) (0 : α)] {x : α} :
#[x].sum = x := by
simp [Array.sum_eq_foldr, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem sum_push [Add α] [Zero α] [Std.Associative (α := α) (· + ·)]
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : Array α} {x : α} :
(xs.push x).sum = xs.sum + x := by
simp [Array.sum_eq_foldr, Std.LawfulRightIdentity.right_id, Std.LawfulLeftIdentity.left_id,
Array.foldr_assoc]
@[simp, grind =]
theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.Commutative (α := α) (· + ·)]
[Std.LawfulLeftIdentity (α := α) (· + ·) 0] (xs : Array α) : xs.reverse.sum = xs.sum := by
simp [ sum_toList, List.sum_reverse]
theorem sum_eq_foldl [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : Array α} :
xs.sum = xs.foldl (init := 0) (· + ·) := by
simp [ sum_toList, List.sum_eq_foldl]
theorem foldl_toList_eq_flatMap {l : List α} {acc : Array β}
{F : Array β α Array β} {G : α List β}
(H : acc a, (F acc a).toList = acc.toList ++ G a) :

View File

@@ -126,6 +126,14 @@ theorem swap_perm {xs : Array α} {i j : Nat} (h₁ : i < xs.size) (h₂ : j < x
simp only [swap, perm_iff_toList_perm, toList_set]
apply set_set_perm
theorem Perm.pairwise_iff {R : α α Prop} (S : {x y}, R x y R y x) {xs ys : Array α}
: _p : xs.Perm ys, xs.toList.Pairwise R ys.toList.Pairwise R := by
simpa only [perm_iff_toList_perm] using List.Perm.pairwise_iff S
theorem Perm.pairwise {R : α α Prop} {xs ys : Array α} (hp : xs ~ ys)
(hR : xs.toList.Pairwise R) (hsymm : {x y}, R x y R y x) :
ys.toList.Pairwise R := (hp.pairwise_iff hsymm).mp hR
namespace Perm
set_option linter.indexVariables false in

View File

@@ -36,3 +36,4 @@ public import Init.Data.List.FinRange
public import Init.Data.List.Lex
public import Init.Data.List.Range
public import Init.Data.List.Scan
public import Init.Data.List.ControlImpl

View File

@@ -2186,7 +2186,7 @@ Examples:
* `List.intercalate sep [a, b] = a ++ sep ++ b`
* `List.intercalate sep [a, b, c] = a ++ sep ++ b ++ sep ++ c`
-/
def intercalate (sep : List α) (xs : List (List α)) : List α :=
noncomputable def intercalate (sep : List α) (xs : List (List α)) : List α :=
(intersperse sep xs).flatten
/-! ### eraseDupsBy -/

View File

@@ -219,9 +219,9 @@ def filterMapM {m : Type u → Type v} [Monad m] {α : Type w} {β : Type u} (f
Applies a monadic function that returns a list to each element of a list, from left to right, and
concatenates the resulting lists.
-/
@[inline, expose]
def flatMapM {m : Type u Type v} [Monad m] {α : Type w} {β : Type u} (f : α m (List β)) (as : List α) : m (List β) :=
let rec @[specialize] loop
@[expose]
noncomputable def flatMapM {m : Type u Type v} [Monad m] {α : Type w} {β : Type u} (f : α m (List β)) (as : List α) : m (List β) :=
let rec loop
| [], bs => pure bs.reverse.flatten
| a :: as, bs => do
let bs' f a

View File

@@ -0,0 +1,35 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Author: Markus Himmel
-/
module
prelude
public import Init.Data.List.Control
public import Init.Data.List.Impl
public section
namespace List
/--
Applies a monadic function that returns a list to each element of a list, from left to right, and
concatenates the resulting lists.
-/
@[inline, expose]
def flatMapMTR {m : Type u Type v} [Monad m] {α : Type w} {β : Type u} (f : α m (List β)) (as : List α) : m (List β) :=
let rec @[specialize] loop
| [], bs => pure bs.reverse.flatten
| a :: as, bs => do
let bs' f a
loop as (bs' :: bs)
loop as []
@[csimp] theorem flatMapM_eq_flatMapMTR : @flatMapM = @flatMapMTR := by
funext m _ α β f l
simp only [flatMapM, flatMapMTR]
generalize [] = m
fun_induction flatMapM.loop <;> simp_all [flatMapMTR.loop]
end List

View File

@@ -1838,6 +1838,11 @@ theorem sum_append [Add α] [Zero α] [Std.LawfulLeftIdentity (α := α) (· +
[Std.Associative (α := α) (· + ·)] {l₁ l₂ : List α} : (l₁ ++ l₂).sum = l₁.sum + l₂.sum := by
induction l₁ generalizing l₂ <;> simp_all [Std.Associative.assoc, Std.LawfulLeftIdentity.left_id]
@[simp, grind =]
theorem sum_singleton [Add α] [Zero α] [Std.LawfulRightIdentity (· + ·) (0 : α)] {x : α} :
[x].sum = x := by
simp [List.sum_eq_foldr, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.Commutative (α := α) (· + ·)]
@@ -2727,6 +2732,31 @@ theorem foldr_assoc {op : ααα} [ha : Std.Associative op] :
simp only [foldr_cons, ha.assoc]
rw [foldr_assoc]
theorem foldl_eq_apply_foldr {xs : List α} {f : α α α}
[Std.Associative f] [Std.LawfulRightIdentity f init] :
xs.foldl f x = f x (xs.foldr f init) := by
induction xs generalizing x
· simp [Std.LawfulRightIdentity.right_id]
· simp [foldl_assoc, *]
theorem foldr_eq_apply_foldl {xs : List α} {f : α α α}
[Std.Associative f] [Std.LawfulLeftIdentity f init] :
xs.foldr f x = f (xs.foldl f init) x := by
have : Std.Associative (fun x y => f y x) := by simp [Std.Associative.assoc]
have : Std.RightIdentity (fun x y => f y x) init :=
have : Std.LawfulRightIdentity (fun x y => f y x) init := by simp [Std.LawfulLeftIdentity.left_id]
rw [ List.reverse_reverse (as := xs), foldr_reverse, foldl_eq_apply_foldr, foldl_reverse]
theorem foldr_eq_foldl {xs : List α} {f : α α α}
[Std.Associative f] [Std.LawfulIdentity f init] :
xs.foldr f init = xs.foldl f init := by
simp [foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
theorem sum_eq_foldl [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : List α} :
xs.sum = xs.foldl (init := 0) (· + ·) := by
simp [sum_eq_foldr, foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
-- The argument `f : α₁ → α₂` is intentionally explicit, as it is sometimes not found by unification.
theorem foldl_hom (f : α₁ α₂) {g₁ : α₁ β α₁} {g₂ : α₂ β α₂} {l : List β} {init : α₁}
(H : x y, g₂ (f x) y = f (g₁ x y)) : l.foldl g₂ (f init) = f (l.foldl g₁ init) := by

View File

@@ -1525,6 +1525,11 @@ def Slice.Pos.toReplaceEnd {s : Slice} (p₀ : s.Pos) (pos : s.Pos) (h : pos ≤
theorem Slice.Pos.offset_sliceTo {s : Slice} {p₀ : s.Pos} {pos : s.Pos} {h : pos p₀} :
(sliceTo p₀ pos h).offset = pos.offset := (rfl)
@[simp]
theorem Slice.Pos.sliceTo_inj {s : Slice} {p₀ : s.Pos} {pos pos' : s.Pos} {h h'} :
p₀.sliceTo pos h = p₀.sliceTo pos' h' pos = pos' := by
simp [Pos.ext_iff]
@[simp]
theorem Slice.Pos.ofSliceTo_startPos {s : Slice} {pos : s.Pos} :
ofSliceTo (s.sliceTo pos).startPos = s.startPos := by
@@ -2264,14 +2269,26 @@ theorem Slice.Pos.le_ofSliceFrom {s : Slice} {p₀ : s.Pos} {pos : (s.sliceFrom
p₀ ofSliceFrom pos := by
simp [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.ofSliceFrom_lt_ofSliceFrom_iff {s : Slice} {p : s.Pos}
{q r : (s.sliceFrom p).Pos} : Slice.Pos.ofSliceFrom q < Slice.Pos.ofSliceFrom r q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Slice.Pos.ofSliceFrom_le_ofSliceFrom_iff {s : Slice} {p : s.Pos}
{q r : (s.sliceFrom p).Pos} : Slice.Pos.ofSliceFrom q Slice.Pos.ofSliceFrom r q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Pos.ofSliceFrom_lt_ofSliceFrom_iff {s : String} {p : s.Pos}
{q r : (s.sliceFrom p).Pos} : Pos.ofSliceFrom q < Pos.ofSliceFrom r q < r := by
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Pos.ofSliceFrom_le_ofSliceFrom_iff {s : String} {p : s.Pos}
{q r : (s.sliceFrom p).Pos} : Pos.ofSliceFrom q Pos.ofSliceFrom r q r := by
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
theorem Pos.get_eq_get_ofSliceFrom {s : String} {p₀ : s.Pos}
{pos : (s.sliceFrom p₀).Pos} {h} :
pos.get h = (ofSliceFrom pos).get (by rwa [ ofSliceFrom_endPos, ne_eq, ofSliceFrom_inj]) := by
@@ -2335,6 +2352,16 @@ theorem Slice.Pos.ofSliceTo_le {s : Slice} {p₀ : s.Pos} {pos : (s.sliceTo p₀
ofSliceTo pos p₀ := by
simpa [Pos.le_iff, Pos.Raw.le_iff] using pos.isValidForSlice.le_utf8ByteSize
@[simp]
theorem Pos.ofSliceTo_lt_ofSliceTo_iff {s : String} {p : s.Pos}
{q r : (s.sliceTo p).Pos} : Pos.ofSliceTo q < Pos.ofSliceTo r q < r := by
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Pos.ofSliceTo_le_ofSliceTo_iff {s : String} {p : s.Pos}
{q r : (s.sliceTo p).Pos} : Pos.ofSliceTo q Pos.ofSliceTo r q r := by
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
/-- Given a position in `s` that is at most `p₀`, obtain the corresponding position in `s.sliceTo p₀`. -/
@[inline]
def Pos.sliceTo {s : String} (p₀ : s.Pos) (pos : s.Pos) (h : pos p₀) :
@@ -2351,6 +2378,11 @@ def Pos.toReplaceEnd {s : String} (p₀ : s.Pos) (pos : s.Pos) (h : pos ≤ p₀
theorem Pos.offset_sliceTo {s : String} {p₀ : s.Pos} {pos : s.Pos} {h : pos p₀} :
(sliceTo p₀ pos h).offset = pos.offset := (rfl)
@[simp]
theorem Pos.sliceTo_inj {s : String} {p₀ : s.Pos} {pos pos' : s.Pos} {h h'} :
p₀.sliceTo pos h = p₀.sliceTo pos' h' pos = pos' := by
simp [Pos.ext_iff, Slice.Pos.ext_iff]
@[simp]
theorem Slice.Pos.ofSliceTo_sliceTo {s : Slice} {p₀ p : s.Pos} {h : p p₀} :
Slice.Pos.ofSliceTo (p₀.sliceTo p h) = p := by
@@ -2419,6 +2451,27 @@ theorem Slice.Pos.ofSlice_inj {s : Slice} {p₀ p₁ : s.Pos} {h} (pos₁ pos₂
ofSlice pos₁ = ofSlice pos₂ pos₁ = pos₂ := by
simp [Pos.ext_iff, Pos.Raw.ext_iff]
@[simp]
theorem Slice.Pos.le_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} : p₀ ofSlice pos := by
simp [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.ofSlice_le {s : Slice} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} : ofSlice pos p₁ := by
have := (Pos.Raw.isValidForSlice_slice _).1 pos.isValidForSlice |>.1
simpa [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.ofSlice_lt_ofSlice_iff {s : Slice} {p₀ p₁ : s.Pos} {h}
{q r : (s.slice p₀ p₁ h).Pos} : Slice.Pos.ofSlice q < Slice.Pos.ofSlice r q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Slice.Pos.ofSlice_le_ofSlice_iff {s : Slice} {p₀ p₁ : s.Pos} {h}
{q r : (s.slice p₀ p₁ h).Pos} : Slice.Pos.ofSlice q Slice.Pos.ofSlice r q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
/-- Given a position in `s.slice p₀ p₁ h`, obtain the corresponding position in `s`. -/
@[inline]
def Pos.ofSlice {s : String} {p₀ p₁ : s.Pos} {h} (pos : (s.slice p₀ p₁ h).Pos) : s.Pos :=
@@ -2449,6 +2502,27 @@ theorem Pos.ofSlice_inj {s : String} {p₀ p₁ : s.Pos} {h} (pos₁ pos₂ : (s
ofSlice pos₁ = ofSlice pos₂ pos₁ = pos₂ := by
simp [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.ext_iff]
@[simp]
theorem Pos.le_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} : p₀ ofSlice pos := by
simp [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Pos.ofSlice_le {s : String} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} : ofSlice pos p₁ := by
have := (Pos.Raw.isValidForSlice_slice _).1 pos.isValidForSlice |>.1
simpa [Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Pos.ofSlice_lt_ofSlice_iff {s : String} {p₀ p₁ : s.Pos} {h}
{q r : (s.slice p₀ p₁ h).Pos} : Pos.ofSlice q < Pos.ofSlice r q < r := by
simp [Pos.lt_iff, Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Pos.ofSlice_le_ofSlice_iff {s : String} {p₀ p₁ : s.Pos} {h}
{q r : (s.slice p₀ p₁ h).Pos} : Pos.ofSlice q Pos.ofSlice r q r := by
simp [Pos.le_iff, Slice.Pos.le_iff, Pos.Raw.le_iff]
theorem Slice.Pos.le_trans {s : Slice} {p q r : s.Pos} : p q q r p r := by
simpa [Pos.le_iff, Pos.Raw.le_iff] using Nat.le_trans
@@ -2472,6 +2546,48 @@ def Pos.slice {s : String} (pos : s.Pos) (p₀ p₁ : s.Pos) (h₁ : p₀ ≤ po
theorem Pos.offset_slice {s : String} {p₀ p₁ pos : s.Pos} {h₁ : p₀ pos} {h₂ : pos p₁} :
(pos.slice p₀ p₁ h₁ h₂).offset = pos.offset.unoffsetBy p₀.offset := (rfl)
@[simp]
theorem Slice.Pos.offset_slice {s : Slice} {p₀ p₁ pos : s.Pos} {h₁ : p₀ pos} {h₂ : pos p₁} :
(pos.slice p₀ p₁ h₁ h₂).offset = pos.offset.unoffsetBy p₀.offset := (rfl)
@[simp]
theorem Slice.Pos.ofSlice_slice {s : Slice} {p₀ p₁ pos : s.Pos}
{h₁ : p₀ pos} {h₂ : pos p₁} :
Slice.Pos.ofSlice (pos.slice p₀ p₁ h₁ h₂) = pos := by
simpa [Pos.ext_iff] using Pos.Raw.offsetBy_unoffsetBy_of_le h₁
@[simp]
theorem Slice.Pos.slice_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} :
(Slice.Pos.ofSlice pos).slice p₀ p₁ Slice.Pos.le_ofSlice Slice.Pos.ofSlice_le = pos := by
simp [ Slice.Pos.ofSlice_inj]
@[simp]
theorem Pos.ofSlice_slice {s : String} {p₀ p₁ pos : s.Pos}
{h₁ : p₀ pos} {h₂ : pos p₁} :
Pos.ofSlice (pos.slice p₀ p₁ h₁ h₂) = pos := by
simpa [Pos.ext_iff] using Pos.Raw.offsetBy_unoffsetBy_of_le h₁
@[simp]
theorem Pos.slice_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} :
(Pos.ofSlice pos).slice p₀ p₁ Pos.le_ofSlice Pos.ofSlice_le = pos := by
simp [ Pos.ofSlice_inj]
@[simp]
theorem Slice.Pos.slice_inj {s : Slice} {p₀ p₁ : s.Pos} {pos pos' : s.Pos}
{h₁ h₁' h₂ h₂'} :
pos.slice p₀ p₁ h₁ h₂ = pos'.slice p₀ p₁ h₁' h₂' pos = pos' := by
simp [Pos.ext_iff, Pos.Raw.ext_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
@[simp]
theorem Pos.slice_inj {s : String} {p₀ p₁ : s.Pos} {pos pos' : s.Pos}
{h₁ h₁' h₂ h₂'} :
pos.slice p₀ p₁ h₁ h₂ = pos'.slice p₀ p₁ h₁' h₂' pos = pos' := by
simp [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.ext_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
/--
Given a position in `s`, obtain the corresponding position in `s.slice p₀ p₁ h`, or panic if `pos`
is not between `p₀` and `p₁`.

View File

@@ -10,6 +10,7 @@ public import Init.Data.String.Basic
import Init.Data.String.OrderInstances
import Init.Data.String.Lemmas.Basic
import Init.Data.Order.Lemmas
import Init.Omega
public section
@@ -56,6 +57,14 @@ theorem Slice.Pos.endPos_le {s : Slice} (p : s.Pos) : s.endPos ≤ p ↔ p = s.e
theorem Slice.Pos.lt_endPos_iff {s : Slice} (p : s.Pos) : p < s.endPos p s.endPos := by
simp [ endPos_le, Std.not_le]
@[simp]
theorem Pos.endPos_le {s : String} (p : s.Pos) : s.endPos p p = s.endPos :=
fun h => Std.le_antisymm (le_endPos _) h, by simp +contextual
@[simp]
theorem Pos.lt_endPos_iff {s : String} (p : s.Pos) : p < s.endPos p s.endPos := by
simp [ endPos_le, Std.not_le]
@[simp]
theorem Pos.le_startPos {s : String} (p : s.Pos) : p s.startPos p = s.startPos :=
fun h => Std.le_antisymm h (startPos_le _), by simp +contextual
@@ -64,10 +73,6 @@ theorem Pos.le_startPos {s : String} (p : s.Pos) : p ≤ s.startPos ↔ p = s.st
theorem Pos.startPos_lt_iff {s : String} {p : s.Pos} : s.startPos < p p s.startPos := by
simp [ le_startPos, Std.not_le]
@[simp]
theorem Pos.endPos_le {s : String} (p : s.Pos) : s.endPos p p = s.endPos :=
fun h => Std.le_antisymm (le_endPos _) h, by simp +contextual [Std.le_refl]
@[simp]
theorem Slice.Pos.not_lt_startPos {s : Slice} {p : s.Pos} : ¬ p < s.startPos :=
fun h => Std.lt_irrefl (Std.lt_of_lt_of_le h (Slice.Pos.startPos_le _))
@@ -105,14 +110,232 @@ theorem Slice.Pos.next_ne_startPos {s : Slice} {p : s.Pos} {h} :
p.next h s.startPos :=
ne_startPos_of_lt lt_next
@[simp]
theorem Slice.Pos.ofSliceTo_lt_ofSliceTo_iff {s : Slice} {p : s.Pos}
{q r : (s.sliceTo p).Pos} : Slice.Pos.ofSliceTo q < Slice.Pos.ofSliceTo r q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Slice.Pos.ofSliceTo_le_ofSliceTo_iff {s : Slice} {p : s.Pos}
{q r : (s.sliceTo p).Pos} : Slice.Pos.ofSliceTo q Slice.Pos.ofSliceTo r q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.sliceTo_lt_sliceTo_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceTo p₀ q h₁ < Pos.sliceTo p₀ r h₂ q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Slice.Pos.sliceTo_le_sliceTo_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceTo p₀ q h₁ Pos.sliceTo p₀ r h₂ q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Pos.sliceTo_lt_sliceTo_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceTo p₀ q h₁ < Pos.sliceTo p₀ r h₂ q < r := by
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff]
@[simp]
theorem Pos.sliceTo_le_sliceTo_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceTo p₀ q h₁ Pos.sliceTo p₀ r h₂ q r := by
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff]
@[simp]
theorem Slice.Pos.sliceFrom_lt_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceFrom p₀ q h₁ < Pos.sliceFrom p₀ r h₂ q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff, Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂
omega
@[simp]
theorem Slice.Pos.sliceFrom_le_sliceFrom_iff {s : Slice} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceFrom p₀ q h₁ Pos.sliceFrom p₀ r h₂ q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂
omega
@[simp]
theorem Pos.sliceFrom_lt_sliceFrom_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceFrom p₀ q h₁ < Pos.sliceFrom p₀ r h₂ q < r := by
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂
omega
@[simp]
theorem Pos.sliceFrom_le_sliceFrom_iff {s : String} {p₀ : s.Pos} {q r : s.Pos} {h₁ h₂} :
Pos.sliceFrom p₀ q h₁ Pos.sliceFrom p₀ r h₂ q r := by
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₂
omega
theorem Slice.Pos.ofSliceFrom_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
Pos.ofSliceFrom p < q h, p < Slice.Pos.sliceFrom p₀ q h := by
refine fun h => Std.le_of_lt (Std.lt_of_le_of_lt Pos.le_ofSliceFrom h), ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceFrom_ofSliceFrom (p := p)]
rwa [Pos.sliceFrom_lt_sliceFrom_iff]
· simp +singlePass only [ Pos.ofSliceFrom_sliceFrom (h := h)]
rwa [Pos.ofSliceFrom_lt_ofSliceFrom_iff]
theorem Slice.Pos.le_ofSliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
q Pos.ofSliceFrom p h, Slice.Pos.sliceFrom p₀ q h p := by
simp [ Std.not_lt, Pos.ofSliceFrom_lt_iff]
theorem Slice.Pos.ofSliceFrom_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
Pos.ofSliceFrom p q h, p Slice.Pos.sliceFrom p₀ q h := by
refine fun h => Std.le_trans Pos.le_ofSliceFrom h, ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceFrom_ofSliceFrom (p := p)]
rwa [Pos.sliceFrom_le_sliceFrom_iff]
· simp +singlePass only [ Pos.ofSliceFrom_sliceFrom (h := h)]
rwa [Pos.ofSliceFrom_le_ofSliceFrom_iff]
theorem Slice.Pos.lt_ofSliceFrom_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
q < Pos.ofSliceFrom p h, Slice.Pos.sliceFrom p₀ q h < p := by
simp [ Std.not_le, Pos.ofSliceFrom_le_iff]
theorem Pos.ofSliceFrom_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
Pos.ofSliceFrom p < q h, p < Pos.sliceFrom p₀ q h := by
refine fun h => Std.le_of_lt (Std.lt_of_le_of_lt Pos.le_ofSliceFrom h), ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceFrom_ofSliceFrom (p := p)]
rwa [Pos.sliceFrom_lt_sliceFrom_iff]
· simp +singlePass only [ Pos.ofSliceFrom_sliceFrom (h := h)]
rwa [Pos.ofSliceFrom_lt_ofSliceFrom_iff]
theorem Pos.le_ofSliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
q Pos.ofSliceFrom p h, Pos.sliceFrom p₀ q h p := by
simp [ Std.not_lt, Pos.ofSliceFrom_lt_iff]
theorem Pos.ofSliceFrom_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
Pos.ofSliceFrom p q h, p Pos.sliceFrom p₀ q h := by
refine fun h => Std.le_trans Pos.le_ofSliceFrom h, ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceFrom_ofSliceFrom (p := p)]
rwa [Pos.sliceFrom_le_sliceFrom_iff]
· simp +singlePass only [ Pos.ofSliceFrom_sliceFrom (h := h)]
rwa [Pos.ofSliceFrom_le_ofSliceFrom_iff]
theorem Pos.lt_ofSliceFrom_iff {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {q : s.Pos} :
q < Pos.ofSliceFrom p h, Pos.sliceFrom p₀ q h < p := by
simp [ Std.not_le, Pos.ofSliceFrom_le_iff]
theorem Slice.Pos.ofSliceFrom_next {s : Slice} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {h} :
Pos.ofSliceFrom (p.next h) = (Pos.ofSliceFrom p).next (by simpa [ Pos.ofSliceFrom_inj] using h) := by
rw [eq_comm, Pos.next_eq_iff]
simp only [Pos.ofSliceFrom_lt_ofSliceFrom_iff, Pos.lt_next, Pos.ofSliceFrom_le_iff,
Pos.next_le_iff_lt, true_and]
simp [Pos.ofSliceFrom_lt_iff]
theorem Pos.ofSliceFrom_next {s : String} {p₀ : s.Pos} {p : (s.sliceFrom p₀).Pos} {h} :
Pos.ofSliceFrom (p.next h) = (Pos.ofSliceFrom p).next (by simpa [ Pos.ofSliceFrom_inj] using h) := by
rw [eq_comm, Pos.next_eq_iff]
simp only [Pos.ofSliceFrom_lt_ofSliceFrom_iff, Slice.Pos.lt_next, Pos.ofSliceFrom_le_iff,
Slice.Pos.next_le_iff_lt, true_and]
simp [Pos.ofSliceFrom_lt_iff]
theorem Slice.Pos.le_ofSliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
q Pos.ofSliceTo p h, Slice.Pos.sliceTo p₀ q h p := by
refine fun h => Slice.Pos.le_trans h Pos.ofSliceTo_le, ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceTo_ofSliceTo (p := p)]
rwa [Pos.sliceTo_le_sliceTo_iff]
· simp +singlePass only [ Pos.ofSliceTo_sliceTo (h := h)]
rwa [Pos.ofSliceTo_le_ofSliceTo_iff]
theorem Slice.Pos.ofSliceTo_lt_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
Pos.ofSliceTo p < q h, p < Slice.Pos.sliceTo p₀ q h := by
simp [ Std.not_le, Slice.Pos.le_ofSliceTo_iff]
theorem Slice.Pos.lt_ofSliceTo_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
q < Pos.ofSliceTo p h, Slice.Pos.sliceTo p₀ q h < p := by
refine fun h => Std.le_of_lt (Std.lt_of_le_of_lt (Std.le_refl q) (Std.lt_of_lt_of_le h Pos.ofSliceTo_le)), ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceTo_ofSliceTo (p := p)]
rwa [Pos.sliceTo_lt_sliceTo_iff]
· simp +singlePass only [ Pos.ofSliceTo_sliceTo (h := h)]
rwa [Pos.ofSliceTo_lt_ofSliceTo_iff]
theorem Slice.Pos.ofSliceTo_le_iff {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
Pos.ofSliceTo p q h, p Slice.Pos.sliceTo p₀ q h := by
simp [ Std.not_lt, Slice.Pos.lt_ofSliceTo_iff]
theorem Pos.le_ofSliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
q Pos.ofSliceTo p h, Pos.sliceTo p₀ q h p := by
refine fun h => Pos.le_trans h Pos.ofSliceTo_le, ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceTo_ofSliceTo (p := p)]
rwa [Pos.sliceTo_le_sliceTo_iff]
· simp +singlePass only [ Pos.ofSliceTo_sliceTo (h := h)]
rwa [Pos.ofSliceTo_le_ofSliceTo_iff]
theorem Pos.ofSliceTo_lt_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
Pos.ofSliceTo p < q h, p < Pos.sliceTo p₀ q h := by
simp [ Std.not_le, Pos.le_ofSliceTo_iff]
theorem Pos.lt_ofSliceTo_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
q < Pos.ofSliceTo p h, Pos.sliceTo p₀ q h < p := by
refine fun h => Pos.le_of_lt (Pos.lt_of_lt_of_le h Pos.ofSliceTo_le), ?_, fun h, h' => ?_
· simp +singlePass only [ Pos.sliceTo_ofSliceTo (p := p)]
rwa [Pos.sliceTo_lt_sliceTo_iff]
· simp +singlePass only [ Pos.ofSliceTo_sliceTo (h := h)]
rwa [Pos.ofSliceTo_lt_ofSliceTo_iff]
theorem Pos.ofSliceTo_le_iff {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {q : s.Pos} :
Pos.ofSliceTo p q h, p Pos.sliceTo p₀ q h := by
simp [ Std.not_lt, Pos.lt_ofSliceTo_iff]
theorem Slice.Pos.ofSliceTo_ne_endPos {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos}
(h : p (s.sliceTo p₀).endPos) : Pos.ofSliceTo p s.endPos := by
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₀))
simpa [ lt_endPos_iff, ofSliceTo_lt_ofSliceTo_iff] using h
theorem Pos.ofSliceTo_ne_endPos {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos}
(h : p (s.sliceTo p₀).endPos) : Pos.ofSliceTo p s.endPos := by
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₀))
simpa [ Slice.Pos.lt_endPos_iff, ofSliceTo_lt_ofSliceTo_iff] using h
theorem Slice.Pos.ofSliceTo_next {s : Slice} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {h} :
Pos.ofSliceTo (p.next h) = (Pos.ofSliceTo p).next (ofSliceTo_ne_endPos h) := by
rw [eq_comm, Pos.next_eq_iff]
simp only [Pos.ofSliceTo_lt_ofSliceTo_iff, Pos.lt_next, Pos.ofSliceTo_le_iff,
Pos.next_le_iff_lt, true_and]
simp [Pos.ofSliceTo_lt_iff]
theorem Pos.ofSliceTo_next {s : String} {p₀ : s.Pos} {p : (s.sliceTo p₀).Pos} {h} :
Pos.ofSliceTo (p.next h) = (Pos.ofSliceTo p).next (ofSliceTo_ne_endPos h) := by
rw [eq_comm, Pos.next_eq_iff]
simp only [Pos.ofSliceTo_lt_ofSliceTo_iff, Slice.Pos.lt_next, Pos.ofSliceTo_le_iff,
Slice.Pos.next_le_iff_lt, true_and]
simp [Pos.ofSliceTo_lt_iff]
@[simp]
theorem Slice.Pos.slice_lt_slice_iff {s : Slice} {p₀ p₁ : s.Pos} {q r : s.Pos}
{h₁ h₁' h₂ h₂'} :
q.slice p₀ p₁ h₁ h₂ < r.slice p₀ p₁ h₁' h₂' q < r := by
simp [Slice.Pos.lt_iff, Pos.Raw.lt_iff, Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
@[simp]
theorem Slice.Pos.slice_le_slice_iff {s : Slice} {p₀ p₁ : s.Pos} {q r : s.Pos}
{h₁ h₁' h₂ h₂'} :
q.slice p₀ p₁ h₁ h₂ r.slice p₀ p₁ h₁' h₂' q r := by
simp [Slice.Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
@[simp]
theorem Pos.slice_lt_slice_iff {s : String} {p₀ p₁ : s.Pos} {q r : s.Pos}
{h₁ h₁' h₂ h₂'} :
q.slice p₀ p₁ h₁ h₂ < r.slice p₀ p₁ h₁' h₂' q < r := by
simp [Slice.Pos.lt_iff, Pos.lt_iff, Pos.Raw.lt_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
@[simp]
theorem Pos.slice_le_slice_iff {s : String} {p₀ p₁ : s.Pos} {q r : s.Pos}
{h₁ h₁' h₂ h₂'} :
q.slice p₀ p₁ h₁ h₂ r.slice p₀ p₁ h₁' h₂' q r := by
simp [Slice.Pos.le_iff, Pos.le_iff, Pos.Raw.le_iff] at h₁ h₁'
omega
theorem Slice.Pos.ofSlice_ne_endPos {s : Slice} {p₀ p₁ : s.Pos} {h} {p : (s.slice p₀ p₁ h).Pos}
(h : p (s.slice p₀ p₁ h).endPos) : Pos.ofSlice p s.endPos := by
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₁))
simpa [ lt_endPos_iff, ofSlice_lt_ofSlice_iff] using h
theorem Pos.ofSlice_ne_endPos {s : String} {p₀ p₁ : s.Pos} {h} {p : (s.slice p₀ p₁ h).Pos}
(h : p (s.slice p₀ p₁ h).endPos) : Pos.ofSlice p s.endPos := by
refine (lt_endPos_iff _).1 (Std.lt_of_lt_of_le ?_ (le_endPos p₁))
simpa [ Slice.Pos.lt_endPos_iff, ofSlice_lt_ofSlice_iff] using h
@[simp]
theorem Slice.Pos.offset_le_rawEndPos {s : Slice} {p : s.Pos} :
p.offset s.rawEndPos :=
@@ -161,4 +384,38 @@ theorem Pos.isUTF8FirstByte_getUTF8Byte_offset {s : String} {p : s.Pos} {h} :
(s.getUTF8Byte p.offset h).IsUTF8FirstByte := by
simpa [getUTF8Byte_offset] using isUTF8FirstByte_byte
theorem Slice.Pos.get_eq_get_ofSliceTo {s : Slice} {p₀ : s.Pos} {pos : (s.sliceTo p₀).Pos} {h} :
pos.get h = (ofSliceTo pos).get (ofSliceTo_ne_endPos h) := by
simp [Slice.Pos.get]
theorem Pos.get_eq_get_ofSliceTo {s : String} {p₀ : s.Pos}
{pos : (s.sliceTo p₀).Pos} {h} :
pos.get h = (ofSliceTo pos).get (ofSliceTo_ne_endPos h) := by
simp [Pos.get, Slice.Pos.get]
theorem Slice.Pos.get_eq_get_ofSlice {s : Slice} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} {h'} :
pos.get h' = (ofSlice pos).get (ofSlice_ne_endPos h') := by
simp [Slice.Pos.get, Nat.add_assoc]
theorem Pos.get_eq_get_ofSlice {s : String} {p₀ p₁ : s.Pos} {h}
{pos : (s.slice p₀ p₁ h).Pos} {h'} :
pos.get h' = (ofSlice pos).get (ofSlice_ne_endPos h') := by
simp [Pos.get, Slice.Pos.get]
theorem Slice.Pos.ofSlice_next {s : Slice} {p₀ p₁ : s.Pos} {h}
{p : (s.slice p₀ p₁ h).Pos} {h'} :
Pos.ofSlice (p.next h') = (Pos.ofSlice p).next (ofSlice_ne_endPos h') := by
simp only [Slice.Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.offset_next, Slice.Pos.offset_ofSlice]
rw [Slice.Pos.get_eq_get_ofSlice (h' := h')]
simp [Pos.Raw.offsetBy, Nat.add_assoc]
theorem Pos.ofSlice_next {s : String} {p₀ p₁ : s.Pos} {h}
{p : (s.slice p₀ p₁ h).Pos} {h'} :
Pos.ofSlice (p.next h') = (Pos.ofSlice p).next (ofSlice_ne_endPos h') := by
simp only [Pos.ext_iff, Pos.Raw.ext_iff, Slice.Pos.offset_next, Pos.offset_next,
Pos.offset_ofSlice]
rw [Pos.get_eq_get_ofSlice (h' := h')]
simp [Pos.Raw.offsetBy, Nat.add_assoc]
end String

View File

@@ -10,6 +10,10 @@ public import Init.Data.String.Pattern.Char
public import Init.Data.String.Lemmas.Pattern.Basic
import Init.Data.Option.Lemmas
import Init.Data.String.Lemmas.Basic
import Init.Data.String.Lemmas.Order
import Init.Data.Order.Lemmas
import Init.Data.String.OrderInstances
import Init.Omega
public section
@@ -25,8 +29,7 @@ instance {c : Char} : NoPrefixForwardPatternModel c :=
theorem isMatch_iff {c : Char} {s : Slice} {pos : s.Pos} :
IsMatch c pos
(h : s.startPos s.endPos), pos = s.startPos.next h s.startPos.get h = c := by
simp only [Model.isMatch_iff, ForwardPatternModel.Matches]
rw [sliceTo_copy_eq_iff_exists_splits]
simp only [Model.isMatch_iff, ForwardPatternModel.Matches, sliceTo_copy_eq_iff_exists_splits]
refine ?_, ?_
· simp only [splits_singleton_iff]
exact fun t₂, h, h₁, h₂, h₃ => h, h₁, h₂
@@ -38,12 +41,25 @@ theorem isLongestMatch_iff {c : Char} {s : Slice} {pos : s.Pos} :
(h : s.startPos s.endPos), pos = s.startPos.next h s.startPos.get h = c := by
rw [isLongestMatch_iff_isMatch, isMatch_iff]
theorem isLongestMatchAt_iff {c : Char} {s : Slice} {pos pos' : s.Pos} :
IsLongestMatchAt c pos pos' h, pos' = pos.next h pos.get h = c := by
simp +contextual [Model.isLongestMatchAt_iff, isLongestMatch_iff, Pos.ofSliceFrom_inj,
Pos.get_eq_get_ofSliceFrom, Pos.ofSliceFrom_next]
instance {c : Char} : LawfulForwardPatternModel c where
dropPrefix?_eq_some_iff {s} pos := by
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?]
exact fun h, h₁, h₂ => h, h₂.symm, h₁, fun h, h₁, h₂ => h, h₂, h₁.symm
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?, and_comm, eq_comm (b := pos)]
instance {c : Char} : LawfulToForwardSearcherModel c :=
.defaultImplementation
theorem matchesAt_iff {c : Char} {s : Slice} {pos : s.Pos} :
MatchesAt c pos (h : pos s.endPos), pos.get h = c := by
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {c : Char} :
matchAt? c pos =
if h₀ : (h : pos s.endPos), pos.get h = c then some (pos.next h₀.1) else none := by
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
end String.Slice.Pattern.Model.Char

View File

@@ -10,6 +10,10 @@ public import Init.Data.String.Pattern.Pred
public import Init.Data.String.Lemmas.Pattern.Basic
import Init.Data.Option.Lemmas
import Init.Data.String.Lemmas.Basic
import Init.Data.String.Lemmas.Order
import Init.Data.Order.Lemmas
import Init.Data.String.OrderInstances
import Init.Omega
public section
@@ -38,14 +42,27 @@ theorem isLongestMatch_iff {p : Char → Bool} {s : Slice} {pos : s.Pos} :
(h : s.startPos s.endPos), pos = s.startPos.next h p (s.startPos.get h) := by
rw [isLongestMatch_iff_isMatch, isMatch_iff]
theorem isLongestMatchAt_iff {p : Char Bool} {s : Slice} {pos pos' : s.Pos} :
IsLongestMatchAt p pos pos' h, pos' = pos.next h p (pos.get h) := by
simp +contextual [Model.isLongestMatchAt_iff, isLongestMatch_iff, Pos.ofSliceFrom_inj,
Pos.get_eq_get_ofSliceFrom, Pos.ofSliceFrom_next]
instance {p : Char Bool} : LawfulForwardPatternModel p where
dropPrefix?_eq_some_iff {s} pos := by
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?]
exact fun h, h₁, h₂ => h, h₂.symm, h₁, fun h, h₁, h₂ => h, h₂, h₁.symm
simp [isLongestMatch_iff, ForwardPattern.dropPrefix?, and_comm, eq_comm (b := pos)]
instance {p : Char Bool} : LawfulToForwardSearcherModel p :=
.defaultImplementation
theorem matchesAt_iff {p : Char Bool} {s : Slice} {pos : s.Pos} :
MatchesAt p pos (h : pos s.endPos), p (pos.get h) := by
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {p : Char Bool} :
matchAt? p pos =
if h₀ : (h : pos s.endPos), p (pos.get h) then some (pos.next h₀.1) else none := by
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
namespace Decidable
instance {p : Char Prop} [DecidablePred p] : ForwardPatternModel p where
@@ -73,6 +90,16 @@ theorem isLongestMatch_iff_isLongestMatch_decide {p : Char → Prop} [DecidableP
{pos : s.Pos} : IsLongestMatch p pos IsLongestMatch (decide <| p ·) pos := by
simp [isLongestMatch_iff_isMatch, isMatch_iff_isMatch_decide]
theorem isLongestMatchAt_iff_isLongestMatchAt_decide {p : Char Prop} [DecidablePred p]
{s : Slice} {pos pos' : s.Pos} :
IsLongestMatchAt p pos pos' IsLongestMatchAt (decide <| p ·) pos pos' := by
simp [Model.isLongestMatchAt_iff, isLongestMatch_iff_isLongestMatch_decide]
theorem isLongestMatchAt_iff {p : Char Prop} [DecidablePred p] {s : Slice}
{pos pos' : s.Pos} :
IsLongestMatchAt p pos pos' h, pos' = pos.next h p (pos.get h) := by
simp [isLongestMatchAt_iff_isLongestMatchAt_decide, CharPred.isLongestMatchAt_iff]
theorem dropPrefix?_eq_dropPrefix?_decide {p : Char Prop} [DecidablePred p] :
ForwardPattern.dropPrefix? p = ForwardPattern.dropPrefix? (decide <| p ·) := rfl
@@ -84,6 +111,15 @@ instance {p : Char → Prop} [DecidablePred p] : LawfulForwardPatternModel p whe
instance {p : Char Prop} [DecidablePred p] : LawfulToForwardSearcherModel p :=
.defaultImplementation
theorem matchesAt_iff {p : Char Prop} [DecidablePred p] {s : Slice} {pos : s.Pos} :
MatchesAt p pos (h : pos s.endPos), p (pos.get h) := by
simp [matchesAt_iff_exists_isLongestMatchAt, isLongestMatchAt_iff, exists_comm]
theorem matchAt?_eq {s : Slice} {pos : s.Pos} {p : Char Prop} [DecidablePred p] :
matchAt? p pos =
if h₀ : (h : pos s.endPos), p (pos.get h) then some (pos.next h₀.1) else none := by
split <;> simp_all [isLongestMatchAt_iff, matchesAt_iff]
end Decidable
end String.Slice.Pattern.Model.CharPred

View File

@@ -6,235 +6,4 @@ Author: Markus Himmel
module
prelude
public import Init.Data.String.Lemmas.Pattern.Basic
public import Init.Data.String.Slice
import all Init.Data.String.Slice
import Init.Data.Option.Lemmas
import Init.Data.String.Termination
import Init.Data.String.Lemmas.Order
import Init.ByCases
import Init.Data.Order.Lemmas
import Init.Data.String.OrderInstances
import Init.Data.Iterators.Lemmas.Basic
import Init.Data.Iterators.Lemmas.Consumers.Collect
set_option doc.verso true
/-!
# Verification of {name}`String.Slice.splitToSubslice`
This PR verifies the {name}`String.Slice.splitToSubslice` function by relating it to a model
implementation based on the {name}`String.Slice.Pattern.Model.ForwardPatternModel` class.
This gives a low-level correctness proof from which higher-level API lemmas can be derived.
-/
namespace String.Slice.Pattern.Model
/--
Represents a list of subslices of a slice {name}`s`, the first of which starts at the given
position {name}`startPos`. This is a natural type for a split routine to return.
-/
@[ext]
public structure SlicesFrom {s : Slice} (startPos : s.Pos) : Type where
l : List s.Subslice
any_head? : l.head?.any (·.startInclusive = startPos)
namespace SlicesFrom
/--
A {name}`SlicesFrom` consisting of a single empty subslice at the position {name}`pos`.
-/
public def «at» {s : Slice} (pos : s.Pos) : SlicesFrom pos where
l := [s.subslice pos pos (Slice.Pos.le_refl _)]
any_head? := by simp
@[simp]
public theorem l_at {s : Slice} (pos : s.Pos) :
(SlicesFrom.at pos).l = [s.subslice pos pos (Slice.Pos.le_refl _)] := (rfl)
/--
Concatenating two {name}`SlicesFrom` yields a {name}`SlicesFrom` from the first position.
-/
public def append {s : Slice} {p₁ p₂ : s.Pos} (l₁ : SlicesFrom p₁) (l₂ : SlicesFrom p₂) :
SlicesFrom p₁ where
l := l₁.l ++ l₂.l
any_head? := by simpa using Option.any_or_of_any_left l₁.any_head?
@[simp]
public theorem l_append {s : Slice} {p₁ p₂ : s.Pos} {l₁ : SlicesFrom p₁} {l₂ : SlicesFrom p₂} :
(l₁.append l₂).l = l₁.l ++ l₂.l :=
(rfl)
/--
Given a {lean}`SlicesFrom p₂` and a position {name}`p₁` such that {lean}`p₁ ≤ p₂`, obtain a
{lean}`SlicesFrom p₁` by extending the left end of the first subslice to from {name}`p₂` to
{name}`p₁`.
-/
public def extend {s : Slice} (p₁ : s.Pos) {p₂ : s.Pos} (h : p₁ p₂) (l : SlicesFrom p₂) :
SlicesFrom p₁ where
l :=
match l.l, l.any_head? with
| st :: sts, h => st.extendLeft p₁ (by simp_all) :: sts
any_head? := by split; simp
@[simp]
public theorem l_extend {s : Slice} {p₁ p₂ : s.Pos} (h : p₁ p₂) {l : SlicesFrom p₂} :
(l.extend p₁ h).l =
match l.l, l.any_head? with
| st :: sts, h => st.extendLeft p₁ (by simp_all) :: sts :=
(rfl)
@[simp]
public theorem extend_self {s : Slice} {p₁ : s.Pos} (l : SlicesFrom p₁) :
l.extend p₁ (Slice.Pos.le_refl _) = l := by
rcases l with l, h
match l, h with
| st :: sts, h =>
simp at h
simp [SlicesFrom.extend, h]
@[simp]
public theorem extend_extend {s : Slice} {p₀ p₁ p₂ : s.Pos} {h : p₀ p₁} {h' : p₁ p₂}
{l : SlicesFrom p₂} : (l.extend p₁ h').extend p₀ h = l.extend p₀ (Slice.Pos.le_trans h h') := by
rcases l with l, h
match l, h with
| st :: sts, h => simp [SlicesFrom.extend]
end SlicesFrom
/--
Noncomputable model implementation of {name}`String.Slice.splitToSubslice` based on
{name}`ForwardPatternModel`. This is supposed to be simple enough to allow deriving higher-level
API lemmas about the public splitting functions.
-/
public protected noncomputable def split {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {s : Slice}
(start : s.Pos) : SlicesFrom start :=
if h : start = s.endPos then
.at start
else
match hd : matchAt? pat start with
| some pos =>
have : start < pos := (matchAt?_eq_some_iff.1 hd).lt
(SlicesFrom.at start).append (Model.split pat pos)
| none => (Model.split pat (start.next h)).extend start (by simp)
termination_by start
@[simp]
public theorem split_endPos {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice} :
Model.split pat s.endPos = SlicesFrom.at s.endPos := by
simp [Model.split]
public theorem split_eq_of_isLongestMatchAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
{s : Slice} {start stop : s.Pos} (h : IsLongestMatchAt pat start stop) :
Model.split pat start = (SlicesFrom.at start).append (Model.split pat stop) := by
rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h.lt)]
split
· congr <;> exact (matchAt?_eq_some_iff.1 _).eq h
· simp [matchAt?_eq_some_iff.2 _] at *
public theorem split_eq_of_not_matchesAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice}
{start stop : s.Pos} (h₀ : start stop) (h : p, start p p < stop ¬ MatchesAt pat p) :
Model.split pat start = (SlicesFrom.extend start h₀ (Model.split pat stop)) := by
induction start using WellFounded.induction Slice.Pos.wellFounded_gt with | h start ih
by_cases h' : start < stop
· rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h')]
have : ¬ MatchesAt pat start := h start (Slice.Pos.le_refl _) h'
split
· rename_i heq
simp [matchAt?_eq_none_iff.2 _] at heq
· rw [ih, SlicesFrom.extend_extend]
· simp
· simp [h']
· refine fun p hp₁ hp₂ => h p (Std.le_of_lt (by simpa using hp₁)) hp₂
· obtain rfl : start = stop := Std.le_antisymm h₀ (Std.not_lt.1 h')
simp
/--
Splits a slice {name}`s` into subslices from a list of {lean}`SearchStep s`.
This is an intermediate step in the verification. The equivalence of
{name}`String.Slice.splitToSubslice` and {name}`splitFromSteps` is pure "iteratorology", while
the equivalence of {name}`splitFromSteps` and {name}`split` is the actual correctness proof for the
splitting routine.
-/
def splitFromSteps {s : Slice} (currPos : s.Pos) (l : List (SearchStep s)) : List s.Subslice :=
match l with
| [] => [s.subsliceFrom currPos]
| .rejected .. :: l => splitFromSteps currPos l
| .matched p q :: l => s.subslice! currPos p :: splitFromSteps q l
theorem IsValidSearchFrom.splitFromSteps_eq_extend_split {ρ : Type} (pat : ρ)
[ForwardPatternModel pat] (l : List (SearchStep s)) (pos pos' : s.Pos) (h₀ : pos pos')
(h' : p, pos p p < pos' ¬ MatchesAt pat p)
(h : IsValidSearchFrom pat pos' l) :
splitFromSteps pos l = ((Model.split pat pos').extend pos h₀).l := by
induction h generalizing pos with
| endPos =>
simp only [splitFromSteps, Model.split, reduceDIte, SlicesFrom.l_extend, List.head?_cons,
Option.any_some]
split
simp_all only [SlicesFrom.l_at, List.cons.injEq, List.nil_eq, List.head?_cons, Option.any_some,
decide_eq_true_eq, heq_eq_eq, and_true]
rename_i h
simp only [ h.1]
ext <;> simp
| matched h valid ih =>
simp only [splitFromSteps]
rw [subslice!_eq_subslice h₀, split_eq_of_isLongestMatchAt h]
simp only [SlicesFrom.append, SlicesFrom.at, List.cons_append, List.nil_append,
SlicesFrom.l_extend, List.cons.injEq]
refine ?_, ?_
· ext <;> simp
· rw [ih _ (Slice.Pos.le_refl _), SlicesFrom.extend_self]
exact fun p hp₁ hp₂ => False.elim (Std.lt_irrefl (Std.lt_of_le_of_lt hp₁ hp₂))
| mismatched h rej valid ih =>
simp only [splitFromSteps]
rename_i l startPos endPos
rw [split_eq_of_not_matchesAt (Std.le_of_lt h) rej, SlicesFrom.extend_extend, ih]
intro p hp₁ hp₂
by_cases hp : p < startPos
· exact h' p hp₁ hp
· exact rej _ (Std.not_lt.1 hp) hp₂
theorem SplitIterator.toList_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice Type}
[ToForwardSearcher pat σ]
[ s, Std.Iterator (σ s) Id (SearchStep s)] [ s, Std.Iterators.Finite (σ s) Id] {s : Slice}
(it : Std.Iter (α := σ s) (SearchStep s)) (currPos : s.Pos) :
(Std.Iter.mk (α := SplitIterator pat s) (.operating currPos it)).toList =
splitFromSteps currPos it.toList := by
induction it using Std.Iter.inductSteps generalizing currPos with | step it ihy ihs
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
conv => rhs; rw [Std.Iter.toList_eq_match_step]
simp only [Std.Iter.toIterM_mk]
cases it.step using Std.PlausibleIterStep.casesOn with
| yield it out h =>
match out with
| .matched startPos endPos => simp [splitFromSteps, ihy h]
| .rejected startPos endPos => simp [splitFromSteps, ihy h]
| skip it h => simp [ ihs h]
| done =>
simp only [Id.run_pure, Std.Shrink.inflate_deflate, Std.IterM.Step.toPure_yield,
Std.PlausibleIterStep.yield, Std.IterM.toIter_mk, splitFromSteps, List.cons.injEq, true_and]
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
simp
theorem toList_splitToSubslice_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice Type} [ToForwardSearcher pat σ]
[ s, Std.Iterator (σ s) Id (SearchStep s)] [ s, Std.Iterators.Finite (σ s) Id] (s : Slice) :
(s.splitToSubslice pat).toList = splitFromSteps s.startPos (ToForwardSearcher.toSearcher pat s).toList := by
rw [splitToSubslice, SplitIterator.toList_eq_splitFromSteps]
end Model
open Model
public theorem toList_splitToSubslice_eq_modelSplit {ρ : Type} (pat : ρ) [ForwardPatternModel pat]
{σ : Slice Type} [ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)]
[ s, Std.Iterators.Finite (σ s) Id] [LawfulToForwardSearcherModel pat] (s : Slice) :
(s.splitToSubslice pat).toList = (Model.split pat s.startPos).l := by
rw [toList_splitToSubslice_eq_splitFromSteps, IsValidSearchFrom.splitFromSteps_eq_extend_split pat _
s.startPos s.startPos (Std.le_refl _) _ (LawfulToForwardSearcherModel.isValidSearchFrom_toList _),
SlicesFrom.extend_self]
simp
end String.Slice.Pattern
public import Init.Data.String.Lemmas.Pattern.Split.Basic

View File

@@ -0,0 +1,240 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Author: Markus Himmel
-/
module
prelude
public import Init.Data.String.Lemmas.Pattern.Basic
public import Init.Data.String.Slice
import all Init.Data.String.Slice
import Init.Data.Option.Lemmas
import Init.Data.String.Termination
import Init.Data.String.Lemmas.Order
import Init.ByCases
import Init.Data.Order.Lemmas
import Init.Data.String.OrderInstances
import Init.Data.Iterators.Lemmas.Basic
import Init.Data.Iterators.Lemmas.Consumers.Collect
set_option doc.verso true
/-!
# Verification of {name}`String.Slice.splitToSubslice`
This PR verifies the {name}`String.Slice.splitToSubslice` function by relating it to a model
implementation based on the {name}`String.Slice.Pattern.Model.ForwardPatternModel` class.
This gives a low-level correctness proof from which higher-level API lemmas can be derived.
-/
namespace String.Slice.Pattern.Model
/--
Represents a list of subslices of a slice {name}`s`, the first of which starts at the given
position {name}`startPos`. This is a natural type for a split routine to return.
-/
@[ext]
public structure SlicesFrom {s : Slice} (startPos : s.Pos) : Type where
l : List s.Subslice
any_head? : l.head?.any (·.startInclusive = startPos)
namespace SlicesFrom
/--
A {name}`SlicesFrom` consisting of a single empty subslice at the position {name}`pos`.
-/
public def «at» {s : Slice} (pos : s.Pos) : SlicesFrom pos where
l := [s.subslice pos pos (Slice.Pos.le_refl _)]
any_head? := by simp
@[simp]
public theorem l_at {s : Slice} (pos : s.Pos) :
(SlicesFrom.at pos).l = [s.subslice pos pos (Slice.Pos.le_refl _)] := (rfl)
/--
Concatenating two {name}`SlicesFrom` yields a {name}`SlicesFrom` from the first position.
-/
public def append {s : Slice} {p₁ p₂ : s.Pos} (l₁ : SlicesFrom p₁) (l₂ : SlicesFrom p₂) :
SlicesFrom p₁ where
l := l₁.l ++ l₂.l
any_head? := by simpa using Option.any_or_of_any_left l₁.any_head?
@[simp]
public theorem l_append {s : Slice} {p₁ p₂ : s.Pos} {l₁ : SlicesFrom p₁} {l₂ : SlicesFrom p₂} :
(l₁.append l₂).l = l₁.l ++ l₂.l :=
(rfl)
/--
Given a {lean}`SlicesFrom p₂` and a position {name}`p₁` such that {lean}`p₁ ≤ p₂`, obtain a
{lean}`SlicesFrom p₁` by extending the left end of the first subslice to from {name}`p₂` to
{name}`p₁`.
-/
public def extend {s : Slice} (p₁ : s.Pos) {p₂ : s.Pos} (h : p₁ p₂) (l : SlicesFrom p₂) :
SlicesFrom p₁ where
l :=
match l.l, l.any_head? with
| st :: sts, h => st.extendLeft p₁ (by simp_all) :: sts
any_head? := by split; simp
@[simp]
public theorem l_extend {s : Slice} {p₁ p₂ : s.Pos} (h : p₁ p₂) {l : SlicesFrom p₂} :
(l.extend p₁ h).l =
match l.l, l.any_head? with
| st :: sts, h => st.extendLeft p₁ (by simp_all) :: sts :=
(rfl)
@[simp]
public theorem extend_self {s : Slice} {p₁ : s.Pos} (l : SlicesFrom p₁) :
l.extend p₁ (Slice.Pos.le_refl _) = l := by
rcases l with l, h
match l, h with
| st :: sts, h =>
simp at h
simp [SlicesFrom.extend, h]
@[simp]
public theorem extend_extend {s : Slice} {p₀ p₁ p₂ : s.Pos} {h : p₀ p₁} {h' : p₁ p₂}
{l : SlicesFrom p₂} : (l.extend p₁ h').extend p₀ h = l.extend p₀ (Slice.Pos.le_trans h h') := by
rcases l with l, h
match l, h with
| st :: sts, h => simp [SlicesFrom.extend]
end SlicesFrom
/--
Noncomputable model implementation of {name}`String.Slice.splitToSubslice` based on
{name}`ForwardPatternModel`. This is supposed to be simple enough to allow deriving higher-level
API lemmas about the public splitting functions.
-/
public protected noncomputable def split {ρ : Type} (pat : ρ) [ForwardPatternModel pat] {s : Slice}
(start : s.Pos) : SlicesFrom start :=
if h : start = s.endPos then
.at start
else
match hd : matchAt? pat start with
| some pos =>
have : start < pos := (matchAt?_eq_some_iff.1 hd).lt
(SlicesFrom.at start).append (Model.split pat pos)
| none => (Model.split pat (start.next h)).extend start (by simp)
termination_by start
@[simp]
public theorem split_endPos {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice} :
Model.split pat s.endPos = SlicesFrom.at s.endPos := by
simp [Model.split]
public theorem split_eq_of_isLongestMatchAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat]
{s : Slice} {start stop : s.Pos} (h : IsLongestMatchAt pat start stop) :
Model.split pat start = (SlicesFrom.at start).append (Model.split pat stop) := by
rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h.lt)]
split
· congr <;> exact (matchAt?_eq_some_iff.1 _).eq h
· simp [matchAt?_eq_some_iff.2 _] at *
public theorem split_eq_of_not_matchesAt {ρ : Type} {pat : ρ} [ForwardPatternModel pat] {s : Slice}
{start stop : s.Pos} (h₀ : start stop) (h : p, start p p < stop ¬ MatchesAt pat p) :
Model.split pat start = (SlicesFrom.extend start h₀ (Model.split pat stop)) := by
induction start using WellFounded.induction Slice.Pos.wellFounded_gt with | h start ih
by_cases h' : start < stop
· rw [Model.split, dif_neg (Slice.Pos.ne_endPos_of_lt h')]
have : ¬ MatchesAt pat start := h start (Slice.Pos.le_refl _) h'
split
· rename_i heq
simp [matchAt?_eq_none_iff.2 _] at heq
· rw [ih, SlicesFrom.extend_extend]
· simp
· simp [h']
· refine fun p hp₁ hp₂ => h p (Std.le_of_lt (by simpa using hp₁)) hp₂
· obtain rfl : start = stop := Std.le_antisymm h₀ (Std.not_lt.1 h')
simp
/--
Splits a slice {name}`s` into subslices from a list of {lean}`SearchStep s`.
This is an intermediate step in the verification. The equivalence of
{name}`String.Slice.splitToSubslice` and {name}`splitFromSteps` is pure "iteratorology", while
the equivalence of {name}`splitFromSteps` and {name}`split` is the actual correctness proof for the
splitting routine.
-/
def splitFromSteps {s : Slice} (currPos : s.Pos) (l : List (SearchStep s)) : List s.Subslice :=
match l with
| [] => [s.subsliceFrom currPos]
| .rejected .. :: l => splitFromSteps currPos l
| .matched p q :: l => s.subslice! currPos p :: splitFromSteps q l
theorem IsValidSearchFrom.splitFromSteps_eq_extend_split {ρ : Type} (pat : ρ)
[ForwardPatternModel pat] (l : List (SearchStep s)) (pos pos' : s.Pos) (h₀ : pos pos')
(h' : p, pos p p < pos' ¬ MatchesAt pat p)
(h : IsValidSearchFrom pat pos' l) :
splitFromSteps pos l = ((Model.split pat pos').extend pos h₀).l := by
induction h generalizing pos with
| endPos =>
simp only [splitFromSteps, Model.split, reduceDIte, SlicesFrom.l_extend, List.head?_cons,
Option.any_some]
split
simp_all only [SlicesFrom.l_at, List.cons.injEq, List.nil_eq, List.head?_cons, Option.any_some,
decide_eq_true_eq, heq_eq_eq, and_true]
rename_i h
simp only [ h.1]
ext <;> simp
| matched h valid ih =>
simp only [splitFromSteps]
rw [subslice!_eq_subslice h₀, split_eq_of_isLongestMatchAt h]
simp only [SlicesFrom.append, SlicesFrom.at, List.cons_append, List.nil_append,
SlicesFrom.l_extend, List.cons.injEq]
refine ?_, ?_
· ext <;> simp
· rw [ih _ (Slice.Pos.le_refl _), SlicesFrom.extend_self]
exact fun p hp₁ hp₂ => False.elim (Std.lt_irrefl (Std.lt_of_le_of_lt hp₁ hp₂))
| mismatched h rej valid ih =>
simp only [splitFromSteps]
rename_i l startPos endPos
rw [split_eq_of_not_matchesAt (Std.le_of_lt h) rej, SlicesFrom.extend_extend, ih]
intro p hp₁ hp₂
by_cases hp : p < startPos
· exact h' p hp₁ hp
· exact rej _ (Std.not_lt.1 hp) hp₂
theorem SplitIterator.toList_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice Type}
[ToForwardSearcher pat σ]
[ s, Std.Iterator (σ s) Id (SearchStep s)] [ s, Std.Iterators.Finite (σ s) Id] {s : Slice}
(it : Std.Iter (α := σ s) (SearchStep s)) (currPos : s.Pos) :
(Std.Iter.mk (α := SplitIterator pat s) (.operating currPos it)).toList =
splitFromSteps currPos it.toList := by
induction it using Std.Iter.inductSteps generalizing currPos with | step it ihy ihs
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
conv => rhs; rw [Std.Iter.toList_eq_match_step]
simp only [Std.Iter.toIterM_mk]
cases it.step using Std.PlausibleIterStep.casesOn with
| yield it out h =>
match out with
| .matched startPos endPos => simp [splitFromSteps, ihy h]
| .rejected startPos endPos => simp [splitFromSteps, ihy h]
| skip it h => simp [ ihs h]
| done =>
simp only [Id.run_pure, Std.Shrink.inflate_deflate, Std.IterM.Step.toPure_yield,
Std.PlausibleIterStep.yield, Std.IterM.toIter_mk, splitFromSteps, List.cons.injEq, true_and]
rw [Std.Iter.toList_eq_match_step, Std.Iter.step_eq]
simp
theorem toList_splitToSubslice_eq_splitFromSteps {ρ : Type} {pat : ρ} {σ : Slice Type} [ToForwardSearcher pat σ]
[ s, Std.Iterator (σ s) Id (SearchStep s)] [ s, Std.Iterators.Finite (σ s) Id] (s : Slice) :
(s.splitToSubslice pat).toList = splitFromSteps s.startPos (ToForwardSearcher.toSearcher pat s).toList := by
rw [splitToSubslice, SplitIterator.toList_eq_splitFromSteps]
end Model
open Model
public theorem toList_splitToSubslice_eq_modelSplit {ρ : Type} (pat : ρ) [ForwardPatternModel pat]
{σ : Slice Type} [ToForwardSearcher pat σ] [ s, Std.Iterator (σ s) Id (SearchStep s)]
[ s, Std.Iterators.Finite (σ s) Id] [LawfulToForwardSearcherModel pat] (s : Slice) :
(s.splitToSubslice pat).toList = (Model.split pat s.startPos).l := by
rw [toList_splitToSubslice_eq_splitFromSteps, IsValidSearchFrom.splitFromSteps_eq_extend_split pat _
s.startPos s.startPos (Std.le_refl _) _ (LawfulToForwardSearcherModel.isValidSearchFrom_toList _),
SlicesFrom.extend_self]
simp
end String.Slice.Pattern

View File

@@ -101,6 +101,17 @@ theorem toArray_mk {xs : Array α} (h : xs.size = n) : (Vector.mk xs h).toArray
@[simp] theorem foldr_mk {f : α β β} {b : β} {xs : Array α} (h : xs.size = n) :
(Vector.mk xs h).foldr f b = xs.foldr f b := rfl
@[simp, grind =] theorem foldlM_toArray [Monad m]
{f : β α m β} {init : β} {xs : Vector α n} :
xs.toArray.foldlM f init = xs.foldlM f init := rfl
@[simp, grind =] theorem foldrM_toArray [Monad m]
{f : α β m β} {init : β} {xs : Vector α n} :
xs.toArray.foldrM f init = xs.foldrM f init := rfl
@[simp, grind =] theorem foldl_toArray (f : β α β) {init : β} {xs : Vector α n} :
xs.toArray.foldl f init = xs.foldl f init := rfl
@[simp] theorem drop_mk {xs : Array α} {h : xs.size = n} {i} :
(Vector.mk xs h).drop i = Vector.mk (xs.extract i xs.size) (by simp [h]) := rfl
@@ -514,17 +525,32 @@ protected theorem ext {xs ys : Vector α n} (h : (i : Nat) → (_ : i < n) → x
@[grind =_] theorem toList_toArray {xs : Vector α n} : xs.toArray.toList = xs.toList := rfl
theorem toArray_toList {xs : Vector α n} : xs.toList.toArray = xs.toArray := rfl
@[simp, grind =] theorem foldlM_toList [Monad m]
{f : β α m β} {init : β} {xs : Vector α n} :
xs.toList.foldlM f init = xs.foldlM f init := by
rw [ foldlM_toArray, toArray_toList, List.foldlM_toArray]
@[simp, grind =] theorem foldl_toList (f : β α β) {init : β} {xs : Vector α n} :
xs.toList.foldl f init = xs.foldl f init :=
List.foldl_eq_foldlM .. foldlM_toList ..
@[simp, grind =] theorem foldrM_toList [Monad m]
{f : α β m β} {init : β} {xs : Vector α n} :
xs.toList.foldrM f init = xs.foldrM f init := by
rw [ foldrM_toArray, toArray_toList, List.foldrM_toArray]
@[simp, grind =] theorem foldr_toList (f : α β β) {init : β} {xs : Vector α n} :
xs.toList.foldr f init = xs.foldr f init :=
List.foldr_eq_foldrM .. foldrM_toList ..
@[simp, grind =] theorem toList_mk : (Vector.mk xs h).toList = xs.toList := rfl
@[simp, grind =] theorem sum_toList [Add α] [Zero α] {xs : Vector α n} :
xs.toList.sum = xs.sum := by
rw [ toList_toArray, Array.sum_toList, sum_toArray]
@[simp, grind =]
theorem toList_zip {as : Vector α n} {bs : Vector β n} :
(Vector.zip as bs).toList = List.zip as.toList bs.toList := by
rw [mk_zip_mk, toList_mk, Array.toList_zip, toList_toArray, toList_toArray]
@[simp] theorem getElem_toList {xs : Vector α n} {i : Nat} (h : i < xs.toList.length) :
xs.toList[i] = xs[i]'(by simpa using h) := by
cases xs
@@ -609,6 +635,11 @@ theorem toList_swap {xs : Vector α n} {i j} (hi hj) :
@[simp] theorem toList_take {xs : Vector α n} {i} : (xs.take i).toList = xs.toList.take i := by
simp [toList]
@[simp, grind =]
theorem toList_zip {as : Vector α n} {bs : Vector β n} :
(Vector.zip as bs).toList = List.zip as.toList bs.toList := by
rw [mk_zip_mk, toList_mk, Array.toList_zip, toList_toArray, toList_toArray]
@[simp] theorem toList_zipWith {f : α β γ} {as : Vector α n} {bs : Vector β n} :
(Vector.zipWith f as bs).toList = List.zipWith f as.toList bs.toList := by
rcases as with as, rfl
@@ -703,6 +734,9 @@ protected theorem eq_empty {xs : Vector α 0} : xs = #v[] := by
/-! ### size -/
theorem size_singleton {x : α} : #v[x].size = 1 := by
simp
theorem eq_empty_of_size_eq_zero {xs : Vector α n} (h : n = 0) : xs = #v[].cast h.symm := by
rcases xs with xs, rfl
apply toArray_inj.1
@@ -2448,6 +2482,21 @@ theorem foldl_eq_foldr_reverse {xs : Vector α n} {f : β → α → β} {b} :
theorem foldr_eq_foldl_reverse {xs : Vector α n} {f : α β β} {b} :
xs.foldr f b = xs.reverse.foldl (fun x y => f y x) b := by simp
theorem foldl_eq_apply_foldr {xs : Vector α n} {f : α α α}
[Std.Associative f] [Std.LawfulRightIdentity f init] :
xs.foldl f x = f x (xs.foldr f init) := by
simp [ foldl_toList, foldr_toList, List.foldl_eq_apply_foldr]
theorem foldr_eq_apply_foldl {xs : Vector α n} {f : α α α}
[Std.Associative f] [Std.LawfulLeftIdentity f init] :
xs.foldr f x = f (xs.foldl f init) x := by
simp [ foldl_toList, foldr_toList, List.foldr_eq_apply_foldl]
theorem foldr_eq_foldl {xs : Vector α n} {f : α α α}
[Std.Associative f] [Std.LawfulIdentity f init] :
xs.foldr f init = xs.foldl f init := by
simp [foldl_eq_apply_foldr, Std.LawfulLeftIdentity.left_id]
theorem foldl_assoc {op : α α α} [ha : Std.Associative op] {xs : Vector α n} {a₁ a₂} :
xs.foldl op (op a₁ a₂) = op a₁ (xs.foldl op a₂) := by
rcases xs with xs, rfl
@@ -3064,8 +3113,25 @@ theorem sum_append [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
{as₁ as₂ : Vector α n} : (as₁ ++ as₂).sum = as₁.sum + as₂.sum := by
simp [ sum_toList, List.sum_append]
@[simp, grind =]
theorem sum_singleton [Add α] [Zero α] [Std.LawfulRightIdentity (· + ·) (0 : α)] {x : α} :
#v[x].sum = x := by
simp [ sum_toList, Std.LawfulRightIdentity.right_id x]
@[simp, grind =]
theorem sum_push [Add α] [Zero α] [Std.Associative (α := α) (· + ·)]
[Std.LawfulIdentity (· + ·) (0 : α)] {xs : Vector α n} {x : α} :
(xs.push x).sum = xs.sum + x := by
simp [ sum_toArray]
@[simp, grind =]
theorem sum_reverse [Zero α] [Add α] [Std.Associative (α := α) (· + ·)]
[Std.Commutative (α := α) (· + ·)]
[Std.LawfulLeftIdentity (α := α) (· + ·) 0] (xs : Vector α n) : xs.reverse.sum = xs.sum := by
simp [ sum_toList, List.sum_reverse]
theorem sum_eq_foldl [Zero α] [Add α]
[Std.Associative (α := α) (· + ·)] [Std.LawfulIdentity (· + ·) (0 : α)]
{xs : Vector α n} :
xs.sum = xs.foldl (b := 0) (· + ·) := by
simp [ sum_toList, List.sum_eq_foldl]

View File

@@ -3098,7 +3098,7 @@ Examples:
* `[["a"], ["b", "c"]].flatten = ["a", "b", "c"]`
* `[["a"], [], ["b", "c"], ["d", "e", "f"]].flatten = ["a", "b", "c", "d", "e", "f"]`
-/
def List.flatten : List (List α) List α
noncomputable def List.flatten : List (List α) List α
| nil => nil
| cons l L => List.append l (flatten L)
@@ -3125,7 +3125,7 @@ Examples:
* `[2, 3, 2].flatMap List.range = [0, 1, 0, 1, 2, 0, 1]`
* `["red", "blue"].flatMap String.toList = ['r', 'e', 'd', 'b', 'l', 'u', 'e']`
-/
@[inline] def List.flatMap {α : Type u} {β : Type v} (b : α List β) (as : List α) : List β := flatten (map b as)
@[inline] noncomputable def List.flatMap {α : Type u} {β : Type v} (b : α List β) (as : List α) : List β := flatten (map b as)
/--
`Array α` is the type of [dynamic arrays](https://en.wikipedia.org/wiki/Dynamic_array) with elements
@@ -3453,7 +3453,7 @@ def String.utf8EncodeChar (c : Char) : List UInt8 :=
/-- Encode a list of characters (Unicode scalar value) in UTF-8. This is an inefficient model
implementation. Use `List.asString` instead. -/
def List.utf8Encode (l : List Char) : ByteArray :=
noncomputable def List.utf8Encode (l : List Char) : ByteArray :=
l.flatMap String.utf8EncodeChar |>.toByteArray
/-- A byte array is valid UTF-8 if it is of the form `List.Internal.utf8Encode m` for some `m`.

View File

@@ -30,6 +30,7 @@ variable {α : Sort _} {β : α → Sort _} {γ : (a : α) → β a → Sort _}
set_option doc.verso true
namespace WellFounded
open Relation
/--
The function implemented as the loop {lean}`opaqueFix R F a = F a (fun a _ => opaqueFix R F a)`.
@@ -85,6 +86,23 @@ public theorem extrinsicFix_eq_apply [∀ a, Nonempty (C a)] {R : αα
simp only [extrinsicFix, dif_pos h]
rw [WellFounded.fix_eq]
public theorem extrinsicFix_invImage {α' : Sort _} [ a, Nonempty (C a)] (R : α α Prop) (f : α' α)
(F : a, ( a', R a' a C a') C a) (F' : a, ( a', R (f a') (f a) C (f a')) C (f a))
(h : a r, F (f a) r = F' a fun a' hR => r (f a') hR) (a : α') (h : WellFounded R) :
extrinsicFix (C := (C <| f ·)) (InvImage R f) F' a = extrinsicFix (C := C) R F (f a) := by
have h' := h
rcases h with h
specialize h (f a)
have : Acc (InvImage R f) a := InvImage.accessible _ h
clear h
induction this
rename_i ih
rw [extrinsicFix_eq_apply, extrinsicFix_eq_apply, h]
· congr; ext a x
rw [ih _ x]
· assumption
· exact InvImage.wf _ _
/--
A fixpoint combinator that allows for deferred proofs of termination.
@@ -242,4 +260,273 @@ nontrivial properties about it.
-/
add_decl_doc extrinsicFix₃
/--
A fixpoint combinator that can be used to construct recursive functions with an
*extrinsic, partial* proof of termination.
Given a relation {name}`R` and a fixpoint functional {name}`F` which must be decreasing with respect
to {name}`R`, {lean}`partialExtrinsicFix R F` is the recursive function obtained by having {name}`F` call
itself recursively.
For each input {given}`a`, {lean}`partialExtrinsicFix R F a` can be verified given a *partial* termination
proof. The precise semantics are as follows.
If {lean}`Acc R a` does not hold, {lean}`partialExtrinsicFix R F a` might run forever. In this case,
nothing interesting can be proved about the recursive function; it is opaque and behaves like a
recursive function with the `partial` modifier.
If {lean}`Acc R a` _does_ hold, {lean}`partialExtrinsicFix R F a` is equivalent to
{lean}`F a (fun a' _ => partialExtrinsicFix R F a')`, both logically and regarding its termination behavior.
In particular, if {name}`R` is well-founded, {lean}`partialExtrinsicFix R F a` is equivalent to
{lean}`WellFounded.fix _ F`.
-/
@[inline]
public def partialExtrinsicFix [ a, Nonempty (C a)] (R : α α Prop)
(F : a, ( a', R a' a C a') C a) (a : α) : C a :=
extrinsicFix (α := { a' : α // a' = a TransGen R a' a }) (C := (C ·.1))
(fun p q => R p.1 q.1)
(fun a recur => F a.1 fun a' hR => recur a', by
rcases a.property with ha | ha
· exact Or.inr (TransGen.single (ha hR))
· apply Or.inr
apply TransGen.trans ?_ _
apply TransGen.single
assumption _) a, Or.inl rfl
public theorem partialExtrinsicFix_eq_apply_of_acc [ a, Nonempty (C a)] {R : α α Prop}
{F : a, ( a', R a' a C a') C a} {a : α} (h : Acc R a) :
partialExtrinsicFix R F a = F a (fun a' _ => partialExtrinsicFix R F a') := by
simp only [partialExtrinsicFix]
rw [extrinsicFix_eq_apply]
congr; ext a' hR
let f (x : { x : α // x = a' TransGen R x a' }) : { x : α // x = a TransGen R x a } :=
x.val, by
cases x.property
· rename_i h
rw [h]
exact Or.inr (.single hR)
· rename_i h
apply Or.inr
refine TransGen.trans h ?_
exact .single hR
have := extrinsicFix_invImage (C := (C ·.val)) (R := (R ·.1 ·.1)) (f := f)
(F := fun a r => F a.1 fun a' hR => r a', Or.inr (by rcases a.2 with ha | ha; exact .single (ha hR); exact .trans (.single hR) _) hR)
(F' := fun a r => F a.1 fun a' hR => r a', by rcases a.2 with ha | ha; exact .inr (.single (ha hR)); exact .inr (.trans (.single hR) _) hR)
unfold InvImage at this
rw [this]
· simp +zetaDelta
· constructor
intro x
refine InvImage.accessible _ ?_
cases x.2 <;> rename_i hx
· rwa [hx]
· exact h.inv_of_transGen hx
· constructor
intro x
refine InvImage.accessible _ ?_
cases x.2 <;> rename_i hx
· rwa [hx]
· exact h.inv_of_transGen hx
public theorem partialExtrinsicFix_eq_apply [ a, Nonempty (C a)] {R : α α Prop}
{F : a, ( a', R a' a C a') C a} {a : α} (wf : WellFounded R) :
partialExtrinsicFix R F a = F a (fun a' _ => partialExtrinsicFix R F a') :=
partialExtrinsicFix_eq_apply_of_acc (wf.apply _)
public theorem partialExtrinsicFix_eq_fix [ a, Nonempty (C a)] {R : α α Prop}
{F : a, ( a', R a' a C a') C a}
(wf : WellFounded R) {a : α} :
partialExtrinsicFix R F a = wf.fix F a := by
have h := wf.apply a
induction h with | intro a' h ih
rw [partialExtrinsicFix_eq_apply_of_acc (Acc.intro _ h), WellFounded.fix_eq]
congr 1; ext a'' hR
exact ih _ hR
/--
A 2-ary fixpoint combinator that can be used to construct recursive functions with an
*extrinsic, partial* proof of termination.
Given a relation {name}`R` and a fixpoint functional {name}`F` which must be decreasing with respect
to {name}`R`, {lean}`partialExtrinsicFix₂ R F` is the recursive function obtained by having {name}`F` call
itself recursively.
For each pair of inputs {given}`a` and {given}`b`, {lean}`partialExtrinsicFix₂ R F a b` can be verified
given a *partial* termination proof. The precise semantics are as follows.
If {lean}`Acc R ⟨a, b⟩ ` does not hold, {lean}`partialExtrinsicFix₂ R F a b` might run forever. In this
case, nothing interesting can be proved about the recursive function; it is opaque and behaves like
a recursive function with the `partial` modifier.
If {lean}`Acc R ⟨a, b⟩` _does_ hold, {lean}`partialExtrinsicFix₂ R F a b` is equivalent to
{lean}`F a b (fun a' b' _ => partialExtrinsicFix₂ R F a' b')`, both logically and regarding its
termination behavior.
In particular, if {name}`R` is well-founded, {lean}`partialExtrinsicFix₂ R F a b` is equivalent to
a well-foundesd fixpoint.
-/
@[inline]
public def partialExtrinsicFix₂ [ a b, Nonempty (C₂ a b)]
(R : (a : α) ×' β a (a : α) ×' β a Prop)
(F : (a : α) (b : β a) ((a' : α) (b' : β a') R a', b' a, b C₂ a' b') C₂ a b)
(a : α) (b : β a) :
C₂ a b :=
extrinsicFix₂ (α := α) (β := fun a' => { b' : β a' // (PSigma.mk a' b') = (PSigma.mk a b) TransGen R a', b' a, b })
(C₂ := (C₂ · ·.1))
(fun p q => R p.1, p.2.1 q.1, q.2.1)
(fun a b recur => F a b.1 fun a' b' hR => recur a' b', Or.inr (by
rcases b.property with hb | hb
· exact .single (hb hR)
· apply TransGen.trans ?_ _
apply TransGen.single
assumption) _) a b, Or.inl rfl
public theorem partialExtrinsicFix₂_eq_partialExtrinsicFix [ a b, Nonempty (C₂ a b)]
{R : (a : α) ×' β a (a : α) ×' β a Prop}
{F : (a : α) (b : β a) ((a' : α) (b' : β a') R a', b' a, b C₂ a' b') C₂ a b}
{a : α} {b : β a} (h : Acc R a, b) :
partialExtrinsicFix₂ R F a b = partialExtrinsicFix (α := PSigma β) (C := fun a => C₂ a.1 a.2) R (fun p r => F p.1 p.2 fun a' b' hR => r a', b' hR) a, b := by
simp only [partialExtrinsicFix, partialExtrinsicFix₂, extrinsicFix₂]
let f (x : ((a' : α) ×' { b' // PSigma.mk a' b' = a, b TransGen R a', b' a, b })) : { a' // a' = a, b TransGen R a' a, b } :=
x.1, x.2.1, x.2.2
have := extrinsicFix_invImage (C := fun a => C₂ a.1.1 a.1.2) (f := f) (R := (R ·.1 ·.1))
(F := fun a r => F a.1.1 a.1.2 fun a' b' hR => r a', b', ?refine_a hR)
(F' := fun a r => F a.1 a.2.1 fun a' b' hR => r a', b', ?refine_b hR)
(a := a, b, ?refine_c); rotate_left
· cases a.2 <;> rename_i heq
· rw [heq] at hR
exact .inr (.single hR)
· exact .inr (.trans (.single hR) heq)
· cases a.2.2 <;> rename_i heq
· rw [heq] at hR
exact .inr (.single hR)
· exact .inr (.trans (.single hR) heq)
· exact .inl rfl
unfold InvImage f at this
simp at this
rw [this]
constructor
intro x
apply InvImage.accessible
cases x.2 <;> rename_i heq
· rwa [heq]
· exact h.inv_of_transGen heq
public theorem partialExtrinsicFix₂_eq_apply_of_acc [ a b, Nonempty (C₂ a b)]
{R : (a : α) ×' β a (a : α) ×' β a Prop}
{F : (a : α) (b : β a) ((a' : α) (b' : β a') R a', b' a, b C₂ a' b') C₂ a b}
{a : α} {b : β a} (wf : Acc R a, b) :
partialExtrinsicFix₂ R F a b = F a b (fun a' b' _ => partialExtrinsicFix₂ R F a' b') := by
rw [partialExtrinsicFix₂_eq_partialExtrinsicFix wf, partialExtrinsicFix_eq_apply_of_acc wf]
congr 1; ext a' b' hR
rw [partialExtrinsicFix₂_eq_partialExtrinsicFix (wf.inv hR)]
public theorem partialExtrinsicFix₂_eq_apply [ a b, Nonempty (C₂ a b)]
{R : (a : α) ×' β a (a : α) ×' β a Prop}
{F : (a : α) (b : β a) ((a' : α) (b' : β a') R a', b' a, b C₂ a' b') C₂ a b}
{a : α} {b : β a} (wf : WellFounded R) :
partialExtrinsicFix₂ R F a b = F a b (fun a' b' _ => partialExtrinsicFix₂ R F a' b') :=
partialExtrinsicFix₂_eq_apply_of_acc (wf.apply _)
public theorem partialExtrinsicFix₂_eq_fix [ a b, Nonempty (C₂ a b)]
{R : (a : α) ×' β a (a : α) ×' β a Prop}
{F : a b, ( a' b', R a', b' a, b C₂ a' b') C₂ a b}
(wf : WellFounded R) {a b} :
partialExtrinsicFix₂ R F a b = wf.fix (fun x G => F x.1 x.2 (fun a b h => G a, b h)) a, b := by
rw [partialExtrinsicFix₂_eq_partialExtrinsicFix (wf.apply _), partialExtrinsicFix_eq_fix wf]
/--
A 3-ary fixpoint combinator that can be used to construct recursive functions with an
*extrinsic, partial* proof of termination.
Given a relation {name}`R` and a fixpoint functional {name}`F` which must be decreasing with respect
to {name}`R`, {lean}`partialExtrinsicFix₃ R F` is the recursive function obtained by having {name}`F` call
itself recursively.
For each pair of inputs {given}`a`, {given}`b` and {given}`c`, {lean}`partialExtrinsicFix₃ R F a b` can be
verified given a *partial* termination proof. The precise semantics are as follows.
If {lean}`Acc R ⟨a, b, c⟩ ` does not hold, {lean}`partialExtrinsicFix₃ R F a b` might run forever. In this
case, nothing interesting can be proved about the recursive function; it is opaque and behaves like
a recursive function with the `partial` modifier.
If {lean}`Acc R ⟨a, b, c⟩` _does_ hold, {lean}`partialExtrinsicFix₃ R F a b` is equivalent to
{lean}`F a b c (fun a' b' c' _ => partialExtrinsicFix₃ R F a' b' c')`, both logically and regarding its
termination behavior.
In particular, if {name}`R` is well-founded, {lean}`partialExtrinsicFix₃ R F a b c` is equivalent to
a well-foundesd fixpoint.
-/
@[inline]
public def partialExtrinsicFix₃ [ a b c, Nonempty (C₃ a b c)]
(R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop)
(F : (a : α) (b : β a) (c : γ a b) ((a' : α) (b' : β a') (c' : γ a' b') R a', b', c' a, b, c C₃ a' b' c') C₃ a b c)
(a : α) (b : β a) (c : γ a b) :
C₃ a b c :=
extrinsicFix₃ (α := α) (β := β) (γ := fun a' b' => { c' : γ a' b' // (a', b', c' : (a : α) ×' (b : β a) ×' γ a b) = a, b, c TransGen R a', b', c' a, b, c })
(C₃ := (C₃ · · ·.1))
(fun p q => R p.1, p.2.1, p.2.2.1 q.1, q.2.1, q.2.2.1)
(fun a b c recur => F a b c.1 fun a' b' c' hR => recur a' b' c', Or.inr (by
rcases c.property with hb | hb
· exact .single (hb hR)
· apply TransGen.trans ?_ _
apply TransGen.single
assumption) _) a b c, Or.inl rfl
public theorem partialExtrinsicFix₃_eq_partialExtrinsicFix [ a b c, Nonempty (C₃ a b c)]
{R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop}
{F : (a : α) (b : β a) (c : γ a b) ((a' : α) (b' : β a') (c' : γ a' b') R a', b', c' a, b, c C₃ a' b' c') C₃ a b c}
{a : α} {b : β a} {c : γ a b} (h : Acc R a, b, c) :
partialExtrinsicFix₃ R F a b c = partialExtrinsicFix (α := (a : α) ×' (b : β a) ×' γ a b) (C := fun a => C₃ a.1 a.2.1 a.2.2) R (fun p r => F p.1 p.2.1 p.2.2 fun a' b' c' hR => r a', b', c' hR) a, b, c := by
simp only [partialExtrinsicFix, partialExtrinsicFix₃, extrinsicFix₃]
let f (x : ((a' : α) ×' (b' : β a') ×' { c' // (a', b', c' : (a : α) ×' (b : β a) ×' γ a b) = a, b, c TransGen R a', b', c' a, b, c })) : { a' // a' = a, b, c TransGen R a' a, b, c } :=
x.1, x.2.1, x.2.2.1, x.2.2.2
have := extrinsicFix_invImage (C := fun a => C₃ a.1.1 a.1.2.1 a.1.2.2) (f := f) (R := (R ·.1 ·.1))
(F := fun a r => F a.1.1 a.1.2.1 a.1.2.2 fun a' b' c' hR => r a', b', c', ?refine_a hR)
(F' := fun a r => F a.1 a.2.1 a.2.2.1 fun a' b' c' hR => r a', b', c', ?refine_b hR)
(a := a, b, c, ?refine_c); rotate_left
· cases a.2 <;> rename_i heq
· rw [heq] at hR
exact .inr (.single hR)
· exact .inr (.trans (.single hR) heq)
· cases a.2.2.2 <;> rename_i heq
· rw [heq] at hR
exact .inr (.single hR)
· exact .inr (.trans (.single hR) heq)
· exact .inl rfl
unfold InvImage f at this
simp at this
rw [this]
constructor
intro x
apply InvImage.accessible
cases x.2 <;> rename_i heq
· rwa [heq]
· exact h.inv_of_transGen heq
public theorem partialExtrinsicFix₃_eq_apply_of_acc [ a b c, Nonempty (C₃ a b c)]
{R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop}
{F : (a b c), ( (a' b' c'), R a', b', c' a, b, c C₃ a' b' c') C₃ a b c}
{a : α} {b : β a} {c : γ a b} (wf : Acc R a, b, c) :
partialExtrinsicFix₃ R F a b c = F a b c (fun a b c _ => partialExtrinsicFix₃ R F a b c) := by
rw [partialExtrinsicFix₃_eq_partialExtrinsicFix wf, partialExtrinsicFix_eq_apply_of_acc wf]
congr 1; ext a' b' c' hR
rw [partialExtrinsicFix₃_eq_partialExtrinsicFix (wf.inv hR)]
public theorem partialExtrinsicFix₃_eq_apply [ a b c, Nonempty (C₃ a b c)]
{R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop}
{F : (a b c), ( (a' b' c'), R a', b', c' a, b, c C₃ a' b' c') C₃ a b c}
{a : α} {b : β a} {c : γ a b} (wf : WellFounded R) :
partialExtrinsicFix₃ R F a b c = F a b c (fun a b c _ => partialExtrinsicFix₃ R F a b c) :=
partialExtrinsicFix₃_eq_apply_of_acc (wf.apply _)
public theorem partialExtrinsicFix₃_eq_fix [ a b c, Nonempty (C₃ a b c)]
{R : (a : α) ×' (b : β a) ×' γ a b (a : α) ×' (b : β a) ×' γ a b Prop}
{F : a b c, ( a' b' c', R a', b', c' a, b, c C₃ a' b' c') C₃ a b c}
(wf : WellFounded R) {a b c} :
partialExtrinsicFix₃ R F a b c = wf.fix (fun x G => F x.1 x.2.1 x.2.2 (fun a b c h => G a, b, c h)) a, b, c := by
rw [partialExtrinsicFix₃_eq_partialExtrinsicFix (wf.apply _), partialExtrinsicFix_eq_fix wf]
end WellFounded

View File

@@ -55,22 +55,8 @@ errors from the interpreter itself as those depend on whether we are running in
-/
@[export lean_eval_check_meta]
private partial def evalCheckMeta (env : Environment) (declName : Name) : Except String Unit := do
if !env.header.isModule then
return
go declName |>.run' {}
where go (ref : Name) : StateT NameSet (Except String) Unit := do
if ( get).contains ref then
return
modify (·.insert ref)
if let some localDecl := declMapExt.getState env |>.find? ref then
for ref in collectUsedFDecls localDecl do
go ref
else
-- NOTE: We do not use `getIRPhases` here as it's intended for env decls, nor IR decls. We do
-- not set `includeServer` as we want this check to be independent of server mode. Server-only
-- users disable this check instead.
if findEnvDecl env ref |>.isNone then
throw s!"Cannot evaluate constant `{declName}` as it uses `{ref}` which is neither marked nor imported as `meta`"
if getIRPhases env declName == .runtime then
throw s!"Cannot evaluate constant `{declName}` as it is neither marked nor imported as `meta`"
builtin_initialize
registerTraceClass `compiler.ir.inferMeta

View File

@@ -101,6 +101,10 @@ partial def lowerCode (c : LCNF.Code .impure) : M FnBody := do
let ret getFVarValue fvarId
return .ret ret
| .unreach .. => return .unreachable
| .oset fvarId i y k _ =>
let y lowerArg y
let .var fvarId getFVarValue fvarId | unreachable!
return .set fvarId i y ( lowerCode k)
| .sset fvarId i offset y type k _ =>
let .var y getFVarValue y | unreachable!
let .var fvarId getFVarValue fvarId | unreachable!
@@ -109,12 +113,18 @@ partial def lowerCode (c : LCNF.Code .impure) : M FnBody := do
let .var y getFVarValue y | unreachable!
let .var fvarId getFVarValue fvarId | unreachable!
return .uset fvarId i y ( lowerCode k)
| .setTag fvarId cidx k _ =>
let .var var getFVarValue fvarId | unreachable!
return .setTag var cidx ( lowerCode k)
| .inc fvarId n check persistent k _ =>
let .var var getFVarValue fvarId | unreachable!
return .inc var n check persistent ( lowerCode k)
| .dec fvarId n check persistent k _ =>
let .var var getFVarValue fvarId | unreachable!
return .dec var n check persistent ( lowerCode k)
| .del fvarId k _ =>
let .var var getFVarValue fvarId | unreachable!
return .del var ( lowerCode k)
| .fun .. => panic! "all local functions should be λ-lifted"
partial def lowerLet (decl : LCNF.LetDecl .impure) (k : LCNF.Code .impure) : M FnBody := do
@@ -155,6 +165,9 @@ partial def lowerLet (decl : LCNF.LetDecl .impure) (k : LCNF.Code .impure) : M F
| .unbox var =>
withGetFVarValue var fun var => do
continueLet (.unbox var)
| .isShared var =>
withGetFVarValue var fun var => do
continueLet (.isShared var)
| .erased => mkErased ()
where
mkErased (_ : Unit) : M FnBody := do

View File

@@ -187,7 +187,8 @@ private unsafe def runInitAttrs (env : Environment) (opts : Options) : IO Unit :
if !Elab.inServer.get opts && getIRPhases env decl == .runtime then
continue
if initDecl.isAnonymous then
let initFn IO.ofExcept <| env.evalConst (IO Unit) opts decl
-- Don't check `meta` again as it would not respect `Elab.inServer`
let initFn IO.ofExcept <| env.evalConst (checkMeta := false) (IO Unit) opts decl
initFn
else
runInit env opts decl initDecl

View File

@@ -75,6 +75,7 @@ def eqvLetValue (e₁ e₂ : LetValue pu) : EqvM Bool := do
pure (i₁ == i₂ && u₁ == u₂) <&&> eqvFVar v₁ v₂ <&&> eqvArgs as₁ as₂
| .box ty₁ v₁ _, .box ty₂ v₂ _ => eqvType ty₁ ty₂ <&&> eqvFVar v₁ v₂
| .unbox v₁ _, .unbox v₂ _ => eqvFVar v₁ v₂
| .isShared v₁ _, .isShared v₂ _ => eqvFVar v₁ v₂
| _, _ => return false
@[inline] def withFVar (fvarId₁ fvarId₂ : FVarId) (x : EqvM α) : EqvM α :=
@@ -143,6 +144,11 @@ partial def eqv (code₁ code₂ : Code pu) : EqvM Bool := do
eqvFVar c₁.discr c₂.discr <&&>
eqvType c₁.resultType c₂.resultType <&&>
eqvAlts c₁.alts c₂.alts
| .oset fvarId₁ i₁ y₁ k₁ _, .oset fvarId₂ i₂ y₂ k₂ _ =>
pure (i₁ == i₂) <&&>
eqvFVar fvarId₁ fvarId₂ <&&>
eqvArg y₁ y₂ <&&>
eqv k₁ k₂
| .sset fvarId₁ i₁ offset₁ y₁ ty₁ k₁ _, .sset fvarId₂ i₂ offset₂ y₂ ty₂ k₂ _ =>
pure (i₁ == i₂) <&&>
pure (offset₁ == offset₂) <&&>
@@ -155,6 +161,10 @@ partial def eqv (code₁ code₂ : Code pu) : EqvM Bool := do
eqvFVar fvarId₁ fvarId₂ <&&>
eqvFVar y₁ y₂ <&&>
eqv k₁ k₂
| .setTag fvarId₁ c₁ k₁ _, .setTag fvarId₂ c₂ k₂ _ =>
pure (c₁ == c₂) <&&>
eqvFVar fvarId₁ fvarId₂ <&&>
eqv k₁ k₂
| .inc fvarId₁ n₁ c₁ p₁ k₁ _, .inc fvarId₂ n₂ c₂ p₂ k₂ _ =>
pure (n₁ == n₂) <&&>
pure (c₁ == c₂) <&&>
@@ -167,6 +177,9 @@ partial def eqv (code₁ code₂ : Code pu) : EqvM Bool := do
pure (p₁ == p₂) <&&>
eqvFVar fvarId₁ fvarId₂ <&&>
eqv k₁ k₂
| .del fvarId₁ k₁ _, .del fvarId₂ k₂ _ =>
eqvFVar fvarId₁ fvarId₂ <&&>
eqv k₁ k₂
| _, _ => return false
end

View File

@@ -219,6 +219,10 @@ inductive LetValue (pu : Purity) where
| box (ty : Expr) (fvarId : FVarId) (h : pu = .impure := by purity_tac)
/-- Given `fvarId : [t]object`, obtain the underlying scalar value. -/
| unbox (fvarId : FVarId) (h : pu = .impure := by purity_tac)
/--
Return whether the object stored behind `fvarId` is shared or not. The return type is a `UInt8`.
-/
| isShared (fvarId : FVarId) (h : pu = .impure := by purity_tac)
deriving Inhabited, BEq, Hashable
def Arg.toLetValue (arg : Arg pu) : LetValue pu :=
@@ -298,7 +302,12 @@ private unsafe def LetValue.updateUnboxImp (e : LetValue pu) (fvarId' : FVarId)
@[implemented_by LetValue.updateUnboxImp] opaque LetValue.updateUnbox! (e : LetValue pu) (fvarId' : FVarId) : LetValue pu
private unsafe def LetValue.updateIsSharedImp (e : LetValue pu) (fvarId' : FVarId) : LetValue pu :=
match e with
| .isShared fvarId _ => if fvarId == fvarId' then e else .isShared fvarId'
| _ => unreachable!
@[implemented_by LetValue.updateIsSharedImp] opaque LetValue.updateIsShared! (e : LetValue pu) (fvarId' : FVarId) : LetValue pu
private unsafe def LetValue.updateArgsImp (e : LetValue pu) (args' : Array (Arg pu)) : LetValue pu :=
match e with
@@ -331,6 +340,7 @@ def LetValue.toExpr (e : LetValue pu) : Expr :=
#[.fvar var, .const i.name [], ToExpr.toExpr updateHeader] ++ (args.map Arg.toExpr)
| .box ty var _ => mkApp2 (.const `box []) ty (.fvar var)
| .unbox var _ => mkApp (.const `unbox []) (.fvar var)
| .isShared fvarId _ => mkApp (.const `isShared []) (.fvar fvarId)
structure LetDecl (pu : Purity) where
fvarId : FVarId
@@ -361,10 +371,13 @@ inductive Code (pu : Purity) where
| cases (cases : Cases pu)
| return (fvarId : FVarId)
| unreach (type : Expr)
| oset (fvarId : FVarId) (i : Nat) (y : Arg pu) (k : Code pu) (h : pu = .impure := by purity_tac)
| uset (fvarId : FVarId) (i : Nat) (y : FVarId) (k : Code pu) (h : pu = .impure := by purity_tac)
| sset (fvarId : FVarId) (i : Nat) (offset : Nat) (y : FVarId) (ty : Expr) (k : Code pu) (h : pu = .impure := by purity_tac)
| setTag (fvarId : FVarId) (cidx : Nat) (k : Code pu) (h : pu = .impure := by purity_tac)
| inc (fvarId : FVarId) (n : Nat) (check : Bool) (persistent : Bool) (k : Code pu) (h : pu = .impure := by purity_tac)
| dec (fvarId : FVarId) (n : Nat) (check : Bool) (persistent : Bool) (k : Code pu) (h : pu = .impure := by purity_tac)
| del (fvarId : FVarId) (k : Code pu) (h : pu = .impure := by purity_tac)
deriving Inhabited
end
@@ -440,25 +453,32 @@ inductive CodeDecl (pu : Purity) where
| let (decl : LetDecl pu)
| fun (decl : FunDecl pu) (h : pu = .pure := by purity_tac)
| jp (decl : FunDecl pu)
| oset (fvarId : FVarId) (i : Nat) (y : Arg pu) (h : pu = .impure := by purity_tac)
| uset (fvarId : FVarId) (i : Nat) (y : FVarId) (h : pu = .impure := by purity_tac)
| sset (fvarId : FVarId) (i : Nat) (offset : Nat) (y : FVarId) (ty : Expr) (h : pu = .impure := by purity_tac)
| setTag (fvarId : FVarId) (cidx : Nat) (h : pu = .impure := by purity_tac)
| inc (fvarId : FVarId) (n : Nat) (check : Bool) (persistent : Bool) (h : pu = .impure := by purity_tac)
| dec (fvarId : FVarId) (n : Nat) (check : Bool) (persistent : Bool) (h : pu = .impure := by purity_tac)
| del (fvarId : FVarId) (h : pu = .impure := by purity_tac)
deriving Inhabited
def CodeDecl.fvarId : CodeDecl pu FVarId
| .let decl | .fun decl _ | .jp decl => decl.fvarId
| .uset fvarId .. | .sset fvarId .. | .inc fvarId .. | .dec fvarId .. => fvarId
| .uset fvarId .. | .sset fvarId .. | .inc fvarId .. | .dec fvarId .. | .del fvarId ..
| .oset fvarId .. | .setTag fvarId .. => fvarId
def Code.toCodeDecl! : Code pu CodeDecl pu
| .let decl _ => .let decl
| .fun decl _ _ => .fun decl
| .jp decl _ => .jp decl
| .uset fvarId i y _ _ => .uset fvarId i y
| .sset fvarId i offset ty y _ _ => .sset fvarId i offset ty y
| .inc fvarId n check persistent _ _ => .inc fvarId n check persistent
| .dec fvarId n check persistent _ _ => .dec fvarId n check persistent
| _ => unreachable!
| .let decl _ => .let decl
| .fun decl _ _ => .fun decl
| .jp decl _ => .jp decl
| .oset fvarId i y _ _ => .oset fvarId i y
| .uset fvarId i y _ _ => .uset fvarId i y
| .sset fvarId i offset ty y _ _ => .sset fvarId i offset ty y
| .setTag fvarId cidx _ _ => .setTag fvarId cidx
| .inc fvarId n check persistent _ _ => .inc fvarId n check persistent
| .dec fvarId n check persistent _ _ => .dec fvarId n check persistent
| .del fvarId _ _ => .del fvarId
| _ => unreachable!
def attachCodeDecls (decls : Array (CodeDecl pu)) (code : Code pu) : Code pu :=
go decls.size code
@@ -469,10 +489,13 @@ where
| .let decl => go (i-1) (.let decl code)
| .fun decl _ => go (i-1) (.fun decl code)
| .jp decl => go (i-1) (.jp decl code)
| .oset fvarId idx y _ => go (i-1) (.oset fvarId idx y code)
| .uset fvarId idx y _ => go (i-1) (.uset fvarId idx y code)
| .sset fvarId idx offset y ty _ => go (i-1) (.sset fvarId idx offset y ty code)
| .setTag fvarId cidx _ => go (i-1) (.setTag fvarId cidx code)
| .inc fvarId n check persistent _ => go (i-1) (.inc fvarId n check persistent code)
| .dec fvarId n check persistent _ => go (i-1) (.dec fvarId n check persistent code)
| .del fvarId _ => go (i-1) (.del fvarId code)
else
code
@@ -488,14 +511,20 @@ mutual
| .jmp j₁ as₁, .jmp j₂ as₂ => j₁ == j₂ && as₁ == as₂
| .return r₁, .return r₂ => r₁ == r₂
| .unreach t₁, .unreach t₂ => t₁ == t₂
| .oset v₁ i₁ y₁ k₁ _, .oset v₂ i₂ y₂ k₂ _ =>
v₁ == v₂ && i₁ == i₂ && y₁ == y₂ && eqImp k₁ k₂
| .uset v₁ i₁ y₁ k₁ _, .uset v₂ i₂ y₂ k₂ _ =>
v₁ == v₂ && i₁ == i₂ && y₁ == y₂ && eqImp k₁ k₂
| .sset v₁ i₁ o₁ y₁ ty₁ k₁ _, .sset v₂ i₂ o₂ y₂ ty₂ k₂ _ =>
v₁ == v₂ && i₁ == i₂ && o₁ == o₂ && y₁ == y₂ && ty₁ == ty₂ && eqImp k₁ k₂
| .setTag v₁ c₁ k₁ _, .setTag v₂ c₂ k₂ _ =>
v₁ == v₂ && c₁ == c₂ && eqImp k₁ k₂
| .inc v₁ n₁ c₁ p₁ k₁ _, .inc v₂ n₂ c₂ p₂ k₂ _ =>
v₁ == v₂ && n₁ == n₂ && c₁ == c₂ && p₁ == p₂ && eqImp k₁ k₂
| .dec v₁ n₁ c₁ p₁ k₁ _, .dec v₂ n₂ c₂ p₂ k₂ _ =>
v₁ == v₂ && n₁ == n₂ && c₁ == c₂ && p₁ == p₂ && eqImp k₁ k₂
| .del v₁ k₁ _, .del v₂ k₂ _ =>
v₁ == v₂ && eqImp k₁ k₂
| _, _ => false
private unsafe def eqFunDecl (d₁ d₂ : FunDecl pu) : Bool :=
@@ -588,10 +617,13 @@ private unsafe def updateAltImp (alt : Alt pu) (ps' : Array (Param pu)) (k' : Co
| .let decl k => if ptrEq k k' then c else .let decl k'
| .fun decl k _ => if ptrEq k k' then c else .fun decl k'
| .jp decl k => if ptrEq k k' then c else .jp decl k'
| .oset fvarId offset y k _ => if ptrEq k k' then c else .oset fvarId offset y k'
| .sset fvarId i offset y ty k _ => if ptrEq k k' then c else .sset fvarId i offset y ty k'
| .uset fvarId offset y k _ => if ptrEq k k' then c else .uset fvarId offset y k'
| .setTag fvarId cidx k _ => if ptrEq k k' then c else .setTag fvarId cidx k'
| .inc fvarId n check persistent k _ => if ptrEq k k' then c else .inc fvarId n check persistent k'
| .dec fvarId n check persistent k _ => if ptrEq k k' then c else .dec fvarId n check persistent k'
| .del fvarId k _ => if ptrEq k k' then c else .del fvarId k'
| _ => unreachable!
@[implemented_by updateContImp] opaque Code.updateCont! (c : Code pu) (k' : Code pu) : Code pu
@@ -635,6 +667,19 @@ private unsafe def updateAltImp (alt : Alt pu) (ps' : Array (Param pu)) (k' : Co
.sset fvarId' i' offset' y' ty' k'
| _ => unreachable!
@[inline] private unsafe def updateOsetImp (c : Code pu) (fvarId' : FVarId)
(i' : Nat) (y' : Arg pu) (k' : Code pu) : Code pu :=
match c with
| .oset fvarId i y k _ =>
if ptrEq fvarId fvarId' && i == i' && ptrEq y y' && ptrEq k k' then
c
else
.oset fvarId' i' y' k'
| _ => unreachable!
@[implemented_by updateOsetImp] opaque Code.updateOset! (c : Code pu) (fvarId' : FVarId)
(i' : Nat) (y' : Arg pu) (k' : Code pu) : Code pu
@[implemented_by updateSsetImp] opaque Code.updateSset! (c : Code pu) (fvarId' : FVarId) (i' : Nat)
(offset' : Nat) (y' : FVarId) (ty' : Expr) (k' : Code pu) : Code pu
@@ -651,6 +696,19 @@ private unsafe def updateAltImp (alt : Alt pu) (ps' : Array (Param pu)) (k' : Co
@[implemented_by updateUsetImp] opaque Code.updateUset! (c : Code pu) (fvarId' : FVarId)
(i' : Nat) (y' : FVarId) (k' : Code pu) : Code pu
@[inline] private unsafe def updateSetTagImp (c : Code pu) (fvarId' : FVarId) (cidx' : Nat)
(k' : Code pu) : Code pu :=
match c with
| .setTag fvarId cidx k _ =>
if ptrEq fvarId fvarId' && cidx == cidx' && ptrEq k k' then
c
else
.setTag fvarId' cidx' k'
| _ => unreachable!
@[implemented_by updateSetTagImp] opaque Code.updateSetTag! (c : Code pu) (fvarId' : FVarId)
(cidx' : Nat) (k' : Code pu) : Code pu
@[inline] private unsafe def updateIncImp (c : Code pu) (fvarId' : FVarId) (n' : Nat)
(check' : Bool) (persistent' : Bool) (k' : Code pu) : Code pu :=
match c with
@@ -685,6 +743,19 @@ private unsafe def updateAltImp (alt : Alt pu) (ps' : Array (Param pu)) (k' : Co
@[implemented_by updateDecImp] opaque Code.updateDec! (c : Code pu) (fvarId' : FVarId) (n' : Nat)
(check' : Bool) (persistent' : Bool) (k' : Code pu) : Code pu
@[inline] private unsafe def updateDelImp (c : Code pu) (fvarId' : FVarId) (k' : Code pu) :
Code pu :=
match c with
| .del fvarId k _ =>
if ptrEq fvarId fvarId' && ptrEq k k' then
c
else
.del fvarId' k'
| _ => unreachable!
@[implemented_by updateDelImp] opaque Code.updateDel! (c : Code pu) (fvarId' : FVarId)
(k' : Code pu) : Code pu
private unsafe def updateParamCoreImp (p : Param pu) (type : Expr) : Param pu :=
if ptrEq type p.type then
p
@@ -753,8 +824,8 @@ partial def Code.size (c : Code pu) : Nat :=
where
go (c : Code pu) (n : Nat) : Nat :=
match c with
| .let (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. => go k (n + 1)
| .let (k := k) .. | .oset (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .setTag (k := k) .. | .del (k := k) .. => go k (n + 1)
| .jp decl k | .fun decl k _ => go k <| go decl.value n
| .cases c => c.alts.foldl (init := n+1) fun n alt => go alt.getCode (n+1)
| .jmp .. => n+1
@@ -772,8 +843,8 @@ where
go (c : Code pu) : EStateM Unit Nat Unit := do
match c with
| .let (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. => inc; go k
| .let (k := k) .. | .oset (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .setTag (k := k) .. | .del (k := k) .. => inc; go k
| .jp decl k | .fun decl k _ => inc; go decl.value; go k
| .cases c => inc; c.alts.forM fun alt => go alt.getCode
| .jmp .. => inc
@@ -785,8 +856,8 @@ where
go (c : Code pu) : m Unit := do
f c
match c with
| .let (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. => go k
| .let (k := k) .. | .oset (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .setTag (k := k) .. | .del (k := k) .. => go k
| .fun decl k _ | .jp decl k => go decl.value; go k
| .cases c => c.alts.forM fun alt => go alt.getCode
| .unreach .. | .return .. | .jmp .. => return ()
@@ -1017,7 +1088,7 @@ Return `true` if `decl` is supposed to be inlined/specialized.
-/
def Decl.isTemplateLike (decl : Decl pu) : CoreM Bool := do
let env getEnv
if hasLocalInst decl.type then
if !hasNospecializeAttribute env decl.name && ( hasLocalInst decl.type) then
return true -- `decl` applications will be specialized
else if ( isImplicitReducible decl.name) then
return true -- `decl` is "fuel" for code specialization
@@ -1053,7 +1124,7 @@ private def collectLetValue (e : LetValue pu) (s : FVarIdHashSet) : FVarIdHashSe
| .fvar fvarId args => collectArgs args <| s.insert fvarId
| .const _ _ args _ | .pap _ args _ | .fap _ args _ | .ctor _ args _ => collectArgs args s
| .proj _ _ fvarId _ | .sproj _ _ fvarId _ | .uproj _ fvarId _ | .oproj _ fvarId _
| .reset _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ => s.insert fvarId
| .reset _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ | .isShared fvarId _ => s.insert fvarId
| .lit .. | .erased => s
| .reuse fvarId _ _ args _ => collectArgs args <| s.insert fvarId
@@ -1082,7 +1153,12 @@ partial def Code.collectUsed (code : Code pu) (s : FVarIdHashSet := {}) : FVarId
let s := s.insert fvarId
let s := s.insert y
k.collectUsed s
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) .. =>
| .oset fvarId _ y k _ =>
let s := s.insert fvarId
let s := if let .fvar y := y then s.insert y else s
k.collectUsed s
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) ..
| .del (fvarId := fvarId) (k := k) .. | .setTag (fvarId := fvarId) (k := k) .. =>
k.collectUsed <| s.insert fvarId
end
@@ -1095,7 +1171,11 @@ def CodeDecl.collectUsed (codeDecl : CodeDecl pu) (s : FVarIdHashSet := ∅) : F
| .jp decl | .fun decl _ => decl.collectUsed s
| .sset (fvarId := fvarId) (y := y) .. | .uset (fvarId := fvarId) (y := y) .. =>
s.insert fvarId |>.insert y
| .inc (fvarId := fvarId) .. | .dec (fvarId := fvarId) .. => s.insert fvarId
| .oset (fvarId := fvarId) (y := y) .. =>
let s := s.insert fvarId
if let .fvar y := y then s.insert y else s
| .inc (fvarId := fvarId) .. | .dec (fvarId := fvarId) .. | .setTag (fvarId := fvarId) ..
| .del (fvarId := fvarId) .. => s.insert fvarId
/--
Traverse the given block of potentially mutually recursive functions
@@ -1125,7 +1205,8 @@ where
modify fun s => s.insert declName
| _ => pure ()
visit k
| .uset (k := k) .. | .sset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => visit k
| .oset (k := k) .. | .uset (k := k) .. | .sset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .del (k := k) .. | .setTag (k := k) .. => visit k
go : StateM NameSet Unit :=
decls.forM (·.value.forCodeM visit)

View File

@@ -68,7 +68,8 @@ where
eraseCode k
eraseParam auxParam
return .unreach typeNew
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. =>
| .oset (k := k) ..| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) ..
| .del (k := k) .. | .setTag (k := k) .. =>
return c.updateCont! ( go k)
instance : MonadCodeBind CompilerM where

View File

@@ -149,7 +149,7 @@ def eraseCodeDecl (decl : CodeDecl pu) : CompilerM Unit := do
match decl with
| .let decl => eraseLetDecl decl
| .jp decl | .fun decl _ => eraseFunDecl decl
| .sset .. | .uset .. | .inc .. | .dec .. => return ()
| .sset .. | .uset .. | .inc .. | .dec .. | .setTag .. | .oset .. | .del .. => return ()
/--
Erase all free variables occurring in `decls` from the local context.
@@ -300,6 +300,10 @@ private partial def normLetValueImp (s : FVarSubst pu) (e : LetValue pu) (transl
match normFVarImp s fvarId translator with
| .fvar fvarId' => e.updateUnbox! fvarId'
| .erased => .erased
| .isShared fvarId _ =>
match normFVarImp s fvarId translator with
| .fvar fvarId' => e.updateIsShared! fvarId'
| .erased => .erased
/--
Interface for monads that have a free substitutions.
@@ -497,16 +501,26 @@ mutual
withNormFVarResult ( normFVar fvarId) fun fvarId => do
withNormFVarResult ( normFVar y) fun y => do
return code.updateSset! fvarId i offset y ( normExpr ty) ( normCodeImp k)
| .oset fvarId offset y k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
let y normArg y
return code.updateOset! fvarId offset y ( normCodeImp k)
| .uset fvarId offset y k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
withNormFVarResult ( normFVar y) fun y => do
return code.updateUset! fvarId offset y ( normCodeImp k)
| .setTag fvarId cidx k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
return code.updateSetTag! fvarId cidx ( normCodeImp k)
| .inc fvarId n check persistent k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
return code.updateInc! fvarId n check persistent ( normCodeImp k)
| .dec fvarId n check persistent k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
return code.updateDec! fvarId n check persistent ( normCodeImp k)
| .del fvarId k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
return code.updateDel! fvarId ( normCodeImp k)
end
@[inline] def normFunDecl [MonadLiftT CompilerM m] [Monad m] [MonadFVarSubst m pu t] (decl : FunDecl pu) : m (FunDecl pu) := do

View File

@@ -39,12 +39,18 @@ partial def hashCode (code : Code pu) : UInt64 :=
| .cases c => mixHash (mixHash (hash c.discr) (hash c.resultType)) (hashAlts c.alts)
| .sset fvarId i offset y ty k _ =>
mixHash (mixHash (hash fvarId) (hash i)) (mixHash (mixHash (hash offset) (hash y)) (mixHash (hash ty) (hashCode k)))
| .oset fvarId offset y k _ =>
mixHash (mixHash (hash fvarId) (hash offset)) (mixHash (hash y) (hashCode k))
| .uset fvarId offset y k _ =>
mixHash (mixHash (hash fvarId) (hash offset)) (mixHash (hash y) (hashCode k))
| .setTag fvarId cidx k _ =>
mixHash (hash fvarId) (mixHash (hash cidx) (hashCode k))
| .inc fvarId n check persistent k _ =>
mixHash (mixHash (hash fvarId) (hash n)) (mixHash (mixHash (hash persistent) (hash check)) (hashCode k))
| .dec fvarId n check persistent k _ =>
mixHash (mixHash (hash fvarId) (hash n)) (mixHash (mixHash (hash persistent) (hash check)) (hashCode k))
| .del fvarId k _ =>
mixHash (hash fvarId) (hashCode k)
end

View File

@@ -31,7 +31,7 @@ private def letValueDepOn (e : LetValue pu) : M Bool :=
match e with
| .erased | .lit .. => return false
| .proj _ _ fvarId _ | .oproj _ fvarId _ | .uproj _ fvarId _ | .sproj _ _ fvarId _
| .reset _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ => fvarDepOn fvarId
| .reset _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ | .isShared fvarId _ => fvarDepOn fvarId
| .fvar fvarId args | .reuse fvarId _ _ args _ => fvarDepOn fvarId <||> args.anyM argDepOn
| .const _ _ args _ | .ctor _ args _ | .fap _ args _ | .pap _ args _ => args.anyM argDepOn
@@ -46,8 +46,12 @@ private partial def depOn (c : Code pu) : M Bool :=
| .jmp fvarId args => fvarDepOn fvarId <||> args.anyM argDepOn
| .return fvarId => fvarDepOn fvarId
| .unreach _ => return false
| .sset fv1 _ _ fv2 _ k _ | .uset fv1 _ fv2 k _ => fvarDepOn fv1 <||> fvarDepOn fv2 <||> depOn k
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) .. =>
| .oset fv1 _ fv2 k _ =>
fvarDepOn fv1 <||> argDepOn fv2 <||> depOn k
| .sset fv1 _ _ fv2 _ k _ | .uset fv1 _ fv2 k _ =>
fvarDepOn fv1 <||> fvarDepOn fv2 <||> depOn k
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) ..
| .del (fvarId := fvarId) (k := k) .. | .setTag (fvarId := fvarId) (k := k) .. =>
fvarDepOn fvarId <||> depOn k
@[inline] def Arg.dependsOn (arg : Arg pu) (s : FVarIdSet) : Bool :=
@@ -66,9 +70,14 @@ def CodeDecl.dependsOn (decl : CodeDecl pu) (s : FVarIdSet) : Bool :=
match decl with
| .let decl => decl.dependsOn s
| .jp decl | .fun decl _ => decl.dependsOn s
| .uset (fvarId := fvarId) (y := y) .. | .sset (fvarId := fvarId) (y := y) .. =>
| .oset (fvarId := fvarId) (y := y) .. =>
s.contains fvarId || y.dependsOn s
| .uset (fvarId := fvarId) (y := y) ..
| .sset (fvarId := fvarId) (y := y) .. =>
s.contains fvarId || s.contains y
| .inc (fvarId := fvarId) .. | .dec (fvarId := fvarId) .. => s.contains fvarId
| .inc (fvarId := fvarId) .. | .dec (fvarId := fvarId) .. | .del (fvarId := fvarId) ..
| .setTag (fvarId := fvarId) .. =>
s.contains fvarId
/--
Return `true` is `c` depends on a free variable in `s`.

View File

@@ -35,7 +35,7 @@ def collectLocalDeclsLetValue (s : UsedLocalDecls) (e : LetValue pu) : UsedLocal
match e with
| .erased | .lit .. => s
| .proj _ _ fvarId _ | .reset _ fvarId _ | .sproj _ _ fvarId _ | .uproj _ fvarId _
| .oproj _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ => s.insert fvarId
| .oproj _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ | .isShared fvarId _ => s.insert fvarId
| .const _ _ args _ => collectLocalDeclsArgs s args
| .fvar fvarId args | .reuse fvarId _ _ args _ => collectLocalDeclsArgs (s.insert fvarId) args
| .fap _ args _ | .pap _ args _ | .ctor _ args _ => collectLocalDeclsArgs s args
@@ -56,9 +56,8 @@ def LetValue.safeToElim (val : LetValue pu) : Bool :=
| .pure => true
| .impure =>
match val with
-- TODO | .isShared ..
| .ctor .. | .reset .. | .reuse .. | .oproj .. | .uproj .. | .sproj .. | .lit .. | .pap ..
| .box .. | .unbox .. | .erased .. => true
| .box .. | .unbox .. | .erased .. | .isShared .. => true
-- 0-ary full applications are considered constants
| .fap _ args => args.isEmpty
| .fvar .. => false
@@ -95,6 +94,13 @@ partial def Code.elimDead (code : Code pu) : M (Code pu) := do
| .return fvarId => collectFVarM fvarId; return code
| .jmp fvarId args => collectFVarM fvarId; args.forM collectArgM; return code
| .unreach .. => return code
| .oset fvarId _ y k _ =>
let k k.elimDead
if ( get).contains fvarId then
collectArgM y
return code.updateCont! k
else
return k
| .uset fvarId _ y k _ | .sset fvarId _ _ y _ k _ =>
let k k.elimDead
if ( get).contains fvarId then
@@ -102,7 +108,8 @@ partial def Code.elimDead (code : Code pu) : M (Code pu) := do
return code.updateCont! k
else
return k
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) .. =>
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) ..
| .setTag (fvarId := fvarId) (k := k) .. | .del (fvarId := fvarId) (k := k) .. =>
let k k.elimDead
collectFVarM fvarId
return code.updateCont! k

View File

@@ -284,7 +284,7 @@ partial def Code.explicitBoxing (code : Code .impure) : BoxM (Code .impure) := d
let some jpDecl findFunDecl? fvarId | unreachable!
castArgsIfNeeded args jpDecl.params fun args => return code.updateJmp! fvarId args
| .unreach .. => return code.updateUnreach! ( getResultType)
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .oset .. | .setTag .. | .del .. => unreachable!
where
/--
Up to this point the type system of IR is quite loose so we can for example encounter situations
@@ -313,7 +313,7 @@ where
| .ctor i _ => return i.type
| .fvar .. | .lit .. | .sproj .. | .oproj .. | .reset .. | .reuse .. =>
return currentType
| .box .. | .unbox .. => unreachable!
| .box .. | .unbox .. | .isShared .. => unreachable!
visitLet (code : Code .impure) (decl : LetDecl .impure) (k : Code .impure) : BoxM (Code .impure) := do
let type tryCorrectLetDeclType decl.type decl.value
@@ -350,7 +350,7 @@ where
| .erased | .reset .. | .sproj .. | .uproj .. | .oproj .. | .lit .. =>
let decl decl.update type decl.value
return code.updateLet! decl k
| .box .. | .unbox .. => unreachable!
| .box .. | .unbox .. | .isShared .. => unreachable!
def run (decls : Array (Decl .impure)) : CompilerM (Array (Decl .impure)) := do
let decls decls.foldlM (init := #[]) fun newDecls decl => do

View File

@@ -117,7 +117,7 @@ partial def collectCode (code : Code .impure) : M Unit := do
| .cases cases => cases.alts.forM (·.forCodeM collectCode)
| .sset (k := k) .. | .uset (k := k) .. => collectCode k
| .return .. | .jmp .. | .unreach .. => return ()
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .oset .. | .del .. => unreachable!
/--
Collect the derived value tree as well as the set of parameters that take objects and are borrowed.
@@ -334,6 +334,7 @@ def useLetValue (value : LetValue .impure) : RcM Unit := do
useVar fvarId
useArgs args
| .lit .. | .erased => return ()
| .isShared .. => unreachable!
@[inline]
def bindVar (fvarId : FVarId) : RcM Unit :=
@@ -547,6 +548,7 @@ def LetDecl.explicitRc (code : Code .impure) (decl : LetDecl .impure) (k : Code
addIncBeforeConsumeAll allArgs (code.updateLet! decl k)
| .lit .. | .box .. | .reset .. | .erased .. =>
pure <| code.updateLet! decl k
| .isShared .. => unreachable!
useLetValue decl.value
bindVar decl.fvarId
return k
@@ -622,7 +624,7 @@ partial def Code.explicitRc (code : Code .impure) : RcM (Code .impure) := do
| .unreach .. =>
setRetLiveVars
return code
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .del .. | .oset .. => unreachable!
def Decl.explicitRc (decl : Decl .impure) :
CompilerM (Decl .impure) := do

View File

@@ -83,12 +83,13 @@ def LetValue.mapFVarM [MonadLiftT CompilerM m] [Monad m] (f : FVarId → m FVarI
return e.updateReuse! ( f fvarId) i updateHeader ( args.mapM (TraverseFVar.mapFVarM f))
| .box ty fvarId _ => return e.updateBox! ty ( f fvarId)
| .unbox fvarId _ => return e.updateUnbox! ( f fvarId)
| .isShared fvarId _ => return e.updateIsShared! ( f fvarId)
def LetValue.forFVarM [Monad m] (f : FVarId m Unit) (e : LetValue pu) : m Unit := do
match e with
| .lit .. | .erased => return ()
| .proj _ _ fvarId _ | .oproj _ fvarId _ | .sproj _ _ fvarId _ | .uproj _ fvarId _
| .reset _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ => f fvarId
| .reset _ fvarId _ | .box _ fvarId _ | .unbox fvarId _ | .isShared fvarId _ => f fvarId
| .const _ _ args _ | .pap _ args _ | .fap _ args _ | .ctor _ args _ =>
args.forM (TraverseFVar.forFVarM f)
| .fvar fvarId args | .reuse fvarId _ _ args _ => f fvarId; args.forM (TraverseFVar.forFVarM f)
@@ -139,14 +140,20 @@ partial def Code.mapFVarM [MonadLiftT CompilerM m] [Monad m] (f : FVarId → m F
return Code.updateReturn! c ( f var)
| .unreach typ =>
return Code.updateUnreach! c ( Expr.mapFVarM f typ)
| .oset fvarId offset y k _ =>
return Code.updateOset! c ( f fvarId) offset ( y.mapFVarM f) ( mapFVarM f k)
| .sset fvarId i offset y ty k _ =>
return Code.updateSset! c ( f fvarId) i offset ( f y) ( Expr.mapFVarM f ty) ( mapFVarM f k)
| .uset fvarId offset y k _ =>
return Code.updateUset! c ( f fvarId) offset ( f y) ( mapFVarM f k)
| .setTag fvarId cidx k _ =>
return Code.updateSetTag! c ( f fvarId) cidx ( mapFVarM f k)
| .inc fvarId n check persistent k _ =>
return Code.updateInc! c ( f fvarId) n check persistent ( mapFVarM f k)
| .dec fvarId n check persistent k _ =>
return Code.updateDec! c ( f fvarId) n check persistent ( mapFVarM f k)
| .del fvarId k _ =>
return Code.updateDel! c ( f fvarId) ( mapFVarM f k)
partial def Code.forFVarM [Monad m] (f : FVarId m Unit) (c : Code pu) : m Unit := do
match c with
@@ -182,7 +189,12 @@ partial def Code.forFVarM [Monad m] (f : FVarId → m Unit) (c : Code pu) : m Un
f fvarId
f y
forFVarM f k
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) .. =>
| .oset fvarId _ y k _ =>
f fvarId
y.forFVarM f
forFVarM f k
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) ..
| .del (fvarId := fvarId) (k := k) .. | .setTag (fvarId := fvarId) (k := k) .. =>
f fvarId
forFVarM f k
@@ -210,17 +222,22 @@ instance : TraverseFVar (CodeDecl pu) where
| .jp decl => return .jp ( mapFVarM f decl)
| .let decl => return .let ( mapFVarM f decl)
| .uset fvarId i y _ => return .uset ( f fvarId) i ( f y)
| .oset fvarId i y _ => return .oset ( f fvarId) i ( y.mapFVarM f)
| .sset fvarId i offset y ty _ => return .sset ( f fvarId) i offset ( f y) ( mapFVarM f ty)
| .setTag fvarId cidx _ => return .setTag ( f fvarId) cidx
| .inc fvarId n check persistent _ => return .inc ( f fvarId) n check persistent
| .dec fvarId n check persistent _ => return .dec ( f fvarId) n check persistent
| .del fvarId _ => return .del ( f fvarId)
forFVarM f decl :=
match decl with
| .fun decl _ => forFVarM f decl
| .jp decl => forFVarM f decl
| .let decl => forFVarM f decl
| .uset fvarId i y _ => do f fvarId; f y
| .oset fvarId i y _ => do f fvarId; y.forFVarM f
| .sset fvarId i offset y ty _ => do f fvarId; f y; forFVarM f ty
| .inc (fvarId := fvarId) .. | .dec (fvarId := fvarId) .. => f fvarId
| .inc (fvarId := fvarId) .. | .dec (fvarId := fvarId) .. | .del (fvarId := fvarId) ..
| .setTag (fvarId := fvarId) .. => f fvarId
instance : TraverseFVar (Alt pu) where
mapFVarM f alt := do

View File

@@ -91,7 +91,7 @@ where
| .cases cs => cs.alts.forM (·.forCodeM (goCode declName))
| .let _ k | .uset _ _ _ k _ | .sset _ _ _ _ _ k _ => goCode declName k
| .return .. | .jmp .. | .unreach .. => return ()
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .del .. | .oset .. => unreachable!
/--
Apply the inferred borrow annotations from `map` to a SCC.
@@ -121,7 +121,7 @@ where
| .cases cs => return code.updateAlts! <| cs.alts.mapMonoM (·.mapCodeM (go declName))
| .let _ k | .uset _ _ _ k _ | .sset _ _ _ _ _ k _ => return code.updateCont! ( go declName k)
| .return .. | .jmp .. | .unreach .. => return code
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .oset .. | .del .. => unreachable!
structure Ctx where
/--
@@ -300,7 +300,7 @@ where
| .cases cs => cs.alts.forM (·.forCodeM collectCode)
| .uset _ _ _ k _ | .sset _ _ _ _ _ k _ => collectCode k
| .return .. | .unreach .. => return ()
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .oset .. | .del .. => unreachable!
public def inferBorrow : Pass where

View File

@@ -120,6 +120,10 @@ private partial def internalizeLetValue (e : LetValue pu) : InternalizeM pu (Let
match ( normFVar fvarId) with
| .fvar fvarId' => return e.updateBox! ty fvarId'
| .erased => return .erased
| .isShared fvarId _ =>
match ( normFVar fvarId) with
| .fvar fvarId' => return e.updateIsShared! fvarId'
| .erased => return .erased
def internalizeLetDecl (decl : LetDecl pu) : InternalizeM pu (LetDecl pu) := do
let binderName refreshBinderName decl.binderName
@@ -166,12 +170,22 @@ partial def internalizeCode (code : Code pu) : InternalizeM pu (Code pu) := do
withNormFVarResult ( normFVar fvarId) fun fvarId => do
withNormFVarResult ( normFVar y) fun y => do
return .uset fvarId offset y ( internalizeCode k)
| .oset fvarId offset y k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
let y normArg y
return .oset fvarId offset y ( internalizeCode k)
| .setTag fvarId cidx k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
return .setTag fvarId cidx ( internalizeCode k)
| .inc fvarId n check persistent k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
return .inc fvarId n check persistent ( internalizeCode k)
| .dec fvarId n check persistent k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
return .dec fvarId n check persistent ( internalizeCode k)
| .del fvarId k _ =>
withNormFVarResult ( normFVar fvarId) fun fvarId => do
return .del fvarId ( internalizeCode k)
end
@@ -180,8 +194,12 @@ partial def internalizeCodeDecl (decl : CodeDecl pu) : InternalizeM pu (CodeDecl
| .let decl => return .let ( internalizeLetDecl decl)
| .fun decl _ => return .fun ( internalizeFunDecl decl)
| .jp decl => return .jp ( internalizeFunDecl decl)
| .uset fvarId i y _ =>
| .oset fvarId i y _ =>
-- Something weird should be happening if these become erased...
let .fvar fvarId normFVar fvarId | unreachable!
let y normArg y
return .oset fvarId i y
| .uset fvarId i y _ =>
let .fvar fvarId normFVar fvarId | unreachable!
let .fvar y normFVar y | unreachable!
return .uset fvarId i y
@@ -190,12 +208,18 @@ partial def internalizeCodeDecl (decl : CodeDecl pu) : InternalizeM pu (CodeDecl
let .fvar y normFVar y | unreachable!
let ty normExpr ty
return .sset fvarId i offset y ty
| .setTag fvarId cidx _ =>
let .fvar fvarId normFVar fvarId | unreachable!
return .setTag fvarId cidx
| .inc fvarId n check offset _ =>
let .fvar fvarId normFVar fvarId | unreachable!
return .inc fvarId n check offset
| .dec fvarId n check offset _ =>
let .fvar fvarId normFVar fvarId | unreachable!
return .dec fvarId n check offset
| .del fvarId _ =>
let .fvar fvarId normFVar fvarId | unreachable!
return .del fvarId
end Internalize

View File

@@ -77,7 +77,8 @@ mutual
| .let decl k => eraseCode k <| lctx.eraseLetDecl decl
| .jp decl k | .fun decl k _ => eraseCode k <| eraseFunDecl lctx decl
| .cases c => eraseAlts c.alts lctx
| .uset (k := k) .. | .sset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. =>
| .oset (k := k) .. | .uset (k := k) .. | .sset (k := k) .. | .inc (k := k) ..
| .dec (k := k) .. | .del (k := k) .. | .setTag (k := k) .. =>
eraseCode k lctx
| .return .. | .jmp .. | .unreach .. => lctx
end

View File

@@ -65,6 +65,8 @@ where
| .jp decl k => go decl.value <||> (do markJpVisited decl.fvarId; go k)
| .uset fvarId _ y k _ | .sset fvarId _ _ y _ k _ =>
visitVar fvarId <||> visitVar y <||> go k
| .oset fvarId _ y k _ =>
visitVar fvarId <||> pure (y.dependsOn ( read).targetSet) <||> go k
| .cases c => visitVar c.discr <||> c.alts.anyM (go ·.getCode)
| .jmp fvarId args =>
(pure <| args.any (·.dependsOn ( read).targetSet)) <||> do
@@ -76,7 +78,8 @@ where
go decl.value
| .return var => visitVar var
| .unreach .. => return false
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) .. =>
| .inc (fvarId := fvarId) (k := k) .. | .dec (fvarId := fvarId) (k := k) ..
| .setTag (fvarId := fvarId) (k := k) .. | .del (fvarId := fvarId) (k := k) =>
visitVar fvarId <||> go k
@[inline]

View File

@@ -83,10 +83,10 @@ def ppLetValue (e : LetValue pu) : M Format := do
| .proj _ i fvarId _ => return f!"{← ppFVar fvarId} # {i}"
| .fvar fvarId args => return f!"{← ppFVar fvarId}{← ppArgs args}"
| .const declName us args _ => return f!"{← ppExpr (.const declName us)}{← ppArgs args}"
| .ctor i args _ => return f!"{i} {← ppArgs args}"
| .ctor i args _ => return f!"{i}{← ppArgs args}"
| .fap declName args _ => return f!"{declName}{← ppArgs args}"
| .pap declName args _ => return f!"pap {declName}{← ppArgs args}"
| .oproj i fvarId _ => return f!"proj[{i}] {← ppFVar fvarId}"
| .oproj i fvarId _ => return f!"oproj[{i}] {← ppFVar fvarId}"
| .uproj i fvarId _ => return f!"uproj[{i}] {← ppFVar fvarId}"
| .sproj i offset fvarId _ => return f!"sproj[{i}, {offset}] {← ppFVar fvarId}"
| .reset n fvarId _ => return f!"reset[{n}] {← ppFVar fvarId}"
@@ -94,6 +94,7 @@ def ppLetValue (e : LetValue pu) : M Format := do
return f!"reuse" ++ (if updateHeader then f!"!" else f!"") ++ f!" {← ppFVar fvarId} in {info}{← ppArgs args}"
| .box _ fvarId _ => return f!"box {← ppFVar fvarId}"
| .unbox fvarId _ => return f!"unbox {← ppFVar fvarId}"
| .isShared fvarId _ => return f!"isShared {← ppFVar fvarId}"
def ppParam (param : Param pu) : M Format := do
let borrow := if param.borrow then "@&" else ""
@@ -144,11 +145,15 @@ mutual
return ""
| .sset fvarId i offset y ty k _ =>
if pp.letVarTypes.get ( getOptions) then
return f!"sset {← ppFVar fvarId} [{i}, {offset}] : {← ppExpr ty} := {← ppFVar y} " ++ ";" ++ .line ++ ( ppCode k)
return f!"sset {← ppFVar fvarId}[{i}, {offset}] : {← ppExpr ty} := {← ppFVar y};" ++ .line ++ ( ppCode k)
else
return f!"sset {← ppFVar fvarId} [{i}, {offset}] := {← ppFVar y} " ++ ";" ++ .line ++ ( ppCode k)
return f!"sset {← ppFVar fvarId}[{i}, {offset}] := {← ppFVar y};" ++ .line ++ ( ppCode k)
| .uset fvarId i y k _ =>
return f!"uset {← ppFVar fvarId} [{i}] := {← ppFVar y} " ++ ";" ++ .line ++ ( ppCode k)
return f!"uset {← ppFVar fvarId}[{i}] := {← ppFVar y};" ++ .line ++ ( ppCode k)
| .oset fvarId i y k _ =>
return f!"oset {← ppFVar fvarId} [{i}] := {← ppArg y};" ++ .line ++ ( ppCode k)
| .setTag fvarId cidx k _ =>
return f!"setTag {← ppFVar fvarId} := {cidx};" ++ .line ++ ( ppCode k)
| .inc fvarId n _ _ k _ =>
if n != 1 then
return f!"inc[{n}] {← ppFVar fvarId};" ++ .line ++ ( ppCode k)
@@ -159,6 +164,8 @@ mutual
return f!"dec[{n}] {← ppFVar fvarId};" ++ .line ++ ( ppCode k)
else
return f!"dec {← ppFVar fvarId};" ++ .line ++ ( ppCode k)
| .del fvarId k _ =>
return f!"del {← ppFVar fvarId};" ++ .line ++ ( ppCode k)
partial def ppDeclValue (b : DeclValue pu) : M Format := do

View File

@@ -58,7 +58,8 @@ where
go k
| .cases cs => cs.alts.forM (go ·.getCode)
| .jmp .. | .return .. | .unreach .. => return ()
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
start (decls : Array (Decl pu)) : StateRefT (Array (LetValue pu)) CompilerM Unit :=
decls.forM (·.value.forCodeM go)
@@ -73,7 +74,8 @@ where
| .jp decl k => modify (·.push decl); go decl.value; go k
| .cases cs => cs.alts.forM (go ·.getCode)
| .jmp .. | .return .. | .unreach .. => return ()
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
start (decls : Array (Decl pu)) : StateRefT (Array (FunDecl pu)) CompilerM Unit :=
decls.forM (·.value.forCodeM go)
@@ -86,7 +88,8 @@ where
| .fun decl k _ | .jp decl k => go decl.value <||> go k
| .cases cs => cs.alts.anyM (go ·.getCode)
| .jmp .. | .return .. | .unreach .. => return false
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
partial def filterByFun (pu : Purity) (f : FunDecl pu CompilerM Bool) : Probe (Decl pu) (Decl pu) :=
filter (·.value.isCodeAndM go)
@@ -96,7 +99,8 @@ where
| .fun decl k _ => do if ( f decl) then return true else go decl.value <||> go k
| .cases cs => cs.alts.anyM (go ·.getCode)
| .jmp .. | .return .. | .unreach .. => return false
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
partial def filterByJp (pu : Purity) (f : FunDecl pu CompilerM Bool) : Probe (Decl pu) (Decl pu) :=
filter (·.value.isCodeAndM go)
@@ -107,7 +111,8 @@ where
| .jp decl k => do if ( f decl) then return true else go decl.value <||> go k
| .cases cs => cs.alts.anyM (go ·.getCode)
| .jmp .. | .return .. | .unreach .. => return false
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
partial def filterByFunDecl (pu : Purity) (f : FunDecl pu CompilerM Bool) :
Probe (Decl pu) (Decl pu):=
@@ -118,7 +123,8 @@ where
| .fun decl k _ | .jp decl k => do if ( f decl) then return true else go decl.value <||> go k
| .cases cs => cs.alts.anyM (go ·.getCode)
| .jmp .. | .return .. | .unreach .. => return false
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
partial def filterByCases (pu : Purity) (f : Cases pu CompilerM Bool) : Probe (Decl pu) (Decl pu) :=
filter (·.value.isCodeAndM go)
@@ -128,7 +134,8 @@ where
| .fun decl k _ | .jp decl k => go decl.value <||> go k
| .cases cs => do if ( f cs) then return true else cs.alts.anyM (go ·.getCode)
| .jmp .. | .return .. | .unreach .. => return false
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
partial def filterByJmp (pu : Purity) (f : FVarId Array (Arg pu) CompilerM Bool) :
Probe (Decl pu) (Decl pu) :=
@@ -140,7 +147,8 @@ where
| .cases cs => cs.alts.anyM (go ·.getCode)
| .jmp fn var => f fn var
| .return .. | .unreach .. => return false
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
partial def filterByReturn (pu : Purity) (f : FVarId CompilerM Bool) : Probe (Decl pu) (Decl pu) :=
filter (·.value.isCodeAndM go)
@@ -151,7 +159,8 @@ where
| .cases cs => cs.alts.anyM (go ·.getCode)
| .jmp .. | .unreach .. => return false
| .return var => f var
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
partial def filterByUnreach (pu : Purity) (f : Expr CompilerM Bool) : Probe (Decl pu) (Decl pu) :=
filter (·.value.isCodeAndM go)
@@ -162,7 +171,8 @@ where
| .cases cs => cs.alts.anyM (go ·.getCode)
| .jmp .. | .return .. => return false
| .unreach typ => f typ
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. => go k
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. | .del (k := k) ..
| .setTag (k := k) .. | .oset (k := k) .. => go k
@[inline]
def declNames (pu : Purity) : Probe (Decl pu) Name :=

View File

@@ -133,6 +133,8 @@ where
| .jp decl k =>
let decl decl.updateValue ( decl.value.pushProj)
go k (decls.push (.jp decl))
| .oset fvarId i y k _ =>
go k (decls.push (.oset fvarId i y))
| .uset fvarId i y k _ =>
go k (decls.push (.uset fvarId i y))
| .sset fvarId i offset y ty k _ =>
@@ -141,6 +143,10 @@ where
go k (decls.push (.inc fvarId n check persistent))
| .dec fvarId n check persistent k _ =>
go k (decls.push (.dec fvarId n check persistent))
| .del fvarId k _ =>
go k (decls.push (.del fvarId))
| .setTag fvarId cidx k _ =>
go k (decls.push (.setTag fvarId cidx))
| .cases c => c.pushProjs decls
| .jmp .. | .return .. | .unreach .. =>
return attachCodeDecls decls c

View File

@@ -53,7 +53,8 @@ partial def Code.applyRenaming (code : Code pu) (r : Renaming) : CompilerM (Code
| .ctorAlt _ k _ => return alt.updateCode ( k.applyRenaming r)
return code.updateAlts! alts
| .jmp .. | .unreach .. | .return .. => return code
| .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. =>
| .oset (k := k) .. | .sset (k := k) .. | .uset (k := k) .. | .inc (k := k) .. | .dec (k := k) ..
| .del (k := k) .. | .setTag (k := k) .. =>
return code.updateCont! ( k.applyRenaming r)
end

View File

@@ -120,7 +120,7 @@ where
| .return .. | .jmp .. | .unreach .. => return (c, false)
| .sset _ _ _ _ _ k _ | .uset _ _ _ k _ | .let _ k =>
goK k
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .oset .. | .del .. => unreachable!
def isCtorUsing (instr : CodeDecl .impure) (x : FVarId) : Bool :=
match instr with
@@ -242,7 +242,7 @@ where
return (c.updateCont! k, false)
| .return .. | .jmp .. | .unreach .. =>
return (c, c.isFVarLiveIn x)
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .oset .. | .del .. => unreachable!
end
@@ -275,7 +275,7 @@ partial def Code.insertResetReuse (c : Code .impure) : ReuseM (Code .impure) :=
| .let _ k | .uset _ _ _ k _ | .sset _ _ _ _ _ k _ =>
return c.updateCont! ( k.insertResetReuse)
| .return .. | .jmp .. | .unreach .. => return c
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .oset .. | .del .. => unreachable!
partial def Decl.insertResetReuseCore (decl : Decl .impure) : ReuseM (Decl .impure) := do
let value decl.value.mapCodeM fun code => do
@@ -298,7 +298,7 @@ where
| .jp decl k => collectResets decl.value; collectResets k
| .cases c => c.alts.forM (collectResets ·.getCode)
| .unreach .. | .return .. | .jmp .. => return ()
| .inc .. | .dec .. => unreachable!
| .inc .. | .dec .. | .setTag .. | .oset .. | .del .. => unreachable!
def Decl.insertResetReuse (decl : Decl .impure) : CompilerM (Decl .impure) := do

View File

@@ -107,7 +107,8 @@ partial def Code.simpCase (code : Code .impure) : CompilerM (Code .impure) := do
let decl decl.updateValue ( decl.value.simpCase)
return code.updateFun! decl ( k.simpCase)
| .return .. | .jmp .. | .unreach .. => return code
| .let _ k | .uset (k := k) .. | .sset (k := k) .. | .inc (k := k) .. | .dec (k := k) .. =>
| .let _ k | .uset (k := k) .. | .sset (k := k) .. | .inc (k := k) .. | .dec (k := k) ..
| .setTag (k := k) .. | .del (k := k) .. | .oset (k := k) .. =>
return code.updateCont! ( k.simpCase)
def Decl.simpCase (decl : Decl .impure) : CompilerM (Decl .impure) := do

View File

@@ -116,10 +116,18 @@ partial def Code.toExprM (code : Code pu) : ToExprM Expr := do
let value := mkApp5 (mkConst `sset) (.fvar fvarId) (toExpr i) (toExpr offset) (.fvar y) ty
let body withFVar fvarId k.toExprM
return .letE `dummy (mkConst ``Unit) value body true
| .oset fvarId offset y k _ =>
let value := mkApp3 (mkConst `oset) (.fvar fvarId) (toExpr offset) ( y.toExprM)
let body withFVar fvarId k.toExprM
return .letE `dummy (mkConst ``Unit) value body true
| .uset fvarId offset y k _ =>
let value := mkApp3 (mkConst `uset) (.fvar fvarId) (toExpr offset) (.fvar y)
let body withFVar fvarId k.toExprM
return .letE `dummy (mkConst ``Unit) value body true
| .setTag fvarId cidx k _ =>
let body withFVar fvarId k.toExprM
let value := mkApp2 (mkConst `setTag) (.fvar fvarId) (toExpr cidx)
return .letE `dummy (mkConst ``Unit) value body true
| .inc fvarId n check persistent k _ =>
let value := mkApp4 (mkConst `inc) (.fvar fvarId) (toExpr n) (toExpr check) (toExpr persistent)
let body withFVar fvarId k.toExprM
@@ -128,6 +136,10 @@ partial def Code.toExprM (code : Code pu) : ToExprM Expr := do
let body withFVar fvarId k.toExprM
let value := mkApp4 (mkConst `dec) (.fvar fvarId) (toExpr n) (toExpr check) (toExpr persistent)
return .letE `dummy (mkConst ``Unit) value body true
| .del fvarId k _ =>
let body withFVar fvarId k.toExprM
let value := mkApp (mkConst `del) (.fvar fvarId)
return .letE `dummy (mkConst ``Unit) value body true
end
public def Code.toExpr (code : Code pu) (xs : Array FVarId := #[]) : Expr :=

View File

@@ -348,15 +348,15 @@ def mkParam (binderName : Name) (type : Expr) : M (Param .pure) := do
modify fun s => { s with lctx := s.lctx.mkLocalDecl param.fvarId binderName type .default }
return param
def mkLetDecl (binderName : Name) (type : Expr) (value : Expr) (type' : Expr) (arg : Arg .pure)
(nondep : Bool) : M (LetDecl .pure) := do
def mkLetDecl (binderName : Name) (type : Expr) (value : Expr) (type' : Expr) (arg : Arg .pure) :
M (LetDecl .pure) := do
let binderName cleanupBinderName binderName
let value' match arg with
| .fvar fvarId => pure <| .fvar fvarId #[]
| .erased | .type .. => pure .erased
let letDecl LCNF.mkLetDecl binderName type' value'
modify fun s => { s with
lctx := s.lctx.mkLetDecl letDecl.fvarId binderName type value nondep
lctx := s.lctx.mkLetDecl letDecl.fvarId binderName type value false
seq := s.seq.push <| .let letDecl
}
return letDecl
@@ -385,38 +385,6 @@ where
else
return (ps, e.instantiateRev xs)
/--
Given `e` and `args` where `mkAppN e (args.map (·.toExpr))` is not necessarily well-typed
(because of dependent typing), returns `e.beta args'` where `args'` are new local declarations each
assigned to a value in `args` with adjusted type (such that the resulting expression is well-typed).
-/
def mkTypeCorrectApp (e : Expr) (args : Array (Arg .pure)) : M Expr := do
if args.isEmpty then
return e
let type liftMetaM <| do
let type Meta.inferType e
if type.getNumHeadForalls < args.size then
-- expose foralls
Meta.forallBoundedTelescope type args.size Meta.mkForallFVars
else
return type
go type 0 #[]
where
go (type : Expr) (i : Nat) (xs : Array Expr) : M Expr := do
if h : i < args.size then
match type with
| .forallE nm t b bi =>
let t := t.instantiateRev xs
let arg := args[i]
if liftMetaM <| Meta.isProp t then
go b (i + 1) (xs.push (mkLcProof t))
else
let decl mkLetDecl nm t arg.toExpr ( arg.inferType) arg (nondep := true)
go b (i + 1) (xs.push (.fvar decl.fvarId))
| _ => liftMetaM <| Meta.throwFunctionExpected (mkAppN e xs)
else
return e.beta xs
def mustEtaExpand (env : Environment) (e : Expr) : Bool :=
if let .const declName _ := e.getAppFn then
match env.find? declName with
@@ -558,7 +526,7 @@ where
k args[arity...*]
```
-/
mkOverApplication (app : Arg .pure) (args : Array Expr) (arity : Nat) : M (Arg .pure) := do
mkOverApplication (app : (Arg .pure)) (args : Array Expr) (arity : Nat) : M (Arg .pure) := do
if args.size == arity then
return app
else
@@ -573,14 +541,11 @@ where
/--
Visit a `matcher`/`casesOn` alternative.
-/
visitAlt (casesAltInfo : CasesAltInfo) (e : Expr) (overArgs : Array (Arg .pure)) :
M (Expr × (Alt .pure)) := do
visitAlt (casesAltInfo : CasesAltInfo) (e : Expr) : M (Expr × (Alt .pure)) := do
withNewScope do
match casesAltInfo with
| .default numHyps =>
let e := mkAppN e (Array.replicate numHyps erasedExpr)
let e mkTypeCorrectApp e overArgs
let c toCode ( visit e)
let c toCode ( visit (mkAppN e (Array.replicate numHyps erasedExpr)))
let altType c.inferType
return (altType, .default c)
| .ctor ctorName numParams =>
@@ -590,7 +555,6 @@ where
let (ps', e') ToLCNF.visitLambda e
ps := ps ++ ps'
e := e'
e mkTypeCorrectApp e overArgs
/-
Insert the free variable ids of fields that are type formers into `toAny`.
Recall that we do not want to have "data" occurring in types.
@@ -615,8 +579,7 @@ where
visitCases (casesInfo : CasesInfo) (e : Expr) : M (Arg .pure) :=
etaIfUnderApplied e casesInfo.arity do
let args := e.getAppArgs
let overArgs (args.drop casesInfo.arity).mapM visitAppArg
let mut resultType toLCNFType ( liftMetaM do Meta.inferType (mkAppN e.getAppFn args))
let mut resultType toLCNFType ( liftMetaM do Meta.inferType (mkAppN e.getAppFn args[*...casesInfo.arity]))
let typeName := casesInfo.indName
let .inductInfo indVal getConstInfo typeName | unreachable!
if casesInfo.numAlts == 0 then
@@ -646,7 +609,8 @@ where
fieldArgs := fieldArgs.push fieldArg
return fieldArgs
let f := args[casesInfo.altsRange.lower]!
visit (mkAppN (mkAppN f fieldArgs) (overArgs.map (·.toExpr)))
let result visit (mkAppN f fieldArgs)
mkOverApplication result args casesInfo.arity
else
let mut alts := #[]
let discr visitAppArg args[casesInfo.discrPos]!
@@ -654,13 +618,14 @@ where
| .fvar discrFVarId => pure discrFVarId
| .erased | .type .. => mkAuxLetDecl .erased
for i in casesInfo.altsRange, numParams in casesInfo.altNumParams do
let (altType, alt) visitAlt numParams args[i]! overArgs
let (altType, alt) visitAlt numParams args[i]!
resultType := joinTypes altType resultType
alts := alts.push alt
let cases := typeName, resultType, discrFVarId, alts
let auxDecl mkAuxParam resultType
pushElement (.cases auxDecl cases)
return .fvar auxDecl.fvarId
let result := .fvar auxDecl.fvarId
mkOverApplication result args casesInfo.arity
visitCtor (arity : Nat) (e : Expr) : M (Arg .pure) :=
etaIfUnderApplied e arity do
@@ -878,14 +843,14 @@ where
visitLet (e : Expr) (xs : Array Expr) : M (Arg .pure) := do
match e with
| .letE binderName type value body nondep =>
| .letE binderName type value body _ =>
let type := type.instantiateRev xs
let value := value.instantiateRev xs
if ( (liftMetaM <| Meta.isProp type) <||> isTypeFormerType type) then
visitLet body (xs.push value)
else
let type' toLCNFType type
let letDecl mkLetDecl binderName type value type' ( visit value) nondep
let letDecl mkLetDecl binderName type value type' ( visit value)
visitLet body (xs.push (.fvar letDecl.fvarId))
| _ =>
let e := e.instantiateRev xs

View File

@@ -57,11 +57,6 @@ partial def markDeclPublicRec (phase : Phase) (decl : Decl pu) : CompilerM Unit
trace[Compiler.inferVisibility] m!"Marking {ref} as opaque because it is used by transparent {decl.name}"
markDeclPublicRec phase refDecl
register_builtin_option compiler.relaxedMetaCheck : Bool := {
defValue := false
descr := "Allow mixed `meta`/non-`meta` references in the same module. References to imports are unaffected."
}
/-- Checks whether references in the given declaration adhere to phase distinction. -/
partial def checkMeta (origDecl : Decl pu) : CompilerM Unit := do
if !( getEnv).header.isModule || !compiler.checkMeta.get ( getOptions) then
@@ -94,18 +89,19 @@ where go (isMeta isPublic : Bool) (decl : Decl pu) : StateT NameSet CompilerM Un
throwError "Invalid public `meta` definition `{.ofConstName origDecl.name}`, \
`{.ofConstName ref}` is not accessible here; consider adding \
`public meta import {env.header.moduleNames[modIdx]!}`"
match getIRPhases env ref, isMeta with
| .runtime, true =>
let relaxedCheck := compiler.relaxedMetaCheck.get ( getOptions) && !env.isImportedConst ref
let irPhases := getIRPhases env ref
if !relaxedCheck && irPhases == .runtime && isMeta then
if let some modIdx := env.getModuleIdxFor? ref then
-- We use `public` here as a conservative default (and most common case) as necessary
-- visibility is only clear at the end of the file.
throwError "Invalid `meta` definition `{.ofConstName origDecl.name}`, \
`{.ofConstName ref}` is not accessible here; consider adding \
`public meta import {env.header.moduleNames[modIdx]!}`"
`{.ofConstName ref}` is not accessible here; consider adding \
`public meta import {env.header.moduleNames[modIdx]!}`"
else
throwError "Invalid `meta` definition `{.ofConstName origDecl.name}`, \
`{.ofConstName ref}` not marked `meta`"
| .comptime, false =>
`{.ofConstName ref}` not marked `meta`"
if !relaxedCheck && irPhases == .comptime && !isMeta then
if let some modIdx := env.getModuleIdxFor? ref then
if !isMarkedMeta env ref then
throwError "Invalid definition `{.ofConstName origDecl.name}`, may not access \
@@ -113,13 +109,13 @@ where go (isMeta isPublic : Bool) (decl : Decl pu) : StateT NameSet CompilerM Un
`import {env.header.moduleNames[modIdx]!}`"
throwError "Invalid definition `{.ofConstName origDecl.name}`, may not access \
declaration `{.ofConstName ref}` marked as `meta`"
| irPhases, _ =>
-- We allow auxiliary defs to be used in either phase but we need to recursively check
-- *their* references in this case. We also need to do this for non-auxiliary defs in case a
-- public meta def tries to use a private meta import via a local private meta def :/ .
if irPhases == .all || isPublic && isPrivateName ref then
if let some _, refDecl getLocalDecl? ref then
go isMeta isPublic (refDecl.castPurity! pu)
-- We allow auxiliary defs to be used in either phase but we need to recursively check
-- *their* references in this case. We also need to do this for non-auxiliary defs in case a
-- public meta def tries to use a private meta import via a local private meta def :/ .
if irPhases == .all || isPublic && isPrivateName ref then
if let some _, refDecl getLocalDecl? ref then
go isMeta isPublic (refDecl.castPurity! pu)
/--
Checks that imports necessary for inlining/specialization are public as otherwise we may run into

View File

@@ -24,4 +24,9 @@ register_builtin_option compiler.checkMeta : Bool := {
intended only for debugging purposes."
}
register_builtin_option compiler.relaxedMetaCheck : Bool := {
defValue := false
descr := "Allow mixed `meta`/non-`meta` references in the same module. References to imports are unaffected."
}
end Lean.Compiler

View File

@@ -66,3 +66,4 @@ public import Lean.Elab.DocString
public import Lean.Elab.DocString.Builtin
public import Lean.Elab.Parallel
public import Lean.Elab.BuiltinDo
public import Lean.Elab.Idbg

View File

@@ -70,16 +70,17 @@ where go
private def addNamespace (header : Name) : CommandElabM Unit :=
addScopes (isNewNamespace := true) (isNoncomputable := false) (attrs := []) header
private def popScopes (numScopes : Nat) : CommandElabM Unit :=
for _ in *...numScopes do
popScope
def withNamespace {α} (ns : Name) (elabFn : CommandElabM α) : CommandElabM α := do
addNamespace ns
let a elabFn
modify fun s => { s with scopes := s.scopes.drop ns.getNumParts }
popScopes ns.getNumParts
pure a
private def popScopes (numScopes : Nat) : CommandElabM Unit :=
for _ in *...numScopes do
popScope
private def innermostScopeName? : List Scope Option Name
| { header := "", .. } :: _ => none
| { header := h, .. } :: _ => some <| .mkSimple h

View File

@@ -119,34 +119,59 @@ private def elabDiscrs (discrStxs : TSyntaxArray ``matchDiscr) : TermElabM (Arra
return discrs
open Meta.Match (throwIncorrectNumberOfPatternsAt logIncorrectNumberOfPatternsAt) in
private def elabPatterns (patternStxs : Array Syntax) (discrTypes : Array Expr) : TermElabM (Array Expr) :=
private def elabPatterns (patternStxs : Array Syntax) (numDiscrs : Nat) (matchType : Expr) : TermElabM (Array Expr) :=
withReader (fun ctx => { ctx with implicitLambda := false }) do
let mut patterns := #[]
let patternStxs Term.checkNumPatterns discrTypes.size patternStxs
let mut matchType := matchType
let patternStxs Term.checkNumPatterns numDiscrs patternStxs
for h : idx in *...patternStxs.size do
let patternStx := patternStxs[idx]
let discrType := discrTypes[idx]!
matchType whnf matchType
let .forallE _ discrType matchTypeBody _ := matchType
| throwError "unexpected match type {matchType}"
let pattern Term.withSynthesize <| Term.withPatternElabConfig <| Term.elabTermEnsuringType patternStx discrType
patterns := patterns.push pattern
matchType := matchTypeBody.instantiate1 pattern
return patterns
/--
Abstract earlier discriminant free variables from each discriminant type to produce
a dependent match motive. E.g., for discriminants `[i, h]` with types `[Nat, i ≤ n]` and
body `m β`, this produces `∀ (x₁ : Nat), ∀ (x₂ : x₁ ≤ n), m β`.
-/
private def mkDepMatchMotive (discrs : Array Term.Discr) (body : Expr) : TermElabM Expr := do
let mut matchType := body
for i in [0:discrs.size] do
let idx := discrs.size - 1 - i
let discr := discrs[idx]!
let discrType inferType discr.expr
let discrType transform (usedLetOnly := true) ( instantiateMVars discrType)
-- Abstract earlier discriminants (reversed so that the closest earlier discriminant
-- gets de Bruijn index 0). Non-fvar discriminants occupy index slots but won't match,
-- preserving correct de Bruijn numbering.
let earlierDiscrs := discrs[:idx].toArray.map (·.expr) |>.reverse
let discrType := if earlierDiscrs.isEmpty then discrType else discrType.abstract earlierDiscrs
let userName mkUserNameFor discr.expr
matchType := mkForall userName .default discrType matchType
return matchType
def withElaboratedLHS {α} (patternVarDecls : Array Term.PatternVarDecl) (patternStxs : Array Syntax)
(lhsStx : Syntax) (discrTypes : Array Expr) (k : AltLHS TermElabM α) : TermElabM α := do
let patterns Term.withSynthesize <| withRef lhsStx <| elabPatterns patternStxs discrTypes
(lhsStx : Syntax) (numDiscrs : Nat) (matchType : Expr)
(k : AltLHS TermElabM α) : TermElabM α := do
let patterns Term.withSynthesize <| withRef lhsStx <| elabPatterns patternStxs numDiscrs matchType
trace[Elab.do.match] "patterns: {patterns}"
let matchType := discrTypes.foldr (init := mkConst ``Unit) (fun discrType matchType => mkForall `x .default discrType matchType)
Term.withDepElimPatterns patternVarDecls patterns matchType fun localDecls patterns _matchType => do
k { ref := lhsStx, fvarDecls := localDecls.toList, patterns := patterns.toList }
private abbrev DoMatchAltView := Term.MatchAltView ``doSeq
private def elabMatchAlt (discrs : Array Term.Discr) (discrTypes : Array Expr)
private def elabMatchAlt (discrs : Array Term.Discr) (matchType : Expr)
(alt : DoMatchAltView) (dec : DoElemCont) : DoElabM (AltLHS × Expr) := withRef alt.ref do
let (patternVars, alt) Term.collectPatternVars alt
trace[Elab.do.match] "patternVars: {patternVars}"
controlAtTermElabM fun runInBase => do
Term.withPatternVars patternVars fun patternVarDecls => do
withElaboratedLHS patternVarDecls alt.patterns alt.lhs discrTypes fun altLHS =>
withElaboratedLHS patternVarDecls alt.patterns alt.lhs discrs.size matchType fun altLHS =>
Term.withEqs discrs altLHS.patterns fun eqs =>
withLocalInstances altLHS.fvarDecls <| runInBase do
trace[Elab.do.match] "elabMatchAlt: {alt.lhs}"
@@ -158,13 +183,8 @@ private def elabMatchAlt (discrs : Array Term.Discr) (discrTypes : Array Expr)
private def elabMatchAlts (discrs : Array Term.Discr) (alts : Array DoMatchAltView) (dec : DoElemCont) : DoElabM (Expr × Array AltLHS × Array Expr) := do
let alts liftMacroM <| Term.expandMacrosInPatterns alts
let discrTypes discrs.mapM fun discr => do
let discrType inferType discr.expr
transform (usedLetOnly := true) ( instantiateMVars discrType)
let (lhss, rhss) Array.unzip <$> alts.mapM (elabMatchAlt discrs discrTypes · dec)
let mut matchType mkMonadicType ( read).doBlockResultType
for discrType in discrTypes.reverse do
matchType := mkForall `x .default ( instantiateMVars discrType) matchType
let matchType mkDepMatchMotive discrs ( mkMonadicType ( read).doBlockResultType)
let (lhss, rhss) Array.unzip <$> alts.mapM (elabMatchAlt discrs matchType · dec)
return (matchType, lhss, rhss)
private def compileMatch (discrs : Array Term.Discr) (matchType : Expr) (lhss : List AltLHS)
@@ -190,7 +210,7 @@ private def elabDoMatchCore (discrs : TSyntaxArray ``matchDiscr) (alts : Array D
let (discrs, matchType, lhss, rhss) mapTermElabM Term.commitIfDidNotPostpone do
let discrs Term.withSynthesize <| elabDiscrs discrs
let (matchType, lhss, rhss) elabMatchAlts discrs alts dec
Term.synthesizeSyntheticMVarsUsingDefault -- Rationale similar to term match elaborator
Term.synthesizeSyntheticMVars
let lhss Term.instantiateAltLHSs lhss
return (discrs, matchType, lhss, rhss)
compileMatch discrs matchType lhss rhss
@@ -222,7 +242,11 @@ def getAltsPatternVars (alts : TSyntaxArray ``matchAlt) : TermElabM (Array Ident
-- Expand non-atomic discriminants for independent elaboration problems
if let some discrs expandNonAtomicDiscrs? discrs then
let newStx `(doElem| match $[(generalizing := $gen?)]? $(motive?)? $discrs,* with $alts:matchAlt*)
return Term.withMacroExpansion stx newStx <| elabDoElem newStx dec
-- We use `mkSaveInfoAnnotation` to make sure the info trees for the discriminants are saved
-- even if the inner match elaboration is postponed (returning a metavariable).
-- Without this, `withTermInfoContext'` in `elabDoElemFns` would discard the subtrees
-- containing TermInfo for variable references in the discriminants.
return Term.mkSaveInfoAnnotation ( Term.withMacroExpansion stx newStx <| elabDoElem newStx dec)
-- Expand simple matches to `let`
if let `(matchAltExpr| | $y:ident => $seq) := alts.getD 0 .missing then

View File

@@ -88,13 +88,18 @@ private def addAndCompileExprForEval (declName : Name) (value : Expr) (allowSorr
-- An alternative design would be to make `elabTermForEval` into a term elaborator and elaborate the command all at once
-- with `unsafe def _eval := term_for_eval% $t`, which we did try, but unwanted error messages
-- such as "failed to infer definition type" can surface.
let defView := mkDefViewOfDef { isUnsafe := true, visibility := .private }
let defView := mkDefViewOfDef { isUnsafe := true, visibility := .private, computeKind := .meta }
( `(Parser.Command.definition|
def $(mkIdent <| `_root_ ++ declName) := $( Term.exprToSyntax value)))
let declName := mkPrivateName ( getEnv) declName
-- Allow access to both `meta` and non-`meta` declarations as the compilation result does not
-- escape the current module.
withOptions (Compiler.compiler.checkMeta.set · false) do
-- For simplicity, allow arbitrary phase accesses in server, `import`-correct access otherwise.
-- These are the most permissive respective settings that are guaranteed not to lead to missing
-- IR errors.
withOptions (fun opts =>
if Elab.inServer.get opts then
Compiler.compiler.checkMeta.set opts false
else
Compiler.compiler.relaxedMetaCheck.set opts true) do
Term.elabMutualDef #[] { header := "" } #[defView]
assert! ( getEnv).contains declName
unless allowSorry do

View File

@@ -880,4 +880,22 @@ partial def withSetOptionIn (cmd : CommandElab) : CommandElab := fun stx => do
export Elab.Command (Linter addLinter)
namespace Parser.Command
/--
Returns syntax for `private` or `public` visibility depending on `isPublic`. This function should be
used to generate visibility syntax for declarations that is independent of the presence of
`public section`s.
-/
def visibility.ofBool (isPublic : Bool) : TSyntax ``visibility :=
Unhygienic.run <| if isPublic then `(visibility| public) else `(visibility| private)
/--
Returns syntax for `private` if `attrKind` is `local` and `public` otherwise.
-/
def visibility.ofAttrKind (attrKind : TSyntax ``Term.attrKind) : TSyntax ``visibility :=
visibility.ofBool <| !attrKind matches `(attrKind| local)
end Parser.Command
end Lean

View File

@@ -105,7 +105,7 @@ inductive DoElemContKind
/--
Elaboration of a `do` block `do $e; $rest`, results in a call
``elabTerm `(do $e; $rest) = elabElem e dec``, where `elabElem e ·` is the elaborator for `do`
``elabTerm `(do $e; $rest) = elabDoElem e dec``, where `elabDoElem e ·` is the elaborator for `do`
element `e`, and `dec` is the `DoElemCont` describing the elaboration of the rest of the block
`rest`.
@@ -147,8 +147,8 @@ deriving Inhabited
/--
The type of elaborators for `do` block elements.
It is ``elabTerm `(do $e; $rest) = elabElem e dec``, where `elabElem e ·` is the elaborator for `do`
element `e`, and `dec` is the `DoElemCont` describing the elaboration of the rest of the block
It is ``elabTerm `(do $e; $rest) = elabDoElem e dec``, where `elabDoElem e ·` is the elaborator for
`do` element `e`, and `dec` is the `DoElemCont` describing the elaboration of the rest of the block
`rest`.
-/
abbrev DoElab := TSyntax `doElem DoElemCont DoElabM Expr

View File

@@ -37,20 +37,20 @@ def ControlStack.base (mi : MonadInfo) : ControlStack where
runInBase e := pure e
restoreCont dec := pure dec
def ControlStack.stateT (baseMonadInfo : MonadInfo) (mutVars : Array Name) (σ : Expr) (base : ControlStack) : ControlStack where
def ControlStack.stateT (baseMonadInfo : MonadInfo) (mutVarIdents : Array Ident) (σ : Expr) (base : ControlStack) : ControlStack where
description _ := m!"StateT {σ} over {base.description ()}"
m := return mkApp2 (mkConst ``StateT [baseMonadInfo.u, baseMonadInfo.v]) ( getσ) ( base.m)
stM α := stM α >>= base.stM
runInBase e := do
-- `e : StateT σ m α`. Fetch the state tuple `s : σ` and apply it to `e`, `e.run s`.
-- See also `StateT.monadControl.liftWith`.
let (tuple, tupleTy) mkProdMkN ( mutVars.mapM (getFVarFromUserName ·)) baseMonadInfo.u
let mutExprs mutVarIdents.mapM fun x => do
let defn getLocalDeclFromUserName x.getId
Term.addTermInfo' x defn.toExpr
pure defn.toExpr
let (tuple, tupleTy) mkProdMkN mutExprs baseMonadInfo.u
unless isDefEq tupleTy σ do -- just for sanity; maybe delete in the future
throwError "State tuple type mismatch: expected {σ}, got {tupleTy}. This is a bug in the `do` elaborator."
-- throwError "tuple: {tuple}, tupleTy: {tupleTy}, {σ}"
-- let α ← mkFreshResultType `α
-- let eTy := mkApp3 (mkConst ``StateT [mi.u, mi.v]) σ mi.m α
-- let e ← Term.ensureHasType eTy e -- might need to replace mi.m by a metavariable due to match refinement
base.runInBase <| mkApp e tuple
restoreCont dec := do
-- Wrap `dec` such that the result type is `(dec.resultType × σ)` by unpacking the state tuple
@@ -59,11 +59,12 @@ def ControlStack.stateT (baseMonadInfo : MonadInfo) (mutVars : Array Name) (σ :
let resultType stM dec.resultType
let k : DoElabM Expr := do
let p getFVarFromUserName resultName
bindMutVarsFromTuple (dec.resultName :: mutVars.toList) p.fvarId! do
bindMutVarsFromTuple (dec.resultName :: mutVarNames.toList) p.fvarId! do
dec.k
base.restoreCont { resultName, resultType, k }
where
getσ := do mkProdN ( mutVars.mapM (LocalDecl.type <$> getLocalDeclFromUserName ·)) baseMonadInfo.u
mutVarNames := mutVarIdents.map (·.getId)
getσ := do mkProdN ( mutVarNames.mapM (LocalDecl.type <$> getLocalDeclFromUserName ·)) baseMonadInfo.u
stM α := return mkApp2 (mkConst ``Prod [baseMonadInfo.u, baseMonadInfo.u]) α ( getσ) -- NB: muts `σ` might have been refined by dependent pattern matches
def ControlStack.optionT (baseMonadInfo : MonadInfo) (optionTWrapper casesOnWrapper : Name)
@@ -200,9 +201,10 @@ structure ControlLifter where
def ControlLifter.ofCont (info : ControlInfo) (dec : DoElemCont) : DoElabM ControlLifter := do
let mi := ( read).monadInfo
let reassignedMutVars := ( read).mutVars |>.map (·.getId) |>.filter info.reassigns.contains
let reassignedMutVars := ( read).mutVars |>.filter (info.reassigns.contains ·.getId)
let reassignedMutVarNames := reassignedMutVars.map (·.getId)
let ρ := ( getReturnCont).resultType
let σ mkProdN ( reassignedMutVars.mapM (LocalDecl.type <$> getLocalDeclFromUserName ·)) mi.u
let σ mkProdN ( reassignedMutVarNames.mapM (LocalDecl.type <$> getLocalDeclFromUserName ·)) mi.u
let needEarlyReturn := if info.returnsEarly then some ρ else none
let needBreak := info.breaks && ( getBreakCont).isSome

View File

@@ -19,13 +19,15 @@ inductive DocScope where
| import (mods : Array Name)
private def imports := leading_parser sepBy1 ident ", "
set_option compiler.relaxedMetaCheck true in
private meta def importsM := imports
instance : FromDocArg DocScope where
fromDocArg v := private
match v with
| .str s => do
let stx withRef s <| parseQuotedStrLit (whitespace >> imports.fn) s
let `(imports|$modNames,*) := stx
let `(importsM|$modNames,*) := stx
| throwErrorAt stx "Expected comma-separated imports list, got `{stx}`"
let modNames : Array Ident := modNames
return .import (modNames.map (·.getId))

372
src/Lean/Elab/Idbg.lean Normal file
View File

@@ -0,0 +1,372 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Sebastian Ullrich
-/
module
prelude
public import Lean.Elab.Do.Basic
meta import Lean.Parser.Do
import Std.Internal.Async.TCP
/-!
# Interactive Debug Expression Evaluator (`idbg`)
`idbg` enables live communication between a running compiled Lean program and the language server.
## Protocol
Communication uses a length-prefixed TCP protocol over localhost. Both server (language server side)
and client (compiled program side) compute a deterministic port from the source location hash.
-/
open Lean Lean.Elab Lean.Elab.Term Lean.Meta
open Std.Net Std.Internal.IO.Async
namespace Lean.Idbg
/-- Custom `Name` serialization that preserves the exact structure.
The standard `ToJson`/`FromJson Name` uses `toString`/`toName` which doesn't
round-trip for hygienic names (e.g., names containing `@`). -/
def nameToJson : Name Json
| .anonymous => Json.null
| .str p s => Json.mkObj [("str", Json.arr #[nameToJson p, toJson s])]
| .num p n => Json.mkObj [("num", Json.arr #[nameToJson p, n])]
/-- Inverse of `nameToJson`. -/
partial def nameFromJson? (j : Json) : Except String Name := do
if j.isNull then return .anonymous
if let some arr := (j.getObjVal? "str").toOption then
let #[p, s] := ( fromJson? arr : Array Json) | .error "str expects 2 elements"
return .str ( nameFromJson? p) ( fromJson? s)
if let some arr := (j.getObjVal? "num").toOption then
let #[p, n] := ( fromJson? arr : Array Json) | .error "num expects 2 elements"
return .num ( nameFromJson? p) ( fromJson? n)
.error s!"expected Name, got {j}"
/-- Serialize `BinderInfo` to JSON. -/
def binderInfoToJson : BinderInfo Json
| .default => "default"
| .implicit => "implicit"
| .strictImplicit => "strictImplicit"
| .instImplicit => "instImplicit"
/-- Deserialize `BinderInfo` from JSON. -/
def binderInfoFromJson? : Json Except String BinderInfo
| .str "default" => .ok .default
| .str "implicit" => .ok .implicit
| .str "strictImplicit" => .ok .strictImplicit
| .str "instImplicit" => .ok .instImplicit
| j => .error s!"expected BinderInfo, got {j}"
/-- Serialize `Literal` to JSON. -/
def literalToJson : Literal Json
| .natVal n => Json.mkObj [("natVal", n)]
| .strVal s => Json.mkObj [("strVal", s)]
/-- Deserialize `Literal` from JSON. -/
def literalFromJson? (j : Json) : Except String Literal := do
if let some n := (j.getObjVal? "natVal").toOption then
return .natVal ( fromJson? n)
if let some s := (j.getObjVal? "strVal").toOption then
return .strVal ( fromJson? s)
.error s!"expected Literal, got {j}"
/-- Serialize `Level` to JSON. -/
partial def levelToJson : Level Json
| .zero => Json.mkObj [("zero", Json.null)]
| .succ l => Json.mkObj [("succ", levelToJson l)]
| .max a b => Json.mkObj [("max", Json.arr #[levelToJson a, levelToJson b])]
| .imax a b => Json.mkObj [("imax", Json.arr #[levelToJson a, levelToJson b])]
| .param n => Json.mkObj [("param", nameToJson n)]
| .mvar id => Json.mkObj [("mvar", nameToJson id.name)]
/-- Deserialize `Level` from JSON. -/
partial def levelFromJson? (j : Json) : Except String Level := do
if (j.getObjVal? "zero").toOption.isSome then
return .zero
if let some l := (j.getObjVal? "succ").toOption then
return .succ ( levelFromJson? l)
if let some arr := (j.getObjVal? "max").toOption then
let #[a, b] := ( fromJson? arr : Array Json) | .error "max expects 2 elements"
return .max ( levelFromJson? a) ( levelFromJson? b)
if let some arr := (j.getObjVal? "imax").toOption then
let #[a, b] := ( fromJson? arr : Array Json) | .error "imax expects 2 elements"
return .imax ( levelFromJson? a) ( levelFromJson? b)
if let some n := (j.getObjVal? "param").toOption then
return .param ( nameFromJson? n)
if let some n := (j.getObjVal? "mvar").toOption then
return .mvar nameFromJson? n
.error s!"expected Level, got {j}"
/-- Serialize `Expr` to JSON. Metadata is stripped; free variables are preserved. -/
partial def exprToJson : Expr Json
| .bvar i => Json.mkObj [("bvar", i)]
| .fvar id => Json.mkObj [("fvar", nameToJson id.name)]
| .mvar id => Json.mkObj [("mvar", nameToJson id.name)]
| .sort l => Json.mkObj [("sort", levelToJson l)]
| .const n ls => Json.mkObj [("const", nameToJson n), ("levels", Json.arr (ls.toArray.map levelToJson))]
| .app fn arg => Json.mkObj [("app", Json.arr #[exprToJson fn, exprToJson arg])]
| .lam n ty b bi => Json.mkObj [("lam", Json.mkObj [("name", nameToJson n), ("type", exprToJson ty), ("body", exprToJson b), ("bi", binderInfoToJson bi)])]
| .forallE n ty b bi => Json.mkObj [("forallE", Json.mkObj [("name", nameToJson n), ("type", exprToJson ty), ("body", exprToJson b), ("bi", binderInfoToJson bi)])]
| .letE n ty v b nd => Json.mkObj [("letE", Json.mkObj [("name", nameToJson n), ("type", exprToJson ty), ("value", exprToJson v), ("body", exprToJson b), ("nondep", nd)])]
| .lit l => Json.mkObj [("lit", literalToJson l)]
| .mdata _ e => exprToJson e -- strip metadata
| .proj tn i s => Json.mkObj [("proj", Json.mkObj [("typeName", nameToJson tn), ("idx", i), ("struct", exprToJson s)])]
/-- Deserialize `Expr` from JSON. -/
partial def exprFromJson? (j : Json) : Except String Expr := do
if let some i := (j.getObjVal? "bvar").toOption then
return .bvar ( fromJson? i)
if let some id := (j.getObjVal? "fvar").toOption then
return .fvar nameFromJson? id
if let some id := (j.getObjVal? "mvar").toOption then
return .mvar nameFromJson? id
if let some l := (j.getObjVal? "sort").toOption then
return .sort ( levelFromJson? l)
if (j.getObjVal? "const").toOption.isSome then
let n nameFromJson? ( j.getObjVal? "const")
let ls : Array Json fromJson? ( j.getObjVal? "levels")
return .const n ( ls.toList.mapM levelFromJson?)
if let some arr := (j.getObjVal? "app").toOption then
let #[fn, arg] := ( fromJson? arr : Array Json) | .error "app expects 2 elements"
return .app ( exprFromJson? fn) ( exprFromJson? arg)
if let some obj := (j.getObjVal? "lam").toOption then
return .lam ( nameFromJson? ( obj.getObjVal? "name"))
( exprFromJson? ( obj.getObjVal? "type"))
( exprFromJson? ( obj.getObjVal? "body"))
( binderInfoFromJson? ( obj.getObjVal? "bi"))
if let some obj := (j.getObjVal? "forallE").toOption then
return .forallE ( nameFromJson? ( obj.getObjVal? "name"))
( exprFromJson? ( obj.getObjVal? "type"))
( exprFromJson? ( obj.getObjVal? "body"))
( binderInfoFromJson? ( obj.getObjVal? "bi"))
if let some obj := (j.getObjVal? "letE").toOption then
return .letE ( nameFromJson? ( obj.getObjVal? "name"))
( exprFromJson? ( obj.getObjVal? "type"))
( exprFromJson? ( obj.getObjVal? "value"))
( exprFromJson? ( obj.getObjVal? "body"))
( fromJson? ( obj.getObjVal? "nondep"))
if let some l := (j.getObjVal? "lit").toOption then
return .lit ( literalFromJson? l)
if let some obj := (j.getObjVal? "proj").toOption then
return .proj ( nameFromJson? ( obj.getObjVal? "typeName"))
( fromJson? ( obj.getObjVal? "idx"))
( exprFromJson? ( obj.getObjVal? "struct"))
.error s!"expected Expr, got {j}"
/-- Deterministic port for a given `idbg` site, in range [10000, 65535]. -/
def idbgPort (siteId : String) : UInt16 :=
let h := hash siteId
(h % 55535 + 10000).toUInt16
def sendMsg (client : TCP.Socket.Client) (msg : String) : IO Unit := do
let bytes := msg.toUTF8
let header := s!"{bytes.size}\n".toUTF8
let t (client.sendAll #[header, bytes]).toIO
t.block
def recvMsg (client : TCP.Socket.Client) : IO String := do
-- Read until newline to get the decimal length
let mut header := ByteArray.empty
repeat
let t (client.recv? 1).toIO
let some chunk t.block | throw (.userError "idbg: connection closed")
if chunk[0]! == '\n'.toUInt8 then break
header := header ++ chunk
let some lenStr := String.fromUTF8? header | throw (.userError "idbg: invalid header")
let some len := lenStr.toNat? | throw (.userError "idbg: invalid length")
let mut payload := ByteArray.empty
while payload.size < len do
let t (client.recv? (len - payload.size).toUInt64).toIO
let some chunk t.block | throw (.userError "idbg: connection closed")
payload := payload ++ chunk
let some s := String.fromUTF8? payload | throw (.userError "idbg: invalid UTF-8")
return s
/-- Start a TCP server on the deterministic port for this site,
wait for one connection, send expression JSON, and receive the result string.
Returns `none` if the port is still held by a previous (cancelled) server. -/
def idbgServer (siteId : String) (exprJson : Json) : IO (Option String) := do
let port := idbgPort siteId
let addr := SocketAddressV4.mk (.ofParts 127 0 0 1) port
let mut server? : Option TCP.Socket.Server := none
for _ in List.range 100 do -- retry for up to 10s
match ( (do let s TCP.Socket.Server.mk; s.bind addr; s.listen 1; return s).toBaseIO) with
| .ok s => server? := some s; break
| .error _ => IO.sleep 100
let some server := server? | return none
let t server.accept |>.toIO
let client t.block
sendMsg client exprJson.compress
let result recvMsg client
let t client.shutdown |>.toIO
t.block
return some result
/-- Load the program's environment from its imports.
The imports include the current module (appended last by the elaborator) so that
same-file declarations are available. If its `.olean` is not found (e.g., when
running `lean` directly), we fall back to just the transitive imports. -/
def idbgLoadEnv (imports : Array Import) : IO Environment := do
try
importModules imports {} 0
catch _ =>
importModules imports.pop {} 0
/-- Compile and evaluate an expression in the given environment. -/
def idbgCompileAndEval (α : Type) [Nonempty α]
(env : Environment) (type value : Expr) : IO α := do
let name := `_idbg
let decl := Declaration.defnDecl {
name
levelParams := []
type
value
hints := .opaque
safety := .unsafe
}
let ((), {env := env', ..}) (addAndCompile decl).toIO
{ fileName := "<idbg>", fileMap := default, options := {} }
{ env }
match unsafe env'.evalConst α {} name (checkMeta := false) with
| .ok val => return val
| .error msg => throw (.userError s!"idbg evalConst failed: {msg}")
/-- Connect to the debug server, receive expressions, evaluate, send results. Loops forever. -/
@[nospecialize, export lean_idbg_client_loop] def idbgClientLoopImpl {α : Type} [Nonempty α]
(siteId : String) (imports : Array Import) (apply : α String) : IO Unit := do
let baseEnv idbgLoadEnv imports
let port := idbgPort siteId
let addr := SocketAddressV4.mk (.ofParts 127 0 0 1) port
while true do
try
let client TCP.Socket.Client.mk
let t (client.connect addr).toIO
t.block
let msg recvMsg client
let json IO.ofExcept (Json.parse msg)
let type IO.ofExcept (exprFromJson? ( IO.ofExcept (json.getObjVal? "type")))
let value IO.ofExcept (exprFromJson? ( IO.ofExcept (json.getObjVal? "value")))
let fnVal idbgCompileAndEval α baseEnv type value
let result := apply fnVal
sendMsg client result
let t client.shutdown |>.toIO
t.block
catch e =>
-- Only log non-connection errors (connection refused is normal during reconnect)
let msg := toString e
unless (msg.find? "connection refused").isSome do
IO.eprintln s!"idbg client: {e}"
IO.sleep 500
end Lean.Idbg
namespace Lean.Elab.Do
open Lean.Idbg
@[builtin_doElem_control_info Lean.Parser.Term.doIdbg]
def controlInfoIdbg : ControlInfoHandler := fun _ => return default
/-- Core elaboration logic shared by term and do-element forms.
Elaborates `e`, wraps the result in `toString ∘ repr`, abstracts over all local declarations,
and generates both the server-side TCP exchange and the runtime client loop code. -/
private def elabIdbgCore (e : Syntax) (body : TSyntax `term) (ref : Syntax) (expectedType? : Option Expr) :
TermElabM Expr := do
let fileName IO.FS.realPath ( getFileName)
let siteId := toString (hash s!"{fileName}:{ref.getPos?.getD 0}")
-- Collect ALL non-aux local declarations.
-- We need all of them (not just those used in the current expression)
-- because the expression can change on the server side while the
-- compiled program's apply closure is fixed.
let lctx getLCtx
let mut localDecls : Array LocalDecl := #[]
for decl in lctx do
if decl.isAuxDecl then continue
localDecls := localDecls.push decl
let localFVars := localDecls.map (mkFVar ·.fvarId)
-- Elaborate e, wrap in `toString ∘ repr`.
-- `synthesizeSyntheticMVarsNoPostponing` forces pending instance resolution
-- so that `instantiateMVars` can fully resolve all metavariables.
let eExpr Term.elabTerm e none
Term.synthesizeSyntheticMVarsNoPostponing
let eExpr instantiateMVars eExpr
let reprExpr Meta.mkAppM ``repr #[eExpr]
let reprStrExpr Meta.mkAppM ``toString #[reprExpr]
Term.synthesizeSyntheticMVarsNoPostponing
let reprStrExpr instantiateMVars reprStrExpr
-- Abstract over ALL locals as lambdas (not lets).
-- Do-notation let-bindings have `nondep := false`, so `mkLambdaFVars` would
-- create `letE` for them. We temporarily set `nondep := true` so that
-- `generalizeNondepLet` (the default) turns them all into lambdas.
let lctx' := localDecls.foldl (init := getLCtx) fun lctx decl =>
lctx.modifyLocalDecl decl.fvarId (·.setNondep true)
let (abstractedValue, abstractedType) withLCtx' lctx' do
let abstractedValue Meta.mkLambdaFVars localFVars reprStrExpr
let abstractedType Meta.inferType abstractedValue
return ( instantiateMVars abstractedValue, instantiateMVars abstractedType)
if abstractedValue.hasMVar then
throwError "idbg: abstracted value still has metavariables"
if abstractedType.hasMVar then
throwError "idbg: abstracted type still has metavariables"
-- Server mode: serialize and serve in a background snapshot task.
-- Skip if expression contains sorry (partial input during editing).
if Elab.inServer.get ( getOptions) && !abstractedValue.hasSorry then
let json := Json.mkObj [
("type", exprToJson abstractedType),
("value", exprToJson abstractedValue)
]
let cancelTk IO.CancelToken.new
let act Core.wrapAsyncAsSnapshot (cancelTk? := cancelTk) fun () => do
if let some result idbgServer siteId json then
logInfoAt ref m!"idbg: {result}"
Core.logSnapshotTask {
stx? := some ref
task := ( BaseIO.asTask (act ()))
cancelTk? := cancelTk
}
-- Generate runtime code: `idbgClientLoop siteId imports apply >>= fun _ => body`
let siteLit := Syntax.mkStrLit siteId
let applyClosure withLocalDecl `f .default abstractedType fun fVar => do
let appBody := mkAppN fVar localFVars
Meta.mkLambdaFVars #[fVar] appBody
let closureStx Term.exprToSyntax applyClosure
-- Include the current module so the client can access same-file declarations.
-- The `.olean` should exist since the program was compiled from it.
let imports := ( getEnv).header.imports.push { module := ( getEnv).mainModule }
let importExprs imports.mapM fun imp => do
let nameExpr := toExpr imp.module
let importAllExpr := toExpr imp.importAll
let isExportedExpr := toExpr imp.isExported
let isMetaExpr := toExpr imp.isMeta
return mkAppN (.const ``Import.mk []) #[nameExpr, importAllExpr, isExportedExpr, isMetaExpr]
let importsExpr := mkApp2 (.const ``List.toArray [.zero])
(.const ``Import [])
(importExprs.toList.foldr (fun e acc => mkApp3 (.const ``List.cons [.zero]) (.const ``Import []) e acc)
(mkApp (.const ``List.nil [.zero]) (.const ``Import [])))
let importsStx Term.exprToSyntax importsExpr
Term.elabTerm ( `(
Lean.Idbg.idbgClientLoop $siteLit $importsStx $closureStx >>= fun _ => $body
)) expectedType?
@[builtin_term_elab Lean.Parser.Term.idbg]
def elabIdbgTerm : TermElab := fun stx expectedType? => do
let `(Lean.Parser.Term.idbg| idbg $e; $body) := stx | throwUnsupportedSyntax
elabIdbgCore (e := e) (body := body) (ref := stx) expectedType?
@[builtin_doElem_elab Lean.Parser.Term.doIdbg]
def elabDoIdbg : DoElab := fun stx dec => do
let `(Lean.Parser.Term.doIdbg| idbg $e) := stx | throwUnsupportedSyntax
let mγ mkMonadicType ( read).doBlockResultType
doElabToSyntax "idbg body" dec.continueWithUnit fun body => do
elabIdbgCore (e := e) (body := body) (ref := stx) mγ
end Lean.Elab.Do

View File

@@ -207,7 +207,8 @@ private def elabCtors (indFVars : Array Expr) (params : Array Expr) (r : ElabHea
let indFamily isInductiveFamily params.size indFVar
r.view.ctors.toList.mapM fun ctorView =>
withoutExporting (when := isPrivateName ctorView.declName) do
Term.withAutoBoundImplicit <| Term.elabBinders ctorView.binders.getArgs fun ctorParams =>
let (binders, paramInfoOverrides) elabParamInfoUpdates params ctorView.binders.getArgs (fun _ => pure true)
Term.withAutoBoundImplicit <| Term.elabBinders binders fun ctorParams =>
withRef ctorView.ref do
let elabCtorType : TermElabM Expr := do
match ctorView.type? with
@@ -263,6 +264,7 @@ private def elabCtors (indFVars : Array Expr) (params : Array Expr) (r : ElabHea
let type mkForallFVars (extraCtorParams ++ ctorParams) type
let type reorderCtorArgs type
let type mkForallFVars params type
let type := type.updateForallBinderInfos (params |>.map (fun param => paramInfoOverrides[param]?.map Prod.snd) |>.toList)
trace[Elab.inductive] "{ctorView.declName} : {type}"
return { name := ctorView.declName, type }
where

View File

@@ -95,7 +95,7 @@ private partial def withAuxLocalDecls {α} (views : Array LetRecDeclView) (k : A
private def elabLetRecDeclValues (view : LetRecView) : TermElabM (Array Expr) :=
view.decls.mapM fun view => do
forallBoundedTelescope view.type view.binderIds.size fun xs type => do
forallBoundedTelescope view.type view.binderIds.size (cleanupAnnotations := true) fun xs type => do
-- Add new info nodes for new fvars. The server will detect all fvars of a binder by the binder's source location.
for h : i in *...view.binderIds.size do
addLocalVarInfo view.binderIds[i] xs[i]!

View File

@@ -965,30 +965,18 @@ structure LetRecClosure where
private def mkLetRecClosureFor (toLift : LetRecToLift) (freeVars : Array FVarId) : TermElabM LetRecClosure := do
let lctx := toLift.lctx
withLCtx lctx toLift.localInstances do
lambdaTelescope toLift.val fun xs val => do
/-
Recall that `toLift.type` and `toLift.value` may have different binder annotations.
See issue #1377 for an example.
-/
let userNameAndBinderInfos forallBoundedTelescope toLift.type xs.size fun xs _ =>
xs.mapM fun x => do
let localDecl x.fvarId!.getDecl
return (localDecl.userName, localDecl.binderInfo)
/- Auxiliary map for preserving binder user-facing names and `BinderInfo` for types. -/
let mut userNameBinderInfoMap : FVarIdMap (Name × BinderInfo) := {}
for x in xs, (userName, bi) in userNameAndBinderInfos do
userNameBinderInfoMap := userNameBinderInfoMap.insert x.fvarId! (userName, bi)
let type instantiateForall toLift.type xs
/-
Recall that `toLift.type` and `toLift.value` may have different binder annotations.
See issue #1377 for an example.
-/
let lambdaArity := toLift.val.getNumHeadLambdas
forallBoundedTelescope toLift.type lambdaArity fun xs type => do
let val := toLift.val.beta xs
let lctx getLCtx
let s mkClosureFor freeVars <| xs.map fun x => lctx.get! x.fvarId!
/- Apply original type binder info and user-facing names to local declarations. -/
let typeLocalDecls := s.localDecls.map fun localDecl =>
if let some (userName, bi) := userNameBinderInfoMap.get? localDecl.fvarId then
localDecl.setBinderInfo bi |>.setUserName userName
else
localDecl
let type := Closure.mkForall typeLocalDecls <| Closure.mkForall s.newLetDecls type
let val := Closure.mkLambda s.localDecls <| Closure.mkLambda s.newLetDecls val
let cleanLocalDecls := s.localDecls.map fun decl => decl.setType <| decl.type.cleanupAnnotations
let type := Closure.mkForall s.localDecls <| Closure.mkForall s.newLetDecls type
let val := Closure.mkLambda cleanLocalDecls <| Closure.mkLambda s.newLetDecls val
let c := mkAppN (Lean.mkConst toLift.declName) s.exprArgs
toLift.mvarId.assign c
return {

View File

@@ -421,6 +421,47 @@ private def instantiateMVarsAtInductive (indType : InductiveType) : TermElabM In
let ctors indType.ctors.mapM fun ctor => return { ctor with type := ( instantiateMVars ctor.type) }
return { indType with type, ctors }
open Parser.Term in
private def typelessBinder? : Syntax Option (Array Ident × BinderInfo)
| `(bracketedBinderF|($ids:ident*)) => some (ids, .default)
| `(bracketedBinderF|{$ids:ident*}) => some (ids, .implicit)
| `(bracketedBinderF|$ids:ident*) => some (ids, .strictImplicit)
| `(bracketedBinderF|[$id:ident]) => some (#[id], .instImplicit)
| _ => none
/--
Takes a binder list and interprets the prefix to see if any could be construed to be binder info updates.
Returns the binder list without these updates along with the new binder infos for these parameters.
- `params` are the parameters appearing in the header
- `binders` is the binder list to process
- `maybeParam` should return true for every local that could be a parameter
(for example, in structures we check that the ids don't refer to previously defined fields)
-/
def elabParamInfoUpdates
[Monad m] [MonadError m] [MonadLCtx m] [MonadLiftT TermElabM m]
(params : Array Expr) (binders : Array Syntax)
(maybeParam : FVarId m Bool) :
m (Array Syntax × ExprMap (Syntax × BinderInfo)) := do
let mut overrides : ExprMap (Syntax × BinderInfo) := {}
for i in *...binders.size do
match typelessBinder? binders[i]! with
| none => return (binders.extract i, overrides)
| some (ids, bi) =>
let lctx getLCtx
let decls := ids.filterMap fun id => lctx.findFromUserName? id.getId
let decls decls.filterM fun decl => maybeParam decl.fvarId
if decls.size != ids.size then
-- Then either these are for a new variables or the binder isn't only for parameters
return (binders.extract i, overrides)
for decl in decls, id in ids do
Term.addTermInfo' id decl.toExpr
unless params.contains decl.toExpr do
throwErrorAt id m!"Only parameters appearing in the declaration header may have their binders kinds be overridden"
++ .hint' "If this is not intended to be an override, use a binder with a type: for example, `(x : _)`"
overrides := overrides.insert decl.toExpr (id, bi)
return (#[], overrides)
section IndexPromotion
/-!
## Index-to-parameter promotion
@@ -1134,8 +1175,8 @@ private def withUsed {α} (elabs : Array InductiveElabStep2) (vars : Array Expr)
private def updateParams (vars : Array Expr) (indTypes : List InductiveType) : TermElabM (List InductiveType) :=
indTypes.mapM fun indType => do
let type mkForallFVars vars indType.type
let ctors indType.ctors.mapM fun ctor => do
let ctorType withExplicitToImplicit vars (mkForallFVars vars ctor.type)
let ctors withExplicitToImplicit vars <| indType.ctors.mapM fun ctor => do
let ctorType mkForallFVars vars ctor.type
return { ctor with type := ctorType }
return { indType with type, ctors }

View File

@@ -336,7 +336,7 @@ def addPreDefinitions (docCtx : LocalContext × LocalInstances) (preDefs : Array
(structuralRecursion docCtx preDefs termMeasures?s)
(wfRecursion docCtx preDefs termMeasures?s))
(fun msg =>
let preDefMsgs := preDefs.toList.map (MessageData.ofExpr $ mkConst ·.declName)
let preDefMsgs := preDefs.toList.map (MessageData.ofConstName <| ·.declName)
m!"fail to show termination for{indentD (MessageData.joinSep preDefMsgs Format.line)}\nwith errors\n{msg}")
catch ex =>
logException ex

View File

@@ -225,8 +225,7 @@ def getQuotKind (stx : Syntax) : TermElabM SyntaxNodeKind := do
| .str kind "quot" => addNamedQuotInfo stx kind
| ``dynamicQuot =>
let id := stx[1]
-- local parser use, so skip meta check
match ( elabParserName id (checkMeta := false)) with
match ( withoutExporting <| elabParserName id) with
| .parser n _ => return n
| .category c => return c
| .alias _ => return ( Parser.getSyntaxKindOfParserAlias? id.getId.eraseMacroScopes).get!

View File

@@ -416,55 +416,59 @@ def structureSyntaxToView (modifiers : Modifiers) (stx : Syntax) : TermElabM Str
let modifiers := if isClass then modifiers.addFirstAttr { name := `class } else modifiers
let declId := stx[1]
let name, declName, levelNames, docString? Term.expandDeclId ( getCurrNamespace) ( Term.getLevelNames) declId modifiers
if modifiers.isMeta then
modifyEnv (markMeta · declName)
addDeclarationRangesForBuiltin declName modifiers.stx stx
let (binders, type?) := expandOptDeclSig stx[2]
let exts := stx[3]
let type?
-- Compatibility mode for `structure S extends P : Type` syntax
if type?.isNone && !exts.isNone && !exts[0][2].isNone then
logWarningAt exts[0][2][0] <| "\
The syntax is now `structure S : Type extends P` rather than `structure S extends P : Type`"
++ .note "The purpose of this change is to accommodate `structure S extends toP : P` syntax for naming parent projections."
pure (some exts[0][2][0][1])
else
if !exts.isNone && !exts[0][2].isNone then
logErrorAt exts[0][2][0] <| "\
Unexpected additional resulting type. \
The syntax is now `structure S : Type extends P` rather than `structure S extends P : Type`."
++ .note "The purpose of this change is to accommodate `structure S extends toP : P` syntax for naming parent projections."
pure type?
let parents expandParents exts
let derivingClasses getOptDerivingClasses stx[5]
let fields expandFields stx modifiers declName
-- Private fields imply a private constructor (in the module system only, for back-compat)
let ctor expandCtor
(forcePrivate := ( getEnv).header.isModule && fields.any (·.modifiers.isPrivate))
stx modifiers declName
fields.forM fun field => do
if field.declName == ctor.declName then
throwErrorAt field.ref "Invalid field name `{field.name}`: This is the name of the structure constructor"
addDeclarationRangesFromSyntax field.declName field.ref
return {
ref := stx
declId
modifiers
isClass
shortDeclName := name
declName
levelNames
binders
type?
allowIndices := false
allowSortPolymorphism := false
ctors := #[ctor]
parents
fields
computedFields := #[]
derivingClasses
docString?
}
-- In the case of mutual inductives, this is the earliests point where we can establish the
-- correct scope for each individual inductive declaration (used e.g. to infer ctor visibility
-- below), so let's do that now.
withExporting (isExporting := !isPrivateName declName) do
if modifiers.isMeta then
modifyEnv (markMeta · declName)
addDeclarationRangesForBuiltin declName modifiers.stx stx
let (binders, type?) := expandOptDeclSig stx[2]
let exts := stx[3]
let type?
-- Compatibility mode for `structure S extends P : Type` syntax
if type?.isNone && !exts.isNone && !exts[0][2].isNone then
logWarningAt exts[0][2][0] <| "\
The syntax is now `structure S : Type extends P` rather than `structure S extends P : Type`"
++ .note "The purpose of this change is to accommodate `structure S extends toP : P` syntax for naming parent projections."
pure (some exts[0][2][0][1])
else
if !exts.isNone && !exts[0][2].isNone then
logErrorAt exts[0][2][0] <| "\
Unexpected additional resulting type. \
The syntax is now `structure S : Type extends P` rather than `structure S extends P : Type`."
++ .note "The purpose of this change is to accommodate `structure S extends toP : P` syntax for naming parent projections."
pure type?
let parents expandParents exts
let derivingClasses getOptDerivingClasses stx[5]
let fields expandFields stx modifiers declName
-- Private fields imply a private constructor (in the module system only, for back-compat)
let ctor expandCtor
(forcePrivate := ( getEnv).header.isModule && fields.any (·.modifiers.isPrivate))
stx modifiers declName
fields.forM fun field => do
if field.declName == ctor.declName then
throwErrorAt field.ref "Invalid field name `{field.name}`: This is the name of the structure constructor"
addDeclarationRangesFromSyntax field.declName field.ref
return {
ref := stx
declId
modifiers
isClass
shortDeclName := name
declName
levelNames
binders
type?
allowIndices := false
allowSortPolymorphism := false
ctors := #[ctor]
parents
fields
computedFields := #[]
derivingClasses
docString?
}
/-!
@@ -958,45 +962,17 @@ private def solveParentMVars (e : Expr) : StructElabM Expr := do
discard <| MVarId.checkedAssign mvar parentInfo.fvar
return e
open Parser.Term in
private def typelessBinder? : Syntax Option ((Array Ident) × BinderInfo)
| `(bracketedBinderF|($ids:ident*)) => some (ids, .default)
| `(bracketedBinderF|{$ids:ident*}) => some (ids, .implicit)
| `(bracketedBinderF|$ids:ident*) => some (ids, .strictImplicit)
| `(bracketedBinderF|[$id:ident]) => some (#[id], .instImplicit)
| _ => none
/--
Takes a binder list and interprets the prefix to see if any could be construed to be binder info updates.
Returns the binder list without these updates along with the new binder infos for these parameters.
-/
private def elabParamInfoUpdates (structParams : Array Expr) (binders : Array Syntax) : StructElabM (Array Syntax × ExprMap (Syntax × BinderInfo)) := do
let mut overrides : ExprMap (Syntax × BinderInfo) := {}
for i in *...binders.size do
match typelessBinder? binders[i]! with
| none => return (binders.extract i, overrides)
| some (ids, bi) =>
let lctx getLCtx
let decls := ids.filterMap fun id => lctx.findFromUserName? id.getId
-- Filter out all fields. We assume the remaining fvars are the possible parameters.
let decls decls.filterM fun decl => return ( findFieldInfoByFVarId? decl.fvarId).isNone
if decls.size != ids.size then
-- Then either these are for a new variables or the binder isn't only for parameters
return (binders.extract i, overrides)
for decl in decls, id in ids do
Term.addTermInfo' id decl.toExpr
unless structParams.contains decl.toExpr do
throwErrorAt id m!"Only parameters appearing in the declaration header may have their binders kinds be overridden"
++ .hint' "If this is not intended to be an override, use a binder with a type: for example, `(x : _)`"
overrides := overrides.insert decl.toExpr (id, bi)
return (#[], overrides)
private def elabParamInfoUpdatesForField (structParams : Array Expr) (binders : Array Syntax) : StructElabM (Array Syntax × ExprMap (Syntax × BinderInfo)) := do
elabParamInfoUpdates structParams binders
-- Filter out all fields. We assume the remaining fvars are the possible parameters.
(fun fvarId => return ( findFieldInfoByFVarId? fvarId).isNone)
private def elabFieldTypeValue (structParams : Array Expr) (view : StructFieldView) :
StructElabM (Option Expr × ExprMap (Syntax × BinderInfo) × Option StructFieldDefault) := do
withoutExporting (when := view.modifiers.isPrivate) do
let state get
let binders := view.binders.getArgs
let (binders, paramInfoOverrides) elabParamInfoUpdates structParams binders
let (binders, paramInfoOverrides) elabParamInfoUpdatesForField structParams binders
Term.withAutoBoundImplicit <| Term.withAutoBoundImplicitForbiddenPred (fun n => view.name == n) <| Term.elabBinders binders fun params => do
match view.type? with
| none =>
@@ -1085,7 +1061,7 @@ where
if info.default?.isSome then
throwError "A new default value for field `{view.name}` has already been set in this structure"
let mut valStx := valStx
let (binders, paramInfoOverrides) elabParamInfoUpdates structParams view.binders.getArgs
let (binders, paramInfoOverrides) elabParamInfoUpdatesForField structParams view.binders.getArgs
unless paramInfoOverrides.isEmpty do
let params := MessageData.joinSep (paramInfoOverrides.toList.map (m!"{·.1}")) ", "
throwError "Cannot override structure parameter binder kinds when overriding the default value: {params}"
@@ -1182,7 +1158,7 @@ Builds a constructor for the type, for adding the inductive type to the environm
private def mkCtor (view : StructView) (r : ElabHeaderResult) (params : Array Expr) : StructElabM Constructor :=
withoutExporting (when := isPrivateName view.ctor.declName) do
withRef view.ref do
let (binders, paramInfoOverrides) elabParamInfoUpdates params view.ctor.binders.getArgs
let (binders, paramInfoOverrides) elabParamInfoUpdates params view.ctor.binders.getArgs (fun _ => pure true)
unless binders.isEmpty do
throwErrorAt (mkNullNode binders) "Expecting binders that update binder kinds of type parameters."
trace[Elab.structure] "constructor param overrides {view.ctor.binders}"
@@ -1555,9 +1531,10 @@ def elabStructureCommand : InductiveElabDescr where
addProjections params r fieldInfos
registerStructure view.declName fieldInfos
runStructElabM (init := state) do
withOptions (warn.sorry.set · false) do
mkFlatCtor levelParams params view.declName replaceIndFVars
addDefaults levelParams params replaceIndFVars
withExporting (isExporting := !isPrivateName view.ctor.declName) do
withOptions (warn.sorry.set · false) do
mkFlatCtor levelParams params view.declName replaceIndFVars
addDefaults levelParams params replaceIndFVars
let parentInfos withLCtx lctx localInsts <| runStructElabM (init := state) do
withOptions (warn.sorry.set · false) do
mkRemainingProjections levelParams params view

View File

@@ -86,7 +86,7 @@ def checkLeftRec (stx : Syntax) : ToParserDescrM Bool := do
markAsTrailingParser (prec?.getD 0)
return true
def elabParserName? (stx : Syntax.Ident) (checkMeta := true) : TermElabM (Option Parser.ParserResolution) := do
def elabParserName? (stx : Syntax.Ident) : TermElabM (Option Parser.ParserResolution) := do
match Parser.resolveParserName stx with
| [n@(.category catId)] =>
if let some cat := Parser.getParserCategory? ( getEnv) catId then
@@ -96,7 +96,7 @@ def elabParserName? (stx : Syntax.Ident) (checkMeta := true) : TermElabM (Option
addCategoryInfo stx catId
return n
| [n@(.parser parser _)] =>
if checkMeta && getIRPhases ( getEnv) parser == .runtime then
if getIRPhases ( getEnv) parser == .runtime then
throwError m!"Declaration `{.ofConstName parser}` must be marked or imported as `meta` to be used as parser"
recordExtraModUseFromDecl (isMeta := true) parser
addTermInfo' stx (Lean.mkConst parser)
@@ -106,8 +106,8 @@ def elabParserName? (stx : Syntax.Ident) (checkMeta := true) : TermElabM (Option
| _::_::_ => throwErrorAt stx "ambiguous parser {stx}"
| [] => return none
def elabParserName (stx : Syntax.Ident) (checkMeta := true) : TermElabM Parser.ParserResolution := do
match elabParserName? stx checkMeta with
def elabParserName (stx : Syntax.Ident) : TermElabM Parser.ParserResolution := do
match elabParserName? stx with
| some n => return n
| none => throwErrorAt stx "unknown parser {stx}"

View File

@@ -64,7 +64,7 @@ def withLocation (loc : Location) (atLocal : FVarId → TacticM Unit) (atTarget
if type then
withMainContext atTarget
| Location.wildcard =>
let worked tryTactic <| withMainContext <| atTarget
let worked tryTactic <| withSaveInfoContext <| withMainContext <| atTarget
let g try getMainGoal catch _ => return () -- atTarget closed the goal
g.withContext do
let mut worked := worked
@@ -72,7 +72,7 @@ def withLocation (loc : Location) (atLocal : FVarId → TacticM Unit) (atTarget
for fvarId in ( getLCtx).getFVarIds.reverse do
if ( fvarId.getDecl).isImplementationDetail then
continue
worked := worked || ( tryTactic <| withMainContext <| atLocal fvarId)
worked := worked || ( tryTactic <| withSaveInfoContext <| withMainContext <| atLocal fvarId)
unless worked do
failed ( getMainGoal)

View File

@@ -426,9 +426,7 @@ def elabSetLibrarySuggestions : CommandElab
-- Generate a fresh name for the selector definition
let name liftMacroM <| Macro.addMacroScope `_librarySuggestions
-- Elaborate the definition with the library_suggestions attribute
-- Note: @[expose] public, to ensure visibility across module boundaries
-- Use fully qualified `Lean.LibrarySuggestions.Selector` for module compatibility
elabCommand ( `(@[expose, library_suggestions] public def $(mkIdent name) : Lean.LibrarySuggestions.Selector := $selector))
elabCommand ( `(@[library_suggestions] public meta def $(mkIdent name) : Selector := $selector))
| _ => throwUnsupportedSyntax
open Lean.Elab.Tactic in

View File

@@ -6,8 +6,7 @@ Authors: Kim Morrison
module
prelude
public import Lean.LibrarySuggestions.SineQuaNon
import all Lean.LibrarySuggestions.SineQuaNon
public meta import Lean.LibrarySuggestions.SineQuaNon
/-!
# Default library suggestions engine

View File

@@ -38,7 +38,7 @@ def coercionsBannedInCore : Array Name := #[``optionCoe, ``instCoeSubarrayArray]
def coeLinter : Linter where
run := fun _ => do
let mainModule getMainModule
let isCoreModule := mainModule = `lean.run.linterCoe (mainModule.getRoot [`Init, `Std])
let isCoreModule := mainModule = `elab.linterCoe (mainModule.getRoot [`Init, `Std])
let shouldWarnOnDeprecated := getLinterValue linter.deprecatedCoercions ( getLinterOptions)
let trees Elab.getInfoTrees
for tree in trees do

View File

@@ -34,6 +34,8 @@ matches a particular constructor. Use `linter.constructorNameAsVariable` to disa
-/
def constructorNameAsVariable : Linter where
run cmdStx := do
unless getLinterValue linter.constructorNameAsVariable ( getLinterOptions) do
return
let some cmdStxRange := cmdStx.getRange?
| return
@@ -55,9 +57,9 @@ def constructorNameAsVariable : Linter where
-- Skip declarations which are outside the command syntax range, like `variable`s
-- (it would be confusing to lint these), or those which are macro-generated
if !cmdStxRange.contains range.start || ldecl.userName.hasMacroScopes then return
let opts := ci.options
let opts ci.options.toLinterOptions
-- we have to check for the option again here because it can be set locally
if !linter.constructorNameAsVariable.get opts then return
if !getLinterValue linter.constructorNameAsVariable opts then return
if let n@(.str .anonymous s) := info.stx.getId then
-- Check whether the type is an inductive type, and get its constructors
let ty

View File

@@ -965,20 +965,6 @@ Note that the error name is not relativized to the current namespace.
@[builtin_command_parser] def registerErrorExplanationStx := leading_parser
optional docComment >> "register_error_explanation " >> ident >> termParser
/--
Returns syntax for `private` or `public` visibility depending on `isPublic`. This function should be
used to generate visibility syntax for declarations that is independent of the presence of
`public section`s.
-/
def visibility.ofBool (isPublic : Bool) : TSyntax ``visibility :=
Unhygienic.run <| if isPublic then `(visibility| public) else `(visibility| private)
/--
Returns syntax for `private` if `attrKind` is `local` and `public` otherwise.
-/
def visibility.ofAttrKind (attrKind : TSyntax ``Term.attrKind) : TSyntax ``visibility :=
visibility.ofBool <| !attrKind matches `(attrKind| local)
end Command
namespace Term

View File

@@ -63,7 +63,7 @@ def notFollowedByRedefinedTermToken :=
-- an "open" command follows the `do`-block.
-- If we don't add `do`, then users would have to indent `do` blocks or use `{ ... }`.
notFollowedBy ("set_option" <|> "open" <|> "if" <|> "match" <|> "match_expr" <|> "let" <|> "let_expr" <|> "have" <|>
"do" <|> "dbg_trace" <|> "assert!" <|> "debug_assert!" <|> "for" <|> "unless" <|> "return" <|> symbol "try")
"do" <|> "dbg_trace" <|> "idbg" <|> "assert!" <|> "debug_assert!" <|> "for" <|> "unless" <|> "return" <|> symbol "try")
"token at 'do' element"
@[builtin_doElem_parser] def doLet := leading_parser
@@ -224,6 +224,36 @@ It should only be used for debugging.
@[builtin_doElem_parser] def doDbgTrace := leading_parser:leadPrec
"dbg_trace " >> ((interpolatedStr termParser) <|> termParser)
/--
*experimental*
`idbg e` enables live inspection of program state from the editor. When placed in a `do` block,
it captures all local variables in scope and the expression `e`, then:
- **In the language server**: starts a TCP server on localhost waiting for the running program to
connect; the editor will mark this part of the program as "in progress" during this wait but that
will not block `lake build` of the project.
- **In the compiled program**: on first execution of the `idbg` call site, connects to the server,
receives the expression, compiles and evaluates it using the program's actual runtime values, and
sends the `repr` result back.
The result is displayed as an info diagnostic on the `idbg` keyword. The expression `e` can be
edited while the program is running - each edit triggers re-elaboration of `e`, a new TCP exchange,
and an updated result. This makes `idbg` a live REPL for inspecting and experimenting with
program state at a specific point in execution. Only when `idbg` is inserted, moved, or removed does
the program need to be recompiled and restarted.
# Known Limitations
* The program will poll for the server for up to 10 minutes and needs to be killed manually
otherwise.
* Use of multiple `idbg` at once untested, likely too much overhead from overlapping imports without
further changes.
* `LEAN_PATH` must be properly set up so compiled program can import its origin module.
* Untested on Windows and macOS.
-/
@[builtin_doElem_parser] def doIdbg := leading_parser:leadPrec
withPosition ("idbg " >> termParser)
/--
`assert! cond` panics if `cond` evaluates to `false`.
-/
@[builtin_doElem_parser] def doAssert := leading_parser:leadPrec

View File

@@ -367,16 +367,20 @@ def leadingIdentBehavior (env : Environment) (catName : Name) : LeadingIdentBeha
| none => LeadingIdentBehavior.default
| some cat => cat.behavior
unsafe def evalParserConstUnsafe (declName : Name) : ParserFn := fun ctx s => unsafeBaseIO do
unsafe def evalParserConstUnsafe (declName : Name) (evalFallback? : Option ParserFn := none) : ParserFn := fun ctx s => unsafeBaseIO do
let categories := (parserExtension.getState ctx.env).categories
match ( (mkParserOfConstant categories declName { env := ctx.env, opts := ctx.options }).toBaseIO) with
| .ok (_, p) =>
-- We should manually register `p`'s tokens before invoking it as it might not be part of any syntax category (yet)
return adaptUncacheableContextFn (fun ctx => { ctx with tokens := p.info.collectTokens [] |>.foldl (fun tks tk => tks.insert tk tk) ctx.tokens }) p.fn ctx s
| .error e => return s.mkUnexpectedError e.toString
| .error e =>
if let some evalFallback := evalFallback? then
return evalFallback ctx s
else
return s.mkUnexpectedError e.toString
@[implemented_by evalParserConstUnsafe]
opaque evalParserConst (declName : Name) : ParserFn
opaque evalParserConst (declName : Name) (evalFallback? : Option ParserFn := none) : ParserFn
register_builtin_option internal.parseQuotWithCurrentStage : Bool := {
defValue := false
@@ -388,7 +392,11 @@ def evalInsideQuot (declName : Name) : Parser → Parser := withFn fun f c s =>
if c.quotDepth > 0 && !c.suppressInsideQuot && internal.parseQuotWithCurrentStage.get c.options && c.env.contains declName then
adaptUncacheableContextFn (fun ctx =>
{ ctx with options := ctx.options.set `interpreter.prefer_native false })
(evalParserConst declName) c s
-- HACK: silently fall back to running compiled `f` on eval error, otherwise parser imported
-- but not meta imported can lead to silent backtracking and confusing errors such as
-- "unexpected token `by`". Note that the above `contains` already sets a silent fallback for
-- "not imported at all".
(evalParserConst (evalFallback? := some f) declName) c s
else
f c s
@@ -696,8 +704,9 @@ private def resolveParserNameCore (env : Environment) (opts : Options) (currName
return []
/-- Resolve the given parser name and return a list of candidates. -/
def ParserContext.resolveParserName (ctx : ParserContext) (id : Ident) : List ParserResolution :=
Parser.resolveParserNameCore ctx.env ctx.options ctx.currNamespace ctx.openDecls id
def ParserContext.resolveParserName (ctx : ParserContext) (id : Ident) (unsetExporting := false) : List ParserResolution :=
let env := if unsetExporting then ctx.env.setExporting false else ctx.env
Parser.resolveParserNameCore env ctx.options ctx.currNamespace ctx.openDecls id
/-- Resolve the given parser name and return a list of candidates. -/
def resolveParserName (id : Ident) : CoreM (List ParserResolution) :=
@@ -710,7 +719,7 @@ def parserOfStackFn (offset : Nat) : ParserFn := fun ctx s => Id.run do
let parserName@(.ident ..) := stack.get! (stack.size - offset - 1)
| s.mkUnexpectedError ("failed to determine parser using syntax stack, the specified element on the stack is not an identifier")
let iniSz := s.stackSize
let s match ctx.resolveParserName parserName with
let s match ctx.resolveParserName parserName (unsetExporting := true) with
| [.category cat] =>
categoryParserFn cat ctx s
| [.parser parserName _] =>

View File

@@ -6,7 +6,8 @@ Authors: Leonardo de Moura, Sebastian Ullrich
module
prelude
public import Lean.Parser.Command
public import Lean.Parser.Module.Syntax
meta import Lean.Parser.Module.Syntax
import Init.While
public section
@@ -15,26 +16,6 @@ namespace Lean
namespace Parser
namespace Module
def moduleTk := leading_parser "module"
def «prelude» := leading_parser "prelude"
def «public» := leading_parser (withAnonymousAntiquot := false) "public"
def «meta» := leading_parser (withAnonymousAntiquot := false) "meta"
def «all» := leading_parser (withAnonymousAntiquot := false) "all"
def «import» := leading_parser
atomic (optional «public» >> optional «meta» >> "import ") >>
optional all >>
identWithPartialTrailingDot
def header := leading_parser optional (moduleTk >> ppLine >> ppLine) >>
optional («prelude» >> ppLine) >>
many («import» >> ppLine) >>
ppLine
/--
Parser for a Lean module. We never actually run this parser but instead use the imperative definitions below that
return the same syntax tree structure, but add error recovery. Still, it is helpful to have a `Parser` definition
for it in order to auto-generate helpers such as the pretty printer. -/
@[run_builtin_parser_attribute_hooks]
def module := leading_parser header >> many (commandParser >> ppLine >> ppLine)
def updateTokens (tokens : TokenTable) : TokenTable :=
match addParserTokens tokens header.info with

View File

@@ -0,0 +1,36 @@
/-
Copyright (c) 2019 Microsoft Corporation. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Leonardo de Moura, Sebastian Ullrich
-/
module
prelude
public import Lean.Parser.Command
public section
namespace Lean
namespace Parser
namespace Module
def moduleTk := leading_parser "module"
def «prelude» := leading_parser "prelude"
def «public» := leading_parser (withAnonymousAntiquot := false) "public"
def «meta» := leading_parser (withAnonymousAntiquot := false) "meta"
def «all» := leading_parser (withAnonymousAntiquot := false) "all"
def «import» := leading_parser
atomic (optional «public» >> optional «meta» >> "import ") >>
optional all >>
identWithPartialTrailingDot
def header := leading_parser optional (moduleTk >> ppLine >> ppLine) >>
optional («prelude» >> ppLine) >>
many («import» >> ppLine) >>
ppLine
/--
Parser for a Lean module. We never actually run this parser but instead use the imperative definitions in the parent module that
return the same syntax tree structure, but add error recovery. Still, it is helpful to have a `Parser` definition
for it in order to auto-generate helpers such as the pretty printer. -/
@[run_builtin_parser_attribute_hooks]
def module := leading_parser header >> many (commandParser >> ppLine >> ppLine)

View File

@@ -912,6 +912,10 @@ interpolated string literal) to stderr. It should only be used for debugging.
@[builtin_term_parser] def dbgTrace := leading_parser:leadPrec
withPosition ("dbg_trace" >> (interpolatedStr termParser <|> termParser)) >>
optSemicolon termParser
/-- Term-level form of the interactive debugger. See the `doIdbg` do element for full documentation. -/
@[builtin_term_parser] def «idbg» := leading_parser:leadPrec
withPosition ("idbg " >> checkColGt >> termParser) >>
optSemicolon termParser
/-- `assert! cond` panics if `cond` evaluates to `false`. -/
@[builtin_term_parser] def assert := leading_parser:leadPrec
withPosition ("assert! " >> termParser) >> optSemicolon termParser
@@ -1024,6 +1028,14 @@ string or a `MessageData` term.
@[builtin_term_parser] def logNamedWarningAtMacro := leading_parser
"logNamedWarningAt " >> termParser maxPrec >> ppSpace >> identWithPartialTrailingDot >> ppSpace >> (interpolatedStr termParser <|> termParser maxPrec)
/--
Representation of an expression with metadata used during pretty printing for the `pp.mdata` option.
-/
@[run_builtin_parser_attribute_hooks]
def mdataDiagnostic := leading_parser
group ("[" >> "mdata" >> many (group <| ppSpace >> ident >> optional (":" >> termParser)) >> "]") >>
ppSpace >> termParser
end Term
@[builtin_term_parser default+1] def Tactic.quot : Parser := leading_parser

View File

@@ -184,11 +184,14 @@ def binderTactic := leading_parser
def binderDefault := leading_parser
" := " >> termParser
set_option compiler.relaxedMetaCheck true in
private meta def binderDefaultM := binderDefault
open Lean.PrettyPrinter Parenthesizer Syntax.MonadTraverser in
@[combinator_parenthesizer Lean.Parser.Term.binderDefault, expose] def binderDefault.parenthesizer : Parenthesizer := do
let prec := match ( getCur) with
-- must parenthesize to distinguish from `binderTactic`
| `(binderDefault| := by $_) => maxPrec
| `(binderDefaultM| := by $_) => maxPrec
| _ => 0
visitArgs do
term.parenthesizer prec

View File

@@ -9,10 +9,9 @@ prelude
public import Lean.PrettyPrinter.Delaborator.Basic
public import Lean.Meta.CoeAttr
public import Lean.Meta.Structure
import Lean.Parser.Command
public import Lean.PrettyPrinter.Formatter
public import Lean.PrettyPrinter.Parenthesizer
meta import Lean.Parser.Do
meta import Lean.Parser.Command
public section
@@ -100,17 +99,31 @@ Rather, it is called through the `app` delaborator.
-/
def delabConst : Delab := do
let Expr.const c₀ ls getExpr | unreachable!
let unresolveName (n : Name) : DelabM Name := do
unresolveNameGlobalAvoidingLocals n (fullNames := getPPOption getPPFullNames)
let mut c := c₀
if isPrivateName c₀ then
unless ( getPPOption getPPPrivateNames) do
c unresolveName c
if let some n := privateToUserName? c then
-- The private name could not be made non-private, so make the result inaccessible
c withFreshMacroScope <| MonadQuotation.addMacroScope n
else
c unresolveName c
let env getEnv
let c
if pure (isPrivateName c₀) <&&> getPPOption getPPPrivateNames then
pure c₀
else if let some c unresolveNameGlobalAvoidingLocals? c₀ (fullNames := getPPOption getPPFullNames) then
pure c
else if !env.contains c then
-- The compiler pretty prints constants that are not defined in the environment.
-- Use such names as-is, without additional macro scopes.
-- We still remove the private prefix if the name would be accessible to the current module.
if isPrivateName c && mkPrivateName env (privateToUserName c₀.eraseMacroScopes) == c₀.eraseMacroScopes then
pure <| Name.modifyBase c₀ privateToUserName
else
pure c₀
else
-- The identifier cannot be referred to. To indicate this, we add a delaboration-specific macro scope.
-- If the name is private, we also erase the private prefix and add it as an additional macro scope, ensuring
-- no collisions between names with different private prefixes.
let c := if let some (.num priv 0) := privatePrefix? c₀.eraseMacroScopes then
-- The private prefix before `0` is of the form `_private.modulename`, which works as a macro scope context
Lean.addMacroScope priv (Name.modifyBase c₀ privateToUserName) reservedMacroScope
else
c₀
pure <| Lean.addMacroScope `_delabConst c reservedMacroScope
let stx
if !ls.isEmpty && ( getPPOption getPPUniverses) then
let mvars getPPOption getPPMVarsLevels
@@ -296,7 +309,10 @@ def delabAppExplicitCore (fieldNotation : Bool) (numArgs : Nat) (delabHead : (in
let insertExplicit := needsExplicit (( getExpr).getBoundedAppFn numArgs) numArgs paramKinds
let fieldNotation pure (fieldNotation && !insertExplicit) <&&> getPPOption getPPFieldNotation
<&&> not <$> getPPOption getPPAnalysisNoDot
<&&> withBoundedAppFn numArgs do pure ( getExpr).consumeMData.isConst <&&> not <$> withMDatasOptions (getPPOption getPPAnalysisBlockImplicit <|> getPPOption getPPUniverses)
<&&> withBoundedAppFn numArgs do
pure ( getExpr).consumeMData.isConst
<&&> not <$> (pure ( getExpr).isMData <&&> getPPOption getPPMData)
<&&> not <$> withMDatasOptions (getPPOption getPPAnalysisBlockImplicit <|> getPPOption getPPUniverses)
let field? if fieldNotation then appFieldNotationCandidate? else pure none
let (fnStx, _, argStxs) withBoundedAppFnArgs numArgs
(do return ( delabHead insertExplicit, paramKinds.toList, Array.mkEmpty numArgs))
@@ -357,7 +373,10 @@ Assumes `numArgs ≤ paramKinds.size`.
-/
def delabAppImplicitCore (unexpand : Bool) (numArgs : Nat) (delabHead : Delab) (paramKinds : Array ParamKind) : Delab := do
let unexpand pure unexpand
<&&> withBoundedAppFn numArgs do pure ( getExpr).consumeMData.isConst <&&> not <$> withMDatasOptions (getPPOption getPPUniverses)
<&&> withBoundedAppFn numArgs do
pure ( getExpr).consumeMData.isConst
<&&> not <$> (pure ( getExpr).isMData <&&> getPPOption getPPMData)
<&&> not <$> withMDatasOptions (getPPOption getPPUniverses)
let field?
if pure unexpand <&&> getPPOption getPPFieldNotation <&&> not <$> getPPOption getPPAnalysisNoDot then
appFieldNotationCandidate?
@@ -522,7 +541,12 @@ Default delaborator for applications.
@[builtin_delab app]
def delabApp : Delab := do
let delabAppFn (insertExplicit : Bool) : Delab := do
let stx if ( getExpr).consumeMData.isConst then withMDatasOptions delabConst else delab
let e getExpr
let stx
if pure e.consumeMData.isConst <&&> not <$> (pure e.isMData <&&> getPPOption getPPMData) then
withMDatasOptions delabConst
else
delab
if insertExplicit && !stx.raw.isOfKind ``Lean.Parser.Term.explicit then `(@$stx) else pure stx
delabAppCore ( getExpr).getAppNumArgs delabAppFn (unexpand := true)
@@ -843,18 +867,37 @@ where
else
x
private def reflectDataValue (t : DataValue) : Term := Unhygienic.run do
match t with
| .ofBool b => return mkIdent (if b then `true else `false)
| .ofNat n => return quote n
| .ofInt n => if n 0 then return quote n.toNat else `(-$(quote n.natAbs))
| .ofString s => return quote s
| .ofName n => return mkIdent n
| .ofSyntax _ => `(_)
@[builtin_delab mdata]
def delabMData : Delab := do
if let some _ := inaccessible? ( getExpr) then
let s withMDataExpr delab
if ( read).inPattern then
`(.($s)) -- We only include the inaccessible annotation when we are delaborating patterns
else
return s
else if let some _ := isLHSGoal? ( getExpr) then
withMDataExpr <| withAppFn <| withAppArg <| delab
if getPPOption getPPMData then
let .mdata d _ getExpr | unreachable!
let (keys, vals?) d.entries.foldlM (init := (#[], #[]))
fun ((keys : Array Ident), (vals : Array (Option Term))) (k, v) => do
return (keys.push (mkIdent k), vals.push (some <| reflectDataValue v))
let e withMDataOptions delab
-- Annotate to prevent overwriting the terminfo for `e`, which is
-- already inserted at the current position.
annotateCurPos =<< `(mdataDiagnostic| [mdata $[$keys $[:$vals?]?]*] $e)
else
withMDataOptions delab
if let some _ := inaccessible? ( getExpr) then
let s withMDataExpr delab
if ( read).inPattern then
`(.($s)) -- We only include the inaccessible annotation when we are delaborating patterns
else
return s
else if let some _ := isLHSGoal? ( getExpr) then
withMDataExpr <| withAppFn <| withAppArg <| delab
else
withMDataOptions delab
/--
Return `true` iff current binder should be merged with the nested
@@ -1503,7 +1546,7 @@ def delabSorry : Delab := whenPPOption getPPNotation <| whenNotPPOption getPPExp
open Parser Command Term in
@[run_builtin_parser_attribute_hooks]
-- use `termParser` instead of `declId` so we can reuse `delabConst`
def declSigWithId := leading_parser termParser maxPrec >> declSig
private meta def declSigWithId := leading_parser termParser maxPrec >> declSig
private unsafe def evalSyntaxConstantUnsafe (env : Environment) (opts : Options) (constName : Name) : ExceptT String Id Syntax :=
env.evalConstCheck Syntax opts ``Syntax constName

View File

@@ -98,6 +98,10 @@ register_builtin_option pp.numericTypes : Bool := {
defValue := false
descr := "(pretty printer) display types of numeric literals"
}
register_builtin_option pp.mdata : Bool := {
defValue := false
descr := "(pretty printer) displays a representation of mdata annotations"
}
register_builtin_option pp.instantiateMVars : Bool := {
defValue := true
descr := "(pretty printer) instantiate mvars before delaborating"
@@ -260,6 +264,7 @@ def getPPTagAppFns (o : Options) : Bool := o.get pp.tagAppFns.name (getPPAll o)
def getPPUniverses (o : Options) : Bool := o.get pp.universes.name (getPPAll o)
def getPPFullNames (o : Options) : Bool := o.get pp.fullNames.name (getPPAll o)
def getPPPrivateNames (o : Options) : Bool := o.get pp.privateNames.name (getPPAll o)
def getPPMData (o : Options) : Bool := o.get pp.mdata.name pp.mdata.defValue
def getPPInstantiateMVars (o : Options) : Bool := o.get pp.instantiateMVars.name pp.instantiateMVars.defValue
def getPPMVars (o : Options) : Bool := o.get pp.mvars.name pp.mvars.defValue
def getPPMVarsAnonymous (o : Options) : Bool := o.get pp.mvars.anonymous.name (pp.mvars.anonymous.defValue && getPPMVars o)

View File

@@ -622,7 +622,7 @@ def resolveLocalName [MonadLCtx m] (n : Name) : m (Option (Expr × List String))
loop view.name [] (globalDeclFound := false)
/--
Finds a name that unambiguously resolves to the given name `n₀`.
Finds an efficient name that unambiguously resolves to the given name `n₀`, if possible.
Considers suffixes of `n₀` and suffixes of aliases of `n₀` when "unresolving".
Aliases are considered first.
@@ -639,61 +639,63 @@ If `n₀` is an accessible name, then the result will be an accessible name.
The name `n₀` may be private.
-/
def unresolveNameGlobal (n₀ : Name) (fullNames := false) (allowHorizAliases := false)
(filter : Name m Bool := fun _ => pure true) : m Name := do
if n₀.hasMacroScopes then return n₀
-- `n₁` is the name without any private prefix, and `qn₁?` is a valid fully-qualified name.
let (n₁, qn₁?) := if let some n := privateToUserName? n₀ then
if n₀ == mkPrivateName ( getEnv) n then
-- The private name is for the current module. `ResolveName.resolveExact` allows `_root_` for such names.
(n, some (rootNamespace ++ n))
else
(n, none)
else
(n₀, some (rootNamespace ++ n₀))
def unresolveNameGlobal? (n₀ : Name)
(fullNames := false) (allowHorizAliases := false)
(filter : Name m Bool := fun _ => pure true) :
m (Option Name) := OptionT.run do
let view := extractMacroScopes n₀
-- `n₁` is the name without any private prefix or macro scopes
let n₁ := privateToUserName view.name
if fullNames then
if let [(potentialMatch, _)] resolveGlobalName n₁ (enableLog := false) then
if ( pure (potentialMatch == n₀) <&&> filter n₁) then
return n₁
if let some qn₁ := qn₁? then
-- We assume that the fully-qualified name resolves.
return qn₁
else
-- This is the imported private name case. Return the original private name.
return n₀
-- `initialNames` is an array of names to try taking suffixes of.
-- First are all the names that have `n₀` as an alias.
-- If horizontal aliases are not allowed, then any aliases that aren't from a parent namespace are filtered out.
let mut initialNames := (getRevAliases ( getEnv) n₀).toArray
-- Prefer `n₀` without `_root_`, but add it if necessary.
-- (Note: `ResolveName.resolveExact` allows `_root_` for private names accessible to the current module.)
return tryResolve view n₁ <|> tryResolve view (rootNamespace ++ n₁)
-- Now we consider aliases and the (potential) fully qualified name.
-- If horizontal aliases are not allowed, then any aliases that aren't from a parent namespace of `n₁` are skipped.
-- We try all suffixes for each option, taking the first that resolves to `n₀`, if any.
let mut aliases := (getRevAliases ( getEnv) n₀).toArray
unless allowHorizAliases do
initialNames := initialNames.filter fun n => n.getPrefix.isPrefixOf n₁.getPrefix
-- After aliases is the fully-qualified name.
if let some qn₁ := qn₁? then
initialNames := initialNames.push qn₁
for initialName in initialNames do
if let some n unresolveNameCore initialName then
return n
-- Both non-private names and current-module private names should be handled already,
-- but as a backup we return the original name.
-- Imported private names will often get to this point.
return n₀
aliases := aliases.filter fun n => n.getPrefix.isPrefixOf n₁.getPrefix
aliases.firstM (unresolveNameCore none) -- do not apply macro scopes on `n₀` to aliases
<|> unresolveNameCore view (rootNamespace ++ n₁)
where
unresolveNameCore (n : Name) : m (Option Name) := do
if n.hasMacroScopes then return none
/-- Adds any macro scopes to `n`, then returns it if it resolves to `n₀` and isn't filtered out. -/
tryResolve (view? : Option MacroScopesView) (n : Name) : OptionT m Name := do
let n' := if let some view := view? then { view with name := n }.review else n
let [(potentialMatch, _)] resolveGlobalName (enableLog := false) n' | failure
guard <| potentialMatch == n₀
guard <| filter n'
return n'
unresolveNameCore (view? : Option MacroScopesView) (n : Name) : OptionT m Name := do
guard <| !n.hasMacroScopes
let n := privateToUserName n
let mut revComponents := n.componentsRev
let mut candidate := Name.anonymous
for cmpt in revComponents do
candidate := Name.appendCore cmpt candidate
if let [(potentialMatch, _)] resolveGlobalName (enableLog := false) candidate then
if potentialMatch == n₀ then
if ( filter candidate) then
return some candidate
return none
try
return tryResolve view? candidate
catch _ : Unit => pure ()
failure
/-- Like `Lean.unresolveNameGlobal`, but also ensures that the unresolved name does not conflict
with the names of any local declarations. -/
def unresolveNameGlobalAvoidingLocals [MonadLCtx m] (n₀ : Name) (fullNames := false) : m Name :=
unresolveNameGlobal n₀ (fullNames := fullNames) (filter := fun n => Option.isNone <$> resolveLocalName n)
/--
Finds an efficient name that unambiguously resolves to the given name `n₀`, if possible.
If not, it returns `n₀`. (See `unresolveNameGlobal?`, which returns `none` instead.)
-/
def unresolveNameGlobal (n₀ : Name) (fullNames := false) (allowHorizAliases := false)
(filter : Name m Bool := fun _ => pure true) : m Name := do
let n? unresolveNameGlobal? n₀ (fullNames := fullNames) (allowHorizAliases := allowHorizAliases) (filter := filter)
return n?.getD n₀
/-- Like `Lean.unresolveNameGlobal?`, but also ensures that the unresolved name does not conflict
with the names of any local declarations. Returns `none` if the name cannot be referred to.
For example, the name may be private and not accessible to the current module, or it may have macro scopes.-/
def unresolveNameGlobalAvoidingLocals? [MonadLCtx m] (n₀ : Name) (fullNames := false) : m (Option Name) :=
unresolveNameGlobal? n₀ (fullNames := fullNames) (filter := fun n => Option.isNone <$> resolveLocalName n)
/-- Like `Lean.unresolveNameGlobalAvoidingLocals?`, but returns `n₀` unchanged if the name cannot be referred to. -/
def unresolveNameGlobalAvoidingLocals [MonadLCtx m] (n₀ : Name) (fullNames := false) : m Name := do
let n? unresolveNameGlobalAvoidingLocals? n₀ (fullNames := fullNames)
return n?.getD n₀
end Lean

View File

@@ -57,7 +57,7 @@ private def deriveStructureInstance (indVal : InductiveVal) (params : Array Expr
return { $decInits:structInstField,* }
)
private def matchAltTerm := Lean.Parser.Term.matchAlt (rhsParser := Lean.Parser.termParser)
private meta def matchAltTerm := Lean.Parser.Term.matchAlt (rhsParser := Lean.Parser.termParser)
private instance : Coe (TSyntax ``matchAltTerm) (TSyntax ``Parser.Term.matchAlt) where coe s := s
private def deriveInductiveInstance (indVal : InductiveVal) (params : Array Expr)

View File

@@ -33,6 +33,11 @@ structure Import where
deriving Repr, Inhabited, ToJson, FromJson,
BEq, Hashable -- needed by Lake (in `Lake.Load.Elab.Lean`)
-- TODO: move further up into `Init` by using simpler representation for `imports`
@[extern "lean_idbg_client_loop"]
public opaque Idbg.idbgClientLoop {α : Type} [Nonempty α]
(siteId : String) (imports : Array Import) (apply : α String) : IO Unit
instance : Coe Name Import := ({module := ·})
instance : ToString Import := fun imp =>

View File

@@ -11,6 +11,7 @@ public import Std.Data.DHashMap.RawDef
public import Std.Data.Internal.List.Defs
public import Std.Data.DHashMap.Internal.Index
public import Init.Data.Nat.Power2.Basic
import Init.Data.List.Impl
import Init.Omega
public section

View File

@@ -677,45 +677,45 @@ inductive WF : {α : Type u} → {β : α → Type v} → [BEq α] → [Hashable
-- we can write down `DHashMap.map` and `DHashMap.filterMap` in `AdditionalOperations.lean`
-- without requiring these proofs just to invoke the operations.
/-- Internal implementation detail of the hash map -/
| wf {α β} [BEq α] [Hashable α] {m : Raw α β} : 0 < m.buckets.size
| wf {α β : _} [BEq α] [Hashable α] {m : Raw α β} : 0 < m.buckets.size
( [EquivBEq α] [LawfulHashable α], Raw.WFImp m) WF m
/-- Internal implementation detail of the hash map -/
| emptyWithCapacity₀ {α β} [BEq α] [Hashable α] {c} : WF (Raw₀.emptyWithCapacity c : Raw₀ α β).1
| emptyWithCapacity₀ {α β : _} [BEq α] [Hashable α] {c} : WF (Raw₀.emptyWithCapacity c : Raw₀ α β).1
/-- Internal implementation detail of the hash map -/
| insert₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a b} : WF m WF (Raw₀.insert m, h a b).1
| insert₀ {α β : _} [BEq α] [Hashable α] {m : Raw α β} {h a b} : WF m WF (Raw₀.insert m, h a b).1
/-- Internal implementation detail of the hash map -/
| containsThenInsert₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a b} :
| containsThenInsert₀ {α β : _} [BEq α] [Hashable α] {m : Raw α β} {h a b} :
WF m WF (Raw₀.containsThenInsert m, h a b).2.1
/-- Internal implementation detail of the hash map -/
| containsThenInsertIfNew₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a b} :
| containsThenInsertIfNew₀ {α β : _} [BEq α] [Hashable α] {m : Raw α β} {h a b} :
WF m WF (Raw₀.containsThenInsertIfNew m, h a b).2.1
/-- Internal implementation detail of the hash map -/
| erase₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a} : WF m WF (Raw₀.erase m, h a).1
| erase₀ {α β : _} [BEq α] [Hashable α] {m : Raw α β} {h a} : WF m WF (Raw₀.erase m, h a).1
/-- Internal implementation detail of the hash map -/
| insertIfNew₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h a b} :
| insertIfNew₀ {α β : _} [BEq α] [Hashable α] {m : Raw α β} {h a b} :
WF m WF (Raw₀.insertIfNew m, h a b).1
/-- Internal implementation detail of the hash map -/
| getThenInsertIfNew?₀ {α β} [BEq α] [Hashable α] [LawfulBEq α] {m : Raw α β} {h a b} :
| getThenInsertIfNew?₀ {α β : _} [BEq α] [Hashable α] [LawfulBEq α] {m : Raw α β} {h a b} :
WF m WF (Raw₀.getThenInsertIfNew? m, h a b).2.1
/-- Internal implementation detail of the hash map -/
| filter₀ {α β} [BEq α] [Hashable α] {m : Raw α β} {h f} : WF m WF (Raw₀.filter f m, h).1
| filter₀ {α β : _} [BEq α] [Hashable α] {m : Raw α β} {h f} : WF m WF (Raw₀.filter f m, h).1
/-- Internal implementation detail of the hash map -/
| constGetThenInsertIfNew?₀ {α β} [BEq α] [Hashable α] {m : Raw α (fun _ => β)} {h a b} :
| constGetThenInsertIfNew?₀ {α β : _} [BEq α] [Hashable α] {m : Raw α (fun _ => β)} {h a b} :
WF m WF (Raw₀.Const.getThenInsertIfNew? m, h a b).2.1
/-- Internal implementation detail of the hash map -/
| modify₀ {α β} [BEq α] [Hashable α] [LawfulBEq α] {m : Raw α β} {h a} {f : β a β a} :
| modify₀ {α β : _} [BEq α] [Hashable α] [LawfulBEq α] {m : Raw α β} {h a} {f : β a β a} :
WF m WF (Raw₀.modify m, h a f).1
/-- Internal implementation detail of the hash map -/
| constModify₀ {α} {β : Type v} [BEq α] [Hashable α] {m : Raw α (fun _ => β)} {h a} {f : β β} :
| constModify₀ {α : _} {β : Type v} [BEq α] [Hashable α] {m : Raw α (fun _ => β)} {h a} {f : β β} :
WF m WF (Raw₀.Const.modify m, h a f).1
/-- Internal implementation detail of the hash map -/
| alter₀ {α β} [BEq α] [Hashable α] [LawfulBEq α] {m : Raw α β} {h a}
| alter₀ {α β : _} [BEq α] [Hashable α] [LawfulBEq α] {m : Raw α β} {h a}
{f : Option (β a) Option (β a)} : WF m WF (Raw₀.alter m, h a f).1
/-- Internal implementation detail of the hash map -/
| constAlter₀ {α} {β : Type v} [BEq α] [Hashable α] {m : Raw α (fun _ => β)} {h a}
| constAlter₀ {α : _} {β : Type v} [BEq α] [Hashable α] {m : Raw α (fun _ => β)} {h a}
{f : Option β Option β} : WF m WF (Raw₀.Const.alter m, h a f).1
/-- Internal implementation detail of the hash map -/
| inter₀ {α β} [BEq α] [Hashable α] {m₁ m₂ : Raw α β} {h₁ h₂} : WF m₁ WF m₂ WF (Raw₀.inter m₁, h₁ m₂, h₂).1
| inter₀ {α β : _} [BEq α] [Hashable α] {m₁ m₂ : Raw α β} {h₁ h₂} : WF m₁ WF m₂ WF (Raw₀.inter m₁, h₁ m₂, h₂).1
-- TODO: this needs to be deprecated, but there is a bootstrapping issue.
-- @[deprecated WF.emptyWithCapacity₀ (since := "2025-03-12")]

View File

@@ -311,6 +311,10 @@ def lakeLongOption : (opt : String) → CliM PUnit
| "--explain" => modifyThe LakeOptions ({· with shake.explain := true})
| "--trace" => modifyThe LakeOptions ({· with shake.trace := true})
| "--fix" => modifyThe LakeOptions ({· with shake.fix := true})
| "--only" => do
let mod takeOptArg "--only" "minimize only this module"
modifyThe LakeOptions fun opts =>
{opts with shake.onlyMods := opts.shake.onlyMods.push mod.toName}
| opt => throw <| CliError.unknownLongOption opt
def lakeOption :=

View File

@@ -37,6 +37,8 @@ public structure Args where
fix : Bool := false
/-- `<MODULE>..`: the list of root modules to check -/
mods : Array Name := #[]
/-- The list of modules to minimize exclusively, otherwise all reachable ones. -/
onlyMods : Array Name := #[]
/-- We use `Nat` as a bitset for doing efficient set operations.
The bit indexes will usually be a module index. -/
@@ -421,7 +423,9 @@ def visitModule (pkgs : Array Name) (srcSearchPath : SearchPath)
let s get
let addOnly := addOnly || module?.any (·.raw.getTrailing?.any (·.toString.contains "shake: keep-all"))
let addOnly := addOnly ||
(!args.onlyMods.isEmpty && !args.onlyMods.contains modName) ||
module?.any (·.raw.getTrailing?.any (·.toString.contains "shake: keep-all"))
let mut deps := needs
-- Add additional preserved imports

View File

@@ -1 +1 @@
lean4-stage0
../build/release/stage0

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More