Compare commits

...

21 Commits

Author SHA1 Message Date
Eric Wieser
6714601ee4 fix: remove accidental type monomorphism in Id.run_seqLeft (#12936)
This PR fixes `Id.run_seqLeft` and `Id.run_seqRight` to apply when the
two monad results are different.
2026-03-17 06:43:51 +00:00
damiano
6b604625f2 fix: add missing pp-spaces in grind_pattern (#11686)
This PR adds a pretty-printed space in `grind_pattern`.

[#lean4 > Some pretty printing quirks @
💬](https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/Some.20pretty.20printing.20quirks/near/563848793)

Co-authored-by: Kim Morrison <kim@tqft.net>
2026-03-17 04:15:02 +00:00
Kim Morrison
e96b0ff39c fix: use response files on all platforms to avoid ARG_MAX (#12540)
This PR extends Lake's use of response files (`@file`) from Windows-only
to all platforms, avoiding `ARG_MAX` limits when invoking `clang`/`ar`
with many object files.

Lake already uses response files on Windows to avoid exceeding CLI
length limits. On macOS and Linux, linking Mathlib's ~15,000 object
files into a shared library can exceed macOS's `ARG_MAX` (262,144
bytes). Both `clang` and `gcc` support `@file` response files on all
platforms, so this is safe to enable unconditionally.

Reported as a macOS issue at
https://leanprover.zulipchat.com/#narrow/channel/270676-lean4/topic/The.20clang.20command.20line.20with.20all.20~15.2C000.20Mathlib.20.2Ec.2Eo.2Eexport/near/574369912:
the Mathlib cache ships Linux `.so` shared libs but not macOS `.dylib`
files, so `precompileModules` on macOS triggers a full re-link that
exceeds `ARG_MAX`.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-17 04:14:37 +00:00
Kim Morrison
50ee6dff0a chore: update leantar to v0.1.19 (#12938) 2026-03-17 03:55:21 +00:00
Mac Malone
9e0aa14b6f feat: lake: fixedToolchain package configuration (#12935)
This PR adds the `fixedToolchain` Lake package configuration option.
Setting this to `true` informs Lake that the package is only expected to
function on a single toolchain (like Mathlib). This causes Lake's
toolchain update procedure to prioritize its toolchain and avoids the
need to separate input-to-output mappings for the package by toolchain
version in the Lake cache.
2026-03-17 02:37:55 +00:00
Garmelon
5c685465bd chore: handle absence of meld in fix_expected.py (#12934) 2026-03-16 19:07:44 +00:00
Garmelon
ef87f6b9ac chore: delete temp files before, not after tests (#12932) 2026-03-16 19:02:28 +00:00
Garmelon
49715fe63c chore: improve how test suite interacts with stages (#12913)
The tests need to run with certain environment variables set that only
cmake really knows and that differ between stages. Cmake could just set
the variables directly when running the tests and benchmarks, but that
would leave no good way to manually run a single benchmark. So cmake
generates some stage-specific scripts instead that set the required
environment variables.

Previously, those scripts were sourced directly by the individual
`run_*` scripts, so the env scripts of different stages would overwrite
each other. This PR changes the setup so they can instead be generated
next to each other. This also simplifies the `run_*` scripts themselves
a bit, and makes `tests/bench/build` less of a hack.
2026-03-16 15:20:03 +00:00
Lean stage0 autoupdater
133fd016b4 chore: update stage0 2026-03-16 13:15:14 +00:00
Bhavik Mehta
76e593a52d fix: rename Int.sq_nonnneg to Int.sq_nonneg (#12909)
This PR fixes the typo in `Int.sq_nonnneg`.

Closes #12906.

---------

Co-authored-by: Claude <noreply@anthropic.com>
2026-03-16 10:52:57 +00:00
Jesse Alama
fa9a32b5c8 fix: correct swapped operands in Std.Time subtraction instances (#12919)
This PR fixes the `HSub PlainTime Duration` instance, which had its
operands reversed: it computed `duration - time` instead of `time -
duration`. For example, subtracting 2 minutes from `time("13:02:01")`
would give `time("10:57:59")` rather than the expected
`time("13:00:01")`. We also noticed that `HSub PlainDateTime
Millisecond.Offset` is similarly affected.

Closes #12918
2026-03-16 10:52:06 +00:00
Henrik Böving
2d999d7622 refactor: ignore borrow annotations at export/extern tricks (#12930)
This PR places `set_option compiler.ignoreBorrowAnnotation true in` on
to all `export`/`extern`
pairs. This is necessary because `export` forces all arguments to be
passed as owned while `extern`
respects borrow annotations. The current approach to the
`export`/`extern` trick was always broken
but never surfaced. However, with upcoming changes many
`export`/`extern` pairs are going to be
affected by borrow annotations and would've broken without this.
2026-03-16 10:03:40 +00:00
Sebastian Ullrich
ddd5c213c6 chore: CLAUDE.md: stage 2 build instructions (#12929) 2026-03-16 09:47:14 +00:00
Kim Morrison
c9ceba1784 fix: use null-safe while-read loop for subverso manifest sync (#12928)
This PR replaces `find -print0 | xargs -0 -I{} sh -c '...'` with
`find -print0 | while IFS= read -r -d '' f; do ... done` for the
subverso sub-manifest sync in release_steps.py. The original xargs
invocation had fragile nested shell quoting; the while-read loop is
both null-delimiter safe and more readable.

🤖 Prepared with Claude Code

Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-16 08:17:32 +00:00
Mac Malone
57df23f27e feat: lake: cached compressed module artifacts (#12914)
This PR adds packing and unpacking of module artifacts into `.ltar`
archives using `leantar`.
2026-03-16 04:36:19 +00:00
Mac Malone
ea8fca2d9f refactor: lake: download arts by default in cache get (#12927)
This PR changes `lake cache get` to download artifacts by default.
Artifacts can be downloaded on demand with the new `--mappings-only`
option (`--download-arts` is now obsolete).

In the future, the plan is to have Lake download mappings when cloning
dependencies. Then, `lake cache get` will primarily be used to download
artifacts eagerly. Thus, it makes sense to have that as the default.
2026-03-16 02:29:44 +00:00
Paul Reichert
274997420a refactor: remove backward compatibility options from iterator/slice/range modules (#12925)
This PR removes `respectTransparency`, `reducibleClassField` and `simp
+instances` usages in the iterator/slice/range modules.
2026-03-15 14:03:51 +00:00
Wojciech Różowski
6631352136 fix: remove accidentally added code from Sym.Simp.Pattern (#12926)
This PR removes unused functions (`mkPatternCoreFromLambda`,
`mkPatternFromLambda`, `mkSimprocPatternFromExpr`) and the `import
Lean.Meta.AbstractMVars` that were added to `Lean.Meta.Sym.Pattern`
after merging #12597.
2026-03-15 10:30:26 +00:00
Leonardo de Moura
cfa8c5a036 fix: handle universe level commutativity in sym pattern matching (#12923)
This PR fixes a bug where `max u v` and `max v u` fail to match in
SymM's pattern matching. Both `processLevel` (Phase 1) and
`isLevelDefEqS` (Phase 2) treated `max` positionally, so `max u v ≠ max
v u` structurally even though they are semantically equal.

The fix has three parts:
- Eagerly normalize universe levels in patterns at creation time
(`preprocessDeclPattern`, `preprocessExprPattern`,
`mkSimprocPatternFromExpr`)
- Normalize the target level in `processLevel` before matching, using a
`where go` refactor
- Add `tryApproxMaxMax` to `processLevel` and `isLevelDefEqS`: when
positional `max/max` matching would fail, check if one argument from
each side matches structurally and match the remaining pair

Also moves `normalizeLevels` from `Grind.Util` to `Sym.Util` to avoid
code duplication, since both Sym and Grind need it.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-15 01:06:16 +00:00
Leonardo de Moura
7120d9aef5 fix: eta-reduce expressions in sym discrimination tree lookup (#12920)
This PR adds eta reduction to the sym discrimination tree lookup
functions (`getMatch`, `getMatchWithExtra`, `getMatchLoop`). Without
this, expressions like `StateM Nat` that unfold to eta-expanded forms
`(fun α => StateT Nat Id α)` fail to match discrimination tree entries
for the eta-reduced form `(StateT Nat Id)`.

Also optimizes `etaReduce` with an early exit for non-lambda expressions
and removes a redundant `n == 0` check.
Includes a test verifying that `P (StateM Nat)` matches a disc tree
entry for `P (StateT Nat Id)`.

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-14 16:57:10 +00:00
Joachim Breitner
c2d4079193 perf: optimize string literal equality simprocs for kernel efficiency (#12887)
This PR optimizes the `String.reduceEq`, `String.reduceNe`, and
`Sym.Simp` string equality simprocs to produce kernel-efficient proofs.
Previously, these used `String.decEq` which forced the kernel to run
UTF-8 encoding/decoding and byte array comparison, causing 86+ kernel
unfoldings on short strings.

The new approach reduces string inequality to `List Char` via
`String.ofList_injective`, then uses two strategies depending on the
difference:

- **Different characters at position `i`**: Projects to `Nat` via
`congrArg (fun l => (List.get!Internal l i).toNat)`, then uses
`Nat.ne_of_beq_eq_false rfl`. This avoids `Decidable` instances entirely
— the kernel only evaluates `Nat.beq` on two concrete natural numbers.

- **One string is a prefix of the other**: Uses `congrArg (List.drop n
·)` with `List.cons_ne_nil`, which is a definitional proof requiring no
`decide` step at all.

For equal strings, `eq_true rfl` avoids kernel evaluation entirely.

The shared proof construction is in `Lean.Meta.mkStringLitNeProof`
(`Lean/Meta/StringLitProof.lean`), used by both the standard simprocs
and the `Sym.Simp` ground evaluator.

Kernel max unfolds for `"hello" ≠ "foo"`: 86+ → 6.

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-14 10:30:31 +00:00
238 changed files with 1436 additions and 689 deletions

View File

@@ -20,9 +20,24 @@ CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test ARGS='--rerun-failed'
# Single test from tests/foo/bar/ (quick check during development)
cd tests/foo/bar && ./run_test example_test.lean
CTEST_PARALLEL_LEVEL="$(nproc)" CTEST_OUTPUT_ON_FAILURE=1 \
make -C build/release -j "$(nproc)" test ARGS=-R testname'
```
## Testing stage 2
When requested to test stage 2, build it as follows:
```
make -C build/release stage2 -j$(nproc)
```
Stage 2 is *not* automatically invalidated by changes to `src/` which allows for faster iteration
when fixing a specific file in the stage 2 build but for invalidating any files that already passed
the stage 2 build as well as for final validation,
```
make -C build/release/stage2 clean-stdlib
```
must be run manually before building.
## New features
When asked to implement new features:

View File

@@ -79,7 +79,7 @@ if(NOT CMAKE_SYSTEM_NAME MATCHES "Emscripten")
endif()
find_program(LEANTAR leantar)
if(NOT LEANTAR)
set(LEANTAR_VERSION v0.1.18)
set(LEANTAR_VERSION v0.1.19)
if(CMAKE_SYSTEM_NAME MATCHES "Windows")
set(LEANTAR_ARCHIVE_SUFFIX .zip)
set(LEANTAR_TARGET x86_64-pc-windows-msvc)

View File

@@ -1,6 +1,3 @@
#!/usr/bin/env bash
source ../../../tests/env_test.sh
leanmake --always-make bin
capture ./build/bin/test hello world

3
doc/examples/run_test → doc/examples/run_test.sh Executable file → Normal file
View File

@@ -1,6 +1,3 @@
#!/usr/bin/env bash
source ../../tests/env_test.sh
capture_only "$1" \
lean -Dlinter.all=false "$1"
check_exit_is_success

View File

@@ -492,8 +492,9 @@ def execute_release_steps(repo, version, config):
'ROOT_REV=$(jq -r \'.packages[] | select(.name == "subverso") | .rev\' lake-manifest.json); '
'SUBVERSO_URL=$(jq -r \'.packages[] | select(.name == "subverso") | .url\' lake-manifest.json); '
'DEMOD_REV=$(git ls-remote "$SUBVERSO_URL" "refs/tags/no-modules/$ROOT_REV" | awk \'{print $1}\'); '
'find test-projects -name lake-manifest.json -print0 | '
'xargs -0 -I{} sh -c \'jq --arg rev "$DEMOD_REV" \'.packages |= map(if .name == "subverso" then .rev = $rev else . end)\' "{}" > /tmp/lm_tmp.json && mv /tmp/lm_tmp.json "{}"\''
'find test-projects -name lake-manifest.json -print0 | while IFS= read -r -d \'\' f; do '
'jq --arg rev "$DEMOD_REV" \'.packages |= map(if .name == "subverso" then .rev = $rev else . end)\' "$f" > /tmp/lm_tmp.json && mv /tmp/lm_tmp.json "$f"; '
'done'
)
run_command(sync_script, cwd=repo_path)
print(green("Synced de-modulized subverso rev to all test-project sub-manifests"))

View File

@@ -254,8 +254,8 @@ instance : LawfulMonad Id := by
@[simp, grind =] theorem run_bind (x : Id α) (f : α Id β) : (x >>= f).run = (f x.run).run := rfl
@[simp, grind =] theorem run_pure (a : α) : (pure a : Id α).run = a := rfl
@[simp, grind =] theorem pure_run (a : Id α) : pure a.run = a := rfl
@[simp] theorem run_seqRight (x y : Id α) : (x *> y).run = y.run := rfl
@[simp] theorem run_seqLeft (x y : Id α) : (x <* y).run = x.run := rfl
@[simp] theorem run_seqRight (x : Id α) (y : Id β) : (x *> y).run = y.run := rfl
@[simp] theorem run_seqLeft (x : Id α) (y : Id β) : (x <* y).run = x.run := rfl
@[simp] theorem run_seq (f : Id (α β)) (x : Id α) : (f <*> x).run = f.run x.run := rfl
end Id

View File

@@ -118,16 +118,19 @@ theorem toNat_pow_of_nonneg {x : Int} (h : 0 ≤ x) (k : Nat) : (x ^ k).toNat =
| succ k ih =>
rw [Int.pow_succ, Int.toNat_mul (Int.pow_nonneg h) h, ih, Nat.pow_succ]
protected theorem sq_nonnneg (m : Int) : 0 m ^ 2 := by
protected theorem sq_nonneg (m : Int) : 0 m ^ 2 := by
rw [Int.pow_succ, Int.pow_one]
cases m
· apply Int.mul_nonneg <;> simp
· apply Int.mul_nonneg_of_nonpos_of_nonpos <;> exact negSucc_le_zero _
@[deprecated Int.sq_nonneg (since := "2026-03-13")]
protected theorem sq_nonnneg (m : Int) : 0 m ^ 2 := Int.sq_nonneg m
protected theorem pow_nonneg_of_even {m : Int} {n : Nat} (h : n % 2 = 0) : 0 m ^ n := by
rw [ Nat.mod_add_div n 2, h, Nat.zero_add, Int.pow_mul]
apply Int.pow_nonneg
exact Int.sq_nonnneg m
exact Int.sq_nonneg m
protected theorem neg_pow {m : Int} {n : Nat} : (-m)^n = (-1)^(n % 2) * m^n := by
rw [Int.neg_eq_neg_one_mul, Int.mul_pow]

View File

@@ -435,8 +435,9 @@ theorem Iter.forIn_filterMapWithPostcondition
match (f out).run with
| some c => g c acc
| none => return .yield acc) := by
simp +instances [Iter.forIn_eq_forIn_toIterM, filterMapWithPostcondition, IterM.forIn_filterMapWithPostcondition,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]; rfl
simp only [filterMapWithPostcondition, IterM.forIn_filterMapWithPostcondition, forIn_eq_forIn_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
rfl -- expressions are equal up to different matchers
theorem Iter.forIn_filterMapM
[Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
@@ -448,8 +449,9 @@ theorem Iter.forIn_filterMapM
match f out with
| some c => g c acc
| none => return .yield acc) := by
simp +instances [filterMapM, forIn_eq_forIn_toIterM, IterM.forIn_filterMapM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]; rfl
simp [filterMapM, forIn_eq_forIn_toIterM, IterM.forIn_filterMapM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
rfl
theorem Iter.forIn_filterMap
[Monad n] [LawfulMonad n] [Finite α Id]
@@ -469,8 +471,8 @@ theorem Iter.forIn_mapWithPostcondition
{g : β₂ γ o (ForInStep γ)} :
forIn (it.mapWithPostcondition f) init g =
forIn it init (fun out acc => do g ( (f out).run) acc) := by
simp +instances [mapWithPostcondition, forIn_eq_forIn_toIterM, IterM.forIn_mapWithPostcondition,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [mapWithPostcondition, forIn_eq_forIn_toIterM, IterM.forIn_mapWithPostcondition]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.forIn_mapM
[Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
@@ -498,8 +500,8 @@ theorem Iter.forIn_filterWithPostcondition
haveI : MonadLift n o := monadLift
forIn (it.filterWithPostcondition f) init g =
forIn it init (fun out acc => do if ( (f out).run).down then g out acc else return .yield acc) := by
simp +instances [filterWithPostcondition, forIn_eq_forIn_toIterM, IterM.forIn_filterWithPostcondition,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [filterWithPostcondition, forIn_eq_forIn_toIterM, IterM.forIn_filterWithPostcondition]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.forIn_filterM
[Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
@@ -508,8 +510,8 @@ theorem Iter.forIn_filterM
[IteratorLoop α Id o] [LawfulIteratorLoop α Id o]
{it : Iter (α := α) β} {f : β n (ULift Bool)} {init : γ} {g : β γ o (ForInStep γ)} :
forIn (it.filterM f) init g = forIn it init (fun out acc => do if ( f out).down then g out acc else return .yield acc) := by
simp +instances [filterM, forIn_eq_forIn_toIterM, IterM.forIn_filterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [filterM, forIn_eq_forIn_toIterM, IterM.forIn_filterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.forIn_filter
[Monad n] [LawfulMonad n]
@@ -550,8 +552,9 @@ theorem Iter.foldM_filterMapM {α β γ δ : Type w}
it.foldM (init := init) (fun d b => do
let some c f b | pure d
g d c) := by
simp +instances [filterMapM, IterM.foldM_filterMapM, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]; rfl
simp only [filterMapM, IterM.foldM_filterMapM, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
rfl
theorem Iter.foldM_mapWithPostcondition {α β γ δ : Type w}
{n : Type w Type w''} {o : Type w Type w'''}
@@ -563,8 +566,8 @@ theorem Iter.foldM_mapWithPostcondition {α β γ δ : Type w}
{f : β PostconditionT n γ} {g : δ γ o δ} {init : δ} {it : Iter (α := α) β} :
(it.mapWithPostcondition f).foldM (init := init) g =
it.foldM (init := init) (fun d b => do let c (f b).run; g d c) := by
simp +instances [mapWithPostcondition, IterM.foldM_mapWithPostcondition, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [mapWithPostcondition, IterM.foldM_mapWithPostcondition, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.foldM_mapM {α β γ δ : Type w}
{n : Type w Type w''} {o : Type w Type w'''}
@@ -578,8 +581,8 @@ theorem Iter.foldM_mapM {α β γ δ : Type w}
haveI : MonadLift n o := MonadLiftT.monadLift
(it.mapM f).foldM (init := init) g =
it.foldM (init := init) (fun d b => do let c f b; g d c) := by
simp +instances [mapM, IterM.foldM_mapM, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [mapM, IterM.foldM_mapM, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.foldM_filterWithPostcondition {α β δ : Type w}
{n : Type w Type w''} {o : Type w Type w'''}
@@ -591,8 +594,8 @@ theorem Iter.foldM_filterWithPostcondition {α β δ : Type w}
{f : β PostconditionT n (ULift Bool)} {g : δ β o δ} {init : δ} {it : Iter (α := α) β} :
(it.filterWithPostcondition f).foldM (init := init) g =
it.foldM (init := init) (fun d b => do if ( (f b).run).down then g d b else pure d) := by
simp +instances [filterWithPostcondition, IterM.foldM_filterWithPostcondition, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [filterWithPostcondition, IterM.foldM_filterWithPostcondition, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.foldM_filterM {α β δ : Type w}
{n : Type w Type w''} {o : Type w Type w'''}
@@ -605,8 +608,8 @@ theorem Iter.foldM_filterM {α β δ : Type w}
{f : β n (ULift Bool)} {g : δ β o δ} {init : δ} {it : Iter (α := α) β} :
(it.filterM f).foldM (init := init) g =
it.foldM (init := init) (fun d b => do if ( f b).down then g d b else pure d) := by
simp +instances [filterM, IterM.foldM_filterM, foldM_eq_foldM_toIterM,
instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
simp only [filterM, IterM.foldM_filterM, foldM_eq_foldM_toIterM]
rw [instMonadLiftTOfMonadLift_instMonadLiftTOfPure]
theorem Iter.foldM_filterMap {α β γ δ : Type w} {n : Type w Type w''}
[Iterator α Id β] [Finite α Id] [Monad n] [LawfulMonad n]

View File

@@ -232,7 +232,6 @@ public theorem Iter.toArray_flatMapM {α α₂ β γ : Type w} {m : Type w → T
(it₁.flatMapM f).toArray = Array.flatten <$> (it₁.mapM fun b => do ( f b).toArray).toArray := by
simp [flatMapM, toArray_flatMapAfterM]
set_option backward.isDefEq.respectTransparency false in
public theorem Iter.toList_flatMapAfter {α α₂ β γ : Type w} [Iterator α Id β] [Iterator α₂ Id γ]
[Finite α Id] [Finite α₂ Id]
{f : β Iter (α := α₂) γ} {it₁ : Iter (α := α) β} {it₂ : Option (Iter (α := α₂) γ)} :
@@ -241,9 +240,9 @@ public theorem Iter.toList_flatMapAfter {α α₂ β γ : Type w} [Iterator α I
| some it₂ => it₂.toList ++
(it₁.map fun b => (f b).toList).toList.flatten := by
simp only [flatMapAfter, Iter.toList, toIterM_toIter, IterM.toList_flatMapAfter]
cases it₂ <;> simp [map, IterM.toList_map_eq_toList_mapM, - IterM.toList_map]
unfold Iter.toList
cases it₂ <;> simp [map]
set_option backward.isDefEq.respectTransparency false in
public theorem Iter.toArray_flatMapAfter {α α₂ β γ : Type w} [Iterator α Id β] [Iterator α₂ Id γ]
[Finite α Id] [Finite α₂ Id]
{f : β Iter (α := α₂) γ} {it₁ : Iter (α := α) β} {it₂ : Option (Iter (α := α₂) γ)} :
@@ -252,6 +251,7 @@ public theorem Iter.toArray_flatMapAfter {α α₂ β γ : Type w} [Iterator α
| some it₂ => it₂.toArray ++
(it₁.map fun b => (f b).toArray).toArray.flatten := by
simp only [flatMapAfter, Iter.toArray, toIterM_toIter, IterM.toArray_flatMapAfter]
unfold Iter.toArray
cases it₂ <;> simp [map, IterM.toArray_map_eq_toArray_mapM, - IterM.toArray_map]
public theorem Iter.toList_flatMap {α α₂ β γ : Type w} [Iterator α Id β] [Iterator α₂ Id γ]

View File

@@ -374,7 +374,6 @@ theorem IterM.toList_map_eq_toList_filterMapM {α β γ : Type w} {m : Type w
simp [toList_map_eq_toList_mapM, toList_mapM_eq_toList_filterMapM]
congr <;> simp
set_option backward.whnf.reducibleClassField false in
/--
Variant of `toList_filterMapWithPostcondition_filterMapWithPostcondition` that is intended to be
used with the `apply` tactic. Because neither the LHS nor the RHS determine all implicit parameters,
@@ -395,7 +394,7 @@ private theorem IterM.toList_filterMapWithPostcondition_filterMapWithPostconditi
(it.filterMapWithPostcondition (n := o) fg).toList := by
induction it using IterM.inductSteps with | step it ihy ihs
letI : MonadLift n o := monadLift
haveI : LawfulMonadLift n o := by simp +instances [this], by simp +instances [this]
haveI : LawfulMonadLift n o := LawfulMonadLiftT.monadLift_pure, LawfulMonadLiftT.monadLift_bind
rw [toList_eq_match_step, toList_eq_match_step, step_filterMapWithPostcondition,
bind_assoc, step_filterMapWithPostcondition, step_filterMapWithPostcondition]
simp only [bind_assoc, liftM_bind]
@@ -602,7 +601,6 @@ theorem IterM.toList_map_mapM {α β γ δ : Type w}
toList_filterMapM_mapM]
congr <;> simp
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem IterM.toList_filterMapWithPostcondition {α β γ : Type w} {m : Type w Type w'}
[Monad m] [LawfulMonad m]
@@ -626,7 +624,6 @@ theorem IterM.toList_filterMapWithPostcondition {α β γ : Type w} {m : Type w
· simp [ihs _, heq]
· simp [heq]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem IterM.toList_mapWithPostcondition {α β γ : Type w} {m : Type w Type w'}
[Monad m] [LawfulMonad m] [Iterator α Id β] [Finite α Id]
@@ -647,25 +644,25 @@ theorem IterM.toList_mapWithPostcondition {α β γ : Type w} {m : Type w → Ty
· simp [ihs _, heq]
· simp [heq]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem IterM.toList_filterMapM {α β γ : Type w} {m : Type w Type w'}
[Monad m] [MonadAttach m] [LawfulMonad m] [WeaklyLawfulMonadAttach m]
[Iterator α Id β] [Finite α Id]
{f : β m (Option γ)} (it : IterM (α := α) Id β) :
(it.filterMapM f).toList = it.toList.run.filterMapM f := by
simp [toList_filterMapM_eq_toList_filterMapWithPostcondition, toList_filterMapWithPostcondition,
PostconditionT.attachLift, PostconditionT.run_eq_map, WeaklyLawfulMonadAttach.map_attach]
simp only [toList_filterMapM_eq_toList_filterMapWithPostcondition,
toList_filterMapWithPostcondition, PostconditionT.run_eq_map]
simp [PostconditionT.attachLift, WeaklyLawfulMonadAttach.map_attach]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem IterM.toList_mapM {α β γ : Type w} {m : Type w Type w'}
[Monad m] [MonadAttach m] [LawfulMonad m] [WeaklyLawfulMonadAttach m]
[Iterator α Id β] [Finite α Id]
{f : β m γ} (it : IterM (α := α) Id β) :
(it.mapM f).toList = it.toList.run.mapM f := by
simp [toList_mapM_eq_toList_mapWithPostcondition, toList_mapWithPostcondition,
PostconditionT.attachLift, PostconditionT.run_eq_map, WeaklyLawfulMonadAttach.map_attach]
simp only [toList_mapM_eq_toList_mapWithPostcondition, toList_mapWithPostcondition,
PostconditionT.run_eq_map]
simp [PostconditionT.attachLift, WeaklyLawfulMonadAttach.map_attach]
@[simp]
theorem IterM.toList_filterMap {α β γ : Type w} {m : Type w Type w'}
@@ -1303,7 +1300,6 @@ theorem IterM.forIn_filterMap
rw [filterMap, forIn_filterMapWithPostcondition]
simp [PostconditionT.run_eq_map]
set_option backward.isDefEq.respectTransparency false in
theorem IterM.forIn_mapWithPostcondition
[Monad m] [LawfulMonad m] [Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
[MonadLiftT m n] [LawfulMonadLiftT m n] [MonadLiftT n o] [LawfulMonadLiftT n o]
@@ -1314,9 +1310,9 @@ theorem IterM.forIn_mapWithPostcondition
haveI : MonadLift n o := monadLift
forIn (it.mapWithPostcondition f) init g =
forIn it init (fun out acc => do g ( (f out).run) acc) := by
rw [mapWithPostcondition, InternalCombinators.map, InternalCombinators.filterMap,
filterMapWithPostcondition, forIn_filterMapWithPostcondition]
simp [PostconditionT.run_eq_map]
unfold mapWithPostcondition InternalCombinators.map Map.instIterator Map.instIteratorLoop Map
rw [ InternalCombinators.filterMap, filterMapWithPostcondition, forIn_filterMapWithPostcondition]
simp
theorem IterM.forIn_mapM
[Monad m] [LawfulMonad m] [Monad n] [LawfulMonad n] [Monad o] [LawfulMonad o]
@@ -1480,7 +1476,7 @@ theorem IterM.foldM_filterM {α β δ : Type w}
simp [filterM, foldM_filterMapWithPostcondition, PostconditionT.run_attachLift]
congr 1; ext out acc
apply bind_congr; intro fx
cases fx.down <;> simp [PostconditionT.run_eq_map]
cases fx.down <;> simp
theorem IterM.foldM_filterMap {α β γ δ : Type w} {m : Type w Type w'} {n : Type w Type w''}
[Iterator α m β] [Finite α m] [Monad m] [Monad n] [LawfulMonad m] [LawfulMonad n]

View File

@@ -32,11 +32,12 @@ theorem Iter.forIn'_eq {α β : Type w} [Iterator α Id β] [Finite α Id]
IterM.DefaultConsumers.forIn' (n := m) (fun _ _ f x => f x.run) γ (fun _ _ _ => True)
it.toIterM init _ (fun _ => id)
(fun out h acc => return f out (Iter.isPlausibleIndirectOutput_iff_isPlausibleIndirectOutput_toIterM.mpr h) acc, trivial) := by
simp +instances only [ForIn'.forIn']
simp only [ForIn'.forIn']
have : a b c, f a b c = (Subtype.val <$> (·, trivial) <$> f a b c) := by simp
simp +singlePass only [this]
rw [hl.lawful (fun _ _ f x => f x.run) (wf := IteratorLoop.wellFounded_of_finite)]
simp +instances [IteratorLoop.defaultImplementation]
simp only [IteratorLoop.forIn, Functor.map_map, id_map',
bind_pure_comp]
theorem Iter.forIn_eq {α β : Type w} [Iterator α Id β] [Finite α Id]
{m : Type x Type x'} [Monad m] [LawfulMonad m] [IteratorLoop α Id m]
@@ -81,7 +82,7 @@ theorem Iter.forIn'_eq_forIn'_toIterM {α β : Type w} [Iterator α Id β]
letI : ForIn' m (IterM (α := α) Id β) β _ := IterM.instForIn'
ForIn'.forIn' it.toIterM init
(fun out h acc => f out (isPlausibleIndirectOutput_iff_isPlausibleIndirectOutput_toIterM.mpr h) acc) := by
simp +instances [ForIn'.forIn', monadLift]
simp [ForIn'.forIn', monadLift]
theorem Iter.forIn_eq_forIn_toIterM {α β : Type w} [Iterator α Id β]
[Finite α Id] {m : Type w Type w''} [Monad m] [LawfulMonad m]

View File

@@ -109,10 +109,10 @@ theorem IterM.forIn'_eq {α β : Type w} {m : Type w → Type w'} [Iterator α m
letI : ForIn' n (IterM (α := α) m β) β _ := IterM.instForIn'
ForIn'.forIn' (α := β) (m := n) it init f = IterM.DefaultConsumers.forIn' (n := n)
(fun _ _ f x => monadLift x >>= f) γ (fun _ _ _ => True) it init _ (fun _ => id) (return f · · ·, trivial) := by
simp +instances only [ForIn'.forIn']
simp only [ForIn'.forIn']
have : f = (Subtype.val <$> (·, trivial) <$> f · · ·) := by simp
rw [this, hl.lawful (fun _ _ f x => monadLift x >>= f) (wf := IteratorLoop.wellFounded_of_finite)]
simp +instances [IteratorLoop.defaultImplementation]
simp [IteratorLoop.forIn]
try rfl
theorem IterM.forIn_eq {α β : Type w} {m : Type w Type w'} [Iterator α m β] [Finite α m]

View File

@@ -272,6 +272,12 @@ theorem PostconditionT.run_bind' {m : Type w → Type w'} [Monad m] [LawfulMonad
(x >>= f).run = x.run >>= (f · |>.run) :=
run_bind
@[simp]
protected theorem PostconditionT.run_pure {m : Type w Type w'} [Monad m] [LawfulMonad m]
{α : Type w} {x : α} :
(pure x : PostconditionT m α).run = pure x := by
simp [run_eq_map]
@[simp]
theorem PostconditionT.property_lift {m : Type w Type w'} [Functor m] {α : Type w}
{x : m α} : (lift x : PostconditionT m α).Property = (fun _ => True) := by

View File

@@ -208,7 +208,7 @@ public instance LawfulOrderLT.of_lt {α : Type u} [LT α] [i : Asymm (α := α)
haveI := LE.ofLT α
LawfulOrderLT α :=
letI := LE.ofLT α
{ lt_iff a b := by simp +instances [LE.le]; apply Asymm.asymm }
{ lt_iff a b := by simp [LE.le]; apply Asymm.asymm }
/--
If an `LT α` instance is asymmetric and its negation is transitive, then `LE.ofLT α` represents a
@@ -253,8 +253,7 @@ public theorem LawfulOrderInf.of_lt {α : Type u} [Min α] [LT α]
letI := LE.ofLT α
{ le_min_iff a b c := by
open Classical in
simp +instances only [LE.le]
simp [ not_or, Decidable.not_iff_not]
simp only [LE.le, not_or, Decidable.not_iff_not]
simpa [Decidable.imp_iff_not_or] using min_lt_iff a b c }
/--
@@ -283,8 +282,7 @@ public theorem LawfulOrderSup.of_lt {α : Type u} [Max α] [LT α]
letI := LE.ofLT α
{ max_le_iff a b c := by
open Classical in
simp +instances only [LE.le]
simp [ not_or, Decidable.not_iff_not]
simp only [LE.le, not_or, Decidable.not_iff_not]
simpa [Decidable.imp_iff_not_or] using lt_max_iff a b c }
/--

View File

@@ -287,7 +287,7 @@ scoped instance (priority := low) instLawfulOrderLTOpposite {il : LE α} {it : L
letI := il.opposite
letI := it.opposite
{ lt_iff a b := by
simp +instances only [LE.opposite, LT.opposite]
simp only [LE.le, LT.lt]
letI := il; letI := it
exact LawfulOrderLT.lt_iff b a }
@@ -297,7 +297,7 @@ scoped instance (priority := low) instLawfulOrderBEqOpposite {il : LE α} {ib :
LawfulOrderBEq α :=
letI := il.opposite
{ beq_iff_le_and_ge a b := by
simp +instances only [LE.opposite]
simp only [LE.le]
letI := il; letI := ib
rw [LawfulOrderBEq.beq_iff_le_and_ge]
exact and_comm }
@@ -310,7 +310,7 @@ scoped instance (priority := low) instLawfulOrderInfOpposite {il : LE α} {im :
letI := il.opposite
letI := im.oppositeMax
{ max_le_iff a b c := by
simp +instances only [LE.opposite, Min.oppositeMax]
simp only [LE.le, Max.max]
letI := il; letI := im
exact LawfulOrderInf.le_min_iff c a b }
@@ -322,11 +322,11 @@ scoped instance (priority := low) instLawfulOrderMinOpposite {il : LE α} {im :
letI := il.opposite
letI := im.oppositeMax
{ max_eq_or a b := by
simp +instances only [Min.oppositeMax]
simp only [Max.max]
letI := il; letI := im
exact MinEqOr.min_eq_or a b
max_le_iff a b c := by
simp +instances only [LE.opposite, Min.oppositeMax]
simp only [LE.le, Max.max]
letI := il; letI := im
exact LawfulOrderInf.le_min_iff c a b }
@@ -338,7 +338,7 @@ scoped instance (priority := low) instLawfulOrderSupOpposite {il : LE α} {im :
letI := il.opposite
letI := im.oppositeMin
{ le_min_iff a b c := by
simp +instances only [LE.opposite, Max.oppositeMin]
simp only [LE.le, Min.min]
letI := il; letI := im
exact LawfulOrderSup.max_le_iff b c a }
@@ -350,11 +350,11 @@ scoped instance (priority := low) instLawfulOrderMaxOpposite {il : LE α} {im :
letI := il.opposite
letI := im.oppositeMin
{ min_eq_or a b := by
simp +instances only [Max.oppositeMin]
simp only [Min.min]
letI := il; letI := im
exact MaxEqOr.max_eq_or a b
le_min_iff a b c := by
simp +instances only [LE.opposite, Max.oppositeMin]
simp only [LE.le, Min.min]
letI := il; letI := im
exact LawfulOrderSup.max_le_iff b c a }
@@ -366,11 +366,11 @@ scoped instance (priority := low) instLawfulOrderLeftLeaningMinOpposite {il : LE
letI := il.opposite
letI := im.oppositeMax
{ max_eq_left a b hab := by
simp +instances only [Min.oppositeMax]
simp only [Max.max]
letI := il; letI := im
exact LawfulOrderLeftLeaningMin.min_eq_left a b hab
max_eq_right a b hab := by
simp +instances only [Min.oppositeMax]
simp only [Max.max]
letI := il; letI := im
exact LawfulOrderLeftLeaningMin.min_eq_right a b hab }
@@ -382,11 +382,11 @@ scoped instance (priority := low) instLawfulOrderLeftLeaningMaxOpposite {il : LE
letI := il.opposite
letI := im.oppositeMin
{ min_eq_left a b hab := by
simp +instances only [Max.oppositeMin]
simp only [Min.min]
letI := il; letI := im
exact LawfulOrderLeftLeaningMax.max_eq_left a b hab
min_eq_right a b hab := by
simp +instances only [Max.oppositeMin]
simp only [Min.min]
letI := il; letI := im
exact LawfulOrderLeftLeaningMax.max_eq_right a b hab }

View File

@@ -796,7 +796,6 @@ automatically. If it fails, it is necessary to provide some of the fields manual
@[inline, expose, implicit_reducible]
public def LinearOrderPackage.ofOrd (α : Type u)
(args : Packages.LinearOrderOfOrdArgs α := by exact {}) : LinearOrderPackage α :=
-- set_option backward.isDefEq.respectTransparency false in
letI := LinearPreorderPackage.ofOrd α args.toLinearPreorderOfOrdArgs
haveI : LawfulEqOrd α := args.eq_of_compare _ _
letI : Min α := args.min

View File

@@ -597,7 +597,7 @@ instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α] [LE α] [Decidabl
LawfulIteratorLoop (Rxc.Iterator α) Id n where
lawful := by
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
simp only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
rw [IterM.DefaultConsumers.forIn'.wf]
split; rotate_left
· simp only [IterM.step_eq,
@@ -636,7 +636,7 @@ The pure function mapping a range iterator of type {name}`IterM` to the next ste
This function is prefixed with {lit}`Monadic` in order to disambiguate it from the version for iterators
of type {name}`Iter`.
-/
@[inline]
@[inline, implicit_reducible]
def Iterator.Monadic.step [UpwardEnumerable α] [LT α] [DecidableLT α]
(it : IterM (α := Rxo.Iterator α) Id α) :
IterStep (IterM (α := Rxo.Iterator α) Id α) α :=
@@ -1113,7 +1113,6 @@ private theorem Iterator.instIteratorLoop.loop_eq_wf [UpwardEnumerable α] [LT
· rw [WellFounded.fix_eq]
simp_all
set_option backward.isDefEq.respectTransparency false in
private theorem Iterator.instIteratorLoop.loopWf_eq [UpwardEnumerable α] [LT α] [DecidableLT α]
[LawfulUpwardEnumerable α] [LawfulUpwardEnumerableLT α]
{n : Type u Type w} [Monad n] [LawfulMonad n] (γ : Type u)
@@ -1165,14 +1164,13 @@ termination_by IteratorLoop.WithWF.mk ⟨⟨some next, upperBound⟩⟩ acc (hwf
decreasing_by
simp [IteratorLoop.rel, Monadic.isPlausibleStep_iff, Monadic.step, *]
set_option backward.isDefEq.respectTransparency false in
instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α] [LT α] [DecidableLT α]
[LawfulUpwardEnumerable α] [LawfulUpwardEnumerableLT α]
{n : Type u Type w} [Monad n] [LawfulMonad n] :
LawfulIteratorLoop (Rxo.Iterator α) Id n where
lawful := by
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
simp only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
rw [IterM.DefaultConsumers.forIn'.wf]
split; rotate_left
· simp [IterM.step_eq, Monadic.step, Internal.LawfulMonadLiftBindFunction.liftBind_pure (liftBind := lift)]
@@ -1637,7 +1635,7 @@ instance Iterator.instLawfulIteratorLoop [UpwardEnumerable α]
LawfulIteratorLoop (Rxi.Iterator α) Id n where
lawful := by
intro lift instLawfulMonadLiftFunction γ it init Pl wf f
simp +instances only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
simp only [IteratorLoop.forIn, IterM.DefaultConsumers.forIn'_eq_wf Pl wf]
rw [IterM.DefaultConsumers.forIn'.wf]
split; rotate_left
· simp [Monadic.step_eq_step, Monadic.step, Internal.LawfulMonadLiftBindFunction.liftBind_pure]

View File

@@ -248,7 +248,16 @@ instance : HasModel Int8 (BitVec 8) where
le_iff_encode_le := by simp [LE.le, Int8.le]
lt_iff_encode_lt := by simp [LT.lt, Int8.lt]
set_option backward.whnf.reducibleClassField false in
private theorem succ?_eq_minValueSealed {x : Int8} :
UpwardEnumerable.succ? x = if x + 1 = minValueSealed then none else some (x + 1) :=
(rfl)
private theorem succMany?_eq_maxValueSealed {i : Int8} :
UpwardEnumerable.succMany? n i =
have := i.minValue_le_toInt
if h : i.toInt + n maxValueSealed.toInt then some (.ofIntLE _ (by omega) (maxValueSealed_def h)) else none :=
(rfl)
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext
@@ -256,16 +265,16 @@ theorem instUpwardEnumerable_eq :
apply HasModel.succ?_eq_of_technicalCondition
simp [HasModel.encode, succ?, Int8.toBitVec_inj, toBitVec_minValueSealed_eq_intMinSealed]
· ext
simp +instances [HasModel.succMany?_eq, instUpwardEnumerable, HasModel.encode, HasModel.decode,
simp [HasModel.succMany?_eq, succMany?_eq_maxValueSealed, HasModel.encode, HasModel.decode,
toInt_toBitVec, toBitVec_maxValueSealed_eq_intMaxSealed, ofIntLE_eq_ofInt]
instance : LawfulUpwardEnumerable Int8 := by
simp +instances only [instUpwardEnumerable_eq]
rw [instUpwardEnumerable_eq]
infer_instance
instance : LawfulUpwardEnumerableLE Int8 := by
simp +instances only [instUpwardEnumerable_eq]
rw [instUpwardEnumerable_eq]
infer_instance
public instance instRxcHasSize : Rxc.HasSize Int8 where
@@ -277,7 +286,7 @@ theorem instRxcHasSize_eq :
toInt_toBitVec, HasModel.toNat_toInt_add_one_sub_toInt (Nat.zero_lt_succ _)]
public instance instRxcLawfulHasSize : Rxc.LawfulHasSize Int8 := by
simp +instances only [instUpwardEnumerable_eq, instRxcHasSize_eq]
rw [instUpwardEnumerable_eq, instRxcHasSize_eq]
infer_instance
public instance : Rxc.IsAlwaysFinite Int8 := by exact inferInstance
@@ -294,7 +303,7 @@ theorem instRxiHasSize_eq :
HasModel.encode, HasModel.toNat_two_pow_sub_one_sub_toInt (show 8 > 0 by omega)]
public instance instRxiLawfulHasSize : Rxi.LawfulHasSize Int8 := by
simp +instances only [instUpwardEnumerable_eq, instRxiHasSize_eq]
rw [instUpwardEnumerable_eq, instRxiHasSize_eq]
infer_instance
public instance instRxiIsAlwaysFinite : Rxi.IsAlwaysFinite Int8 := by exact inferInstance
@@ -344,7 +353,6 @@ instance : HasModel Int16 (BitVec 16) where
le_iff_encode_le := by simp [LE.le, Int16.le]
lt_iff_encode_lt := by simp [LT.lt, Int16.lt]
set_option backward.whnf.reducibleClassField false in
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext
@@ -440,7 +448,6 @@ instance : HasModel Int32 (BitVec 32) where
le_iff_encode_le := by simp [LE.le, Int32.le]
lt_iff_encode_lt := by simp [LT.lt, Int32.lt]
set_option backward.whnf.reducibleClassField false in
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext
@@ -536,7 +543,6 @@ instance : HasModel Int64 (BitVec 64) where
le_iff_encode_le := by simp [LE.le, Int64.le]
lt_iff_encode_lt := by simp [LT.lt, Int64.lt]
set_option backward.whnf.reducibleClassField false in
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext
@@ -637,7 +643,6 @@ instance : HasModel ISize (BitVec System.Platform.numBits) where
le_iff_encode_le := by simp [LE.le, ISize.le]
lt_iff_encode_lt := by simp [LT.lt, ISize.lt]
set_option backward.whnf.reducibleClassField false in
theorem instUpwardEnumerable_eq :
instUpwardEnumerable = HasModel.instUpwardEnumerable := by
apply UpwardEnumerable.ext

View File

@@ -26,7 +26,7 @@ variable {shape : RangeShape} {α : Type u}
structure SubarrayIterator (α : Type u) where
xs : Subarray α
@[inline, expose]
@[inline, expose, implicit_reducible]
def SubarrayIterator.step :
IterM (α := SubarrayIterator α) Id α IterStep (IterM (α := SubarrayIterator α) m α) α
| xs =>

View File

@@ -28,7 +28,6 @@ open Std Std.Iterators Std.PRange Std.Slice
namespace SubarrayIterator
set_option backward.isDefEq.respectTransparency false in
theorem step_eq {it : Iter (α := SubarrayIterator α) α} :
it.step = if h : it.1.xs.start < it.1.xs.stop then
haveI := it.1.xs.start_le_stop
@@ -215,7 +214,6 @@ public theorem Array.stop_toSubarray {xs : Array α} {lo hi : Nat} :
(xs.toSubarray lo hi).stop = min hi xs.size := by
simp [toSubarray_eq_min, Subarray.stop]
set_option backward.whnf.reducibleClassField false in
public theorem Subarray.toList_eq {xs : Subarray α} :
xs.toList = (xs.array.extract xs.start xs.stop).toList := by
let aslice := xs

View File

@@ -70,7 +70,6 @@ end ListSlice
namespace List
set_option backward.whnf.reducibleClassField false in
@[simp, grind =]
public theorem toList_mkSlice_rco {xs : List α} {lo hi : Nat} :
xs[lo...hi].toList = (xs.take hi).drop lo := by
@@ -78,9 +77,9 @@ public theorem toList_mkSlice_rco {xs : List α} {lo hi : Nat} :
simp only [Std.Rco.Sliceable.mkSlice, toSlice, ListSlice.toList_eq]
by_cases h : lo < hi
· have : lo hi := by omega
simp +instances [h, List.take_drop, Nat.add_sub_cancel' _, List.take_eq_take_min]
simp [h, List.take_drop, Nat.add_sub_cancel' _, List.take_eq_take_min]
· have : min hi xs.length lo := by omega
simp +instances [h, Nat.min_eq_right this]
simp [h, Nat.min_eq_right this]
@[simp, grind =]
public theorem toArray_mkSlice_rco {xs : List α} {lo hi : Nat} :
@@ -111,12 +110,11 @@ public theorem size_mkSlice_rcc {xs : List α} {lo hi : Nat} :
xs[lo...=hi].size = min (hi + 1) xs.length - lo := by
simp [ length_toList_eq_size]
set_option backward.whnf.reducibleClassField false in
@[simp, grind =]
public theorem toList_mkSlice_rci {xs : List α} {lo : Nat} :
xs[lo...*].toList = xs.drop lo := by
rw [List.drop_eq_drop_min]
simp +instances [ListSlice.toList_eq, Std.Rci.Sliceable.mkSlice, List.toUnboundedSlice]
simp [ListSlice.toList_eq, Std.Rci.Sliceable.mkSlice, List.toUnboundedSlice]
@[simp, grind =]
public theorem toArray_mkSlice_rci {xs : List α} {lo : Nat} :
@@ -290,11 +288,11 @@ section ListSubslices
namespace ListSlice
set_option backward.whnf.reducibleClassField false in
@[simp, grind =]
public theorem toList_mkSlice_rco {xs : ListSlice α} {lo hi : Nat} :
xs[lo...hi].toList = (xs.toList.take hi).drop lo := by
simp +instances only [instSliceableListSliceNat_1, List.toList_mkSlice_rco, ListSlice.toList_eq (xs := xs)]
rw [instSliceableListSliceNat_1]
simp only [List.toList_mkSlice_rco, ListSlice.toList_eq (xs := xs)]
obtain xs, stop := xs
cases stop
· simp
@@ -329,13 +327,13 @@ public theorem size_mkSlice_rcc {xs : ListSlice α} {lo hi : Nat} :
xs[lo...=hi].size = min (hi + 1) xs.size - lo := by
simp [ length_toList_eq_size]
set_option backward.whnf.reducibleClassField false in
@[simp, grind =]
public theorem toList_mkSlice_rci {xs : ListSlice α} {lo : Nat} :
xs[lo...*].toList = xs.toList.drop lo := by
simp +instances only [instSliceableListSliceNat_2, ListSlice.toList_eq (xs := xs)]
rw [instSliceableListSliceNat_2]
simp only [ListSlice.toList_eq (xs := xs)]
obtain xs, stop := xs
simp +instances only
simp only
split <;> simp
@[simp, grind =]

View File

@@ -55,9 +55,11 @@ end String
namespace String.Internal
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_posof"]
opaque posOf (s : String) (c : Char) : Pos.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_offsetofpos"]
opaque offsetOfPos (s : String) (pos : Pos.Raw) : Nat
@@ -67,6 +69,7 @@ opaque extract : (@& String) → (@& Pos.Raw) → (@& Pos.Raw) → String
@[extern "lean_string_length"]
opaque length : (@& String) Nat
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_pushn"]
opaque pushn (s : String) (c : Char) (n : Nat) : String
@@ -76,45 +79,57 @@ opaque append : String → (@& String) → String
@[extern "lean_string_utf8_next"]
opaque next (s : @& String) (p : @& Pos.Raw) : Pos.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_isempty"]
opaque isEmpty (s : String) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_foldl"]
opaque foldl (f : String Char String) (init : String) (s : String) : String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_isprefixof"]
opaque isPrefixOf (p : String) (s : String) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_any"]
opaque any (s : String) (p : Char Bool) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_contains"]
opaque contains (s : String) (c : Char) : Bool
@[extern "lean_string_utf8_get"]
opaque get (s : @& String) (p : @& Pos.Raw) : Char
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_capitalize"]
opaque capitalize (s : String) : String
@[extern "lean_string_utf8_at_end"]
opaque atEnd : (@& String) (@& Pos.Raw) Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_nextwhile"]
opaque nextWhile (s : String) (p : Char Bool) (i : String.Pos.Raw) : String.Pos.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_trim"]
opaque trim (s : String) : String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_intercalate"]
opaque intercalate (s : String) : List String String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_front"]
opaque front (s : String) : Char
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_drop"]
opaque drop (s : String) (n : Nat) : String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_dropright"]
opaque dropRight (s : String) (n : Nat) : String
@@ -141,33 +156,43 @@ def List.asString (s : List Char) : String :=
namespace Substring.Raw.Internal
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_tostring"]
opaque toString : Substring.Raw String
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_drop"]
opaque drop : Substring.Raw Nat Substring.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_front"]
opaque front (s : Substring.Raw) : Char
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_takewhile"]
opaque takeWhile : Substring.Raw (Char Bool) Substring.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_extract"]
opaque extract : Substring.Raw String.Pos.Raw String.Pos.Raw Substring.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_all"]
opaque all (s : Substring.Raw) (p : Char Bool) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_beq"]
opaque beq (ss1 ss2 : Substring.Raw) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_isempty"]
opaque isEmpty (ss : Substring.Raw) : Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_get"]
opaque get : Substring.Raw String.Pos.Raw Char
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_substring_prev"]
opaque prev : Substring.Raw String.Pos.Raw String.Pos.Raw
@@ -175,9 +200,11 @@ end Substring.Raw.Internal
namespace String.Pos.Raw.Internal
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_pos_sub"]
opaque sub : String.Pos.Raw String.Pos.Raw String.Pos.Raw
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_string_pos_min"]
opaque min (p₁ p₂ : Pos.Raw) : Pos.Raw

View File

@@ -56,6 +56,7 @@ private def syntaxToExternAttrData (stx : Syntax) : AttrM ExternAttrData := do
return { entries := entries.toList }
-- Forward declaration
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_add_extern"]
opaque addExtern (declName : Name) (externAttrData : ExternAttrData) : CoreM Unit

View File

@@ -623,6 +623,7 @@ private def evalSuggestSimpAllTrace : TryTactic := fun tac => do
| _ => throwUnsupportedSyntax
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_eval_suggest_tactic"] -- forward definition to avoid mutual block
opaque evalSuggest : TryTactic

View File

@@ -752,6 +752,7 @@ private def lakeAdd (env : Environment) (cinfo : ConstantInfo) : Environment :=
}
-- forward reference due to too many cyclic dependencies
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_is_reserved_name"]
private opaque isReservedName (env : Environment) (name : Name) : Bool
@@ -1768,6 +1769,7 @@ private def looksLikeOldCodegenName : Name → Bool
| .str _ s => s.startsWith "_cstage" || s.startsWith "_spec_" || s.startsWith "_elambda"
| _ => false
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_get_ir_extra_const_names"]
private opaque getIRExtraConstNames (env : Environment) (level : OLeanLevel) (includeDecls := false) : Array Name
@@ -1804,6 +1806,7 @@ def mkModuleData (env : Environment) (level : OLeanLevel := .private) : IO Modul
constNames, constants, entries
}
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_ir_export_entries"]
private opaque exportIREntries (env : Environment) : Array (Name × Array EnvExtensionEntry)
@@ -1862,6 +1865,7 @@ private def setImportedEntries (states : Array EnvExtensionState) (mods : Array
{ s with importedEntries := s.importedEntries.set! modIdx entries }
return states
set_option compiler.ignoreBorrowAnnotation true in
/--
"Forward declaration" needed for updating the attribute table with user-defined attributes.
User-defined attributes are declared using the `initialize` command. The `initialize` command is just syntax sugar for the `init` attribute.
@@ -1872,9 +1876,12 @@ private def setImportedEntries (states : Array EnvExtensionState) (mods : Array
Later, we set this method with code that adds the user-defined attributes that were imported after we initialized `attributeExtension`.
-/
@[extern "lean_update_env_attributes"] opaque updateEnvAttributes : Environment IO Environment
set_option compiler.ignoreBorrowAnnotation true in
/-- "Forward declaration" for retrieving the number of builtin attributes. -/
@[extern "lean_get_num_attributes"] opaque getNumBuiltinAttributes : IO Nat
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_run_init_attrs"]
private opaque runInitAttrs (env : Environment) (opts : Options) : IO Unit
@@ -2399,6 +2406,7 @@ def displayStats (env : Environment) : IO Unit := do
@[extern "lean_eval_const"]
private unsafe opaque evalConstCore (α) (env : @& Environment) (opts : @& Options) (constName : @& Name) : Except String α
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_eval_check_meta"]
private opaque evalCheckMeta (env : Environment) (constName : Name) : Except String Unit

View File

@@ -760,6 +760,7 @@ have to hard-code the true arity of these definitions here, and make sure the C
We have used another hack based on `IO.Ref`s in the past, it was safer but less efficient.
-/
set_option compiler.ignoreBorrowAnnotation true in
/--
Reduces an expression to its *weak head normal form*.
This is when the "head" of the top-level expression has been fully reduced.
@@ -768,6 +769,7 @@ The result may contain subexpressions that have not been reduced.
See `Lean.Meta.whnfImp` for the implementation.
-/
@[extern "lean_whnf"] opaque whnf : Expr MetaM Expr
set_option compiler.ignoreBorrowAnnotation true in
/--
Returns the inferred type of the given expression. Assumes the expression is type-correct.
@@ -822,8 +824,11 @@ def e3 : Expr := .app (.const ``Nat.zero []) (.const ``Nat.zero [])
See `Lean.Meta.inferTypeImp` for the implementation of `inferType`.
-/
@[extern "lean_infer_type"] opaque inferType : Expr MetaM Expr
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_is_expr_def_eq"] opaque isExprDefEqAux : Expr Expr MetaM Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_is_level_def_eq"] opaque isLevelDefEqAux : Level Level MetaM Bool
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_synth_pending"] protected opaque synthPending : MVarId MetaM Bool
def whnfForall (e : Expr) : MetaM Expr := do
@@ -2498,6 +2503,7 @@ def isDefEqD (t s : Expr) : MetaM Bool :=
def isDefEqI (t s : Expr) : MetaM Bool :=
withReducibleAndInstances <| isDefEq t s
set_option compiler.ignoreBorrowAnnotation true in
/--
Returns `true` if `mvarId := val` was successfully assigned.
This method uses the same assignment validation performed by `isDefEq`, but it does not check whether the types match.

View File

@@ -46,12 +46,14 @@ Forward definition of `getEquationsForImpl`.
We want to use `getEquationsFor` in the simplifier,
getEquationsFor` depends on `mkEquationsFor` which uses the simplifier.
-/
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_get_match_equations_for"]
opaque getEquationsFor (matchDeclName : Name) : MetaM MatchEqns
/-
Forward definition of `genMatchCongrEqnsImpl`.
-/
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_get_congr_match_equations_for"]
opaque genMatchCongrEqns (matchDeclName : Name) : MetaM (Array Name)

View File

@@ -0,0 +1,111 @@
/-
Copyright (c) 2026 Lean FRO, LLC. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.
Authors: Joachim Breitner
-/
module
prelude
public import Lean.Meta.AppBuilder
public section
namespace Lean.Meta
/--
Builds a proof of `String.ofList l₁ ≠ String.ofList l₂` for two distinct string literals.
The proof avoids both `String.decEq` (expensive UTF-8 byte array comparison) and full
`List Char` comparison. It reduces to `List Char` via `String.ofList_injective`, then
proves the lists differ using one of two strategies depending on whether the strings
share a common prefix of the shorter string's full length:
- **Different characters at position `i`**: Uses
`congrArg (fun l => (List.get!Internal l i).toNat)` to project down to `Nat`, then
`Nat.ne_of_beq_eq_false rfl` for a single `Nat.beq` comparison. This avoids `Decidable`
instances entirely — the kernel only evaluates `Nat.beq` on two concrete natural numbers.
- **One string is a prefix of the other**: Uses `congrArg (List.drop n ·)` where `n` is the
shorter string's length, then `List.cons_ne_nil` to prove the non-empty tail differs from `[]`.
This avoids `decide` entirely since `cons_ne_nil` is a definitional proof.
-/
def mkStringLitNeProof (s₁ s₂ : String) : MetaM Expr := do
let l₁ := s₁.toList
let l₂ := s₂.toList
let l₁Expr := toExpr l₁
let l₂Expr := toExpr l₂
let charConst := mkConst ``Char
let listCharTy := mkApp (mkConst ``List [.zero]) charConst
-- Find the first differing character index
let minLen := min l₁.length l₂.length
let diffIdx := Id.run do
for i in [:minLen] do
if l₁[i]! l₂[i]! then return i
return minLen
-- Build the list inequality proof, dispatching on whether it's a character
-- difference or a prefix/length difference
let listEq := mkApp3 (mkConst ``Eq [.succ .zero]) listCharTy l₁Expr l₂Expr
let listNeProof
if diffIdx < minLen then
-- Both lists have a character at diffIdx: project to Nat via get!Internal + toNat,
-- then use Nat.ne_of_beq_eq_false rfl (avoids Decidable instances entirely)
let inhabCharExpr := mkConst ``Char.instInhabited
let natConst := mkConst ``Nat
let iExpr := toExpr diffIdx
-- f = fun l => (List.get!Internal l i).toNat
let projFn := mkLambda `l .default listCharTy
(mkApp (mkConst ``Char.toNat)
(mkApp4 (mkConst ``List.get!Internal [.zero]) charConst inhabCharExpr (mkBVar 0) iExpr))
let mkGetToNat (lExpr : Expr) : Expr :=
mkApp (mkConst ``Char.toNat)
(mkApp4 (mkConst ``List.get!Internal [.zero]) charConst inhabCharExpr lExpr iExpr)
let projL1 := mkGetToNat l₁Expr
let projL2 := mkGetToNat l₂Expr
let projEq := mkApp3 (mkConst ``Eq [.succ .zero]) natConst projL1 projL2
let congrArgFn := mkApp5 (mkConst ``congrArg [.succ .zero, .succ .zero])
listCharTy natConst l₁Expr l₂Expr projFn
-- Nat.ne_of_beq_eq_false rfl : n₁ ≠ n₂ (kernel evaluates Nat.beq on two literals)
let projNeProof := mkApp3 (mkConst ``Nat.ne_of_beq_eq_false)
projL1 projL2 ( mkEqRefl (mkConst ``false))
pure <| mkApp4 (mkConst ``mt) listEq projEq congrArgFn projNeProof
else
-- One list is a prefix of the other: use drop + cons_ne_nil
let nExpr := toExpr minLen
let dropFn := mkLambda `l .default listCharTy
(mkApp3 (mkConst ``List.drop [.zero]) charConst nExpr (mkBVar 0))
let dropL1 := mkApp3 (mkConst ``List.drop [.zero]) charConst nExpr l₁Expr
let dropL2 := mkApp3 (mkConst ``List.drop [.zero]) charConst nExpr l₂Expr
let dropEq := mkApp3 (mkConst ``Eq [.succ .zero]) listCharTy dropL1 dropL2
let congrArgFn := mkApp5 (mkConst ``congrArg [.succ .zero, .succ .zero])
listCharTy listCharTy l₁Expr l₂Expr dropFn
-- The longer list's drop has a head element; the shorter list's drop is []
let (hdChar, tlList) :=
if l₁.length l₂.length then
let dropped := l₂.drop minLen
(dropped.head!, dropped.tail!)
else
let dropped := l₁.drop minLen
(dropped.head!, dropped.tail!)
let hdExpr := toExpr hdChar
let tlExpr := toExpr tlList
-- cons_ne_nil hd tl : hd :: tl ≠ []
let consNeNil := mkApp3 (mkConst ``List.cons_ne_nil [.zero]) charConst hdExpr tlExpr
let dropNeProof :=
if l₁.length l₂.length then
-- l₁ is shorter: drop l₁ = [], drop l₂ = hd :: tl, need [] ≠ hd :: tl
mkApp4 (mkConst ``Ne.symm [.succ .zero]) listCharTy
(mkApp3 (mkConst ``List.cons [.zero]) charConst hdExpr tlExpr)
(mkApp (mkConst ``List.nil [.zero]) charConst)
consNeNil
else
-- l₂ is shorter: drop l₁ = hd :: tl, drop l₂ = [], need hd :: tl ≠ []
consNeNil
pure <| mkApp4 (mkConst ``mt) listEq dropEq congrArgFn dropNeProof
-- strNeProof : String.ofList l₁ ≠ String.ofList l₂ via mt ofList_injective listNeProof
let strType := mkConst ``String
let strEq := mkApp3 (mkConst ``Eq [.succ .zero]) strType
(mkApp (mkConst ``String.ofList) l₁Expr) (mkApp (mkConst ``String.ofList) l₂Expr)
return mkApp4 (mkConst ``mt) strEq listEq
(mkApp2 (mkConst ``String.ofList_injective) l₁Expr l₂Expr) listNeProof
end Lean.Meta

View File

@@ -30,12 +30,12 @@ then returns `f`. Otherwise, returns `e`.
Returns the original expression when not reducible to enable pointer equality checks.
-/
public def etaReduce (e : Expr) : Expr :=
go e 0
if e.isLambda then go e 0 else e
where
go (body : Expr) (n : Nat) : Expr :=
match body with
| .lam _ _ b _ => go b (n+1)
| _ => if n == 0 then e else etaReduceAux body n 0 e
| _ => etaReduceAux body n 0 e
/-- Returns `true` if `e` can be eta-reduced. Uses pointer equality for efficiency. -/
public def isEtaReducible (e : Expr) : Bool :=

View File

@@ -17,7 +17,7 @@ import Lean.Meta.Sym.ProofInstInfo
import Lean.Meta.Sym.AlphaShareBuilder
import Lean.Meta.Sym.Offset
import Lean.Meta.Sym.Eta
import Lean.Meta.AbstractMVars
import Lean.Meta.Sym.Util
import Init.Data.List.MapIdx
import Init.Data.Nat.Linear
import Std.Do.Triple.Basic
@@ -31,7 +31,9 @@ framework (`Sym`). The design prioritizes performance by using a two-phase appro
# Phase 1 (Syntactic Matching)
- Patterns use de Bruijn indices for expression variables and renamed level params (`_uvar.0`, `_uvar.1`, ...) for universe variables
- Matching is purely structural after reducible definitions are unfolded during preprocessing
- Universe levels treat `max` and `imax` as uninterpreted functions (no AC reasoning)
- Universe levels are eagerly normalized in patterns and normalized on the target side during matching
- `tryApproxMaxMax` handles `max` commutativity when one argument matches structurally. It is relevant
for constraints such as `max ?u a =?= max a b`
- Binders and term metavariables are deferred to Phase 2
# Phase 2 (Pending Constraints)
@@ -107,6 +109,7 @@ def preprocessDeclPattern (declName : Name) : MetaM (List Name × Expr) := do
let us := levelParams.map mkLevelParam
let type instantiateTypeLevelParams info.toConstantVal us
let type preprocessType type
let type normalizeLevels type
return (levelParams, type)
def preprocessExprPattern (e : Expr) (levelParams₀ : List Name) : MetaM (List Name × Expr) := do
@@ -115,6 +118,7 @@ def preprocessExprPattern (e : Expr) (levelParams₀ : List Name) : MetaM (List
let us := levelParams.map mkLevelParam
let type := type.instantiateLevelParams levelParams₀ us
let type preprocessType type
let type normalizeLevels type
return (levelParams, type)
/--
@@ -261,45 +265,6 @@ public def mkEqPatternFromDecl (declName : Name) : MetaM (Pattern × Expr) := do
let_expr Eq _ lhs rhs := type | throwError "conclusion is not a equality{indentExpr type}"
return (lhs, rhs)
/--
Like `mkPatternCore` but takes a lambda expression instead of a forall type.
Uses `lambdaBoundedTelescope` to open binders and detect instance/proof arguments.
-/
def mkPatternCoreFromLambda (lam : Expr) (levelParams : List Name)
(varTypes : Array Expr) (pattern : Expr) : MetaM Pattern := do
let fnInfos mkProofInstInfoMapFor pattern
let checkTypeMask := mkCheckTypeMask pattern varTypes.size
let checkTypeMask? := if checkTypeMask.all (· == false) then none else some checkTypeMask
let varInfos? lambdaBoundedTelescope lam varTypes.size fun xs _ =>
mkProofInstArgInfo? xs
return { levelParams, varTypes, pattern, fnInfos, varInfos?, checkTypeMask? }
def mkPatternFromLambda (levelParams : List Name) (lam : Expr) : MetaM Pattern := do
let rec go (pattern : Expr) (varTypes : Array Expr) : MetaM Pattern := do
if let .lam _ d b _ := pattern then
return ( go b (varTypes.push d))
let pattern preprocessType pattern
mkPatternCoreFromLambda lam levelParams varTypes pattern
go lam #[]
/--
Creates a `Pattern` from an expression containing metavariables.
Metavariables in `e` become pattern variables (wildcards). For example,
`Nat.succ ?m` produces a pattern matching `Nat.succ _` with discrimination
tree keys `[Nat.succ, *]`.
This is used for user-registered simproc patterns where the user provides
an expression with underscores (elaborated as metavariables) to specify
what the simproc should match.
-/
public def mkSimprocPatternFromExpr (e : Expr) : MetaM Pattern := do
let result abstractMVars e
let levelParams := result.paramNames.mapIdx fun i _ => Name.num uvarPrefix i
let us := levelParams.toList.map mkLevelParam
let expr := result.expr.instantiateLevelParamsArray result.paramNames us.toArray
mkPatternFromLambda levelParams.toList expr
structure UnifyM.Context where
pattern : Pattern
unify : Bool := true
@@ -365,35 +330,42 @@ def assignLevel (uidx : Nat) (u : Level) : UnifyM Bool := do
modify fun s => { s with uAssignment := s.uAssignment.set! uidx (some u) }
return true
def processLevel (u : Level) (v : Level) : UnifyM Bool := do
match u, v with
| .zero, .zero => return true
| .succ u, .succ v => processLevel u v
| .zero, .succ _ => return false
| .succ _, .zero => return false
| .zero, .max v₁ v₂ => processLevel .zero v <&&> processLevel .zero v
| .max u₁ u₂, .zero => processLevel u₁ .zero <&&> processLevel u₂ .zero
| .zero, .imax _ v => processLevel .zero v
| .imax _ u, .zero => processLevel u .zero
| .max u₁ u₂, .max v₁ v₂ => processLevel u₁ v₁ <&&> processLevel u₂ v₂
| .imax u₁ u₂, .imax v₁ v => processLevel u₁ v₁ <&&> processLevel u₂ v
| .param uName, _ =>
if let some uidx := isUVar? uName then
assignLevel uidx v
else if u == v then
return true
else if v.isMVar && ( read).unify then
pushLevelPending u v
return true
else
return false
| .mvar _, _ | _, .mvar _ =>
if ( read).unify then
pushLevelPending u v
return true
else
return false
| _, _ => return false
def processLevel (u : Level) (v : Level) : UnifyM Bool :=
go u v.normalize
where
go (u : Level) (v : Level) : UnifyM Bool := do
match u, v with
| .zero, .zero => return true
| .succ u, .succ v => go u v
| .zero, .succ _ => return false
| .succ _, .zero => return false
| .zero, .max v₁ v₂ => go .zero v₁ <&&> go .zero v₂
| .max u₁ u₂, .zero => go u₁ .zero <&&> go u₂ .zero
| .zero, .imax _ v => go .zero v
| .imax _ u, .zero => go u .zero
| .max u₁ u, .max v₁ v₂ =>
-- tryApproxMaxMax: find a shared concrete arg between both sides
if u == v then go u₁ v₂
else if u₁ == v₂ then go u₂ v₁
else go u₁ v₁ <&&> go u₂ v₂
| .imax u₁ u₂, .imax v₁ v₂ => go u v <&&> go u₂ v₂
| .param uName, _ =>
if let some uidx := isUVar? uName then
assignLevel uidx v
else if u == v then
return true
else if v.isMVar && ( read).unify then
pushLevelPending u v
return true
else
return false
| .mvar _, _ | _, .mvar _ =>
if ( read).unify then
pushLevelPending u v
return true
else
return false
| _, _ => return false
def processLevels (us : List Level) (vs : List Level) : UnifyM Bool := do
match us, vs with
@@ -542,7 +514,7 @@ def tryAssignLevelMVar (u : Level) (v : Level) : MetaM Bool := do
/--
Structural definitional equality for universe levels.
Treats `max` and `imax` as uninterpreted functions (no AC reasoning).
Uses `tryApproxMaxMax` to handle `max` commutativity when one argument matches structurally.
Attempts metavariable assignment in both directions if structural matching fails.
-/
def isLevelDefEqS (u : Level) (v : Level) : MetaM Bool := do
@@ -556,7 +528,10 @@ def isLevelDefEqS (u : Level) (v : Level) : MetaM Bool := do
| .max u₁ u₂, .zero => isLevelDefEqS u₁ .zero <&&> isLevelDefEqS u₂ .zero
| .zero, .imax _ v => isLevelDefEqS .zero v
| .imax _ u, .zero => isLevelDefEqS u .zero
| .max u₁ u₂, .max v₁ v₂ => isLevelDefEqS u₁ v₁ <&&> isLevelDefEqS u₂ v₂
| .max u₁ u₂, .max v₁ v₂ =>
if u₂ == v₁ then isLevelDefEqS u₁ v₂
else if u₁ == v₂ then isLevelDefEqS u₂ v₁
else isLevelDefEqS u₁ v₁ <&&> isLevelDefEqS u₂ v₂
| .imax u₁ u₂, .imax v₁ v₂ => isLevelDefEqS u₁ v₁ <&&> isLevelDefEqS u₂ v₂
| _, _ => tryAssignLevelMVar u v <||> tryAssignLevelMVar v u
@@ -598,6 +573,7 @@ structure DefEqM.Context where
abbrev DefEqM := ReaderT DefEqM.Context SymM
set_option compiler.ignoreBorrowAnnotation true in
/--
Structural definitional equality. It is much cheaper than `isDefEq`.

View File

@@ -8,6 +8,7 @@ prelude
public import Lean.Meta.Sym.Pattern
public import Lean.Meta.DiscrTree.Basic
import Lean.Meta.Sym.Offset
import Lean.Meta.Sym.Eta
import Init.Omega
namespace Lean.Meta.Sym
open DiscrTree
@@ -154,7 +155,7 @@ partial def getMatchLoop (todo : Array Expr) (c : Trie α) (result : Array α) :
else if h : csize = 0 then
result
else
let e := todo.back!
let e := etaReduce todo.back!
let todo := todo.pop
let first := cs[0] /- Recall that `Key.star` is the minimal key -/
if csize = 1 then
@@ -184,6 +185,7 @@ public def getMatch (d : DiscrTree α) (e : Expr) : Array α :=
let result := match d.root.find? .star with
| none => .mkEmpty initCapacity
| some (.node vs _) => vs
let e := etaReduce e
match d.root.find? (getKey e) with
| none => result
| some c => getMatchLoop (pushArgsTodo #[] e) c result
@@ -196,6 +198,7 @@ This is useful for rewriting: if a pattern matches `f x` but `e` is `f x y z`, w
still apply the rewrite and return `(value, 2)` indicating 2 extra arguments.
-/
public partial def getMatchWithExtra (d : DiscrTree α) (e : Expr) : Array (α × Nat) :=
let e := etaReduce e
let result := getMatch d e
let result := result.map (·, 0)
if !e.isApp then

View File

@@ -8,6 +8,7 @@ prelude
public import Lean.Meta.Sym.Simp.SimpM
import Init.Sym.Lemmas
import Lean.Meta.Sym.LitValues
import Lean.Meta.StringLitProof
namespace Lean.Meta.Sym.Simp
/-!
@@ -376,6 +377,25 @@ def evalFinPred (n : Expr) (trueThm falseThm : Expr) (op : {n : Nat} → Fin n
open Lean.Sym
/--
Evaluates `@Eq String a b` for string literal arguments, producing kernel-efficient proofs.
When equal, uses `eq_self` (no kernel evaluation needed). When different, uses
`mkStringLitNeProof` which finds the first differing character position and proves
inequality via `congrArg (List.get?Internal · i)`.
-/
private def evalStringEq (a b : Expr) : SimpM Result := do
let some va := getStringValue? a | return .rfl
let some vb := getStringValue? b | return .rfl
if va = vb then
let e getTrueExpr
return .step e (mkApp2 (mkConst ``eq_self [.succ .zero]) (mkConst ``String) a) (done := true)
else
let neProof mkStringLitNeProof va vb
let proof := mkApp2 (mkConst ``eq_false)
(mkApp3 (mkConst ``Eq [.succ .zero]) (mkConst ``String) a b) neProof
let e getFalseExpr
return .step e proof (done := true)
def evalLT (α : Expr) (a b : Expr) : SimpM Result :=
match_expr α with
| Nat => evalBinPred getNatValue? (mkConst ``Nat.lt_eq_true) (mkConst ``Nat.lt_eq_false) (. < .) a b
@@ -434,7 +454,7 @@ def evalEq (α : Expr) (a b : Expr) : SimpM Result :=
| Fin n => evalFinPred n (mkConst ``Fin.eq_eq_true) (mkConst ``Fin.eq_eq_false) (. = .) a b
| BitVec n => evalBitVecPred n (mkConst ``BitVec.eq_eq_true) (mkConst ``BitVec.eq_eq_false) (. = .) a b
| Char => evalBinPred getCharValue? (mkConst ``Char.eq_eq_true) (mkConst ``Char.eq_eq_false) (. = .) a b
| String => evalBinPred getStringValue? (mkConst ``String.eq_eq_true) (mkConst ``String.eq_eq_false) (. = .) a b
| String => evalStringEq a b
| _ => return .rfl
def evalDvd (α : Expr) (a b : Expr) : SimpM Result :=

View File

@@ -227,6 +227,7 @@ def SimpM.run' (x : SimpM α) (methods : Methods := {}) (config : Config := {})
let initialLCtxSize := ( getLCtx).decls.size
x methods.toMethodsRef { initialLCtxSize, config } |>.run' {}
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_sym_simp"] -- Forward declaration
opaque simp : Simproc

View File

@@ -109,4 +109,23 @@ where
public def _root_.Lean.MVarId.checkMaxShared (mvarId : MVarId) (msg := "") : SymM Unit := do
( mvarId.getDecl).type.checkMaxShared msg
/-- Quick filter for checking whether we can skip `normalizeLevels`. -/
def levelsAlreadyNormalized (e : Expr) : Bool :=
Option.isNone <| e.find? fun
| .const _ us => us.any (! ·.isAlreadyNormalizedCheap)
| .sort u => !u.isAlreadyNormalizedCheap
| _ => false
/--
Normalizes universe levels in constants and sorts.
-/
public def normalizeLevels (e : Expr) : CoreM Expr := do
if levelsAlreadyNormalized e then return e
let pre (e : Expr) := do
match e with
| .sort u => return .done <| e.updateSort! u.normalize
| .const _ us => return .done <| e.updateConst! (us.map Level.normalize)
| _ => return .continue
Core.transform e (pre := pre)
end Lean.Meta.Sym

View File

@@ -229,6 +229,7 @@ private inductive MulEqProof where
| mulVar (k : Int) (a : Expr) (h : Expr)
| none
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_cutsat_eq_cnstr_to_proof"] -- forward definition
private opaque EqCnstr.toExprProof (c' : EqCnstr) : ProofM Expr

View File

@@ -41,6 +41,7 @@ def inconsistent : GoalM Bool := do
if ( isInconsistent) then return true
return ( get').conflict?.isSome
set_option compiler.ignoreBorrowAnnotation true in
/-- Creates a new variable in the cutsat module. -/
@[extern "lean_grind_cutsat_mk_var"] -- forward definition
opaque mkVar (e : Expr) : GoalM Var
@@ -62,6 +63,7 @@ def isIntTerm (e : Expr) : GoalM Bool :=
def eliminated (x : Var) : GoalM Bool :=
return ( get').elimEqs[x]!.isSome
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_cutsat_assert_eq"] -- forward definition
opaque EqCnstr.assert (c : EqCnstr) : GoalM Unit
@@ -114,6 +116,7 @@ def DiseqCnstr.throwUnexpected (c : DiseqCnstr) : GoalM α := do
def DiseqCnstr.denoteExpr (c : DiseqCnstr) : GoalM Expr := do
return mkNot (mkIntEq ( c.p.denoteExpr') (mkIntLit 0))
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_cutsat_assert_le"] -- forward definition
opaque LeCnstr.assert (c : LeCnstr) : GoalM Unit

View File

@@ -12,6 +12,7 @@ import Lean.Meta.IntInstTesters
public section
namespace Lean.Meta.Grind.Arith.Cutsat
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_cutsat_propagate_nonlinear"]
opaque propagateNonlinearTerm (y : Var) (x : Var) : GoalM Bool

View File

@@ -109,7 +109,7 @@ where
let e eraseIrrelevantMData e
/- We must fold kernel projections like it is done in the preprocessor. -/
let e foldProjs e
normalizeLevels e
Sym.normalizeLevels e
private def markNestedProof (e : Expr) : M Expr := do
let prop inferType e

View File

@@ -138,7 +138,7 @@ example : f b = f c → a ≤ f a → f (f a) ≤ f (f (f a)) := by
```
-/
@[builtin_command_parser] def grindPattern := leading_parser
Term.attrKind >> "grind_pattern " >> optional ("[" >> ident >> "]") >> ident >> darrow >> sepBy1 termParser "," >> optional grindPatternCnstrs
Term.attrKind >> "grind_pattern " >> optional ("[" >> ident >> "]") >> ident >> darrow >> sepBy1 termParser ", " >> optional grindPatternCnstrs
@[builtin_command_parser] def initGrindNorm := leading_parser
"init_grind_norm " >> many ident >> "| " >> many ident

View File

@@ -63,7 +63,7 @@ def preprocessImpl (e : Expr) : GoalM Simp.Result := do
let e' markNestedSubsingletons e'
let e' eraseIrrelevantMData e'
let e' foldProjs e'
let e' normalizeLevels e'
let e' Sym.normalizeLevels e'
let r' eraseSimpMatchDiscrsOnly e'
let r r.mkEqTrans r'
let e' := r'.expr
@@ -98,6 +98,6 @@ but ensures assumptions made by `grind` are satisfied.
-/
def preprocessLight (e : Expr) : GoalM Expr := do
let e instantiateMVars e
shareCommon ( canon ( normalizeLevels ( foldProjs ( eraseIrrelevantMData ( markNestedSubsingletons ( Sym.unfoldReducible e))))))
shareCommon ( canon ( Sym.normalizeLevels ( foldProjs ( eraseIrrelevantMData ( markNestedSubsingletons ( Sym.unfoldReducible e))))))
end Lean.Meta.Grind

View File

@@ -1359,6 +1359,7 @@ partial def getCongrRoot (e : Expr) : GoalM Expr := do
def isInconsistent : GoalM Bool :=
return ( get).inconsistent
set_option compiler.ignoreBorrowAnnotation true in
/--
Returns a proof that `a = b`.
It assumes `a` and `b` are in the same equivalence class, and have the same type.
@@ -1367,6 +1368,7 @@ It assumes `a` and `b` are in the same equivalence class, and have the same type
@[extern "lean_grind_mk_eq_proof"]
opaque mkEqProof (a b : Expr) : GoalM Expr
set_option compiler.ignoreBorrowAnnotation true in
/--
Returns a proof that `a ≍ b`.
It assumes `a` and `b` are in the same equivalence class.
@@ -1376,14 +1378,17 @@ It assumes `a` and `b` are in the same equivalence class.
opaque mkHEqProof (a b : Expr) : GoalM Expr
-- Forward definition
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_process_new_facts"]
opaque processNewFacts : GoalM Unit
-- Forward definition
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_internalize"]
opaque internalize (e : Expr) (generation : Nat) (parent? : Option Expr := none) : GoalM Unit
-- Forward definition
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_grind_preprocess"]
opaque preprocess : Expr GoalM Simp.Result
@@ -1729,6 +1734,7 @@ def withoutModifyingState (x : GoalM α) : GoalM α := do
finally
set saved
set_option compiler.ignoreBorrowAnnotation true in
/-- Canonicalizes nested types, type formers, and instances in `e`. -/
@[extern "lean_grind_canon"] -- Forward definition
opaque canon (e : Expr) : GoalM Expr

View File

@@ -153,25 +153,7 @@ def foldProjs (e : Expr) : MetaM Expr := do
return .done e
Meta.transform e (post := post)
/-- Quick filter for checking whether we can skip `normalizeLevels`. -/
private def levelsAlreadyNormalized (e : Expr) : Bool :=
Option.isNone <| e.find? fun
| .const _ us => us.any (! ·.isAlreadyNormalizedCheap)
| .sort u => !u.isAlreadyNormalizedCheap
| _ => false
/--
Normalizes universe levels in constants and sorts.
-/
def normalizeLevels (e : Expr) : CoreM Expr := do
if levelsAlreadyNormalized e then return e
let pre (e : Expr) := do
match e with
| .sort u => return .done <| e.updateSort! u.normalize
| .const _ us => return .done <| e.updateConst! (us.map Level.normalize)
| _ => return .continue
Core.transform e (pre := pre)
set_option compiler.ignoreBorrowAnnotation true in
/--
Normalizes the given expression using the `grind` simplification theorems and simprocs.
This function is used for normalizing E-matching patterns. Note that it does not return a proof.

View File

@@ -7,6 +7,7 @@ module
prelude
public import Lean.Meta.Tactic.Simp.BuiltinSimprocs.Char
import Lean.Meta.StringLitProof
public section
@@ -57,6 +58,7 @@ builtin_dsimproc [simp, seval] reduceSingleton (String.singleton _) := fun e =>
let some m fromExpr? e.appArg! | return .continue
evalPropStep e (op n m)
@[inline] def reduceBoolPred (declName : Name) (arity : Nat) (op : String String Bool) (e : Expr) : SimpM DStep := do
unless e.isAppOfArity declName arity do return .continue
let some n fromExpr? e.appFn!.appArg! | return .continue
@@ -67,8 +69,18 @@ builtin_simproc [simp, seval] reduceLT (( _ : String) < _) := reduceBinPred ``
builtin_simproc [simp, seval] reduceLE (( _ : String) _) := reduceBinPred ``LE.le 4 (. .)
builtin_simproc [simp, seval] reduceGT (( _ : String) > _) := reduceBinPred ``GT.gt 4 (. > .)
builtin_simproc [simp, seval] reduceGE (( _ : String) _) := reduceBinPred ``GE.ge 4 (. .)
builtin_simproc [simp, seval] reduceEq (( _ : String) = _) := reduceBinPred ``Eq 3 (. = .)
builtin_simproc [simp, seval] reduceNe (( _ : String) _) := reduceBinPred ``Ne 3 (. .)
builtin_simproc [simp, seval] reduceEq (( _ : String) = _) := fun e => do
unless e.isAppOfArity ``Eq 3 do return .continue
let some n fromExpr? e.appFn!.appArg! | return .continue
let some m fromExpr? e.appArg! | return .continue
evalEqPropStep e (n = m) (mkStringLitNeProof n m)
builtin_simproc [simp, seval] reduceNe (( _ : String) _) := fun e => do
unless e.isAppOfArity ``Ne 3 do return .continue
let some n fromExpr? e.appFn!.appArg! | return .continue
let some m fromExpr? e.appArg! | return .continue
evalNePropStep e (n m) (mkStringLitNeProof n m)
builtin_dsimproc [simp, seval] reduceBEq (( _ : String) == _) := reduceBoolPred ``BEq.beq 4 (. == .)
builtin_dsimproc [simp, seval] reduceBNe (( _ : String) != _) := reduceBoolPred ``bne 4 (. != .)

View File

@@ -24,4 +24,40 @@ def evalPropStep (p : Expr) (result : Bool) : SimpM Step := do
else
return .done { expr := mkConst ``False, proof? := mkAppN (mkConst ``eq_false_of_decide) #[p, d.appArg!, ( mkEqRefl (mkConst ``false))] }
/--
Like `evalPropStep`, but specialized for `@Eq.{u} α a b`. When the values are equal, uses
`eq_true rfl` which requires no kernel evaluation. When different, calls `mkNeProof` to build
a proof of `a ≠ b`.
-/
def evalEqPropStep (e : Expr) (eq : Bool) (mkNeProof : SimpM Expr) : SimpM Step := do
let α := e.appFn!.appFn!.appArg!
let a := e.appFn!.appArg!
let u := e.appFn!.appFn!.appFn!.constLevels!.head!
if eq then
let proof := mkApp2 (mkConst ``eq_true) e (mkApp2 (mkConst ``Eq.refl [u]) α a)
return .done { expr := mkConst ``True, proof? := proof }
else
let neProof mkNeProof
let proof := mkApp2 (mkConst ``eq_false) e neProof
return .done { expr := mkConst ``False, proof? := proof }
/--
Like `evalPropStep`, but specialized for `@Ne.{u} α a b`. When the values differ, calls
`mkNeProof` to build a proof of `a ≠ b`. When equal, uses `eq_false (not_not_intro rfl)`
which requires no kernel evaluation.
-/
def evalNePropStep (e : Expr) (ne : Bool) (mkNeProof : SimpM Expr) : SimpM Step := do
if ne then
let neProof mkNeProof
let proof := mkApp2 (mkConst ``eq_true) e neProof
return .done { expr := mkConst ``True, proof? := proof }
else
let α := e.appFn!.appFn!.appArg!
let a := e.appFn!.appArg!
let u := e.appFn!.appFn!.appFn!.constLevels!.head!
let eqProp := mkApp3 (mkConst ``Eq [u]) α a a
let rflExpr := mkApp2 (mkConst ``Eq.refl [u]) α a
let proof := mkApp2 (mkConst ``eq_false) e (mkApp2 (mkConst ``not_not_intro) eqProp rflExpr)
return .done { expr := mkConst ``False, proof? := proof }
end Lean.Meta.Simp

View File

@@ -295,9 +295,11 @@ Executes `x` using a `MetaM` configuration for inferred from `Simp.Config`.
@[inline] def withSimpMetaConfig (x : SimpM α) : SimpM α := do
withConfigWithKey ( readThe Simp.Context).metaConfig x
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_simp"]
opaque simp (e : Expr) : SimpM Result
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_dsimp"]
opaque dsimp (e : Expr) : SimpM Expr

View File

@@ -36,6 +36,7 @@ namespace Lean.Meta
/-! # Smart unfolding support -/
-- ===========================
set_option compiler.ignoreBorrowAnnotation true in
/--
Forward declaration. It is defined in the module `src/Lean/Elab/PreDefinition/Structural/Eqns.lean`.
It is possible to avoid this hack if we move `Structural.EqnInfo` and `Structural.eqnInfoExt`

View File

@@ -174,6 +174,7 @@ See also `Lean.matchConstStructure` for a less restrictive version.
| _ => failK ()
| _ => failK ()
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_has_compile_error"]
opaque hasCompileError (env : Environment) (constName : Name) : Bool

View File

@@ -234,10 +234,12 @@ def recover'.formatter (fmt : PrettyPrinter.Formatter) := fmt
-- Note that there is a mutual recursion
-- `categoryParser -> mkAntiquot -> termParser -> categoryParser`, so we need to introduce an indirection somewhere
-- anyway.
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_mk_antiquot_formatter"]
opaque mkAntiquot.formatter' (name : String) (kind : SyntaxNodeKind) (anonymous := true) (isPseudoKind := false) : Formatter
-- break up big mutual recursion
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_pretty_printer_formatter_interpret_parser_descr"]
opaque interpretParserDescr' : ParserDescr CoreM Formatter

View File

@@ -307,6 +307,7 @@ def recover'.parenthesizer (p : PrettyPrinter.Parenthesizer) : PrettyPrinter.Par
-- Note that there is a mutual recursion
-- `categoryParser -> mkAntiquot -> termParser -> categoryParser`, so we need to introduce an indirection somewhere
-- anyway.
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_mk_antiquot_parenthesizer"]
opaque mkAntiquot.parenthesizer' (name : String) (kind : SyntaxNodeKind) (anonymous := true) (isPseudoKind := false) : Parenthesizer
@@ -314,6 +315,7 @@ opaque mkAntiquot.parenthesizer' (name : String) (kind : SyntaxNodeKind) (anonym
liftM x
-- break up big mutual recursion
set_option compiler.ignoreBorrowAnnotation true in
@[extern "lean_pretty_printer_parenthesizer_interpret_parser_descr"]
opaque interpretParserDescr' : ParserDescr CoreM Parenthesizer

View File

@@ -1011,7 +1011,6 @@ theorem getEntryGT?_eq_getEntryGT?ₘ [Ord α] (k : α) (t : Impl α β) :
getEntryGT? k t = getEntryGT?ₘ k t := by
rw [getEntryGT?_eq_getEntryGT?ₘ', getEntryGT?ₘ', getEntryGT?ₘ, explore_eq_applyPartition] <;> simp
set_option backward.whnf.reducibleClassField false in
theorem getEntryLT?_eq_getEntryGT?_reverse [o : Ord α] [TransOrd α] {t : Impl α β} {k : α} :
getEntryLT? k t = @getEntryGT? α β o.opposite k t.reverse := by
rw [getEntryLT?, @getEntryGT?.eq_def, Ord.opposite]
@@ -1019,7 +1018,6 @@ theorem getEntryLT?_eq_getEntryGT?_reverse [o : Ord α] [TransOrd α] {t : Impl
simp only [*, getEntryLT?.go, reverse, getEntryGT?.go, OrientedCmp.eq_swap (b := k),
Ordering.swap]
set_option backward.whnf.reducibleClassField false in
theorem getEntryLE?_eq_getEntryGE?_reverse [o : Ord α] [TransOrd α] {t : Impl α β} {k : α} :
getEntryLE? k t = @getEntryGE? α β o.opposite k t.reverse := by
rw [getEntryLE?, @getEntryGE?.eq_def, Ord.opposite]

View File

@@ -623,7 +623,6 @@ theorem RxoIterator.step_cons_of_isLT_eq_false [Ord α] {upper : α} {h : (compa
rw [step, h]
simp only [Bool.false_eq_true, reduceIte]
set_option backward.isDefEq.respectTransparency false in
@[simp]
theorem RxoIterator.val_run_step_toIterM_iter [Ord α] {z : RxoIterator α β} : (z : Iter (α := RxoIterator α β) ((a : α) × β a)).toIterM.step.run.inflate.val = z.step := by
rw [IterM.step]

View File

@@ -1895,7 +1895,6 @@ theorem contains_of_contains_insertMany_list' [TransCmp cmp] [BEq α] [LawfulBEq
(h' : contains (insertMany t l) k = true)
(w : l.findSomeRev? (fun a, b => if cmp a k = .eq then some b else none) = none) :
contains t k = true :=
set_option backward.whnf.reducibleClassField false in
Impl.Const.contains_of_contains_insertMany_list' h
(by simpa [Impl.Const.insertMany_eq_insertMany!] using h')
(by simpa [compare_eq_iff_beq, BEq.comm] using w)

View File

@@ -566,7 +566,7 @@ instance : HAdd PlainDateTime Millisecond.Offset PlainDateTime where
hAdd := addMilliseconds
instance : HSub PlainDateTime Millisecond.Offset PlainDateTime where
hSub := addMilliseconds
hSub := subMilliseconds
instance : HAdd PlainDateTime Second.Offset PlainDateTime where
hAdd := addSeconds

View File

@@ -384,7 +384,7 @@ instance : HAdd PlainTime Duration PlainTime where
hAdd pt d := PlainTime.ofNanoseconds (d.toNanoseconds + pt.toNanoseconds)
instance : HSub PlainTime Duration PlainTime where
hSub pt d := PlainTime.ofNanoseconds (d.toNanoseconds - pt.toNanoseconds)
hSub pt d := PlainTime.ofNanoseconds (pt.toNanoseconds - d.toNanoseconds)
end Duration
end Time

View File

@@ -87,21 +87,20 @@ public def compileO
}
public def mkArgs (basePath : FilePath) (args : Array String) : LogIO (Array String) := do
if Platform.isWindows then
-- Use response file to avoid potentially exceeding CLI length limits.
let rspFile := basePath.addExtension "rsp"
let h IO.FS.Handle.mk rspFile .write
args.forM fun arg =>
-- Escape special characters
let arg := arg.foldl (init := "") fun s c =>
if c == '\\' || c == '"' then
s.push '\\' |>.push c
else
s.push c
h.putStr s!"\"{arg}\"\n"
return #[s!"@{rspFile}"]
else
return args
-- Use response file to avoid potentially exceeding CLI length limits.
-- On Windows this is always needed; on macOS/Linux this is needed for large
-- projects like Mathlib where the number of object files exceeds ARG_MAX.
let rspFile := basePath.addExtension "rsp"
let h IO.FS.Handle.mk rspFile .write
args.forM fun arg =>
-- Escape special characters
let arg := arg.foldl (init := "") fun s c =>
if c == '\\' || c == '"' then
s.push '\\' |>.push c
else
s.push c
h.putStr s!"\"{arg}\"\n"
return #[s!"@{rspFile}"]
public def compileStaticLib
(libFile : FilePath) (oFiles : Array FilePath)

View File

@@ -499,43 +499,71 @@ public def cacheArtifact
/-- **For internal use only.** -/
public class ResolveOutputs (α : Type) where
/-- **For internal use only.** -/
resolveOutputs (out : Json)
(service? : Option CacheServiceName) (scope? : Option CacheServiceScope) : JobM α
resolveOutputs (out : CacheOutput) : JobM α
open ResolveOutputs in
/--
Retrieve artifacts from the Lake cache using the outputs stored
in either the saved trace file or in the cached input-to-content mapping.
Retrieve artifacts from the Lake cache using only the outputs
stored in the cached input-to-content mapping.
**For internal use only.**
-/
@[specialize] public nonrec def getArtifacts?
[ResolveOutputs α] (inputHash : Hash) (savedTrace : SavedTrace) (pkg : Package)
@[specialize] private def getArtifactsUsingCache?
[ResolveOutputs α] (inputHash : Hash) (pkg : Package)
: JobM (Option α) := do
let cache getLakeCache
let updateCache pkg.isArtifactCacheWritable
if let some out cache.readOutputs? pkg.cacheScope inputHash then
if let some out ( getLakeCache).readOutputs? pkg.cacheScope inputHash then
try
return some ( resolveOutputs out.data out.service? out.scope?)
return some ( resolveOutputs out)
catch e =>
let log takeLogFrom e
let msg := s!"input '{inputHash.toString.take 7}' found in package artifact cache, \
but some output(s) have issues:"
let msg := log.entries.foldl (s!"{·}\n- {·.message}") msg
logWarning msg
return none
else
return none
open ResolveOutputs in
/--
Retrieve artifacts from the Lake cache using only the outputs stored in the saved trace file.
**For internal use only.**
-/
@[specialize] public def getArtifactsUsingTrace?
[ResolveOutputs α] (inputHash : Hash) (savedTrace : SavedTrace) (pkg : Package)
: JobM (Option α) := do
if let .ok data := savedTrace then
if data.depHash == inputHash then
if let some out := data.outputs? then
try
let arts resolveOutputs out none none
if updateCache then
if let .error e (cache.writeOutputs pkg.cacheScope inputHash out).toBaseIO then
let arts resolveOutputs (.ofData out)
if ( pkg.isArtifactCacheWritable) then
let act := ( getLakeCache).writeOutputs pkg.cacheScope inputHash out
if let .error e act.toBaseIO then
logWarning s!"could not write outputs to cache: {e}"
return some arts
catch e =>
dropLogFrom e
return none
open ResolveOutputs in
/--
Retrieve artifacts from the Lake cache using the outputs stored in either
the saved trace file or (unless `traceOnly` is `true`) in the cached input-to-content mapping.
**For internal use only.**
-/
@[inline] public nonrec def getArtifacts?
[ResolveOutputs α] (inputHash : Hash) (savedTrace : SavedTrace) (pkg : Package)
: JobM (Option α) := do
if let some a getArtifactsUsingCache? inputHash pkg then
return some a
if let some a getArtifactsUsingTrace? inputHash savedTrace pkg then
return some a
return none
/-- **For internal use only.** -/
public def resolveArtifact
(descr : ArtifactDescr)
@@ -549,6 +577,7 @@ public def resolveArtifact
| .error (.noFileOrDirectory ..) =>
-- we redownload artifacts on any error
if let some service := service? then
updateAction .fetch
if let some service := ws.findCacheService? service.toString then
let some scope := scope?
| error s!"artifact with associated cache service but no scope"
@@ -573,19 +602,17 @@ public def resolveArtifact
| .error e =>
error s!"failed to retrieve artifact from cache: {e}"
def resolveArtifactOutput
(out : Json) (service? : Option CacheServiceName) (scope? : Option CacheServiceScope)
(exe := false)
: JobM Artifact := do
match fromJson? out with
| .ok descr => resolveArtifact descr service? scope? exe
| .error e => error s!"ill-formed artifact output:\n{out.render.pretty 80 2}\n{e}"
/-- **For internal use only.** -/
public def resolveArtifactOutput (out : CacheOutput) (exe := false) : JobM Artifact := do
match fromJson? out.data with
| .ok descr => resolveArtifact descr out.service? out.scope? exe
| .error e => error s!"ill-formed artifact output:\n{out.data.render.pretty 80 2}\n{e}"
set_option linter.unusedVariables false in
/-- An artifact equipped with information about whether it is executable. -/
def XArtifact (exe : Bool) := Artifact
instance : ResolveOutputs (XArtifact exe) := resolveArtifactOutput (exe := exe)
instance : ResolveOutputs (XArtifact exe) := (resolveArtifactOutput (exe := exe))
/--
Construct an artifact from a path outside the Lake artifact cache.
@@ -675,8 +702,9 @@ public def buildArtifactUnlessUpToDate
if let some art fetchArt? (restore := true) then
return art
doBuild depTrace traceFile
if let some outputsRef := pkg.outputsRef? then
outputsRef.insert inputHash art.descr
if pkg.isRoot then
if let some outputsRef := ( getBuildContext).outputsRef? then
outputsRef.insert inputHash art.descr
setTrace art.trace
setMTime art traceFile
else

View File

@@ -6,6 +6,7 @@ Authors: Mac Malone
module
prelude
public import Lake.Config.Cache
public import Lake.Config.Context
public import Lake.Build.Job.Basic
@@ -48,6 +49,11 @@ public def JobQueue := IO.Ref (Array OpaqueJob)
public structure BuildContext extends BuildConfig, Context where
leanTrace : BuildTrace
registeredJobs : JobQueue
/--
Input-to-output(s) map for hashes of the root package's artifacts.
If `none`, tracking outputs is disabled for this build.
-/
outputsRef? : Option CacheRef := none
/-- A transformer to equip a monad with a `BuildContext`. -/
public abbrev BuildT := ReaderT BuildContext

View File

@@ -126,6 +126,9 @@ Its trace just includes its dependencies.
-/
builtin_facet leanArts : Module => ModuleOutputArtifacts
/-- A compressed archive (produced via `leantar`) of the module's build artifacts. -/
builtin_facet ltar : Module => FilePath
/-- The `olean` file produced by `lean`. -/
builtin_facet olean : Module => FilePath

View File

@@ -180,6 +180,9 @@ namespace Module
@[inherit_doc cFacet] public abbrev bc (self : Module) :=
self.facetCore bcFacet
@[inherit_doc ltarFacet] public abbrev ltar (self : Module) :=
self.facetCore ltarFacet
@[inherit_doc oFacet] public abbrev o (self : Module) :=
self.facetCore oFacet

View File

@@ -562,6 +562,7 @@ public def Module.depsFacetConfig : ModuleFacetConfig depsFacet :=
/-- Remove all existing artifacts produced by the Lean build of the module. -/
public def Module.clearOutputArtifacts (mod : Module) : IO PUnit := do
try
removeFileIfExists mod.ltarFile
removeFileIfExists mod.oleanFile
removeFileIfExists mod.oleanServerFile
removeFileIfExists mod.oleanPrivateFile
@@ -575,6 +576,7 @@ public def Module.clearOutputArtifacts (mod : Module) : IO PUnit := do
/-- Remove any cached file hashes of the module build outputs (in `.hash` files). -/
public def Module.clearOutputHashes (mod : Module) : IO PUnit := do
try
clearFileHash mod.ltarFile
clearFileHash mod.oleanFile
clearFileHash mod.oleanServerFile
clearFileHash mod.oleanPrivateFile
@@ -587,6 +589,8 @@ public def Module.clearOutputHashes (mod : Module) : IO PUnit := do
/-- Cache the file hashes of the module build outputs in `.hash` files. -/
public def Module.cacheOutputHashes (mod : Module) : IO PUnit := do
if ( mod.ltarFile.pathExists) then
cacheFileHash mod.ltarFile
cacheFileHash mod.oleanFile
if ( mod.oleanServerFile.pathExists) then
cacheFileHash mod.oleanServerFile
@@ -599,37 +603,68 @@ public def Module.cacheOutputHashes (mod : Module) : IO PUnit := do
if Lean.Internal.hasLLVMBackend () then
cacheFileHash mod.bcFile
def resolveModuleOutputs
(out : Json) (service? : Option CacheServiceName) (scope? : Option CacheServiceScope)
def ModuleOutputDescrs.resolve
(descrs : ModuleOutputDescrs) (service? : Option CacheServiceName) (scope? : Option CacheServiceScope)
: JobM ModuleOutputArtifacts := do
match fromJson? out with
| .ok (descrs : ModuleOutputDescrs) => do
let arts : ModuleOutputArtifacts := {
olean := resolve descrs.olean
oleanServer? := descrs.oleanServer?.mapM resolve
oleanPrivate? := descrs.oleanPrivate?.mapM resolve
ir? := descrs.ir?.mapM resolve
ilean := resolve descrs.ilean
c := resolve descrs.c
bc? := none
}
if Lean.Internal.hasLLVMBackend () then
let some descr := descrs.bc?
| error "LLVM backend enabled but module outputs lack bitcode"
return {arts with bc? := some ( resolve descr)}
else
return arts
| .error e =>
error s!"ill-formed module outputs:\n{out.render.pretty 80 2}\n{e}"
let arts : ModuleOutputArtifacts := {
isModule := descrs.isModule
olean := resolve descrs.olean
oleanServer? := descrs.oleanServer?.mapM resolve
oleanPrivate? := descrs.oleanPrivate?.mapM resolve
ir? := descrs.ir?.mapM resolve
ilean := resolve descrs.ilean
c := resolve descrs.c
ltar? := descrs.ltar?.mapM resolve
}
if Lean.Internal.hasLLVMBackend () then
let some descr := descrs.bc?
| error "LLVM backend enabled but module outputs lack bitcode"
return {arts with bc? := some ( resolve descr)}
else
return arts
where @[inline] resolve descr := resolveArtifact descr service? scope?
instance : ResolveOutputs ModuleOutputArtifacts := resolveModuleOutputs
def resolveModuleOutputArtifacts (out : CacheOutput) : JobM ModuleOutputArtifacts := do
match fromJson? out.data with
| .ok (descrs : ModuleOutputDescrs) => descrs.resolve out.service? out.scope?
| .error e => error s!"ill-formed module outputs:\n{out.data.render.pretty 80 2}\n{e}"
instance : ResolveOutputs ModuleOutputArtifacts := resolveModuleOutputArtifacts
inductive ModuleOutputs
| ltar (art : Artifact)
| arts (arts : ModuleOutputArtifacts)
instance : Coe ModuleOutputArtifacts ModuleOutputs := .arts
def resolveModuleOutputs (out : CacheOutput) : JobM ModuleOutputs := do
if let .str descr := out.data then
match ArtifactDescr.ofFilePath? descr with
| .ok descr =>
return .ltar ( resolveArtifact descr out.service? out.scope?)
| .error e =>
error s!"ill-formed module archive output:\n{out.data.render.pretty 80 2}\n{e}"
else
match fromJson? out.data with
| .ok (descrs : ModuleOutputDescrs) =>
if let some descr := descrs.ltar? then
try
descrs.resolve out.service? out.scope?
catch e =>
dropLogFrom e
return .ltar ( resolveArtifact descr out.service? out.scope?)
else
descrs.resolve out.service? out.scope?
| .error e => error s!"ill-formed module outputs:\n{out.data.render.pretty 80 2}\n{e}"
instance : ResolveOutputs ModuleOutputs := resolveModuleOutputs
/-- Save module build artifacts to the local Lake cache. -/
private def Module.cacheOutputArtifacts
(mod : Module) (isModule : Bool) (useLocalFile : Bool)
: JobM ModuleOutputArtifacts := do
return {
isModule
olean := cache mod.oleanFile "olean"
oleanServer? := cacheIf? isModule mod.oleanServerFile "olean.server"
oleanPrivate? := cacheIf? isModule mod.oleanPrivateFile "olean.private"
@@ -637,6 +672,7 @@ private def Module.cacheOutputArtifacts
ilean := cache mod.ileanFile "ilean"
c := cache mod.cFile "c"
bc? := cacheIf? (Lean.Internal.hasLLVMBackend ()) mod.bcFile "bc"
ltar? := cacheIf? ( mod.ltarFile.pathExists) mod.ltarFile "ltar"
}
where
@[inline] cache file ext := do
@@ -664,11 +700,12 @@ private def Module.restoreAllArtifacts (mod : Module) (cached : ModuleOutputArti
ir? := restoreSome mod.irFile cached.ir?
c := restoreArtifact mod.cFile cached.c
bc? := restoreSome mod.bcFile cached.bc?
ltar? := restoreSome mod.ltarFile cached.ltar?
}
where
@[inline] restoreSome file art? := art?.mapM (restoreArtifact file ·)
public protected def Module.checkExists (self : Module) (isModule : Bool) : BaseIO Bool := do
public def Module.checkArtifactsExsist (self : Module) (isModule : Bool) : BaseIO Bool := do
unless ( self.oleanFile.pathExists) do return false
unless ( self.ileanFile.pathExists) do return false
unless ( self.cFile.pathExists) do return false
@@ -680,22 +717,30 @@ public protected def Module.checkExists (self : Module) (isModule : Bool) : Base
unless ( self.irFile.pathExists) do return false
return true
public protected def Module.checkExists (self : Module) (isModule : Bool) : BaseIO Bool := do
self.ltarFile.pathExists <||> self.checkArtifactsExsist isModule
@[deprecated Module.checkExists (since := "2025-03-04")]
public instance : CheckExists Module := Module.checkExists (isModule := false)
public protected def Module.getMTime (self : Module) (isModule : Bool) : IO MTime := do
let mut mtime :=
( getMTime self.oleanFile)
|> max ( getMTime self.ileanFile)
|> max ( getMTime self.cFile)
if Lean.Internal.hasLLVMBackend () then
mtime := max mtime ( getMTime self.bcFile)
if isModule then
mtime := mtime
|> max ( getMTime self.oleanServerFile)
|> max ( getMTime self.oleanPrivateFile)
|> max ( getMTime self.irFile)
return mtime
try
let mut mtime :=
( getMTime self.oleanFile)
|> max ( getMTime self.ileanFile)
|> max ( getMTime self.cFile)
if Lean.Internal.hasLLVMBackend () then
mtime := max mtime ( getMTime self.bcFile)
if isModule then
mtime := mtime
|> max ( getMTime self.oleanServerFile)
|> max ( getMTime self.oleanPrivateFile)
|> max ( getMTime self.irFile)
return mtime
catch e =>
try getMTime self.ltarFile catch
| .noFileOrDirectory .. => throw e
| e => throw e
@[deprecated Module.getMTime (since := "2025-03-04")]
public instance : GetMTime Module := Module.getMTime (isModule := false)
@@ -711,7 +756,6 @@ private def ModuleOutputArtifacts.setMTime (self : ModuleOutputArtifacts) (mtime
bc? := self.bc?.map ({· with mtime})
}
private def Module.mkArtifacts (mod : Module) (srcFile : FilePath) (isModule : Bool) : ModuleArtifacts where
lean? := srcFile
olean? := mod.oleanFile
@@ -724,6 +768,7 @@ private def Module.mkArtifacts (mod : Module) (srcFile : FilePath) (isModule : B
private def Module.computeArtifacts (mod : Module) (isModule : Bool) : FetchM ModuleOutputArtifacts :=
return {
isModule
olean := compute mod.oleanFile "olean"
oleanServer? := computeIf isModule mod.oleanServerFile "olean.server"
oleanPrivate? := computeIf isModule mod.oleanPrivateFile "olean.private"
@@ -741,6 +786,66 @@ where
instance : ToOutputJson ModuleOutputArtifacts := (toJson ·.descrs)
def Module.packLtar (self : Module) (arts : ModuleOutputArtifacts) : JobM Artifact := do
let arts id do
if ( self.checkArtifactsExsist arts.isModule) then
return arts
else self.restoreAllArtifacts arts
let args id do
let mut args := #[
"-C", self.leanLibDir.toString,
"-C", self.irDir.toString,
self.ltarFile.toString,
self.fileName "trace"
]
let addArt args idx art :=
args.push "-i" |>.push idx |>.push (self.fileName art.ext)
-- Note: oleans parts must be in the order of `.olean`, `.olean.server`, `.olean.private`
args := addArt args "0" arts.olean
if let some art := arts.oleanServer? then
args := addArt args "0" art
if let some art := arts.oleanPrivate? then
args := addArt args "0" art
args := addArt args "0" arts.ilean
if let some art := arts.ir? then
args := addArt args "0" art
args := addArt args "1" arts.c
if Lean.Internal.hasLLVMBackend () then
let some art := arts.bc?
| error "LLVM backend enabled but module outputs lack bitcode"
args := addArt args "1" art
return args
proc (quiet := true) {cmd := ( getLeantar).toString, args}
if ( self.pkg.isArtifactCacheWritable) then
cacheArtifact self.ltarFile "ltar" ( self.pkg.restoreAllArtifacts)
else
computeArtifact self.ltarFile "ltar"
def Module.unpackLtar (self : Module) (ltar : FilePath) : JobM Unit := do
let args := #[
"-C", self.leanLibDir.toString,
"-C", self.irDir.toString,
"-x", ltar.toString
]
proc (quiet := true) {cmd := ( getLeantar).toString, args}
private def Module.recBuildLtar (self : Module) : FetchM (Job FilePath) := do
withRegisterJob s!"{self.name}:ltar" <| withCurrPackage self.pkg do
( self.leanArts.fetch).mapM fun arts => do
let art arts.ltar?.getDM do
if ( getNoBuild) then
modify ({· with wantsRebuild := true})
error "archive does not exist and needs to be built"
else
self.packLtar arts
newTrace s!"{self.name.toString}:ltar"
addTrace art.trace
return art.path
/-- The `ModuleFacetConfig` for the builtin `ltarFacet`. -/
public def Module.ltarFacetConfig : ModuleFacetConfig ltarFacet :=
mkFacetJobConfig recBuildLtar
private def Module.buildLean
(mod : Module) (depTrace : BuildTrace) (srcFile : FilePath) (setup : ModuleSetup)
: JobM ModuleOutputArtifacts := buildAction depTrace mod.traceFile do
@@ -786,45 +891,100 @@ private def Module.recBuildLean (mod : Module) : FetchM (Job ModuleOutputArtifac
addPureTrace mod.pkg.id? "Package.id?"
addPureTrace mod.leanArgs "Module.leanArgs"
setTraceCaption s!"{mod.name.toString}:leanArts"
let depTrace getTrace
let inputHash := depTrace.hash
let savedTrace readTraceFile mod.traceFile
have : CheckExists Module := Module.checkExists (isModule := setup.isModule)
have : GetMTime Module := Module.getMTime (isModule := setup.isModule)
let fetchArtsFromCache? restoreAll := do
let some arts getArtifacts? inputHash savedTrace mod.pkg
| return none
let arts fetchCore setup srcFile srcTrace
let arts trackOutputsIfEnabled arts
let arts adjustMTime arts
return arts
where
fetchFromCache? setup savedTrace restoreAll : JobM (ModuleOutputArtifacts SavedTrace) := do
let inputHash := ( getTrace).hash
let some ltarOrArts getArtifacts? inputHash savedTrace mod.pkg
| return .inr savedTrace
match (ltarOrArts : ModuleOutputs) with
| .ltar ltar =>
mod.clearOutputArtifacts
mod.unpackLtar ltar.path
-- Note: This branch implies that only the ltar output is (validly) cached.
-- Thus, we use only the new trace unpacked from the ltar to resolve further artifacts.
let savedTrace readTraceFile mod.traceFile
let arts? getArtifactsUsingTrace? inputHash savedTrace mod.pkg
if let some (arts : ModuleOutputArtifacts) := arts? then
-- on initial unpack from cache ensure all artifacts uniformly
-- end up in the build directory and, if writable, the cache
let arts mod.restoreAllArtifacts {arts with ltar? := some ltar}
if ( mod.pkg.isArtifactCacheWritable) then
.inl <$> mod.cacheOutputArtifacts setup.isModule restoreAll
else
return .inl arts
else
return .inr savedTrace
| .arts arts =>
unless ( savedTrace.replayOrFetchIfUpToDate inputHash) do
mod.clearOutputArtifacts
writeFetchTrace mod.traceFile inputHash (toJson arts.descrs)
if restoreAll then
some <$> mod.restoreAllArtifacts arts
else
some <$> mod.restoreNeededArtifacts arts
let arts id do
if ( mod.pkg.isArtifactCacheWritable) then
let restore mod.pkg.restoreAllArtifacts
if let some arts fetchArtsFromCache? restore then
let arts
if restoreAll then
mod.restoreAllArtifacts arts
else
mod.restoreNeededArtifacts arts
return .inl arts
fetchCore setup srcFile srcTrace : JobM ModuleOutputArtifacts := do
let depTrace getTrace
have : GetMTime Module := Module.getMTime (isModule := setup.isModule)
have : CheckExists Module := Module.checkExists (isModule := setup.isModule)
let savedTrace readTraceFile mod.traceFile
if ( mod.pkg.isArtifactCacheWritable) then
let restore mod.pkg.restoreAllArtifacts
match ( fetchFromCache? setup savedTrace restore) with
| .inl arts =>
return arts
| .inr savedTrace =>
let status savedTrace.replayIfUpToDate' (oldTrace := srcTrace.mtime) mod depTrace
if status.isUpToDate then
unless ( mod.checkArtifactsExsist setup.isModule) do
mod.unpackLtar mod.ltarFile
else
discard <| mod.buildLean depTrace srcFile setup
if status.isCacheable then
let arts mod.cacheOutputArtifacts setup.isModule restore
( getLakeCache).writeOutputs mod.pkg.cacheScope depTrace.hash arts.descrs
return arts
else
let status savedTrace.replayIfUpToDate' (oldTrace := srcTrace.mtime) mod depTrace
unless status.isUpToDate do
discard <| mod.buildLean depTrace srcFile setup
if status.isCacheable then
let arts mod.cacheOutputArtifacts setup.isModule restore
( getLakeCache).writeOutputs mod.pkg.cacheScope inputHash arts.descrs
return arts
else
mod.computeArtifacts setup.isModule
else if ( savedTrace.replayIfUpToDate (oldTrace := srcTrace.mtime) mod depTrace) then
mod.computeArtifacts setup.isModule
else
if ( savedTrace.replayIfUpToDate (oldTrace := srcTrace.mtime) mod depTrace) then
unless ( mod.checkArtifactsExsist setup.isModule) do
mod.unpackLtar mod.ltarFile
mod.computeArtifacts setup.isModule
else
if ( mod.pkg.isArtifactCacheReadable) then
if let some arts fetchArtsFromCache? true then
return arts
mod.buildLean depTrace srcFile setup
if let some ref := mod.pkg.outputsRef? then
ref.insert inputHash arts.descrs
match ( fetchFromCache? setup savedTrace true) with
| .inl arts =>
return arts
| .inr savedTrace =>
if ( savedTrace.replayIfUpToDate (oldTrace := srcTrace.mtime) mod depTrace) then
mod.computeArtifacts setup.isModule
else
mod.buildLean depTrace srcFile setup
else
mod.buildLean depTrace srcFile setup
trackOutputsIfEnabled arts : JobM ModuleOutputArtifacts := do
if mod.pkg.isRoot then
if let some ref := ( getBuildContext).outputsRef? then
let inputHash := ( getTrace).hash
if let some ltar := arts.ltar? then
ref.insert inputHash ltar.descr
return arts
else
let ltar id do
if ( mod.ltarFile.pathExists) then
computeArtifact mod.ltarFile "ltar"
else
mod.packLtar arts
ref.insert inputHash ltar.descr
return {arts with ltar? := some ltar}
return arts
adjustMTime arts : JobM ModuleOutputArtifacts := do
match ( getMTime mod.traceFile |>.toBaseIO) with
| .ok mtime =>
return arts.setMTime mtime
@@ -846,7 +1006,7 @@ public def Module.leanArtsFacetConfig : ModuleFacetConfig leanArtsFacet :=
/-
Avoid recompiling unchanged OLean files.
OLean files transitively include their imports.
THowever, imports are pre-resolved by Lake, so they are not included in their trace.
However, imports are pre-resolved by Lake, so they are not included in their trace.
-/
newTrace s!"{mod.name.toString}:{facet}"
addTrace art.trace
@@ -891,7 +1051,7 @@ public def Module.cFacetConfig : ModuleFacetConfig cFacet :=
let art := arts.c
/-
Avoid recompiling unchanged C files.
C files are assumed to incorporate their own content
C files are assumed to only incorporate their own content
and not transitively include their inputs (e.g., imports).
They do, however, include `lean/lean.h`.
Lean also produces LF-only C files, so no line ending normalization.
@@ -1031,6 +1191,7 @@ public def Module.initFacetConfigs : DNameMap ModuleFacetConfig :=
|>.insert importArtsFacet importArtsFacetConfig
|>.insert importAllArtsFacet importAllArtsFacetConfig
|>.insert exportInfoFacet exportInfoFacetConfig
|>.insert ltarFacet ltarFacetConfig
|>.insert oleanFacet oleanFacetConfig
|>.insert oleanServerFacet oleanServerFacetConfig
|>.insert oleanPrivateFacet oleanPrivateFacetConfig

View File

@@ -15,6 +15,7 @@ namespace Lake
/-- The descriptions of the output artifacts of a module build. -/
public structure ModuleOutputDescrs where
isModule : Bool
olean : ArtifactDescr
oleanServer? : Option ArtifactDescr := none
oleanPrivate? : Option ArtifactDescr := none
@@ -22,6 +23,7 @@ public structure ModuleOutputDescrs where
ir? : Option ArtifactDescr := none
c : ArtifactDescr
bc? : Option ArtifactDescr := none
ltar? : Option ArtifactDescr := none
public def ModuleOutputDescrs.oleanParts (self : ModuleOutputDescrs) : Array ArtifactDescr := Id.run do
let mut descrs := #[self.olean]
@@ -33,6 +35,7 @@ public def ModuleOutputDescrs.oleanParts (self : ModuleOutputDescrs) : Array Art
public protected def ModuleOutputDescrs.toJson (self : ModuleOutputDescrs) : Json := Id.run do
let mut obj : JsonObject := {}
obj := obj.insert "m" self.isModule
obj := obj.insert "o" self.oleanParts
obj := obj.insert "i" self.ilean
if let some ir := self.ir? then
@@ -40,6 +43,8 @@ public protected def ModuleOutputDescrs.toJson (self : ModuleOutputDescrs) : Jso
obj := obj.insert "c" self.c
if let some bc := self.bc? then
obj := obj.insert "b" bc
if let some ltar := self.ltar? then
obj := obj.insert "l" ltar
return obj
public instance : ToJson ModuleOutputDescrs := ModuleOutputDescrs.toJson
@@ -50,6 +55,7 @@ public protected def ModuleOutputDescrs.fromJson? (val : Json) : Except String M
let some olean := oleanHashes[0]?
| throw "expected a least one 'o' (.olean) hash"
return {
isModule := ( obj.get? "m").getD (oleanHashes.size > 1)
olean := olean
oleanServer? := oleanHashes[1]?
oleanPrivate? := oleanHashes[2]?
@@ -57,12 +63,14 @@ public protected def ModuleOutputDescrs.fromJson? (val : Json) : Except String M
ir? := obj.get? "r"
c := obj.get "c"
bc? := obj.get? "b"
ltar? := obj.get? "l"
}
public instance : FromJson ModuleOutputDescrs := ModuleOutputDescrs.fromJson?
/-- The output artifacts of a module build. -/
public structure ModuleOutputArtifacts where
isModule : Bool
olean : Artifact
oleanServer? : Option Artifact := none
oleanPrivate? : Option Artifact := none
@@ -70,9 +78,11 @@ public structure ModuleOutputArtifacts where
ir? : Option Artifact := none
c : Artifact
bc? : Option Artifact := none
ltar? : Option Artifact := none
/-- Content hashes of the artifacts. -/
public def ModuleOutputArtifacts.descrs (arts : ModuleOutputArtifacts) : ModuleOutputDescrs where
isModule := arts.isModule
olean := arts.olean.descr
oleanServer? := arts.oleanServer?.map (·.descr)
oleanPrivate? := arts.oleanPrivate?.map (·.descr)
@@ -80,3 +90,4 @@ public def ModuleOutputArtifacts.descrs (arts : ModuleOutputArtifacts) : ModuleO
ir? := arts.ir?.map (·.descr)
c := arts.c.descr
bc? := arts.bc?.map (·.descr)
ltar? := arts.ltar?.map (·.descr)

View File

@@ -260,14 +260,14 @@ public def monitorJobs
public def noBuildCode : ExitCode := 3
def Workspace.saveOutputs
[logger : MonadLog BaseIO] (ws : Workspace)
[logger : MonadLog BaseIO] (ws : Workspace) (outputsRef? : Option CacheRef)
(out : IO.FS.Stream) (outputsFile : FilePath) (isVerbose : Bool)
: BaseIO Unit := do
unless ws.isRootArtifactCacheWritable do
logWarning s!"{ws.root.prettyName}: \
the artifact cache is not enabled for this package, so the artifacts described \
by the mappings produced by `-o` will not necessarily be available in the cache."
if let some ref := ws.root.outputsRef? then
if let some ref := outputsRef? then
match ( ( ref.get).writeFile outputsFile {}) with
| .ok _ log =>
if !log.isEmpty && isVerbose then
@@ -314,27 +314,34 @@ def monitorJob (ctx : MonitorContext) (job : Job α) : BaseIO (BuildResult α) :
else
return {toMonitorResult := result, out := .error "build failed"}
def mkBuildContext' (ws : Workspace) (cfg : BuildConfig) (jobs : JobQueue) : BuildContext where
def mkBuildContext'
(ws : Workspace) (cfg : BuildConfig) (jobs : JobQueue)
: BaseIO BuildContext := return {
opaqueWs := ws
toBuildConfig := cfg
outputsRef? := id do
if cfg.outputsFile?.isSome then
some <$> CacheRef.mk
else
return none
registeredJobs := jobs
leanTrace := .ofHash (pureHash ws.lakeEnv.leanGithash)
s!"Lean {Lean.versionStringCore}, commit {ws.lakeEnv.leanGithash}"
}
def Workspace.startBuild
(ws : Workspace) (cfg : BuildConfig) (jobs : JobQueue) (build : FetchM α)
(caption := "job computation")
(bctx : BuildContext) (build : FetchM α) (caption := "job computation")
: BaseIO (Job α) := do
let bctx := mkBuildContext' ws cfg jobs
let compute := Job.async build (caption := caption)
compute.run.run'.run bctx |>.run nilTrace
def Workspace.finalizeBuild
(ws : Workspace) (cfg : BuildConfig) (ctx : MonitorContext) (result : BuildResult α)
def finalizeBuild
(cfg : BuildConfig) (bctx : BuildContext ) (mctx : MonitorContext) (result : BuildResult α)
: IO α := do
reportResult cfg ctx.out result
reportResult cfg mctx.out result
if let some outputsFile := cfg.outputsFile? then
ws.saveOutputs (logger := ctx.logger) ctx.out outputsFile (cfg.verbosity matches .verbose)
bctx.workspace.saveOutputs (logger := mctx.logger)
bctx.outputsRef? mctx.out outputsFile (cfg.verbosity matches .verbose)
match result.out with
| .ok a =>
return a
@@ -354,9 +361,10 @@ public def Workspace.runFetchM
: IO α := do
let jobs mkJobQueue
let mctx mkMonitorContext cfg jobs
let job ws.startBuild cfg jobs build caption
let bctx mkBuildContext' ws cfg jobs
let job startBuild bctx build caption
let result monitorJob mctx job
ws.finalizeBuild cfg mctx result
finalizeBuild cfg bctx mctx result
def monitorBuild (mctx : MonitorContext) (job : Job (Job α)) : BaseIO (BuildResult α) := do
let result monitorJob mctx job
@@ -382,7 +390,8 @@ public def Workspace.checkNoBuild
let jobs mkJobQueue
let cfg := {noBuild := true}
let mctx mkMonitorContext cfg jobs
let job ws.startBuild cfg jobs build
let bctx mkBuildContext' ws cfg jobs
let job startBuild bctx build
let result monitorBuild mctx job
return result.isOk -- `isOk` means no failures, and thus no `--no-build` failures
@@ -392,9 +401,10 @@ public def Workspace.runBuild
: IO α := do
let jobs mkJobQueue
let mctx mkMonitorContext cfg jobs
let job ws.startBuild cfg jobs build
let bctx mkBuildContext' ws cfg jobs
let job startBuild bctx build
let result monitorBuild mctx job
ws.finalizeBuild cfg mctx result
finalizeBuild cfg bctx mctx result
/-- Produce a build job in the Lake monad's workspace and await the result. -/
@[inline] public def runBuild

View File

@@ -377,7 +377,7 @@ OPTIONS:
--platform=<target-triple> with Reservoir or --repo, sets the platform
--toolchain=<name> with Reservoir or --repo, sets the toolchain
--scope=<remote-scope> scope for a custom endpoint
--download-arts download artifacts now, not on demand
--mappings-only only download mappings, delay artifacts
--force-download redownload existing files
Downloads build outputs for packages in the workspace from a remote cache
@@ -408,10 +408,9 @@ artifacts. If no mappings are found, Lake will backtrack the Git history up to
`--max-revs`, looking for a revision with mappings. If `--max-revs` is 0, Lake
will search the repository's entire history (or as far as Git will allow).
With a named service and without a mappings file, Lake will only download
the input-to-output mappings for packages. It will delay downloading of the
corresponding artifacts to the next `lake build` that requires them. Using
`--download-arts` will force Lake to download all artifacts eagerly.
By default, Lake will download both the input-to-output mappings and the
output artifacts for a package. By using `--mappings-onlys`, Lake will only
download the mappings abd delay downloading artifacts until they are needed.
If a download for an artifact fails or the download process for a whole
package fails, Lake will report this and continue on to the next. Once done,

View File

@@ -64,7 +64,7 @@ public structure LakeOptions where
offline : Bool := false
outputsFile? : Option FilePath := none
forceDownload : Bool := false
downloadArts : Bool := false
mappingsOnly : Bool := false
service? : Option String := none
scope? : Option CacheServiceScope := none
platform? : Option CachePlatform := none
@@ -249,7 +249,8 @@ def lakeLongOption : (opt : String) → CliM PUnit
| "--wfail" => modifyThe LakeOptions ({· with failLv := .warning})
| "--iofail" => modifyThe LakeOptions ({· with failLv := .info})
| "--force-download" => modifyThe LakeOptions ({· with forceDownload := true})
| "--download-arts" => modifyThe LakeOptions ({· with downloadArts := true})
| "--download-arts" => modifyThe LakeOptions ({· with mappingsOnly := false})
| "--mappings-only" => modifyThe LakeOptions ({· with mappingsOnly := true})
| "--service" => do
let service takeOptArg "--service" "service name"
modifyThe LakeOptions ({· with service? := some service})
@@ -391,7 +392,7 @@ def serviceNotFound (service : String) (configuredServices : Array CacheServiceC
configuredServices.foldl (· ++ s!" {·.name}") msg
@[inline] private def cacheToolchain (pkg : Package) (toolchain : CacheToolchain) : CacheToolchain :=
if pkg.bootstrap then .none else toolchain
if pkg.fixedToolchain || pkg.bootstrap then .none else toolchain
@[inline] private def cachePlatform (pkg : Package) (platform : CachePlatform) : CachePlatform :=
if pkg.isPlatformIndependent then .none else platform
@@ -409,6 +410,8 @@ protected def get : CliM PUnit := do
let ws loadWorkspace cfg
let cache := ws.lakeCache
if let some file := mappings? then liftM (m := LoggerIO) do
if opts.mappingsOnly then
error "`--mappings-only` is not supported with a mappings file; use `lake cache add` instead"
if opts.platform?.isSome || opts.toolchain?.isSome then
logWarning "the `--platform` and `--toolchain` options do nothing for `cache get` with a mappings file"
if opts.failLv .warning then
@@ -441,6 +444,9 @@ protected def get : CliM PUnit := do
match ws.lakeEnv.cacheArtifactEndpoint?, ws.lakeEnv.cacheRevisionEndpoint? with
| some artifactEndpoint, some revisionEndpoint =>
logWarning endpointDeprecation
if opts.mappingsOnly then
error "`--mappings-only` requires services to be configured
via the Lake system configuration (not enviroment variables)"
return .downloadService artifactEndpoint revisionEndpoint ws.lakeEnv.cacheService?
| none, none =>
return ws.defaultCacheService
@@ -469,7 +475,7 @@ protected def get : CliM PUnit := do
else
findOutputs cache service pkg remoteScope opts platform toolchain
cache.writeMap pkg.cacheScope map service.name? (some remoteScope)
if opts.downloadArts || service.name?.isNone then
unless opts.mappingsOnly do
let descrs map.collectOutputDescrs
service.downloadArtifacts descrs cache remoteScope opts.forceDownload
else if service.isReservoir then
@@ -483,7 +489,7 @@ protected def get : CliM PUnit := do
try
let map findOutputs cache service pkg remoteScope opts platform toolchain
cache.writeMap pkg.cacheScope map service.name? (some remoteScope)
if opts.downloadArts || service.name?.isNone then
unless opts.mappingsOnly do
let descrs map.collectOutputDescrs
service.downloadArtifacts descrs cache remoteScope opts.forceDownload
return ok

View File

@@ -166,8 +166,7 @@ where go as o := do
match o with
| .null =>
return as
| .bool b =>
logError s!"unsupported output: {b}"
| .bool _ => -- boolean metadata is allowed (as of 2025-03-13)
return as
| .num o =>
match Hash.ofJsonNumber? o with
@@ -297,7 +296,8 @@ public structure CacheOutput where
namespace CacheOutput
@[inline] def ofData (data : Json) : CacheOutput := {data}
/-- **For internal use only.** -/
@[inline] public def ofData (data : Json) : CacheOutput := {data}
public protected def toJson (out : CacheOutput) : Json := Id.run do
let mut obj :=

View File

@@ -77,6 +77,9 @@ public abbrev pkg (self : Module) : Package :=
@[inline] public def rootDir (self : Module) : FilePath :=
self.lib.rootDir
@[inline] public def fileName (ext : String) (self : Module) : String :=
FilePath.addExtension self.name.getString! ext |>.toString
@[inline] public def filePath (dir : FilePath) (ext : String) (self : Module) : FilePath :=
Lean.modToFilePath dir self.name ext
@@ -92,6 +95,9 @@ public abbrev pkg (self : Module) : Package :=
@[inline] public def leanLibPath (ext : String) (self : Module) : FilePath :=
self.filePath self.pkg.leanLibDir ext
@[inline] public def leanLibDir (self : Module) : FilePath :=
Lean.modToFilePath self.pkg.leanLibDir self.name.getPrefix ""
@[inline] public def oleanFile (self : Module) : FilePath :=
self.leanLibPath "olean"
@@ -104,18 +110,21 @@ public abbrev pkg (self : Module) : Package :=
@[inline] public def ileanFile (self : Module) : FilePath :=
self.leanLibPath "ilean"
@[inline] public def irFile (self : Module) : FilePath :=
self.leanLibPath "ir"
@[inline] public def traceFile (self : Module) : FilePath :=
self.leanLibPath "trace"
@[inline] public def irPath (ext : String) (self : Module) : FilePath :=
self.filePath self.pkg.irDir ext
@[inline] public def irDir (self : Module) : FilePath :=
Lean.modToFilePath self.pkg.irDir self.name.getPrefix ""
@[inline] public def setupFile (self : Module) : FilePath :=
self.irPath "setup.json"
@[inline] public def irFile (self : Module) : FilePath :=
self.leanLibPath "ir"
@[inline] public def cFile (self : Module) : FilePath :=
self.irPath "c"
@@ -134,6 +143,9 @@ public def bcFile? (self : Module) : Option FilePath :=
@[inline] public def bcoFile (self : Module) : FilePath :=
self.irPath "bc.o"
@[inline] public def ltarFile (self : Module) : FilePath :=
self.irPath "ltar"
/-- Suffix for single module dynlibs (e.g., for precompilation). -/
public def dynlibSuffix := "-1"

View File

@@ -300,6 +300,10 @@ variable [Functor m]
@[inline] public def getLeanc : m FilePath :=
(·.leanc) <$> getLeanInstall
/-- Returns the path of the {lit}`leantar` binary in the detected Lean installation. -/
@[inline] public def getLeantar : m FilePath :=
(·.leantar) <$> getLeanInstall
/-- Returns the path of the {lit}`libleanshared` library in the detected Lean installation. -/
@[inline] public def getLeanSharedLib : m FilePath :=
(·.sharedLib) <$> getLeanInstall

View File

@@ -45,7 +45,7 @@ public structure Package where
/-- The path to the package's configuration file (relative to `dir`). -/
relConfigFile : FilePath
/-- The path to the package's JSON manifest of remote dependencies (relative to `dir`). -/
relManifestFile : FilePath := config.manifestFile.getD defaultManifestFile |>.normalize
relManifestFile : FilePath := defaultManifestFile
/-- The package's scope (e.g., in Reservoir). -/
scope : String
/-- The URL to this package's Git remote. -/
@@ -78,16 +78,6 @@ public structure Package where
testDriver : String := config.testDriver
/-- The driver used for `lake lint` when this package is the workspace root. -/
lintDriver : String := config.lintDriver
/--
Input-to-output(s) map for hashes of package artifacts.
If `none`, the artifact cache is disabled for the package.
-/
inputsRef? : Option CacheRef := none
/--
Input-to-output(s) map for hashes of package artifacts.
If `none`, the artifact cache is disabled for the package.
-/
outputsRef? : Option CacheRef := none
deriving Inhabited
@@ -254,6 +244,10 @@ public def id? (self : Package) : Option PkgId :=
@[inline] public def isPlatformIndependent (self : Package) : Bool :=
self.config.platformIndependent == some true
/-- The package's `fixedToolchain` configuration. -/
@[inline] public def fixedToolchain (self : Package) : Bool :=
self.config.fixedToolchain
/-- The package's `releaseRepo`/`releaseRepo?` configuration. -/
@[inline] public def releaseRepo? (self : Package) : Option String :=
self.config.releaseRepo

View File

@@ -29,16 +29,6 @@ public configuration PackageConfig (p : Name) (n : Name) extends WorkspaceConfig
/-- **For internal use.** Whether this package is Lean itself. -/
bootstrap : Bool := false
/--
**This field is deprecated.**
The path of a package's manifest file, which stores the exact versions
of its resolved dependencies.
Defaults to `defaultManifestFile` (i.e., `lake-manifest.json`).
-/
manifestFile : Option FilePath := none
/-- An `Array` of target names to build whenever the package is used. -/
extraDepTargets : Array Name := #[]
@@ -331,6 +321,18 @@ public configuration PackageConfig (p : Name) (n : Name) extends WorkspaceConfig
-/
allowImportAll : Bool := false
/--
Whether this package is expected to function only on a single toolchain
(the package's toolchain).
This informs Lake's toolchain update procedure (in `lake update`) to prioritize
this package's toolchain. It also avoids the need to separate input-to-output mappings
for this package by toolchain version in the Lake cache.
Defaults to `false`.
-/
fixedToolchain : Bool := false
deriving Inhabited
/-- The package's name as specified by the author. -/

View File

@@ -88,6 +88,7 @@ def elabConfigFile
else
return s.commandState.env
set_option compiler.ignoreBorrowAnnotation true in
/--
`Lean.Kernel.Environment.add` is now private, this is an exported helper wrapping it for
`Lean.Environment`.

View File

@@ -166,10 +166,6 @@ public def Package.loadFromEnv
else
pure self.config.lintDriver
-- Deprecation warnings
unless self.config.manifestFile.isNone do
logWarning s!"{self.prettyName}: package configuration option 'manifestFile' is deprecated"
-- Fill in the Package
return {self with
depConfigs, targetDecls, targetDeclMap

View File

@@ -50,8 +50,9 @@ That is, Lake ignores the `-` suffix.
**v1.x.x** (versioned by a string)
- `"1.0.0"`: Switches to a semantic versioning scheme
- `"1.1.0"`: Add optional `scope` package entry field
- `"1.2.0"`: Add optional `fixedToolchain` manifest field
-/
@[inline] public def Manifest.version : StdVer := {major := 1, minor := 1}
@[inline] public def Manifest.version : StdVer := {major := 1, minor := 2}
/-- Manifest version `0.6.0` package entry. For backwards compatibility. -/
private inductive PackageEntryV6
@@ -84,7 +85,9 @@ public structure PackageEntry where
name : Name
scope : String := ""
inherited : Bool
/-- The relative path within the package directory to the Lake configuration file. -/
configFile : FilePath := defaultConfigFile
/-- The relative path within the package directory to the Lake manifest file. -/
manifestFile? : Option FilePath := none
src : PackageEntrySrc
deriving Inhabited
@@ -139,7 +142,7 @@ public protected def fromJson? (json : Json) : Except String PackageEntry := do
| _ =>
throw s!"unknown package entry type '{type}'"
return {
name, scope, inherited,
name, scope, inherited
configFile, manifestFile? := manifestFile, src
: PackageEntry
}
@@ -172,6 +175,7 @@ end PackageEntry
public structure Manifest where
name : Name
lakeDir : FilePath
fixedToolchain : Bool
packagesDir? : Option FilePath := none
packages : Array PackageEntry := #[]
@@ -184,6 +188,7 @@ public def addPackage (entry : PackageEntry) (self : Manifest) : Manifest :=
public protected def toJson (self : Manifest) : Json :=
Json.mkObj [
("version", toJson version),
("fixedToolchain", toJson self.fixedToolchain),
("name", toJson self.name),
("lakeDir", toJson self.lakeDir),
("packagesDir", toJson self.packagesDir?),
@@ -217,11 +222,12 @@ private def getPackages (ver : StdVer) (obj : JsonObject) : Except String (Array
public protected def fromJson? (json : Json) : Except String Manifest := do
let obj JsonObject.fromJson? json
let ver getVersion obj
let fixedToolchain obj.getD "fixedToolchain" false
let name obj.getD "name" Name.anonymous
let lakeDir obj.getD "lakeDir" defaultLakeDir
let packagesDir? obj.get? "packagesDir"
let packages getPackages ver obj
return {name, lakeDir, packagesDir?, packages}
return {name, lakeDir, packagesDir?, packages, fixedToolchain}
public instance : FromJson Manifest := Manifest.fromJson?

View File

@@ -89,26 +89,33 @@ public structure MaterializedDep where
Used as the endpoint from which to fetch cloud releases for the package.
-/
remoteUrl : String
/-- The manifest for the dependency or the error produced when trying to load it. -/
manifest? : Except IO.Error Manifest
/-- The manifest entry for the dependency. -/
manifestEntry : PackageEntry
deriving Inhabited
namespace MaterializedDep
@[inline] public def name (self : MaterializedDep) :=
@[inline] public def name (self : MaterializedDep) : Name :=
self.manifestEntry.name
@[inline] public def scope (self : MaterializedDep) :=
@[inline] public def scope (self : MaterializedDep) : String :=
self.manifestEntry.scope
/-- Path to the dependency's configuration file (relative to `relPkgDir`). -/
@[inline] public def manifestFile? (self : MaterializedDep) :=
@[inline] public def manifestFile? (self : MaterializedDep) : Option FilePath :=
self.manifestEntry.manifestFile?
/-- Path to the dependency's configuration file (relative to `relPkgDir`). -/
@[inline] public def configFile (self : MaterializedDep) :=
@[inline] public def configFile (self : MaterializedDep) : FilePath :=
self.manifestEntry.configFile
public def fixedToolchain (self : MaterializedDep) : Bool :=
match self.manifest? with
| .ok manifest => manifest.fixedToolchain
| _ => false
end MaterializedDep
inductive InputVer
@@ -148,7 +155,7 @@ public def Dependency.materialize
match src with
| .path dir =>
let relPkgDir := relParentDir / dir
return mkDep relPkgDir "" (.path relPkgDir)
mkDep (wsDir / relPkgDir) relPkgDir "" (.path relPkgDir)
| .git url inputRev? subDir? => do
let sname := dep.name.toString (escape := false)
let repoUrl := Git.filterUrl? url |>.getD ""
@@ -196,16 +203,19 @@ public def Dependency.materialize
| _ => error s!"{pkg.fullName}: Git source not found on Reservoir"
where
materializeGit name relPkgDir gitUrl remoteUrl inputRev? subDir? : LoggerIO MaterializedDep := do
let repo := GitRepo.mk (wsDir / relPkgDir)
let pkgDir := wsDir / relPkgDir
let repo := GitRepo.mk pkgDir
let gitUrl := lakeEnv.pkgUrlMap.find? dep.name |>.getD gitUrl
materializeGitRepo name repo gitUrl inputRev?
let rev repo.getHeadRevision
let relPkgDir := if let some subDir := subDir? then relPkgDir / subDir else relPkgDir
return mkDep relPkgDir remoteUrl <| .git gitUrl rev inputRev? subDir?
@[inline] mkDep relPkgDir remoteUrl src : MaterializedDep := {
relPkgDir, remoteUrl,
manifestEntry := {name := dep.name, scope := dep.scope, inherited, src}
}
mkDep pkgDir relPkgDir remoteUrl <| .git gitUrl rev inputRev? subDir?
@[inline] mkDep pkgDir relPkgDir remoteUrl src : LoggerIO MaterializedDep := do
return {
relPkgDir, remoteUrl
manifest? := Manifest.load (pkgDir / defaultManifestFile) |>.toBaseIO
manifestEntry := {name := dep.name, scope := dep.scope, inherited, src}
}
/--
Materializes a manifest package entry, cloning and/or checking it out as necessary.
@@ -216,7 +226,7 @@ public def PackageEntry.materialize
: LoggerIO MaterializedDep :=
match manifestEntry.src with
| .path (dir := relPkgDir) .. =>
return mkDep relPkgDir ""
mkDep (wsDir / relPkgDir) relPkgDir ""
| .git (url := url) (rev := rev) (subDir? := subDir?) .. => do
let sname := manifestEntry.name.toString (escape := false)
let relGitDir := relPkgsDir / sname
@@ -239,7 +249,12 @@ public def PackageEntry.materialize
let url := lakeEnv.pkgUrlMap.find? manifestEntry.name |>.getD url
cloneGitPkg sname repo url rev
let relPkgDir := match subDir? with | .some subDir => relGitDir / subDir | .none => relGitDir
return mkDep relPkgDir (Git.filterUrl? url |>.getD "")
mkDep gitDir relPkgDir (Git.filterUrl? url |>.getD "")
where
@[inline] mkDep relPkgDir remoteUrl : MaterializedDep :=
{relPkgDir, remoteUrl, manifestEntry}
@[inline] mkDep pkgDir relPkgDir remoteUrl : LoggerIO MaterializedDep := do
let manifest? id do
if let some manifestFile := manifestEntry.manifestFile? then
Manifest.load (pkgDir / manifestFile) |>.toBaseIO
else
return .error (.noFileOrDirectory "" 0 "")
return {relPkgDir, remoteUrl, manifest?, manifestEntry}

View File

@@ -171,8 +171,8 @@ private def reuseManifest
logWarning s!"{rootName}: ignoring previous manifest because it failed to load: {e}"
/-- Add a package dependency's manifest entries to the update state. -/
private def addDependencyEntries (pkg : Package) : UpdateT LoggerIO PUnit := do
match ( Manifest.load pkg.manifestFile |>.toBaseIO) with
private def addDependencyEntries (pkg : Package) (matDep : MaterializedDep) : UpdateT LoggerIO PUnit := do
match matDep.manifest? with
| .ok manifest =>
manifest.packages.forM fun entry => do
unless ( getThe (NameMap PackageEntry)).contains entry.name do
@@ -210,6 +210,36 @@ Used, for instance, if the toolchain is updated and no Elan is detected.
-/
def restartCode : ExitCode := 4
/-- The toolchain information of a package. -/
private structure ToolchainCandidate where
/-- The name of the package which provided the toolchain candidate. -/
src : Name
/-- The version of the toolchain candidate. -/
ver : ToolchainVer
/-- Whether the candidate toolchain been fixed to particular version. -/
fixed : Bool := false
private structure ToolchainState where
/-- The name of depedency which provided the current candidate toolchain. -/
src : Name
/-- The current candidate toolchain version (if any). -/
tc? : Option ToolchainVer
/-- Incompatible candidate toolchains (if any). -/
clashes : Array ToolchainCandidate
/--
Whether the candidate toolchain been fixed to particular version.
If `false`, the search will update the toolchain further where possible.
-/
fixed : Bool
@[inline] def ToolchainState.replace
(src : Name) (tc? : Option ToolchainVer) (fixed : Bool) (self : ToolchainState)
: ToolchainState := {self with src, tc?, fixed}
@[inline] def ToolchainState.addClash
(src : Name) (ver : ToolchainVer) (fixed : Bool) (self : ToolchainState)
: ToolchainState := {self with clashes := self.clashes.push {src, ver, fixed}}
/--
Update the workspace's `lean-toolchain` if necessary.
@@ -222,23 +252,38 @@ def Workspace.updateToolchain
: LoggerIO PUnit := do
let rootToolchainFile := ws.root.dir / toolchainFileName
let rootTc? ToolchainVer.ofDir? ws.dir
let (src, tc?, tcs) rootDeps.foldlM (init := (ws.root.baseName, rootTc?, #[])) fun s dep => do
let s : ToolchainState := ws.root.baseName, rootTc?, #[], ws.root.fixedToolchain
let src, tc?, tcs, fixed rootDeps.foldlM (init := s) fun s dep => do
let depTc? ToolchainVer.ofDir? (ws.dir / dep.relPkgDir)
let some depTc := depTc?
| return s
let (src, tc?, tcs) := s
let some tc := tc?
| return (dep.name, depTc?, tcs)
if depTc tc then
return (src, tc, tcs)
else if tc < depTc then
return (dep.name, depTc, tcs)
let some tc := s.tc?
| return s.replace dep.name depTc? dep.fixedToolchain
if dep.fixedToolchain then
if s.fixed then
if tc = depTc then
return s
else
return s.addClash dep.name depTc dep.fixedToolchain -- true
else
if tc depTc then
return s.replace dep.name depTc dep.fixedToolchain -- true
else
return s.addClash dep.name depTc dep.fixedToolchain -- true
else
return (src, tc, tcs.push (dep.name, depTc))
if depTc tc then
return s
else if !s.fixed && tc < depTc then
return s.replace dep.name depTc dep.fixedToolchain -- false
else
return s.addClash dep.name depTc dep.fixedToolchain -- false
if 0 < tcs.size then
let s := "toolchain not updated; multiple toolchain candidates:"
let s := if let some tc := tc? then s!"{s}\n {tc}\n from {src}" else s
let s := tcs.foldl (init := s) fun s (d, tc) => s!"{s}\n {tc}\n from {d}"
let addEntry s tc src fixed :=
let fixed := if fixed then " (fixed toolchain)" else ""
s!"{s}\n {tc}\n from {src}{fixed}"
let s := if let some tc := tc? then addEntry s tc src fixed else s
let s := tcs.foldl (init := s) fun s src, tc, fixed => addEntry s tc src fixed
logWarning s
else if let some tc := tc? then
if rootTc?.any (· == tc) then
@@ -333,7 +378,7 @@ where
loadUpdatedDep wsIdx dep matDep
@[inline] loadUpdatedDep wsIdx dep matDep : StateT Workspace (UpdateT LoggerIO) Package := do
let depPkg loadDepPackage wsIdx matDep dep.opts leanOpts true
addDependencyEntries depPkg
addDependencyEntries depPkg matDep
return depPkg
/-- Write package entries to the workspace manifest. -/
@@ -347,6 +392,7 @@ def Workspace.writeManifest
| none => arr -- should only be the case for the root
let manifest : Manifest := {
name := ws.root.baseName
fixedToolchain := ws.root.fixedToolchain
lakeDir := ws.relLakeDir
packagesDir? := ws.relPkgsDir
packages := manifestEntries

View File

@@ -32,7 +32,6 @@ public def loadWorkspaceRoot (config : LoadConfig) : LogIO Workspace := do
Lean.searchPathRef.set config.lakeEnv.leanSearchPath
let lakeConfig loadLakeConfig config.lakeEnv
let (root, env?) loadPackageCore "[root]" {config with pkgIdx := 0}
let root := {root with outputsRef? := CacheRef.mk}
let ws : Workspace := {
root
lakeEnv := config.lakeEnv

View File

@@ -462,6 +462,11 @@
"default": false,
"description": "Whether downstream packages can `import all` modules of this package.\n\nIf enabled, downstream users will be able to access the `private` internals of modules, including definition bodies not marked as `@[expose]`. This may also, in the future, prevent compiler optimization which rely on `private` definitions being inaccessible outside their own package."
},
"fixedToolchain": {
"type": "boolean",
"default": false,
"description": "Whether this package is expected to only function on a single toolchain (the package's toolchain).\n\nThis informs Lake's toolchain update procedure (in `lake update`) to prioritize this package's toolchain. It also avoids the need to separate input-to-output mappings for this package by toolchain version in the Lake cache."
},
"enableArtifactCache": {
"type": "boolean",
"description": "Whether to enables Lake's local, offline artifact cache for the package.\n\nArtifacts (i.e., build products) of packages will be shared across local copies by storing them in a cache associated with the Lean toolchain.\nThis can significantly reduce initial build times and disk space usage when working with multiple copies of large projects or large dependencies.\n\nAs a caveat, build targets which support the artifact cache will not be stored in their usual location within the build directory. Thus, projects with custom build scripts that rely on specific location of artifacts may wish to disable this feature.\n\nIf `none` (the default), this will fallback to (in order):\n* The `LAKE_ARTIFACT_CACHE` environment variable (if set).\n* The workspace root's `enableArtifactCache` configuration (if set and this package is a dependency).\n* **Lake's default**: The package can use artifacts from the cache, but cannot write to it."

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More